Metric API : decimals numbers must be rounded, not in scientific format
Expected Behavior
For example when an input is stopped in Graylog, this metric is going down : org.graylog2.shared.buffers.InputBufferImpl.incomingMessages
We try to monitor this metric and compare it to an threshold. Both are numbers and can be compared.
Current Behavior
When the number drop far bellow 1, the API give the number in a scientific format, which is not handled by many tools. Example : 3.67007355196561e-06
And more theses number have no sense, actually i have this metric in the API : org.apache.logging.log4j.core.Appender.error : 3.07894633440346e-120
e-120 !That' a lot of zero ! And if you convert is to a "normal" format the number is so long
My monitoring software, is not able to use scientific format, so when the number fall far below 1 and goes in scientific format, it tells that's there is no problem :
| CRITICAL | HARD | 3 | CRITICAL: org.graylog2.shared.buffers.InputBufferImpl.incomingMessages : 0.000544687209986029 |
| OK | HARD | 1 | OK: org.graylog2.shared.buffers.InputBufferImpl.incomingMessages : 3.67007355196561e-06 | |
Possible Solution
Please round decimal numbers and to not use scientific notation for small numbers
Steps to Reproduce (for bugs)
Stop your input, and look the metric : org.graylog2.shared.buffers.InputBufferImpl.incomingMessages and wait him to fall below 1.
Context
Your Environment
- Graylog Version: 3.3.6
- Java Version: java-1.8.0-openjdk-headless-1.8.0.265.b01
- Elasticsearch Version: 6.8
- MongoDB Version: 4.2.10
- Operating System: Windows 10
- Browser version: Firefox 82
Might be a good fix for 5.0.
Let's also check if this format is exposed by our prometheus exporter.
We are exporting our metrics as JSON. Scientific notation for double values is standard behaviour for JSON. We would have to force our JSON serialization to use a plain decimal form for double values when we serialize the metrics. Also, we should probably introduce configuration parameters to enable the new format and to allow setting a precision for rounding. Otherwise, it would be a potentially breaking change.
Our Prometheus exporter also uses scientific notation for double values.
For both our JSON export and Prometheus export of metrics, we are using third-party libraries, and this is their standard behaviour. I'm not really sold on the idea of interfering with this. For our JSON export, we could change the format with some effort, but I wouldn't touch the Prometheus export at all.
@boosty, @bernd, what do you think?
For both our JSON export and Prometheus export of metrics, we are using third-party libraries, and this is their standard behaviour. I'm not really sold on the idea of interfering with this. For our JSON export, we could change the format with some effort, but I wouldn't touch the Prometheus export at all.
@boosty, @bernd, what do you think?
@thll I agree. Let's not touch the serialization. The scientific notation is part of the JSON spec. So every JSON parser must be able to read it. I agree that the notation is hard to read for humans, but it shouldn't be problematic for JSON parsers.
@thll Thanks for the investigation, and @bernd for your comment 👍
I agree with you. If this is valid behaviour, then let's close this. As there were no further comments or upvotes from other users, I assume that not too many people run into this issue.