This is about the development process to use the Grafana Simple JSON Datasource plugin to map template variable identifiers to text labels using HTTP requests and more. The outcome of this work yielded the »grafana-metadata-api« already. Thanks @roh, @weef and @wtf for supporting this work.
Problem
Occasionally, we run into performance issues with InfluxDB. We are sure this is our fault as we didn’t care for proper database design in time.
Reminder
Don’t have too many series
Tags containing highly variable information like UUIDs, hashes, and random strings will lead to a large number of series in the database, known colloquially as high series cardinality. High series cardinality is a primary driver of high memory usage for many database workloads.
We definitively want to store associated data in a different data storage location and not inside the main InfluxDB collection for storing measurement data in order to counter the denormalization inherent with the current database schema design.
Grafana feature request
Mark Bell asked for a feature the other day which is also important to us:
Use Case: You may store metrics based on an ‘ID’ property but wish to have the template variable selection UI use a more human friendly label. e.g. You track metrics by domain with an internal domain ID but wish to use the domain’s URL in the template variable selector UI.
@torkelo I can take a cut at implementing this, what are your thoughts on implementation? For my specific use case I would want to be able to provide an arbitrary JS function to perform the value -> text conversion as I need to hit an external service for the lookup. I was thinking an initial implementation could be adding a config value in the dashboard JSON that defines the mapping function. UI support could be added later to handle more trivial mappings with pre built mapping functions (e.g. regex substitutions).
Also connected to this would be the ability to edit the full dashboard JSON via the UI, although export -> edit -> import would function as a workaround if this proves to be difficult.
Map Grafana template variable value to display text using the Simple JSON Datasource plugin.
Michael Everett solved it using the Simple JSON Datasource plugin:
For anyone who is interested I was able to solve my particular use case by using the Simple JSON Datasource plugin.
That said the plugin didn’t presently support template variable queries but my pull request that fixes the issue has been merged into master. An updated Grafana.net release of the plugin should follow at some point.
With it you can use custom HTTP endpoints as data sources in Grafana. They just have to implement 4 methods. When used with query-based template variables the HTTP endpoint will receive a /search API request and the body will be a JSON object in the form: { "target": "{template query content here}" } . You can parse the query content however you’d like.
Returning an array of values from your endpoint will create an underlying list of template variable values ["custom value 1", "custom value 2"] in the form of: [{ "text": "custom value 1", value: 0 }] where the text property is the returned value per array item and the value property is the index of the variable in the return array.
Alternatively you can return an array of text/value objects [{ "text": "label", "value": 123 }] and Grafana will use the text property as a the template variable’s label, and the value property as the raw value of the template variable.
It is possible to dynamically inject other template variables into the query in regex form and send them into the endpoint dynamically for processing.
This won’t solve all aliasing scenarios, but having an arbitrary HTTP data source that can be used for template variables, including dynamically injecting other template variables into the template query is a nice tool to have.
For applying interactive filtering on the data, the data is currently stored highly denormalized as it was the most convenient thing to do when getting started with Grafana.
However, this is obviously highly inefficient in terms of storage space and computing efficiency as text labels currently required for filtering are stored in in each and every timeseries record.
While we gave this a try with Environmental Metadata Library, we had to through all of Erneuerung der Luftdatenpumpe to finally settle the foundation of LDI data plane v2 on top of an appropriate PostGIS database. Holding this information loosely in JSON files just doesn’t deliver on the requirements here.