Data acquisition for specific measurement fields gets rejected

Dear @Geraldine and @lorenzo,

we just found a minor issue with the data submission from one of your nodes, where the problem is probably at least partly on the backend side.

Have a look for yourself

You can also see it when subscribing to your error channels (following Fehlersignalisierung bei Datenakquise (Backend) and MQTT error signalling - Kotori DAQ) like:

mosquitto_sub -h swarm.hiveeyes.org -p 1883 -t 'hiveeyes/+/lema-bkr/+/error.json' -v

One of the error messages is:

hiveeyes/57eaf23a-968f-4740-b747-137e3ab00580/lema-bkr/node-5/error.json {
    "timestamp": "2018-05-31T13:46:15+00:00",
    "message": "400: {\"error\":\"partial write: field type conflict: input field \\\"Humidite\\\" on measurement \\\"lema_bkr_node_5_sensors\\\" is type float, already exists as type string dropped=1\"}\n",
    "type": "<class 'influxdb.exceptions.InfluxDBClientError'>",
    "description": "Error processing MQTT message \"{\"Humidite\": \"56.20\"}\" from topic \"hiveeyes/57eaf23a-968f-4740-b747-137e3ab00580/lema-bkr/node-5/data.json\"."
}

Don’t panic

This problem is not uncommon, you are not alone with this specific issue - see also:

Background

Although InfluxDB does not use an explicit database schema definition, fields keep their data type once having been created implicitly, so the data format of the first data point counts.

In the words of the InfluxDB documentation about Schemaless Design, this means:

InfluxDB is a schemaless database. You can add new measurements, tags, and fields at any time. Note that if you attempt to write data with a different type than previously used (for example, writing a string to a field that previously accepted integers), InfluxDB will reject those data.

In our case, this is currently a String data type for the measurements in question:

> show field keys from lema_bkr_node_5_sensors

fieldKey fieldType
-------- ---------
Humidite string

Subsequent attempts to store measurments in the correct data format (float) to the same database/measurement collection obviously will fail. As our backend system Kotori still does not have adequate feedback or recovery mechanisms in place, this issue requires a manual resolution.

Solution

We have different options to solve this problem.

Abandon the field or collection

You can just use a different field name for storing into the same collection or you can just abandon the collection altogether and use a different channel address for submitting subsequent measurement data. This is a countermeasure everyone can apply when running into this or similar issues.

Fix the database

As the database is not yet accessible from the outside, we would have to go there and fix the data schema of the measurement collection. We will check if this is possible.

With kind regards,
Andreas.

Solution: Fix the database

After taking a short glimpse

> select * from lema_bkr_node_5_sensors;

time                Humidite
----                --------
1527580477290941518 0.40\n1
1527580477296935178 0.40\n1
1527580477842240877 0.40\n1
[...]
1527603728686691360 80\n-7
1527603728944774587 0\n-12
1527603729098979258 70\n-1
1527603764594225326 70\n-1
1527603785177458058 0\n-63
[...]

this reveals it seems to be “just garbage”, probably from an early test cycle. As this measurement collection also happens to contain only 39 data points

> select count(*) from lema_bkr_node_5_sensors;

time count_Humidite
---- --------------
0    39

we decided to just drop the measurement collection

> drop measurement lema_bkr_node_5_sensors;

Please get back to us if we should restore the old one from the backup, otherwise we hope you will be happy about the resolution of this problem.

Immediately after dropping the collection, valid data seems to get stored

> select * from lema_bkr_node_5_sensors;

time                Humidite
----                --------
1527776942979978611 55.3
1527776943105662769 55.3
1527776943146680829 55.3
> select count(*) from lema_bkr_node_5_sensors;

time count_Humidite
---- --------------
0    42

Have fun!

With kind regards,
Andreas.

2 Likes

Hi Andreas,

thank you for your kind feedback,

I will take a look to your to fix that issue

Regards

Lorenzo

Hi Lorenzo,

should be fixed already, no worries. Unless this problem comes up again with one of your nodes, we should just don’t care ;].

With kind regards,
Andreas.