@Thias said it’s only about the /* suffix which has been disabled. Before, he was getting the discrete payloads containing just a flat dictionary of decoded CayenneLPP fields. It was easy to rewrite the MQTT topic address using an appropriate Mosquitto bridge configuration snippet to make Kotori ingest the measurement data.
while the decoder is no rocket science at all, we haven’t made it to integrate it into Kotori in time. Now, I will be traveling until 7th of December or so and would like to apologize that this means some data loss for your hive monitoring setup.
The decoder is already half-finished within the ttnlogger program
so you might be able to pick it up from there and put it into something which republishes the telemetry data to the MQTT bus on swarm.hiveeyes.org in the appropriate format. However, you will probably have to put some hours of work into this guy to make that happen.
If you will really get into it, you might want to run it on a local RaspberryPi machine or even on elbanco itself within a screen or tmux session as a provisional solution.
Sorry again and with kind regards,
Andreas.
Appendix
These are pointers to the appropriate functions of the TTNDatenpumpe and InfluxDatabase classes within ttnlogger.
we’ve just installed ttnlogger on elbanco and this might help you in the interim. The documentation about how to invoke this utility can be found within the README section Synopsis of ttnlogger.
It will be easy to figure it out how to invoke ttnlogger. I would recommend using a test database/measurement first and – as soon you can confirm everything is right – you might want to consider switching the output to your personal database/measurement.
In order to keep it running after logout, you might want to invoke it within a tmux session like
tmux new -s ttn-thias ttnlogger testdrive ttn-account-v2.UcOZ3_gRRVbzsJ1lR7WfuINLN_DKIlc9oKvgukHPGck testdb data
Thanks! It’s basically working after a (Python 2->3 ?) bugfix.
It’s running right now in a screen environment on elbanco and data is visible in a dashboard duplicated from my original live data dashboard.
Next thing would be to get rid of the database and measurement arguments in favor of deriving both from the dev_id. I’ll see what I can do but I don’t expect to get this implemented by myself properly since some structural rewritings are needed.
I accidentally got it working with dynamic InfluxDB database and measurement assignments which are being derived from the dev_id. The PR is extended accordingly
The code on elbanco is now running on a separate branch which can be deleted before pulling my commits to master.
With this change implemented the TTN dev_ids will have to follow a certain scheme:
A dev_id: hiveeyes-USER-LOCATION-NAMEOFHIVE will be written to the agents database hiveeyes-USER while the measurement will be LOCATION-NAMEOFHIVE, where you want to replace upper case strings with your individual (lower case) namings.
For performance reasons we unfortunately had to disable publishing Uplink Fields to MQTT in our EU region. This means that the values from payload decodes are no longer published to individual [AppID]/devices/[DevID]/up/[Field] topics. Until this is resolved, you can subscribe to [AppID]/devices/[DevID]/up and read the payload_fields field from the uplinks you receive there.
Posted 13 days ago. Nov 18, 2019 - 11:00 CET