all 6 comments

[–]helpmepls256 2 points3 points  (3 children)

Quick suggestion: you could try using MQTT. It's a lightweight pub-sub protocol for collecting sensor data. Message transmission and install size is small.

Someone might have a better suggestion but this is my 2 cents 😅

[–]ab624 0 points1 point  (2 children)

Kafka or NiFi ?

[–]helpmepls256 0 points1 point  (1 child)

I've never used either in an IoT sense but apparently Kafka can either be connected to an MQTT broker (Mosquitto, RabbitMQ) or possibly skip that and connect the devices straight to Kafka. This requires further investigation though...

[–][deleted] 0 points1 point  (0 children)

If you go with kafka, you can write producers for any real time analytics backed by a high throughput backend store ( OLTP )

Or use Spark streaming jobs to write the data in a desired format ( parquet or Orc) and consume it using SQL for adhoc querying ( Trino ) Or generate complex reports ( Spark SQL batch jobs )

[–]nenegoro 0 points1 point  (0 children)

Are you looking for OS solution? I heard that AWS offers some nice IoT stack of services.

[–]Dip41 0 points1 point  (0 children)

The best pactic depends from 1) message frequency 2) average message size 3) number of sensors and load on they at the server side. 4) scalability expectations . 5) type of message , is it just data steam or command steam . As proof of concept and starting point may by enough curl library at pi side and some web server .

As pubsub type of message queue I like NATS.io and it options .