InfluxDB

InfluxDB

Application and Data / Data Stores / Databases
Needs advice
on
TimescaleDBTimescaleDBMongoDBMongoDB
and
InfluxDBInfluxDB

Hi all, I am trying to decide on a database for time-series data. The data could be tracking some simple series like statistics over time or could be a nested JSON (multi-level nested). I have been experimenting with InfluxDB for the former case of a simple list of variables over time. The continuous queries are powerful too. But for the latter case, where InfluxDB requires to flatten out a nested JSON before saving it into the database the complexity arises. The nested JSON could be objects or a list of objects and objects under objects in which a complete flattening doesn't leave the data in a state for the queries I'm thinking.

[ 
  { "timestamp": "2021-09-06T12:51:00Z",
    "name": "Name1",
    "books": [
        { "title": "Book1", "page": 100 },
        { "title": "Book2", "page": 280 },
    ]
  },
 { "timestamp": "2021-09-06T12:52:00Z",
   "name": "Name2",
   "books": [
       { "title": "Book1", "page": 320},
       { "title": "Book2", "page": 530 },
       { "title": "Book3", "page": 150 },
   ]
 }
]

Sample query: With a time range, for name xyz, find all the book title for which # of page < 400.

If I flatten it completely, it will result in fields like books_0_title, books_0_page, books_1_title, books_1_page, ... And by losing the nested context it will be hard to return one field (title) where some condition for another field (page) satisfies.

Appreciate any suggestions. Even a piece of generic advice on handling the time-series and choosing the database is welcome!

READ MORE
8 upvotes·6.9K views
Software Develpor (ioT) at Vappar·
Needs advice
on
ReactReactPythonPython
and
InfluxDBInfluxDB

I am facing a problem of a high latency rate. How can I make it real-time?

My stack is Python (reading some data from serial), InfluxDB (Data Store in DB) React (Front-end show data on a web app with API)

The cycle is Read Data, Store, and then Show Front-end data from API(DB).

READ MORE
2 upvotes·4.7K views
Replies (2)
Software Design Engineer at AN10·

Hi,

If you are trying to pull a huge amount of data from DB. You can enable pagination to pull data in chunks and while pulling data from DB on pull required information(the selected item that you want to show on the frontend.)

READ MORE
1 upvote·145 views
Software Architect at Northern Gas and Power·

Hi Muhammed Chances High latency could be because of the following reasons 1.Python API and Influx DB are not located in the same region. (Eg: API machine instance in US-west and InfluxDB machine instance in US-east).

2.Python API credentials connecting to influxDB via public IP(external) instead of private internal IP.

READ MORE
1 upvote·2 comments·129 views
Muhammad Asif Azam
Muhammad Asif Azam
·
December 18th 2020 at 8:20PM

No Sir, I am working on local system I have 5~8 second latency, this too high if any alarm occur or anything happen. I want something like read and show, something like skip step to store in database , front-end query to DB and then show.

·
Reply
Anand Babu Vinayagam
Anand Babu Vinayagam
·
January 4th 2021 at 5:36AM

Firestore database will be a better option for front end query to database. but it does not support SQL.

·
Reply
Needs advice
on
LogstashLogstashGrafanaGrafana
and
GoGo

Hi everyone. I'm trying to create my personal syslog monitoring.

  1. To get the logs, I have uncertainty to choose the way: 1.1 Use Logstash like a TCP server. 1.2 Implement a Go TCP server.

  2. To store and plot data. 2.1 Use Elasticsearch tools. 2.2 Use InfluxDB and Grafana.

I would like to know... Which is a cheaper and scalable solution?

Or even if there is a better way to do it.

READ MORE
9 upvotes·8.1K views
Replies (3)
Recommends
Loki
Grafana

Hi Juan

A very simple and cheap (resource usage) option here would be to use promtail to send syslog data to Loki and visualise Loki with Grafana using the native Grafana Loki data source. I have recently put together this set up and promtail and Loki are less resource intensive than Logstash/ES and it is a simple set up and configuration and works very nicely.

READ MORE
4 upvotes·2 comments·1.4K views
Sunil Chaudhari
Sunil Chaudhari
·
October 27th 2021 at 1:23AM

Hi,

Does promtel available for PCF?

·
Reply
Gary Wilson
Gary Wilson
·
October 27th 2021 at 1:38PM

Hi @sunilmchaudhari I do not know. I assume by PCF you are refering to Pivot Cloud Foundry, which I have no knowledge of sorry. Promtail is a go binary so if you can add log data to a syslog, then you can process it with Promtail.

·
Reply
Team Lead at XYZ·

For Syslog, you can certainly use TCP Input. Really interested to know what is your syslog client( which will ship logs to logstash). Anyways you can check that and see if that client has capability to configure multiple logstash host ports so that it works as a load balancer. This will increase throughput. Also check pipeline-to-pipeline communcation of logstash: https://www.elastic.co/guide/en/logstash/current/pipeline-to-pipeline.html This helps to implement distributor pattern of pipeline where multiple type of data is coming to same input and you may want to route filtering and processing based on types. It increases parallelism. About Elasticsearch: Its a native component and perfectly fits with logstash so you can use elasticsearch for storage and search. Its one of the datasource of grafana.

READ MORE
3 upvotes·700 views
View all (3)