Hi everyone. I'm trying to create my personal syslog monitoring.
To get the logs, I have uncertainty to choose the way: 1.1 Use Logstash like a TCP server. 1.2 Implement a Go TCP server.
To store and plot data. 2.1 Use Elasticsearch tools. 2.2 Use InfluxDB and Grafana.
I would like to know... Which is a cheaper and scalable solution?
Or even if there is a better way to do it.
Hi Juan
A very simple and cheap (resource usage) option here would be to use promtail to send syslog data to Loki and visualise Loki with Grafana using the native Grafana Loki data source. I have recently put together this set up and promtail and Loki are less resource intensive than Logstash/ES and it is a simple set up and configuration and works very nicely.
Hi @sunilmchaudhari I do not know. I assume by PCF you are refering to Pivot Cloud Foundry, which I have no knowledge of sorry. Promtail is a go binary so if you can add log data to a syslog, then you can process it with Promtail.
For Syslog, you can certainly use TCP Input. Really interested to know what is your syslog client( which will ship logs to logstash). Anyways you can check that and see if that client has capability to configure multiple logstash host ports so that it works as a load balancer. This will increase throughput. Also check pipeline-to-pipeline communcation of logstash: https://www.elastic.co/guide/en/logstash/current/pipeline-to-pipeline.html This helps to implement distributor pattern of pipeline where multiple type of data is coming to same input and you may want to route filtering and processing based on types. It increases parallelism. About Elasticsearch: Its a native component and perfectly fits with logstash so you can use elasticsearch for storage and search. Its one of the datasource of grafana.