Experience working with end business teams in planning, architecting and implementing log analytics solutions using Elasticsearch, Logstash, Beats, Kibana and Kafka
– Experience architecting and implementing Kafka, Kafka Connect and various plugins for enterprise use cases and scale
– Implement data-ingestion and data-transformation methods using Logstash and/or Elastic Beats to comply with a logical data-model.
– Design and develop data ingestion scripts to load data into Elasticsearch from disparate custom data sources.
– Design and develop data-quality improvement methods and scripts to address the completeness, currency, accuracy and availability of the data.
– Enable API based data consumption from Elasticsearch to external applications
– Write efficient queries to extract data from Elasticsearch as possible payload for visualization in an external dashboard.
– Manage Kafka and Elasticsearch clusters while creating tools to automate and improve reliability and performance of the cluster.
– Participate in capacity planning and growth projections.
– Support application deployments in production environments, including performance and functional testing.
– Monitor and maintain system health and ensure efficient operation; automate common tasks.
Required Qualifications :
– At least 3 years of development experience on the ELK Stack (Elasticsearch, Kibana, Logstash) and Kafka
– Experience in cross cluster replication, index lifecycle management and hot-warm architectures
– Experience in installation and configuration of Elasticsearch, Logstash and Kibana in a distributed environment
– Experience in installation and configuration of Kafka in a distributed environment
– Current hands-on experience with at least two coding and scripting languages e.g. Python, Java, Scala, Shell Scripting
- INR Month
- Experience 3 - 6 Years
- Qualification Graduation