Talent.com
DevOps Engineer (Elasticsearch Experience)

DevOps Engineer (Elasticsearch Experience)

Infinitive IncMcLean, VA, US
30+ days ago
Job type
  • Full-time
  • Quick Apply
Job description

About Infinitive :

  • Infinitive is a data and AI consultancy that enables its clients to modernize, monetize and operationalize their data to create lasting and substantial value.
  • We possess deep industry and technology expertise to drive and sustain adoption of new capabilities.
  • We match our people and personalities to our clients' culture while bringing the right mix of talent and skills to enable high return on investment.
  • Infinitive has been named “Best Small Firms to Work For” by Consulting Magazine 7 times most recently in 2024.

Infinitive has also been named a Washington Post “Top Workplace”, Washington Business Journal “Best Places to Work”, and Virginia Business “Best Places to Work.”   About the Role :

  • We are seeking a skilled DevOps Engineer with data engineering experience to join our dynamic team.
  • The ideal candidate will have expertise in ElasticSearch, CI / CD, Git, and Infrastructure as Code (IaC) while also possessing experience in data engineering.
  • You will be responsible for designing, automating, and optimizing infrastructure, deployment pipelines, and data workflows.
  • This role requires close collaboration with data engineers, software developers, and operations teams to build scalable, secure, and high-performance data platforms.
  • Key Responsibilities :

  • DevOps & Infrastructure Management : Design, deploy, and manage ElasticSearch clusters, ensuring high availability, scalability, and performance for search and analytics workloads.
  • Develop and maintain CI / CD pipelines for automating build, test, and deployment processes using tools like Jenkins, GitHub Actions, GitLab CI / CD, or ArgoCD.
  • Manage and optimize version control workflows using Git, ensuring best practices for branching, merging, and release management.
  • Implement Infrastructure as Code (IaC) solutions using Terraform, CloudFormation, or Ansible for cloud and on-prem infrastructure.
  • Automate system monitoring, alerting, and incident response using tools such as Prometheus, Grafana, Elastic Stack (ELK), or Datadog.
  • Data Engineering & Pipeline Automation :

  • Collaborate with data engineering teams to design and deploy scalable ETL / ELT pipelines using Apache Kafka, Apache Spark, Kinesis, Pub / Sub, Dataflow, Dataproc, or AWS Glue.
  • Optimize data storage and retrieval for large-scale analytics and search workloads using ElasticSearch, BigQuery, Snowflake, Redshift, or ClickHouse.
  • Ensure data pipeline reliability and performance, implementing monitoring, logging, and alerting for data workflows.
  • Automate data workflows and infrastructure scaling for high-throughput real-time and batch processing environments.
  • Implement data security best practices, including access controls, encryption, and compliance with industry standards such as GDPR, HIPAA, or SOC 2.
  • Required Skills & Qualifications :

  • 3+ years of experience in DevOps, Data Engineering, or Infrastructure Engineering.
  • Strong expertise in ElasticSearch, including cluster tuning, indexing strategies, and scaling.
  • Hands-on experience with CI / CD pipelines using Jenkins, GitHub Actions, GitLab CI / CD, or ArgoCD.
  • Proficiency in Git for version control, branching strategies, and code collaboration.
  • Experience with Infrastructure as Code (IaC) using Terraform, CloudFormation, Ansible, or Pulumi.
  • Solid experience with cloud platforms (AWS, GCP, or Azure) and cloud-native data engineering tools.
  • Proficiency in Python, Bash, or Scala for automation, data processing, and infrastructure scripting.
  • Hands-on experience with containerization and orchestration (Docker, Kubernetes, Helm).
  • Experience with data engineering tools, including Apache Kafka, Spark Streaming, Kinesis, Pub / Sub, or Dataflow.
  • Strong understanding of ETL / ELT workflows and distributed data processing frameworks.
  • Preferred Qualifications :

  • Experience working with data warehouses and lakes (BigQuery, Snowflake, Redshift, ClickHouse, S3, GCS).
  • Knowledge of monitoring and logging solutions for data-intensive applications.
  • Familiarity with security best practices for data storage, transmission, and processing.
  • Understanding of event-driven architectures and real-time data processing frameworks.
  • Certifications such as AWS Certified DevOps Engineer, Google Cloud Professional Data Engineer, or Certified Kubernetes Administrator (CKA).
  • Powered by JazzHR
  • Create a job alert for this search

    Elasticsearch Engineer • McLean, VA, US