Software and Data Engineering : Utilize your expertise in Java and Kubernetes to build and maintain robust data solutions. DevOps and monitoring solutions as well as a strong knowledge of the SDLC are required. Build and manage dozens of data pipelines to source and transform data based on business requirements. Innovation and Learning : Quickly learn new technologies by applying your current skills, staying ahead of industry trends and advancements. Self-identify the need for new skills to be developed and adopt new technologies into your skill set in a month's time. Team Collaboration : Collaborate within a Pod of 4+ data engineers, working towards common objectives in a consultative fashion with clients. Data Movement and Transformation : Use Apache NiFi and Apache Flink for data movement, streaming, and transformation services, ensuring efficient and reliable data workflows. 3+ years of experience in Data and Cloud Application Engineering. 2+ years of experience working with Snowflake and AWS cloud services. 3+ years of experience in a Data warehousing environment. 3+ years of experience building Data Pipelines (NiFi, DBT, Apache Airflow, or Matillion, etc.) Java : Advanced skills in Java programming. Kubernetes : Proficiency in using Kubernetes for container orchestration. AWS : Strong knowledge of AWS services. Prometheus : Expertise in using Prometheus for monitoring. GitHub : Proficiency in using GitHub for version control. Jenkins : Experience in using Jenkins for continuous integration and delivery. Terraform : Strong knowledge of Terraform for infrastructure as code. Kafka : Experience in using Kafka for streaming data. Domain Expertise : Nice to have experience in financial data analysis, risk, and compliance data management. Learning Agility : Ability to quickly learn new technologies by applying current skills. Adaptability : Ability to adapt to new technologies quickly and efficiently.
Data Engineer Aws • Nashville, Tennessee, United States