Job Description
Pay Range : $92.74hr - $97.74hr
Requirements / Must Have
- Bachelor's degree in Computer Science, Engineering, or related field.
- 5+ years of hands-on experience in data engineering.
- Expertise with Apache Airflow, Snowflake, SQL, Python, and API integrations.
- Experience with SAP data replication using HVR or similar tools.
- Familiarity with cloud platforms such as AWS or Azure.
- Experience with Linux administration, data lakes, and data quality frameworks.
- Strong analytical and problem-solving abilities.
- Strong communication skills.
- Ability to work collaboratively across teams.
Experience
Data pipeline development and EL processes.Designing and implementing data integration architectures.Working with AWS services and modern data engineering technologies.Performance optimization of data pipelines and systems.Data warehousing, analytics, and cross-functional technical collaboration.Responsibilities
Perform hands-on data integration engineering tasks, including pipeline development and EL processes.Serve as the technical expert for complex data engineering challenges.Design and implement scalable, secure, and efficient data integration architectures.Use Snowflake, SQL, Airflow, Python, AWS, API technologies, and HVR to develop and optimize solutions.Monitor and maintain data quality and integrity.Collaborate with data science, warehousing, and analytics teams to deliver insights.Identify and resolve performance bottlenecks to ensure optimal system efficiency.Should Have
Ability to stay current with emerging technologies.Strong decision-making skills.Team-focused, collaborative mindset.Skills
Snowflake.Apache Airflow.SQL.Python.Shell scripting.API gateways and web services.AWS or Azure.HVR or SAP replication tools.Linux administration.Qualification and Education
Minimum Bachelor’s degree (advanced degree a plus).#J-18808-Ljbffr