Job Description
Job Description
We are looking for a skilled Data Engineer to join our team in Cleveland, Ohio. This long-term contract position offers the opportunity to contribute to the development and optimization of data platforms, with a primary focus on Snowflake and Apache Airflow technologies. You will play a key role in ensuring efficient data management and processing to support critical business needs.
Responsibilities :
- Design, develop, and maintain data pipelines using Snowflake and Apache Airflow.
- Collaborate with cross-functional teams to implement scalable data solutions.
- Optimize data processing workflows to ensure high performance and reliability.
- Monitor and troubleshoot issues within the Snowflake data platform.
- Develop ETL processes to support data integration and transformation.
- Work with tools such as Apache Spark, Hadoop, and Kafka to manage large-scale data operations.
- Implement robust data warehousing strategies to support business intelligence initiatives.
- Analyze and resolve data-related technical challenges promptly.
- Provide support and guidance during Snowflake deployments across subsidiaries.
- Document processes and ensure best practices for data engineering are followed.
- Proficiency in Snowflake and Apache Airflow for data engineering tasks.
- Strong experience with Apache Spark, Hadoop, and Kafka.
- Expertise in Python programming for data processing and automation.
- Solid understanding of ETL processes and data transformation techniques.
- Background in data warehousing and business intelligence methodologies.
- Familiarity with Snowflake schema design and optimization.
- Ability to work independently and adapt quickly to changing workloads.
- Excellent problem-solving skills and attention to detail.