Role : Sr Python Developer & Lead
Location : Detroit MI (Onsite)
Type : Contract
Job Requirements
The Senior Data Engineer & Technical Lead (SDET Lead) will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands-on expertise in building deploying and maintaining robust data pipelines using Python PySpark and Airflow as well as designing and implementing CI / CD processes for data engineering projects Key Responsibilities
1. Data Engineering : Design develop and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads.
2. Workflow Orchestration : Build schedule and monitor complex workflows using Airflow ensuring reliability and maintainability.
3. CI / CD Pipeline Development : Architect and implement CI / CD pipelines for data engineering projects using GitHub Docker and cloud-native solutions.
4. Testing & Quality : Apply test-driven development (TDD) practices and automate unit / integration tests for data pipelines.
5. Secure Development : Implement secure coding best practices and design patterns throughout the development lifecycle.
6. Collaboration : Work closely with Data Architects QA teams and business stakeholders to translate requirements into technical solutions.
7. Documentation : Create and maintain technical documentation including process / data flow diagrams and system design artifacts.
8. Mentorship : Lead and mentor junior engineers providing guidance on coding testing and deployment best practices.
9. Troubleshooting : Analyze and resolve technical issues across the data stack including pipeline failures and performance bottlenecks.
10. Cross-Team Knowledge Sharing : Cross-train team members outside the project team (e.g. operations support) for full knowledge coverage. Includes all above skills plus the following;
Technical Experience :
1. Hands-on Data Engineering : Minimum 5 years of practical experience building production-grade data pipelines using Python and PySpark.
2. Airflow Expertise : Proven track record of designing deploying and managing Airflow DAGs in enterprise environments.
3. CI / CD for Data Projects : Ability to build and maintain CI / CD pipelines for data engineering workflows including automated testing and deployment.
4. Cloud & Containers : Experience with containerization (Docker and cloud platforms (GCP) for data engineering workloads. Appreciation for twelve-factor design principles
5. Python Fluency : Ability to write object-oriented Python code manage dependencies and follow industry best practices
6. Version Control : Proficiency with
7. Unix / Linux : Strong command-line skills
8 . SQL : Solid understanding of SQL for data ingestion and analysis.
9. Collaborative Development : Comfortable with code reviews pair programming and using remote collaboration tools effectively.
10. Engineering Mindset : Writes code with an eye for maintainability and testability; excited to build production-grade software
11. Education : Bachelors or graduate degree in Computer Science Data Analytics or related field or equivalent work experience.
Unique Skills
Key Skills
APIs,Docker,Jenkins,REST,Python,AWS,NoSQL,MySQL,JavaScript,Postgresql,Django,GIT
Employment Type : Full Time
Experience : years
Vacancy : 1
Python Developer • Detroit, Michigan, USA