Trident Consulting is seeking a " GCP Python Data Engineer " for one of our clients in " Dearborn, MI " A global leader in business and technology services.
Role : GCP Python Data Engineer
Location : Dearborn, MI (4 days per week in office)
Duration : Contract
Rate : $55-60
Job Summary :
The Data Engineer will be responsible for supporting the Credit Global Securitization (Client) team's upskilling initiative by contributing to data engineering efforts across cloud and traditional platforms. This role is intended to accelerate development and delivery. The engineer will work closely with cross-functional teams to build, optimize, and maintain data pipelines and workflows using GCP, Python, and ETL tools.
Experience : 8to12Yrs (Overall)
Required Technical Skills :
Minimum 3+ years of hands-on experience with Google Cloud Platform (GCP), specifically using Astronomer / Composer for orchestration.
Strong proficiency in Python for data engineering and automation.
Experience with RDBMS technologies such as DB2 and Teradata.
Exposure to Big Data ecosystems and distributed data processing.
Nice to have Technical Skills :
Prior experience with ETL tools like DataStage or Informatica.
Job Description : Responsibilities :
The Data Engineer will play a key role in the developing and maintaining scalable data pipelines and workflows. The engineer will work with GCP tools like Astronomer / Composer and leverage Python for automation and transformation tasks. The role involves integrating data from RDBMS platforms such as DB2 and Teradata, and supporting ETL processes using tools like DataStage or Informatica.
This position is part of a strategic effort to enhance the delivery capabilities of the customer team and extend the longevity of current project resources. The engineer will collaborate with existing team members, including Software Analysts and Scrum Masters, and will be expected to contribute to knowledge sharing and process improvement.
Specifically :
Develop and implement solutions using GCP, Python, Big Data technologies to enhance data analysis capabilities.
Utilize Python for scripting and automation to streamline geospatial data processing tasks.
Integrate and manage data workflows using Cloud Composer to ensure efficient data pipeline operations.
Leverage GCP Cloud to deploy scalable applications and services.
Conduct thorough data analysis to provide actionable insights for business decision-making.
Ensure data integrity and accuracy through rigorous testing and validation processes.
Provide technical expertise and support to team members on data management and analysis.
Stay updated with the latest advancements in geospatial technologies and incorporate them into projects.
Optimize cloud resources to achieve cost-effective and high-performance solutions.
Collaborate with stakeholders to understand requirements and deliver tailored solutions.
Document processes and methodologies to maintain knowledge continuity and facilitate training.
Contribute to the development of best practices and standards for data management.
Trident Consulting is a premier IT staffing firm providing high-impact workforce solutions to Fortune 500 and mid-market clients. Since 2005, we've specialized in sourcing elite technology and engineering talent for contract, direct hire, and managed services roles. Our expertise spans cloud, AI / ML, cybersecurity, and data analytics , supported by a 3M+ candidate database and a 78% fill ratio . With a highly engaged leadership team and a reputation for delivering hard-to-fill, niche talent , we help organizations build agile, high-performing teams that drive innovation and business success. Learn more : tridentconsultinginc.com .
Some of our recent awards include :
Data Engineer Python • Dearborn, MI, United States