Role : Data Engineer
Location : Okemos, MI (Hybrid / Onsite)
Duration : Long term
Rate : Market
F2F is must, Need USC / GC
Responsibilities :
- Participates in the analysis and development of technical specifications, programming, and testing of Data Engineering components.
- Participates in creating data pipelines and ETL workflows to ensure that design and enterprise programming standards and guidelines are followed. Assist with updating the enterprise standards when gaps are identified.
- Follows technology best practices and standards and escalates any issues as deemed appropriate. Follows architecture and design best practices (as guided by the Lead Data Engineer, BI Architect, and Architectural team.
- Responsible for assisting in configuration and scripting to implement fully automated data pipelines, stored procedures, and functions, and ETL workflows that allow data to flow from on-premises data sources to cloud-based data platforms (e.g. Snowflake) and application platforms (e.g. Salesforce), where data may be consumed by end customers.
- Follows standard change control and configuration management practices.
- Participates in 24-hour on-call rotation in support of the platform.
Required Skills / Qualifications :
Database Platforms : Snowflake, Oracle, and SQL ServerOS Platforms : RedHat Enterprise Linux and Windows ServerLanguages and Tools : PL / SQL, Python, T-SQL, StreamSets, Snowflake Cloud Data Platform, and Informatica PowerCenter, Informatica IICS or IDMC.Experience creating and maintaining ETL processes that use Salesforce as a destination.Drive and desire to automate repeatable processes.Excellent interpersonal skills and communication, as well as the willingness to collaborate with teams across the organization.