- Search jobs
- Minneapolis, MN
- etl informatica developer
Etl informatica developer Jobs in Minneapolis, MN
- Promoted
ADF ETL Data Engineer
Chelsoft Solutions CoMinneapolis, MN, United States- Promoted
Engagement Lead (Informatica MDM)
Capgeminiminneapolis, MN, United StatesETL Technical Lead
Tata Consultancy ServicesMinneapolis, MN- Promoted
Net Developer
HighCloud SolutionsSaint Paul, Minnesota, USA- Promoted
eLearning Developer
Marcomm IncMinneapolis, MN, USSr Manager Informatica P360
nVentSt Louis Park MN, US- Promoted
DEVELOPER
The Toro CompanyMN, United States- Promoted
MoveWorks Developer
Syntricate TechnologiesArden Hills, MN, United States- Promoted
Reports Developer
Noblesoft TechnologiesMinneapolis, MN, US- Promoted
- New!
Snowflake Developer
VDart IncMinneapolis, MN, US- Promoted
Informatica SnapLogic Developer
VirtualVocationsSaint Paul, Minnesota, United StatesSr. ETL Developer
Horizontal TalentMinneapolis, MN, US- Promoted
Developer
Diverse LynxMinneapolis, MN, United States- Promoted
.NET Developer
VodastraMinneapolis, MN, USData Engineer - Sr. ETL Developer
020 Travelers Indemnity CoSt. Paul,MNOperations & Technology Transformation Senior Consultant, Guidewire Data Migration - ETL Developer
DeloitteMinneapolis, Minnesota, USInformatica Application Developer
Highmark HealthMN, Working at Home, Minnesota- Promoted
Java Developer
The Dignify Solutions LLCMinneapolis, MN, United StatesData Engineer - Sr. ETL Developer
TravelersSt. Paul,MNADF ETL Data Engineer
Chelsoft Solutions CoMinneapolis, MN, United States- Full-time
ADF ETL Data Engineer
Minneapolis MN
Required Skills : 5 years of Data engineering experience with a focus on Data Warehousing. 2 years of experience creating pipelines in Azure Data Factory (ADF). 3 years of experience creating stored procedures with Oracle PL / SQL SQL Server TSQL or Snowflake SQL
Top Responsibilities :
Develop and manage effective working relationships with other departments groups and personnel with whom work must be coordinated or interfaced.
Efficiently communicate with ETL architect while understanding the requirements and business process knowledge to transform the data in a way thats geared towards the needs of end users.
Assist in the overall architecture of the ETL Design and proactively provide inputs in designing implementing and automating the ETL flows.
Investigate and mine data to identify potential issues within ETL pipelines notify endusers and propose adequate solutions.
Develop ETL pipelines and data flows in and out of the data warehouse using a combination of Azure Data Factory and Snowflake toolsets.
Develop idempotent ETL process design so that interrupted incomplete or failed processes can be rerun without errors using ADF dataflows and Pipelines.
Work in Snowflake Virtual Warehouses as needed and automate data pipelines using Snowpipe for tedious ETL problems.
Capture changes in data dimensions and maintain versions of them using Stream sets in Snowflake and schedule them using Tasks.
Optimize every step of the data movement not only limited to source and during travel but also when its at rest in the database for accelerated responses.
Build a highly efficient orchestrator that can schedule jobs execute workflows perform Data quality checks and coordinate dependencies among tasks.
Test ETL system code data design pipelines and data flows. Perform root cause analysis on all processes and resolve production issues. Conduct routine tests on databases and data flow and pipeline testing.
Document implementations test cases and build deployment documents needed for CI / CD.
Required Skills / Attributes :
5 years of Data engineering experience with a focus on Data Warehousing.
2 years of experience creating pipelines in Azure Data Factory (ADF).
5 years developing ETL using Informatica PowerCenter SSIS Azure Data Factory or similar tools.
5 years of experience with Relational Databases such as Oracle Snowflake SQL Server etc.
3 years of experience creating stored procedures with Oracle PL / SQL SQL Server TSQL or Snowflake SQL.
2 years of experience with GitHub SVN or similar source control systems.
2 years of experience processing structured and unstructured data.
Experience with HL7 and FHIR standards and processing files in these formats.
3 years analyzing project requirements and developing detailed specifications for ETL requirements.
Excellent problemsolving and analytical skills with the ability to troubleshoot and optimize data pipelines.
Ability to adapt to evolving technologies and changing business requirements.
Bachelors or Advanced Degree in a related field such as Information Technology / Computer Science Mathematics / Statistics Analytics Business.
Preferred Skills / Attributes :
2 years of batch or PowerShell scripting.
2 years of experience with Python scripting.
3 years of data modeling experience in a data warehouse environment.
Experience or familiarity with Informatica Intelligent Cloud Services (specifically Data Integration).
Experience designing and building APIs in Snowflake and ADF (e.g. REST RPC).
Experience with State Medicaid / Medicare / Healthcare applications.
Azure certifications related to data engineering or data analytics.