Only taking USC, GC or H4 EAD
ALL ROLES ARE HYBRID IN THE OFFICE (2-3 DAYS / WEEK) UNLESS REMOTE IS NOTED
Please submit qualified candidates and include FULL LEGAL NAME as it appears on the passport
Job Description
Requirements :
- 6+ years in data engineering
- Strong ETL / ELT experience with data pipeline development for operational use.
- Expertise in SQL, Python / Spark, and data pipeline tools.
- Proficiency in designing and architecting data pipelines for reporting and downstream applications using open-source software and cloud.
- Deep understanding of databases, DeltaLake (operational & analytical data stores), and reporting concepts for cloud-based solutions.
- Ability to analyze and interpret legacy code from SSIS, Ab Initio, and SAS.
- Cloud expertise, with a preference for GCP.
What is your work auth status?
Where do you live?
Please provide your years of experience :
ETL / ELTSQLPython / SparkBuilding Data PipelinesDelta LakeAble to analyst Legacy Code SSIS, AbInitio, SASCloud (note which flavor)TIPS FOR SUBMITTALS - these tips will help with movement
Understand the Job Requirements :
Thoroughly review job description, required skills, and client preferences.Confirm candidate aligns with the role's technical and cultural expectations.