Job Title : Data Engineering Contractor
Department : Technology
About the Role :
Join our dynamic team of talented engineers with a proven track record in constructing data
warehouses, lakes, and pipelines. We are on a mission to fuel Unite Us' ongoing expansion and
enhance its positive influence on both the healthcare industry and individuals nationwide. As part
of our team, your role will be instrumental in collecting, processing, and delivering data to both
internal and external stakeholders. This pivotal work empowers us to intelligently invest in
emerging opportunities, quantifying the value and ROI that Unite Us networks bring to our
customers. By leveraging your expertise, you will help establish Unite Us as a thought leader in
the dynamic social care landscape. Become an integral part of our team and help us in shaping
the future of healthcare and making a meaningful impact on the lives of people across the
country.
What You'll Do :
- Implement a data architecture and infrastructure that aligns with business objectives.
Collaborate closely with Application Engineers and Product Managers to ensure that the
technical infrastructure robustly supports client requirements
Create ETL and data pipeline solutions for efficient loading of data into the warehousealong with their testing to ensure reliability and optimal performance
Collect, validate, and provide high-quality data, ensuring data integrityChampion data democratization efforts, facilitating accessibility to data for relevantstakeholders
Guide the team with regard to technical best practices and contribute substantially to thearchitecture of our systems
Supporting operational work like onboarding new customers to our data products andparticipating in on-call for the team
Engage with cross-functional teams, including Solutions Delivery, Business Intelligence,Predictive Analytics, and Enterprise Services, to address and support any data-related
technical issues or requirements
You’re a great fit for this role if :
At least 6-8 years of experience in working with data warehouses, data lakes, and ETLpipelines
Proven experience with building optimized data pipelines using Snowflake and dbtExpert in orchestrating data pipelines using Apache Airflow, including authoring,scheduling, and monitoring workflows
Exposure to AWS and proficiency in cloud services such as EKS(Kubernetes), ECS, S3,RDS, IAM etc.
Experience designing and implementing CI / CD workflows using GitHub Actions,Codeship, Jenkins etc.
Experience with tools like Terraform, Docker, KafkaStrong experience with Spark using Scala and PythonAdvanced SQL knowledge, with experience in pulling complex queries, query authoring,and strong familiarity with Snowflake and various relational databases like Redshift,
Postgres, etc.
Experience with data modeling and system design architecting scalable data platformsand applications for large enterprise clients.
A dedicated focus on building high-performance systemsExposure to building data quality frameworksStrong problem-solving and troubleshooting skills, with the ability to identify and resolvedata engineering issues and system failures
Excellent communication skills, with the ability to communicate technical information tonon-technical stakeholders and collaborate effectively with cross-functional teams
The ability to envision and construct scalable solutions that meet diverse needs forenterprise clients with dedicated data teams
Nice to have :
Previous engagement with healthcare and / or social determinants of health data products.Experience leveraging agentic-assisted coding tools (eg : , Cursor, Codex AI, Amazon Q,GitHub Copilot)
Experience working with RExperience with processing health care eligibility and claims dataExposure to Matillion ETLExperience using and building solutions to support various reporting and data user tools