Are you an experienced Data Migration Engineer with an expertise in Snowflake and dbt (Data Build Tool)
About the role, we are seeking a Data Migration Specialist with extensive experience in dbt (Data Build Tool) and Snowflake, and solid proficiency in Python within the AWS ecosystem. The role involves developing, maintaining, and optimizing scalable data pipelines and integrations that enable data-driven decision-making across platforms. Please do not apply without current and extensive dbt and snowflake.
About the team, this diverse team provides post-sale technical support services to our University customers, including installation, troubleshooting, problem resolution and maintenance of products and services.
Key Responsibilities
- Designing, implementing, and maintaining data pipelines using dbt and Snowflake
- Developing and automating Python scripts for data transformation, validation, and delivery
- Managing data workflows and deployments across the AWS ecosystem (S3, Lambda, ECS, IAM, etc.)
- Collaborating with internal and external teams to deliver efficient, secure data integrations
- Troubleshooting and resolving data pipeline or performance issues
- Applying best practices for CI / CD, testing, and version control in data workflows
- Contributing to ETL orchestration and scheduling using Matillion
Required Skills & Experience
Possess current experience with dbt and Snowflake (required). Please do not apply with this experience.Experience with Matillion ETL or similar data orchestration toolsFamiliarity with Airflow, Dagster, or other workflow orchestration frameworksHave current and extensive Python development skills for automation and data processingPossess a solid understanding of AWS services related to data engineeringExperience with SQL, schema design, and performance optimizationPossess familiarity with Git and collaborative development practices