Were looking for a Data Engineer to build and maintain scalable data pipelines and cloud data infrastructure on GCP . The role focuses on BigQuery , Dataflow , and modern ETL / ELT to support analytics and ML workflows.
MUST HAVES
- A problem solver with ability to analyze and research complex issues and problems; and proposing actionable solutions and / or strategies.
- Solid understanding and hands on experience with major cloud platforms.
- Experience in designing and implementing data pipelines.
- Must have strong Python, SQL & GCP skills
Responsibilities
Build and optimize batch / streaming pipelines using Dataflow, Pub / Sub, Composer.Develop and tune BigQuery models, queries, and ingestion processes.Implement IaC (Terraform), CI / CD, monitoring, and data quality checks.Ensure data governance, security, and reliable pipeline operations.Collaborate with data science teams and support Vertex AIbased ML workflows.Must-Have
Must have strong Python, SQL & GCP skills35+ years of data engineering experience.Hands-on GCP experience (BigQuery, Dataflow, Pub / Sub).Solid ETL / ELT and data modeling experience.Nice-to-Have
GCP certifications, Spark, Kafka, Airflow, dbt / Dataform, Docker / K8s.