Role : GCP Data Engineer.
Location : USA (Remote).
Duration : Long Term Contract.
Overview :
- We are seeking a highly skilled GCP Data Engineer to design, develop, and optimize large-scale data pipelines and analytics solutions on Google Cloud Platform. The ideal candidate has hands-on experience with Python, Apache Spark, BigQuery, and related GCP services, with a passion for building scalable data architectures that drive business insight and efficiency.
Key Responsibilities :
Design and implement high-performance data pipelines and transformation workflows using GCP services such as Cloud Dataflow, Dataproc, and Cloud Composer.Develop and maintain BigQuery-based data models for analytics and reporting.Integrate and process data from diverse sources using Python, PySpark, and Apache NiFi.Collaborate with cross-functional teams to translate business requirements into robust data engineering solutions.Optimize performance across data systems through effective partitioning, indexing, and query tuning.Manage data quality, governance, and scalability within cloud-based environments.Work closely with data scientists, analysts, and application teams to ensure seamless data access and reliability.Ensure adherence to data security, regulatory compliance, and best practices in cloud-based architectures.Required Skills & Experience :
Strong expertise in Python, PySpark, and SQL programming.Deep understanding of Google Cloud Platform components including BigQuery, Cloud Composer, Dataflow, Dataproc, and Cloud SQL.Experience with Apache Spark, Apache NiFi, and Hadoop-based ecosystems.Expertise in designing, managing, and optimizing large-scale distributed data applications.Familiarity with APIs, REST architecture, and integration techniques.Strong analytical, problem-solving, and communication skills in collaborative settings.