GCP Data Engineer

Triunity Software
Phoenix, AZ, US
Full-time

Job Description

Job Description

Nature : Day One On-site

Duration : 24 Months

Candidates Required : 04

Experience : 5 to 8 Years

Following is the Job description for the role of a Data Engineer,

Mandatory Skill Set : Apache Spark, Hive, Hadoop, BigQuery, BigTable, Cloud Composure, Dataflow, Google Cloud Storage, Python, SQL, Shell Scripting, Git.

Good to have Skill Set : CI / CD, Jenkins, Security and Networking, Scala, GCP Identity and Access Management (IAM).

Responsibilities :

1. Data Processing : Design, develop, and maintain scalable and efficient data processing pipelines using technologies such as Apache Spark, Hive, and Hadoop.

2. Programming Languages : Proficient in Python, Scala, SQL, and Shell Scripting for data processing, transformation, and automation.

3. Cloud Platform Expertise : Hands-on experience with Google Cloud Platform (GCP) services, including but not limited to BigQuery, BigTable, Cloud Composer, Dataflow, Google Cloud Storage, and Identity and Access Management (IAM).

4. Version Control and CI / CD : Implement and maintain version control using Git and establish continuous integration / continuous deployment (CI / CD) pipelines for data processing workflows.

5. Jenkins Integration : Experience with Jenkins for automating the building, testing, and deployment of data pipelines.

6. Data Modeling : Work on data modeling and database design to ensure optimal storage and retrieval of data.

7. Performance Optimization : Identify and implement performance optimization techniques for large-scale data processing.

8. Collaboration : Collaborate with cross-functional teams, including data scientists, analysts, and other engineers, to understand data requirements and deliver solutions.

9. Security and Networking : Possess basic knowledge of GCP Networking and GCP IAM to ensure secure and compliant data processing.

10. Documentation : Create and maintain comprehensive documentation for data engineering processes, workflows, and infrastructure.

Qualifications :

1. Proven experience with Apache Spark, Hive, and Hadoop.

2. Strong programming skills in Python, Scala, SQL, and Shell Scripting.

3. Hands-on experience with GCP services, including BigQuery, BigTable, Cloud Composer, Dataflow, Google Cloud Storage, and Identity and Access Management (IAM)

4. Familiarity with version control using Git and experience in implementing CI / CD pipelines.

5. Experience with Jenkins for automating data pipeline processes.

6. Basic understanding of GCP Networking.

7. Excellent problem-solving and analytical skills.

8. Strong communication and collaboration skills.

30+ days ago
Related jobs
Promoted
VirtualVocations
Scottsdale, Arizona

A company is looking for a GCP Data Engineer - Senior Level. ...

Diverse Lynx
Phoenix, Arizona

Expert in SQL and Data warehousing concepts. Hands-on experience with public cloud data warehouse (Google Cloud). GCP certification will be very good to have. Hands-on expertise with application design and software development in Big Data (Spark(Pyspark), HIVE). ...

E-Solutions
Phoenix, Arizona

Google Cloud Platform (GCP) knowledge. Strong experience with Pub/Sub, Data Flow, Big Table, Big Query and GCS . Experience managing MySQL or non-SQL databases like MongoDB. Active Google Certification, Google Cloud Developer or Google Cloud Engineer. ...

KASTECH Software Solutions Group
Phoenix, Arizona

Years of experience: 6 to 9<br /> Location: Phoenix<br /> Role: GCP Data Engineer<br /> Skills required: GCP - Data flow, Data proc, Apache Beam skills</p> <p> </p> <p>Focussed Skills : Focused skills: GCP Services including Big Query, Data flow, Data proc,...

Triunity Software
Phoenix, Arizona

Work on data modeling and database design to ensure optimal storage and retrieval of data. Collaborate with cross-functional teams, including data scientists, analysts, and other engineers, to understand data requirements and deliver solutions. Possess basic knowledge of GCP Networking and GCP IAM t...

E-Solutions INC
Phoenix, Arizona

JOB TITLE: GCP DATA ENGINEER/ CONSULTANT</b></p> <p class="wordsection1"><b>JOB LOCATION: PHOENIX, AZ ||</b><b> </b><b>USA</b></p> <p><b>JOB TYPE: </b><b>CONTRACT</b></p> <p class="xxwor...

Tata Consultancy Services
Phoenix, Arizona

Works with large volumes of traffic data and user behaviors to build pipelines that enhance raw data. ...

Genius Business Solutions
Phoenix, Arizona

Years of hands on Exp with GCP - Data flow, Data proc, Apache Beam skills. Experience in building ELT/ETL processes using DataProc (Python or Java and Pyspark) & Dataflow Apache Beam using Java (for streaming). Strong understanding of Data Ingestion, Data processing, Orchestration, Parallelizati...

E-Solutions
Phoenix, Arizona

JOB TITLE: GCP DATA ENGINEER/ CONSULTANT. GCP Services including Big Query, Data flow, Data proc, Apache Beam, Pub Sub, Compose. Years of hands-on experience with GCP including Data flow, Data proc, Apache Beam skills. Experience in building ELT/ETL processes using DataProc (Python or Java and Pyspa...

Genius Business Solutions Inc
Phoenix, Arizona

Work you'll do/Responsibilities7+ Years of hands on Exp with GCP - Data flow, Data proc, Apache Beam skillsWriting complex SQL queries to pull data from multiple tablesExperience in building ELT/ETL processes using DataProc (Python or Java and Pyspark) & Dataflow Apache Beam using Java (for streamin...