Hi ,
Our client is looking for a GCP Data Engineer f or Contract project in Hartford, CT (Hybrid) . below is the detailed requirement.
Position : GCP Data Engineer
Location : Hartford, CT (Hybrid)
Duration : Long Term Contract
Required Skills : Python, Apache Spark, Big Query, PySpark, Cloud Composer, Cloud Dataflow, Cloud Dataproc, Cloud SQL
Responsibilities :
- Should have strong knowledge of large scale search applications and building high volume data pipelines using GCP.
- Should have experience in building data transformation and processing solutions using GCP.
- Should have knowledge in Hadoop architecture, HDFS commands and experience designing & optimizing queries against data in the HDFS environment.
- Should have the ability to leverage multiple tools and programming languages to analyse and manipulate data sets from disparate data sources.
- Should have experience with Apache NiFi ,Rest API and Apache Spark required.
- Should have strong collaboration and communication skills within and across teams.
- Should have strong knowledge on Google cloud composer, dataproc, Cloud SQL, Google BigQuery, Hadoop, Hive, Pyspark, Python