Job Description
Job Description
Overview
CTG is seeking to fill an ETL Big Data Engineer position for our client in Phoenix, AZ.
Duration : 12 months
Key Skills : Genesys, Oracle, PL / SQL, Apache Kafka, API development, Denodo, Google Cloud Platform (GCP)
Duties :
Design, develop, and optimize ETL pipelines for large-scale data processing.
Build real-time data streaming solutions using Apache Kafka.
Integrate data across systems via APIs and Denodo data virtualization.
Deploy and maintain cloud-based data solutions on GCP.
Ensure data quality, security, and reliability across workflows.
Experience & Education :
Proven experience in ETL, Big Data, and data integration projects.
Hands-on experience with data pipelines, data warehousing, and analytics.
Bachelor’s degree in Computer Science, IT, or related field preferred.
Excellent verbal and written English communication skills and the ability to interact professionally with a diverse group are required.
CTG does not accept unsolicited resumes from headhunters, recruitment agencies, or fee-based recruitment services for this role.
To Apply :
To be considered, please apply directly to this requisition using the link provided. For additional information, please contact JoAnn Abramo at JoAnn.Abramo@ctg.com . Kindly forward this to any other interested parties. Thank you!
Big Data Engineer • Phoenix, AZ, US