Job Description
IQ Clarity’s client is seeking a Data Engineer to design, build, and maintain scalable, cloud-based data platforms and pipelines. This role partners closely with engineering, analytics, and business stakeholders to ensure high-quality, reliable data is available for analytics, reporting, and advanced data use cases.
Responsibilities :
- Design, build, and maintain scalable, reliable data pipelines using modern cloud-based orchestration frameworks
- Develop, test, and optimize analytics-ready data models using dbt and analytics engineering best practices
- Architect and maintain modern data lake and lakehouse storage patterns to support batch and streaming workloads
- Build and support batch and near–real-time data processing workflows using distributed processing frameworks
- Manage and optimize cloud data storage and compute services to ensure performance, scalability, and cost efficiency
- Establish and enforce data quality, testing, and governance standards, including dbt tests and documentation
- Collaborate with analytics, data science, application, and business teams to translate requirements into scalable data solutions
- Monitor data pipelines, troubleshoot data quality or performance issues, and improve system reliability
- Support CI / CD pipelines and infrastructure automation for data workflows and platforms
- Ensure data security, access controls, and governance best practices are followed
- Contribute to engineering best practices, documentation, and code reviews
- Evaluate and integrate tools and technologies that improve platform scalability, developer experience, and operational efficiency
Requirements :
5+ years of experience in data engineering or analytics engineering rolesStrong proficiency in Python and SQLHands-on experience designing and building data pipelines on AWSExperience with AWS data services such as S3, Glue, Redshift, RDS, Lambda, Athena, and related analytics toolingStrong experience with dbt for data transformations, testing, and analytics engineering workflowsExperience with workflow orchestration frameworks (e.g., Airflow or similar tools)Experience working with distributed data processing technologies (e.g., Spark or similar)Experience designing or supporting modern data lake or lakehouse architecturesFamiliarity with streaming or messaging platforms (e.g., Kafka or cloud-native equivalents)Experience with infrastructure-as-code and containerization tools (e.g., Terraform, CloudFormation, Docker)Solid understanding of CI / CD concepts as applied to data systemsAbility to communicate effectively with both technical and non-technical stakeholdersNice to Have :
Experience with streaming data platforms (e.g., Kinesis, Kafka)Exposure to analytics, BI, or visualization toolsExperience in regulated or compliance-focused data environmentsExposure to ML or AI-related data pipelines and feature generation use casesExperience supporting self-service analytics at scalePrior mentoring or informal technical leadership experienceAWS certificationsIQ Clarity, LLC is an Equal Opportunity Employer.
No C2C or third-party candidates