About the Role :
We are looking for a highly skilled and motivated Data Engineer to join our growing team in Dallas, TX . In this role, you will design and implement scalable data pipelines, build robust data infrastructure, and work closely with cross-functional teams to support advanced analytics and data-driven decision-making across the organization.
Key Responsibilities :
- Build, optimize, and maintain scalable and reliable data pipelines (ETL / ELT)
- Develop data architectures and models to support analytics and reporting requirements
- Integrate and transform data from diverse sources such as APIs, flat files, logs, and databases
- Collaborate with Data Scientists, Analysts, and Software Engineers to understand data needs
- Ensure high data quality and integrity through validation and monitoring processes
- Automate workflows using orchestration tools like Apache Airflow or Prefect
- Monitor, troubleshoot, and optimize performance of data systems and pipelines
- Maintain and manage data infrastructure on cloud platforms (AWS / GCP / Azure)
- Implement data governance, security, and compliance best practices
- Create and maintain documentation of data architecture and pipelines
Required Skills & Qualifications :
Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, or a related field3+ years of professional experience in Data Engineering or a similar roleStrong hands-on experience with Python and SQLExperience with data processing frameworks like Apache Spark, Kafka, or FlinkProficiency with cloud data platforms (e.g., AWS Redshift, GCP BigQuery, Azure Synapse)Solid understanding of relational and NoSQL databasesExperience with orchestration tools (Airflow, Luigi, Prefect)Familiarity with CI / CD processes and version control systems (e.g., Git)Strong analytical and problem-solving skillsExcellent verbal and written communication skillsPreferred Qualifications :
Experience with real-time or streaming data systemsKnowledge of data governance, privacy, and compliance standards (e.g., GDPR, HIPAA)Experience with Infrastructure-as-Code (IaC) tools like Terraform or CloudFormationExposure to containerization and DevOps tools (Docker, Kubernetes)Familiarity with business intelligence tools like Tableau, Power BI, or LookerAbout the role
Describe the role and team the candidate will be joiningWhat you'll do
Describe the specific responsibilities and job functions of the roleQualifications
Describe the experience and attributes of the ideal candidatePI965b34b8b7d4-30511-38200883