Senior Data Engineering Manager (50% Leadership / 50% Hands-On) Full-Time | Hybrid | Dallas–Fort Worth (Preferred) | Also open to TX, Florida, Charlotte, Raleigh, Kansas City, St. Louis, Nashville, Chicago, or Denver A nationwide real estate industry leader is hiring a
Senior Data Engineering Manager to lead a modern data engineering organization and build scalable enterprise data solutions. This role is a
50/50 blend of leadership and hands-on development.
You will manage a team of three experienced engineers:
-
Principal Data Engineer
-
Senior Data Engineer
-
Snowflake Platform Engineer
This position is
hybrid, with strong preference for candidates who can be onsite in
DFW for the first few weeks of onboarding, then a few days per month.
What You’ll Do -
Lead, mentor, and manage a team of three data engineers
-
Split responsibilities evenly between leadership and hands-on engineering
-
Define the data engineering strategy, roadmap, and architecture
-
Build, optimize, and maintain scalable ETL/ELT pipelines and data workflows
-
Architect data models and storage solutions on AWS and Snowflake
-
Ensure strong performance, security, governance, and reliability across platforms
-
Own the full data lifecycle: ingestion, transformation, testing, quality, monitoring
-
Implement CI/CD, automated testing, and data observability tools
-
Build APIs and data services (REST, GraphQL)
-
Develop real-time and streaming pipelines using Kinesis, Kafka, or similar technologies
-
Deploy event-driven solutions using AWS serverless (Lambda, SQS, SNS)
-
Partner with BI, Product, Engineering, and business teams on scalable data solutions
What You Bring -
8+ years of data engineering experience, including 3+ years in a management role
-
Ability to operate in a 50/50 leadership + hands?on capacity
-
Deep expertise with AWS, Snowflake, and modern data engineering tools
-
Experience with dbt, Airflow, Matillion, GitHub, Fivetran, or similar technologies
-
Strong SQL and Python skills
-
Experience designing distributed data systems and multi-environment data ecosystems
-
Skilled in real-time/streaming data pipelines (Kinesis, Kafka, etc.)
-
Experience with CI/CD, data quality frameworks, and observability
-
Excellent communication skills with technical and non-technical stakeholders
-
Familiarity with Terraform or other Infrastructure-as-Code tools
-
Hands-on API design/integration experience