About the Role
We are seeking a highly skilled Databricks Data Engineer with strong expertise in modern data engineering, Azure cloud technologies, and Lakehouse architectures. This role is ideal for someone who thrives in dynamic environments, enjoys solving complex data challenges, and can lead end-to-end delivery of scalable data solutions.
What We're Looking For
- 8+ years designing and delivering scalable data pipelines in modern data platforms
- Deep experience in data engineering, data warehousing, and enterprise-grade solution delivery
- Ability to lead cross-functional initiatives in matrixed teams
- Advanced skills in SQL, Python , and ETL / ELT development , including performance tuning
- Hands-on experience with Azure , Snowflake , and Databricks , including system integrations
Key Responsibilities
Design, build, and optimize large-scale data pipelines on the Databricks Lakehouse platformModernize and enhance cloud-based data ecosystems on Azure , contributing to architecture, modeling, security, and CI / CDUse Apache Airflow and similar tools for workflow automation and orchestrationWork with financial or regulated datasets while ensuring strong compliance and governanceDrive best practices in data quality, lineage, cataloging, and metadata managementPrimary Technical Skills
Develop and optimize ETL / ELT pipelines using Python, PySpark, Spark SQL , and Databricks NotebooksDesign efficient Delta Lake models for reliability and performanceImplement and manage Unity Catalog for governance, RBAC, lineage, and secure data sharingBuild reusable frameworks using Databricks Workflows, Repos, and Delta Live TablesCreate scalable ingestion pipelines for APIs, databases, files, streaming sources , and MDM systemsAutomate ingestion and workflows using Python and REST APIsSupport downstream analytics for BI, data science, and application workloadsWrite optimized SQL / T-SQL queries, stored procedures, and curated datasetsAutomate DevOps workflows, testing pipelines, and workspace configurationsAdditional Skills
Azure : Data Factory, Data Lake, Key Vault, Logic Apps, FunctionsCI / CD : Azure DevOpsOrchestration : Apache Airflow (plus)Streaming : Delta Live TablesMDM : Profisee (nice-to-have)Databases : SQL Server, Cosmos DBSoft Skills
Strong analytical and problem-solving mindsetExcellent communication and cross-team collaborationDetail-oriented with a high sense of ownership and accountability