Job Title : Tableau to Databricks Migration Specialist
Location : Remote
Type : Contract
Duration : Long Term
Job Summary :
We are seeking a highly skilled Tableau to Databricks Migration Specialist to lead and execute the migration of enterprise-level reporting and analytics from Tableau dashboards to Databricks Lakehouse Platform . The ideal candidate will have hands-on experience with data visualization, ETL processes, SQL optimization, and translating BI logic from Tableau to scalable notebooks, dashboards, or SQL queries in Databricks.
Key Responsibilities :
Assessment & Planning
- Assess existing Tableau dashboards, data sources, and underlying logic.
- Map and document the current state of Tableau reporting architecture.
- Develop a comprehensive migration roadmap, identifying dependencies, blockers, and priorities.
Migration Execution
Translate Tableau visualizations, calculations, and data connections into equivalent Databricks SQL or notebooks.Migrate underlying data sources to Delta Lake format if needed.Rebuild KPIs and data models in Databricks using PySpark, SQL, and Unity Catalog.Data Engineering Support
Ensure data pipelines feeding Tableau are transitioned or refactored for Databricks.Work with data engineering teams to build or modify ETL / ELT pipelines using Databricks workflows.Performance Optimization
Optimize SQL queries and data models for performance within the Databricks Lakehouse.Tune visualizations and jobs for large-scale data analytics.Testing & Validation
Validate data consistency between Tableau and Databricks dashboards.Perform functional, regression, and performance testing for each migrated report.Collaboration & Documentation
Collaborate with business stakeholders, BI analysts, and data engineers.Train teams on using Databricks dashboards and replacing legacy Tableau assets.Document the new architecture, logic, and workflows for future maintainability.Required Skills & Experience :
5+ years of experience with Tableau development and dashboarding3+ years working with Databricks (Delta Lake, Unity Catalog, SQL, Notebooks)Proficiency in SQL , Python (PySpark) , and data modelingStrong knowledge of ETL / ELT processes and data warehousing conceptsExperience migrating BI tools / reports to cloud-based platforms (preferably Databricks or similar)Familiarity with version control , CI / CD , and DevOps practices in data environmentsExperience working with large datasets and ensuring data quality and consistencyPreferred Qualifications :
Exposure to tools like Power BI , Looker , or Apache Superset (optional)Experience with Azure Data Factory , AWS Glue , or DataBricks WorkflowsTableau certification and / or Databricks certifications are a plusSoft Skills :
Strong analytical and problem-solving mindsetExcellent communication and stakeholder managementAbility to work in fast-paced, cross-functional teams