Role : Machine Learning Consultant.
Location : Remote.
Duration : Long Term Contract.
Key Responsibilities :
- Architecture Design : Design and build scalable systems and infrastructure to deploy, scale, and monitor machine learning solutions on the Databricks Lakehouse platform, leveraging the unified approach of data lakes and warehouses for both structured and unstructured data.
- Platform Enablement : Enable data science teams to operationalize ML workflows from data ingestion through feature engineering to model training, deployment, and monitoring. Experience with Databricks Online Feature Store and Asset Bundles is essential.
- Tooling & Integration : Define and implement MLOps practices using Databricks components such as MLflow (tracking, registration, deployment), Delta Lake (data versioning and logging), Unity Catalog (data governance and access control), and experience with Delta Sharing methods.
- CI / CD Implementation : Architect robust CI / CD pipelines using Azure DevOps or GitHub Actions to automate testing and deployment of ML code and models, supporting MLOps 3.0 principles.
- Collaboration : Work closely with data engineers, scientists, and DevOps teams to ensure platform implementations meet business requirements and technical standards.
- Monitoring & Optimization : Implement comprehensive monitoring of model performance, data drift, and bias to ensure ongoing model health and timely retraining triggers.
- Technical Leadership : Provide leadership in system design, infrastructure standardization, and documentation to promote best practices and facilitate cross-team knowledge sharing.
Required Qualifications :
8+ years in data architecture, platform engineering, or MLOps, with 5+ years hands-on experience with Databricks environments on major cloud platforms (AWS, Azure, or GCP).Strong technical skills with Apache Spark, Delta Lake, MLflow, Unity Catalog, and Databricks Runtime for ML.Proficiency in Python and SQL programming, with familiarity in containerization technologies such as Docker and Kubernetes.Deep understanding of data modeling, ETL design, and modern data Lakehouse architectures.Bachelor's or Master's degree in Computer Science, Data Engineering, or related technical field.Excellent communication and problem-solving skills, adept at translating technical designs to non-technical stakeholders.Preferred Qualifications :
Databricks Certifications (e.g., Databricks Certified Machine Learning Associate / Professional).Experience with Generative AI, LLMOps, and related frameworks such as LangChain or Hugging Face.