Description : "Responsibilities
Advanced Data Solutions & Engineering
- Prototype and operationalize advanced AI solutions, including GenAI and LLM-based systems.
- Build and integrate cloud-native data pipelines using tools such as Snowflake, Airflow, and Vertex AI.
- Implement retrieval-augmented generation (RAG) pipelines and multimodal data solutions.
- Drive automation, observability, and performance optimization across AI and data workflows.
Innovation & Applied AI
Lead initiatives to explore, validate, and scale emerging AI technologies.Translate research and prototypes into production-ready capabilities.Collaborate across teams to embed AI-driven insights and automation into business processes.Evaluate and shape next-generation AI trends, including agentic systems and autonomous workflows.Technology Leadership & Best Practices
Champion hands-on experimentation and rapid solution delivery while maintaining technical excellence.Define and promote engineering standards that balance agility, scalability, and governance.Collaborate with security, compliance, and governance partners to ensure responsible data and AI usage.Mentor engineers and architects in modern data and AI development practices.Collaboration & Knowledge Sharing
Act as a trusted advisor for business and technology leaders on data-driven innovation.Lead internal workshops and training sessions to accelerate AI adoption.Represent the organization in external forums, conferences, and publications focused on data and AI innovation.Qualifications
10+ years of experience in enterprise data architecture or engineering, with a strong hands-on focus on AI and cloud-native data platforms.Proven experience in designing, implementing, and optimizing large-scale AI systems, including LLM-based, GenAI, and agentic AI applications.Expertise in Python, SQL, and modern data frameworks (e.g., PySpark, Airflow, Snowflake, LangChain, Hugging Face, Vertex AI, OpenAI)Strong background in data modelling, distributed systems, and cloud architecture (AWS, GCP, or Azure).Experience developing and deploying AI / ML / GenAI pipelines leveraging vector databases and RAG frameworks.Bachelor's or Master's degree in Computer Science, Engineering, Data Science, or related field.Preferred :
Experience with agentic AI design patterns, including tool-use orchestration, autonomous workflow agents, or AI copilots.Proficiency in API design, microservices, and containerization (Docker, Kubernetes).Demonstrated ability to rapidly prototype new AI concepts and transition successful PoCs into production-grade systems. "Role Description : GenAI, LLMs, RAG pipelines, vector databasesPython SQL (expert level)Cloud-native data platforms (Snowflake, Airflow, Vertex AI AWS Azure)Modern AI frameworks (LangChain, Hugging Face, PySpark)Data modeling cloud architectureBuilding AIML pipelines end-to-endHands-on prototyping and productionizing AI solutions
Competencies : Data Architecture and Modeling