Roles and Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL workflows using Python and SQL.
- Work with structured, semi-structured, and unstructured data to ensure efficient ingestion, transformation, and storage.
- Optimize data processes for performance, scalability, and cost-efficiency.
- Manage and monitor data workflows to ensure accuracy, consistency, and reliability.
- Implement data quality checks, validation, and error-handling mechanisms in pipelines.
- Integrate data from multiple sources (databases, APIs, cloud storage, streaming platforms).
- Work with cloud environments (AWS / Azure / GCP) for data storage, processing, and orchestration.
- Maintain clear documentation of data workflows, schemas, and business logic.
- Support and troubleshoot production issues in data pipelines and recommend improvements.
Nice to Have
Awareness of data security and governance (IAM roles, encryption).Knowledge of monitoring / logging tools (CloudWatch, Grafana, ELK stack).Exposure to data security and governance (IAM roles, encryption).Benefits and Perks
Health InsuranceEmployee Stock Ownership PlanMaternity / Paternity Paid LeavesSkills Enhancement Certifications5-Days WorkingFlexible Working HoursLeave EncashmentsReferral BonusYearly Paid leavesYearly TripFestival / Event CelebrationsCompany Anniversary CelebrationsBirthday / Marriage Anniversary LeaveCulture
Life @ Albiorix!!!!
Explore the vibrant side of tech life at Albiorix! Innovation, passion, fun, and celebrations. Join us in shaping the future!
#J-18808-Ljbffr