Hello There,
My name is Himanshu Sharma , and I serve as the Recruitment Lead at Kanak-IT INC. I am reaching out to share an excellent career opportunity for the role of DevOps Engineer with our esteemed client. If you are interested then please share your updated resume at Himanshu01@kanakits.com
Job Description
Position : DevOps Engineer
Location : Boston, MA Hybrid
Duration : Long term contract
Description
The client is seeking an experienced DevOps Engineer to support our cloud data warehouse modernization initiative, migrating from a SQL Server / AWS based system to a Snowflake-based data platform. The DevOps Engineer is responsible for developing, maintaining, and optimizing data pipelines and integration processes that support analytics, reporting, and business operations. The DevOps Engineer will design and implement CI / CD pipelines, automate data pipeline deployments, and ensure operational reliability across Snowflake, Informatica, and Apache Airflow environments
DETAILED LIST OF JOB DUTIES AND RESPONSIBILITIES :
- Build and maintain CI / CD ( Continuous Integration (CI) / Continuous Delivery / Deployment (CD) pipelines for Snowflake, Informatica (IICS), and Airflow DAG ( Directed Acyclic Graph) deployments
- Implement automated code promotion between development, test, and production environments
- Integrate testing, linting, and security scanning into deployment processes
- Develop IaC(Infrastructure as Code using Terraform or similar tools to manage Snowflake objects, network, and cloud resources
- Manage configuration and environment consistency across multi-region / multi-cloud setups
- Maintain secure connectivity between cloud and on-prem systems (VPNs, private links, firewalls)
- Implement logging and alerting for Airflow DAGs, Informatica workflows, and Snowflake performance
- Develop proactive monitoring dashboards for job failures, data quality triggers, and warehouse usage
- Optimize pipeline performance, concurrency, and cost governance in Snowflake
- Own deployment frameworks for ETL / ELT code, SQL scripts, metadata updates
- Support user access provisioning & RBAC alignment across Snowflake, Informatica, and Airflow
- Troubleshoot platform and orchestration issues, lead incident response during outages
- Enforce DevSecOps practices including encryption, secrets management, and key rotation
- Implement audit, logging, compliance, and backup / restore strategies aligned with governance requirements
- Participate in testing, deployment, and release management for new data workflows and enhancements.
Required Qualifications
3 7+ years in DevOps, Cloud Engineering, or Data Platform Engineering rolesHands-on experience with :Snowflake (roles, warehouses, performance tuning, cost control)
Apache Airflow (DAG orchestration, monitoring, deployments)Informatica (IICS pipeline deployment automation preferred)Strong CI / CD skills using GitLab, GitHub Actions, Azure DevOps, Jenkins, or similarProficiency with Terraform , Python , and Shell scriptingDeep understanding of cloud platforms : AWS , Azure , or GCPExperience with containerization (Docker, Kubernetes), especially for AirflowStrong knowledge of networking concepts and security controlsPreferred Knowledge, Skills & Abilities :
Experience migrating from SQL Server or other legacy DW platformsKnowledge of FinOps practices for Snowflake usage optimizationBackground in healthcare, finance, or regulated industries a plusSoft Skills
Effective communication with technical and non-technical stakeholdersAbility to troubleshoot complex distributed data workloadsStrong documentation and cross-team collaboration skillsProactive and committed to process improvement and automationDetail-oriented, with a focus on data accuracy and process improvement.Education and Certification :
Bachelor's degree or equivalent years in Computer Science, Information Systems, Data Engineering , Health Informatics , or related field.