Role : Data Engineer with ITIL (Snowflake, SnowSQL, Python, Pyspark),Servicenow, Jenkins
Location : NYC, NY (Hybrid, 3 days in a week from office)
In-person interview is required
Recent Banking experience is MUST
Overview -
We are seeking a skilled Data Engineer to join our dynamic team. The ideal candidate will possess expertise in data pipeline development, data warehousing, and have a solid understanding of ITIL processes to support operational efficiency.
Key Responsibilities :
- Design, develop, and maintain scalable data pipelines using SQL, Python, and PySpark.
- Build and optimize data warehouses leveraging Snowflake for efficient data storage and retrieval.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions.
- Monitor and troubleshoot data workflows ensuring data quality and performance.
- Document data processes and procedures following ITIL best practices.
Technical Skills :
SQL : Strong proficiency in writing complex queries for data extraction, transformation, and loading (ETL).Python & PySpark : Experience in scripting and big data processing using Python and PySpark frameworks.Data Warehouse : Hands-on experience with designing, implementing, and managing data warehouses.Snowflake : Deep understanding of Snowflake platform features, architecture, and best practices.Process Knowledge :
Familiarity with ITIL framework, especially in Incident Management and Problem Management.Proven experience handling incident resolution, root cause analysis, and problem tracking within ITSM tools.Qualifications :
Bachelor's degree in Computer Science, Information Systems, or related field.10 years of experience in data engineering or related roles.Strong analytical and problem-solving skills.Excellent communication and teamwork abilities.Preferred Certifications :
ITIL Foundation or higher.Snowflake Certification.Any relevant certifications in big data, Python, or SQL.