ONLY LOCAL CANDIDATES WILL BE CONSIDERED. IF YOU ARE OUTSIDE THE AREA, YOU WILL NOT RECEIVE A CALLBACK. THIS ROLE IS FULLY ONSITE IN IRVINE, CA, FOR 60 TO 90 DAYS. AFTER THAT, THE ROLE WILL DROP TO 3 DAYS ONSITE, 2 DAYS REMOTE (PENDING CONTINUED PERFORMANCE). UNFORTUNATELY, WE ARE UNABLE TO TRANSFER / SPONSOR VISAS AT THIS TIME. KORE1, a nationwide provider of staffing and recruiting solutions, has an immediate opening for a Data Engineer : MS SQL, AWS or Snowflake DW, ELT modern data pipelines, & Python req; Airflow DAG, SSIS ETL, MySQL a +
We are looking for a talented Data Engineer who loves to be hands-on to join a team of Business Intelligence Developers and Data Engineers. This is a key position within the company and aligns with our data strategy, driving optimization, reliability, and robust business intelligence capabilities across the organization. This position is ideal for data professionals seeking to have influence and who wish to apply their technical expertise in a collaborative, data-driven organization focused on innovation and growth. Essential Duties and Responsibilities
- Develop and maintain SQL Server and Snowflake Data Warehouses and Data Lakes with a focus on data governance, security, and performance optimization.
- Manage and optimize database solutions within Snowflake, SQL Server, MySQL, and AWS RDS environments.
- Build and optimize ETL / ELT pipeline processes using Python, Airflow DAG, SSIS, etc.
- Proficiency in Data Tools such as SSMS, VS, Profiler, Query Store, Redgate, etc.
- Perform light operational tasks, including database backup & restore.
- Collaborate and support organizational data development projects, working closely with the Business Intelligence and Business Analysts Teams.
- Configure and manage data integration platforms
- Leverage business intelligence tools, ideally including Power BI, to generate DW monitoring reports (including failed SQL jobs) for remediation
- Help ensure and maintain database security, integrity, and compliance following the industry's best practices.
Qualifications
A bachelor's degree is required (any field acceptable)A bachelor's or master's degree in Computer Science, Information Systems, Data Science, or an equivalent field is a plus.5+ years of hands-on experience as a Data Engineer that includes : Databases : Proven mastery of Microsoft SQL Server, including T-SQL is required AWS RDS strongly preferred MySQL skills are a plus. Data warehouse / data lakes : 2+ years of hands-on experience working with AWS OR Snowflake data warehouses and data lake environments Hands-on experience working with Snowflake data warehouses and data lake environments is a plus. ETL : Hands-on experience developing ELT / ETL / data pipelines with Python is required. Airflow DAG is a big plus SSIS is a plus / preferred. Other : Basic Power BI skills would be a plus (we have a separate BI reporting team; however, we use Power BI for monitoring for our DW, so basic skills are required) Proficiency with scripting and automation, including Python, PowerShell, or R. Experience with and knowledge of data integration and analytic tools such as Boomi, Redshift, or Databricks is desirable.Excellent communication and organizational skills are essentialCompensation depends on experience but is typically $100-145K.