ESSENTIAL JOB FUNCTIONS FOR THIS POSITION :
Build and maintain the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud technologies.
Assemble large, complex sets of data that meet non-functional and functional business requirements.
Identify, design, and implement internal data-related process improvements.
Working with stakeholders including data, design, product and executive teams and assisting them with data-related technical issues.
Conduct configuration & design of application to better leverage the enterprise.
Prepare data for prescriptive and predictive modeling.
Use effective communication to work with application vendors.
Assist in the creation and quality assurance review of design documents and test results to ensure all project requirements are satisfied.
Ability to advise and implement on improvements to data warehousing and data workflow architecture.
Think outside the box and come up with improvement and efficiency opportunities to streamline business and operational workflows.
Document high-level business workflows and transform into low-level technical requirements.
Ability to analyze complex information sets and communicate that information in a clear well thought out and well laid out manner.
Ability to communicate at varying levels of detail (30,000 ft. view, 10,000 ft. view, granular level) and to produce corresponding documentation at varying levels of abstraction.
Be an advocate for best practices and continued learning.
Ability to communicate with business stakeholders on status of projects / issues.
Ability to prioritize and multi-task between duties at any given time.
Solid communication and interpersonal skills.
Comply with company policies and procedures and all applicable laws and regulations.
General DBA work as needed.
Maintain and troubleshoot existing ETL processes.
Create and maintain BI reports.
Additional duties as assigned.
REQUIRED EDUCATION :
REQUIRED SKILLS :
Demonstrated experience with SQL in a large database environment.
Direct experience utilizing SQL to develop queries or profile data.
Strong hands-on experience with Fabric.
DAX experience
REQUIRED MICROSOFT FABRIC SKILLS :
Strong grasp of OneLake concepts : lakehouses vs. warehouses, shortcuts, mirroring, item / workspace structure.
Hands-on with Delta Lake (Parquet, Delta tables, partitioning, V-ordering, Z-ordering, Vacuum retention).
Understanding of Direct Lake, Import, and DirectQuery trade-offs and when to use each.
Experience designing star schemas and modern medallion architectures (bronze / silver / gold).
Spark / PySpark notebooks (jobs, clusters, caching, optimization, broadcast joins).
Data Factory in Fabric (Pipelines) : activities, triggers, parameterization, error handling / retries.
Dataflows Gen2 (Power Query / M) for ELT, incremental refresh, and reusable transformations.
Building / optimizing semantic models; DAX (measures, calculation groups, aggregations).
Ability to multi-task, think on his / her feet and react, apply attention to detail and follow-up, and work effectively and collegially with management staff and end users.
ManpowerGroup is committed to providing equal employment opportunities in a professional, high quality work environment. It is the policy of ManpowerGroup and all of its subsidiaries to recruit, train, promote, transfer, pay and take all employment actions without regard to an employee's race, color, national origin, ancestry, sex, sexual orientation, gender identity, genetic information, religion, age, disability, protected veteran status, or any other basis protected by applicable law.
Engineer Engineer • Carmel, IN, US