Talent.com
Lead Specialist, Azure Data Solutions - 25050

Lead Specialist, Azure Data Solutions - 25050

World Wildlife FundWashington, DC, United States
30+ days ago
Job type
  • Full-time
Job description

Lead Specialist, Azure Data Solutions - 25050

Job Locations

US-DC-Washington

# of Openings

Overview

World Wildlife Fund (WWF), one of the world's leading conservation organizations, seeks a Lead Specialist, Azure Data Solutions.

The Lead Specialist, Azure Data Solutions is a hands-on data engineering role responsible for designing, implementing, and maintaining end-to-end solutions using Microsoft Fabric and Azure services. This role involves configuring and managing Fabric workspaces, developing data models and pipelines, integrating data sources into OneLake and Blob Storage, and ensuring data quality and integrity. The specialist also creates and manages Power BI reports and dashboards, optimizes solutions for performance and reliability, and stays updated with the latest Fabric and Azure best practices.

The role also serves as a trusted proxy for the Director of Data Solutions in meetings, representing the team's work and providing guidance on data best practices.

Salary Range : $126,600 to $181,900

Please note : Applicants must be legally authorized to work in the U.S. This position is not eligible for employment visa sponsorship. In compliance with federal law, all persons hired willbe required to verify identity and eligibility to work in the United States and to complete the required employment eligibility verification form upon hire.

  • All applicants are strongly encouraged to submit a cover letter
  • This position is NOT eligible for relocation
  • This position is hybrid - 2 days per week at our DC office

Responsibilities

Designs, implements, and maintains data solutions using Microsoft Fabric and Azure services.

  • Configures and manages Fabric workspaces, semantic models, dataflows, and pipelines, applying best practices such as medallion architecture.
  • Designs Fabric architectures and environments to ensure scalability, reliability, and governance.
  • Supports the integration of data sources into OneLake, Azure SQL Database, and Azure Blob Storage.
  • Develops and maintains ETL / ELT processes using Python and PySpark with Fabric Data Engineering and Azure Data Factory.
  • Builds semantic models (e.g., star schema) and delivers both simple and advanced Power BI reports.
  • Evaluates and migrates legacy workflows and systems into Microsoft Fabric, modernizing pipelines and solutions.
  • Ensures data quality and integrity through data validation and cleansing processes.
  • Understands Azure permissions and access requirements (e.g., service principals, security groups, least-privilege) to coordinate requests with infrastructure / security teams.
  • Monitors and optimizes Fabric data solutions for performance, scalability, and cost
  • efficiency.

  • Provides technical support and troubleshooting for data-related issues.
  • Collaborates with other WWF teams building their own pipelines, offering best practices guidance or stepping in to support delivery.
  • Identifies opportunities to enhance Microsoft Fabric and Azure data services and stay current with the latest updates and best practices.
  • Performs other duties as assigned
  • Key Competencies

    Core capabilities for this role include analytical thinking, problem solving, communication skills, and business knowledge. Specifically, the following competencies are essential :

    Interpersonal Communication and Collaboration - collaborate effectively with colleagues, and receive and incorporate feedback well.

    Technical Adaptability - Embraces new technologies, adjusts to workflow modifications, and thrives in dynamic environments.

    Data analysis / Analytical Thinking - Ability to investigate a problem and find the ideal solution in a timely, efficient manner; Ability to read, understand, and interpret data to solve business challenges or create a compelling narrative to inform business decisions.

    Big Picture Thinking - the ability to solve complex problems with a "big picture" perspective and a sensitivity to many moving parts.

    Qualifications

  • Bachelor's degree in Computer Science, Information Technology, or a related field
  • MUST HAVE :
  • Minimum 6 years of relevant data experience.

  • Strong, hands-on expertise in Microsoft Fabric (workspaces, semantic modeling, dataflows, pipelines, governance).
  • Proficiency with SQL; SQL DBA experience is a plus.
  • Experience with Python and PySpark for data engineering and workflow automation.
  • Experience integrating and managing data in OneLake, Azure SQL Database, and Azure Blob Storage.
  • Experience with Power BI (data modeling, DAX, visualization best practices).
  • Knowledge of data architecture best practices in Fabric (e.g., medallion design, workspace setup).
  • Experience migrating data pipelines or systems from legacy platforms into Microsoft Fabric
  • Strong analytical and problem-solving skills.
  • Excellent communication and teamwork abilities. Ability to communicate effectively with all levels of the organization; ability to communicate difficult concepts, new technology, and new ways of working.
  • Willingness to learn and adapt to new technologies and methodologies.
  • A self-starter that brings the energy and resilience to continually drive improvements and change.
  • Committed to building and strengthening a culture of inclusion within and across teams.
  • Preferred :
  • Experience with other Azure services such as Azure Synapse Analytics and Azure Databricks.

  • Familiarity with Azure AI tools (Azure AI Foundry, Cognitive Services, ML frameworks).
  • Experience advising or supporting SQL DBA activities in Azure managed environments (e.g., dedicated SQL databases, VMs).
  • Familiarity with data governance, metadata management, and security best practices.
  • Experience guiding other teams in data engineering best practices and solution alignment.
  • Experience working in Agile or iterative project environments.
  • Awareness of cloud cost optimization practices in Azure and Fabric.
  • Relevant Azure and / or Microsoft Fabric certifications.
  • Identifies and aligns with WWF's core values :
  • COURAGE - We demonstrate courage through our actions, we work for change where it's needed, and we inspire people and institutions to tackle the greatest threats to nature and the future of the planet, which is our home.

  • INTEGRITY - We live the principles we call on others to meet. We act with integrity, accountability, and transparency, and we rely on facts and science to guide us and to ensure that we learn and evolve.
  • RESPECT - We honor the voices and knowledge of people and communities that we serve, and we work to secure their rights to a sustainable future.
  • COLLABORATION - We deliver impact at the scale of the challenges we face through the power of collective action and innovation.
  • To Apply :

  • ","469777815" : "hybridMultilevel"}" aria-setsize="-1" data-aria-posinset="1" data-aria-level="1">
  • Submit cover letter and resume through our Careers Page , Requisition # 25050
  • ","469777815" : "hybridMultilevel"}" aria-setsize="-1" data-aria-posinset="2" data-aria-level="1">
  • Due to the high volume of applications, we are not able to respond to inquiries via phone
  • World Wildlife Fund (WWF) promotes equal employment opportunities for all qualified individuals regardless of age, race, color, sex, religion, national origin, disability, or veteran status, or any other characteristic protected under applicable law.

    Create a job alert for this search

    Lead Specialist Data • Washington, DC, United States