Talent.com
Data Architect - Senior
Data Architect - SeniorAcestack • United States
Data Architect - Senior

Data Architect - Senior

Acestack • United States
3 days ago
Job type
  • Full-time
  • Quick Apply
Job description

Description : Project Name :

Data Management Platform Projects

Scope :

The Government of Alberta's modernization initiatives are shifting legacy systems to a cloud-native Azure Data Management Platform, alongside on-premises geospatial systems. This transformation requires a Data Architect to design, implement and manage scalable, secure, and integrated data solutions.

Ministries such as Environment and Protected Areas, Transportation and Economic Corridors, and Service Alberta rely on complex data from systems like ServiceNow, ERP platforms, and geospatial tools. The Data Architect will enable seamless ingestion, transformation, and integration of this data using Azure services including Data Factory, Synapse Analytics, Data Lake Storage, and Purview.

Azure Databricks will be used to support advanced data engineering, analytics, and machine learning workflows. The Data Architect will ensure that data pipelines are optimized for both batch and real-time processing, supporting operational reporting, predictive modeling, and automation.

Downstream systems will consume data via APIs and data services. The Data Architect will design and manage these interfaces using Azure API Management, ensuring secure, governed, and scalable access to data.

Security, governance, and compliance are critical. The Data Architect will implement role-based access controls, encryption, data masking, and metadata management to meet FOIP and other regulatory requirements.

As data volumes and complexity grow, the Data Architect will ensure the platform remains extensible, reliable, and future-ready, supporting new data sources, ministries, and analytical capabilities.

Duties :

Design and implement scalable, secure, and high-performance data architecture on Microsoft Azure, supporting both cloud-native and hybrid environments.

Lead the development of data ingestion, transformation, and integration pipelines using Azure Data Factory, Azure Databricks, and Azure Synapse Analytics.

Architect and manage data lakes and structured storage solutions using Azure Data Lake Storage Gen2, ensuring efficient access and governance.

Integrate data from diverse source systems including ServiceNow, and geospatial systems, using APIs, connectors, and custom scripts.

Develop and maintain robust data models and semantic layers to support operational reporting, analytics, and machine learning use cases.

Build and optimize data workflows using Python and SQL for data cleansing, enrichment, and advanced analytics within Azure Databricks.

Design and expose secure data services and APIs using Azure API Management for downstream systems.

Implement data governance practices, including metadata management, data classification, and lineage tracking.

Ensure compliance with privacy and regulatory standards (e.g., FOIP, GDPR) through role-based access controls, encryption, and data masking.

Collaborate with cross-functional teams to align data architecture with business requirements, program timelines, and modernization goals.

Monitor and troubleshoot data pipelines and integrations, ensuring reliability, scalability, and performance across the platform.

Provide technical leadership and mentorship to data engineers and analysts, promoting best practices in cloud data architecture and development.

Other duties as needed

Equipment Requirements :

Resource will require own equipment

"The contractor is responsible for providing all necessary equipment, including computers, software, printers, supplies, desks, and chairs. Computers must run a modern version of Windows (preferred) or MacOS that is compatible with Azure Virtual Desktop (AVD) and related remote access software, which will be installed on the resource's computer. The Province will ensure the contractor's resources have the required access and credentials to the Government of Alberta's systems."

Working Hours :

Standard Hours of work are 08 : 15 16 : 30 Alberta time, Monday through Friday excluding holidays observed by the Province

Work must be done from within Canada, due to network and data security issues.

Notes on Location :

The resource will primarily work remotely but must be available for on-site meetings as required. These meetings may involve strategic, analytical, or technical discussions and may include engagement with team members, senior managers, directors, executive directors, or business clients.

On-Site Meeting Frequency :

Meetings may occur up to 3 4 times per fiscal month, but the actual frequency will depend on the specific initiative and will be determined on an on-demand basis.

On-Site Location :

Meetings will take place in Edmonton, Alberta, at one of the Government of Alberta buildings.

The resource will be provided with details regarding the meeting location in advance.

This policy ensures flexibility for remote work while maintaining the ability to collaborate effectively during key on-site engagements. However, time to travel and any associated expenses to and from Edmonton and / or travel within Alberta will be at no cost to the Province.

Incumbency :

This is a net new role.

Background Check Required :

Standard Background Check :

The Supplier shall, prior to commencement of the Services, provide the Province, on its request and at no cost to the Province, with criminal record checks.

Mandatory Training Courses :

Once hired the resource will be required to complete all mandatory training which includes but not limited to Freedom of Information and Protection of Privacy Act and Security awareness training. There may also be some optional courses as well.

Anticipated Interviews Dates

will be held between November 6 to 10, 2025

Refer to the Job Posting attachments for the proposed form of contract applicable to this Contingent Resource Request

Scoring Methodology :

Financial / Pricing : 20%

Resource Qualifications : 20%

Interview Process : 60%

SUBMISSION MUST INCLUDE :

RESUME

ALL REQUIRED EXPERIENCE MUST BE DESCRIBED IN RESUME UNDER THE JOB / PROJECT WHERE EXPERIENCE WAS ATTAINED.

EACH JOB / PROJECT MUST CONTAIN THE TERM OF THE JOB / PROJECT IN THE FORMAT MMM / YYYY to MMM / YYYY.

RESOURCE REFERENCES

Three references, for whom similar work has been performed, must be provided. The most recent reference should be listed first. Reference checks may or may not be completed to assist with scoring of the proposed resource.

Enable Skills-Based Hiring No $(function(){ $(".qualPlaceHolder").each(function(){ if($(this).parents('#jsLegend').length != 1 || $(this).find('ul').length == 0) { var requested = $(this).attr("requested"); var presented = $(this).attr("presented"); var dynamicId = $(this).attr("id"); var text = $(this).attr("text"); var readOnly = ($(this).attr("edit") === 'true'); var flexibleScaleLevels = $(this).attr("flexibleScaleLevels"); var flexibleScaleLevelNames = $(this).attr("flexibleScaleLevelNames"); new FG.Qual({ element : $(this), readOnly : readOnly, requested : requested, presented : presented, dynamicId : dynamicId, text : text, flexibleScaleLevels : flexibleScaleLevels, flexibleScaleLevelNames : flexibleScaleLevelNames, }).initialize(); } }) }); Qualification

Additional Details

  • Payment Terms : Y030 - within 30 days Due net|Y015
  • Maximum Extension Term (Months) : 12

Qualification

Assessment

Must Have

Education

Yes / No - College or Bachelor degree in Computer Science or a related field of study.

Yes

Work Experience

Duration - Databricks Platform Administration and Operational Optimization

3 years

Duration - Enterprise-Wide Data Architecture and Strategic Alignment

8 years

Duration -Experience Designing Analytics-Ready Data Platforms and Enabling Business Insights

4 years

Duration - Experience using version control systems

4 years

Duration - Experience with Azure Infrastructure, Services, and Authentication

5 years

Duration - Hands-on Experience in Python and SQL for Data Engineering

6 years

Duration - Hands-on Experience with Azure Databricks and Delta Lake

3 years

Nice to Have

Professional Licenses / Certification

Yes / No - Certification in The Open Group Architecture Framework (TOGAF).

Yes

Work Experience

Duration -AI-driven code generation, analysis & automation

1 years

Duration - Direct, hands-on experience performing business requirement analysis related to data...

8 years

Duration - Experience and strong technical knowledge of Microsoft SQL Server, including database...

8 years

Duration - Experience building scalable ETL pipelines.

2 years

Duration - Experience in data governance, security, and metadata management within a Databricks...

2 years

Duration - Experience in Designing and Integrating RESTful APIs.

3 years

Duration - Experience in Message Queueing Technologies, implementing message queuing using tools...

3 years

Duration - Experience working with cross functional teams.

5 years

Duration - Experience working with ServiceNow- Azure based Data Management.

1 years

Create a job alert for this search

Senior Data Architect • United States