Python programming Empleos en Costa mesa ca
Crear una alerta de empleo para esta búsqueda
Python programming • costa mesa ca
- Nueva oferta
Senior Data Architect (Azure Databricks / Python)
loanDepotIrvine, CaliforniaGenAI Python Systems Engineer-Director
PricewaterhouseCoopers Advisory Services LLCIrvine,CASenior Manager, Statistical Programming
BioSpaceIrvine, CA, United StatesRecreation Programming Assistant
Chapman UniversityOrange, CA, USSenior Software Engineer (Python)
NetAppIrvine, CA, USProgramming / Data Analytics (Adjunct)
Maricopa Community CollegesMesa Community CollegeFirmware Engineer - Python Test Automation
AndurilCosta Mesa, California, United StatesSenior Manager, Statistical Programming
AbbVie, IncIrvine, CA, United StatesPrincipal, Statistical Programming
Edwards LifesciencesUSA, California – IrvineProgrammer Analyst (Programming / Analysis)
BoeingIrvine, CA, United StatesSoftware Engineer - Backend (Python)
LingaTech, Inc.Irvine, CA, United StatesSoftware Engineer / Senior Software Engineer (Strong Python)
Autonomous Medical DevicesorporatedSanta Ana, California, United StatesLead Python Data Engineer (Capital Markets)
Lorven TechnologiesIrvine, California, United StatesSenior Data Architect (Azure Databricks / Python)
loanDepotIrvine, California- A tiempo completo
Description
Position at loanDepot Position Summary :
The Senior Data Architect is responsible for designing, developing, and maintaining robust, scalable, and high-performance enterprise data architectures within a modern cloud environment. This role requires deep expertise in Big Data solutions (Delta Lake architecture), modern data warehouse practices and operations, semantic layering, dimensional modeling (star schemas), transactional OLTP databases (3NF modeling), and advanced data modeling techniques. The ideal candidate will have at least 15 years of data modeling experience specifically within Business Intelligence and Analytics contexts, extensive hands-on experience with batch and streaming data processing, and strong expertise with Apache Spark, Databricks, and Spark Structured Streaming. Required skills include proficiency in Python programming, Azure cloud technologies, semantic modeling, and modern CI / CD deployment practices. Experience in ML engineering is highly desirable. The candidate must be able to collaborate quickly and effectively with data and engineering teams, clearly document source-to-target mappings, and reverse engineer existing database objects such as stored procedures, views, and complex SQL queries.
Responsibilities :
- Lead architecture design and implementation of enterprise-scale data platforms leveraging Databricks, Delta Lake, Azure cloud, and modern Big Data technologies.
- Design, build, and maintain modern data warehouse solutions using dimensional modeling (star schema) and semantic layering to optimize analytics and reporting capabilities.
- Define and enforce data modeling standards, guidelines, and best practices within analytics and BI contexts.
- Architect robust batch processing and real-time streaming solutions using Apache Spark, Databricks, Kafka, Kinesis, and Spark Structured Streaming.
- Effectively collaborate with engineering teams to rapidly deliver data architecture solutions and support agile development practices.
- Provide clear, comprehensive source-to-target documentation, data lineage mappings, and semantic layer definitions.
- Reverse engineer existing database structures, including stored procedures, views, and complex SQL logic, to document existing data processes and support modernization initiatives.
- Provide technical leadership, mentoring, and guidance to data engineering teams, ensuring alignment with architectural standards and best practices.
- Evaluate and continuously improve existing data architectures, optimize performance, and recommend enhancements for efficiency and scalability.
- Collaborate closely with stakeholders to define long-term data strategies and clearly communicate architectural decisions.
- Ensure compliance with industry standards, data governance practices, regulatory requirements, and security guidelines.
- Champion modern DevOps and CI / CD practices for data and analytics pipelines.
Requirements :
Preferred Qualifications :
Why work for #teamloanDepot : loanDepot (NYSE : LDI) is a digital commerce company committed to serving its customers throughout the home ownership journey. Since its launch in 2010, loanDepot has revolutionized the mortgage industry with a digital-first approach that makes it easier, faster, and less stressful to purchase or refinance a home. Today, as the nation's second largest non-bank retail mortgage lender, loanDepot enables customers to achieve the American dream of homeownership through a broad suite of lending and real estate services that simplify one of life's most complex transactions. With headquarters in Southern California and offices nationwide, loanDepot is committed to serving the communities in which its team lives and works through a variety of local, regional, and national philanthropic efforts.Base pay is one part of our total compensation package and is determined within a range. This provides the opportunity to progress as you grow and develop within a role. The base pay for this roles is between $186,000 and $210,000 per year. Your base pay will depend on multiple individualized factors, including your job-related knowledge / skills, qualifications, experience, and market location. We are an equal opportunity employer and value diversity in our company. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.