Like the look of this opportunity Make sure to apply fast, as a high volume of applications is expected Scroll down to read the complete job description.
We are looking for a senior-level Data Platform Engineer who will own the entire data lifecycle—from raw ingestion to clean, standardized tables to production-grade KPI calculation and API exposure. This is a true “full-stack data” role : you will deeply understand every data source, build and maintain the PostgreSQL core, implement all business-critical metrics and calculations, and expose them reliably via FastAPI (or GraphQL) endpoints consumed by our applications and dashboards. This is a high-ownership, hands-on engineering position with end-to-end responsibility for data correctness, performance, and usability. Key Responsibilities
Proactively request, receive, and integrate data from numerous external and internal sources in varied formats (CSV, JSON, Excel, APIs, database dumps, etc.). Build robust, automated Python-based ingestion pipelines that parse, validate, cleanse, enrich, and standardize incoming data. Transform heterogeneous sources into a single, consistent schema in PostgreSQL with full auditability and error handling. Compliance with FAIR(T) principles, and familiarity of the ISO-8000 standard.
Design, evolve, and performance-tune PostgreSQL schemas, tables, indexes, materialized views, and complex SQL functions. Maintain referential integrity, slowly changing dimensions, and reference / lookup tables. Develop intimate, expert-level knowledge of every field, entity, and business meaning in the database. Own database migrations, versioning, documentation, and query optimization.
Translate business requirements into precise, reproducible KPI calculations (revenue, utilization, compliance ratios, performance scores, etc.). Implement all KPIs as reusable, version-controlled logic in both SQL (stored functions / views) and Python modules. Guarantee 100% consistency between metrics shown in dashboards, APIs, and reports. Build a central, well-documented library of metrics that the entire company trusts as the single source of truth.
Design and maintain a clean, performant API layer using FastAPI (or GraphQL) that exposes KPIs, aggregated datasets, and granular queries to frontend applications, dashboards, and third-party consumers. Create parameterized, secure, and cache-friendly endpoints with clear contracts and OpenAPI / Schema documentation. Optimize query performance for high-concurrency API traffic and large reporting workloads.
Define and enforce canonical data models for core entities (assets, sites, transactions, customers, events, classifications, etc.). Keep the data warehouse clean, de-duplicated, and future-proof as new sources and features are added. Collaborate closely with engineering, product, and analytics teams to evolve models without breaking existing consumers.
SQL, performance tuning).
Job Type : Full-time
Pay : $100,000.00 - $130,000.00 per year
Benefits :Work Location : Hybrid remote in Bridgewater, NJ 08807
Data Platform Engineer • Bridgewater, New Jersey, US