Base pay range : $100,000.00 / yr - $150,000.00 / yr
OptionMetrics, headquartered in New York, NY, is a dynamic and innovative technology company providing financial information and research derived from the option markets. We empower businesses with solutions that drive data-driven decisions. Our data and analytics models are used by over 350 investment banks, hedge funds, asset management firms, and academic institutions worldwide.
OptionMetrics is an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected veteran status, age, or any other characteristic protected by law. We do not discriminate against applicants for inquiries about pay.
What you'll do :
- Ensure the smooth and reliable flow of data that powers our analytics and products.
- Monitor, maintain, and improve systems and pipelines that deliver critical financial data to clients worldwide.
- Implement quality checks and data validation processes to ensure accuracy and integrity.
- Collaborate with development and infrastructure teams to build resilient, efficient, and secure workflows.
- Contribute to process improvements, optimize performance across cloud platforms, and support data delivery speed, integrity, and consistency.
Objectives :
Customer Distribution & Delivery – Deliver data feeds / files to customers (SFTP, APIs, etc.) with integrity checks and audit logs.Exception Handling – Build processes for handling late / missing vendor data, schema changes, and bad data without disrupting downstream systems.Resiliency – Incorporate backup / retry logic, failover planning, and disaster-recovery support in pipelines.Audit Support – Maintain data lineage, access logs, and documentation.Continuous Improvement – Identify repetitive operational tasks and propose efficiency improvements.Collaboration with Dev & Infra Teams – Ensure quality-designed workflows, platform resiliency, and secure data handling practices are enforced.SLA Adherence – Track data processing and SLA management.Responsibilities :
Monitor and maintain ETL / ELT workflows for uptime and performance.Implement data validation, quality checks, and reconciliation to ensure accuracy.Reduce mean time to detect (MTTD) and mean time to resolve (MTTR) for data issues.Troubleshoot and resolve operational issues in data pipelines.Partner with Dev and Data QA teams to ingest and normalize feeds from market data vendors.Optimize workflows for cost and performance across cloud platforms.Maintain documentation and runbooks for operational processes.Support incident management and root cause analysis related to data movement failures.Visualize data flows and error tracking.Required skills and qualifications :
Strong SQL and scripting (Python, Bash).Experience with workflow orchestration.Familiarity with data warehouses like Snowflake.Experience with FTP and similar data distribution platforms.Cloud-native data services such as AWS SQS and general AWS knowledge.Basic observability (monitoring, logging, alerting).Financial data knowledge is preferred.What we offer :
A collaborative environment where everyone's input makes a difference.Paid time off : vacation, personal, sick days, and holidays.Pre-tax commuter benefits (NJT, MTA, etc.) and 401(k) plan.Full medical and dental insurance coverage.J-18808-Ljbffr