Start / End Dates : 08 / 12 / 2025 - 07 / 12 / 2026
Work Location : US Remote_ Strong preference for resource on West coast
Max Bill Rate : BR / Hr
Job Description : Experienced Data Engineer to support our infrastructure work. This role would be primarily responsible for building and modifying high-complexity pipelines and alerting systems for my team in legal.
We're looking for someone who is senior, with at least 5 + years of experience with SQL and python-based pipeline development (dataswarm, airflow, etc). Beyond YOE, due to the technical complexity of the role, we'd ideally like to find someone who has experience working on data warehousing and building (+ leading end-to-end projects to build) high-complexity monitoring and alerting systems. We're a very XFN-heavy team, so someone who is very communicative, collaborative, comfortable navigating ambiguity / tenacious in finding answers, and cognizant of privacy / security would do very well in this role.
Engineers are responsible for scheduling and querying massive tables, ensuring efficient data extraction and transformation.
Collaborating with attorneys : Work closely with attorneys and Data Scientists to understand what data is needed, where it is located, when it is required, and which tables are relevant.
Cross-functional collaboration : Collaborate with various XFN teams to launch new systems and products. Responsibilities include preparing documentation, coordinating with multiple stakeholders, developing a plan for system building, identifying collaborators, and ensuring successful execution.
Client would love to see "non-traditional" candidates. Would prefer someone who is hungry for a new opportunity and excited to learn over someone who is overly comfortable and not a go getter. They think this would be great for someone who could potentially be looking for a career pivot and has prior experience leading end to end projects - for example those who have experience working in startups and are wanting to break into big tech.
Nice to have skills
Volunteer work or internship with a law firm or worked in privacy.
Experience with remediating data loss .
Must-Have Skills
5 + years of experience with SQL and python-based pipeline development (dataswarm, airflow, etc)
Experience working on data warehousing and building (+ leading end-to-end projects to build) high-complexity monitoring and alerting systems.
Experience building ETL pipelines
Strong preference for someone who is based on the west coast.
Application Architect • Virtual, CA, United States