Job Type
Full-time
Description
As a Senior Data Pipeline Engineer, you will play a crucial role in designing, building, and maintaining robust data pipelines for our customers. Your expertise will drive efficient collection, storage, processing, and transformation of large-scale data sets in support of the U.S. Army's energy and water use metering program. The total solution is considered a system-of-systems with developed applications integrating with the Army enterprise cloud and data environment. You will work closely with cross-functional teams to ensure seamless integration and optimal performance.
Responsibilities :
Pipeline Development :
- Design, develop, and optimize end-to-end data pipelines for efficient data extraction, transformation, and loading (ETL) processes.
- Collaborate with cross-functional teams to understand data requirements and translate them into scalable pipeline solutions.
- Implement best practices for data integration, ensuring high performance, reliability, and scalability.
Data Transformation and Quality :
Transform raw data into usable formats, ensuring data quality, consistency, and accuracy.Handle data validation, cleansing, and error handling to maintain data integrity.Monitor and proactively maintain data pipelines to ensure high service availability.Performance Optimization :
Continuously improve pipeline performance by identifying bottlenecks and implementing optimizations.Work with cloud-based technologies (e.g., AWS, GCP, Azure) to enhance scalability and efficiency.Collaboration and Leadership :
Partner with Data Scientists, Analysts, and other stakeholders to understand their data needs.Lead discussions on system enhancements, process improvements, and data governance.Mentor junior engineers and contribute to the growth of the data engineering team.#qf
#qg
Requirements
REQUIRED QUALIFICATIONS :
Bachelor's degree in Computer Science, Engineering, or a related field.5+ years of experience in data engineering, with a focus on building and maintaining data pipelines.1-2 years experience with building NiFI data flows or similar for Kafka and Hadoop-based NoSQL databases.Proficiency in ETL tools, SQL, and scripting languages (Python, Scala, etc.).Experience with Data Catalog and Accumulo indexes for information retrieval and discovery.Experience with API-led design.Experience with ELK stack a plus.Experience with cloud-based data platforms (e.g., AWS S3, Redshift, Google BigQuery).Strong problem-solving skills and attention to detail.Excellent communication and collaboration abilities.Security+ CertificationActive US Government Clearance at Secret level or higherEffective written and verbal communications skills for collaboration with both customers and fellow team members.Ability to sit for extended periods of time.Ability to regularly lift at least 25 pounds.Ability to commute to the designated onsite work location as required.QBE is an equal opportunity / affirmative action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender, gender-identity and / or expression, age, disability, Veteran status, genetic information, pregnancy (including childbirth, lactation, or other related medical conditions), marital-status, neurodivergence, ethnicity, ancestry, caste, military / uniformed service-member status, or any other characteristic protected by applicable federal, state, local, or international law.