Job Description
Job Description
We are looking for a skilled Big Data Engineer to join our team in Westfield, Indiana. This role involves leveraging advanced technologies to design, implement, and optimize big data solutions that drive our business objectives. The ideal candidate will have extensive experience in data engineering and a passion for building scalable systems.
Responsibilities :
- Design, develop, and implement scalable big data solutions using Python, Apache Spark, and other relevant technologies.
- Build and optimize ETL pipelines to efficiently handle large volumes of structured and unstructured data.
- Manage and process data using frameworks such as Apache Hadoop and Apache Kafka.
- Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions.
- Utilize cloud platforms like Amazon Web Services (AWS) to deploy and maintain data systems.
- Ensure the security, reliability, and performance of big data architectures.
- Troubleshoot and resolve issues related to data systems and pipelines.
- Monitor and analyze system performance to identify opportunities for improvement.
- Stay updated on emerging technologies and incorporate them into data engineering practices as appropriate.
- A minimum of 10 years of experience in big data engineering or a related field.
- Proficiency in programming languages such as Python.
- Strong expertise in data processing frameworks, including Apache Spark, Hadoop, and Kafka.
- Solid experience with ETL processes and pipeline development.
- Familiarity with cloud platforms, particularly Amazon Web Services (AWS).
- Proven ability to design and implement scalable data architectures.
- Excellent problem-solving skills and attention to detail.
- Strong communication skills to collaborate effectively with technical and non-technical teams.