This is Full-Time / Permanent Position You will be directly working with an end client for this position.
The interview process will be initiated as soon as possible.
We re excited to hear back from you.
Job Description :
Role : AWS Data Engineer
Salary : $$Base + Benefits + Bonus + Relocation
Location : Memphis, TN - 5 Days onsite
Job Description : The Sr. Data Engineer is responsible for designing, developing, and maintaining data pipelines and infrastructure on the AWS platform. The Sr. Data Engineer works with large volumes of data, ensuring its quality, reliability, speed, and accessibility. Tasks may include data ingestion, transformation, storage, data sharing and consumption, and implementing data security and privacy measures. This role is crucial in enabling efficient and effective data-driven decision-making.
Requirements
- Extensive experience with enterprise data solutions, including but not limited to data warehouse modernization, enterprise data lakes, cloud migration, automation and reducing engineering toil
- Working experience with Amazon Well-Architected Framework.
- Proficient in programming languages such as Python and SQL for database querying and manipulation.
- Strong understanding of AWS services related to data engineering, such as Amazon S3, Amazon Redshift, Amazon Aurora Postgres, AWS Glue, AWS Lambda, AWS Step Function, AWS Lake Formation, Amazon Data Zone, Amazon Kinesis, MSK, and Amazon EMR.
- Knowledge of database design principles and experience with database management systems.
- Experience with data storage technologies, such as relational databases (e.g., SQL Server, PostgreSQL) and distributed storage systems (e.g., PySpark).
- Understanding of Extract, Transform, Load (ETL) processes and experience with ETL tools like AWS Glue and SQL Server Integration Services is essential.
- Skilled at integrating disparate data sources and ensuring data quality and consistency.
- Understanding and experience with orchestration tools like Apache Airflow, AWS Glue Workflows, AWS Step Functions, and notification services.
- Familiarity with IAC, such as Terraform, git, and DevOps pipelines.
- Strong analytical thinking and problem-solving abilities are essential to effectively identify and resolve data-related issues.
- Ability to analyze complex data sets, identify patterns, and derive actionable insights.
- Awareness of data governance practices, data privacy regulations, and security protocols is crucial.
- Experience implementing data security measures and ensuring compliance with relevant standards is desirable.
- Passion for mentoring less experienced engineers.
Proficient in the following computer languages
PythonSQLAWS technologies to include
GlueS3RedshiftLambdaLake FormationDataZoneResponsibilities
Build and maintain scalable and reliable data pipelines, ensuring the smooth flow of data from various sources to the desired destinations in the AWS cloud environment.Work closely with stakeholders to understand their data requirements and design data solutions that meet their needs.Understand data models / schemas and implement ETL (Extract, Transform, and Load) processes to transform raw data into a usable format in the destination.Responsible for monitoring and optimizing the performance of data pipelines, troubleshooting any issues that arise, and ensuring data quality and integrity.Regards,
Piyush Kumar
Resource Manager | Syntricate Technologies Inc.
Email : piyush@syntricatetechnologies.com | Web : www.syntricatetechnologies.com
We're hiring! connect with us on LinkedIn and visit our Jobs Portal