Job Description
Must have java , AWS (S3 or Glue) and automation exp
- 6-8 years of IT experience focusing on Testing and Automation experience using Java and Groovy.
- Rest assured app concepts in automation.
- Experience with Databricks & on Prem, Structured Streaming, Delta Lake concepts, and Delta Live Tables required
- Experience with Spark Scala and Java programming
- Data Lake concepts such as time travel and schema evolution and optimization
- Experience leading and architecting enterprise-wide initiatives specifically in automation testing for ETL testing and several phases of testing, data migration, transformation.
- understanding of streaming data pipelines and how they differ from batch systems
- understanding of ETL and ELT and ETL / ELT tools such as Data Migration Service etc
- Familiarity and / or expertise with Great Expectations or other data quality / data validation frameworks a bonus
- Advanced level SQL experience (Joins, Aggregation, Windowing functions, Common Table Expressions, RDBMS schema design performance optimization)
- Indexing and partitioning strategy experience
- Debug, troubleshoot, and implement solutions to complex technical issues
- Experience with large-scale, high-performance enterprise big data application deployment and testing
- Architecture experience in AWS environment a bonus
- Familiarity working with Lambda specifically with how to push and pull data, how to use AWS tools to view data for processing massive data at scale a bonus
- Experience with Gitlab and CloudWatch and ability to write and maintain Gitlab for supporting CI / CD pipelines
- Experience working with AWS Lambdas for configuration and optimization and experience with S3
- Familiarity with Schema Registry, and message formats such as Avro, ORC, etc.
- Ability to thrive in a team-based environment
- Experience briefing the benefits and constraints of technology solutions to technology partners, stakeholders, team members, and senior-level of management