Job Title : AWS Engineer Tokenization & Data Security
Local to Mclean, VA Dallas TX
Onsite-5 days
Job Location : Mclean, VA Dallas TX -Onsite-Local or neighbouring state profile.
Rate-$60-65 / hr on C2C
4 opening - closed before 2nd Dec, immediate profile required, interview confirm this week
LinkedIn should be old
We are seeking a skilled AWS Engineer with strong experience in tokenization, data security, and containerized application management. The ideal candidate will have deep technical expertise in AWS cloud services, automation, scripting, and data protection frameworks. This role requires hands-on experience with EKS clusters, CFT updates, and secure data handling (PII / confidential data masking and tokenization).
Key Responsibilities
- Design, build, and maintain containerized applications on AWS (EKS / ECS) using automation and custom scripts.
- Implement and manage tokenization and detokenization processes to protect PII and confidential data.
- Work with data security tools to ensure compliance with internal and external security standards (e.g., PCI DSS, GDPR).
- Build and maintain infrastructure as code (IaC) using CloudFormation Templates (CFTs) and automation pipelines.
- Develop and manage scripts (Python, Shell, Ansible, etc.) to automate application builds and deployments.
- Collaborate with security and data engineering teams to implement data masking, token mapping, and encryption solutions.
- Monitor, optimize, and troubleshoot EKS clusters, ensuring high performance and scalability.
- Maintain documentation on infrastructure design, tokenization workflows, and data protection measures.
- Participate in audits, reviews, and assessments of data security systems.
Required Skills & Experience
6 10 years of total IT experience with a strong focus on AWS Cloud Engineering and Data Security.Hands-on experience with AWS services EC2, EKS, Lambda, S3, IAM, CloudFormation, and KMS.Proven experience in containerization and Kubernetes (EKS) management, including upgrades and patching.Proficiency in Python scripting and automation for build / deployment processes.Strong understanding of tokenization concepts, token mapping, and data masking techniques.Experience with data security tools used for tokenization / detokenization and encryption key management (e.g., Protegrity, Thales CipherTrust, Voltage SecureData, or similar).Deep knowledge of PII and confidential data protection standards.Experience updating and maintaining CloudFormation Templates (CFTs) and other IaC frameworks.Solid understanding of security compliance frameworks (PCI DSS, GDPR, HIPAA).Nice-to-Have Skills
Exposure to ETL tools and data pipelines (Informatica, IICS).Familiarity with DevSecOps and integrating security within CI / CD pipelines.Knowledge of AWS KMS, encryption mechanisms, and key rotation policies.Soft Skills
Strong analytical and problem-solving abilities.Excellent communication and documentation skills.Ability to collaborate with cross-functional teams (DevOps, Data, Security).Self-driven with a proactive approach to automation and process improvement.