Job Description
10+ years experience with 3+ years of experience in designing big data solutions using Hadoop technology ecosystem
Lead the tool selection, technical design, development, testing, implementation, and support of key projects implementing Big Data technologies with client's data centers or on cloud providers like Amazon Web Services.
Lead the definition of guidelines, standards, strategies, security policies and change management policies to support the Big Data platforms.
Research and evaluate technical solutions including various Hadoop distributions, NoSQL databases, data integration and analytical tools with a focus on enterprise deployment capabilities like security, scalability, reliability, maintainability, etc
Partner with project teams (project managers, architects, business analysts, and developers) on tools, technology, and methodology related to the design and development of Big Data solutions.
Ability to present and discuss strategies and technical information in a matter that establishes rapport, persuades others, and establishes understanding, for technical and nontechnical audiences
Proven ability to work in a startup-like environment with little supervision.
Significant programming experience in multiple languages like Python, Java, Ruby, C#, etc.
Ability to manage many simultaneous technical projects and initiatives.
Thorough knowledge of database technologies ranging from RDBMS databases to NoSQL databases to enterprise search
Thorough knowledge of data integration technologies like Informatica (PowerCenter, IDS), Denono, etc
Large-scale systems integration involving on-premises technology and public cloud platforms
Experience working with various IT development techniques like Agile and DevOps
Excellent verbal, written and interpersonal communication skills
Additional Information
All your information will be kept confidential according to EEO guidelines.
#J-18808-Ljbffr
Data Architect • Baltimore, MD, United States