POC : Sam Chavez
ATTENTION ALL SUPPLIERS!!!
READ BEFORE SUBMITTING
- UPDATED CONTACT NUMBER and EMAIL ID is a MANDATORY REQUEST from our client for all the submissions
- Limited to 1 submission per supplier. Please submit your best.
- We prioritize endorsing those with complete and accurate information
- Avoid submitting duplicate profiles. We will Reject / Disqualify immediately.
- Make sure that candidate's interview schedules are updated. Please inform the candidate to keep their lines open.
- Please submit profiles within the max proposed rate.
- Please make sure to TAG the profiles correctly if the candidate has WORKED FOR INFOSYS as a SUBCON or FTE.
MANDATORY : Please include in the resume the candidate's complete & updated contact information (Phone number, Email address and Skype ID) as well as a set of 5 interview timeslots over a 72-hour period after submitting the profile when the hiring managers could potentially reach to them. PROFILES WITHOUT THE REQUIRED DETAILS and TIME SLOTS will be REJECTED.
Job Title : Technology Lead | OpenSystem | Python - OpenSystem Data Engineering Sr Developer / Snowflake Sr Developer
Work Location & Reporting Address : Birmingham, AL 35225 (REMOTE but with occassional travel if needed by the client. LOCAL CANDIDATES ARE HIGHLY PREFERRED. Open to candidates willing to relocate to client's location)
Contract duration : 12
MAX VENDOR RATE : market rate0-market rate6 per hour max
Target Start Date : 01 Dec 2025
Does this position require Visa independent candidates only? Yes
Must Have Skills :
Must have skill : Snowflake SQL, Data sharing and data exchange, Security & GovernanceMust have skill : Python, PySparkMust have skill : AWS Glue, LambdaExp with CICD (Continuous Integration and continuous deployment) tools like Github Actions, Harness or any otperson toolPrevious work experience in Financial Crimes domain will be helpfulNice to Have Skills :
Detailed Job Description :
Convert the enhancements in legacy pyspark Cloudera datamart code repo to Snowflake compatible Snowpark codeSetup Python / .NET based data connector in AWS Lambda to automate data pull from Snowflake and ingestion into PalantirProduce a comprehensive report confirming that data ingested into Palantir matches the source data in ClouderaResolve integration and testing issues during the integration and QA phasesMinimum Years of Experience :
5 -8 years of exp in ETL, data warehousing with alteast 3 years of Snowflake experienceCertifications Needed :
NoTop 3 responsibilities you would expect the Subcon to shoulder and execute :
Data-oriented experience including technologies like Data Moduler, Data Analyst, BIDW Analyst, OFSA, Big Data, OBIEE Etc.Work experience into Data Moduler, Data Analyst, BIDW Analyst, OFSA, Big Data, OBIEE Etc.Coordinates design and development with Data Products Partners, Data Scientists, Data Management, Data Modelers, and otperson Technical partnersInterview Process (Is face to face required?)
NoAny additional information you would like to share about the project specs / nature of work :