Manager Notes :
- This position can be 100% remote
- Candidates need to have previous AI / ML experience
- Work history In Finance Preferred
- Experienced in Databricks, Spark, Python, snowflake
- Must have STRONG SQL skills
Role Responsibilities :
Contribute to Design and implement data products and features in collaboration with product owners, data analysts, and business partners using Agile / Scrum methodologyContribute to overall architecture, frameworks and patterns for processing and storing large data volumesContribute to evaluation of new technologies / tools / frameworks centered around high-volume data processingTranslate product backlog items into logical units of work in engineeringImplement distributed data processing pipelines using tools and languages prevalent in the big data ecosystemBuild utilities, user defined functions, libraries, and frameworks to better enable data flow patternsWork with engineering leads and other teams to ensure quality solutions are implemented, and engineering best practices are defined and followedBuild and incorporate automated unit tests and participate in integration testing effortsUtilize and advance continuous integration and deployment frameworksTroubleshoot data issues and perform root cause analysisWork across teams to resolve operational & performance issuesThe following qualifications and technical skills will position you well for this role :
Bachelor's degree in Computer Science, or related technical discipline7+ years of experience in large-scale software development, 5+ years of big data experienceProgramming experience, Python or Scala preferred.Experience working with Hadoop and related processing frameworks such as Spark, Hive, etc.Experience with messaging / streaming / complex event processing tooling and frameworksExperience with data warehousing concepts, SQL and SQL Analytical functionsExperience with workflow orchestration tools like Apache AirflowExperience with source code control tools like Github or BitbucketAbility to communicate effectively, both verbally and written, with team membersInterest in and ability to quickly pick up new languages, technologies, and frameworksExperience in Agile / Scrum application developmentThe following skills and experience are also relevant to our overall environment, and nice to have :
Experience with JavaExperience working in a public cloud environment, particularly AWS, databricksExperience with cloud warehouse tools like SnowflakeExperience working with NoSQL data stores such as HBase, DynamoDB, etc.Experience building RESTful API's to enable data consumptionExperience with build tools such as Terraform or CloudFormation and automation tools such as Jenkins or Circle CIExperience with practices like Continuous Development, Continuous Integration and Automated TestingThese are the characteristics that we strive for in our own work. We would love to hear from candidates who embody the same :
Desire to work collaboratively with your teammates to come up with the best solution to a problemDemonstrated experience and ability to deliver results on multiple projects in a fast-paced, agile environmentExcellent problem-solving and interpersonal communication skillsStrong desire to learn and share knowledge with others