Must be located in :
- Nashville, TN
- Dallas, TX
- Denver, CO
Company Description :
FortyAU is a software consulting company headquartered in Nashville, Tennessee focused on building custom solutions for a variety of clients including Fortune 500 companies, small businesses, and dynamic start-ups.
Founded by developers for developers, our team is solely focused on solving problems by writing great code for our clients.
We look for versatile developers that understand code at a fundamental level and are always willing to grow and expand their skill set.
We offer the opportunity to work across a variety of projects, in a supportive environment, where no two days are the same.
Interested in learning more?
Role Description :
- Design, develop, and maintain scalable data pipelines and database systems for a variety of clients
- Collaborate with software engineers, product teams, and external stakeholders to deliver solutions that offer real business value
- Ensure data quality and integrity through rigorous testing, monitoring, and validation activities
Basic Experience :
- 3+ years of relevant professional experience
- Deploying production code in object-oriented frameworks (Python, Java, R, etc.)
- Moving data using modern tooling (Fivetran, Airbyte, Kafka, etc.)
- Transforming data using modern tooling (DBT, custom code, stored procedures, etc.)
- Visualizing data for various stakeholders (Looker, Tableau, Superset, PowerBI, Retool, Hex, Observable Framework, D3.js, etc.)
- Orchestrating scalable cloud pipelines (Airflow, Prefect, Dagster, etc.)
- Developing in at least one of the major cloud providers (AWS, GCP, Azure)
- Constructing architecture using infrastructure-as-code (Terraform, CloudFormation, Helm, AWS / GCP / Azure configuration, etc.)
- Abiding by software development best practices (CI / CD, unit testing, version control, etc.)
- Communicating with cross-functional teams and external stakeholders
- Being curious, smart, and kind
Preferred Experience :
- Familiarity with containerization and orchestration technologies like Docker and Kubernetes
- Working with big data and distributed systems
- Designing APIs
- Developing in all of the major cloud providers (AWS, GCP, Azure)
- Deploying both batch and streaming data pipelines
- Managing data governance systems
- Deep knowledge in a STEM field
- Full-stack application development
- Using AI application frameworks (LangGraph, Autogen, etc.)
- Staying updated with emerging data technologies
If you fit the description above, we would love to have a conversation.
Sorry, no corp-to-corp. Must be a full time candidate with ability to work in the U.S.