Date Posted : 08 / 29 / 2025
Hiring Organization : Rose International
Position Number : 487850
Industry : Telecommunications
Job Title : Data Engineer
Job Location : Remote, USA
Work Model : Remote
Work Model Details : Remote from home (PST & MST)
Shift : Regular business hours
Employment Type : Temporary
FT / PT : Full-Time
Estimated Duration (In months) : 13
Min Hourly Rate ($) : 65.00
Max Hourly Rate ($) : 80.00
Must Have Skills / Attributes : Dashboards, Data Mining, Databricks, Oracle, Python, Scala, Snowflake, SQL, Tableau
Experience Desired : Exp. designing & maintaining scalable data pipelines using Databricks (Spark / Scala / Python / QL) (7+ yrs); Tableau dashboard development (parameters, performance optimization, & data-driven visualization) (7+ yrs); Strong SQL development skills for data extraction, transformation, and performance tuning (7+ yrs); Expertise in Snowflake data loading and update patterns (7+ yrs)
Required Minimum Education : Bachelor’s Degree
Job Description
Only Senior Data Engineer candidates located within the PST or MST time zones to be considered due for this remote the position.Qualified candidates must be available to work directly for Rose International on a W2 basis. Not a corp-to-corp opportunityRequired Skills, Experience, & Abilities :
Hands-on experience designing and maintaining scalable data pipelines using Databricks (Spark / Scala / Python / Sql) and SnowflakeStrong SQL development skills for data extraction, transformation, and performance tuningAbility to conduct source-to-target data reconciliation, auditing, and anomaly detection across large datasetsProven experience troubleshooting Spark performance issuesExpertise in Snowflake data loading and update patternsTableau dashboard development, including parameters, performance optimization, and data-driven visualizationSolid understanding of data validation lifecyclePreferred Skills :
Familiarity with advanced data profiling, anomaly detection techniques, and predictive modeling for identifying inconsistenciesExperience collaborating with business stakeholders to translate analytical requirements into reliable data productsKnowledge documenting data lineage, pipeline logic, and metadata management practicesStrong communication skills with the ability to clearly explain technical findings to non-technical stakeholdersDemonstrated ability to work independently with minimal oversight while managing priorities effectivelyRole & Responsibilities :
Design, build, and maintain scalable and auditable data pipelines using Databricks (Spark / Scala / Python / SQL) and SnowflakeOwn the data validation lifecycle—including source-to-target reconciliation using Oracle, data profiling, and anomaly detectionConduct deep-dive data mining to uncover patterns, inconsistencies, and opportunities for business action.Develop and maintain data visualizations and dashboards in Tableau to support transparency and decision-makingImplement data quality checks and automated alerting for pipeline failures, anomalies, and audit flagsCollaborate with analysts, scientists, and business stakeholders to gather requirements, and deliver reliable data products.Optimize existing pipelines for performance, scalability, and operational simplicityDocument data lineage, pipeline logic, and metadata clearly and consistentlyOnly those lawfully authorized to work in the designated country associated with the position will be considered.Please note that all Position start dates and duration are estimates and may be reduced or lengthened based upon a client’s business needs and requirements.Benefits :
For information and details on employment benefits offered with this position, please visit here. Should you have any questions / concerns, please contact our HR Department via our secure website.
California Pay Equity :
For information and details on pay equity laws in California, please visit the State of California Department of Industrial Relations' website here.