Autonomy Software Engineer - Mapping and Localization
Posted July 28, 2025
About UCR
UCR (Under Control Robotics) builds multipurpose robots to support human workers in the world's toughest jobsturning dangerous work from a necessity into a choice. Our work demands reliability, robustness, and readiness for the unexpectedon time, every time. We're assembling a mission-driven team focused on delivering real impact in heavy industry, from construction and mining to energy. If you're driven to build rugged, reliable products that solve real-world problems, we'd love to talk.
Position Overview
At UCR, building robots is a team sport. As a Robotics Autonomy Engineer, you'll take ownership and lead the development of autonomy systems that power our multipurpose robots across diverse and unstructured environments. You'll design, implement, and optimize cutting-edge localization, mapping, navigation, and SLAM systemsincluding advanced techniques such as 3D Gaussian Splattingthat enable our robots to perceive, understand, and act in the real world with confidence.
Responsibilities
- Develop and maintain real-time mapping, localization, and navigation software for mobility robotic systems
- Build scalable SLAM pipelines using a mix of sensors, including LiDAR, vision, and IMU
- Implement 3D scene representations using cutting-edge techniques such as 3D Gaussian Splatting, NeRFs, and other neural or volumetric methods
- Integrate localization and mapping modules with motion planning and control systems
- Deploy robust autonomy stacks to on-board compute platforms and validate them in both simulation and real-world testing
- Analyze and tune performance of perception and SLAM systems in challenging environments
- Collaborate with mechanical, electrical, and software engineers to develop co-designed autonomy solutions
- Write clean, modular, production-quality code with thorough documentation and testing
- Operate and support robots during field testing and customer deployment
Requirements
4+ years of experience working in robotics, autonomy, or a closely related fieldStrong foundation in SLAM, probabilistic localization, 3D reconstruction, and navigation algorithmsDeep experience with C++ and Python, especially in real-time robotics or embedded systemsExperience building and deploying autonomy stacks using frameworks such as ROS or ROS2Proven ability to develop algorithms for sensor fusion and state estimation (e.g., EKF, UKF, particle filters)Hands-on experience with real robot systemsground, legged, or aerial platformsFamiliarity with 3D mapping techniques including voxel grids, mesh reconstruction, and Gaussian SplattingDemonstrated rapid growth and technical ownership on complex autonomy projectsAbility to prioritize and execute tasks in a fast-paced, dynamic environmentExcellent communication and collaboration skills across disciplinesNice to Have
Experience with GPU-accelerated vision or perception pipelines (CUDA, TensorRT)Exposure to deep learning-based SLAM, view synthesis, or scene understanding techniquesExperience with multirobot SLAM, loop closure, or graph optimization frameworksContributions to open-source robotics or perception librariesComfort debugging hardware / software integration in field settingsExperience with autonomy in unstructured or GPS-denied environmentsStrong understanding of simulation frameworks (e.g., Gazebo, Isaac Sim, Unity Robotics)To apply, submit your resume here or email people@ucr.bot . To increase your chances of being selected for an interview, we encourage you to include a public portfolio of your most representative work featuring your individual contributions and public demonstrations of autonomy or SLAM systems.
#J-18808-Ljbffr