About Slip Robotics
Slip Robotics is transforming freight logistics with autonomous robotic systems that load and unload trailers faster, safer, and more reliably than traditional methods. Our robots operate in demanding real-world environments alongside people and heavy cargo every day. We are a small, high-impact team based in Atlanta building products that are already deployed in production.
The Role
We are looking for a Robotics Perception Engineer to own the sensing and perception stack on our autonomous mobile robots. You will work across LiDAR, cameras, IMUs, and other sensor modalities to give our robots a reliable understanding of their environment—including trailers, pallets, dock infrastructure, and people. This is a hands-on role where your work ships to production robots operating in live freight facilities.
We need someone who has shipped software on real robots at real customer sites. You should have experience deploying perception systems that work reliably in unstructured or semi-structured environments, and you should be comfortable debugging sensor issues on physical hardware.
Key Responsibilities
- Design, implement, and maintain the perception pipeline for autonomous mobile robots, including object detection, segmentation, localization, and mapping
- Integrate and calibrate LiDAR, depth cameras, and other sensors on production robot platforms
- Develop robust point cloud processing and filtering for reliable operation
- Build and maintain sensor fusion systems that combine multiple modalities for accurate environment understanding
- Create and manage datasets for perception model training, validation, and regression testing
- Collaborate closely with navigation, controls, and hardware teams to ensure perception outputs meet downstream requirements
- Debug and resolve perception failures on deployed robots using logs, recorded sensor data, and on-site investigation
- Contribute to CI/CD pipelines for perception software testing and deployment
Requirements
- 5+ years of professional experience building perception systems for physical robots at a robotics or autonomous vehicle company
- Strong proficiency in C++ and Python
- Deep experience with LiDAR processing (point cloud filtering, clustering, registration, SLAM)
- Hands-on experience with camera-based perception
- Production experience with ROS or ROS 2
- Experience with sensor calibration (intrinsic, extrinsic, multi-sensor)
- Familiarity with machine learning frameworks (PyTorch, TensorFlow) for perception tasks
- Track record of deploying perception software to real robots in real environments—not just simulation
- Strong debugging skills and comfort working with physical hardware
- Excellent communication skills and ability to work in a small, fast-moving team
Nice to Have
- Master’s degree or higher in Computer Science, Robotics, or a related field
- Experience with AWS IoT, Greengrass, or similar edge-cloud architectures
- Experience in logistics, warehousing, or freight environments
- Background in Gazebo or other physics simulation for sensor modeling
Remote Work Requirements
This role can be filled remotely. Remote team members are expected to have reliable high-speed internet access and a dedicated workspace. You will need to travel to our Atlanta facility periodically for hardware integration and testing. A company laptop and necessary peripherals will be provided.
Benefits
- Competitive salary and equity in an early-stage robotics company
- Comprehensive benefits including health, dental, and vision
- Permissive time off policy
- A small team where your work has direct, visible impact on shipped products
- The chance to work on robots that are operating in production today—not a research project