Spaice Tech

Senior Research Engineer – Virtual Worlds & Simulation

Spaice Tech Remote 1 day ago
engineering

About SPAICE

SPAICE is building the autonomy operating system that empowers satellites and drones to navigate and interact with the world – regardless of the environment. From GPS-denied zones on Earth to the unexplored frontiers of space, our Spatial AI delivers unprecedented levels of autonomy, resilience, and adaptability.

At SPAICE, you’ll work on real missions alongside leading aerospace and defense contractors, shaping the future of space and autonomous systems. If you're looking for a place where your work has a real, tangible impact – SPAICE is that place.

About the Role – Research Engineer (Virtual Worlds)

Traditionally, virtual worlds were built for human players. Today, we build synthetic worlds also for artificial agents: intelligent satellites and aerial robots learning to navigate, observe, and make decisions.

In this role, you will create high-fidelity environments where synthetic and real data are virtually indistinguishable. You will build simulation and synthetic data pipelines that train, test, and validate the autonomy stack behind tomorrow’s missions. Your work will bridge realistic physics, sensor behavior, and procedural world generation, while pioneering next-generation generative techniques for multimodal realism across cameras, LiDAR, radar, and beyond.

As part of a deeply multidisciplinary team, you will lay the foundation of SPAICE’s autonomy verification, from Software-in-the-Loop/Hardware-in-the-Loop simulation ecosystems to large-scale distributed simulations supporting thousands of coordinated agents across Earth and space.

What you might work on

  • Designing and developing photorealistic virtual worlds for drones and satellites — on Earth and in orbit

  • Building generative + procedural scene generation pipelines (neural rendering, dynamic environments, materials, weather)

  • Modeling multimodal sensors with realistic physics and spatial–temporal coherence (camera, IR, LiDAR, radar/RF)

  • Integrating geospatial data (terrain, satellite imagery) into high-fidelity scene assets

  • Combining game engines and robotics simulators (Unreal Engine, Omniverse, Gazebo, Isaac/AirSim, Simulink)

  • Developing custom tools and SDKs to accelerate autonomy testing workflows

  • Interfacing virtual environments with Software-in-the-Loop (SIL) and Hardware-in-the-Loop (HIL) systems

What we are looking for

  • Solid background in scene generation: neural scene representations, procedural modeling, and simulation-aware environments

  • Proficient with computer graphics: rendering, materials, lighting, procedural content, GPU programming, or similar

  • Strong experience in C++ and modern real-time engines (preferably Unreal Engine)

  • Understanding of physics-based realism and sensor modeling (rendering for machine perception)

  • Ability to integrate multiple tools and robotic simulators into a cohesive workflow

  • Comfortable collaborating with AI, controls, and aerospace teams — research + engineering mindset combined

  • Bonus: Familiarity with sensor simulation (e.g., LiDAR, cameras, radar) or synthetic data generation

  • Bonus: Experience with ROS, Isaac Sim, AirSim, or Unity in robotics or autonomous systems contexts

Perks & Benefits

  • Competitive salary, based on your experience and impact.

  • Equity options. You are going to be part of our journey from the ground up.

  • Well-being perks. Access to premium gyms, climbing centers, and wellness programs.

  • Team retreats & offsites. The last ones included a half-marathon in Formentera and a Los Angeles retreat during the Oscars weekend.

Sponsored

Explore Engineering

People also search for