The problem: Every minute matters in fire response. As climate change amplifies the intensity of wildfires—with longer fire seasons, dryer fuels, and faster winds—new ignitions spread faster and put more communities at risk. Today, most wildfires are detected by bystanders and reported via 911, meaning it can take hours to detect a fire, verify its exact location and size, and dispatch first responders. Fire authorities need a faster way to detect, confirm, and pinpoint fires so that they can quickly respond—preventing small flare-ups from becoming devastating infernos.
About Pano: We are a 150+ person growth-stage hybrid-remote start-up, headquartered in San Francisco. We are the leader in early wildfire detection and intelligence, helping fire professionals respond to fires faster and more safely—with the right equipment, timely information, and enhanced coordination—so that they can stop a new ignition before it grows. Pano AI combines advanced hardware, software, and artificial intelligence into an easy-to-use, web-based platform. Leveraging a network of ultra-high-definition, 360-degree cameras atop high vantage points, as well as satellite and other data feeds, Pano AI produces a real-time picture of threats in a geographic region and delivers immediate, actionable intelligence.
Pano AI is on TIME's list of the 100 Most Influential Companies of 2025! MIT Technology Review listed Pano as one of the top 15 climate tech companies to watch in 2024, and Fast Company named Pano AI one of the Top 10 most innovative companies in AI of 2023. We’ve also been featured in the Wall Street Journal, Bloomberg, and CNBC News. Pano AI’s dozens of government and enterprise customers span 16 states in the U.S., five states in Australia, and BC, Canada, and we are currently monitoring over 30 million acres of land. Pano AI has raised $89M in venture capital funding from Giant Ventures, Liberty Mutual Ventures, Tokio Marine Future Fund, Congruent Ventures, Initialized Capital, Salesforce Ventures, and T-Mobile Ventures. Learn more at https://www.pano.ai/.
The Role:
Join our data platform team building critical infrastructure for wildfire detection! As our Geospatial Data Engineer, you'll design and implement systems that process spatial data from distributed camera networks and satellite imagery to identify wildfires in their earliest stages.
You'll work with multi-modal geospatial datasets—managing camera telemetry, spatial indexes, coordinate transformations, and large-scale image repositories from remote sensing systems deployed across vulnerable landscapes. Your infrastructure will handle real-time ingestion and processing of spatial data streams. The role involves building robust data pipelines for both structured geospatial data (map layers, third-party incident data, terrain models) and unstructured image data at scale. You'll collaborate with AI engineers and platform architects to optimize data workflows, implement efficient spatial querying systems, and maintain the reliability of data infrastructure that operates 24/7 in production.
We're looking for someone with hands-on experience in geospatial data engineering who wants to apply their skills to a meaningful problem. If you have strong fundamentals in spatial databases, data pipeline development, and large-scale data processing—and you're interested in working on infrastructure that directly supports wildfire detection—we'd like to hear from you.
Key Responsibilities:
Develop pipelines to ingest, process, and publish data, including data from Pano's proprietary equipment and relevant publicly available datasets
Develop internal tools for managing key internal processes, including dataset creation, data management (e.g. video creation), and results evaluation
Build and manage large-scale cloud pipelines that process images and structured data sources
Manage and optimize data systems across in-house and cloud systems, taking advantage of the latest technical developments
Enable Pano to scale AI detection by focusing on the overall system lifecycle, including monitoring and improving system reliability, and optimizing for platform costs
Minimum Qualifications:
3+ years of software engineering experience plus a BS in Computer Science or equivalent
2+ years of experience programming in Python
2+ years of experience building data systems on cloud service providers
2+ years of experience working with relational databases (e.g. PostgreSQL)
Preferred Qualifications:
Experience with Google Cloud
Experience with containerization, including Kubernetes
Experience with CI/CD
Familiarity with vector and raster data formats
Excellent communication skills
Final compensation for full-time employees is determined by a variety of factors, including job-related qualifications, education, experience, skills, knowledge, and geographic location. In addition to base salary, full-time roles are eligible for stock options. Our benefits package also includes comprehensive medical, dental, and vision coverage, a matching 401(k) plan, and flexible paid time off.