Exa is building a search engine from scratch to serve every AI application. We build massive-scale infrastructure to crawl the web, train state-of-the-art embedding models to index it, and develop super high performant vector databases in Rust to search over it. We also own a $5m H200 GPU cluster and routinely run batch jobs with tens of thousands of machines. This isn't your average startup :)
As a Web Crawler engineer, you'd be responsible for crawling the entire web. Basically build Google-scale crawling!
Desired Experience
You have extensive experience building and scaling web crawlers, or would be excited to ramp up very quickly
You have experience with some high performance language (C++, Rust, etc.)
You’re comfortable optimizing a system to an exceptional degree
You care about the problem of finding high quality knowledge and recognize how important this is for the world
Example Projects
Build a distributed crawler that can handle 100M+ pages per day
Optimize crawl politeness and rate limiting across thousands of domains
Design systems to detect and handle dynamic content, JavaScript rendering, and anti-bot measures
Create intelligent crawl scheduling and prioritization algorithms for maximum coverage efficiency
This is an in-person opportunity in Singapore.