Who We Are
At Intropic, we’re building the future of financial intelligence, where deep market expertise meets the power of AI. Founded in London’s financial centre of Canary Wharf, we exist to transform complex data into clarity, precision, and action. Our culture is shaped by truth-seeking, velocity, and ownership, values that drive how we build, learn, and collaborate every day. We move fast, think independently, and hold ourselves to the highest standards of integrity and impact. Here, curiosity isn’t just encouraged, it’s essential. If you’re driven by challenge, inspired by innovation, and ready to amplify your intelligence alongside a team of exceptional thinkers, Intropic is where your ideas can truly compound.
We’re seeking a proactive and technically adept Market Data Engineer who thrives in building robust, low-latency systems. If you're passionate about financial markets and skilled at developing real-time data pipelines, we’d love to meet you.
Requirements
Bachelor’s or Master’s degree in Computer Science, Engineering, Finance, or related fieldProficiency in Python plus strong familiarity with C++ or Java
Solid understanding of real-time feed handling, message protocols, and distributed data architectureHands-on experience with market data sources (e.g., Bloomberg, Refinitiv) and familiarity with cloud platforms (e.g. AWS, GCP) and technologies like KafkaSkilled in building scalable data pipelines (ETL, streaming) and ensuring data quality, integrity, and performanceExcellent attention to detail, problem-solving mindset, and ability to manage tight SLAsEffective communicator with strong stakeholder collaboration skills
Nice to have
Comfortable implementing monitoring, self-service tools, and operational dashboardsExperience in quant finance or working with trading and research teamsBackground in systems performance optimization and high-throughput data environments
What You’ll Be Doing
Design, develop, and maintain real-time and historical market data pipelines from various sourcesEnsure high data integrity and system availability while handling large-scale data processingMonitor key metrics and build tools to streamline data access and operationsCollaborate across product teams to align infrastructure with strategic needsDeliver clean, well-tested, and maintainable code in a fast-paced startup environment