We are seeking highly motivated and talented people to join the Boerboel team. As a member of the team you will be working with passionate and curious colleagues who thrive on solving complex problems. Boerboel is an office-first firm. While we offer flexibility to our employees, remote work is not available at this time.
About us:
Boerboel Trading is an electronic trading firm, active across a wide range of asset classes and regions, with offices in NYC, Chicago, London, and Malta. We engage in rigorous quantitative analysis to deploy systematic trading strategies over multiple time horizons. Trading and research employees of the firm are graduates from a wide spectrum of quantitative disciplines, such as engineering, computer science, mathematics and physics. Our employees apply their scientific methodologies to the analysis of extensive financial data sets.
Summary:
We operate high-performance infrastructure processing 20PB+ of market data. We're building a production ML platform to organise features, orchestrate model training, and serve predictions for systematic strategies. As a ML DevOps Engineer you'll design and implement the feature store, data lake organisation, and ML pipelines that enable our research and trading teams to work efficiently at scale.
Objectives:
- Feature store & data lake: Build scalable infrastructure for time-series feature storage, retrieval, and versioning optimized for ML workloads
- MLOps pipelines: Design end-to-end workflows for data ingestion, feature engineering, model training, backtesting, and deployment
- Data ingestion layer: Connect raw data streams into structured, queryable formats (Parquet/Delta Lake)
- Production serving: Deploy feature computation and model inference with appropriate latency characteristics
- Integration: Work with existing data capture and execution systems
Experience:
- Python depth: Strong Python engineering with focus on data pipelines and ML infrastructure
- ML platform experience: Built feature stores, MLOps pipelines, or similar systems at scale (Netflix, Yandex, Criteo, or comparable tech companies)
- Data lake expertise: Experience ingesting, organising, and serving large datasets (TB-PB scale)
- Production mindset: Deployed systems with reliability, monitoring, and performance requirements
- Distributed computing: Familiarity with Spark, Dask, Ray, or similar frameworks
Required skillset:
- Python-first with some C++ integration points
- Modern data stack: Parquet, Data Lake, distributed compute
- Your choice of orchestration tools (Airflow, Prefect, etc.)
- 20PB storage, nanosecond-resolution data, real-time streams
Preferred:
- C++ exposure (helpful for integration but not required)
- Real-time or time-series data experience
- Knowledge of Data Lake, Iceberg, or modern table formats
Sponsored
Explore Engineering
Skills in this job
People also search for
Similar Jobs
Senior DevOps Engineer, Machine Learning
Roku
Staff Machine Learning Operations Engineer - Devops/SRE
Housecall Pro
Engineering Manager - Machine Learning
Plaid
Software Engineer, Machine Learning
Glean
Staff Engineer - Machine Learning
Adyen
More jobs at Boerboel
Similar Jobs
Senior DevOps Engineer, Machine Learning
Roku
Staff Machine Learning Operations Engineer - Devops/SRE
Housecall Pro
Engineering Manager - Machine Learning
Plaid
Software Engineer, Machine Learning
Glean
Staff Engineer - Machine Learning
Adyen