About Thunes
Thunes is the Smart Superhighway for money movement around the world. Thunes’ proprietary Direct Global Network allows Members to make payments in real-time in over 130 countries and more than 80 currencies.
Thunes’ Network connects directly to over 7 billion mobile wallets and bank accounts worldwide, via more than 350 different payment methods, such as GCash, M-Pesa, Airtel, MTN, Orange, JazzCash, Easypaisa, AliPay, WeChat Pay and many more.
Members of Thunes’ Direct Global Network include gig economy giants like Uber and Deliveroo, super-apps like Grab and WeChat, MTOs, fintechs, PSPs and banks. Thunes’ Direct Global Network differentiates itself through its worldwide reach, in-house Smart Treasury Management Platform and Fortress Compliance Infrastructure, ensuring Members of the Network receive unrivalled speed, control, visibility, protection and cost efficiencies when making real-time payments globally.
Headquartered in Singapore, Thunes has offices in 12 locations, including Barcelona, Beijing, Dubai, London, Manila, Nairobi, Paris, Riyadh, San Francisco, Sao Paulo and Shanghai. For more information, visit: https://www.thunes.com/
Context of the role
The ideal candidate is a highly driven, self-motivated "Product-minded Data Scientist" with strong full-stack capabilities, genuinely excited about building and evolving the predictive engines that power a high-growth fintech.
You will operate at the intersection of data science, product thinking, and business outcomes, combining a startup mindset with the rigor required of a regulated financial institution. Your responsibility goes beyond model accuracy: you will own the end-to-end lifecycle of predictive capabilities, from metric definition, through model design and validation, to reliable production deployment and continuous improvement. Models must be robust, explainable, fair, and trusted by the business, not just technically sound.
In this role, you will design and productionise our core predictive capabilities that directly influence quality of service, liquidity management, risk, and operational decision-making. You will proactively partner with a wide range of stakeholders to translate ambiguous business problems into scalable data products, ensuring predictions are actionable, measurable, and embedded into real workflows. You will help define success metrics, challenge assumptions, and continuously refine models based on business feedback and changing market conditions.
Key Responsibilities
- Ensure production ready implementation of ML projects: write production-grade data transformations to prepare datasets, and build automated training / inference pipelines
- Be responsible for the quality of your deliverables; you will not just hand over a notebook. You will wrap models in inference scripts, register them in model registry; getting them deployment ready for the engineering team
- Implement rigorous model interpretability frameworks to ensure every prediction can be explained to non-technical stakeholders, also ensuring our solutions are compliant and auditable
- Participate in the design and analysis of A/B tests to validate the business impact of model deployments (e.g. reducing fraud false positives without hurting acceptance rates)
- Timely delivery of high-quality, testable, reproducible, and documented code using GitLab and best practices like Data Versioning and Feature Stores
- Understands, applies, and champions the principles of rigorous statistical analysis (hypothesis testing, confidence intervals) and communicates error metrics clearly to non-technical stakeholders
Professional Experience and Qualifications
- Holding a degree in Statistics, Mathematics, Computer Science, Economics, or related quantitative fields
- 5+ years of relevant industry experience, with a preferred background in Fintech, Payments, or Financial Services
- Deep expertise in Time-Series Forecasting: Proficient with libraries for time-series solutions (e.g. Nixtla, Prophet, DeepAR, or Chronos). Strong understanding of non-stationarity (e.g. holiday spikes), seasonality, and backtesting strategies
- Data Scientists that are motivated, curious, and have an appetite for learning ML deployment best practices
- Proficient in Python and core ML libraries: Mastery of Scikit-Learn, XGBoost, and Pandas / Polars
- Deep expertise in SHAP (Shapley Values), LIME, or Counterfactual Explanations. Experience using explainability services on the cloud (e.g. AWS SageMaker Clarify, GCP Vertex Explainable AI) to detect bias and generate model explainability reports for audit purposes
- Proficient in cloud native AI tech stack: Deep hands-on experience with cloud native ML ecosystem (e.g. AWS SageMaker, Feature Store, Model Registry, GCP Vertex AI, Feature Store, Model Registry)
- Cloud Data Ops: Proven ability to build optimised ETL pipelines and query large Data Lakes using distributed frameworks (e.g. Spark, Athena, BigQuery)
- Experience in Pipeline Orchestration: Familiarity with orchestrating workflows (e.g. Airflow, Step Functions, dbt)
- Passionate about code quality and reproducibility: Strict adherence to Git, code reviews, and environment management
- Interest in the Fintech Industry and market innovations
- Certifications: AWS Certified Machine Learning - Specialty is highly preferred
Sound like you? Apply now!
Sponsored