Role Overview
We are seeking a Senior Databricks AI Engineer to design, build, and operate AI platforms natively on the Databricks Lakehouse. This role is Databricks-first: ownership of large-scale data, feature, and AI pipelines using Spark, Delta Lake, MLflow, Feature Store, and Unity Catalog. The focus is on engineering reliable AI systems at scale, not standalone model experimentation.
Key Responsibilities (Databricks-Centric)
- Architect and implement end-to-end AI pipelines on Databricks Lakehouse
- Build scalable data ingestion, transformation, and feature pipelines using Spark
- Design AI-ready Delta Lake architectures (Bronze/Silver/Gold)
- Implement AI lifecycle management using MLflow (tracking, registry, serving)
- Develop reusable Feature Store assets to support enterprise AI use cases
- Operationalize AI workloads using Databricks Workflows and Jobs
- Enable batch and real-time inference using Databricks-native serving
- Enforce data governance, lineage, and access control via Unity Catalog
- Optimize Spark clusters for performance, reliability, and cost
- Establish Databricks AI engineering standards and best practices
- Partner with data science teams to productionize AI solutions
Required Technical Skills (Databricks-First)
Databricks Platform
- Deep hands-on experience with Databricks Lakehouse
- Advanced expertise in Apache Spark (PySpark & Spark SQL)
- Strong command of Delta Lake (ACID, OPTIMIZE, Z-ORDER, VACUUM)
- Production experience with MLflow on Databricks
- Hands-on with Databricks Feature Store
- Strong experience with Unity Catalog for governance and security
- Databricks Jobs, Workflows, Repos, and SQL Warehouses
AI Engineering (Platform-Oriented)
- Feature engineering at scale using Spark
- Model packaging, versioning, and promotion using MLflow
- AI pipeline orchestration and automation
- Batch and near-real-time inference pipelines
- Model monitoring and retraining workflows on Databricks
Nice-to-Have (Still Databricks-Aligned)
- Databricks AutoML
- Databricks Model Serving APIs
- Photon performance tuning
- Multi-workspace Databricks deployments
- AI enablement in regulated environments (banking / finance)
Experience & Qualifications
- 10+ years in data engineering, AI engineering, or platform engineering
- 5+ years building enterprise solutions on Databricks
- Proven experience delivering production-grade AI platforms
- Strong understanding of distributed systems and Spark internals
- Ability to lead architecture decisions and mentor engineers
What Success Looks Like
- AI workloads run reliably, securely, and cost-effectively on Databricks
- Data scientists deploy models without friction using platform tooling
- Feature reuse and governance are enforced through Databricks-native services
- AI delivery cycles are shortened through automation and standardization
Preferred Certifications
- Databricks Certified AI Engineer
- Databricks Certified Data Engineer Professional
- Databricks Certified Machine Learning Professional
Sponsored
Explore Data
Skills in this job
People also search for
Similar Jobs
More jobs at Unison Group
Apply for this position
Sign In to ApplyAbout Unison Group
Unison Consulting was launched in Singapore on September 2012, the hub of the financial industry, with innovative visions in the technocratic arena. We are a boutique next-generation Technology Company with strong business-interests in Liquidity risk, ...