About Unity Advisory
Unity Advisory is a new-generation professional services firm built for an AI-enabled world. We operate a lean, conflict-free, and client-centric model that integrates advanced technology and AI into every workstream.
With no audit practice, we are free from traditional conflicts and legacy silos. This allows us to move faster, collaborate openly, and focus entirely on creating value for clients. Our flat structure and collaborative culture empower exceptional people to deliver exceptional work.
We combine deep advisory expertise with cutting-edge data, AI, and commercial insight to help clients navigate complex challenges faster, smarter, and with greater clarity. At Unity, we are redefining how expert advisory is delivered—one innovative engagement at a time.
The Role
We are looking for a highly skilled, commercially aware Senior Data Engineer to join Unity Advisory in an internal-facing role focused on building and operating the firm’s Core Internal AI & Data Platform, with a strong emphasis on LLM-powered data extraction, unstructured data processing, Retrieval-Augmented Generation (RAG), and AI-enabled knowledge systems.
This is a senior, hands-on engineering role responsible for designing and productionising LLM-native data pipelines and retrieval architectures that transform fragmented, unstructured internal data into structured, trusted, AI-ready assets.
A key component of this role is architecting and optimising our data platform around Snowflake as a central analytical and AI data foundation, ensuring seamless integration between structured data, unstructured document corpora, vector search, and GenAI applications.
This role sits at the intersection of data engineering, GenAI systems, and modern cloud data platforms, with clear opportunity to shape standards, influence architecture, and scale a next-generation internal data capability.
What You'll Do
Unity operates in a fast-paced, high-trust environment. You will lead the design and implementation of systems that unlock the value of both structured and unstructured data.
LLM-Powered Data Extraction & Structuring
Design and deploy pipelines that use Large Language Models (LLMs) to extract, normalise, and structure data from unstructured sources (documents, PDFs, contracts, reports, emails, call transcripts, project artefacts).
Implement Text2SQL, schema inference, entity extraction, classification, summarisation, and semantic enrichment workflows.
Build automated ingestion frameworks that transform raw first-party data into analytics-ready and AI-consumable formats within Snowflake.
Leverage Snowflake capabilities (e.g. Snowpark, Snowflake Cortex, or native AI features where appropriate) to operationalise AI-driven transformations at scale.
Retrieval-Augmented Generation (RAG) & Knowledge Systems
Architect and maintain RAG pipelines combining LLMs with:
Structured data stored in Snowflake
Unstructured document repositories and object storage
Vector databases and embedding indexes
Design embedding pipelines and semantic search layers integrated with Snowflake data models.
Build internal AI-powered search and conversational assistants grounded in trusted enterprise data.
Optimise relevance, grounding accuracy, latency, and response quality for interactive AI applications used by internal teams.
Implement hybrid retrieval approaches combining SQL-based filtering in Snowflake with vector similarity search.
AI-Native Data Platform Architecture
Design scalable data architectures spanning ingestion, transformation, storage, vectorisation, and model interaction layers.
Build and operate production-grade pipelines for:
Structured analytics data in Snowflake
Semi-structured data (JSON, logs, event streams)
Unstructured data (documents, transcripts, knowledge repositories)
Embedding generation and indexing workflows
Develop robust, analytics-ready data models and semantic layers within Snowflake to ensure consistent, governed data consumption.
Embed observability across both data and LLM pipelines (evaluation, hallucination detection, lineage tracking, usage analytics).
LLMOps, Evaluation & Optimisation
Evaluate GenAI systems using experimentation frameworks, offline benchmarks, and real-world user feedback.
Improve response quality through prompt engineering, retrieval optimisation, grounding strategies, and agent orchestration.
Implement reliable deployment and monitoring strategies for GenAI systems interacting with Snowflake-backed data.
Own production rollouts of internally facing GenAI applications.
Establish repeatable LLMOps patterns covering testing, performance optimisation, cost control, and governance.
Data Quality, Governance & Compliance
Implement data quality, lineage, and governance frameworks across Snowflake and AI workflows.
Ensure LLM-driven extraction systems are reliable, auditable, and compliant with internal standards.
Design privacy-aware architectures for sensitive advisory data.
Align AI systems with regulatory and firm-level compliance requirements.
Collaboration & Technical Leadership
Partner closely with AI engineers, full stack engineers, and platform leads to embed AI components into internal applications.
Contribute to architectural decisions, Snowflake optimisation strategies, model selection, and infrastructure design.
Provide technical guidance and mentorship to junior engineers.
Lead complex initiatives or major components of the internal AI data platform.
Operate effectively within agile delivery processes and shared operational ownership.
What You Bring
GenAI & LLM Systems Expertise
Extensive hands-on experience building production-grade GenAI applications, including:
Retrieval-Augmented Generation (RAG)
LLM-powered search or conversational systems
Multi-agent orchestration
Text2SQL and structured data reasoning
Strong understanding of:
Large Language Models (LLMs)
Vector databases and semantic search
Embedding pipelines and retrieval optimisation
Prompt engineering and evaluation methodologies
Snowflake & Modern Data Platform Expertise
Strong hands-on experience designing and operating Snowflake-based data platforms in production.
Deep understanding of:
Data modelling and transformation in Snowflake
Snowpark and programmatic data processing
Performance optimisation, cost management, and warehouse scaling
Secure data sharing and governance controls
Experience integrating Snowflake with AI-powered application layers.
Data Engineering & Cloud Foundations
4+ years’ experience in data engineering, analytics engineering, or closely related roles.
Strong Python expertise and a language-agnostic engineering mindset.
Experience processing large-scale distributed datasets (e.g. Spark or equivalent paradigms).
Experience deploying data-intensive and GenAI systems in production on AWS, Azure, or GCP.
Familiarity with modern AI tooling ecosystems (e.g. HuggingFace, LangChain, DSPy, or equivalent frameworks).
Commercial & Leadership Capability
Strong commercial awareness — understands how AI systems drive measurable business outcomes in a professional services context.
Comfortable owning systems end-to-end, from architecture design to operational reliability.
Experience serving as a technical lead or senior individual contributor.
Excellent communicator, able to translate complex AI and data platform concepts to non-technical stakeholders.
Comfortable operating in fast-growth, entrepreneurial environments with evolving requirements.
Nice to Have
Experience building internal AI copilots or enterprise knowledge assistants.
Exposure to Snowflake Cortex or other native AI capabilities within modern cloud data platforms.
Experience fine-tuning or adapting foundation models.
Background in advisory, consulting, or project-based firms.
Experience helping scale AI/data capabilities from early-stage foundations.
Graduate degree in Computer Science, Engineering, Statistics, or related quantitative discipline.
Why This Role Matters
Unity Advisory is building a next-generation Snowflake-centric AI & Data Platform designed to:
Unlock value from fragmented, unstructured firm knowledge
Operationalise LLM-driven data extraction and reasoning
Enable AI-native workflows across advisory teams
Power internal copilots, semantic search, and decision-support systems
Establish governed, reliable, AI-ready data foundations
This role is central to that mission.
You will not only build data pipelines — you will help define how Unity Advisory operationalises GenAI on top of Snowflake at enterprise scale.
Working at Unity Advisory
A truly hybrid and flexible working environment. We offer the opportunity to be at the forefront of AI-driven advisory services. You’ll be part of a high-impact finance function, empowered to shape how we scale our systems and processes. This is an exciting opportunity to join a fast-growing business and accelerate your finance career.
Additional Information
At Unity Advisory, we are committed to providing an inclusive and accessible recruitment process. In line with the Equality Act 2010, we will accommodate any suitable candidate requiring assistance to attend or conduct an interview. If you need any adjustments or support, please let us know when either scheduling your interview or in your application cover letter. We are dedicated to ensuring everyone has an equal opportunity to succeed and are here to support you throughout the process.
PLEASE NOTE: We do not accept unsolicited CVs from third-party agencies.