
Samaya AI was an early-stage startup building a Human–AI Knowledge Network to help professionals make sense of complex financial and research data. Co-founded by one of the authors of the original Retrieval-Augmented Generation (RAG) paper, the company aimed to turn advanced AI research into a practical product that could support expert decision-making.
As the first in-house Product Designer, I joined a team of former Meta, Google, and AWS engineers and researchers to define how cutting-edge AI could evolve into a usable, trustworthy product experience.
When I joined, Samaya was moving fast in multiple directions, pushing the limits of AI experimentation without a clear design framework to connect it all. My main challenge was to bring structure and consistency to a rapidly evolving environment, translating complex AI workflows into interfaces that users could understand and trust.
The key challenges were:
Fragmented visual language and disconnected interaction patterns.
Complex AI workflows that were powerful but difficult to navigate or interpret.
No shared design system or process to unify ongoing experimentation into a coherent product.
To address these issues, I focused first on establishing solid UX and design foundations that could scale with the pace of research and development.
Early versions were functional but inconsistent in both UI and UX, showing the need for a shared design language and clearer structure.
Building the foundations
I created Samaya’s first design system in Figma, defining tokens, components, and documentation for typography, colour, spacing, and behaviour.
The goal was to bring structure and consistency to a fast-moving AI product, reducing design debt and creating a shared visual language between design and engineering.
Design System Overview – Core Tokens
Samaya’s first design system, built to unify visual language and enable faster, more consistent iteration across teams.
Designing AI-powered product experiences
Partnering with the AI and product teams, I helped shape the core user experiences behind Agentic Search, Market Data visualisation, Tables, and a reimagined Search bar that introduced templates and advanced query options.
My focus was on creating clarity and trust in how users interact with AI, ensuring complex financial data could be explored intuitively, while making the system’s reasoning transparent.
Agentic Search
Agentic Search was designed to help financial experts ask complex questions in natural language and receive traceable, context-rich answers.
I focused on making AI reasoning transparent and interactions predictable, creating a balance between conversational flexibility and user control.

Agentic Search flow in Figma, mapping conversational interactions and data retrieval logic to refine user intent and feedback loops.
Figma prototype of Agentic Search, used to test clarity, trust, and explainability in AI-assisted queries.
Market Data
The Market Data experience aimed to turn dense, structured financial datasets into clear, interactive insights. My focus was on visual hierarchy, data readability, and seamless integration between raw information and AI-generated analysis, allowing experts to interpret data faster and with greater confidence.
Market Data interface showing interactive charts and structured financial datasets presented with visual hierarchy and contextual insights.
Collaboration and scale
I collaborated closely with engineers to align the design system with Tailwind and Storybook, ensuring a one-to-one match between Figma and production.
This alignment removed guesswork, improved design–dev velocity, and helped the team scale faster without compromising visual quality.
Although the product was still evolving, the new design foundations delivered immediate results:
1
Established the first cohesive design language, creating a visual and behavioural foundation that unified the product experience.
2
Improved efficiency and alignment between design and development by introducing a shared system built around Tailwind and Storybook.
3
Enhanced clarity and user confidence in AI-driven workflows by improving feedback loops and visual transparency.
Reflection
Working at Samaya meant navigating constant ambiguity and translating technical AI research into human-centred experiences.
The process strengthened my ability to define scalable systems, create trust in complex interfaces, and bring clarity to areas of high uncertainty.
It also taught me to stay comfortable operating without perfect data, designing for clarity when information is still forming.




