Operationalizing GenAI: From Fragmented Pilots to a Unified Ecosystem

Established the foundational interaction patterns that scaled Agentic AI across 5 core product lines, reducing engineering redundancy and accelerating time-to-market.

Role

AI Experiences Lead & Design System Contributor

Contribution:

AI design pattern, UX, Product strategy

Team:

Design system, Gen AI

Duration

1 year

Platform:

Web app, React storybook

Tools:

In the race to launch AI features, it is easy to assume that users want a magic button that does everything for them. However, as I began leading the AI design initiatives at Cornerstone, I discovered the opposite was true. Our users felt alienated by "black box" automation and lacked the context to trust the results. I realized we needed to pivot from pure automation to augmentation.


This project tells the story of how we established the company’s first AI interaction patterns. By moving beyond simple text generation to a fully Agentic experience, I designed a conversational interface that helps users navigate complex, legacy workflows without losing control. Here is how we turned user skepticism into trust and built a system that scales across the entire product suite.

Role

AI Experiences Lead & Design System Contributor

Contribution:

AI design pattern, UX, Product strategy

Team:

Design system, Gen AI

Duration

1 year

Platform:

Web app, React storybook

Tools:

In the race to launch AI features, it is easy to assume that users want a magic button that does everything for them. However, as I began leading the AI design initiatives at Cornerstone, I discovered the opposite was true. Our users felt alienated by "black box" automation and lacked the context to trust the results. I realized we needed to pivot from pure automation to augmentation.


This project tells the story of how we established the company’s first AI interaction patterns. By moving beyond simple text generation to a fully Agentic experience, I designed a conversational interface that helps users navigate complex, legacy workflows without losing control. Here is how we turned user skepticism into trust and built a system that scales across the entire product suite.

Role & Goal

Served as the main contact for AI-enabled experiences at Cornerstone, leading the Admin experience and partnering with another designer on the Learner experience. Reviewed AI use cases and recommended new design patterns and components.

My role spanned

Define AI design pattern

Influence AI-enabled design direction

Evangelize AI-enabled guideline

MVP development

Leading and hosting workshops

Key Results

Shattered Boundaries

Introducing AI guidelines into product strategy workshops helps teams shape cohesive AI interaction design.

Solved multiple complex product use cases

Established AI interaction patterns by incorporating design requests gathered from product planning workshops.

Reduced inconsistency and redundant work

Implementing a centralized AI design patterns & components ensured cohesive experiences and eliminated redundant development efforts.

2.5k beta AI interactions

The AI Companion generated 2,500 navigation searches in its first beta month, showing strong early engagement and practical value for daily admin tasks.

50% estimated task-time reduction

The new design delivered strong usability signals, earning a 4.5/5 ease-of-use score in testing. Every participant completed the task successfully, reinforcing the clarity of the redesigned workflow.

5 AI-enabled features shipped / planned

Five new AI capabilities were released or planned across products, bringing assistance, recommendations, and intelligent summarization into core flows.

The Challenge: Innovation in Silos

In the race to adopt AI, Cornerstone faced a 'Fragmentation Debt'.
Five different product teams were building isolated AI features, leading to inconsistent interactions and wasted engineering hours.
Furthermore, HR admins hesitated to use AI due to a lack of control and context.

The High-Demand, High-Feasibility Sweet Spot

While introducing AI guidelines in product strategy workshops, I simultaneously collected AI use case initiatives from all product lines. To quickly respond to product team needs, we prioritized the "Content Creation" scenario—which had the highest demand and lowest technical barrier—to launch our first AI editor.

Insight: The team aimed for rapid, high-impact AI deployment but lacked unified patterns. Consequently, my focus shifted to establishing alignment by sharing AI UX guidelines in workshops.

Unified Patterns: One set of components for all 5 products.

Several products are developing AI features independently, resulting in inconsistent UX and redundant development.

Find the right problem

Quality, Context, and the Efficiency Gap in AI Workflows

User testing revealed skepticism regarding the quality and contextual accuracy of AI outputs, suggesting a need for increased human oversight. However, the current workflow failed to save time due to excessive manual data entry for fields the system should have inferred automatically. This highlights a critical design challenge: effectively balancing Augmentation and Automation to minimize manual friction while ensuring the generated content remains meaningful and under user control.

Customer Feedback:

"This doesn’t save me much time. There are still too many fields to fill out. If I have to enter all of that manually in the UI, the benefit is minimal."

"I need more control over the process."

"The results are disappointing."

"The AI lacks the necessary context."

Idea: What if we guide the user through the workflow, allowing context to accumulate as steps progress to generate higher quality content

Shifting Paradigm: From "Chatting" to "Working"

We moved beyond the generic 'Chatbot' pattern. I defined an Agentic Interaction Model centered on 'Human-in-the-loop' workflows. This approach allows the AI to perform complex tasks (drafting, summarizing, analyzing) while ensuring the user remains the final decision-maker.

“I got confused about what this agentic mode is capable of and how it works”

Design Iteration: Reducing the "Blank Canvas" Effect

Designing for an AI-first learning platform presented a unique challenge: How do we integrate the power of Large Language Models (LLM) without overwhelming the user with an empty text box?

Impact: Defining the "AI-First" Strategy

We introduced Generative AI across key product workflows to accelerate search, simplify decision-making, and reduce admin effort. Early releases demonstrated meaningful adoption and time-savings, while design-system integrations and cross-team workshops established scalable patterns for future AI-driven experiences.

2.5k

beta AI interactions

The AI Companion generated 2,500 navigation searches in its first beta month, showing strong early engagement and practical value for daily admin tasks.

50%

Task-time reduction

The new design delivered strong usability signals, earning a 4.5/5 ease-of-use score in testing.

Every participant completed the task successfully, reinforcing the clarity of the redesigned workflow.

5

Features shipped / planned

Five new AI capabilities were released or planned across products, bringing assistance, recommendations, and intelligent summarization into core flows.

2

strategic opportunities

Gen AI + Design System workshops surfaced two high-impact opportunities streamlining component creation and accelerating concept-to-design productivity.

User and clients appreciate the final solution for ease of use and clarity

"This work was pivotal to our market strategy and was showcased as the headline innovation at Connect Live, Cornerstone's global customer conference, demonstrating our commitment to an AI-first future."

Want to see the full case study?

I’d be happy to share it with you.
Reach out and we can set a time to talk.