top of page

Conversational API Generator

Conversational AI Chatbot, Developer Tooling, 2024

Background and Challenge

Workday's Extend platform enables developers to build custom applications on top of Workday's ecosystem — a task that requires navigating hundreds of proprietary Workday APIs alongside various partner APIs. For years, there was no formal library, no centralized documentation, and no built-in way to test APIs and preview their responses. Developers were left to piece things together on their own.

​

This created a significant pain point: developers were spending enormous amounts of time searching for the right API for their use case, often having to rely on word-of-mouth, scattered internal wikis, or trial and error. The question my team set out to answer was: how might we give Workday developers a fast, intelligent, and approachable way to discover, understand, and test the right API for any given use case?​​

firefox-browser-mockup-in-a-windows-desktop-21296.png

The Team

  • 1 UX Designer (Yours Truly)

  • 1 Product Manager

  • 1 A.I. Specialist Engineer

  • 2 Frontend Engineers

Initial Research & Brainstorming

This project was one of the most ambiguous I have taken on. There was no existing product to reference, no formal user research, and no established design pattern for what we were building. I had to move quickly and learn on the fly — which meant leaning heavily on the people around me and trusting my instincts as a designer.

​

Here's how I got up to speed:

​

  • Held multiple 1:1 sessions with the AI developer and frontend engineer to understand the technical possibilities and constraints from the start. I wanted to design within the realm of what was actually buildable — not in a vacuum.

​

  • Independently researched conversational A.I., as this was my introduction to designing for A.I. in my career

​

  • Regularly looped in UX peers and my manager for design critiques throughout the process. Without a formal user research function, these conversations became my primary feedback loop — and they were invaluable.

​

  • Leaned into the ambiguity. Rather than waiting for full clarity before designing, I started ideating and used each as a hypothesis to test with the team.

Iteration is Key

This project went through four meaningful design iterations before reaching the final product. Each pivot was driven by a new insight — about the user, about the technology, or about what the experience needed to feel like to a developer.

​

1. API Playground — Structured Search

My first iteration framed the experience as an "API Playground" — a structured search interface where developers would select from dropdown filters to narrow down relevant APIs.  The page also included educational cards at the top to help developers understand the difference between API types.

​

While this approach felt logical and systematic, it quickly became clear that the structured filter model was too rigid. The AI-powered recommendations panel — the most exciting and useful part of the product — was squeezed into the bottom half of the screen, without enough room to breathe or showcase its value. The interface was visually crowded, and the AI chat element felt like an afterthought.

1st iteration.png

2. Expanding the Results Surface

The core insight from Iteration 1 was that the Copilot recommendation needed more visual real estate. In my second iteration, I restructured the layout so that the AI-generated API results took center stage as a full-width grid of cards. Each card surfaced key data: the API name, a type badge, targeted business object, scope, and security permissions — everything a developer needs to evaluate an API at a glance.

​

This iteration showed real promise. However, feedback from stakeholders and my UX peers pointed to a lingering issue: the input model wasn't quite right. Dropdowns and structured fields were still leading the experience, which forced users to continue to sort through hundreds of API's, read each one, in order to find a solution.

2nd iteration.png

Developers don't think in dropdowns or tabs. We need to meet them where they are.

3. Shifting to Natural Language Input

Iteration 3 was a turning point. Informed by conversations with developers and the broader direction of Workday's Copilot product, I replaced the dropdown filters with a single open-ended text area: "Describe your use case or the kind of API you would like." This was a much more natural way for a developer to interact with an AI tool — just describe what you need in plain language and let the AI do the heavy lifting.

​

I also introduced a "Resources" helper panel within the Copilot section, which surfaced suggested use case categories to help developers who weren't sure how to phrase their request. This gave the Copilot a warm, conversational quality — rather than just a blank prompt, it offered a gentle starting point.

​

This iteration also reintroduced the educational cards at the top of the page, with the chatbot at the bottom with an expandable output container.

3rd Iteration

4. Final Design — A Conversational Experience

​

The final design brought everything together into a fully conversational, two-panel layout. On the left, developers describe their use case and can scroll through a timestamped history of their past queries. On the right, Copilot surfaces tailored API recommendations with deep detail: endpoint names, scopes, security requirements, and a "Preview Response" link for immediate testing.

​

I'm proud that the final design also allowed developers to continue the conversation — asking Copilot follow-up questions about a specific API directly within the interface or using the Preview Response button to compare the responses of multiple suggested APIs. This transformed the tool from a simple lookup utility into something much more powerful: an intelligent dialogue partner for navigating Workday's API ecosystem.

Final Design Interface
Final Iteration "Preview and Compare"

Conclusion and Learnings

This project introduced me to A.I., specifically chatbots and conversational A.I. for the first time in my career. Through my independent research I absorbed as much as I could from basic UI trends that were occuring in A.I. products to the backend processes and how the agent gathers information. 

 

The project also taught me a great deal about designing with ambiguity. When there is no clear roadmap, your instincts as a designer matter more than ever — and so does the quality of the relationships you build with your engineering counterparts. My 1:1s with the AI and frontend developers were not just informational; they were collaborative design sessions that shaped the product in meaningful ways.

​

I also learned that sometimes the most important design decision is the simplest one: replacing four dropdowns with a single text box. Meeting users where they are — in plain language, in the flow of their thinking — is often all it takes to make a product click.

bottom of page