- AI x Customer Research
- Posts
- AI x Customer Research - March '25
AI x Customer Research - March '25
AI designs: From insights to test-ready prototype views faster (a workflow), the options on the market, and data privacy issues -

Read time: 9 minutes
There’s loads of inspiring content on using AI design generators but…
Starting from scratch in an AI design tool with a one-line prompt doesn’t make any sense to me.
And yet…that’s what people are doing in YouTube demo videos of the latest, greatest AI design generation tools.
Prompt: “A modern, elegant finance app with savings and stock investing.”
Really? What designer actually starts there?
Most of you probably think of me as a researcher by now. The truth is, I started as a marketer, then became a designer - and then put together all the things I’d been doing around understanding customers to work more officially as a research person.
In other words, I know what it’s like to design, and what design requirements look like. They don’t look like that prompt. “You need to design a modern, elegant finance app with savings and stock investing,” said no stakeholder ever.
I’ve done a bunch of my own silent tests of AI design generators. I’ve watched all the tutorials online. But here’s the thing I keep thinking is missing:
How can we actually get from “these are our customer insights” to “here are user test-ready views in a prototype” in record time with AI?
That’s what I’m focusing on in this issue. I hope you find it helpful.
Here come the workflow, example prompts and data privacy considerations for AI design tools -
In this edition:
🖼️ From Insights to Prototype-ready Designs, the AI Way: Turn qualitative insights into testable designs — fast.
🕹️ Prompt Like A Designer: Guiding Different Design Generation Tools
Three AI tools, three approaches. How to prompt each one with tested examples that match how each one thinks.
🔐 Privacy Concerns: Before You Share your PRD with AI… Read This
If you’re using AI view generators, you’re feeding proprietary product plans into a black box. Here’s what to consider.
WORKFLOW UPGRADES
📝 From Insights to Prototype-Ready Designs: The AI Way
I’ll keep this short - I think I said enough in the intro!
Here’s the full workflow I used this month to turn real qualitative insights into testable designs — fast.
Step 1: Let insights tell you what to design
The goal: Figure out where the riskiest assumptions still live.
Take the final insights from your qual analysis and feed them into your chosen LLM.
Ask the model to assess which user needs and product ideas were strongly supported by the research — and which weren’t.
It should return a list of assumptions we’re still making.
Then prompt:
“Based on these assumptions, what are the most important product requirements to include in our next prototype for testing?”
Step 2: Turn that into a PRD
Feed the assumptions and any other relevant details back into your LLM to generate a PRD (product requirements doc).
The resulting PRD is your source of truth: what needs to be in the prototype to test what matters. Make sure to double-check the functionality included and add your perspective.
Heads up: most LLMs I’ve tested this process with tend to overbuild. Expect to push your LLM to simplify the PRD output before finalizing this step.
Use a prompt like this:
“Draft a concise PRD with required views. Add the key components for each view. Make sure functionality added focused on testing the assumptions above. Include any other functionality that isn’t related to the assumptions but must be there to cover essentials that a real design and test scenario would include (ex: login, profile, nav bar, etc).”
STEP 3: Generate the screens
From here, I used Claude, UX Pilot, and Uizard to generate visual designs of various fidelity levels.
Tip: Most design view generators seem to work best when given view-by-view details, rather than trying to generate an entire multi-view flow at once
Split up your PRD results into individual requirement sets per design view
Check whether they need more essential details to give you results that are closest to export-ready, like style guidelines (font, color…)
STEP 4: Review + revise
The goal: It’s not to get you 100% of the way to test-ready views. So far, none of my tests of real scenarios have landed there (and many of the more trustworthy sources on AI-generated design views say something similar).
Expect to:
Get views that are pretty good, but not awesome.
Tweak the prompts a little bit to generate 1 other variation.
Accept imperfect details that you can change pretty quickly in Figma or other design tool on your own, after exporting your AI designs and importing there. (ex: change lemon yellow accents to sunflower yellow when Uizard doesn’t understand hex color codes).
Tips:
Especially when you’re on a free trial of one of the design-specific tools, you’ll want to consider your total credits available. Generating many iterations just to try to get the accent color to be the one you wanted is probably a poor use of those credits.
STEP 5: Final prototype build
Once the views were selected and exported, I imported them into Figma to create a clickable prototype. You could also experiment with AI-native prototyping tools like Bolt or Lovable — but that’s a topic for another issue!
PROMPTING PLUS
🕹️ Prompt Like a Designer: Guiding Different Design Generation Tools
Here’s what most people don’t tell you: you can’t use the same prompt across these tools. Each one thinks differently. Here’s how to prompt each of them — and what to expect back.
Scroll to the end for prompt examples per tool -
🟡 UX Pilot
Best for: Turning specific PRDs into usable, test-ready UI.
UX Pilot handles long, structured prompts with a lot of control over layout and content. It will do its best to follow instructions, and when you’re clear, the screens come back with strong hierarchy and reasonable interpretations.
How it thinks:
Structured input = structured output. It does well when you define the goal of the page, the layout by section, the user context, and the navigation model. If you leave things vague, it defaults to generic UI.
Prompt tips:
Include layout and content by section
Be precise about required modules and functionality
Use the “enhance prompt” option to clean up fluff without losing detail
Prompt screen by screen (not by full flow)
Example results:

Tips: There’s a Figma plugin without all functionality, but at least you can stay in your comfort zone. Pack each prompt with as much context as you can afford (generation of each screen and variation costs 6 credits).
⚠️ Heads up: If you want to use it on a trial, you’ll get 90 credits to start. But going through the onboarding tutorial costs you 6!
〰️
🟡 Uizard
Best for: Exploring early concepts fast — less great for detailed execution.
Uizard’s 300-character prompt limit makes it tough to get precision when you already know what needs to be on a screen. But its “creative exploration” mode generates 4 screen variants at a time, which makes it surprisingly useful during early-stage idea generation.
How it thinks:
It responds well to tone, purpose, and screen type — not detailed layout. Think of it as a visual brainstormer that understands general UX patterns, not a spec follower.
Prompt tips:
Start with a clear product concept
Highlight must-have modules for the test
Specify tone or visual style (“modern,” “calm,” etc.)
Say which screen you're creating (home, dashboard, etc.)
Don't expect hex codes or deep structure fidelity
Prioritize what kind of screen it is, and specify if you think something should be on there that isn’t typical hygiene for that kind of view (home, profile, shopping cart…)
Example results:

〰️
🟡 Claude
Best for: Designing from logic up — great wireframe generator, not a visual tool.
Claude won’t render full-fidelity screens, but it can be a powerful wireframe partner. It excels at turning detailed requirements into structured layouts, especially when you want to control spacing, styling, and hierarchy.
How it thinks:
It likes structure and reasoning. The more detailed your PRD, the more precise and usable the output. Unlike the other tools, it can carry your full workflow — from assumptions → PRD → screens — all in one chat. Here, the control is really in your hands. And the outputs are bad when your inputs were first. 😅
Prompt tips:
Give it full spacing, font, grid system and icon set rules
Include hierarchy, UX writing tone, and visual cues
Ask it what’s missing based on your testing goal
Don’t expect finished visuals — use it to get to the what belongs where stage
Example results:

💐 Want my prompts?
DATA PRIVACY
🔐 Before you upload your future product… think about this -
We’re now using AI tools to generate designs based on product concepts that haven’t launched.
And when you’re using PRDs as input, while you should be getting more valuable design results, you might be sending sensitive, proprietary product strategy straight into a third-party black box.
Here are the basics to know:
Uizard: Their privacy policy literally says nothing about AI use of your data, such as in training, how it’s shared or not with third-parties, etc.
UX Pilot: Your data is encrypted at rest and in transfer, and they say they don’t use it to train any models.
Claude: They do use your inputs for training unless you use the Temporary Chat option - like a private browser window.
A few essential guidelines for using these tools -
Do the equivalent of redacting PRDs and requirement lists for AI designs - give the most basic description that still gets you a starting point - then add the relevant and proprietary details in Figma yourself.
Assume anything in a freemium design tool is not private by default.
Don’t tell them anything you wouldn’t want leaked. If a competitor found your inputs, would it still be okay?
WHAT’S COMING NEXT?
Here’s what’s coming in the next few editions -
I’m reshuffling priorities to see what’s most in-demand among people like you!
If there’s something you specifically want someone to deep dive into, let me know!
See you next time!
-Caitlin