The rise of generative AI tools, like Figma Make, transforms design into “Vibe Designing,” paralleling “Vibe Coding.” Using natural language prompts, designers create prototypes and iterate with the help of AI assistance or manually. While fostering creativity and efficiency, challenges include design fixations, generic outcomes, and the risk of diminishing critical skills in designers.
What is Generative AI?
Generative AI models have been trained on text, images, code, and design assets, and can thus generate new text, images, videos, code, or design, based on its previous training data. Users interact with generative AI by prompting it with natural language (e.g. writing “create an image of a blue horse riding an elephant”), and other multimodal inputs (e.g. attaching example images, code files, sketches, and design systems). For example, the thumbnail image for this article was generated using generative AI, with a prompt containing the contents of this article.
AI-based features using generative AI is becoming increasingly popular in creative software such as Canva, Photoshop, and Figma. You can now in all of these apps prompt AI to help you in your creative workflow. Some are calling this new human-AI collaboration “co-creativity”. But it is yet unknown if and how generative AI will change design workflows in the long run.
Vibe Coding & Vibe Designing
A related concept to this topic is “vibe coding”. In the beginning of this year (2025), the concept “Vibe Coding” was coined by Andrej Parpathy, co-founder to OpenAI. Vibe coding is using AI-assisted coding tools to develop code, using natural language. For example, telling Claude Code or Cursor to write a piece of code for you.
Then, a few months later, in July 2025, Figma released Figma Make to all users, an AI-driven tool that allows you to “Vibe Design” using natural language descriptions. Figma Made is available to anyone with a paid account or an educational account (e.g. design students).
What does this for the future?
Similarly to vibe coding, I think vibe designing might change how designers’ workflows will look like in the future, especially blurring the distinctions between front-end developers and UI designers. I believe future UI designers or front-end developers, with these tools, and some additional education in front-end development or design, could bridge the gaps and own the end-to-end user interface design and development process.
I’m not saying any of these roles will magically disappear, but I don’t see why a combined job role wouldn’t be possible, especially with the help of these new tools.
Advantages of Vibe Designing
Vibe designing could allows designers to create much more compelling prototypes, as Emmet Connolly said in his talk at Hatch conference. Instead of showcasing 50 flat frames on Figma, they could instead generate functional prototypes of their ideas.
Using generative AI in the design process could also assist designers in faster iterations, especially in earlier stages of the design process. It might also contribute to more creative explorations.
Vibe designing could also introduce more designers to coding, and make developing functional prototypes in code seem much less off-limits.
Risks of Vibe Designing
There are of course also risks with using vibe coding for designing, as research in how GenAI is used in design practice indicate.
Risks of using generative AI in design workflows could potentially lead to:
- Design fixations – Using AI to design might “lock” designers into a specific idea or path
- Generic output – GenAI might product highly standardized and bare-boned prototypes which lacks creative characteristics
- Unintended biases and cultural insensitivity – Overrelying on AI without critical thinking might lead to the adoption of AI generated designs that perpetuate unintended biases or cultural insensitivities
- Loss of creative skills and critical thinking abilities – Long-term overreliance on AI might lead to loss of human design skills and critical thinking abilities, if we become lazy in our creative work
Guidelines for using generative AI in design Workflows
As a part of my job as a design lecturer, I am currently developing guidelines for using generative AI in design workflows. These are mostly developed for students, but can be applicable for practitioners as well.
1. When to use AI in Design Process
Use AI to assist in – not replace – your own creativity, judgment, and empathy throughout the design cycle.
Tools should primarily support faster early-stage design (e.g. initial market research, ideations, developing briefs, developing personas, and storyboards) and selectively assist in later stages (e.g. evaluation).
2. How to use
Tips – Match AI tool with the design task. You can:
- Use text-based tools (e.g. ChatGPT and Claude) for developing briefs, plans, and personas. But treat any AI-generated (synthetic) data as temporary placeholders – not evidence – until you’ve gotten in contact with real users. Be critical for any possible hallucinations.
- Use visual tools (e.g. Figma Make) for quick UI ideations. Make sure that the AI outputs you adopt fit into the bigger context of your project, such as the overall visual design direction, fit to your intended user groups, and intended goals of the design.
- Use code-based tools (e.g. Claude Code and Cursor) for creating functional prototypes. You can design your prototypes manually first, and then use vibe-coding tools to generate functional prototypes fast, that you can share with others or test with users.
3. How to prompt
- Treat prompts as design artifacts: define inputs (context, constraints, data), system instructions, output format, give a few examples, and safety (accessibility, bias, inclusivity) criteria. Keep versions of prompts used.
- Provide multimodal inputs (e.g. text assets, images, sketches, screenshots, flow diagrams) in the prompts to improve specificity, context, and reduce textual ambiguity.
4. What to think about
- For AI generated design, check accessibility (e.g. color contrast, font size legibility), representational bias and cultural sensitivity, and inclusivity before adoption. Revise or discard outputs as needed, you always generate a new version.
- Actively critique AI outputs; strive for originality and situated appropriateness over simply accepting AI outputs.
- Lessen design fixation by rotating or using different types of AI tools, prompts, and settings. Add non-AI (manual) design ideation blocks in the design process.
5. When to AVOID using generative AI in the Design Process
- Avoid using AI for critical decisions, evaluations, or the final aesthetics, rely on your own judgement for selections and evaluations.
- Do not skip user studies. AI can output false, wrong, and low-quality outputs that do not fit the context.
- Avoid using AI in any way that compromises safety or privacy
- Avoid uploading sensitive data, confidential materials, and personally identifiable information – Either of yourself, your classmates, or participants in any user studies or interviews
- Align practice with responsible AI policies, accessibility standards, and applicable regulations (e.g., EU AI Act principles).
- Use institution-approved tools
6. What to document
If you use AI, document and disclose in your hand-ins when and how GenAI was used in the coursework. Mention:
- Tools used
- Prompt(s) used
- Rationale around your own judgement and validations. For example:
- What did AI produce, and what did you accept, edit, and reject? Why?
- What parts of the AI generated design did you iterate yourself? Why?
- Which parts of the design are primarily AI generated and which parts are primarily manually designed?
- What aspects of accessibility did you check and did you make any manual changes?
- Any interesting relevant side-by-side comparisons of AI vs. human iterations?
- Reflect on: What did you learn? When was AI used appropriately and when was it not? How did you perceive the quality of the AI generated outputs? Did you consider any ethics?
If you use AI generated design assets in your final work, think about:
- Do you have the rights to use the output? Attribute relevant sources and check for copyright/licensing.
TLDR; I believe tools such as Figma Make is going to lead to Vibe Designing. It’s like vibe coding, but for design.
I will update this blog with new posts on this topic as I continue explore this topic. Hope you enjoyed and let me know what you think about the future of designing with AI!