
I recently attended a Belfast Design event with a talk from Maria McGinn on “What GenAI Reveals About How We See UX Design and Where Designers Need to Evolve Next”. A lot of the talk was about using AI to build product, which is something I’m interested in with the rise of vibe-coding. Maria raised something I hadn’t really considered before, which was that we should be prompting with the problem, rather than the solution. At the moment, I find myself describing to AI what a tool should be and how it should work, but really, I should be telling it why I want it do these things.
AI is bad at recreating something from your mind, but good at solving a problem from your mind. I’ve noticed myself that whenever I try and use AI to add styling to a product, if I’m too specific, it will misinterpret what I’m saying. This happens for two reasons:
Really, I should be giving AI the base building blocks to apply styling with and let it do the exact thinking. Similarly, when design the UX of a product, we should be giving it the base building blocks in form of the problem it will solve, rather than a list of features. This will let AI determine what the best features are to include in the solution.
Now I think of it, this makes a lot of sense. This is how we should be approaching design ourselves without AI, not focussing on features but solving problems. I think it will be really interesting to see how the role of the designing changes because of this; I could imagine the future will be more focussed on defining problems than creating features.