
In This Episode
Part of our ongoing series testing whether AI can actually handle UX jobs.
We're back with our AI tools testing series! After hitting some technical snags with the first one - UX research episode - we jumped straight into testing AI tools for UX writing, and the results were eye-opening.
What We Tested
We started with 4 tools—Claude, Writesonic, Frontitude, and Copy.ai—but quickly narrowed it down when two proved unusable for serious UX writing work.
The Verdict
The Frontitude plugin came out on top for UX writing tasks, showing real promise for the field. But here's the catch: even the best tool still needs human oversight and fine-tuning.
The Big Questions
Chapters
(00:00) Intro and catch-up
(07:38) Product thumbs up/down
(24:59) Onboarding the 4 AI tools
(40:51) Understanding marketing and UX writing
(42:08) Copy.ai's confusing onboarding flow
(50:05) Testing error scenarios with Claude
(1:00:41) Testing error scenarios with Frontitude
(1:07:12) Comparing both on the error scenario
(1:12:41) Testing navigation labels with Claude
(1:15:35) Why AI cannot replace human designers just yet
(1:20:43) Testing navigation labels with Frontitude
(1:23:00) Comparing both on the navigation label scenario
(1:24:17) Ranking the AI tools for UX writing
(1:27:00) The role of UX writers in the industry
(1:30:34) Adapting to AI as a UX Writer
(1:36:36) The future