How To Test And Measure Content In UX<\/h1>\nVitaly Friedman<\/address>\n 2025-02-13T08:00:00+00:00
\n 2025-02-18T23:17:29+00:00
\n <\/header>\n
Content testing<\/strong> is a simple way to test the clarity and understanding of the content on a page — be it a paragraph of text, a user flow, a dashboard, or anything in between. Our goal is to understand how well users actually perceive the content that we present to them.<\/p>\nIt\u2019s not only about finding pain points and things that cause confusion or hinder users from finding the right answer on a page but also about if our content clearly and precisely articulates<\/strong> what we actually want to communicate.<\/p>\n.course-intro{–shadow-color:206deg 31% 60%;background-color:#eaf6ff;border:1px solid #ecf4ff;box-shadow:0 .5px .6px hsl(var(–shadow-color) \/ .36),0 1.7px 1.9px -.8px hsl(var(–shadow-color) \/ .36),0 4.2px 4.7px -1.7px hsl(var(–shadow-color) \/ .36),.1px 10.3px 11.6px -2.5px hsl(var(–shadow-color) \/ .36);border-radius:11px;padding:1.35rem 1.65rem}@media (prefers-color-scheme:dark){.course-intro{–shadow-color:199deg 63% 6%;border-color:var(–block-separator-color,#244654);background-color:var(–accent-box-color,#19313c)}}<\/p>\n
This article is part of our ongoing series<\/strong> on UX<\/a>. You can find more details on design patterns and UX strategy<\/strong> in Smart Interface Design Patterns<\/a> \ud83c\udf63 — with live UX training coming up soon. Free preview<\/a>.<\/p>\nBanana Testing<\/h2>\n<\/p>\n <\/p>\n
<\/p>\n
<\/a>\n Banana Testing: Replace all key actions with the word \u201cBanana,\u201d then ask users to suggest what it could be. (Large preview<\/a>)
\n <\/figcaption><\/figure>\nA great way to test how well your design matches a user\u2019s mental model<\/strong> is Banana Testing. We replace all key actions with the word \u201cBanana,\u201d then ask users to suggest what each action could prompt.<\/p>\nNot only does it tell you if key actions are understood immediately<\/strong> and if they are in the right place but also if your icons are helpful and if interactive elements such as links or buttons are perceived as such.<\/p>\nContent Heatmapping<\/h2>\n<\/p>\n <\/p>\n
<\/p>\n
<\/a>\n Content heatmapping<\/a>, a simple technique to evaluate content and how well it performs. (Large preview<\/a>)
\n <\/figcaption><\/figure>\nOne reliable technique to assess content is content heatmapping<\/a>. The way we would use it is by giving participants a task, then asking them to highlight things that are clear or confusing<\/strong>. We could define any other dimensions or style lenses as well: e.g., phrases that bring more confidence and less confidence.<\/p>\nThen we map all highlights into a heatmap<\/strong> to identify patterns and trends. You could run it with print-outs in person, but it could also happen in Figjam or in Miro remotely — as long as your tool of choice has a highlighter feature.<\/p>\nRun Moderated Testing Sessions<\/h2>\n
These little techniques above help you discover content issues, but they don\u2019t tell you what is missing<\/strong> in the content and what doubts, concerns, and issues users have with it. For that, we need to uncover user needs in more detail.<\/p>\nToo often, users say that a page is \u201cclear and well-organized<\/strong>,\u201d but when you ask them specific questions, you notice that their understanding is vastly different from what you were trying to bring into spotlight.<\/p>\nSuch insights rarely surface in unmoderated sessions — it\u2019s much more effective to observe behavior<\/strong> and ask questions on the spot, be it in person or remote.<\/p>\nTest Concepts, Not Words<\/h2>\n<\/p>\n <\/p>\n
<\/p>\n
<\/a>\n Removing doubts before they happen with front-loading key details. (Large preview<\/a>)
\n <\/figcaption><\/figure>\nBefore testing, we need to know what we want to learn. First, write up a plan with goals, customers, questions, script. Don\u2019t tweak words alone — broader is better. In the session, avoid speaking aloud as it\u2019s usually not how people consume content. Ask questions and wait silently<\/strong>.<\/p>\nAfter the task is completed, ask users to explain the product, flow, and concepts to you. But: don\u2019t ask them what they like, prefer, feel, or think. And whenever possible, avoid the word \u201ccontent\u201d<\/strong> in testing as users often perceive it differently.<\/p>\nChoosing The Right Way To Test<\/h2>\n
There are plenty of different tests that you could use:<\/p>\n
\n- Banana test<\/strong> \ud83c\udf4c
Replace key actions with \u201cbananas,\u201d ask to explain.<\/li>\n- Cloze test<\/strong> \ud83d\udd73\ufe0f
Remove words from your copy, ask users to fill in the blanks.<\/li>\n- Reaction cards<\/strong> \ud83e\udd14
Write up emotions on 25 cards, ask users to choose.<\/li>\n- Card sorting<\/strong> \ud83c\udccf
Ask users to group topics into meaningful categories.<\/li>\n- Highlighting<\/strong> \ud83d\udd8d\ufe0f
Ask users to highlight helpful or confusing words.<\/li>\n- Competitive testing<\/strong> \ud83e\udd4a
Ask users to explain competitors\u2019 pages.<\/li>\n<\/ul>\nWhen choosing the right way to test, consider the following guidelines:<\/p>\n
\n- Do users understand?<\/strong>
Interviews, highlighting, Cloze test<\/li>\n- Do we match the mental model?<\/strong>
Banana testing, Cloze test<\/li>\n- What word works best?<\/strong>
Card sorting, A\/B testing, tree testing<\/li>\n- Why doesn\u2019t it work?<\/strong>
Interviews, highlighting, walkthroughs<\/li>\n- Do we know user needs?<\/strong>
Competitive testing, process mapping<\/li>\n<\/ul>\nWrapping Up<\/h2>\n
In many tasks, there is rarely anything more impactful than the careful selection of words on a page. However, it\u2019s not only the words alone that are being used but the voice and tone<\/strong> that you choose to communicate with customers.<\/p>\nUse the techniques above to test and measure how well people perceive content but also check how they perceive the end-to-end experience<\/strong> on the site.<\/p>\nQuite often, the right words used incorrectly on a key page can convey a wrong message or provide a suboptimal experience. Even though the rest of the product might perform remarkably well, if a user is blocked<\/strong> on a critical page, they will be gone before you even blink.<\/p>\nUseful Resources<\/h2>\n\n- Practical Guide To Content Testing<\/a>, by Intuit<\/li>\n
- How To Test Content With Users<\/a>, by Kate Moran<\/li>\n
- Five Fun Ways To Test Words<\/a>, by John Saito<\/li>\n
- A Simple Technique For Evaluating Content<\/a>, by Pete Gale<\/li>\n<\/ul>\n
New: How To Measure UX And Design Impact<\/h2>\n
Meet Measure UX & Design Impact (8h), a new practical guide for designers and UX leads to measure and show your UX impact on business. Use the code \ud83c\udf9f IMPACT<\/code> to save 20% off today. Jump to the details<\/a>.<\/p>\n\n
\n 
\n <\/a>
\n<\/figure>\n\n\n
\n 2025-02-18T23:17:29+00:00
\n <\/header>\n
It\u2019s not only about finding pain points and things that cause confusion or hinder users from finding the right answer on a page but also about if our content clearly and precisely articulates<\/strong> what we actually want to communicate.<\/p>\n .course-intro{–shadow-color:206deg 31% 60%;background-color:#eaf6ff;border:1px solid #ecf4ff;box-shadow:0 .5px .6px hsl(var(–shadow-color) \/ .36),0 1.7px 1.9px -.8px hsl(var(–shadow-color) \/ .36),0 4.2px 4.7px -1.7px hsl(var(–shadow-color) \/ .36),.1px 10.3px 11.6px -2.5px hsl(var(–shadow-color) \/ .36);border-radius:11px;padding:1.35rem 1.65rem}@media (prefers-color-scheme:dark){.course-intro{–shadow-color:199deg 63% 6%;border-color:var(–block-separator-color,#244654);background-color:var(–accent-box-color,#19313c)}}<\/p>\n This article is part of our ongoing series<\/strong> on UX<\/a>. You can find more details on design patterns and UX strategy<\/strong> in Smart Interface Design Patterns<\/a> \ud83c\udf63 — with live UX training coming up soon. Free preview<\/a>.<\/p>\n <\/p>\n <\/a> A great way to test how well your design matches a user\u2019s mental model<\/strong> is Banana Testing. We replace all key actions with the word \u201cBanana,\u201d then ask users to suggest what each action could prompt.<\/p>\n Not only does it tell you if key actions are understood immediately<\/strong> and if they are in the right place but also if your icons are helpful and if interactive elements such as links or buttons are perceived as such.<\/p>\n <\/p>\n <\/a> One reliable technique to assess content is content heatmapping<\/a>. The way we would use it is by giving participants a task, then asking them to highlight things that are clear or confusing<\/strong>. We could define any other dimensions or style lenses as well: e.g., phrases that bring more confidence and less confidence.<\/p>\n Then we map all highlights into a heatmap<\/strong> to identify patterns and trends. You could run it with print-outs in person, but it could also happen in Figjam or in Miro remotely — as long as your tool of choice has a highlighter feature.<\/p>\n These little techniques above help you discover content issues, but they don\u2019t tell you what is missing<\/strong> in the content and what doubts, concerns, and issues users have with it. For that, we need to uncover user needs in more detail.<\/p>\n Too often, users say that a page is \u201cclear and well-organized<\/strong>,\u201d but when you ask them specific questions, you notice that their understanding is vastly different from what you were trying to bring into spotlight.<\/p>\n Such insights rarely surface in unmoderated sessions — it\u2019s much more effective to observe behavior<\/strong> and ask questions on the spot, be it in person or remote.<\/p>\n <\/p>\n <\/a> Before testing, we need to know what we want to learn. First, write up a plan with goals, customers, questions, script. Don\u2019t tweak words alone — broader is better. In the session, avoid speaking aloud as it\u2019s usually not how people consume content. Ask questions and wait silently<\/strong>.<\/p>\n After the task is completed, ask users to explain the product, flow, and concepts to you. But: don\u2019t ask them what they like, prefer, feel, or think. And whenever possible, avoid the word \u201ccontent\u201d<\/strong> in testing as users often perceive it differently.<\/p>\n There are plenty of different tests that you could use:<\/p>\n When choosing the right way to test, consider the following guidelines:<\/p>\n In many tasks, there is rarely anything more impactful than the careful selection of words on a page. However, it\u2019s not only the words alone that are being used but the voice and tone<\/strong> that you choose to communicate with customers.<\/p>\n Use the techniques above to test and measure how well people perceive content but also check how they perceive the end-to-end experience<\/strong> on the site.<\/p>\n Quite often, the right words used incorrectly on a key page can convey a wrong message or provide a suboptimal experience. Even though the rest of the product might perform remarkably well, if a user is blocked<\/strong> on a critical page, they will be gone before you even blink.<\/p>\n Meet Measure UX & Design Impact (8h), a new practical guide for designers and UX leads to measure and show your UX impact on business. Use the code \ud83c\udf9f Banana Testing<\/h2>\n
<\/p>\n
\n <\/figcaption><\/figure>\nContent Heatmapping<\/h2>\n
<\/p>\n
\n <\/figcaption><\/figure>\nRun Moderated Testing Sessions<\/h2>\n
Test Concepts, Not Words<\/h2>\n
<\/p>\n
\n <\/figcaption><\/figure>\nChoosing The Right Way To Test<\/h2>\n
\n
Replace key actions with \u201cbananas,\u201d ask to explain.<\/li>\n
Remove words from your copy, ask users to fill in the blanks.<\/li>\n
Write up emotions on 25 cards, ask users to choose.<\/li>\n
Ask users to group topics into meaningful categories.<\/li>\n
Ask users to highlight helpful or confusing words.<\/li>\n
Ask users to explain competitors\u2019 pages.<\/li>\n<\/ul>\n\n
Interviews, highlighting, Cloze test<\/li>\n
Banana testing, Cloze test<\/li>\n
Card sorting, A\/B testing, tree testing<\/li>\n
Interviews, highlighting, walkthroughs<\/li>\n
Competitive testing, process mapping<\/li>\n<\/ul>\nWrapping Up<\/h2>\n
Useful Resources<\/h2>\n
\n
New: How To Measure UX And Design Impact<\/h2>\n
IMPACT<\/code> to save 20% off today. Jump to the details<\/a>.<\/p>\n
\n
\n <\/a>
\n<\/figure>\n