Comments are off for this post.

Crazy Eights Fun

 

Recently, my team set out to redesign a portion of an SPA (single page application) that we had already spent time designing and reviewing. The company strategy had changed considerably since our first go-around so we needed to iterate on the design. In order to kick-start our ideation efforts, I workshopped with the team in completing Crazy Eights, a collaborative activity that stimulates idea generation.

Our workshop began with each participant folding a piece of paper into 8 squares; then we set a timer and each of us spent 8 minutes conceiving of 8 new ideas, 1 per square of paper.

The result of our Crazy Eights activity is that we were able to gather a few strong concepts in a very short amount of time. It bonded us as designers and problem solvers as we shared our "eights" with each other. And it helped us reject ideas very quickly too. Arguably, the most value of this activity was derived from the analysis portion, when we expounded upon our ideas and negotiated their merits. In the analysis portion, we got to re-think our own ideas, while also providing feedback on our teammates'.

Have you ever tried Crazy Eights? If not, I suggest you do. The entire workshop can be completed in less than 30 minutes while providing a launchpad for team bonding and idea generation.

SUGGESTED READING
Yael Levey. "How to: Run a Crazy Eights exercise to generate design ideas"

Comments are off for this post.

How to Calculate and interpret an NPS score

Imagine you've conducted a survey of your users asking them to provide you with an NPS score. And now you have those results and you'd like to make sense of them. This how-to will explain how to calculate your NPS score, and interpret it.Imagine you've conducted a survey of your users asking them to provide you with an NPS score. And now you have those results and you'd like to make sense of them. This how-to will explain how to calculate your NPS score, and interpret it.
====
The following chart serves as a reminder of the NPS categories:

0-6 : Detractors

7-8 : Passives

9-10 : Promoters

Here is the formula you will apply to determine the NPS score:

(% of promoters - % of detractors) = NPS

And to help you understand how your product is doing, consider the following rubric:

  • 0 or below: your product is failing in terms of customer satisfaction
  • 0-30: your product is doing "quite well" but probably needs improvement
  • 30-70: great in terms of customer satfisfaction70-100: most of your customers are brand advocates

CITE:

  1. "9 Practical Tips for an Effective NPS Data Analysis and Reporting." Retently (blog) https://www.retently.com/blog/nps-data-analysis-reporting/
    Jennifer Rowe.
  2. "Analyzing your Net Promoter Score℠ survey results (Professional Add-on and Enterprise Add-on)." Zendesk (support article) https://support.zendesk.com/hc/en-us/articles/203981113-Analyzing-your-Net-Promoter-Score-survey-results-Professional-Add-on-and-Enterprise-Add-on-

Comments are off for this post.

Compare and Contrast – A Technique for Better Specs

SUMMARY

By providing compare and contrast opportunities in our specs, we are infusing ux into the ux/ui hand-off experience.

When we are in the process of providing specs for a product enhancement, the developers are already familiar with the product (assuming they built the original product).

Since they are familiar with the current product, it facilitates their cognitive comprehension of the desired changes if we provide them with a compare and contrast spec that is annotated. The result being that we are working with their current mental model and then gently ushering them into the new design. The annotations help quickly identify the differences. And the side-by-side presentation allows them to

  1. not work from memory, and
  2. be able to quickly identify the differences visually.

Compare and contrast is a teaching strategy employed at all levels of curriculum, since it’s so effective at scaffolding knowledge. By bringing this technique into software development, we can make the process more delightful for everybody involved - especially the engineering staff that needs to interpret the specifications. But similarly for stake holders that need to understand and sign off on designs before they get moved to the development stage.

Since employing this technique in my specs, I find that there is much less back-and-fourth, much less “mansplaining” and simply a lot more calm around the design handoff process. If you haven’t tried it yet, I strongly suggest you do.

Comments are off for this post.

App Store Descriptions are Gifts to UX Designers

When I’m starting to research a new client’s digital product, one of my favorite places to go is the app stores. This is where companies have an opportunity to identify their product in the most concise terms possible.

Let’s take a look at how Instagram describes themselves in the app stores: ‘Over 300 million users love Instagram! … It’s a simple way to capture and share the world’s moments on your phone … customize … transform … share your photos and videos … follow … Facebook, Twitter … ’ [note: I’ve added the elipses to cull the key words].

If we view their marketing copy on their web site, it is different. It includes their slogan: ‘Capture and Share the World's Moments’; but it doesn’t emphasize simple or the number of users. The keywords here are: fast, beautiful, fun, easy, free, Facebook, Twitter.

It’s hard to argue with such a successful company like Instagram. We can assume that they market their product differently on the web vs in the app store for a reason, probably based on analytics. With less prestigious companies, however, such as newer startups, I have found that such choices are not intentional - but mistakes. For example, an LA startup I worked with had contrasting copy on their Android vs iOS descriptions. This revealed several user experience holes in the company’s product.

When a company is inconsistent with their marketing copy, it reveals the truth about the company - that they have a fractured organization, where one hand doesn’t know what the other hand is doing. For example, the marketing team isn’t in-line with the engineering or product teams. And thus they are chasing different goals. Ultimately, it’s the user who pays for these inefficiencies. An analogy is a child whose parents disagree; one parent says so-and-so is bad; whereas the other parent says so-and-so is fine. The child ends up confused as to which message is the correct one. Similarly, if a company sends mixed messages to its users, those users will wind up confused and/or missing out on understanding key features about the product.

As a user experience designer, these kinds of mistakes are opportunities to do good - to help a company align its message. We can find out whom is responsible for writing the copy; for posting the copy to the app store; and for driving the strategy behind the copy. And then we unite those persons through documentation - maybe just emails - but ideally, through healthy discussions, whiteboarding and Google Docs. Soon, the message will become succinct; teams will become unified; the design will reflect that unified message - and users will reap the benefits.

Lastly, I’d like to emphasize how much a unified message can effect a company’s revenue. If their message is unified, it will be reflected in how their users talk about the product. Like the game of telephone, if the message is simple, it will remain intact after countless numbers of persons have re-iterated it. Same with a digital product's message - it has more of a chance of going viral if it's succinct and concise - and, of course, reflective of a succinct and concise product.

Comments are off for this post.

Focus Groups vs Usability Testing, in brief

I was at a General Assembly socializing event last evening, and some one asked me if I could explain UX to him. Part of our conversation involved the discussion of user testing - at which point he suggested that focus groups sounded like the same thing as user testing. I explained to him that they were different, essentially in that focus groups are usually marketing driven; and that usability testing is led by UX designers and one-on-one.

Below is a summary comparison, culled from Steve Krug's 'Don't Make Me Think', which sits on my coffee table.

Focus groups are best completed in the planning stages of designing a product or feature. It’s often 5 - 10 people talking about their feelings; how they might feel doing such and such; so they help define what a product might be. And, to some extent, if it should even be designed in the first place. Most often, marketing leads these research sessions.

Usability testing is best completed throughout the entire life cycle of the product - before, during and after things are built. It’s 1 person using - or attempting to use - the product. This form of testing helps us see where users get stuck when using our product.

Comments are off for this post.

A quote on Agile

If it's not negotiable it's not agile. — Anonymous, at allaboutagile.com