User testing and user feedback

Over the past two years, I’ve tried some relatively unconventional strategies to user test Talk Hiring. I’m proud to have worked on a product that is so easy to use that people who claim to be “bad at computers” can still use it.

First off, a book.

I recommend The Mom Test as a good primer on how to elicit feedback from users without biasing them. Or as the author describes it: “how to talk to customers & learn if your business is a good idea when everyone is lying to you.” At the bottom of the website linked to above, you can enter your email and get some of the book content for free.

Face-to-face guerrilla testing strategies:

Train stations. I’m not talking about Penn Station, Grand Central, or the like. I would go to my local suburban train station that has 2 tracks. Commuters waiting at the train are as captive an audience as they come. Plus, with train schedules, you can figure out how much time they are waiting for. I would head over there when the trains to New York Penn Station and Hoboken would be stopping there within 15 minutes of each other, to maximize foot traffic. It was pretty easy to walk the platform and find people willing to try out a new product, many willing to do it for free. Once I moved to the Bronx, I did the same at the local Fordham train station.

Hiring events. Talk Hiring helps people improve their interviewing skills, so naturally, we found good user testing subjects at hiring events. Specifically, I wanted to target very busy hiring events where job seekers had to wait to talk to employers. Suburban job fairs are relatively empty compared to NYC job fairs, where the lines can easily snake around the block for the opportunity to talk to a handful of employers. I would walk the line and ask bystanders if they wanted to test out whatever it was that I was working on. People waiting on line at a job fair are feeling a combination of anxiety and boredom, and offering them the opportunity to try something out was welcomed by many.

Stopping people on the street. This did not work well at all for me. Some may have more luck at it, but I found that people hate the intrusion because they’re always in the process of doing something or getting somewhere. In my neighborhood of the Bronx (Morris Park), I also found a lot of people didn’t speak English. The Bronx has a hispanic majority, but I do think some people used this as an excuse to avoid me.

Paying people for participating in user tests. Having a bunch of money in my back pocket seemed unsafe, so instead, I had an assortment of Amazon and Uber gift cards with different denominations in my backpack. In NYC, some of the older folks didn’t really use Amazon, but they did use Uber to get around in emergencies.

Online user testing strategies:

Userbob ( If you have something online that people can test from their home, Userbob is the most affordable online user testing site I’ve found. They charge $1/minute, and they will recruit the study participants for you. If I wanted some quick feedback on something I was building, I could spend $25 and have 5 people spend 5 minutes testing it out within a day or so. It’s flexible to the minute in terms of time increments. They share screen and audio recordings as people go through the steps in the user test.

Collecting user feedback:

Quick Zoom screen share calls with users as they encounter problems. A few months ago, we integrated live chat into Talk Hiring. When users reach out about a problem (and I have a free moment), I try to hop on to a Zoom call with the user ASAP. This has been incredibly helpful because I can see exactly what is confusing or broken to the user, I can ask follow up questions, and we can strategize which product changes could help future users. I’ve made numerous small changes to our product because of these quick Zoom screen share calls. I can learn a lot more from a screen share than from a live chat conversation.

Weekly calls with pilot users. When Talk Hiring was in the early stages, we ran pilots with job training programs. One of the requirements for running a pilot was that we had to have weekly phone calls with someone at the program to get quick and constant feedback on what was working and what wasn’t. I also used these calls to talk about the new bugs we fixed and features we shipped in the past week. Users love hearing when you build something that was their idea.

User surveys. After every mock interview, users can fill out an optional 3 question survey. We automatically tag the feedback as coming from a specific user and career readiness program. We’re then able to look for trends in our feedback across organizations and within organizations. This is pretty common stuff, so I won’t go into too much detail here.