Ask Your Target Market

Get inside the minds of your target market with AYTM's intuitive
online survey tool - the first affordable Internet survey software with a built-in consumer panel. Use the online survey creator to build your survey, define the criteria of your target market, and watch the results begin to pour in immediately.

Tuesday, June 7, 2011

Survey Design Manners: Remembering to Say Please & Thank You

When designing a survey with AYTM, you have the option of scripting a personalized introduction in text or with an opening video. You can also finish the survey with a text message. Why is this helpful?  Because our goal is to make the survey experience as pleasant and engaging as possible for our respondents.  What better way to engage respondents than to welcome them with a non-boilerplate message? Imagine how a personal video intro from you can pique their interest in and help them to willingly open the door to their insight.  What better way to close the survey experience than with an equally pleasing thank you?

Let’s look at some examples.


The introduction to a customer feedback survey could be something like, “Please complete this very brief survey so that we can better serve you in the future.”  Or how about this, “We’d love your feedback on your recent experience with our company. Your feedback, along with that of other valued clients, will help us make ongoing improvements.” 

These openings tell the survey participants why the survey is being done. They also make it clear what’s really in it for them: better service or improved products in the future. 

You can even go a little further and use the introduction message to encourage candid feedback. Consider saying something like, “We’re going to ask you some questions about your experience with our company.  We truly value your candid feedback.”  Simply by saying that you want frank, honest feedback encourages them to comply.

What about the closing message? It’s a great opportunity to remind them that their time was well spent and that you appreciate their help. Consider these options:

  • “Thank you for your feedback. We truly appreciate your time.”
  • “Thank you for completing our customer feedback survey. Your responses will help us to better serve you in the future.”


Maybe you have an idea for a new software application and you’re going to ask your target market some questions about possible feature requirements.  With such a survey you might have an introduction like, “We’re going to ask you several questions about possible features for a new software application. Please note that we want your honest feedback—there are no wrong answers.” Short, relevant, and sends the right message.

If you are surveying your own list and you want to add some extra credibility to the request, you can do so like this: “This research is being sponsored by our Vice President of Product Development, Jane Jones. Jane and her team are eager for honest feedback about several feature ideas currently under consideration.” Adding the executive’s name creates a sense of urgency, and makes it clear that the research really is important.

What about the customized closing text?  Now that they have invested four or five minutes of their time answering your questions, you want to close on a high note.  You can simply say thank you, which is perfectly fine, or you can go a bit further:

  • “This survey was sent to you by Joe Jones of Brand X.  For more information on the survey and how the results will be used, please contact Joe at” 
  • Thanks for your time. Your input will help guide new product development at Brand X. If you have any questions, please call 999.888.7777, ext 111.
The text box also supports hyperlinks, so you could even offer them a link to a page with additional information.

One Warning
Never use a survey for the purpose of lead-generation.  Mixing research with sales is spammy and widely consider unethical. 


No. But it helps, especially if you are using the AYTM platform to survey your own customers or other in-house lists. If you repeatedly ask your list members to take surveys and provide their “valuable feedback” but it feels impersonal or bland, they will get turned off. If the research is important, use customized text that reflects this.

Bottom line: use customized introductions and closing text to tell people that you want their candid feedback, to thank them for their help, and to let them know that you plan to use the data for something important.  And besides, it’s just a nice way to say “please” and “thank you.”

Thursday, June 2, 2011

Avoiding Bias in Survey Design

Ever been at a friend’s for dinner and complained the food was under-cooked or over-salted?
Of course not. People are polite. And while makes for pleasant dinner conversation, it also makes for misleading research results.

“Politeness” is just one of the factors that can lead to weak survey data. So when we are writing surveys, we need to do everything possible to get honest, objective data—even if it’s “bad news.”

Ask For It
In the survey invitation text and opening screen, tell the survey participant that you value their honest options. Let them know you want candid feedback.  And remind them that their answers are anonymous (to imply that even if they say negative things about your brand or products, there will be no repercussions).

Here is an example of such text, as written for a magazine publisher’s survey:
“We value your opinion, and would like your candid feedback on our recent redesign. Your responses will be kept strictly anonymous and will only be used in aggregate with that of other subscribers.”
Avoid Over-reliance on Agree Scales
A common approach is to craft a series of statements followed by a scale like this one:
  • Strongly Disagree
  • Disagree
  • Neutral
  • Agree
  • Strongly Agree

While this is certainly fine in some cases, it can also be a crutch. And it can lead to overly polite responses.

Consider a case where we want to measure behaviors from people who have recently purchased laptop computers. We might ask:
 “Please indicate your agreement with the following statements. Considering your most recent purchase of a laptop computer:
·      I asked friends or family members for advice
·      I researched online before making a purchase
·      The brand of laptop was an important selection criteria to me.
·      The processor speed was an important selection criteria to me.

For this example, the agreement scale would be displayed for each item, or in a grid-style display.
Research suggests that people will be inclined to agree with such statements more than is actually true. One can debate if this is due to politeness or convenience, but it doesn’t really matter: we know it happens.  So while it can be fine to use some agreement scale questions, don’t over use them. Instead consider “Importance” scales or other options. [If you want more information bout this specific issue, read this recent article from Vovici’s Jeffrey Henning:]

Avoid Leading Questions
It’s the classic husband’s dilemma: the wife twirls in a new outfit and then asks, “Does this make me look fat?” He knows what answer she expects, and he’s going to give it to her.
Sometimes survey questions are also leading. Maybe not quite so blatantly, but still.
Which of these options do you think is best?
Option A: “Do you think our new web store is easier to use than our previous one?”, followed by an agreement scale.
Option B: “How does our new web store compare to our previous version?”
§  It’s easier to use
§  The ease of use is the same
§  It’s more difficult to use
§  No opinion

The correct answer is that Option B is less leading. Not just because it doesn’t use an agreement scale, but also because the question itself is more objective.

Randomize Answer Options
Often in surveys we have a list of answer options. Here’s a simple example:
Which of the following types of stores have you shopped in during the past 7 days?
§  “Dollar” store
§  Clothing (Women’s)
§  Clothing (Men’s)
§  Convenience
§  Department
§  Electronics
§  Grocery
§  None of these

If every survey taker sees this list in the same order, there is a risk that the responses will skew to the items at the top. Unfortunately, survey takers tend to focus more on the top of a list than on the bottom. What’s the solution? Randomizing the list. Of course, always anchor logical choices (like, "None of the above”) at the bottom.

Honest Data = Good Data
Our goal is to make it easy and comfortable for survey participants to give us honest, candid answers. Anything we can do to make that happen will help ensure we get quality data from our research surveys.