Ask Your Target Market


Get inside the minds of your target market with AYTM's intuitive
online survey tool - the first affordable Internet survey software with a built-in consumer panel. Use the online survey creator to build your survey, define the criteria of your target market, and watch the results begin to pour in immediately.

Tuesday, June 7, 2011

Survey Design Manners: Remembering to Say Please & Thank You


When designing a survey with AYTM, you have the option of scripting a personalized introduction in text or with an opening video. You can also finish the survey with a text message. Why is this helpful?  Because our goal is to make the survey experience as pleasant and engaging as possible for our respondents.  What better way to engage respondents than to welcome them with a non-boilerplate message? Imagine how a personal video intro from you can pique their interest in and help them to willingly open the door to their insight.  What better way to close the survey experience than with an equally pleasing thank you?

Let’s look at some examples.

EXAMPLE 1.  CUSTOMER FEEDBACK SURVEY

The introduction to a customer feedback survey could be something like, “Please complete this very brief survey so that we can better serve you in the future.”  Or how about this, “We’d love your feedback on your recent experience with our company. Your feedback, along with that of other valued clients, will help us make ongoing improvements.” 

These openings tell the survey participants why the survey is being done. They also make it clear what’s really in it for them: better service or improved products in the future. 

You can even go a little further and use the introduction message to encourage candid feedback. Consider saying something like, “We’re going to ask you some questions about your experience with our company.  We truly value your candid feedback.”  Simply by saying that you want frank, honest feedback encourages them to comply.

What about the closing message? It’s a great opportunity to remind them that their time was well spent and that you appreciate their help. Consider these options:

  • “Thank you for your feedback. We truly appreciate your time.”
  • “Thank you for completing our customer feedback survey. Your responses will help us to better serve you in the future.”


EXAMPLE 2.  PRODUCT CONCEPT TESTING SURVEY

Maybe you have an idea for a new software application and you’re going to ask your target market some questions about possible feature requirements.  With such a survey you might have an introduction like, “We’re going to ask you several questions about possible features for a new software application. Please note that we want your honest feedback—there are no wrong answers.” Short, relevant, and sends the right message.

If you are surveying your own list and you want to add some extra credibility to the request, you can do so like this: “This research is being sponsored by our Vice President of Product Development, Jane Jones. Jane and her team are eager for honest feedback about several feature ideas currently under consideration.” Adding the executive’s name creates a sense of urgency, and makes it clear that the research really is important.

What about the customized closing text?  Now that they have invested four or five minutes of their time answering your questions, you want to close on a high note.  You can simply say thank you, which is perfectly fine, or you can go a bit further:

  • “This survey was sent to you by Joe Jones of Brand X.  For more information on the survey and how the results will be used, please contact Joe at JJones@987654321.com.” 
  • Thanks for your time. Your input will help guide new product development at Brand X. If you have any questions, please call 999.888.7777, ext 111.
The text box also supports hyperlinks, so you could even offer them a link to a page with additional information.

One Warning
Never use a survey for the purpose of lead-generation.  Mixing research with sales is spammy and widely consider unethical. 


IS IT NECESSARY TO SAY PLEASE AND THANK YOU?

No. But it helps, especially if you are using the AYTM platform to survey your own customers or other in-house lists. If you repeatedly ask your list members to take surveys and provide their “valuable feedback” but it feels impersonal or bland, they will get turned off. If the research is important, use customized text that reflects this.

Bottom line: use customized introductions and closing text to tell people that you want their candid feedback, to thank them for their help, and to let them know that you plan to use the data for something important.  And besides, it’s just a nice way to say “please” and “thank you.”


Thursday, June 2, 2011

Avoiding Bias in Survey Design


Ever been at a friend’s for dinner and complained the food was under-cooked or over-salted?
Of course not. People are polite. And while makes for pleasant dinner conversation, it also makes for misleading research results.

“Politeness” is just one of the factors that can lead to weak survey data. So when we are writing surveys, we need to do everything possible to get honest, objective data—even if it’s “bad news.”

Ask For It
In the survey invitation text and opening screen, tell the survey participant that you value their honest options. Let them know you want candid feedback.  And remind them that their answers are anonymous (to imply that even if they say negative things about your brand or products, there will be no repercussions).

Here is an example of such text, as written for a magazine publisher’s survey:
“We value your opinion, and would like your candid feedback on our recent redesign. Your responses will be kept strictly anonymous and will only be used in aggregate with that of other subscribers.”
Avoid Over-reliance on Agree Scales
A common approach is to craft a series of statements followed by a scale like this one:
  • Strongly Disagree
  • Disagree
  • Neutral
  • Agree
  • Strongly Agree

While this is certainly fine in some cases, it can also be a crutch. And it can lead to overly polite responses.

Consider a case where we want to measure behaviors from people who have recently purchased laptop computers. We might ask:
 “Please indicate your agreement with the following statements. Considering your most recent purchase of a laptop computer:
·      I asked friends or family members for advice
·      I researched online before making a purchase
·      The brand of laptop was an important selection criteria to me.
·      The processor speed was an important selection criteria to me.

For this example, the agreement scale would be displayed for each item, or in a grid-style display.
Research suggests that people will be inclined to agree with such statements more than is actually true. One can debate if this is due to politeness or convenience, but it doesn’t really matter: we know it happens.  So while it can be fine to use some agreement scale questions, don’t over use them. Instead consider “Importance” scales or other options. [If you want more information bout this specific issue, read this recent article from Vovici’s Jeffrey Henning: http://blog.vovici.com/blog/bid/21978/Acquiescence-Bias-Agree-Disagree-Scale-Best-Practices]

Avoid Leading Questions
It’s the classic husband’s dilemma: the wife twirls in a new outfit and then asks, “Does this make me look fat?” He knows what answer she expects, and he’s going to give it to her.
Sometimes survey questions are also leading. Maybe not quite so blatantly, but still.
Which of these options do you think is best?
Option A: “Do you think our new web store is easier to use than our previous one?”, followed by an agreement scale.
Option B: “How does our new web store compare to our previous version?”
§  It’s easier to use
§  The ease of use is the same
§  It’s more difficult to use
§  No opinion

The correct answer is that Option B is less leading. Not just because it doesn’t use an agreement scale, but also because the question itself is more objective.

Randomize Answer Options
Often in surveys we have a list of answer options. Here’s a simple example:
Which of the following types of stores have you shopped in during the past 7 days?
§  “Dollar” store
§  Clothing (Women’s)
§  Clothing (Men’s)
§  Convenience
§  Department
§  Electronics
§  Grocery
§  None of these

If every survey taker sees this list in the same order, there is a risk that the responses will skew to the items at the top. Unfortunately, survey takers tend to focus more on the top of a list than on the bottom. What’s the solution? Randomizing the list. Of course, always anchor logical choices (like, "None of the above”) at the bottom.

Honest Data = Good Data
Our goal is to make it easy and comfortable for survey participants to give us honest, candid answers. Anything we can do to make that happen will help ensure we get quality data from our research surveys.

Monday, May 30, 2011

WHEN TO USE A CONSUMER PANEL VERSUS IN-HOUSE LISTS


Imagine for a moment that you’re making a trip to, say, Japan. The travel agent has given you two options, and the only way the two itineraries differ significantly is in where the requisite 12-hour layover occurs. One is an isolated tropical atoll with a charming village, while the other stops in a bustling city. Both flights will get you there, but which suits your preferences best?

One of the first decisions you’ll make for your AYTM online survey project is where to source your sample. Will you use the AYTM panel or provide your own list?  AYTM is very flexible, so like our imaginary trip, either option will get you there. The choice is entirely up to you. How to choose? Consider the following pros and cons of using our panel versus your own list.

BENEFITS OF USING THE AYTM PANEL
  • Convenience:  Because the AYTM panel is integrated with our platform, this is a very efficient way to start data collection.  Once your survey is programmed and you’ve selected your population criteria [image 1] you’re ready to launch, and data collection is usually done within a day or two. 
  • Ample Selects: The other benefit of using AYTM is the large number of selects—we call them “house tags”-- in our panel.  Not only do we have standard items such as gender, age, income level and so on, we also have 168 other options, such as:
    • Been on a cruise
    • Dog owner
    • Facebook user
    • Gamer
    • Home Wifi
    • Leases car
    • Local voter
    • RV owner

Why are ample selects so important? Because the ability to select specific demographic and psychographic variables ensures that you send your survey to the right target market. 
If AYTM house tags don’t fit your specific need, you can craft any psychographic prescreening question to custom craft who gets to answer your survey.  Since our allegory involves Japan, you could for instance survey only those who pass the screening question “Do you eat sashimi at least twice a month?”


BENEFITS OF USING IN-HOUSE LISTS
Cost: It may be free, since you won’t have to purchase panel access.  Although AYTM panel costs are very low, compared to standard market research pricing, it’s still a cost you could avoid by using your own list.
Proprietary Information: If you’re interested in surveying a very particular audience (say your current customers or people who subscribe to your newsletter) then obviously you have those lists and AYTM does not.

RISKS OF IN-HOUSE LISTS
With the exception of surveying a specific population that’s unique to your organization, your own in-house lists may have some shortcomings.  Be honest and ask yourself the following questions: 
  • Is your list accurate?  If 20 percent of your email addresses are out of date, that’s going to cause delay and could lead to an insufficient final sample size.
  • Are you likely to get a good response rate?  If you’ve done surveys to your list before, you may know what percent of the survey invitation recipients are likely to respond; otherwise, this can be a real wildcard.  Response rates can vary from as little as one-half of one percent to as high as 40 percent, but higher rates are extremely rare.  Be honest with yourself.  If you use your own in-house lists, will you get a decent response?
  • Do you have enough demographic and psychographic information in your list to know that you are sending invitations to the right people?  If you don’t have enough data (say, income level), you can’t select your target, and you may not want to turn people off by having them opt in to take a survey only to be rejected. In addition, you might not want to tip off your customers that you’re specifically interested in certain income ranges.  Again, this is something an AYTM panel can take care of.  If you use the AYTM tool to survey your own list, AYTM allows you to auto-build-in demographic questions that allow for some nifty analysis on the survey results page.
  • Do you plan on doing a lot of surveys? Be very careful not to over-survey your in-house lists.  Treat them with respect.  A good rule of thumb is no more than three surveys a year per participant. There are researchers who advocate that it’s okay to send more, and it varies somewhat by the target market, but think about it for yourself.  If one company was sending you a survey every month, would that start to feel a bit onerous?  Perhaps a little presumptuous of your time? 


Ask yourself some honest questions before you decide whether to use an in-house list or a panel for your online survey project.  Like our flight to Japan, both might get you to your destination, but they offer a very different experience along the way.  

Thursday, May 26, 2011

Employee Surveys: Don't Bury Your Head in The Sand


Many business professionals who use online surveys focus their efforts on customers or prospects, and while that's certainly a very valuable type of research there's an interesting opportunity that’s often overlooked. Why not survey the organization's employees as well?  Employees can be a great source of insight, and not just because of their insider view. Just as we want our customers to be loyal and enthusiastic about our company, we want that same energy from our employees. What better way to engender loyalty than to show them we’re listening to their ideas, wants and concerns, and applying what we learn in a meaningful way?

The Data:
What sort of information might we find out by surveying employees? The possibilities are boundless! Employee surveys can be designed to tackle topics such as:
  • How satisfied are they with the company’s IT and phone systems?
  • Do they have any ideas or suggestions for improving the company’s facilities?
  • Have they thought of new products or services the company should be offering?
  • Have they found easier, innovative ways to accomplish their work?
  • How satisfied are they with their jobs?

In one recent example, the human resources department of a mid-size company conducted a survey of its employees to understand what kind of training programs they would like. The HR folks were quite surprised by some of the responses. Some items they expected to rate highly didn’t, and some that they thought were real long shots actually turned out to be quite popular. As I’ve cautioned here before, survey results are only valuable if you analyze them without expectational bias. I won't give away the results, but by casting a wide net, this company collected opinions on training topics as disparate as computer programs, conflict resolution, leadership training and sales skills—all neatly handled by an online survey.

The Risks:
If you do decide to do employee surveys, there is one critical caveat: You must follow up. There are few things more demoralizing to employees than to feel that they’re not listened to, but one of them is to be asked their opinion, but never see any results of that query. Really, it’s better to have never asked in the first place. If you do an employee survey be sure to put some of the results in an employee newsletter or discuss them at a company meeting. Get upper-level buy-in by having an executive talk about the results and what's going to be done with them—and then make sure to live up to those promises.

Conclusions:

Employees often have great insights about internal and customer-related topics, and online surveys can be a very effective tool for tapping that source. Staff can contribute to a company’s success by being given the opportunity to offer input, but the technique is only effective if the information is used in a meaningful way. You might even consider making the surveys anonymous. Not everybody is willing to put their name on their ideas, especially if they have feedback that is less than positive, and that feedback might well be the most valuable you receive.

Monday, May 23, 2011

Clear Objectives: Insurance for Great Surveys


What is the single most important step to take before you start planning an online survey project? Setting clear objectives. Without clear objectives you are likely to end up with a bloated or rambling 
questionnaire design. Even worse? At the end of the project, you are likely to find that the data you collected didn’t adequately meet your most important needs.

With online surveys, a clear set of objectives is an insurance plan you just can’t pass on.

How to State Objectives
When we’re thinking about objectives for market research projects, it’s often helpful to think in terms of questions or hypotheses. Some examples of goals stated as questions are as follows:
  • How can we improve our sales success within a specific customer group?
  • Why have we experienced a sharp increase in customer defections?
  • Which of three new product concepts should we invest in developing further?
  • In which of several possible geographic areas should we expand sales distribution?
  • What types of marketing messages are likely to resonate with our target market as we roll out this new product or solution?
  • What percentage of our target market knows we exist?
  • What percentage of a specific customer group prefers our competitors and why?
These are all examples of specific questions that can define a research project’s scope. They are precise enough to be meaningful and clearly map to important business issues.  While your own questions may be different, these examples can be used as a guide.
In addition to framing objectives as questions, you can also frame them as hypotheses. Then, the survey is designed to test the hypotheses. If you already have hypotheses, that’s a great way to define a project’s cope. Here are some examples:
  • We have a hypothesis that preference for our brand is notably higher among 18- to 35-year-olds than among those 36 to 49.
  • We have a hypothesis that our customers prefer competitor A’s packaging.
  • We have a hypothesis that we need to add feature A before we add feature B, to optimize near-term sales.
  • We have a hypothesis that sales of new product a are lackluster because our retail partners are not adequately trained with it.

By using questions or hypotheses to specify project objectives, you will be able to effectively document the project’s scope. This will be the information you need to design an effective survey and will also be a handy reference for getting buy-in from team members or colleagues.

How Many?
The most successful custom survey projects have one or two high-level objectives, three at the most. Of course, under that level, there can be several sub-goals. For example, in a case where the primary goal is to measure brand awareness among a target customer group, appropriate sub goals might be (a) to understand how that varies by age range, and (b) to test how it compares relative to competitive brands.

Bonus Benefit
As an added bonus, clear objectives will help you avoid scope creep. It is a common scenario: you start with a very precise study topic. Then word gets out that you’re planning to do a survey project. Suddenly you are bombarded with requests to add “just” two or three questions to the questionnaire: “As long as you’re talking to our customers, can you please ask them about color trends?” or, “Can you please find out if they’ve seen this recent ad we ran?”  Being able to point to a clear, agreed upon, documented project scope will you stand firm.

Monday, May 9, 2011

OPEN-ENDED QUESTIONS: Great In Small Doses

Most online surveys rely heavily on “closed-ended” questions. For example, we often have questions that are followed by five-point scaled responses (such as “very dissatisfied” up to “very satisfied”), or that allow the survey taker to “check all that apply” or “check one” from a list. Those are all common approaches in an online survey and are often very appropriate choices. However, one or two carefully constructed open-ended questions can give you a wealth of information.

An open-ended question is a question that is literally open, so a question is displayed and is followed by a blank text box, in which the respondent can type an answer. An example of this is, “Please describe in your own words your last shopping experience at our store?” or “Please describe what’s most important to you when you’re purchasing blue jeans?” Here’s another common example, brand awareness: “For your next purchase of an automobile, what brands are you most likely to consider?”

The Value

Open-ended questions are useful for three types of goals:
1. When you want to hear people describe their opinion or attitude in their own words: What words do they use to describe their satisfaction or experience with something?
2. When you want to measure awareness: Open-ended questions are a great way to measure awareness. What is their top of mind awareness for a brand, a product. For an awareness question, do they name your brand? Do they name brands that you wouldn’t even have considered in your competitive set? It happens, and it’s fascinating when it does…maybe not happy, but fascinating.
3. When you want to discover something: A great discovery question is, “What else can we do to improve your experience with our product?” or “What do you like best about X?” and “like least?” Simple questions like these can elicit very unexpected answers; answers that may reveal important opportunities.

The Catches

Use them judiciously: Unfortunately, people don’t like to type, so it’s a balancing act. We can have one or two open-ended questions and expect meaningful results, but we have to understand that we can’t have an entire survey constructed of open-ended questions. People get tired or bored, and they’ll drop out. A good general rule of thumb is you want no more than one open-ended question for every ten closed-ended questions.

Be realistic: Some people really don’t like to type and might only give you a one or two word response. You can’t really force them to write in complete sentences or in grammatically correct words. Not all of the respondents are going to answer an open-ended question at all, let alone answer as clearly and completely as you’d like. Typically, you’ll get 30 to 40 percent of the people who will write something that’s at least minimally helpful, and obviously, it will vary by topic.

Plan for analysis time: Somebody is actually going to have to read through these text responses, perhaps even code them, and that takes time. One inexpensive option is to use a word cloud tool to see what words occur the most frequently, and that will save you some time.

WORTHWHILE? YES

Market research professionals routinely use open-ended questions, but they use them prudently. We don’t want to wear out our respondents by asking for too much effort.

Friday, April 29, 2011

Planning Survey Samples: Super Sizing Optional


So you’re planning to do an online survey and you’re wondering, “how many completes do I need?”  There is a short answer, but first, some context.

How confident is confident enough?

You want to have enough data so that you can feel confident in the results. The question is, how confident do you want to be? Are you looking for directional data or do you want something that you can calculate how well it actually represents the target population? 

When you watch the news, you sometimes hear polling results. A common description is, “this data is based on a 95 percent confidence interval plus or minus 3 percent.” What does that mean? It simply means that we know that if we were to do this same poll over and over again that we’re confident that the data would be replicated repeatedly with a precision level of plus or minus 3 percent. That’s pretty darn representative, and I’d feel good about that data.

So how would you figure out what that is? Well, if you really do want reliable data, something that represents a broader target market; then you need to have an estimate of that target market’s variability. In fact, one of the biggest myths in market research is that the sample size required is based on the population size (for the population of interest). That’s not quite true. The formula used by statisticians to determine the required sample size is based on the variability of the population. Now, those of us who do a lot of consumer market research know that there are common levels of variability that we’re likely to encounter. Based on this, many market researchers will use rules of thumb. Given the variability we typically encounter in consumer markets, we know that a sample size of 400 to 600 completes will often give us very nice data. In some cases though, that can be overkill. For example, if you’re studying a market where you know (from past research) that most people have very similar opinions and similar behaviors, then you can get away with a smaller sample size and still have representative data. 

POLITICAL REQUESTS

The reality is that sometimes the sample size requirement is not based on objective needs, but on political ones. You may have internal colleagues who are simply stuck on certain numbers when it comes to survey samples—we see it happen all of the time. Some people believe that they must have 500 completes to feel good about the data, or for some the number is 1,000 or more. If budgets permit, it’s certainly nice to have all that data to play with.

SUBGROUP ANALYSIS

If you plan to do any subgroup analysis, that becomes a sample size consideration. For example, if you know you’re going to want to analyze data by gender, you might want to make sure that you have at enough data so that you can compare the two groups. Or, if you have a question in your survey about purchase intent and you have 4 answer options, you may want enough data to use those 4 “buckets” for subgroup analysts. In reality, many market research sample sizes are based on subgroup analysis needs. 

SAMPLE SIZE CALULATORS

There are sample size calculators also, like this one. And if you’re really interested in getting into the statistics of sample size calculations here is a great article.

THE SHORT ANSWER

For some consumer studies, directional data of 100 responses can be used for quick hits. However, if you want data you that can be used to extrapolate your findings to the broader population, then a sample size of 400-600 is preferred…but add in some political needs or sub-group analysis, and you may need to go higher.