Ask Your Target Market


Get inside the minds of your target market with AYTM's intuitive
online survey tool - the first affordable Internet survey software with a built-in consumer panel. Use the online survey creator to build your survey, define the criteria of your target market, and watch the results begin to pour in immediately.

Tuesday, June 7, 2011

Survey Design Manners: Remembering to Say Please & Thank You


When designing a survey with AYTM, you have the option of scripting a personalized introduction in text or with an opening video. You can also finish the survey with a text message. Why is this helpful?  Because our goal is to make the survey experience as pleasant and engaging as possible for our respondents.  What better way to engage respondents than to welcome them with a non-boilerplate message? Imagine how a personal video intro from you can pique their interest in and help them to willingly open the door to their insight.  What better way to close the survey experience than with an equally pleasing thank you?

Let’s look at some examples.

EXAMPLE 1.  CUSTOMER FEEDBACK SURVEY

The introduction to a customer feedback survey could be something like, “Please complete this very brief survey so that we can better serve you in the future.”  Or how about this, “We’d love your feedback on your recent experience with our company. Your feedback, along with that of other valued clients, will help us make ongoing improvements.” 

These openings tell the survey participants why the survey is being done. They also make it clear what’s really in it for them: better service or improved products in the future. 

You can even go a little further and use the introduction message to encourage candid feedback. Consider saying something like, “We’re going to ask you some questions about your experience with our company.  We truly value your candid feedback.”  Simply by saying that you want frank, honest feedback encourages them to comply.

What about the closing message? It’s a great opportunity to remind them that their time was well spent and that you appreciate their help. Consider these options:

  • “Thank you for your feedback. We truly appreciate your time.”
  • “Thank you for completing our customer feedback survey. Your responses will help us to better serve you in the future.”


EXAMPLE 2.  PRODUCT CONCEPT TESTING SURVEY

Maybe you have an idea for a new software application and you’re going to ask your target market some questions about possible feature requirements.  With such a survey you might have an introduction like, “We’re going to ask you several questions about possible features for a new software application. Please note that we want your honest feedback—there are no wrong answers.” Short, relevant, and sends the right message.

If you are surveying your own list and you want to add some extra credibility to the request, you can do so like this: “This research is being sponsored by our Vice President of Product Development, Jane Jones. Jane and her team are eager for honest feedback about several feature ideas currently under consideration.” Adding the executive’s name creates a sense of urgency, and makes it clear that the research really is important.

What about the customized closing text?  Now that they have invested four or five minutes of their time answering your questions, you want to close on a high note.  You can simply say thank you, which is perfectly fine, or you can go a bit further:

  • “This survey was sent to you by Joe Jones of Brand X.  For more information on the survey and how the results will be used, please contact Joe at JJones@987654321.com.” 
  • Thanks for your time. Your input will help guide new product development at Brand X. If you have any questions, please call 999.888.7777, ext 111.
The text box also supports hyperlinks, so you could even offer them a link to a page with additional information.

One Warning
Never use a survey for the purpose of lead-generation.  Mixing research with sales is spammy and widely consider unethical. 


IS IT NECESSARY TO SAY PLEASE AND THANK YOU?

No. But it helps, especially if you are using the AYTM platform to survey your own customers or other in-house lists. If you repeatedly ask your list members to take surveys and provide their “valuable feedback” but it feels impersonal or bland, they will get turned off. If the research is important, use customized text that reflects this.

Bottom line: use customized introductions and closing text to tell people that you want their candid feedback, to thank them for their help, and to let them know that you plan to use the data for something important.  And besides, it’s just a nice way to say “please” and “thank you.”


Thursday, June 2, 2011

Avoiding Bias in Survey Design


Ever been at a friend’s for dinner and complained the food was under-cooked or over-salted?
Of course not. People are polite. And while makes for pleasant dinner conversation, it also makes for misleading research results.

“Politeness” is just one of the factors that can lead to weak survey data. So when we are writing surveys, we need to do everything possible to get honest, objective data—even if it’s “bad news.”

Ask For It
In the survey invitation text and opening screen, tell the survey participant that you value their honest options. Let them know you want candid feedback.  And remind them that their answers are anonymous (to imply that even if they say negative things about your brand or products, there will be no repercussions).

Here is an example of such text, as written for a magazine publisher’s survey:
“We value your opinion, and would like your candid feedback on our recent redesign. Your responses will be kept strictly anonymous and will only be used in aggregate with that of other subscribers.”
Avoid Over-reliance on Agree Scales
A common approach is to craft a series of statements followed by a scale like this one:
  • Strongly Disagree
  • Disagree
  • Neutral
  • Agree
  • Strongly Agree

While this is certainly fine in some cases, it can also be a crutch. And it can lead to overly polite responses.

Consider a case where we want to measure behaviors from people who have recently purchased laptop computers. We might ask:
 “Please indicate your agreement with the following statements. Considering your most recent purchase of a laptop computer:
·      I asked friends or family members for advice
·      I researched online before making a purchase
·      The brand of laptop was an important selection criteria to me.
·      The processor speed was an important selection criteria to me.

For this example, the agreement scale would be displayed for each item, or in a grid-style display.
Research suggests that people will be inclined to agree with such statements more than is actually true. One can debate if this is due to politeness or convenience, but it doesn’t really matter: we know it happens.  So while it can be fine to use some agreement scale questions, don’t over use them. Instead consider “Importance” scales or other options. [If you want more information bout this specific issue, read this recent article from Vovici’s Jeffrey Henning: http://blog.vovici.com/blog/bid/21978/Acquiescence-Bias-Agree-Disagree-Scale-Best-Practices]

Avoid Leading Questions
It’s the classic husband’s dilemma: the wife twirls in a new outfit and then asks, “Does this make me look fat?” He knows what answer she expects, and he’s going to give it to her.
Sometimes survey questions are also leading. Maybe not quite so blatantly, but still.
Which of these options do you think is best?
Option A: “Do you think our new web store is easier to use than our previous one?”, followed by an agreement scale.
Option B: “How does our new web store compare to our previous version?”
§  It’s easier to use
§  The ease of use is the same
§  It’s more difficult to use
§  No opinion

The correct answer is that Option B is less leading. Not just because it doesn’t use an agreement scale, but also because the question itself is more objective.

Randomize Answer Options
Often in surveys we have a list of answer options. Here’s a simple example:
Which of the following types of stores have you shopped in during the past 7 days?
§  “Dollar” store
§  Clothing (Women’s)
§  Clothing (Men’s)
§  Convenience
§  Department
§  Electronics
§  Grocery
§  None of these

If every survey taker sees this list in the same order, there is a risk that the responses will skew to the items at the top. Unfortunately, survey takers tend to focus more on the top of a list than on the bottom. What’s the solution? Randomizing the list. Of course, always anchor logical choices (like, "None of the above”) at the bottom.

Honest Data = Good Data
Our goal is to make it easy and comfortable for survey participants to give us honest, candid answers. Anything we can do to make that happen will help ensure we get quality data from our research surveys.

Monday, May 30, 2011

WHEN TO USE A CONSUMER PANEL VERSUS IN-HOUSE LISTS


Imagine for a moment that you’re making a trip to, say, Japan. The travel agent has given you two options, and the only way the two itineraries differ significantly is in where the requisite 12-hour layover occurs. One is an isolated tropical atoll with a charming village, while the other stops in a bustling city. Both flights will get you there, but which suits your preferences best?

One of the first decisions you’ll make for your AYTM online survey project is where to source your sample. Will you use the AYTM panel or provide your own list?  AYTM is very flexible, so like our imaginary trip, either option will get you there. The choice is entirely up to you. How to choose? Consider the following pros and cons of using our panel versus your own list.

BENEFITS OF USING THE AYTM PANEL
  • Convenience:  Because the AYTM panel is integrated with our platform, this is a very efficient way to start data collection.  Once your survey is programmed and you’ve selected your population criteria [image 1] you’re ready to launch, and data collection is usually done within a day or two. 
  • Ample Selects: The other benefit of using AYTM is the large number of selects—we call them “house tags”-- in our panel.  Not only do we have standard items such as gender, age, income level and so on, we also have 168 other options, such as:
    • Been on a cruise
    • Dog owner
    • Facebook user
    • Gamer
    • Home Wifi
    • Leases car
    • Local voter
    • RV owner

Why are ample selects so important? Because the ability to select specific demographic and psychographic variables ensures that you send your survey to the right target market. 
If AYTM house tags don’t fit your specific need, you can craft any psychographic prescreening question to custom craft who gets to answer your survey.  Since our allegory involves Japan, you could for instance survey only those who pass the screening question “Do you eat sashimi at least twice a month?”


BENEFITS OF USING IN-HOUSE LISTS
Cost: It may be free, since you won’t have to purchase panel access.  Although AYTM panel costs are very low, compared to standard market research pricing, it’s still a cost you could avoid by using your own list.
Proprietary Information: If you’re interested in surveying a very particular audience (say your current customers or people who subscribe to your newsletter) then obviously you have those lists and AYTM does not.

RISKS OF IN-HOUSE LISTS
With the exception of surveying a specific population that’s unique to your organization, your own in-house lists may have some shortcomings.  Be honest and ask yourself the following questions: 
  • Is your list accurate?  If 20 percent of your email addresses are out of date, that’s going to cause delay and could lead to an insufficient final sample size.
  • Are you likely to get a good response rate?  If you’ve done surveys to your list before, you may know what percent of the survey invitation recipients are likely to respond; otherwise, this can be a real wildcard.  Response rates can vary from as little as one-half of one percent to as high as 40 percent, but higher rates are extremely rare.  Be honest with yourself.  If you use your own in-house lists, will you get a decent response?
  • Do you have enough demographic and psychographic information in your list to know that you are sending invitations to the right people?  If you don’t have enough data (say, income level), you can’t select your target, and you may not want to turn people off by having them opt in to take a survey only to be rejected. In addition, you might not want to tip off your customers that you’re specifically interested in certain income ranges.  Again, this is something an AYTM panel can take care of.  If you use the AYTM tool to survey your own list, AYTM allows you to auto-build-in demographic questions that allow for some nifty analysis on the survey results page.
  • Do you plan on doing a lot of surveys? Be very careful not to over-survey your in-house lists.  Treat them with respect.  A good rule of thumb is no more than three surveys a year per participant. There are researchers who advocate that it’s okay to send more, and it varies somewhat by the target market, but think about it for yourself.  If one company was sending you a survey every month, would that start to feel a bit onerous?  Perhaps a little presumptuous of your time? 


Ask yourself some honest questions before you decide whether to use an in-house list or a panel for your online survey project.  Like our flight to Japan, both might get you to your destination, but they offer a very different experience along the way.  

Thursday, May 26, 2011

Employee Surveys: Don't Bury Your Head in The Sand


Many business professionals who use online surveys focus their efforts on customers or prospects, and while that's certainly a very valuable type of research there's an interesting opportunity that’s often overlooked. Why not survey the organization's employees as well?  Employees can be a great source of insight, and not just because of their insider view. Just as we want our customers to be loyal and enthusiastic about our company, we want that same energy from our employees. What better way to engender loyalty than to show them we’re listening to their ideas, wants and concerns, and applying what we learn in a meaningful way?

The Data:
What sort of information might we find out by surveying employees? The possibilities are boundless! Employee surveys can be designed to tackle topics such as:
  • How satisfied are they with the company’s IT and phone systems?
  • Do they have any ideas or suggestions for improving the company’s facilities?
  • Have they thought of new products or services the company should be offering?
  • Have they found easier, innovative ways to accomplish their work?
  • How satisfied are they with their jobs?

In one recent example, the human resources department of a mid-size company conducted a survey of its employees to understand what kind of training programs they would like. The HR folks were quite surprised by some of the responses. Some items they expected to rate highly didn’t, and some that they thought were real long shots actually turned out to be quite popular. As I’ve cautioned here before, survey results are only valuable if you analyze them without expectational bias. I won't give away the results, but by casting a wide net, this company collected opinions on training topics as disparate as computer programs, conflict resolution, leadership training and sales skills—all neatly handled by an online survey.

The Risks:
If you do decide to do employee surveys, there is one critical caveat: You must follow up. There are few things more demoralizing to employees than to feel that they’re not listened to, but one of them is to be asked their opinion, but never see any results of that query. Really, it’s better to have never asked in the first place. If you do an employee survey be sure to put some of the results in an employee newsletter or discuss them at a company meeting. Get upper-level buy-in by having an executive talk about the results and what's going to be done with them—and then make sure to live up to those promises.

Conclusions:

Employees often have great insights about internal and customer-related topics, and online surveys can be a very effective tool for tapping that source. Staff can contribute to a company’s success by being given the opportunity to offer input, but the technique is only effective if the information is used in a meaningful way. You might even consider making the surveys anonymous. Not everybody is willing to put their name on their ideas, especially if they have feedback that is less than positive, and that feedback might well be the most valuable you receive.

Monday, May 23, 2011

Clear Objectives: Insurance for Great Surveys


What is the single most important step to take before you start planning an online survey project? Setting clear objectives. Without clear objectives you are likely to end up with a bloated or rambling 
questionnaire design. Even worse? At the end of the project, you are likely to find that the data you collected didn’t adequately meet your most important needs.

With online surveys, a clear set of objectives is an insurance plan you just can’t pass on.

How to State Objectives
When we’re thinking about objectives for market research projects, it’s often helpful to think in terms of questions or hypotheses. Some examples of goals stated as questions are as follows:
  • How can we improve our sales success within a specific customer group?
  • Why have we experienced a sharp increase in customer defections?
  • Which of three new product concepts should we invest in developing further?
  • In which of several possible geographic areas should we expand sales distribution?
  • What types of marketing messages are likely to resonate with our target market as we roll out this new product or solution?
  • What percentage of our target market knows we exist?
  • What percentage of a specific customer group prefers our competitors and why?
These are all examples of specific questions that can define a research project’s scope. They are precise enough to be meaningful and clearly map to important business issues.  While your own questions may be different, these examples can be used as a guide.
In addition to framing objectives as questions, you can also frame them as hypotheses. Then, the survey is designed to test the hypotheses. If you already have hypotheses, that’s a great way to define a project’s cope. Here are some examples:
  • We have a hypothesis that preference for our brand is notably higher among 18- to 35-year-olds than among those 36 to 49.
  • We have a hypothesis that our customers prefer competitor A’s packaging.
  • We have a hypothesis that we need to add feature A before we add feature B, to optimize near-term sales.
  • We have a hypothesis that sales of new product a are lackluster because our retail partners are not adequately trained with it.

By using questions or hypotheses to specify project objectives, you will be able to effectively document the project’s scope. This will be the information you need to design an effective survey and will also be a handy reference for getting buy-in from team members or colleagues.

How Many?
The most successful custom survey projects have one or two high-level objectives, three at the most. Of course, under that level, there can be several sub-goals. For example, in a case where the primary goal is to measure brand awareness among a target customer group, appropriate sub goals might be (a) to understand how that varies by age range, and (b) to test how it compares relative to competitive brands.

Bonus Benefit
As an added bonus, clear objectives will help you avoid scope creep. It is a common scenario: you start with a very precise study topic. Then word gets out that you’re planning to do a survey project. Suddenly you are bombarded with requests to add “just” two or three questions to the questionnaire: “As long as you’re talking to our customers, can you please ask them about color trends?” or, “Can you please find out if they’ve seen this recent ad we ran?”  Being able to point to a clear, agreed upon, documented project scope will you stand firm.

Monday, May 9, 2011

OPEN-ENDED QUESTIONS: Great In Small Doses

Most online surveys rely heavily on “closed-ended” questions. For example, we often have questions that are followed by five-point scaled responses (such as “very dissatisfied” up to “very satisfied”), or that allow the survey taker to “check all that apply” or “check one” from a list. Those are all common approaches in an online survey and are often very appropriate choices. However, one or two carefully constructed open-ended questions can give you a wealth of information.

An open-ended question is a question that is literally open, so a question is displayed and is followed by a blank text box, in which the respondent can type an answer. An example of this is, “Please describe in your own words your last shopping experience at our store?” or “Please describe what’s most important to you when you’re purchasing blue jeans?” Here’s another common example, brand awareness: “For your next purchase of an automobile, what brands are you most likely to consider?”

The Value

Open-ended questions are useful for three types of goals:
1. When you want to hear people describe their opinion or attitude in their own words: What words do they use to describe their satisfaction or experience with something?
2. When you want to measure awareness: Open-ended questions are a great way to measure awareness. What is their top of mind awareness for a brand, a product. For an awareness question, do they name your brand? Do they name brands that you wouldn’t even have considered in your competitive set? It happens, and it’s fascinating when it does…maybe not happy, but fascinating.
3. When you want to discover something: A great discovery question is, “What else can we do to improve your experience with our product?” or “What do you like best about X?” and “like least?” Simple questions like these can elicit very unexpected answers; answers that may reveal important opportunities.

The Catches

Use them judiciously: Unfortunately, people don’t like to type, so it’s a balancing act. We can have one or two open-ended questions and expect meaningful results, but we have to understand that we can’t have an entire survey constructed of open-ended questions. People get tired or bored, and they’ll drop out. A good general rule of thumb is you want no more than one open-ended question for every ten closed-ended questions.

Be realistic: Some people really don’t like to type and might only give you a one or two word response. You can’t really force them to write in complete sentences or in grammatically correct words. Not all of the respondents are going to answer an open-ended question at all, let alone answer as clearly and completely as you’d like. Typically, you’ll get 30 to 40 percent of the people who will write something that’s at least minimally helpful, and obviously, it will vary by topic.

Plan for analysis time: Somebody is actually going to have to read through these text responses, perhaps even code them, and that takes time. One inexpensive option is to use a word cloud tool to see what words occur the most frequently, and that will save you some time.

WORTHWHILE? YES

Market research professionals routinely use open-ended questions, but they use them prudently. We don’t want to wear out our respondents by asking for too much effort.

Friday, April 29, 2011

Planning Survey Samples: Super Sizing Optional


So you’re planning to do an online survey and you’re wondering, “how many completes do I need?”  There is a short answer, but first, some context.

How confident is confident enough?

You want to have enough data so that you can feel confident in the results. The question is, how confident do you want to be? Are you looking for directional data or do you want something that you can calculate how well it actually represents the target population? 

When you watch the news, you sometimes hear polling results. A common description is, “this data is based on a 95 percent confidence interval plus or minus 3 percent.” What does that mean? It simply means that we know that if we were to do this same poll over and over again that we’re confident that the data would be replicated repeatedly with a precision level of plus or minus 3 percent. That’s pretty darn representative, and I’d feel good about that data.

So how would you figure out what that is? Well, if you really do want reliable data, something that represents a broader target market; then you need to have an estimate of that target market’s variability. In fact, one of the biggest myths in market research is that the sample size required is based on the population size (for the population of interest). That’s not quite true. The formula used by statisticians to determine the required sample size is based on the variability of the population. Now, those of us who do a lot of consumer market research know that there are common levels of variability that we’re likely to encounter. Based on this, many market researchers will use rules of thumb. Given the variability we typically encounter in consumer markets, we know that a sample size of 400 to 600 completes will often give us very nice data. In some cases though, that can be overkill. For example, if you’re studying a market where you know (from past research) that most people have very similar opinions and similar behaviors, then you can get away with a smaller sample size and still have representative data. 

POLITICAL REQUESTS

The reality is that sometimes the sample size requirement is not based on objective needs, but on political ones. You may have internal colleagues who are simply stuck on certain numbers when it comes to survey samples—we see it happen all of the time. Some people believe that they must have 500 completes to feel good about the data, or for some the number is 1,000 or more. If budgets permit, it’s certainly nice to have all that data to play with.

SUBGROUP ANALYSIS

If you plan to do any subgroup analysis, that becomes a sample size consideration. For example, if you know you’re going to want to analyze data by gender, you might want to make sure that you have at enough data so that you can compare the two groups. Or, if you have a question in your survey about purchase intent and you have 4 answer options, you may want enough data to use those 4 “buckets” for subgroup analysts. In reality, many market research sample sizes are based on subgroup analysis needs. 

SAMPLE SIZE CALULATORS

There are sample size calculators also, like this one. And if you’re really interested in getting into the statistics of sample size calculations here is a great article.

THE SHORT ANSWER

For some consumer studies, directional data of 100 responses can be used for quick hits. However, if you want data you that can be used to extrapolate your findings to the broader population, then a sample size of 400-600 is preferred…but add in some political needs or sub-group analysis, and you may need to go higher.



Tuesday, April 26, 2011

Market Research for Start-ups: Is My Baby Ugly?

Do you have a new company and perhaps a new product? Online surveys can be a great way to gather important information before you start marketing that product. Sure, you may think your baby is beautiful, but will everybody else? In your heart, you know you need a sanity check. You want a gauge on likelihood of success. And you may want to identify likely sales objections before you actually encounter them. Or, maybe you simply want to know if the product idea is easy to convey; will target customers embrace it readily, or is this going to be a long sales cycle? These are all great research goals.

Here are some examples of cool things you can do with online surveys before your reveal your new baby to the world.

Getting product concept feedback
In an online survey, you can describe a product and ask some follow up questions to get feedback. When you describe your product, keep in mind a few things: 
  • Keep the product description very brief, two to three sentences at most. Nobody wants to read a 100-word product description.
  • Make sure that your description is objective. This is not a sales pitch. If you want honest feedback, then you want to describe your product objectively. Instead of, "Hey we've invented the world's coolest widget; don't you want to buy one?” You want to say, “Here’s a description of a new widget. It has these features. It's different from other products in that it has feature Y.”

After the product description, you can ask some simple follow up questions. You might ask, “If this product were available today, how likely would you be to purchase it?” or perhaps, “How likely would you be to purchase this type of product in the next six months?” Notice that I am specifying time ranges. This is a way to keep it a little less hypothetical. If you simply say, “How likely are you to purchase this product?” It is very hypothetical: ever, in the next year, in the next five years, maybe? You get overly positive results that way, and you really do not want that. You want honest data. By putting in a timeframe, you will help mitigate that risk of getting overly rosy data.

Identifying Demand Drivers & Deterrents
You can also test possible demand drivers and deterrents. Use skip logic (http://askyourtargetmarket.com/pages/writing_and_editing) to have those people who indicate interest to answer a follow-up question and find out why they like the concept. Skip those who indicate little or no interest to a parallel question to measure demand deterrents. 

One approach is to simply ask, “Which of the following are reasons you are likely to purchase this product?” Now, you likely have some hypotheses to offer as answer options. You might have four different reasons for which people will buy this product:
  • I think it will save me time
  • I think it will look great in my home
  • I think that it will last longer than other similar products that I've own in the past
  • I like that it has feature X
  • Other (please specify)

In this example, your online survey would tell you what percent of people agree with each of those statements. Alternatively, you could have each of those items rated on a five-point scale.

Similarly, perhaps you have some hypotheses about what might prevent them from purchasing the product, such as:
§  My spouse wouldn’t want this in our house
§  It takes up too much space
§  It sounds hard to install
§  I don’t have time to use it
§  Other (please specify)
Measuring the prevalence of likely sales objections can be very helpful. Imagine if one of the objections can be addressed easily through modified packaging? Or, by adding or removing a simple feature or maybe just by describing the product differently? This kind of information could save many headaches.

So, Is Your Baby Ugly?

Hopefully not. But with a little customer insight, you will know how to dress your baby for the occasion.


Thursday, April 21, 2011

Selecting Survey Demographics

Demographics are those attributes or variables that describe the population of interest for your survey. Common demographic variables include age, income range, marital status, education level, ethnic group and gender.  The data collected in a survey is only valuable if you know who is answering your questions, so it’s vital to choose demographic criteria wisely when setting up a survey project.

To select demographics for your next survey, use these three considerations:

  1. What is your objective? The step of selecting demographics comes after setting project objectives. Based on those objectives, the type of population you need should be fairly easy to infer. If you’re doing research related to a bicycle accessory product, you’ll probably want people of a certain age range, and probably of a certain minimum income level so they can afford to spend on things like bicycle accessories.  If your product is designed to appeal more to women or to men, then you might choose to target your research toward one gender. Clear objectives will help define demographics.

  1. What is feasible? When selecting demographics for your survey, avoid the temptation of selecting too many. Why is this important?

·               The more demographic variables you require, the higher the cost. This is true with any sample source.
·               A large number of demographic variables can limit your pool of respondents drastically, reducing the feasibility of your project and increasing the study’s cost and timeline.  Just for context, a typical survey project has two to four demographic requirements..
·               An overly narrow focus becomes artificial. Unless you have a very large marketing budget, you likely can’t market only to men who make $50,000 to $150,000 a year, live in suburban areas, have at least two years of college, are married, and are of a specific ethnic group. With too many demographic selections, you quickly reach the point where your survey sample bears no relation to marketing reality.

  1. What does your internal audience need? You need to know the requirements of the people who will use your survey results—the internal clients. Here’s a common scenario:  You’re sharing the results of a survey project with some colleagues and hear disappointed utterances such as, “Oh, you didn’t include higher income ranges?” and “Oh, I see you included people who are retired…” The fact that your internal clients have questions about the demographics used to select participants will make them less likely to use the research results. You need to know early on about any specific demographic requirements so you can consider them in your planning process.

Conclusions: Having clear objectives, limiting criteria for feasibility (and reality), and identifying internal client needs are all important aspects of designing a good survey project. If you take these three items into consideration when selecting demographics, you’ll be able to prioritize the right demographic variables for your next AYTM survey.

Monday, April 18, 2011

How to Read Data


The idea of reading survey results seems simple enough. Until you do it. Even a “short” survey yields a lot of data. How can you simplify it so you can efficiently identify the most important results?
Let’s tackle this for two common situations.

1: Data from single choice questions
These questions are effective when you want people to prioritize a choice. Because each respondent can select only one answer, the responses add up to 100%, giving you an effective “rank order” of responses.  In some cases, one item clearly stands out at the top of the results—and you have a clear “winner.” But what if some of the items are close? Or if no single item really rises above the others?

The brutal reality is that there may not be a “winning” answer. Let us imagine a single choice question related to color preferences. The result may be that blue, green and purple are all within 5% of each other (ranging from 20-25% each), at the top of the list. You might have preferred to see a winner, but the data show that’s not reality among the population of interest. In this case, we have “bands”: band 1 colors are the top 3, band 2 colors are in the middle, and band 3 is at the bottom (those selected by fewer than 10%):
Blue 25%
Green 21%
Purple 20%
Red 12%
Orange 12%
Yellow 6%
Black 4%

Single choice questions don’t always yield a “winner,” and that’s ok. Using the “bands” approach can help you get to the “so what” quickly.

How do you define a band? Think about what is actionable for your topic area. For your topic, will a 10% difference in preferences cause you to make a different choice? To add a different feature? To offer a different color option? In most cases, probably not. A 20% difference is more likely to lead to action. A 30% difference? Almost certainly so. For example, if 60% of your customers like feature A and 30% like feature B, that will likely lead you to focus on feature A. If instead, that had been 40% versus 30%, the data alone would not lead you to choose feature A.

If you want to get really precise, you can import your data into a “crosstab” program and do actual significance testing. But in many cases, that can be overkill.  What most people want is to use the data to make decisions.  Not to know with precision if a 5% difference is attributable to minor variability in the data set.

2: Data from multiple-choice questions
Multiple-choice questions are great for capturing the realistic situation where more than one attitude, behavior or preference may exist. For an example, let’s imagine you have a question about favorite online news sources.  In this hypothetical case, you want to know “all that apply” because of two key hypotheses:
  •   You have a hypothesis that some customers are online news enthusiasts and visit many such sites.
  •  You have a hypothesis that some news sources attract the same people. For example, you think NYT.com and HuffingtonPost.com have a lot of overlap.


So in this case, a “check all that apply” makes sense.

But how would you interpret the data if you offered 5 news sources, and the percents add up to over 300%? That simply means many people checked most of your options, and supports hypothesis #1.
In this case, you may also want to look at some sub groups. This is the “filter by answers” option of AYTM. For example, take all of the people who selected NYT.com, and see what else they checked. Then take all of the people who selected  CNN.com and see what they selected. Comparing sub-groups can tell more of a story than looking only at the data in aggregate. Maybe the NYT subgroup also visits every other site, but the CNN group tends to focus on CNN alone? That would be fascinating!

But what if you offered 5 news source answer options, and the percents add up to over 400%, and that strikes you as a bit high? In such cases, a follow-up question could have been a safety net.  Question 1 would have been a check all that apply, “Which of the following news sources do you visit at least once a week?” And then question 2 could have been a single choice follow up, “Which of the following sources have you visited most recently?”  It gives you the best of both worlds.

In Either Case
In the case of single or multiple-choice questions, be prepared for unexpected results. Percents that are higher than you expected, or lower. And be open-minded.  The data may not tell the story you were hoping for, but it will tell a story. 

Thursday, April 14, 2011

Market Research Surveys: Keep Your Guests Happy


Do you know what’s worse than throwing a party and having nobody show up? Having one where guests are bored and unhappy.

Same thing applies to surveys. Once participants start your survey, you want them fully engaged—not holding up the walls. You want their active participation, so that they carefully read your questions and answer options, and give you thoughtful answers.


Survey Respondent Engagement

What are some strategies for keeping people fully engaged while completing your survey? Here are three.

Tip No. 1: Avoid Scary Starts. Make sure your first few questions are not so onerous that participants are scared off. If the first one, two, or even three questions seem too intrusive or boring, they’ll simply drop out. After all, at this point participants haven’t completed many questions, so they don’t have any real investment. If you do have questions that are a bit more cognitively difficult or lengthy, place them toward the middle or end of your survey instrument. At that point, there is a sense that, “I’ve already invested five minutes in this survey, so I might as well just finish it up.” In any case, we should keep such content to a bare minimum.

Tip No. 2: This Is Not a Test. Avoid questions that sound like you’re quizzing people. Few people have fond memories of tests like the SAT, so make sure the tone of your questions is friendly and simple. For example, examine your word choices. Change the word “utilize” to “use.” The word “enable” often can be “allow.” You get the point. Select words you would use in a real conversation. Avoid sentences that will need to be re-read three times to get the point.

Tip No. 3: Ask for Opinions. People don’t like to simply report what they do; it starts to feel a little intrusive. For example, asking too many questions in a row about what they’ve purchased recently or how much they spent on a recent grocery store trip gets boring and lacks emotional engagement. To keep them involved, include questions that ask for their opinions. Opinion-oriented questions might ask what brands they like best, or what they would like to see in a new product, or “What could this company do to improve your satisfaction?” Such questions are much more interesting. Sure, you may need some of the boring stuff—but mix it up.

Is Your Survey Boring?

Before you send out that survey, and we’ve mentioned this in other AYTM blog articles, have somebody read it out loud to you so you can hear how somebody else would actually read the questions, the instructions, and the answer options. If you find yourself zoning out after a couple of questions, that’s a good clue that your survey guests will be unhappy.




Tuesday, April 12, 2011

Big News from Ad:Tech in SF: Now Access Opinions from Millions of US Consumers through AYTM


 



Ask Your Target Market Integrates 
 uSamp’s SampleMarket 2.0Panel Access Platform to Serve Up Millions of uSamp’s U.S. Survey Respondents To AYTM's DIY Market Research Survey Solution

At ad:tech San Francisco, AYTM Showcases uSamp Relationship and
Unveils Major Product Upgrade

            SAN FRANCISCO (April 12, 2011) –  Ask Your Target Market (www.AYTM.com), a sophisticated new online survey platform, and uSamp (www.uSamp.com), one of the world's fastest growing technology and online sample companies, today announced a partnership that makes uSamp’s proprietary panel of more than 1.7 million screened and vetted U.S. survey respondents available to AYTM clients.  The companies made the announcement at ad:tech San Francisco, here through April 13.
            Powering the partnership is uSamp’s SampleMarket 2.0, the next generation of its market-leading sample access platform, which offers real-time, self-service access to the uSamp panel.  AYTM will seamlessly integrate access to the uSamp panel, with virtually no alteration to the elegant, intuitive user interface that has made AYTM popular with those needing quick, affordable market research surveys.
            “With the AYTM platform, we have created a service that heretofore didn't exist,” said Lev Mazin, CEO and co-founder, Ask Your Target Market.  “While there are web applications offering affordable, self-service email marketing, on-demand printing and online advertising, nothing else provides powerful, self-service access to affordable target market research. 
Our integration with uSamp makes projects of virtually any scale feasible on the AYTM platform, entirely within our environment.”
            For uSamp, the partnership is yet another example of the advantages of the company's technology, through its API (Application Programming Interface) open platform solution, and the quality of its millions of validated and screened panelists.
            “We’re delighted to provide AYTM with real-time access to millions of richly profiled respondents and our robust Sample Market platform interface,” said Matt Dusig, CEO and co-founder of uSamp.  “AYTM has opened an entirely new niche for businesses that previously lacked the access, expertise or budget to conduct market research.   We share that sensibility with AYTM – that the boundaries of market research are now expanding to encompass businesses of all sizes and complexities.”
            Separately, AYTM announced the addition of dozens of highly requested new features to its suite of online survey tools.  Survey authors can now choose among six types of questions, add videos or images to illustrate their survey, and analyze and customize results in more detail.  Agencies will be able to use the AYTM survey platform on a “white label” basis and offer self-branded market research services to their clients as part of their in-house portfolio of services.
            “Thanks to the uSamp relationship and our battery of enhancements, we believe AYTM will now deliver a significantly better market research experience for all of our stakeholders -- researchers, survey respondents and end clients,” AYTM’s Mazin said.  “With our easy interface, pleasing design palette, simple navigation and lightning-fast results, conducting market research has never been this easier.”

About AYTM.com
AYTM.com Ask Your Target Market is the leading innovator in DIY online market research. Only at AYTM can you define your exact target audience by drilling-down into a panel of millions of US consumers and find your ideal research respondents based upon their psychographic and demographic characteristics. Then, write a survey using up to 6 question types, skip-logic, images, and video.  Launch your survey and see results streaming back in minutes. Use our powerful, easy-to-learn analytic tools to help you understand your data in ways that deliver the greatest insight. AYTM has meticulously created the best user experience for researchers and survey takers alike. AYTM offers the ability to survey our respondents or your own. Agencies can even use the AYTM platform under white label and self-brand their research performed on behalf of their clients and benefit from a new revenue stream. AYTM.com market research has never been this easy.


About uSamp
uSamp (www.uSamp.com) is one of the world’s fastest growing technology and online sample companies, providing global survey panelists and an innovative sampling platform for use in market research. uSamp develops collaborative market research tools to foster more rewarding, profitable relationships between organizations and the people they serve. Founded in 2008, uSamp acquired DMS Insights in June 2010 and now has 175 team members worldwide and 5.1 million global market research panelists. The company’s web-based panel platform is transforming the management and delivery of online panel for market researchers, offering unprecedented access over their panel. uSamp’s deep well of proprietary technologies includes SampleMarket™, PanelNet™, PanelShield™, Opinion Place® River and real-time Panel Book Search – cutting-edge solutions for accessing, branding, sampling and managing panels. uSamp is based in Los Angeles, with offices in Dallas, London, New Delhi and Trumbull, CT.

# # #

Media Contacts:

For AYTM
Lisa Santos
(415) 364-8601

For uSamp
Ken Greenberg
Edge Communications, Inc. 
818/990-5001