Case Study: Google Forms vs. MARE Surveys for product feedback

battle

Running a website survey can give you incredibly valuable information that you can use to improve your services or products, get website feedback, discover bugs, gauge customer interaction or net promoter score, and more.

However, the technique that you use to conduct a website survey can drastically change the results of your survey, and your ability to use those results in a constructive way.

To study this, we decided to run a website survey in our app to see what the difference would be when using two different survey platforms.

In this case study, we tested Google Forms against our very own MARE exit surveys.

Before we take a look at the results, let’s take a look at the two different surveys we ran.

Google Forms product feedback survey

google-forms-survey

We ran a Google Forms survey from February 1st to April 22nd in our web app, and we also sent out the invite link to all users, then sent it out again to users that didn’t open the first email.

Our open rates were on par with other emails we had sent, as were our click through rates. We received probably 40% of our responses from the email blast.

In addition, we added a link to the survey in our app using a bright orange button asking for feedback. This was prominent on every page of the app.

Running MARE Surveys using an exit survey for product feedback

Why using survey question

After the Google Forms survey had run, we decided to test out a MARE exit survey in the app. We didn’t send any email notices about this survey, but instead setup the survey to display if a user looked like they were going to exit the app (exit intent).

We also set the minimum page views for each user to 10 page views before the survey would be triggered. This was setup so that the survey didn’t accidentally display soon after logging in which could be annoying.

This survey was significantly shorter than the first survey, but asked similar questions about their experience using the app on that particular day, as well as their overall experience using MARE surveys.

What were the results?

Here’s a quick run down of the results of each survey.

MARE Exit Survey

4 Questions
62 individual respondents
186 questions
44 days

Average answers per respondent: 3

Contextual data collected:

  • Username
  • Pages viewed before and after taking the survey
  • Original user campaign source
  • User browser and device
  • User level (Basic vs Insights user)
  • List of any other surveys answered by each respondent
  • Conversion goals accomplished by this user

Google forms

14 Questions
30 individual respondents
295 questions answered
81 days

Average answers per respondent: 9.83

Contextual data collected:

  • Username

Takeaways

what does it all mean

So what conclusions can we get from this data?

First off, we have to understand that the surveys weren’t exactly the same. This means that some of the data isn’t as conclusive as we would want. However, there are some great insights that we can get from this test.

Let’s take a look.

1. Much higher response rate

Though there were less questions answered overall (which was expected because the exit survey had less questions available), the response rate of the MARE exit survey was much higher than the Google Forms survey.

Over the same period of time, we had more than twice the number of individual respondents answer the MARE survey compared with the Google Forms one.

Not only that, but the Google Forms survey was even emailed out directly to the entire user base. This goes to show that sending your website surveys via an email newsletter isn’t always the best way to get survey responses.

Conclusion: If you want to get survey responses from your users, using an exit survey within your web app is very effective.

2. Number of questions answered

The average answers per respondent was much higher for the Google Forms survey than for the MARE exit survey – 9.83 to 9 respectively.

However, the Google Forms survey had 14 questions and the MARE exit survey had 4 (we actually had 6, but we used skip logic to only display up to 4 per respondent).

In all, I don’t believe we can make any conclusion about this data, as the number of responses is too low, and the difference in the amount of questions is too high.

We will need to run another test in the future with the exact same survey to test this further.

Conclusion: none

3. Contextual data

This one is hands down in MARE’s favor. The Google Forms survey basically didn’t provide us with any contextual data at all (aside from the email address of the respondent). With Google Forms you can look at each individual response, but you don’t see what the respondent was doing when they answered the survey.

For example:

  • What action were they taking in the app?
  • Had they been on the site for a while that day before answering the survey?
  • Did they engage heavily with the app after they answered the survey?
  • How long had they been a user?
  • What technology were they using? (browser, device etc.)

None of this information is available with Google Forms, but it was available with the MARE exit survey.

Conclusion: If context matters to the results of your survey, Google Forms comes up way short.

4. Future potential

What can we do to act on the survey results that we received with both services.

First, with Google Forms we have the email address of the user, so we can reach out to them directly to learn more about their answers, or even to offer them assistance if they need it.

However, this is pretty much where the future potential of Google Forms ends.

With the MARE exit survey data however, we can do much more.

Because MARE tracks user behavior, we can see what kind of actions each respondent takes after the survey.

For example, we can look at the data in a month or two to see if users who were happy with the service continued to use it, upgraded their account, or became a blog subscriber.

Likewise, we can see the actions of users who were less satisfied with the service to try to understand what roadblocks they came across.

This future potential means that we have the ability to actually act on the information we collect with the MARE exit survey, and we can even easily follow up with the same users over time.

In addition, we can even segment our re-targeting ads based on survey responses, or customize a user’s experience based on their responses.

Bottom line

Ok, so we understand that we’re pretty biased. Yes, we created MARE.

But we also believe that this case study clearly shows some of the advantages of running a MARE survey over a more basic website survey.

We’d love to hear your feedback, questions or comments about this case study. Leave a comment in the form below.


Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*