TLDR: WATCH THE VIDEO (click above)
I canceled ClassPass (hey, nobody’s perfect). And then something interesting happened.
They sent me a survey to try to figure out why I left. So I decided to do a teardown – first of the email they sent me and then of the survey design.
And it’s not good. This video breaks down what’s wrong and how it could be better - from the email they send to hook me to the survey design to understand me.
Watch and find out 💡
3 things you’ll learn from watching this:
Common mistakes on incentives to get people to take your survey
If you should set time expectations to get people to do your survey
Specific wording for questions - what to do.
How to design a survey to really understand why people are canceling.
📣 Have a friend who would enjoy these teardowns? Click the button below to refer them (& earn some great rewards)👇
Next week’s teardown will be real (Easter egg, anyone?) 😉 See you there 🎬
👉 This is a NEW series of product teardowns. Subscribe below to get future ones. Hit “subscribe.”
👂 Transcript:
Okay, I just canceled ClassPass and they sent me a survey to try to figure out why. So we're going to do three things in this video. We're going to evaluate the email that they sent me to try to convince me, and then how the survey is designed, and then finally build it back up to figure out how they could have done better to try to get insights that could correlate with actual product changes they could make.
Okay, so first, for this email, they're doing two things here. They're anchoring me on how it takes five to seven minutes to finish the survey, and then they're saying, look, there's a $25 gift card allure. You could have a chance to win. Now this is interesting because this $25 could be very appealing if it was guaranteed.
However, they're saying “chance to win.” And when people evaluate these lotteries or sweepstakes, they're actually thinking about how likely is it that they're going to win. You have a mega lottery and you know when you enter that, it's very unlikely that you're going to win. But the prize is so high and then you have scratchers by which it’s possible you could win. And so usually when you're designing these sweepstakes, you want to fall in one bucket or the other, a really big prize that is unlikely to win, but maybe for a shot you should do it. That's highly likely this $25 gift card bridges the middle a little bit too much.
It's probably low certainty. People are looking at that and saying, probably not going to win it. And it's not big enough to really change my day and make me rearrange my schedule in order to take the survey. So if I were them, I'd make a lower certain prize or a really big, highly uncertain prize to capture either one of those.
And then second, there's this time expectation of 5-7 minutes. And you know, I don't know what's going on in this email. The survey writer's head, so in one world they could say, look, 5-7 minutes is super short. I designed a 15-minute survey and I cut it down to only 5 minutes. You're welcome. And they're thinking, 5 minutes is super short. On the other hand, they could be thinking if I tell people 5 minutes, then they'll create time in their day. They know how long it's going to take and they'll start it when they're ready. And I think both may be a tad misguided. So we did a study where we basically tried to figure this out where we said there's 3 conditions.
One condition is no expectations. So we're not telling you how long something takes. Another condition is where we tell you it's going to take 2 minutes. Another condition where we tell you it's going to take 10. And by the way, both of these are accurate, we just anchored people on how much they were doing. What do you think happened in one world in the 10-minute one? You're being very kind. You're letting me know how much time it'll take and I can make time in my day for it. The 10-minute condition performed worse than the zero time expectations. Why is that? Well, our hypothesis is when you tell people 10 minutes, they say, great, I'll do it later. When is later? Later is never. In the two minutes, you think, okay, I can do this really quickly. People may not have something in their head and it may not be as exceptionally quick. Right. So time expectations in general feels like something most people want to do.
We all have these like little progress bars. Let me tell you how long it will take. I worry deeply about this because when people see how long it will take, they may say, this is not something I want to start. And our goal is to get people to start. I once went on a hike and somebody said, let's just go on this hike, Kristen. I said, okay. And about halfway through I was like, how long is this hike? And they're like, oh, it's seven miles. Like, I'm very happy. I wanted the hike, but I never would've started the hike if you told me upfront it was seven miles, but now I'm on the hike, am I really going to turn around? And so the idea of setting these time expectations up front is that people may not make a decision because they don't want to start.
And you're giving the psychological friction right away versus after you've started, you understand what you're getting into. Easier to continue and finish. Speaking of finishing, let's, let's do this. “Take survey now.” Okay. The first thing I noticed is that they didn't save my data.
And so the bummer for the survey designers is that I already went halfway through it and now I have to redo it again and or I went halfway through it and they're not learning anything from this. Okay. “While you're subscribed, did you use your ClassPass membership?” So it's a seemingly simple question, but if we were redesigning the survey, it's actually quite difficult because what does “use” mean?
So for me, I used it, but I didn't fully use all my credits every month. So what is in my head is different than what's in the survey designer's head. And the goal of a good survey question is to get those things to be the same, which is you draw out the words used, define it for people, so when they're answering it actually represents an interpretation that you could do something with.
But for now, I'll say yes. Okay. “When you're using ClassPass, what do you think the typical discount was?” Now honestly, I have zero idea. And this is not an option. So this is another problem with survey questions is that by asking the questions, people will give you an answer. By the way, this is the same problem with interviews.
When you're sitting down with somebody and asking them a question, they'll give you an answer. We just don't know how confident I am. So if I were to say 20%, which kind of is my guess, I'd say if they were to ask me afterwards, “How confident are you in your answer?” I would say low confidence.
Now that they know something, they know that I kind of just guessed. But maybe this is a holistic view that I have but if I were to say “high confidence,” then it's possible that I already kind of understood or known what that discount was, and we're going to go with “next.” Great. So you know, if you've done any product design, you know that this would likely cause people to bounce.
They may not complete it. And so the people who do complete it are a special breed of humans. For survey designers, we thank you for completing this. You know who you are if you were to complete this, but most of us just won't. And so I do worry a lot about these types of questions that are self, you're getting a survey that's really self-selecting for a different type of person than your average.
The first thing I notice here is that they're anchoring first on cost. And so this is going to get me to think about the reasons that I left to be about cost. Now for me, the reason I left was more about availability of classes. It's not about any of these. So I may answer, “Agree to some of these,” because I do agree, but it didn't actually drive my cancellation rate. So this will be data that's pretty noisy because I could strongly agree that it is more than I was willing to spend, but that really isn't a predictor of my bouncing from the ClassPass subscription. So we're just going to answer a few of these.
“There's not enough businesses near me.” That's what mine was. Great. I was not able to get the bookings or times I wanted. I guess that's true too, because there weren't enough bookings. By the way, I really like this question: “A change in my life prevented me from using ClassPass too many times.”
You know, we're pretty narcissistic as product people. We think everything's about our product, and if we can only change the product, then people would start using it. And in reality, there's lots of other things going on in people's lives. Could be relocation, could be injury, could be job. Maybe the job gave you a fitness bonus and you're using that instead.
More likely people just moved. Maybe they had a wedding and then they stopped using it for the wedding. Just guessing, not enough businesses near me. I thought I already answered that. Strongly agree. What is the primary reason? So if I were them, I would move this question up.
We'll talk about optimal survey design in a minute. But incremental improvements would be to first ask about this primary reason, which then you could structure the survey to ask more about that reason. Cause that's really actually what the survey designers are, are more hoping to get, is really a stronger understanding of this by taking up so much of your time with this.
You're going to have to cut the survey and, and make some, you know, time up to get to that 5-7 minutes. You're going to have to optimize. Whew, this is so long. So in survey design also, you have too many answers. What you'd really want people to do in a rational thing is to read each one and evaluate how likely it is, if it's a primary reason.
That's the optimal way to assess if this is the optimal reason. Obviously people are probably not going to do that and they're just going to skim. So for this, there are not enough businesses near me. What's the second reason? And by the way, the second time I've taken it, I'm still pausing to try to figure out where my reason is.
Okay, I'll say next. I'm only 17%, and there's an error. So sadly they're requiring me to answer all of these questions here. Which ones didn't I answer? And again, pretty noisy because if I'm required to answer them, I'm just going to fill them in and it's not going to actually be predictive of my real answers.
Oh, okay. Let's see. I do not strongly disagree. I don't know. We're going to see if I can do this. Let's go here. Now. As you can see, I'm a pretty committed survey taker if I'm answering these. Okay. More questions. “When you were a member of ClassPass, how much did you spend per month on fitness?” This is a pretty hard question to answer for a couple reasons.
So, in general, if I were to think about spend per month, I'd have to look at every month in the last, let's say six months. Think about my spend, give ClassPass, an average spend. Obviously, I'm not going to do that. The better way to ask this question would be around, “How much did you spend last month on fitness?”
Getting you to think in a question about the last time, how much did you eat last week? How much did you eat yesterday? What did you eat yesterday versus what do you normally eat for lunch? What did you eat for lunch yesterday? So those are easier things for people to think about is what you can actually remember versus what you think you did, which we tend to be more idealistic, more we have a more social desirability bias where we're thinking about the ideal person we are in front of other people.
And then the second issue with this is the word “fitness.” What does fitness mean? Could fitness have been my new cool workout clothes, or could it be the dumbbells that I bought? Or more likely, and they may be thinking it's a gym membership or something else that they're actually spending on classes.
So it’s really difficult for me to answer this question in a way that would reflect something that they could interpret. I would also, if I were looking at this data, I think that people are underestimating what they're actually spending. “When you were a member of ClassPass, how many days per week did you exercise?”
Now, again, another question. It should really be thinking about “Last week how much did you exercise?” Right? Because it's easier for me to actually assess. The other thing could be “Did you exercise more or less with ClassPass?” Which is a much easier way for me to answer the question. And I know because I've done this before, that actually there's all of these errors.
You can't say $0. One of the problems is that if this doesn't move the thing, it's an error and that's bad for our data. And so we'll say, okay. I'll say zero. I think this is still going to be an error. This is a long survey and we're not going to get to the full thing.
We're only 50% of the way done. Wow. But we already talked about how if, if I were classified, how I'd kind of rebuild this study. The first thing would be to simplify that ask, like, “Why did you cancel?” And instead of having 17 questions, I would have categories.
So I canceled for cost. Reasons I canceled because I didn't have, I didn't have the classes that I wanted on there. I canceled because I had a change in job or home. Probably four categories would be optimal. And then after that you're going to deep dive. Into what that actually looks like for somebody.
And the way I would deep dive is actually not by asking people more about this, but by actually getting them to think about what would bring them back. Like what change in ClassPass could bring them back. And what that does is actually get people to contemplate the real reason that they canceled. So imagine that ClassPass said, "We have these three offers." I mean, show me mockups of the ClassPass interface and say, "One offer is three more studios opened up by you. Join. Do you want to rejoin with a 20% discount?" Another offer is you get a 60% discount on classes for 60%. I still want to rejoin because I don't have the classes by me. And so what we're doing here is basically trying to isolate the variable that would actually drive people to change their behavior when we ask these questions.
It's very nice to know, but we're not as close as we want to be to what people would actually do differently or what ClassPass should do differently because of that. And that's really the insight that ClassPass needs is: what do they do differently because of this cancellation opportunity?
So if I answered something about cost in my primary reason, I then have a smart survey that would give me 3 different incentives or credit systems that would try to figure out if one of those credit systems would cause me to rejoin. And we're jumping to solve here, but really we're trying to get at the say-do ratio.
And bridging that gap is what would actually drive people to rejoin. And then the final thing I would do is have an open text field and have people write in with ChatGPT and, and all these other AI tools, analyzing text, getting easier and easier. We didn't do that before because it's difficult to analyze all this text.
It's now getting easier. And second, when people use words, you can take those words and put them into the product. So if I say things like, you know, I really want some more strength classes, well, you could look at ClassPass filtering and say, oh, do they have a filter for strength? Maybe there are strength classes, but it's just not as visible to people.
And so by asking these open text field questions, you're not just getting more detail, but you're able to use the user's words in a better way. And then the final hack here would be thinking about what ClassPass did really well that I would want, that I will miss. And there's a nice thing of, do you focus on your strengths or your weaknesses?
And many times people will say focus on your strengths. That's your zone of genius. Double click on that. And so it's also possible for a product world that ClassPass could double click on their strengths. What is it that I would miss about the product? And can they reemphasize this for other current users to make sure that it's being emphasized in a way that would increase.
The current user's retention. I don't think I can finish this with you guys yet. It's too long. We're already at 15 minutes. I know that the survey writer did not assume that we would be on a Loom talking about this, so we can't critique that 5-7 minutes, but this is just too long.
The survey. Okay. We'll do one more page. I'm just going to guess, I'm so curious. What's next? Oh gosh. Occasionally. Okay. There we go. This is a nice question here. What is the one thing you could do to make you more likely to reactivate your membership? I think this question still isn't phrased wonderfully, because this is just in theory, in the future. I want free. We want people to make the trade-off here to really think about “in the next week this discount is here.” Would you take it? So when you're asking these types of questions, wanting to be more tangible and have people really react to something as if it's an opportunity cost in their life.
So it's still going to cost you something, but this discount may be available.
*Questions about your product? Email kristen@irrationallabs.com.
Share this post