This is a guest post that was originally published on the Cubeit blog.
Paul Graham says — “Make something people want.” Sound advice, but it leaves would-be-entrepreneurs with a serious problem.
How do you know what people want?
Before you go ahead and put the deposit down on that island you are going to buy with all the cash from your IPO, you need to know if what you are building will get past the 500k user barrier.
Validating an idea before you actually spend resources building a product is now a tenet of the startup methodology.
But how exactly do you validate a product without building it?
One way is the MVP. (But wait, isn’t the P in MVP for Product?)
Well yes. But if you read what we have already written about the Minimum Viable Product (MVP) methodology, we’ve gone out on a limb and said that the MVP doesn’t necessarily have to be a product.
Validation could be accomplished with zero lines of code. For example, the acceptability of a voice-enabled digital assistant (like Siri), could be tested by using a person at the other end (rather than AI) and gauging the user’s reaction when a query he poses is answered.
To understand how we tried to validate Cubeit, it is sort of necessary for me to make a pitch to you right now, so bear with me.
What is Cubeit?
To give you a bit of a background, Cubeit is a filing system on your phone. You can add any type of content — from links and documents to photos and videos, to tasks and calendar entries. It also makes organizing and consuming content on the phone super easy by displaying every piece of content as cards.
The next step is collaboration. You can share a set of links (called a collection or a Cube), just by swiping right, and the Cube instantly becomes collaborative. Anyone you share with can add whatever content they want.
Think of Cubeit as a collaborative Dropbox, except optimized for use on mobile, and hence much faster and easier to use.
However, from a product perspective, we needed to figure out quickly what people would use it for so that we could build the integrations required, and make the experience foolproof. But how did we do that without actually giving people something to use?
Note: All the tests below were conducted after we had a broad direction from our earlier user interviews. With the tests below, we were primarily looking for quantitative metrics.
How did we test without a product?
- Landing Page: This is a now-standard methodology — set up a landing page — drive traffic to it using ads/blogs — measure clicks and conversions. Since we specifically wanted to validate messaging and use cases, we tried multiple variations (different use cases) all very distinct from each other. There was one that stood out and gave us around 20% conversion, the others were around 10%. We weren’t really convinced by this approach — mostly because we didn’t know what a good conversion rate was. Should we be happy with 15%, 20%, 25%? There was no way to identify a foolproof number. In the end, we decided to aim for the moon — take the version performing the best and keep optimizing upward.
- Mockups of the app shared as albums on Facebook: The hypothesis was that showing the app in action would be as good as landing page copy, and we benchmarked a fixed number of likes and shares to count as a success metric for the experiment. This experiment fell flat on its face, but we learned that the audience was probably not appropriate for that particular piece of content. (We weren’t spending any money on Ads or to promote the content, and the hypothesis was that we had already exhausted our page’s audience capacity for likes and shares.) In any case, it didn’t work.
- Video: The natural next step was video, but a video depends immensely on execution. Dropbox is an exception because they got immense traction with a video that had minimum production value (except all the easter egg type references which piqued the tech community), but that was also because their product was revolutionary and the first of its kind. For people who are not building the next time machine, production value on the video matters immensely, and we didn’t have that kind of $$ to spend for validation. (Also we needed multiple use cases tested, which meant multiple videos, and more $$$).
- The Blog as MVP: This was when I mucked around a bit on Reddit, and hit upon the great idea of using a blog as an MVP — explain the product in a long 1000 word article — the hypothesis being that if you aren’t able to get people excited after taking 5 minutes of their time, then your idea isn’t as great as you thought it is. This was hardcore, and we felt if the blog had X of recommendations (we posted on ), then we could feel confident about going ahead and putting out a beta. A great example is how Mattermark evolved from a blog into a product.
So we ran a 3 post series on Medium, trying to validate use cases, and setting benchmarks for each post. Here’s what we learned:
- Organic traffic on is easy to achieve — One of the biggest problems with content marketing, is getting traffic. sort of solves that problem. After you get an initial interest in your article (highlights and recommends), organic traffic from within really bumps up, meaning you have to do less to get traffic to your post.
- Focus on the story — is a great place to tell a story, with its visually pleasing layout and powerful editing tools. And that’s exactly what a pre-launch startup needs for its niche — a great story. Forget the two-word catchphrases for now — If you get the story right, all that will follow.
- If you ask, people will give — We wanted to make sure people were not distracted from the story, so we didn’t link our website anywhere at all in the post. Instead, we asked people to recommend the post if they thought it was a workable idea, and if they didn’t, we asked them to comment and tell us why. It worked, and we got a ton of great feedback on how to improve the product.
- Validation is hard — However much you try, you can never truly validate an idea unless you see people using the product. So everything from landing page conversion rates to click-through on FB ads is only getting you part of the way to validation. The uncertainty between your validation exercise to 500k downloads is something you’ll have to live with.
I’ve written this post as a tool for people wondering how to validate an idea. What I have learned is that, unfortunately, there is no slam dunk. (Except maybe paying customers, which is a problem for a B2C .) You will know only you’ve hit the big time only when the downloads counter crosses 500k.
So here’s to more uncertainty, long hours, and not knowing what tomorrow brings.