November 11th, 2015

Week 8 – Prototype 1 Synthesis

By mary

Two weeks ago, the online learning innovation group put out their first working prototype, called “Cases.” Cases is a mobile app aimed at continuous case learning. It lets healthcare professionals form a group of peers, upload short video cases, and discuss those cases with video responses.

We built an MVP (“minimum viable product”) of Cases in a handful of days. Then we put it in the hands of 5 different Mayo Clinic groups. We also brought the concept (and a working demo) to 9 different external organizations to test for interest.

7_2-1024x543

Launching new products is often a humbling experience, and this was no different.

In terms of external interest, we’ve had plenty of “that’s cool!” but no takers yet. Which is a signal that our ratio of “perceived value proposition to assumed work required to use the app” (no, that’s not an official ratio) is out of whack.

In the leadup to launch, we had been pleased with how easily our group organizers could 1. think of who they wanted to discuss cases with, and 2. create a couple short video cases. But upon launch, we immediately hit some problems. Our metrics clearly showed that invitees were not joining the application. Those who did join were not engaging in discussion.

9_running_experiments-1024x572

The Desktop Wall

One problem was a straight-up usability issue. People were getting an email invitation to join their group. They were reading the email on their computers, not their phones. In order to get the product out quickly, we had optimized the application only for mobile and thus blocked desktop use with a message that asked people to open the email and/or link on their phones.

9_sms-1024x427That turned out to be too much of a wall, so we had to spend a couple of days cleaning up the desktop view of the application. Video still worked best on phones, so we put in a feature that let people send a text message to their phones with a link to the case they were viewing.

The Trouble with Video

People seemed to enjoy watching a 1-2 minute video case, but we were not getting responses. This could partially be attributed to the “no one wants to be first into the pool” problem that all community sites must overcome. But as we studied what was going on, we believed that video played a part.

One person didn’t film a video response because they had just left the gym and didn’t want to look unprofessional. They forgot to come back and do it later. Other people were hesitant to be wrong in front of their peers, and video either felt to heavy, or they didn’t like that they could not easily edit their response later. Other things get in the way as well, such as speed of playback, or background noise during listening or recording.

While there are very interesting trends happening these days with video and such apps as Periscope and Snapchat, this is primarily with younger audiences. Our conclusion from our quick test is that video is too far outside the comfort zone of most healthcare professionals today, save for the newest generation in the field.

Was Testing Video a Mistake?

Going into our MVP, we knew that video was a big risk. We had one advisor who straight up told us, “this will never work.” But therein lies the rub. If an innovation group never does things that people are skeptical about, it will never innovate. We have to take risks, but test quickly.

We also have to take smart risks. One could debate that qualification for video here, but as noted above, video is an increasing communication trend on mobile phones. It had the potential to humanize the interactions in the app and bring groups together more closely. There are also already text-based case discussion platforms out there. While the existence of a competitor is not reason to shy away from something, we did want to try to push the limits of the state of the art and see if it cracked open new behaviors.

Ultimately, if we fail, we just need to do it quickly, and try to learn as much as possible.

Persevere, Pivot or Kill?

Given what we have seen, our recommendation would be to do a hard “pivot.” We still believe that case learning is an interesting space. We still think that crowdsourcing content is an appealing model. We are uncertain about whether self-organizing groups can work, but we like the potential there to reduce customer acquisition costs.

We believe that the experiment has shown that video will not work, however we do not think that a text approach would be innovative enough. Our recommendation, if we were not already switching gears for a second prototype, would have been to stay focused on case learning but do a complete overhaul of the product design based on our lessons so far.

This Week

In week (9), we are synthesizing what we have learned so far, reviewing our quantitative metrics and doing additional qualitative research to understand how people viewed the app experience.

We are also prepping for our open house in Rochester, MN next Tuesday at 3pm to 6pm CT. Anyone at Mayo Clinic is welcome to join us, and walk through our journey so far.  Open houses will be hosted in Arizona and Florida after the first of the year.  Those sites will be able to see the full journey including Prototype #2.

We also have to come up with a second idea to prototype, and so minds are starting to turn towards new problem spaces and potentially creative new solutions.

Tags: Experiments, Prototype, Synthesis, Testing, Weekly Updates

Comment

Please login or register to post a comment.