Post mortem of a soft-launch

Psonar made a soft launch on Wednesday, letting in the first handful from the hundreds of users who had pre-registered over the last few weeks. I thought I'd try and draw out a few lessons from our experience that might help others in a similar situation.

psonar cloudThe Psonar project is nothing if not ambitious, "iTunes in the cloud" at the very least, so we've taken our time getting to a minimal viable product. That doesn't mean we've not been gathering feedback from prospective users and the current release owes an enormous debt to the groups of teenagers who have given up their time to talk to us about their music needs and desires over the last six months.

First up - why a soft launch? As a tiny company, we think it gives us the best opportunity to get the answers we need to the pressing questions the launch is meant to answer.

Lets start with the technical issues: Does the SongShifter application work smoothly on the wide variety of computers and mobile music devices that it actually encounters in the wild? Can our servers handle the load? We use a fairly standard set of tools to monitor all the operational aspects of the service, nagios checks the health of the servers, databases etc. whilst we receive email notification of any errors occurring in the client-side SongShifter application. One of our first challenges on the day was to tune these to a useful level of sensitivity. The detail of information you want during development doesn't scale well with the number of users, we were initially almost overwhelmed by the automated feedback making it very difficult to sort out the genuine issues. Having a soft launch, initially to techie friends, meant that where issues were encountered we could contact them directly and get some quality bug reports.

For example, the error reports we were sending ourselves only identified which component it concerned in the subject line. We very quickly changed this to include the username as well (where appropriate) which makes scanning the inbox much faster when you're on skype trying to resolve a particular user's issue.

songshifter screen shotAlongside this, we also wanted to gather some usability feedback and here a controlled launch also helps. Being able to have direct contact with users to engage in a conversation is vital and the four of us simply can't manage that level of engagement from a standing start for hundreds of users. We're using zendesk for our support site but getting used to that will take time as its a new tool for most of the team.

Its hard to see a product afresh after you've spent so long working with it everyday. For example, SongShifter has a fairly minimal user interface, but it does have two buttons on the title bar that seem to attract an awful lot of clicks. What's worse is that neither offers any immediate visual clue as to what has happened as a result of the click. That'll be changing later on today for sure!

The final, and most important, set of questions surround the actual business/service goals. Are we providing a basic service people really want to use? How many would be interested in a premium service? Does the website encourage users to hang around playing with the various features? These (and more) can all be measured via a selection of KPIs but again fine-tuning the data gathering takes a bit of experience. Here we're using a lot of custom logging and google analytics on the website which is mainly reporting into ugly emails and rapidly thrown together web pages - its going to take a concerted effort to gather all the relevant stats together for each facet of the business.

Next week we'll be letting in the rest of our facebook community and I'll be back again reporting on how it all holds up.