Accuracy

by Admin 29th March 2012 10:52

Our users "get" that we are fast, very cost effective and extremely flexible. Surveys can be set up, deployed and the results reviewed online in minutes. Costs can be as low as £25 and you can set up a survey whenever and with whoever you want.

A couple of users have asked the excellent question - "How accurate is Usurv?"  This is an easy question to ask, but in reality it takes a bit of thought to answer.  The only real test is to do a census, i.e. interview everyone in the entire population.  Clearly that is hugely expensive and only governments can do that and even then only once every ten years.

So what is accuracy? If you are conducting research for a new Rheumatoid Arthritis, HIV or Cancer drug accuracy is very different than if you are collecting feedback on ice cream brands or even political polling.

Political polling, now there is an issue that research agencies like to be accurate about.  It's high profile and always in the press. Different companies conduct political polling, YouGov, Ipsos Mori, TNS etc and publish their results in newspapers.  If you study polling there are many days when two different research agencies field the same poll.  The question usually asked, with very minor changes in the wording is, "If there was a general election tomorrow who would you vote for?" - Labour, Conservative, Liberal Democrats etc.

In fact between 13th May 2010 and 30th September 2011 the same question was asked by two research agencies on the same day 83 times. How accurate were these polls? How close were the results to each other?  On average, over the 83 days when two different companies fielded the same poll, the prediction for the Conservative vote varied by 2%, Liberal Democrat 2% and Labour by 3%.

Our conclusion?  They are all clearly very accurate.  The average prediction over the same time period for the Liberal Democrat vote was 12% and this differed by 2%. This picture, whatever poll you look at, is for all purposes the same - they could only command a minority share of the vote.  Whether this was 12%, 10% or 14% is immaterial, the opinion someone would form reading an article based on the poll would be the same.

So on to our test - Take results from three survey companies; Usurv, an iphone app and a "traditional online panel" and collect data on exactly the same questions.  Eighteen different questions were fielded over a six month period, using exactly the same wording.   The questions covered brands, opinion, preferences and behaviour.

So how close were the answers on average? Usurv differed by 3% from the average, the "traditional online panel" differed by 3% and the iphone app differed by 4%.

Our conclusion? These are all clearly very accurate.  

Anything else?

Use the approach that is fastest, most flexible and most cost effective.

Anything else?

If a 3% deviation from average is not good enough, then you'll have to conduct a census.