Price Change

by Admin 12nd July 2012 08:01

Today we are pleased to announce a price reduction.  The cost of asking the first question in a survey reduces from 50p to 20p per respondent.  The aim of Usurv has always been to democratise the availability of market research.  This cost reduction reflects an important step in doing that.  Small businesses, bloggers and individuals are now able to ask a question of 50 respondents, get results back in minutes and pay only £10.

Larger businesses who want 1,000 respondents can achieve that by spending £200.

Fishing the river

by Admin 11st June 2012 12:00

Usurv uses a recruitment technique called River Sampling.  An accepted definition of river sampling is an online sampling method that recruits respondents by way of a survey invitation while he/she is engaged in some other online activity, i.e. visiting a website, hence "fishing the river".

Usurv’s River Sampling 2.0 differs from standard river sampling in a number of crucial respects.

Unlike river sampling which entices potential respondents away from a site, River 2.0 gets potential participants to complete short surveys (one to five question) which are embedded in the site itself and come with a motivating message “please answer our survey and we will donate x to our charity of the month”, or “help us to keep our content free by answering the questions below”.

Enabling potential participants to answer questions within their current digital footprint results in a crucial advantage, completion rates are extremely high (30%) and therefore self-selection bias is minimised. Questions are answered there and then with minimal disruption to an internet users’ online activity. This is important since there is evidence that the representativeness of a participating group decreases exponentially when non-response levels increase. (Leeflang and Olivier, 1980).

Waiting for a bus

by Admin 23rd April 2012 14:55

 

Ever waited for a bus only to have three come along?  Or perhaps worse still only to find it was full.  Not a great feeling.  It’s bad enough when you are on the side of the road, but when you are waiting for important business information and paying good money it really is a bit galling.

One characteristic of Omnibus research is that they run to a schedule; usually something like questions by Thursday / programming Friday / Fieldwork over the weekend and data tables on Monday.  Not that much help if you have a question on a Tuesday and need results as soon as possible.  This lack of flexibility extends beyond timing to sample size, usually it is necessary to ask a 1,000 respondents, data provision – almost always data tables and respondent selection, there are no targeting options, e.g. if your question is only relevant to females.

Usurv offers a complete Omnibus re-think.  We don’t run to a schedule, we are always ready to leave.  You are guaranteed to be the only person on the bus and your question won’t be number 24 in a survey.  You can ask as many or as few people as you like (and only females if that is what is relevant to you) and your results are provided online instantly fieldwork is finished (usually within an hour) and can be interrogated using our online reporting software.

Think a journey in a taxi, but for the cost of a bus ticket.

 

Accuracy

by Admin 29th March 2012 10:52

Our users "get" that we are fast, very cost effective and extremely flexible. Surveys can be set up, deployed and the results reviewed online in minutes. Costs can be as low as £25 and you can set up a survey whenever and with whoever you want.

A couple of users have asked the excellent question - "How accurate is Usurv?"  This is an easy question to ask, but in reality it takes a bit of thought to answer.  The only real test is to do a census, i.e. interview everyone in the entire population.  Clearly that is hugely expensive and only governments can do that and even then only once every ten years.

So what is accuracy? If you are conducting research for a new Rheumatoid Arthritis, HIV or Cancer drug accuracy is very different than if you are collecting feedback on ice cream brands or even political polling.

Political polling, now there is an issue that research agencies like to be accurate about.  It's high profile and always in the press. Different companies conduct political polling, YouGov, Ipsos Mori, TNS etc and publish their results in newspapers.  If you study polling there are many days when two different research agencies field the same poll.  The question usually asked, with very minor changes in the wording is, "If there was a general election tomorrow who would you vote for?" - Labour, Conservative, Liberal Democrats etc.

In fact between 13th May 2010 and 30th September 2011 the same question was asked by two research agencies on the same day 83 times. How accurate were these polls? How close were the results to each other?  On average, over the 83 days when two different companies fielded the same poll, the prediction for the Conservative vote varied by 2%, Liberal Democrat 2% and Labour by 3%.

Our conclusion?  They are all clearly very accurate.  The average prediction over the same time period for the Liberal Democrat vote was 12% and this differed by 2%. This picture, whatever poll you look at, is for all purposes the same - they could only command a minority share of the vote.  Whether this was 12%, 10% or 14% is immaterial, the opinion someone would form reading an article based on the poll would be the same.

So on to our test - Take results from three survey companies; Usurv, an iphone app and a "traditional online panel" and collect data on exactly the same questions.  Eighteen different questions were fielded over a six month period, using exactly the same wording.   The questions covered brands, opinion, preferences and behaviour.

So how close were the answers on average? Usurv differed by 3% from the average, the "traditional online panel" differed by 3% and the iphone app differed by 4%.

Our conclusion? These are all clearly very accurate.  

Anything else?

Use the approach that is fastest, most flexible and most cost effective.

Anything else?

If a 3% deviation from average is not good enough, then you'll have to conduct a census.

Change4life

by Admin 6th February 2012 15:39

Yesterday saw the launch of the Change4Life campaign. The first alcohol awareness campaign to use the Change4life brand.  The campaign highlight the risks involved of regularly exceeding the safe alcohol limit and focuses on reducing alcohol consumption by encouraging booze breaks, going out later or swapping to a smaller glass etc.

 

Most people have heard of the 21 units for men, 14 for women safe limit, but what does that equate to in reality?  How much do people really think it is safe to drink?  In our survey of 500 people, we found approximately 1 in 7 people felt that 2 pints of strong beer or 2 large glasses of wine daily is acceptable.  When these people become aware of the recent campaign they may be in for a rude awakening, as this triples your chances of mouth cancer.  The better news is that over 70% think the safe limit for themselves is one pint of beer or one glass of wine a day, much more in line with government guidelines.

 

Since before Samuel Pepys described drinking "great drafts of claret", we have, as a nation, been in love with our drink.  So what, if anything, is likely to get us to reduce our alcohol intake?  There doesn't seem to be a single silver bullet, that we think will help us reduce our drinking levels.  However, there are some ideas that may help start a change in our nation's attitude to alcohol.  

 

Fourteen percent think setting a budget for a night out may help, though when the economy starts to roll again this may be less of an issue.  Other ideas that may help include; not storing alcohol in the house (13%), buying a soft drink when it is your round (10%) and only drinking with a meal (11%).  Given the drain on the NHS budget of treating alcohol related disease there will be many hoping that at least one (if not all) of these ideas contribute something to reducing our love of alcohol.

 

For full results click here.

500 people were interviewed on 6th February 2012