This short note
was written after the ESOMAR Congress discussion session led by myself (Ray
Poynter) and is an attempt to reflect the many strands that emerged during the
debate. If you would like a copy of the PowerPoint I used as the stimulus for the discussion then
they can be downloaded at the bottom of this post.
The first
question raised in the debate took the following observations:
- Before the Internet we assumed people only rarely completed surveys, were selected by the research process, and were assumed to fit a random sampling model.
- Since the adoption of online access panels, people are doing 50+ surveys a year, choosing to be members of panels, and can’t be assumed to be a random sample
Is this an
important difference?
The consensus
view was that this change was more apparent then real. Conventional research
had in actuality been re-sampling the same people, ignoring non-response bias,
and turning a blind-eye to a variety of professional respondents. At least with
the rise of online panels the issue is being aired truthfully and investigated.
The second
question was based on the following two observations:
- Turnarounds are getting faster, thinking time is shrinking, more research is becoming commodity research, more ‘so called’ insight is simply the reporting of the low hanging fruit, where the first plausible story is the only story reported.
- Research prices are falling, largely driven by the online access panel price wars, creating less scope for experimentation and more risk avoidance.
Again we asked,
is this an important difference? Again, the consensus was that this change was
one of degree rather than a transformation. Time lines have been getting
shorted since the invention of the telephone, not to mention the fax (if you
are too young to remember the fax, ask one of the old-timers). Research needs
to adapt to the new realities are the new cost and time frames.
The third set
of observations looked at online panels and the growth in outsourcing:
- Most online research conducted via third-party panels, who are busy merging and combining.
- Systems, software, scripting, & tabulating increasingly outsourced
One of the
changes that the growth of panels and outsourced solutions has created is that
a small consultancy can use the same sample sources as one of the large
multinational agencies, and use the same systems and potentially even the same
Indian outsourcer for tables. There was a consensus that commodity research is
going to become more similar, and that more and more tasks will be outsourced.
The fourth set
of observations built on the previous points looked at the issue of lower
barriers to entry:
- Anybody can use a panel company, small agencies, clients, non-researchers.
- Data collection systems can be very cheap and can include a wide variety of additional functionality, e.g. tables.
- Quant research can be conducted from anywhere
- Small companies and non-researchers are perfectly viable (economically)
There was more
disagreement about whether the Internet was raising or lowering the barriers to
entry. Whilst some speakers pointed out how easy it was for a small agency to
buy top quality sample, scripting and hosting, others pointed out to the way
they the large agencies continue to increase their share. There was however
agreement that the ‘worth’ of people who knew how to conduct good research is
rising, and that not having access to these good people can be a major barrier
to entry for new companies.
About half the
debate focused on Web 2.0 and the implications for market research. As an example
of the difference between Web 1.0 and Web 2.0 we started with the following
table:
Web 1.0 Web 2.0
Encyclopaedia Britannica Wikipedia
News & Editorials Blogs
Downloadable Movies YouTube
Photo Albums Flickr
Newspapers & TV User
Generated Media
Amazon Price
and review sites
WalMart eBay
Online chat MySpace
Most speakers,
but not all, accepted that Web 2.0 was something different and that the rules
were being remade. One speaker from China wanted to know what to call the next revolution after Web 2.0, the one driven
by the engine of China
as it increasingly becomes the leader in more areas of world trade and jumps
existing technologies, perhaps by focusing on mobile approaches, Web 3.0 he
wondered?
The existing
research paradigm was highlighted with the chart below, raising the question,
what will Research 2.0 be like?
Web 1.0 Web
2.0
We select when to do surveys ?
We select the respondents ?
We pick the questions ?
We pick the answers ?
We keep the process secret ?
We keep the results secret ?
We treat customers as lab rats ?
Several
speakers talked about issues such as co-creation, user-generated media,
allowing respondents to self-profile (as they do in MySpace), and the need to
create communities.
A key
conclusion of the session was that Research 2.0 will need new skills and a
greater understanding of how people are taking over the Internet. It was also
clearly understood that any researcher who wishes to be competent at Research
2.0 will need to master conventional research.
The final point,
made by myself as moderator was that we have experienced about two years of
rapid change on the Internet and in the way it impacts market research, we are
probably going to have another 20 years of similarly fast, large, and
transformational change.
Download is_research_changing_v2.pps
A website without a live chat would be something like - A customer walks into a car showroom and looks at all the models; he spends some time, and walks out of the showroom with a contact number in hand. He may be left with numerous questions in mind. He may or may not contact the dealer. While a website with a Live Chat Support would be having a salesman in the showroom who can study the customer from the moment he enters the showroom, solving his answers all along and sending him back with all his queries well attended!
Posted by: Live Chat Support Software | August 27, 2007 at 12:53 PM