This paper explores issues that come to play in designing and fielding high quality global web-based research using the framework of survey errors to point out where several current weaknesses in the field exist.The current focus on multi-national web-based research has surrounded the technical capabilities of such research rather than how to minimize error. The results of several experiments designed to demonstrate the need for a local approach to global web research is discussed.
The majority of web businesses do not provide good customer experience: they are unusable and/or do not meet the customers' expectations and preferences. One reason for this state is that good websites rely on both good marketing and software development and usability. Unfortunately, these two disciplines are largely ignorant of each other, with different traditions as well as goals and metrics for success. Marketers rely on the metric they are most familiar with - volume. On the web, this means the number of visitors on the site and how long they stayed. However, both disciplines are critical components for building a web business that makes customers happy and drives loyalty. More sophisticated metrics and research methods are needed to better understand how customers experience the site, incorporating both usability issues and marketing issues. This paper describes the state of customer experience on the web, discusses various methods for understanding customer experience on a website, and argues for a new multi-method approach that incorporates the critical elements of several different methods.
Over the last few years there have been tremendous advances in the world of conjoint analysis in terms of software, mathematics, topics considered, and approaches. These advances create exciting new opportunities for conjoint analysis over the internet. This paper seeks to set a context for conjoint analysis by reviewing how we got to where we are now and then to explore the new possibilities that are opening up because of the web. Finally, the paper will try to draw the threads together by raising questions about where these techniques will take us next.
This paper describes the transitions in business use of the Internet and the research opportunities this offers. It reviews some the different research techniques currently used including what works and what does not and reviews the results of offering respondents a web-completion option on a global tracking study. Finally the paper maps out the developments in the next generation of computer assisted interviewing (CAI).
This chapter reviews the research environment of today and makes recommendations for the majority, in Internet language the newbies'. If you are amongst the 1% of hard core enthusiasts (the digerati), this chapter will not seem very new. However, even for aficionados, some of the warnings may be of relevance. The first section of the paper looks at the historical context, the recent ESOMAR position paper, a two-factor categorisation of Internet interviewing, and an informal review of the status quo in Europe. Having established a framework, the chapter then progresses to review the dos and donts of Internet interviewing.
The philosophy behind the Delphi Technique for forecasting is to gather a group of experts all of whom have different perspectives on the issue being studied. Through a series of iterative surveys and discussions the learning of the group is supposed to simulate the learning of the marketplace or a society over time. Experience has found Delphi processes to be highly accurate in prediction of future market structure and events. However the process can require six months of time and a quarter of a million dollars of budget. This paper discusses the use of the Internet for conducting a Delphi process which reduces the amount of time required from twenty to twenty-six weeks in a traditional Delphi process to a mere five weeks for a Delphi process conducted via the Internet. Cost savings are also significant.
Internet interviewing is still in its infancy. How do we know therefore that the results obtained by such a technique are plausible and if so what special considerations do we need to take into account when interpreting results from this type of survey? These were the questions we set out to answer in a carefully controlled parallel study using a traditional CAT! telephone study and a Web based study of the same population and using the same questionnaire.
This paper deals with a study based on a large opinion survey conducted on the Net by CNN after the death of Lady Diana. It aims at analysing the discourse content about Diana. SORGEM and IBM-ECAM collaborated in devising a methodology of automatic textual analysis dedicated to qualitative studies.
This paper describes current developments in on-line focus group techniques and the applicability of such methodologies. The paper argues that while on-line focus groups have limitations in terms of their applications there are clear and common instances where Internet-based approaches are not just feasible but preferable when compared with traditional face-to-face approaches.
This paper presents case histories of several Internet-based research programs intended to evaluate and enhance Web sites. Findings from these on-line studies are compared to an off-line study in which Internet sites were evaluated employing a traditional question-and- answer methodology similar to that which is commonly utilized in the assessment of advertising. The paper concludes with a discussion of the recommended applications of different Web site research methodologies.
The Internet offers several versatile new technologies which can be applied not only to interviewing Internet users but also as a new and superior means to solve problems for which CAPI is presently used. Internet-based interviewing is not just for self-completion interviews but is useful to interviewers as well. Early work in this area has been concerned with how Internet interviews may be achieved at all even with a simple questionnaire. This paper considers the next stage when researchers will wish to present major questionnaires of the kind supported by other CAI tools. This paper examines the alternatives available for interviewing using the Internet and considers the strengths and weaknesses of each in the context of major questionnaires. The conclusion of this analysis is that an organisation must be prepared to use different CAI techniques for different applications. The author presents a methodology for preparing surveys based on this analysis and shows its application in the case of the RegioLicht Electronic Community experiment in the Netherlands.
The following paper presents a new research tool N Viz which enables online researchers to randomly sample visitors of singular websites and web pages. The first part overviews the relevance of the tool for advertisers. The N Viz methodology is comprehensively described including some experiences from ongoing tests in the United States and in Germany and lastly two examples of applications are outlined.