The paper deals with the problems and prospects of creating and usage of software for social surveys and market research in Ukraine. The experience of SOCIS (an associate member of GALLUP International) is examined. To make it easier to understand the Ukrainian situation for social and market research the brief description of social and economic environment in Ukraine is given in the beginning of the paper. The status and position of Ukraine amongst other European countries is examined. This seems to be especially important under the situation when more and more multinational companies are entering Ukrainian market and more and more research companies operating internationally are offering their clients the possibility to conduct market research in Ukraine. The special attention is given to the situation within computer and software markets in Ukraine. The retrospective review of sociological science in Ukraine is followed by the detailed analysis of the past and present of Ukrainian experience in conducting large scale opinion polls and market research. The current situation is analyzed with respect to the existing practice of the SOCIS company, which started to work as a social survey and market research company amongst the first independent private companies and at the moment where the words 'marketing and 'market research' were known only to a few specialists. The necessity to use the up-to-date hardware and software was fully recognised by SOCIS from the very first minute of its existence. But the usage of modem western developments was rather limited due to the economic reasons.
This paper describes ways in which new methods of data collection enable us to compile accurate and timely data on interviewer performance. Traditional ideas of interviewer quality analysis are considered, and a new approach is proposed, which is only possible because of the new technologies now available. It also highlights some of the pitfalls to be found along the technological path, and reflects upon them with the benefit of hindsight. It is based on direct experience of the period July 1993 to December 1994 during which BMRB went from being totally non-CAPI to (nearly) completely CAPI for its 50 face to face field days per annum.
The paper concentrates on two new tools for data collection developed by Infratest Burke in conjunction with Quantime in 1994. The first part demonstrates the value of using scanned images in a CAPI process. Computer-assisted personal interviewing undoubtedly delivers cleaner data in shorter turnaround times and can cope with more complex questionnaires than pen-and-paper. In the past we often had the problem of administering a CAPI interview with a lot of show cards. Ideally these should have been presented randomly, or by using an even more complex method. Now that the computer can search and display the pictures according to the rules set out by the researcher, even more face-to-face pen-and-paper work can be converted into CAPI. In the near future, we can expect hardware developments that will allow us to display not only pictures but also videos on an interviewer's PC in a cost-efficient way. Our colleagues from Burke Marketing Research in Cincinnati, Ohio, have already been using this feature for about four years, but only on stationary PCs in malls. In the second part of this paper, I will show you, how Infratest Burke is using sound in its CATI interviews. This has been possible for many years by using tape recorders but the administration and analysis is very difficult. With sound cards in the interviewers' workstations, we can work much more flexibly when playing and recording sound. Unfortunately, we are still limited by the quality of the telephone lines, but we are quite certain that this will change quickly.
Marketing researchers are constantly attuned to the methodology of data collection, understanding this facet of research on a number of dimensions: collection procedure validity, demand characteristics, length of field execution, and research cost. Todays marketing efforts require researchers to operate faster, better, and at a lower cost than we do now. MarketWare has commercialized a technology for collecting consumer purchasing data which addresses the issues of speed, validity, and expense. Using a virtual reality computer simulation, the Visionary Shopper® research service puts consumers in a laboratory type setting where consumers are asked to shop as they would normally on a 3-dimensional virtual reality computer system. Consumers can maneuver around store shelves, pick up, examine, and purchase products. The computer system automatically records what they choose to pick up and look at and what they choose to buy. This technology has been used in many parts of the world to investigate the impact of pricing, promotion, packaging, shelf set, and shelf assortment on consumer purchasing behavior. As a simulation, Visionary Shopper provides the consumer with the ability to realistically interact with realistic products in a familiar context. The system is predictive, in that it relies on measures of consumer behavior (not intent) and does not require back data or norms to have the data be understood. Compared to in-market tests, the system is fast (usually 03-Apr weeks per test), flexible (changes are to a computer program, not a store), and relatively inexpensive. The system is confidential, both to the client, who does not expose marketing strategies to the world, and to the respondent, who interacts with a computer rather than an interviewer, avoiding demand characteristics This paper will discuss virtual reality in relation to marketing research, specifically the Visionary Shopper system, presenting its advantages and comparing its use to other technologies.
This paper is a presentation of an actual Market Information System established at the Marketing Institute at the Copenhagen Business School. It is a user oriented system, aiming at helping the Market Manager to fully use the information value of different databases. The scientific approach to the MIS is classic information theory, including statistical methods, and model building. The fundamental is the numerous electronic databases that are integrated in the system, and the paper reveals a number of problems found in databases that must be solved before the demand for information will increase. The Market Information System is implemented in science and education, and a full demonstration will take place.
In this paper the process of large-scale quantitative research is examined with the aim of finding ways to reduce costs and improve quality. The paper focuses on the role IT plays in fieldwork preparation, fieldwork management, data collection and analysis. These activities usually require most effort in terms of time and manpower and are critical for the quality of the output of the process. The Brand Power Experience study is used as a "worst case" to illustrate the way NIPO uses IT in these stages. The case is chosen for it's extremity: in an international face-to- face survey, carried out in tens of countries all over the world, more than 1.000 brands per country were tested for awareness, image, and many more aspects. NIPO carried out fieldwork in the Netherlands and was the only participant using CAPI. As opposed to many participants, NIPO's standard procedure appeared to hold out very well. During the exploration NIPO's information processing strategies (standardization, decentralisation and information systems) are discussed. Especially the benefits of electronic information systems become clear. Besides the known methodological advantages and high data quality they offer, they improve communication between research, service and fieldwork departments. Better integration opens up possibilities for a more efficient research process. An investigation into the profits and costs of NIPO's research systems shows that the return on investment in information technology can be considerable. Finally new challenges for information technology are identified, which are currently being evaluated by NIPO. These challenges relate to systems for paying out and checking interviewers, fieldwork management and using audio and video in face-to-face interviews.
Bewitching developments in the information technology have incited us to present this paper. I am sure that all of you have heard the magic words that come with the new technology: Pentium, P6, RISC, CD-ROM, POWER PC, Windows 95 amutainment, object-oriented programming, voice recognition, virtual reality, neural networks, ATM, electronic highway, and what have you. Today we will try to give you an idea of the importance of these developments, so you will at least know in which direction you have to go. To be honest with you, I believe most roads in the IT jungle still have to be built and that these roads will need an awful lot of bridges. Mr. Shota Hatton of Kozo Keikaku Engineering, and Mr. Richard Miller of Consumer Pulse, who have a lot of experience with building new roads, are here today to tell you more about what they have achieved with multi-media in market research.
The avalanche of information in the media, and in particular on TV, has made the media users demand more and more return on their investments. Today it's a common habit for advertisers with an international dimension to determine the level of effective coverage to be reached in each country. Initiative Media has implemented the IMPROVE system in Europe over the last couple of years, which allows an international coordinator to answer homogeneously for each country, by means of counting people meters raw data following local specifications. Thanks to this experience we know some of the ways in which to succeed in technological transfer in the media. First, we will see why the approach that we use at present is possible and why it was not possible to apply it a couple of years ago. Then we will see how data from different origins can be processed using the same tool (IMPROVE) and how this tool works in response to an advertiser's problematic, no matter which country is concerned. Finally, by means of a simplified but real case study, we will see how responding to the same brief can create fundamental differences (or similarities) in the choice of Day-Parts according to the country.
Information technology has improved to such an extent that computer assistance is now possible for telephone interviews (CATI) as well as for personal interviews (CAPI). Data quality is increased by using the computer because sequential effects are avoided by randomizing question and answer lists, the interviewer led through the questionnaire (automatic filter) and answers can be checked for consistency while conducting the interview. On the other hand, the interview setting is changed when using the computer. Maybe the interview setting is experienced as less "personal" by the participants. This raises the question whether it is possible to compare the results of studies with different data collection methods and whether it is possible to change the data collection method in ongoing studies without changes in results. This article describes the research design and the results from a method study conducted by GfK in 1994. In a laboratory and a field study with more than 600 interviews, the results of three data collection methods, CATI, CAPI, and the traditional paper-and-pencil-interview (PPI) were compared. As a conclusion one can say that indeed the data collection method has an influence on the results. There are statistically significant differences between PPI, CATI, and CAPI in the number of answers to open questions, in the number of yes responses, and the assessment of scales. These differences could be traced back partly to varying response spontaneity, as measured by response latency time. The method study also showed some ways to minimize these differences.
It is a well known fact that the registration of purchases of fast moving goods is a difficult problem which requires automatization in order to reduce the efforts of the households which have to provide the information and to reduce the work of the research organization which is processing the data. It is clear that one would like to use procedures which avoid coding printing mailing punching and data cleaning as much as possible. A consortium of research groups has developed a complete automated system for information processing (CASIP) for this purpose which satisfies these requirements and some more (Saris 1994). In this paper the system is explained and it is compared with alternative systems designed for the same purpose.