This paper describes an operational model used within the scope of a budget process for a warranty extension product. The model is built on a database containing a record of the costs incurred with several thousand vehicles over a period of five years. Each component was modelled in two different ways: the classic way using linear or intrinsically linear models by means of regression and traditional distribution functions; and an alternative way based on neural networks. The transition from a calibration achieved by traditional methods to one achieved by means of neural networks does not in any way affect the model itself. On the other hand the tremendous flexibility of neural networks has significantly improved the models predictive capabilities without inasmuch adding to its complexity.
Customer satisfaction analyses often suffer from the fact that it is difficult to compress the large amount of gathered data material into relevant, concrete action recommendations for decision makers, whereby the practical significance and operative convertibility of the results is particularly disputed. The suggested solution basis describes how a model of factors influencing customer satisfaction, from which initial measures can be derived, can be produced using a special kind of neural network based on empirically gathered data. Relevant, precise action recommendations are derived by testing the measures using a neural network as a simulation tool. This reduces the volume of information in a customer satisfaction survey to a list of measures to be implemented by the decision maker.
We shall look for a segmentation of small car buyers with similar product expectations in France. After discussing the difficulties, we shall describe the data collection upon which the segmentation is based before describing the methodology. The method, using neural nets, as presented here aims to eliminate the redundancy in the data and automatically weight the variables in order to cluster the buyers in segments homogeneous to their product expectancies. The last part describes the results of this method in the automobile market realized for the Peugeot-Citroen group.
The process of categorising customers is traditionally based on simple slotting procedures or more extensive survey data. The first procedure might be inaccurate and/or uncertain, the latter complicated, time- consuming, and most of all expensive. It was therefore a prioritised objective for Telenor Mobil, to try to compose an algorithm to predict customer segments in a more straight forward and smooth way, based on easily accessible data. That is, to be able to predict segment membership among all subscribers in the customer database, based on already registered variables.
In the U.S., the self-administered diary method has achieved status as the industry standard for measuring radio, but data users are increasingly convinced that something better must surely exist. Indeed, Arbitron itself continues to develop what could become a major advance in U.S. radio measurement, the portable personal meter. At Arbitron, however, we also believe that the quality of radio audience measurement has improved, and can continue to improve, through the use of new technologies now. While the basic measurement instrument, the diary, has seen relatively gradual evolution (graphic enhancements, etc.), we have achieved significant improvements in other areas of research quality by investing in technologies that are quite new. Among the technology-based enhancements discussed in this paper:
Two factors are necessary for a method to be useful in a variety of country settings. An analysis technique to understand information available is required that is robust and not constrained. And, data of a comparable format and scope of measurement is needed to enable parallel analysis across countries. Artificial intelligence neural networks and Consumer Confidence Surveys fit these criteria. This work provides an example of the power of this combination.
In order to achieve effective targeting, Media Researchers require access to detailed data for every type of medium. Unfortunately, existing surveys tend to be too specific. For instance, television studies deal only with broad target groups and few surveys exist that deal with more than one type of medium. The ideal situation would be for media researchers to have access to single-source data providing detailed purchasing information combined with the usage of all types of medium. In practice, this is rarely achievable, due to cost, contractual arrangements and so on. The traditional solution to this problem has been the use of so-called fusion techniques, however the success of these techniques is marginal and their practical use of doubt, due to a number of well documented problems. Over the past few years, Pulse Train has been experimenting with Neural Networks - a general method of training a computer based on the structure of the brain - to learn about the relationships between questions in a detailed survey which then allows answers to these questions to be imputed within other surveys. The technique, which we call NDA (Neural Data Ascription), has met with some considerable success, certainly providing significantly improved results over current fusion methods.
Media researchers often focus on the need for precise methodological conclusions from large-scale experimental research. In Arbitron's experience, however, many innovations have their roots in a very simple exerciselistening to comments by survey and panel participants. Time and again, participants have led us to think of the survey process in new ways, and their insights have paid off with new techniques that were effective when tested conventionally. The paper begins with a review of literature which discusses respondent perceptions of the survey process. Arbitrons older research in this area is also summarized. More detail is presented about two recent studies which will shape future research. We present panelist feedback from tests with nonfunctional portable-meter mock-ups, knowledge that is guiding our 1994 research agenda. And we present our latest effort to learn from participant experience, a neural network-based analysis of respondent comments in Arbitron radio diaries.
Rule induction and neural networks have been applied to audience data to produce promising results for understanding and predicting audience behaviour. Potential deployment designs suggest that this will be a fruitful way to assist the task of scheduling.
This work demonstrates the learning ability and capacity of artificial intelligence neural networks, and how they are effective in providing information from large data sources. The combination of new techniques of data collection and the appropriation of technology from scientific fields supplies vast capabilities to consumer marketers. At the center of this approach is the artificial intelligence neural network. The impact of adopting this method of data understanding warrants special attention and emphasis, hence the term - - Neural Analysis.
Developments in a variety of disciplines have provided the necessary components to assemble a system of artificial intelligence available for use in formulating marketing strategy. Based on artificial intelligence neural networks, the concept of neural marketing is presented. A flexible new technique, neural marketing has abilities to measure and interpret expectations. Neural networks understand data and, in a process that mirrors human trial and error learning, neural nets find the relationships of cause and effect that are present in that data. This ability to learn is complicated with a facility to generalize that acquired knowledge and apply it to new experiences. Market researchers will find neural networks of value for any situation requiring forecasting and prediction. Neural marketing takes the next step by uniting all data sources, marketing practitioners, and a new strategic intelligence. In this paper there is a review of neural network theory, a presentation of the concept of neural marketing, and general examples of the benefit neural marketing provides for measuring expectations.
Direct marketing is increasing in popularity as a means of selling to the consumer and business customer, it is nevertheless one of the most wasteful and ineffective forms of marketing. Direct marketers talk glibly of 2% effective rates from a mailing being a success, when perhaps they should consider the reality of 98% failure. In the age of the greening of the western world the implication is that there is a vast amount of waste as a result of ineffective mailings to consumers. Spending on market research by direct marketers is pitifully small and yet the use of lifestyle and segmentation research could make a significant contribution to refining the direct marketing process, maximising opportunity and reducing costs and waste. This paper examines the current state of research in direct marketing - principally in the U.S. (where the experience is markedly greater) but also with observations from Europe - and how the direct marketing community can refine existing approaches and utilize new research techniques to advantage. Discussion extends from macro approaches including simple socio-demographic analysis of databases through to the more sophisticated applications of geodemographic clustering and lifestyle segmentation and the potential for the application of risk analysis and neural network driven expert systems. The paper concludes that the industry could make greater utilisation of the advanced research technology at its disposal to decrease waste and increase effectiveness.