By Neal Sandin
AI continues to permeate into society, spreading to every corner of our homes and workplaces. People are using it to help make emails more professional and resumes more appealing. Unfortunately for market research, some are using it to answer questions.
This is a significant issue. The entire purpose of qualitative market research is to understand people’s unique wants, needs, and desires, as well as the issues and barriers they struggle against. If respondents provide us with AI-generated responses instead, we are failing on a fundamental level. After all, AI generates answers that are compiled from huge amounts of data from millions of users. While undoubtedly useful, it cannot accurately reflect the unique circumstances of the individual. AI can tell us where the rents are rising, but it cannot tell us the feelings, emotions, and hopes of a specific tenant.
One obvious solution is to simply rely less on methodologies such as online bulletin boards, which allow users to reply to questions at their leisure. Face-to-face interaction, either in-person or via Zoom or Teams, precludes using generative AI like Chat GPT. Direct connection would seem to make for more engagement and better results.
However, sometimes online bulletin boards are the best tool for the job, such as when people try a product over the course of several days. Nor has there been a significant drop in the use of online bulletin boards, or similar forms of market research, since the advent of Chat GPT or other AI tools.
Another solution is to simply disregard AI generated answers and replace those respondents. This is possible when the AI is obvious, but as teachers and professors around the world can attest, no person or program can infallibly identify AI-generated responses.
So, the question becomes, why? Why do people use generative AI instead of giving their own opinion? It is easy to fault the respondents, saying that they are lazy or in it for the money. That may occasionally be the case, but the answer is hardly that simple. After all, most people want to have their opinions and frustrations heard and taken seriously. Yet, some turn to AI to generate a response. It is disheartening to find that when asking for a personal opinion (in a confidential and safe setting), people do not feel empowered to give it.