
Market Research is and will always be about uncovering insights to provide solutions. The process of doing so (i.e., the methodology) is never going to be a solution in itself, but it will help better inform the final outcomes. It is important to understand how methodology will be influenced in the coming years so that every market researcher knows fully what is in their tool bag and can make sure that they don’t fall behind the curve. Let’s take a look at what will change and what will evolve.
The first and most obvious is faster computing speeds. Computers are consistently getting more efficient and faster and will continue to do so for many years to come. This means we will be able to start exploring big data algorithms in addition to our primary market research. We will be able to analyze data quicker and create automations that don’t currently exist due to computing power issues.
The second is fiber optics increasing to internet speeds. This is a massive evolution in our internet that quite frankly should have happened many years ago if not for regulations. Fiber optics will multiply average internet speeds by 50, giving us the ability to share information at drastically faster rates. This will not only speed up research, but also change the game. We will now be able to have incredibly high quality webcam conversations and be able t o share videos while potentially seeing facial reactions as well as screens. We can have many and multiple streams going on at once and then download our recordings moments after the research has been completed.
The third is sensors. Sensors will probably have the biggest long term effect in the way we do market research and, more specifically, read human behavior and thoughts. We are currently advancing sensor technology so rapidly that programmers can barely keep up. Let’s take the Tesla car for example. They built the car with a bunch of unused sensors and then release autopilot programming systems a year later which allow the car to drive itself on highways. Another example is the 3d touch that Apple integrated into the new iPhones, which introduces depth as another input. However, movement and touch are not the only sensors. We are also starting to understand brain functions, emotional readings, and physiological tracking.
There is much much more than just the few examples I have outlined here. We may be able to do an interview in 10 years and have a pretty good idea of the respondent’s emotional states during the conversation, giving us a whole new exploration of market research.
Lastly, we will also start to see more and more platforms integrate into one giant solution. This has already started to happen in many different fields as companies are trying to become the one company that does it all (especially when it comes to technology). One app will probably be able to stream webcam interviews and have the ability to conduct surveys. It’s just a matter of time.
