Post Disruption: Market research, what next?

Based on my personal experiences

Around 2011 the owner of the company that I then worked for came to Toronto and challenged the employees to go on social media. At that point, few had considered how important this was about to become for our jobs.

In the pre-historic days of market research, consumer data were collected by walking around the city, knocking on doors and interviewing people face to face. I caught the tail end of this when I started in market research in Russia (which is another story). Then, call centres became industry standard and most survey research was conducted on the phone. However, qualitative discussions were still held in person at focus group facilities.

In the 1990ies, online surveys took over as the main means of eliciting consumer data. Rather than finding consumers through random digit dialing, companies now started to build online panels of people who were willing to answer surveys. There was some concern about the representativeness of these panels at first, but it was soon put to rest when the considerable cost savings of this approach became apparent. Initially, research companies needed specific expertise, expensive software and sufficient server capacity to program, host and run these online panels and surveys. Quite a few even tried to develop their own software.

The 2000s saw a rise in self-serve online research solutions. The idea of software as a service (SaaS), and ‘freemium’ products such as SurveyMonkey made it possible for anyone to collect the data they wanted. It seemed no longer necessary to have any particular expertise or resources. At that time, I felt that many established market researchers were reluctant to open their eyes to the new reality. A few companies were blazing ahead, but the majority sat back and hoped it wouldn’t get too bad.

I realized myself around 2012 that things would never be the same again. When I had the opportunity to start my own business in 2014, I jumped with both feet into the world of SaaS, social media and startups, looking for ways to innovate and carve out a niche for myself in the insights community. I read Clayton Christensen and Eric Ries. I experimented with different tools and methodologies. I immersed myself into WeAreWearables Toronto, then a monthly event at the MaRS Discovery District, an innovation hub that offers services for startups and scaleups, and various other incubators and accelerators around the city.

Innovation was the buzzword, and also agile, real-time, lean and design thinking. The word disruption became popular in my neck of the woods a little later. Hard to believe that this was only six years ago. Now the word already seems a little tired. I feel that the dust of disruption has settled in market research, and firms have made adjustments.

Those who started off with the new technologies – for example Qualtrics, or the Canadian company Vision Critical – have probably done very well (not that I know their books). The traditional firms have incorporated various technologies into their offering. Many have social media analysis products. Some run large-scale customer feedback platforms for their clients. Some use virtual reality goggles for concept testing or shopper studies. Some offer online communities or hold virtual discussion groups. And to cut cost, many move business functions into the cloud, and outsourcing is widely used.

Whether this is sufficient to keep traditional market research organizations profitable, I don’t know. What I have learned in my own business is that I am still selling ATUs, segmentations, concept tests and one-on-one interviews. After the frenzy to be innovative and different, what clients seem to appreciate is my expertise. They trust me to know how market research is done properly, to execute the project for them, and to deliver them results that reflect a thorough understanding of their business questions, and in a form that adheres to their internal processes and requirements.

That’s why I stopped following the hype in the last couple of years. Disruption happened, but that was in the past. No amount of research automation can substitute understanding. And understand the customer we must.

But…I recently started renting a co-working desk at a business centre. I am with ease twenty years older than the majority of the other tenants there. So now I am back in a startup environment, and I see that while the hype has decreased, startups are here to stay for the foreseeable future. Businesses who are trying to disrupt various fields are numerous and, with the myriad applications of AI, far from done.

It will remain important to understand how new technologies fit into and change existing businesses. For example, I am intrigued by the anatomy of the Cloud, where servers are located, how data are moved around and what implications different factors have on data loading speeds, data security etc. When some people that I recruit for surveys complain about it taking longer than they thought, is that because their Internet is slow, or because the survey hosting company has switched from using their own servers to the cloud?

Data governance also interests me. With many market research companies using subcontractors, it is almost impossible to see all the way down the supply chain where data may be stored, processed or transferred to. The business risk related to data governance has increased exponentially for market research firms. Good times for lawyers and insurance companies.

But understanding AI, Cloud and computing in general is and will be immensely important for anyone interested in the affairs of this world. I think this is a topic that they should teach kids about at school, so that future citizens can make informed decisions about it. I want to learn more. So far, I have read three books on AI – it’s a start.

Barbara’s AI reading list:

  • Kartik Hosanagar: A Human’s Guide to Machine Intelligence
  • Ajay Agrawal, Joshua Gans, and Avi Goldfarb: Prediction Machines
  • Virginia Eubank: Automating Inequality

Update, also… Janelle Shane: You look like a thing and I love you

Advertisement

Ensuring equitable access to healthcare in the age of algorithms and AI

Yesterday, Dr. Peter Vaughan, chair of the board of directors of Canada Health Infoway, spoke at Longwoods’ Breakfast with the Chiefs.

After outlining the current state and future perspectives of digitization in healthcare, his main message was two-fold: 1. We are at risk of a “failure of imagination”, i.e. we cannot fathom all the possible futures that digital disruption might confront us with and hence fail to plan for their pitfalls adequately. 2. There is great potential for algorithms to be built in such a way as to solidify and deepen inequalities that currently exist in our system, and we need government oversight of such algorithms to prevent this from happening.

The first point is easy to understand, the second point may need little more explanation. Algorithms are used widely to determine what information is presented to us online, what choices are offered to us. We are all familiar with websites, offering us items we ‘might also like’, based on our past choices and based on what other purchasers have bought.

At a time when data from various sources can be linked to create sophisticated profiles of people, it would be easy for a healthcare organization to identify individuals that are potentially ‘high cost’ and to deny them service or to restrict access to services. Bias can creep into algorithms quickly. If people of a certain age, ethnic background or location are deemed to be ‘higher risk’ for some health issues or for unhealthy behaviours, and this is built into an algorithm that prioritizes ‘lower risk’ customers, then you are discriminated against if you share the same profile, no matter how you actually behave.

Discrimination is often systemic, unless a conscious effort is made to break the cycle of disadvantaged circumstances leading to failure to thrive leading to lower opportunity in the future. As Dr. Peter Vaughan pointed out, we in Canada value equitable access to healthcare, education and other public goods. We expect our government to put safeguards in place against discrimination based on background and circumstances. But how can this be done?

Private, for-profit enterprises have a right to segment their customers and offer different services to different tiers, based on their profitability or ‘life-time customer value’. Companies do this all the time, it is good business practice. But what about a private digital health service that accepts people with low risk profiles into their patient roster, but is unavailable to others, whose profile suggests they may need a lot of services down the line? Is this acceptable?

And if the government were to monitor and regulate algorithms related to the provision of public goods (such as healthcare) who has the right credentials to tackle this issue? People would be needed who understand data science – how algorithms are constructed and how AI feeds into them – and social sciences – to identify the assumptions underpinning the algorithms – and ethics. Since technology is moving very fast, we should have started training such people yesterday.

And how could algorithms be tested? Should this be part of some sort of an approval process? Can testing be done by individuals, relying on their expertise and judgement? Or could there be a more controlled way of assessing algorithms for their potential to disadvantage certain members of society? Or a potential for automation of this process?

I am thinking there may be an opportunity here to develop a standardized set of testing tools that algorithms could be subjected to. For example, one could create profiles that represent different groups in society and test-run them as fake applicants for this or that service.

Also, algorithms change all the time, so one would perhaps need to have a process of re-certification in place to ensure continued compliance with the rules.

And then, there would be the temptation for companies to game the system. So, if a standardized set of test cases were developed to test algorithms for social acceptability, companies may develop code to identify and ‘appease’ these test cases but continue discriminating against real applicants.

In any case, this could be an interesting and important new field for social scientists to go into. However, one must be willing to combines the ‘soft’ social sciences with ‘hard’ stats and IT skills and find the right learning venues to develop these skills.

Much food for thought. Thank you, Dr. Peter Vaughan!

Microsoft spitting into Google’s soup

Has your computer been sending you ceaseless reminders to update to Windows 10? I have just updated mine, and now I understand why Microsoft is so adamant that everyone gets the new operating system. Because it is not just an operating system.  

Pumpkin soup

If you choose the ‘Express Setup’ features, you will give Microsoft access to all sorts of data that would not normally be shared with your operating system provider. It will allow Microsoft to make the sorts of rich data connections that so far only Google with its web of interconnected and super-user friendly services has been able to gather (and profit from handsomely).

My new operating system is also very persuasive in getting me to use its search engine Edge and so far I have found it difficult to stay with Google as the default search engine. What a huge coup for Microsoft and major threat for Google! I read an article or two about Edge, saying what a great new thing it is, but who knows who pays these blog writers… Also, when you search on Edge, ominously the old Bing logo appears – not very reassuring, since nobody really liked Bing, right?

But…so far I have not figured out how to keep Google as my default search tool and perhaps I’m starting to like Edge. So what’s Google going to do about that?

Have you upgraded your Windows yet?

Disclaimer: I am not in IT, perhaps I am not fully understanding all the technical details, but the business strategy seems pretty clear…

How pharma may or may not win in a digital world

Healthcare is going digital at a rapid pace. A recent article by McKinsey & Company titled ‘How pharma can win in a digital world’ outlines emerging trends in digital health and how pharma needs to evolve to keep up with the times.

A number of predictions in this article are, I believe, misguided and reflect a common, but incorrect understanding of the potential of digital in health.

Digital health collage 2

Prediction # 1: “Patients are becoming more than just passive recipients of therapies”

Patients have certainly become more knowledgeable about their own health and about available therapies. And hopefully, health-related apps are helping people lead a healthier lifestyle and stay on top of their medical conditions and medications. However, patients have never been passive recipients of therapies. Patients have always had the choice of taking or not taking their pill, cutting it in half, skipping a dose, forgetting to take it, taking it with food when they are not supposed to etc.

Having served pharmaceutical clients for more than a decade, I have frequently observed that it is difficult for someone within the industry to understand that the medicine they are producing is not the be-all and end-all of a patient’s existence. Life is a busy thing. You work, you look after your family, you eat, you entertain yourself, and you may have a health problem that benefits from taking a medication. The act of taking a pill consumes a fraction of your time and attention. Medicines for health issues that are non-symptomatic may be forgotten because the patient does not feel sick. Medicines for chronic, life-threatening conditions may have suboptimal compliance because the patient would rather not be constantly reminded about his or her precarious situation. For acute conditions, compliance wanes as soon as the patient feels better. Side effects deter patients from taking their pill, etc. Compliance would not be such a huge unsolved problem for pharma if patients were ‘passive recipients of therapies’.

Prediction # 2: “Patients will be actively designing the therapeutic and treatment approaches for themselves with their physicians”

I have read this type of statement numerous times in articles about the future of pharma. Perhaps I am lacking understanding of what’s technologically possible nowadays, but for now let’s assume I have a pretty good handle on it. Designing a pharmaceutical product is an extremely specialized and complex process that involves scientists and labs. A chemical or biological compound with certain properties is created to address a specific health issue, and this compound cannot be easily customized. Rather, it is created and then subjected to rigorous testing, costing hundreds of thousands of dollars (or more), and if it does not hold up to scrutiny, then it’s back to the lab for more experiments and tweaking before another round of expensive testing resumes.

Physicians who spend most of their time in clinical practice do not design therapeutic and treatment approaches. They are merely the retailers of those approaches, acting as consultants to their patients and advising them which approach may be best suited for them. And patients will not be actively designing their own therapies unless they are experimenting with mixing pills and brewing up concoctions of their own invention (caution: don’t try this at home, kids!).

With substantially increased access to information patients can play a much more active role in selecting treatments, but they will not design them.

alchemist-300px

Prediction # 3: “Medicine will be personalized to address individual patients’ needs” (not in McKinsey article, but can be found in many other publications on digital health).

The move towards personalized medicine is certainly well underway. However, it does not mean that a therapy will be designed on the spot for the individual sitting in front of his or her physician. Again, the physician is the expert mechanic using existing wrenches and bolts to fix the car. The inventor who comes up with new wrenches and bolts does not deal directly with the customer whose car broke down.

Tools

Personalized medicines are medicines that target issues more precisely than was previously possible. While physicians used to set off a grenade to blast away your breast cancer, and half of your body as well, they now use a precision rifle that locks in on the malignant area and eliminates not much else. And depending on your genetic profile, there are different bullets that are most effective for your particular type of problem. So the array and precision of weaponry in the physician’s arsenal has increased vastly, and affordable genetic tests have contributed to better targeting of the weapons. But none of these things are designed on the spot, while you’re sitting in the examining room, nor will this be possible for a long, long time.

Explanation: Poor understanding of digital vs. physical contributes to common misperceptions

How do these misconceptions come about and why do smart people write these things?

The past five to ten years of our experience of living in a digital world have greatly impacted our beliefs in how easily things can get done and our feeling of agency. Want to customize your new car? Just click on the features that you want – sunroof, heated seats and the colour red – and you can get this exact model without any effort on your part. Select the perfect outfit? Choose the style, colour and size, and get it delivered to your doorstep the next day. Don’t like part of your video? Just delete and replace.

The ease of these digital experiences has gotten us into the mindset that things can be designed instantaneously, delivered rapidly and modified on the spot. We rarely think about the physical realities that enable our digital experiences. To give you the experience of ‘designing’ the perfect outfit for yourself, the maker has to come up with new styles to attract your desire, run efficient manufacturing to put the piece together with acceptable quality and at an affordable price, ensure the supply chain to enable the manufacturing, build in agility to adapt supply to demand quickly, and create a distribution system to bring the piece to you. All of these things are not done through click of a button, but through the hard work of setting up systems, negotiating agreements, fine-tuning machinery and materials and implementing physical processes.

It’s the same for pharmaceutical products. They are chemical compounds, after all.

Digital opportunities

However, the potential of digital solutions to transform the way we care for ourselves and the way healthcare is provided to us is undisputed. From life tracker apps that help you remember to take your pills on time to smart contact lenses that monitor blood glucose levels without pricking your finger to ingestible sensors that give you peace of mind that your schizophrenic brother has actually taken his medication, digital interfaces, algorithms and sensors can deliver great value to the patient.

The question is how this translates into business opportunities. Many people believe that pharmaceutical companies should transform themselves from being “a products-and-pills company to a solutions company” (see McKinsey & Company article). The idea is to not only provide medicine to the patient but also digital tools for monitoring of the patient’s condition, for communicating with the patient’s circle of care, for scheduling and reminders, for supporting rehabilitation after events and for outcomes tracking. From a patient perspective, this could certainly be a valuable offering. From a business perspective, the value proposition is less clear.

First of all, pharma companies do not typically have the expertise to develop digital solutions in house. Some form alliances with tech companies. Novartis and Google are developing smart contact lenses for people with diabetes and are scheduled to start trials this year. Otsuka and Proteus Digital Health have teamed up to embed a digital sensor into a schizophrenia medication to track compliance, and have submitted the first digitally enhanced new drug application to the FDA. J&J has set up a series of incubators and rewards startups for coming up with interesting ideas in digital health. Merck sponsors health hackathons.

What does the pharmaceutical company get out of this? Will physicians choose their medication over competitive products because it comes with a digital value add? Is the digital component just another cost factor that is necessary to stay competitive these days, or is there a revenue model somewhere? It seems that there is currently a climate of experimentation without a clear business model path ahead, not unlike many other areas of digital development.

In crowded markets with little product differentiation, it is possible that the companion app could become the deciding factor in recommending one drug over the other. However, it is hard to imagine that it would play any role if there were differences in efficacy or side effect profile between the compounds. A tricky little question is also what to do with patients who need to switch off one product and go to another. Should they be denied continued usage of the app?

To be truly solutions providers, pharma companies would need to be structured differently, around disease states, not around products. It would make more sense to form a company that is, say, a ‘cardiology broker’, offered great digital tools to manage a variety of cardiologic conditions and give patients access to the full gamut of cardiology drugs available. The sales reps for this company would not overtly or covertly ‘push’ one or two drugs, but they would advise physicians on what is new in the field and impartially discuss the merits of the different options. There are some attempts of pharma companies to become leaders in a therapeutic space and assume the role of expert provider – for example Roche or Novartis in oncology, where both companies have a large product portfolio. However, by and large, this type of business model does not apply to how pharma companies are organized and how they make money currently. It would be more applicable to private payors, and we see some organizations in the U.S. moving in this direction.

Low-hanging but sour fruit

The obvious area where digital tools can be used very effectively to drive engagement is patient-related. Arguably, a more engaged patient will likely be more compliant and stay on therapy longer, resulting in immediate benefit to the bottom line.

However, while many companies try to be patient-centric, any direct engagement with a patient carries the risk of an adverse event report with it. While adverse event reporting systems have been set up to keep patients from harm, unfortunately, reporting requirements are ridiculously broad. Nobody is keen on generating massive amounts of adverse event reports for their drugs. So digital engagement of patients has to be done with all sorts of caveats to reduce the risk of learning about an adverse event. Some companies stay away from direct engagement with patients altogether for that reason; others have taken the plunge and struggle to come up with creative ways around the problem.

Another challenge in engaging with patients through digital tools and platforms is finding appropriate engagement formats for particular audiences. A platform that has been designed to help kids with pain through gamified challenges and ‘levels’ may not be the right approach to engage a 70-year old cancer patient. Very little testing and research has been done to date to find out what tools best support patients with certain conditions. The key here is to be open to a multi-platform approach. While a game may be great at motivating one audience, a combination of text reminders and phone support may be best suited to keep another audience adherent to their treatment. Unfortunately, many of the vendors that design patient engagement tools on behalf of pharma are either all digital or not digital at all. What would be needed is a new type of vendor who can pull together various types of tools and customize them for a particular target patient population.

Low-hanging sweet, sweet fruit

One area where pharma could employ digital innovation easily and with sustained impact is in the way companies communicate with physicians. While almost everyone has switched to iPads for detailing over the past few years, pharma companies (in Canada, my home turf) still have limited understanding of how digital can be used to improve access and deliver value to physicians. Knowledge about different forms of digital engagement is lacking in marketing departments where people think Twitter and Instagram are for self-absorbed teenagers with too much time on their hands. Also, there is a feeling that digital is not important to the physicians who are core to the business. However, as one year after another go by and younger physicians become key opinion leaders and high prescribers, companies may find that they have missed the boat in establishing a digital rapport with these individuals.

Only recently have some companies started to think about conducting media audits and finding out from their core target how they use digital tools and what might be of value to them. Putting some effort and resources into understanding the myriad of different ways digital can be used, and physician preferences in this regard is relatively simple and will almost certainly have a payoff within a five-year timeframe. There will likely be some resistance from the sales folks who tend to see alternatives to face-to-face engagements as a threat to their position. However, I believe that the 21st century sales rep needs to be an expert in offline and online relationship building. Pharmaceutical companies need to figure out how to integrate different forms of digital and non-digital engagement optimally, and create internal structures and tools to maximize value for the customer.

Cartoon

McKinsey & Company article source:

http://www.mckinsey.com/insights/pharmaceuticals_and_medical_products/how_pharma_can_win_in_a_digital_world

Image sources:

‘Digital health collage’: Made the image myself

‘The alchemist’: https://openclipart.org/detail/222415/alchemist

‘Tools’: DeWalt DEWALT DWMT72163 118PC MECHANICS TOOL SET on http://toolguyd.com/dewalt-ratchets-sockets-mechanics-tool-sets/

‘Cartoon’: I’ve seen this cartoon on the web many times, but don’t know who made it originally. I’ve copied it from https://effectivesoftwaredesign.files.wordpress.com/2015/12/wheel.png?w=640

 

Wearable Tech and Health – not quite there yet?

Wearable tech is revolutionizing healthcare delivery – at least that’s what the pundits have been predicting for a number of years. The array of devices that are under development or already commercially available is stunning.

Looking only at devices that are used by patients on a day-to-day basis, there are three different areas of usage for wearable tech:

  1. Continuous monitoring of chronic illnesses

Many patients with chronic illnesses need to monitor certain biophysical parameters that indicate how well they are doing, if their medications are working and when an exacerbation of their condition would warrant visiting a healthcare professional.

Wearable devices that can sense and accurately measure heart rhythm, breathing rate, or blood glucose levels enable continuous tracking of critical markers and can help alert patients and their healthcare providers early to any arising problems.

  1. Improving the lives of people with disabilities

This is an area in which assistive devices have had a long history (think: hearing aids, wheel chairs, etc.). Digital sensor technology is now making devices more accurate, more personalized and more helpful.

Some examples of new technology that improves daily living include:

  • eSight Eyewear: A device consisting of a high-end camera, video processing software and processing unit and highest quality video OLED screens which project a real-time image that allow legally blind people to see.
  • Sensimat Systems: A series of pressure sensors that are placed under a wheel chair cushion. The sensors use a proprietary algorithm to monitor the seating pattern of the wheel chair user, and send a notification via smart phone when it is time to change position to minimize the risk of pressure sores.
  • TAPS Wearable: Velcro touch pads that can be worn on top of clothing or on the wheel chair. Each pad is a trigger for a smart phone app to play a pre-programmed phrase. This helps people who have difficult speaking (for example due to ALS or cerebral palsy) to communicate more easily.
  1. Recovery and rehabilitation devices

Also an area in which assistive devices have had their place for a long time, digital enhancements now tailor these types of wearables more to the patient’s needs. A number of companies are working on solutions to increase patients’ mobility – typically using some form of exoskeleton, together with sensors and algorithms to help with movement and recovery.

 

How do these new technologies fit into our healthcare system and how accessible will they be to patients who can benefit from their use?

Our healthcare system is already set up to evaluate new assistive devices, and potentially pay for them. Device makers would have to prove that their inventions are useful and enable patients to live more independently and / or return to work earlier and save or reduce disability payments or insurance costs.

Those who develop the wearables have to figure out which ones of the many institutions that share healthcare costs in our country they should approach to be considered for funding.

Funding is more difficult for wearable devices used in monitoring chronic illness. In most cases, there is no precedent for continuous patient monitoring.  Not only the patient’s engagement in the process is required, but a whole new infrastructure approach to healthcare is needed on the provider side. Currently, neither private practices nor hospitals are set up to receive, monitor and act upon myriad patient data coming in through wearable devices.

Many barriers impede adoption of new technologies for patient monitoring:

  • Concern about the reliability of incoming data – how accurate is the wrist-mounted heart monitor, are there differences between different devices and who is at fault if the device either gives a false positive and triggers an unnecessary medical intervention, or a false negative that puts the patient’s health at risk?
  • Integration with existing technology – how will data come into the clinic, will it be compatible with currently used IT solutions, how can staff easily access the data and how will confidentiality and privacy be safeguarded?
  • Integration into existing work flows – who will review the data, at what intervals, and which actions should follow particular cues? Will healthcare professionals need special training on how to read the data? Is extra staff required? How can incoming data be standardized to avoid confusion?
  • And last, but not least, who pays for the extra time that clinic staff spends on continuous patient monitoring?

Many of us are still in the phase of excitement over the wealth of possibilities that wearable tech affords us for delivering better healthcare. The successful players will be the ones who figure out how the possible can be turned into the doable, and profitable, within the constraints of our infrastructure and funding environment.

Confessions of a First Time Wearables User

Since I started my business in healthcare-focused market research, I have been paying attention to wearable devices. Wearables devices have great potential for monitoring health parameters and improving care for certain chronic conditions.

The business press has been making a big deal of wearable devices, predicting exponential market growth over the next few years.

Wearables market growth

I am interested in the user perspective – how useful are these devices, actually? Some statistics show that, similar to fitness club memberships, many people who purchase fitness tracking wrist bands abandon them after a few months of usage.

As an anthropologist, I believe the best way to learn about a certain area of life is to immerse yourself in it, to experience what it feels like and to understand how it works. So I started going to these meet-ups for people engaged in the world of wearable devices. They are awesome!

Run in Steve-Jobs-style corporate presentations by the inspiring wearables guru Tom Emrich, companies in the wearables space present their prototypes and the audience gets to try stuff out in the post-presentation mix-and-mingle. My favorite so far has been the mind-controlled beer tap.

I have met many people in the wearables community, and they are certainly very different in style and outlook to my usual clientele (executives from pharma companies). However, I have hesitated to take the plunge into trying a wearable myself.

I am a pretty fit person, working out two to three times a week, to maintain my health and my sanity, eating pretty healthy, and most of the time walking to public transit rather than taking the car. Whether I run 5 minutes less today than I did last week is not really important to me, as long as I get some exercise every few days. Competing with others along fitness goals does not interest me at all. But I realized that not trying out a wearable myself would deprive me of certain insights that could be essential for conducting the user research that I am so interested in doing.

So I bought a Garmin Vivofit last week. Three things enticed me to purchase this device rather than some of the other ones that are very popular (Jawbone Up, Fitbit, Fuelband).

  1. It shows the time. I am of a generation that still wears a wrist watch, and wearing both a fitness wrist band and a watch separately seemed silly.
  2. Its battery life is supposed to be one year. Charging devices is a big pain, and in my household we are competing for outlets and charger cables to charge the various cell phones, iPods etc for the next morning.
  3. It has a red progress bar that shows up after you have been sitting around for too long. My occupation requires a lot of sitting in front of the computer. I tend to get into a state where I push myself to concentrate only half an hour longer, then another, then another, and then become all tense because I have not taken enough breaks. So a little nudge to get up and walk around seemed like a very useful feature to me.

Garmin Progress Bar

Here are my first experiences with the device:

  • Putting it on is quite uncomfortable. You have to press down on this clip to go into these holes, and doing that hurts the inside of my wrist. Watch wristband makers have certainly figured that one out better. Maybe if you are a tough man you don’t mind. But I’m a lady.
  • The red progress bar is very useful. It has actually helped me take more frequent breaks when I am doing computer work, and I feel better after getting up and walking around for a few minutes.
  • The red progress bar is dumb. This so-called smart device apparently registers only walking activity, i.e. when I swing my left arm back and forth. I was frustrated to see the red bar show up after I spent an hour in the kitchen preparing dinner, and after I was in the back yard, raking and bagging leaves. Apparently, either the sensor or the algorithm don’t realize that these are physical activities.
  • The red progress bar can be fooled. Just for fun, I tried out swinging my arm back and forth for a minute while I was sitting at the dinner table, and it actually tricked the device into registering this as physical activity, so the red bar disappeared.
  • The dashboard that shows my steps and my sleep is kind of interesting. I have only worn the device for a few days, so can’t say yet how useful this data is going to be long-term, if at all, but it’s neat to look at in a narcissistic way – the same way I look at my Twitter account from time to time and delight in the fact that I actually have some followers.

Anyway, it has been a very interesting experiment so far, and definitely proof of the value of ‘walking in the shoes of’ to really understand something.

The true potential of wearables is difficult to tell at the moment. There could be all sorts of useful applications that have not yet been developed or that have not yet gained broad acceptance. After a lot of enthusiasm in the media, there seems to be a bit of a backlash now.

Here’s a recent page from The Atlantic, with quotes of tech opinion leaders all questioning the enthusiasm for wearables:

Atlantic article

And here’s an article written by a health IT consultant about the more technical challenges of integrating mobile health monitoring devices into electronic medical records.

http://medicalconnectivity.com/2014/11/04/challenges-using-patient-generated-data-for-patient-care/

While I share some of the skepticism, the wearables space is certainly an area worth watching, and with great growth opportunities for companies who ‘get it right’. I am excited to be part of this journey.

Man vs Machine

So, Big Data. The market research industry continues to struggle with the concept. It was one of the buzzwords of 2013. Some have come up with a big data offering. Some are searching for a point of view on it. Some counter with small data. Many still have only a vague sense of what we are talking about.

I have asked many colleagues and clients what this concept means to them, in the hopes of developing a brilliant solution that would make me wildly successful. Well, this seems to be taking some time, but anyway, I’d like to share with you what I have learned so far. As I am working in the healthcare sector, this is my focus below.

1. Big Data (in Pharma) is IMS data

For some of my pharmaceutical clients, all they can think of when asked about large data sets is IMS data. IMS captures and sells information about the prescribing behaviour of physicians at the pharmacy level. Through this data, pharmaceutical companies track the sales of their products.

2. Big Data (in Hospitals) is Patient Records and Interaction Statistics

Healthcare providers, particularly hospitals and other large organizations, capture myriads of data on patients flowing through the system. The analysis of this data is largely off the radar screen of traditional market research, and falls under the discipline of health informatics.

3. Big Data is Social Media data

This is a view that many market researchers adopted when social media first appeared on our professional horizon as another form of human expression. Last year’s MRIA NET Gain conference, dedicated to big data, featured a number of presentations in this area.

For those who do not want to develop their own proprietary solutions, subscription-based social media analysis tools are available and used by both end clients and market research vendors.

4. Big Data analysis is a different way of saying Data Mining

Some sectors have worked with large data sets for some time. I am thinking of scanner data in retail, and loyalty programs (Air Miles, Petro Points etc.). Fifteen or so years ago, the statistical techniques used to sift through such data sets were called ‘data mining’.

This practice is still ongoing, and the size of data sets ever increasing with more and more customer touch points being added. Some think of this type of analysis, when hearing the words big data.

5. Big Data is Data that is created by Machines

This type of big data is rarely mentioned and obviously not in the forefront of a market researcher’s mind. However, it has grown exponentially and is increasingly viewed and used as a source of customer information.

For market researchers, the question (and the fear) is to what extent human analysts are still needed, and to what extent ‘the machine’ can do it on its own. And how we can make sure we are still needed.

We say: “You need an expert to interpret what your data means.” We say: “A consultant is needed to guide the analysis process.” We say: “Meaningful data analysis is the development and testing of hypotheses, and only people can come up with those.”

And we are right.

How many times have I looked at the results of a statistical analysis and said, “This does not make any sense.” And then we discarded the analysis and started fresh, because results need to make sense. To another human. To your client. They need to lead to actionable insights and recommendations.

So we are still needed. But…

But many, many processes are now automated, from data analysis over producing charts even to highlighting key insights in charts. Far fewer people are needed to work with the data then before. Take a look at www.beyondcore.com – will get you thinking.

Some companies function with very little market research, in the sense of interactions between real live researchers with real live respondents. Machine-generated user data that streams back from devices guides the refinement of these products. New Apps are developed by split testing and seeing how early users interact with certain features, following the logic of clicks. Service companies integrate their customer interface and CRM software with their enterprise management system. Automated cues let managers at different levels know how they are performing, and notify them if there is a problem.

Technical skills are essential for survival. How can you tell a successful agency these days? If you look at their ‘careers’ page, most of the open positions are for developers (i.e. IT people). Those who win in providing business intelligence are either companies who are focused on the digitalization and automation of data collection and analysis, or companies who make intelligent use of available software products and platforms within the research process.

Do you know what Hadoop is? A wireframe? CSS? If not, perhaps it is time to google it right now…

Man vs Machine

 

Our online and offline worlds

I am a student of human behaviour. When the Internet first became a thing, we used to compare online and offline behaviour. As if we were one person while we sit at our computer and another person in real life. Maybe it was like that in those days. I am talking ten, fifteen years ago.

The world of communication was segmented into different channels: television, radio, print etc. Talking like that does not even make sense any more. Today a typical user experience includes interacting with content and with people across a number of platforms, in a more or less fluid fashion.

Vinu George, Market Intelligence and Customer Insight Manager at Microsoft, recently described this in an article in VUE magazine as follows: We are now moving to a five- screen world…large-screen TVs, gaming consoles, laptops/PCs, tablets and smartphones. Content is now consumed and created across these screens. We are moving from one screen to the next to the next, reading, watching, posting, commenting, sharing online, sharing online offline (Look, mom, have you seen this video?).

Up until recently, I have not been a technology junky at all. But with four out of the five interfaces at my disposal, and discovering the infinite possibilities of social media, I find it more and more difficult to differentiate between online and real life, it is all just life.

Having school-aged children also gives me a privileged view into the future of online immersion. Many parenting experts advise parents on limiting screen time for their kids. Which I agree with. The trouble is, there is not just brain-dead consumption of junk going on, there are lives lived, and they are lived in part through electronic platforms.

As a market researcher, I wonder if our methodologies really address this level of immersion in the online world and the fluidity with which online and offline experiences are intertwined. Rather than focusing on one interviewing medium and throwing in a bit of social media analysis or a few ethnographic observations for good measure, how much richer and more insightful could a truly integrated multi-media exploration of behaviours and attitudes be?

Mom and baby