From Market Research to User Experience: The Evolution of Consumer Insights
- Chris Zhang
- Nov 10, 2024
- 12 min read
Updated: Nov 11, 2024
Strictly speaking, both market research and user research are offspring of ethnographic research. Market research is the older brother, while user research is the younger sister.
Let's start by discussing what ethnographic research actually is. In the late colonial era, Western powers like Britain and France were eager to find more effective ways to understand and manage the indigenous populations in their colonies. This is where anthropologists came in. What exactly did these anthropologists do? Early anthropologists, such as Bronisław Kasper Malinowski, the author of "Argonauts of the Western Pacific," would travel to colonies like Australia and South Africa to study the local populations, documenting their lifestyles, customs, religious rituals, and social activities. Ethnographic research was the method they used—comprising observation, recording, interviews, and systematic analysis and description.
By the 1950s, some large American corporations began employing anthropologists or psychologists as consultants to better understand their consumers. This marked the early form of market research. For instance, Ernest Dichter, the famous American business consultant and marketing genius, rose to prominence during this period.

Dichter was born in 1907 in Vienna, the capital of Austria. After earning his doctorate in 1938, he and his wife arrived in New York with just 100 dollars. In 1939, he sent his resume to six companies, emphasizing his expertise in using Freudian psychoanalysis to optimize marketing strategies. It must be said that Dichter was incredibly adept at marketing himself—what we might today call "crafting a personal brand." First, he differentiated himself from his peers by positioning himself as a "young psychologist from Vienna," hinting at academic authority. At the same time, he portrayed himself as a mysterious "manipulator," suggesting that psychology was a powerful, almost magical force that could help companies profit by manipulating consumers. Sure enough, four out of the six companies responded positively, and his first client was Compton Advertising, where Dichter's task was to help design a marketing campaign for Procter & Gamble's Ivory Soap.
At the time, Ivory Soap was one of Procter & Gamble's flagship products and among their earliest offerings. Before Dichter got involved, Ivory Soap's main selling points were its "purity" and its "lightness." However, through consumer interviews, Dichter discovered that, particularly for female consumers, bathing was actually a kind of ritual. This was why women tended to have specific brand preferences for soap—preferences that weren't just about scent, price, appearance, or texture but also involved something emotional, deeper, something we "didn't yet know." At the time, this was a bold idea—that consumers did not make purchasing decisions merely to meet functional needs. Products were also mirrors that reflected one's personality and identity, and purchasing decisions were often driven by subconscious preferences and fears, largely unrelated to the product itself. Thus, the first step in marketing was to define the brand's "soul," to understand and guide consumer desires. Dichter then came up with the tagline for Ivory Soap: "Wash Your Troubles Away," implying that the soap could not only clean the body but also wash away life's troubles.
The case that truly made Dichter a rising star in the marketing world was his campaign for Chrysler's Plymouth Barracuda convertible. At the time, Chrysler had developed the slogan "Unlike Any Car Before It" to highlight the uniqueness of the vehicle. However, they overlooked a critical issue—that "being different" led consumers to worry about safety, as unfamiliarity often equated to a lack of safety. Dichter traveled to Detroit and used his usual method—interviews. After concluding his research, Dichter observed that convertibles represented a man's desire for youth, freedom, and secret romantic aspirations. Owning a convertible was almost every middle-aged man's dream. But interestingly, when these men actually went to buy a car, they often brought their wives along, and in the end, they opted for a practical, conventional model. So, if Chrysler wanted to boost sales of the convertible, they needed to recognize the influence of spouses in the final purchasing decision and shift their advertising to target women's magazines.
As a result, sales of the Plymouth convertible soared.
Another successful case in Dichter's career was the marketing of the Barbie doll. In 1945, American businesswoman Ruth Handler founded Mattel, a toy company, and invented a doll with a curvy figure and an alluring waistline, which she named "Barbie." Handler approached Dichter for marketing advice. After interviewing consumers, Dichter found that while mothers generally considered Barbie to be a vulgar product, children were eager to emulate Barbie's charm and allure. Dichter told Handler, "You should give the purchasing power to the children and tell mothers that Barbie will teach their daughters how to grow up to be graceful individuals." Handler took Dichter's advice and began airing ads during the "Mickey Mouse Club" TV show. As more children saw Barbie, the sales of the doll skyrocketed. That year, Mattel sold 350,000 Barbie dolls at three dollars each.
In his book "The Strategy of Desire," Dichter famously described human motivation as an "iceberg," with two-thirds hidden from view—so deeply buried that even consumers themselves are often unaware of it. In most cases, what people are truly buying is not the product itself, but the psychological difference and the illusion of the brand image.
Of course, Dichter faced considerable criticism, much of it from the academic world. Some scholars argued that studying consumer motivation lacked objective experimental validation, while others considered Dichter's advertising methods to be absurd, vulgar, and manipulative—particularly of American women's emotions, instilling anxiety in them. For example, to boost typewriter sales, Dichter suggested that typewriters should evoke men's fantasies about the female body, recommending that the keyboard design should be more concave to make it seem more seductive. He believed that smoking for men was akin to a legitimate excuse to decompress and relax after a long day's work, likening it to sucking on the nipple of a "great world breast." Dichter even suggested that lipstick should resemble the shape of the male genitalia—though abstract enough to not be overtly obvious.
Dichter also believed that baking a cake was like giving birth for women, especially at the moment when they took the fragrant, freshly baked cake or bread out of the oven. Hence, the ready-to-bake cake mixes that required nothing but adding water posed a threat, as they minimized the woman's role, thus making her feel marginalized and ultimately replaceable. Dichter advocated instilling the idea that "cooking is an expression of love" into female consumers. He advised cake mix companies to add one simple instruction to their ads—"add an egg." The egg symbolized fertility and subtly addressed the subconscious guilt women might feel about using a pre-mixed product, making it feel like they were contributing more. This is very similar to how some modern influencers induce anxiety among their audiences today.
A side note here—perhaps today's young women might find it hard to understand why American housewives of the past felt guilty or anxious about using instant cake mixes. But think about it this way: when you're not feeling well, would you prefer your boyfriend to cook something for you himself or order McDonald's delivery? Which one feels like a genuine expression of love? If he dares to choose McDonald's, I can't say for sure if he'll feel guilty, but he's definitely in for a scolding.
Despite the controversies, advertising agencies were more than willing to pay Dichter's hefty consulting fees. In 1946, Dichter founded his own company—the Institute for Motivational Research—in New York, which soon established over a dozen offices across the United States and abroad. Corporate executives lined up like "desperate patients" waiting for an appointment with "Dr. Dichter." By 1956, Dichter's consulting fee had reached $30,000, and the annual revenue of the Institute for Motivational Research soared to $750,000—equivalent to about $8.25 million in 2024.
Now, let's talk about user research.
Back in the machine age, roughly from 1880 to 1945, the large-scale production of steel had become a reality, making the construction of massive bridges, skyscrapers, and steel giants no longer a distant dream. The world's first modern power station, built by the British and completed in 1891, had an output of only 800 kilowatts (compared to the 22.5 million kilowatt capacity of China’s Three Gorges Dam today), but it was enough to power central London. Electric lighting drastically improved living and working environments, eliminating the pollution from gas lighting and reducing the risk of fires to an unprecedented low.
In 1841, the world's first national machine tool standard was established. Consistent and reliable power supply enabled machines to be arranged in sequence based on the production process, thus creating the very first assembly lines. With assembly lines came mass production, and the direct benefit of mass production was a reduction in costs. In 1910, Ford's Model T was priced at $780, dropping to $360 by 1916, and down to just $290 by 1924 (equivalent to about $5,156 in 2023). During that era, whoever mastered mass production technology gained a competitive edge.
But having equipment wasn’t enough; factories also needed more advanced management methods. Led by American engineer Frederick Winslow Taylor, a whole generation of production managers began promoting the concept of "scientific management." From selecting, training, and evaluating workers, a series of standardized and process-driven management methods and systems were developed to make labor more efficient. For example, Japan’s Toyota developed the "Andon" system, which allowed any worker on the production line to stop the entire line upon discovering a quality issue. This encouraged proactive problem reporting rather than reactive correction.
Fordism, named after the practices introduced by the Ford Motor Company in the early 20th century, refers to the production method and management philosophy that defined much of industrial capitalism. Its core characteristics included assembly line production, extreme division of labor, and high wage standards. Fordism laid a solid foundation for modern capitalism, making consumer goods affordable for the masses while keeping production highly efficient. However, it largely ignored the value of the individual as a human being, along with their internal desires and motivations. Ford's perspective on workers was inherently instrumental; they were seen as parts of the production system, executing monotonous, repetitive tasks. While this approach significantly improved productivity, workers lost control over their labor and became subjects of exploitation in such an environment.
If we look at the timeline of the evolution of the American consumer market, it can be broadly divided into three phases: the mass consumption era, the brand consumption era, and the rational consumption era. The formation of the mass consumption era is closely linked to Fordism. In 1924, Calvin Coolidge, the 29th Vice President of the United States, was elected as the 30th President. His presidency coincided with the golden decade of economic recovery in the U.S. after World War I. The rapid advances in the steel industry, chemical production, automobile manufacturing, and radio technology not only created numerous jobs but also brought an abundance of material choices for the American public. As mentioned earlier, mass production drastically reduced the cost of consumer goods, making consumption no longer the privilege of the wealthy few—ordinary people could now improve their quality of life through consumption.
In 1939, as Nazi Germany invaded Poland, World War II broke out. During the war, since the United States remained largely untouched by destruction, the country experienced rapid economic growth again after the victory in 1945, reaching its peak in the 1960s.
Overproduction accelerated the transition from the mass consumption era to the brand consumption era, as Americans' pursuit of quality and brands grew increasingly strong. High consumption even became a way to flaunt income and social status. This was the backdrop that led to Dichter’s success. Dichter argued that in modern society, brands served as a replacement for aristocratic titles and family pedigrees—everyone was searching for products that aligned with the image of who they wanted to become. Therefore, every product carried a special meaning, often linked to sex, insecurity, or the desire for social status. For instance, in the early 1950s, Dichter advised banks to emphasize “overdraft” in their ads, making people recognize that having access to money was more important than promoting low loan rates. Credit cards were even more powerful—the magic wasn't in the credit limit but in giving consumers a sense of security and an endless source of power.
Similarly, consumers often experience some level of guilt after purchasing indulgent products, such as cigarettes, spirits, in-game purchases, or fried foods. Therefore, the advertising strategy for these products had to define them as “rewards.”
In the early 20th century, Fordism did indeed solve the contradiction between rapidly growing market demand and industrial production capacity. However, the uniform, standardized products it produced could not satisfy the increasingly individualized demands of the brand consumption era, and thus, Post-Fordism emerged.
Post-Fordism emphasizes flexible production models and innovation, placing particular importance on consumer feedback regarding products and services. It pushed forward the concept of user experience, with designers and engineers gradually realizing that users were no longer satisfied with "one size fits all" products. Instead, they desired something "made for me." If Fordism's central idea was "We know what consumers need, let's produce it and sell it to them," then Post-Fordism's core philosophy became "Consumers know what they need, and companies must acknowledge the consumer's dominant role and adapt proactively to dynamic consumer demands."
In the 1950s, the famous American industrial designer Henry Dreyfuss said in an interview with the Harvard Business Review: "If the point of contact between the product and people becomes a point of friction, then the designer has failed. On the contrary, if people feel safer, more comfortable, more eager to purchase, more efficient, or just happier because they have touched the product, then the designer has succeeded." Look at those key words: safety, comfort, efficiency, happiness. Dreyfuss's perspective succinctly captures the principles behind all modern experience-oriented design.
For example, before the 19th century in Western countries, office chairs were reserved exclusively for aristocrats, wealthy individuals, officials, and scholars. This was because the majority of people were still engaged in physical labor. After the Industrial Revolution, traditional labor modes were transformed, and factories needed literate people to handle logistics, manage warehouses, and perform accounting tasks. This is where the concept of the administrative clerk emerged—essentially the first wave of middle management in companies. The first mass-produced office chair was designed by American inventor Thomas Warren and was exhibited by the American Chair Company at the Great Exhibition in London in 1851. Although its style may seem outdated now, its functionality was almost identical to today's office chairs, featuring a swiveling seat, plush velvet cushioning, and four wheels for easy mobility.

The second generation of office chairs was designed in 1904 by American architect Frank Lloyd Wright—the world’s first height-adjustable office chair. While adjustable height was a useful feature, the chair itself wasn't particularly comfortable. Wright's primary design focus was ensuring visual harmony between the office desk, chair, and his own Larkin Building, rather than the comfort of the person sitting in it. It's worth noting that during that era, managers generally believed that "comfort equals laziness," which is why designers at the time did not prioritize comfort but instead focused on functionality.

With the advent of the brand consumption era, office chair design went through two phases—first, the aesthetics-driven phase, followed by the ergonomics-driven phase. For instance, the aluminum office chairs designed by Charles and Ray Eames featured smooth lines and vibrant colors, making them a hot commodity during the aesthetics-driven phase. In 1976, American furniture designer William Eugene Stumpf created the world's first ergonomic office chair—the Ergon Chair—for Herman Miller. This chair didn’t boast flashy aesthetics, and it used the most affordable foam materials, but it provided superior spinal support.

In the 1990s, the tech boom drove demand for more advanced and functional designs. Stumpf, again working for Herman Miller, designed the Aeron Chair—the same ergonomic chair that many of us use today, complete with a reclining backrest and breathable mesh fabric.

In conclusion, this is how the concept of user experience emerged. While user experience is a core focus for designers and engineers in today’s tech companies, the stories above illustrate that user experience is not a product of modern technology—it is an experience, encompassing emotions, cognition, attitudes, and behaviors that arise when users interact with products and services. User experience can be traced back thousands of years to ancient Chinese feng shui practices, derived from geomancy, and later to Hippocrates in ancient Greece, who spoke of the humanistic, convenient, and applicable nature of design: designs should follow the body's natural movements and postures to minimize discomfort and reduce the learning curve for patients and caregivers.
German psychologist and professor Marc Hassenzahl of the University of Siegen defined user experience in terms of two dimensions: usability and pleasure. Usability is about helping users solve pain points in their life or work, while pleasure is about giving users an opportunity for self-expression. A truly good product balances usability and hedonic experience, ensuring that it is not only easy to use in the long run but also enriches the user’s life without causing obstacles or frustration.
Of course, many of the user experience design principles and methods we know today emerged from the electronics and technology sectors, thanks largely to Apple. In 1984, Apple co-founder Steve Jobs drew inspiration from Xerox's Alto computer research and launched the world’s first commercially successful all-in-one personal computer featuring a mouse and a graphical user interface—the Macintosh 128K. Soon after, Microsoft introduced Windows. Suddenly, people no longer needed to learn programming languages; everything could be accomplished with a mouse. Apple and Microsoft transformed complex, esoteric computer technology into delightful, user-friendly products. When Jobs returned to Apple in 1997, he once again made Apple the leader in user experience, especially with the release of the iPhone in 2007, which completely transformed the way we interact with digital technology.
Yet for a long time, there remained a status disparity between user research and market research, which is why even today many traditional industries still refer to their consumers as "clients" rather than "users." It wasn't until the 2000s, particularly after the 2008 global financial crisis, that companies began to recognize the value of user research. Leading tech and internet companies in China started building their user research teams only after 2010. Why? It can be said that with the end of the financial crisis, the brand consumption era also ended. Consumers were no longer obsessed with big brands as they had been; instead, they began to embrace niche, emerging products—even those without a brand name but tailored to their needs. Meanwhile, advances in computer science, artificial intelligence, and the rise of neuroscience provided new tools and methodologies for user experience and research, tools that inherently had higher scientific rigor. These tools enhanced the efficiency of researchers, product managers, designers, and engineers, propelling product and service optimization, upgrading, and iteration, making user experiences increasingly personalized and customizable.
However, this doesn't mean that brands and marketing are obsolete. From a biological, psychological, and neuroscientific standpoint, our purchasing decisions are still primarily driven by emotions, not rationality. In 2009, a company named Buyology was founded in New York. "Buyology" is an ingenious term—a blend of "Buy," meaning to consume, and "-ology," indicating a field of study. It sounds phonetically similar to "Biology," hinting that this institution studied consumer behavior through the lens of life sciences. Essentially, this company was doing exactly what Ernest Dichter's Institute for Motivational Research had done sixty years prior.
To be continued
留言