A few weeks ago, I interviewed with a large company in downtown Seattle. I’d done research on the company and its offerings, studied their current site, watched a couple of their training videos, and wrote a series of questions to root out information about the position, overseeing an aspect of their customer outreach. I was prepared and somewhat confident entering the interview.
Initially, I met with the hiring manager. I felt we had a good conversation about the challenges and opportunities associated with the position. In addition, I responded to his questions with examples, and discussed my experience tracking metrics at Microsoft.
The second interviewer presented stock questions, for which I gave examples. One question, however, probed as to how I went “deep” in response to requests from executives or managers. I gave examples of examining historical information, conferring with people in the field (who usually have their fingers on the pulse of customers’ needs, requests, and complaints), and talking to sales people (who not only have their fingers on customers’ pulses, but are blunt about everything that’s wrong).
He wasn’t satisfied with the answer. So I gave another answer, and he still insisted, “How did I go deep?”
The answer he was probably seeking was “conduct formal customer research,” rather than rely on quantitative web metric tools. Sounds good in foresight, not viable in hindsight when there wasn’t budget nor interest in doing much more than “make it better.”
My interview with the third person was a complete disaster. He focused on the work I’d done at Dell ten years earlier, questioning how I could come up with the revised messaging and “bucketing” for Dell Professional Services without doing a card sort, and why I felt it necessary to revise a website, even though, as I explained, it melded together pages developed by Dell along with those from a recently acquired company. For the latter, they’d stripped off the company logo and replaced it with the Dell logo.
I further elaborated sales people weren’t selling Dell Professional Services because it was confusing. More importantly, no one felt the site did a good job of representing offerings.
The interviewer was completely awestruck that I could launch into extensive web revisions and marketing projects without doing formal customer research and testing.
The sites that I’d overseen at Intel, Dell, and Microsoft had gone through usability testing, but the timing was based on budget not need. I’d worked on the Intel Home Computing website for nine or more months before it was handed over to a research team.
I remember standing behind a one-way mirror, biting my nails as a man fumbled to figure out what to click to reach the desired content. A few minutes later, a woman sailed through the exercise, clicking on the “appropriate” links each time.
In the end, I made a few tweaks to the site. For the most part, my intuition about navigation, messaging, naming of links, and customer behavior produced an effective site, with good traffic flow, and clicks to the desired product and offer pages.
I used this knowledge when I demolished the Dell Professional Services website, and a few years later, the Dell Enterprise Services site. Both sites received high marks when they were eventually tested.
The Microsoft Learning website was 900 pages in size when I walked in the door. A year later, I reduced it to less than 130 pages by eliminating redundancy, orphan pages, and using tabs to group content on the pages. One page with five tabs seems better than five discrete pages.
Nevertheless, the interviewer to whom relayed this story, wasn’t impressed. He still felt 130 pages was too much. Point taken, but reducing a site by 85% should warrant at least a tiny kudos.
I was upset by the interview, until I sauntered over to my favorite marketing blog, written by Seth Godin. He’d posted an article titled, “More people are doing marketing badly…” He theorizes most people doing marketing are actually good at doing something else, and they’re merely making it up as they go along. Plus, there are no rule book, validating their work.
The cure, he offers is “Noticing. Notice what is working in the real world and try to figure out why. Apply it to your work. Repeat. Learn to see, to discern the difference between good and bad, between useful, and merely comfortable.”
That’s the crux. Accomplished designers, writers, and marketers have an instinct for what will appeal to target audience. They know how to create compelling messaging, fashion appealing offers, and turn content and graphics into effective advertisements, brochures, one-pagers, and web pages. They can do this without conducting a survey, doing a card sort, or testing their potential design. Their insight, experience, and savoir-faire enables them to discern what will work and what won’t.
Physicians are the same. Can you imagine going to a doctor with a list of issues, and instead of their listening, they insist on ordering a battery of tests?
“Let’s schedule a stress test.”
“But doctor, I only have a hang nail.”
“An upper GI would be advisable.”
“But it’s just a rash on my back.”
Physicians rely on their book knowledge, expertise, observation, and their patients’ responses to questions, along with poking and prodding to reach an initial diagnosis. If necessary, tests are ordered to confirm their instincts. Or they might prescribe a treatment plan, and then monitor the outcomes.
It’s the same with marketing, especially if you’re marketing an establish product or service. Interviews with stakeholders (other marketers, sales people, field personnel, etc.), research using existing channels (sales records, success of previous promotions, analysts’ reports, etc.), and intuition provide a framework for arriving at an approach.
And yes, it’s fabulous when you can confirm, refine, or maybe scrap an approach by using metrics, derived from customer research and testing. However, most marketers don’t have this luxury for daily decision-making.
In addition, by solely depending on marketing research there’s the potential to overlook details because you’re looking for the average, and not noticing details, which could be the key to reaching an engaging customers.
Here’s an example. I’d been doing marketing for Dell Professional and Enterprise Services for several years. The consumer services group was revamping their offerings, and organized a road show to introduce the offerings to Dell and contract sales teams. I was sent to Lake City, Florida, population 12,000, where the Dell call center was the big employer in town.
At one point in my spiel, having giving it to several teams, I asked, “How do you sell services?” A man at the back of the room explained his car has a spare tire “just in case.” Even a BMW has a spare tire. In the same vein, a person should also have a spare tire – a service agreement — for their PC “just in case.” I later learned, he sold the most service contracts on his team.
A year later, when I worked with Dell Legal to research a name for a tool to run health checks on PCs, I settled on Dell PC TuneUp, and used a car analogy in the messaging, explaining just as you periodically tune up a car, you should turn up your PC.
I’m certainly not against customer research and testing, and gathering metrics via web and social media tools. When I worked for Microsoft Learning, I would spend days using Webtrends. And later used Omniture and a collection of social media tools. However, research and testing is just one component of the market formula. Its key benefit is to validate existing strategy and approaches, discover new applications and markets, and root out emerging trends.
Understanding your audiences’ challenges, needs and opportunities, being able to patch together seemingly disparate developments and trends, and healthy doses of experience and intuition or key to great marketing.