'

Why Great Marketers Must Be Great Skeptics

Понравилась презентация – покажи это...





Слайд 0

Rand Fishkin, Wizard of Moz | @randfish | rand@moz.com Why Great Marketers Must Be Great Skeptics


Слайд 1

This Presentation Is Online Here: bit.ly/mozskeptics


Слайд 2

Great Skepticism Defining


Слайд 3

I have some depressing news…


Слайд 4


Слайд 5


Слайд 6

Does anyone in this room believe that the Earth doesn’t revolve around the Sun?


Слайд 7


Слайд 8

The Earth (and everything in the solar system, including the Sun) revolves around our system’s gravitational Barycenter, which is only sometimes near the center of the Sun.


Слайд 9

Let’s try a more marketing-centric example...


Слайд 10

In 2009, Conversion Rate Experts built us a new landing page, and increased our subscribers by nearly 25%. What did they do? Via CRE’s Case Study


Слайд 11

One of the most commonly cited facts about CRE’s work is the “long landing page.”


Слайд 12

The Crap Skeptic The Good Skeptic The Great Skeptic Let’s change our landing page to be a long one right now! We should A/B test a long landing page in our conversion funnel. How do we know page length was responsible? What else changed?


Слайд 13

The Crap Skeptic The Good Skeptic The Great Skeptic “I do believe sadly it’s going to take some diseases coming back to realize that we need to change and develop vaccines that are safe.” “Listen, all magic is scientific principals presented like "mystical hoodoo" which is fun, but it's sort of irresponsible.” "The good thing about science is that it's true whether or not you believe in it." 


Слайд 14

In fact, we’ve changed our landing pages numerous times to shorter versions and seen equal success. Length, it would seem, was not the primary factor in this page’s success.


Слайд 15

What separates the crap, good, & great?


Слайд 16

Assumes one belief-reinforcing data point is evidence enough Doesn’t question what’s truly causal vs. merely correlated Doesn’t seek to validate


Слайд 17

Doesn’t make assumptions about why a result occurred Knows that correlation isn’t necessarily causal Validates assumptions w/ data


Слайд 18

Seeks to discover the reasons underlying the results Knows that correlation doesn’t imply causality Thoroughly validates, but doesn’t let imperfect knowledge stop progress


Слайд 19

Will more conversion tests lead to better results? Testing


Слайд 20

Obviously the more tests we run, the better we can optimize our pages. We need to build a “culture of testing” around here.


Слайд 21

Via Wordstream’s What is a Good Conversion Rate?


Слайд 22

Via Wordstream’s What is a Good Conversion Rate? Do Those Who Test More Really Perform Better?


Слайд 23

Hmm… There’s no correlation between those who run more tests across more pages and those who have higher conversion rates. Maybe the number of tests isn’t the right goal.


Слайд 24

Via Factors That Drive How Quickly You Can Run New Online Tests


Слайд 25

Trust Word of Mouth Likability Design Associations Word of Mouth Amount of Pain CTAs UX Effort Required Process Historical Experiences Social Proof Copywriting CONVERSION DECISION Timing Discovery Path Branding Price (it’s a complex process)


Слайд 26

How do we know where our conversion problems lie?


Слайд 27

Ask Smart Questions to the Right People Potential Customers Who Didn’t Buy Those Who Tried/Bought But Didn’t Love It Customers Who Bought & Loved It Professional, demographic, & psychographic characteristics Professional, demographic, & psychographic characteristics Professional, demographic, & psychographic characteristics What objections did you have to buying? What objections did you have; how did you overcome them? What objections did you overcome; how? What would have made you stay/love the product? What would have made you overcome them? What do you love most? Can we share?


Слайд 28

We can start by targeting the right kinds of customers. Trying to please everyone is a recipe for disaster.


Слайд 29

Our tests should be focused around overcoming the objections of the people who best match our customer profiles


Слайд 30

Testing button colors


Слайд 31

Testing headlines, copy, visuals, & form fields


Слайд 32

Designing for how customers think about their problems & your solution


Слайд 33

THIS!


Слайд 34

Does telling users we encrypt data scare them? Security


Слайд 35

Via Visual Website Optimizer Could this actually HURT conversion?


Слайд 36

Via Visual Website Optimizer


Слайд 37

Via Visual Website Optimizer A/B Test Results They found that without the secure icon had over 400% improvement on conversions as compared to having the image. [Note: results ARE statistically significant]


Слайд 38

We need to remove the security messages on our site ASAP!


Слайд 39

We should test this.


Слайд 40

Is this the most meaningful test we can perform right now? (I’m not saying it isn’t, just that we should prioritize intelligently)


Слайд 41

Via Kayak’s Most Interesting A/B Test vs.


Слайд 42

Via Kayak’s Most Interesting A/B Test A/B Test Results “So we decided to do our own experiment about this and we actually found the opposite that when we removed the messaging, people tended to book less.” - Vinayak Ranade, Director of Engineering for Mobile, KAYAK


Слайд 43

Good thing we tested! Good thing we tested! Your evidence is no match for my ignorance!


Слайд 44

What should we expect from sharing our content on social media? Social CTR


Слайд 45

Just find the average social CTRs and then try to match them or do better. No brainer.


Слайд 46

Via Signup.to’s Analysis of CTR on Twitter


Слайд 47

Via Signup.to’s Analysis of CTR on Twitter


Слайд 48


Слайд 49


Слайд 50

306/701 = 43.6%... WTF??


Слайд 51


Слайд 52

Phew! We’re not alone. Via Chartbeat


Слайд 53

Assuming social metrics and engagement correlate was a flawed assumption. We need to find a better way to measure and improve social sharing.


Слайд 54


Слайд 55


Слайд 56


Слайд 57

OK. We can create some benchmarks based on these numbers and their averages, then work to improve them over time.


Слайд 58

That is an insane amount of variability!


Слайд 59

There are other factors at work here. We need to understand them before we can create smart metrics or useful expectations


Слайд 60

Timing Source Audience Affinity Formatting Network-Created Limitations to Visibility Brand Reach Traffic Engagement


Слайд 61

Let’s start by examining the data and impacts of timing.


Слайд 62

Via Facebook Insights


Слайд 63

Via Followerwonk


Слайд 64

Via Google Analytics


Слайд 65

There’s a lot of nuance, but we can certainly see how messages sent at certain times reach different sizes and populations of our audience.


Слайд 66

Comparing a tweet or share sent at 9am Pacific against tweets and shares sent at 11pm Pacific will give us misleading data.


Слайд 67

But, we now know three things: #1 - When our audience is online #2 – Sharing just once is suboptimal #3 – To be a great skeptic (and marketer), we should attempt to understand each of these inputs with similar rigorousness


Слайд 68

Do they work? Can we make them more effective? Share Buttons


Слайд 69

After relentless testing, OKTrends found that the following share buttons worked best:


Слайд 70


Слайд 71


Слайд 72

OKTrends found that removing all but a single button (the “like” on Facebook) had the most positive effect.


Слайд 73

And that waiting until the visitor had scrolled to the bottom of the article produced the highest number of actions


Слайд 74

We should remove all our social sharing buttons and replace them with a single slide-over social CTA for Facebook likes!


Слайд 75

Buzzfeed has also done a tremendous amount of social button testing & optimization…


Слайд 76

And sometimes they do this…


Слайд 77

And sometimes this…


Слайд 78

Is Buzzfeed still in testing mode?


Слайд 79

Nope. They’ve found it’s best to show different buttons based on both the type of content and how you reached the site.


Слайд 80

OK… Well, then let’s do that… Do it now!


Слайд 81

Testing a small number of the most impactful social button changes should produce enough evidence to give us a direction to pursue.


Слайд 82

Buzzfeed & OKTrends share several unique qualities: They have huge amounts of social traffic Social shares are integral to their business model The content they create is optimized for social sharing


Слайд 83

Unless we also fit a number of these criteria, I have to ask again: Is this the most meaningful test we can perform right now?


Слайд 84

BTW – it is true that testing social buttons can coincide with a lot of other tests (since it’s on content vs. the funnel), but dev resources and marketing bandwidth probably are not infinite ?


Слайд 85

Does it still work better than standard link text? Anchor Text


Слайд 86

Psh. Anchor text links obviously work. Otherwise Google wouldn’t be penalizing all these sites for getting them.


Слайд 87

It has been a while since we’ve seen a public test of anchor text. And there’s no way to know for sure how powerful it still is.


Слайд 88

Testing in Google is very, very hard. There’s so many confounding variables – we’d have to choose our criteria carefully and repeat the test multiple times to feel confident of any result.


Слайд 89

1) Three word, informational keyword phrase with relatively light competition and stable rankings Test Conditions: 2) We selected two results (“A” and “B”), ranking #13 (“A”) and #20 ( “B”) in logged-out, non-personalized results 3) We pointed links from 20 pages on 20 unique, high-DA, high-trust, off-topic sites at both “A” and “B”


Слайд 90

A) We pointed 20 links from 20 domains at this result with anchor text exactly matching the query phrase #11 #12 #13 #14 #15 #16 #17 #18 #19 #20 B) We pointed 20 links from the same 20 pages as “A” to this URL with anchor text that did not contain any words in the query


Слайд 91

#11 #12 #13 #14 #15 #16 #17 #18 #19 #20 #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 After 20 days, all of the links had been indexed by Google. “A” and “B” both moved up 4 positions. None of the other results moved more than 2 positions.


Слайд 92

See? Told you it works.


Слайд 93

While both results moved up the same number of positions, it’s almost certainly the case that #13 to #9 was against more serious challengers, and thus anchor text would seem to make a difference. That said, I’d want to repeat this a few times.


Слайд 94

Princess Bubblegum and I are in agreement. We should do the test at least 2-3 more times keeping as many variables as possible the same.


Слайд 95

1) Three word, informational keyword phrase with relatively light competition and stable rankings Early Results from a Second Test: 2) We selected two results (“A” and “B”), ranking #20 (“A”) and #14 ( “B”) in logged-out, non-personalized results 3) We pointed links from 20 pages on 20 unique, high-DA, high-trust, off-topic sites at both “A” and “B”


Слайд 96

B) We pointed 20 links from 20 domains to this URL with anchor text that did not contain any words in the query #11 #12 #13 #14 #15 #16 #17 #18 #19 #20 A) We pointed 20 links from the same pages/domains at this result with anchor text exactly matching the query phrase


Слайд 97

#11 #12 #13 #14 #15 #16 #17 #18 #19 #20 #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 After 16 days, all of the links had been indexed by Google. “A” moved up 19 positions to #1! B moved up 5 positions to #9. None of the other results moved more than 2 positions.


Слайд 98

Good thing we tested! This is looking more conclusive, but we should run at least one more test. Anchor text = rankings. Stick a fork in it!


Слайд 99

Does it influence Google’s non-personalized search rankings? Google+


Слайд 100

Good discussion about Google+ correlations in this post Google+ is just too damn high.


Слайд 101

Good discussion about Google+ correlations in this post From a comment Matt Cutts left on the blog post: “Most of the initial discussion on this thread seemed to take from the blog post the idea that more Google +1s led to higher web ranking. I wanted to preemptively tackle that perception.”


Слайд 102

Good discussion about Google+ correlations in this post To me, that’s Google working really hard to NOT say “we don’t use any data from Google+ (directly or indirectly) at all in our ranking algorithms.” I would be very surprised if they said that.


Слайд 103

Google explicitly SAID +1s don’t affect rankings. You think they’d lie so blatantly? As if.


Слайд 104

The correlations are surprisingly high for something with no connection. There have been several tests showing no result, but if all it takes is a Google+ post, let’s do it!


Слайд 105

First, remember how hard it is to prove causality with a public test like this. And second, don’t let anything but consistent, repeatable, provable results sway your opinion.


Слайд 106


Слайд 107

#21 #22 #23 #24 #25 #26 At 10:50am, the test URL ranked #26 in logged-out, non-personalized, non-geo-biased, Google US results.


Слайд 108

42 minutes later, after ~30 shares, 40 +1s, and several other G+ accounts posting the link, the target moved up to position #23 #21 #22 #23 #24 #25 #26


Слайд 109

#21 #22 #23 #24 #25 #26 48 hours later, after 100 shares of the post, 95 +1s, and tons of additional posts, the result was back down to #25


Слайд 110

At least we proved one thing – the Google+ community is awesome. Nearly 50 people shared the URL in their own posts on G+!


Слайд 111

Many G+ users personalized results, however, were clearly affected.


Слайд 112

#21 #22 #23 #24 #25 #26 #27 #28 #29 #30 Something very strange is happening in relation to the test URL in my personalized results, though. It’s actually ranking LOWER than in non-personalized results.


Слайд 113

Could Google be donking up the test? Sadly, it’s impossible to know.


Слайд 114

GASP!!! The posts did move the result up, then someone from Google must have seen it and is messing with you!!!


Слайд 115

Sigh… It’s possible that Jenny’s right, but impossible to prove. We don’t know for sure what caused the initial movement, nor can we say what’s causing the weird personalized results.


Слайд 116

More testing is needed, but how you do it without any potential monkey wrenches is going to be a big challenge. That said, remember this:


Слайд 117

Phew! We’re not alone. Via Chartbeat


Слайд 118

If I were Google, I wouldn’t use Google+ activity by itself to rank anything, but I would connect G+ to my other data sources and potentially increase a page’s rankings if many pieces of data told a story of engagement & value for visitors.


Слайд 119

Ready to Be Your Own Skeptic?


Слайд 120

Rand Fishkin, Wizard of Moz | @randfish | rand@moz.com bit.ly/mozskeptics


×

HTML:





Ссылка: