• Archives

  • Archive for the ‘Uncategorized’ Category

    What Happens When SEO and CRO Conflict?

    Thursday, January 10th, 2019

    Posted by willcritchlow

    Much has been written and spoken about the interplay of SEO and CRO, and there are a lot of reasons why, in theory, both ought to be working towards a shared goal. Whether it’s simple pragmatism of the business benefit of increasing total number of conversions, or higher-minded pursuits such as the ideal of Google seeking to reward the best user experiences, we have many things that should bring us together.

    In practice, though, it’s rarely that simple or that unified. How much effort do the practitioners of each put in to ensure that they are working towards the true shared common goal of the greatest number of conversions?

    In asking around, I’ve found that many SEOs do worry about their changes hurting conversion rates, but few actively mitigate that risk. Interestingly, my conversations with CRO experts show that they also often worry about SEOs’ work impacting negatively on conversion rates.

    Neither side weights as highly the risks that conversion-oriented changes could hurt organic search performance, but our experiences show that both are real risks.

    So how should we mitigate these risks? How should we work together?

    But first, some evidence

    There are certainly some SEO-centric changes that have a very low risk of having a negative impact on conversion rates for visitors from other channels. If you think about changing meta information, for example, much of that is invisible to users on the page– maybe that is pure SEO:

    And then on the flip side, there are clearly CRO changes that don’t have any impact on your organic search performance. Anything you do on non-indexed pages, for example, can’t change your rankings. Think about work done within a checkout process or within a login area. Google simply isn’t seeing those changes:

    But everything else has a potential impact on both, and our experience has been showing us that the theoretical risk is absolutely real. We have definitely seen SEO changes that have changed conversion rates, and have experience of major CRO-centered changes that have had dramatic impacts on search performance (but more on that later). The point is, there’s a ton of stuff in the intersection of both SEO and CRO:

    So throughout this post, I’ve talked about our experiences, and work we have done that has shown various impacts in different directions, from conversion rate-centric changes that change search performance and vice versa. How are we seeing all this?

    Well, testing has been a central part of conversion rate work essentially since the field began, and we’ve been doing a lot of work in recent years on SEO A/B testing as well. At our recent London conference, we announced that we have been building out new features in our testing platform to enable what we are calling full funnel testing which looks simultaneously at the impact of a single change on conversion rates, and on search performance:

    If you’re interested in the technical details of how we do the testing, you can read more about the setup of a full funnel test here. (Thanks to my colleagues Craig Bradford and Tom Anthony for concepts and diagrams that appear throughout this post).

    But what I really want to talk about today is the mixed objectives of CRO and SEO, and what happens if you fail to look closely at the impact of both together. First: some pure CRO.

    An example CRO scenario: The business impact of conversion rate testing

    In the example that follows, we look at the impact on an example business of a series of conversion rate tests conducted throughout a year, and see the revenue uplift we might expect as a result of rolling out winning tests, and turning off null and negative ones. We compare the revenue we might achieve with the revenue we would have expected without testing. The example is a little simplified but it serves to prove our point.

    We start on a high with a winning test in our first month:

    After starting on a high, our example continues through a bad strong – a null test (no confident result in either direction) followed by three losers. We turn off each of these four so none of them have an actual impact on future months’ revenue:

    Let’s continue something similar out through the end of the year. Over the course of this example year, we see 3 months with winning tests, and of course we only roll out those ones that come with uplifts:

    By the end of this year, even though more tests have failed than have succeeded, you have proved some serious value to this small business, and have moved monthly revenue up significantly, taking annual revenue for the year up to over £1.1m (from a £900k starting point):

    Is this the full picture, though?

    What happens when we add in the impact on organic search performance of these changes we are rolling out, though? Well, let’s look at the same example financials with a couple more lines showing the SEO impact. That first positive CRO test? Negative for search performance:

    If you weren’t testing the SEO impact, and only focused on the conversion uplift, you’d have rolled this one out. Carrying on, we see that the next (null) conversion rate test should have been rolled out because it was a win for search performance:

    Continuing on through the rest of the year, we see that the actual picture (if we make decisions of whether or not to roll out changes based on the CRO testing) looks like this when we add in all the impacts:

    So you remember how we thought we had turned an expected £900k of revenue into over £1.1m? Well, it turns out we’ve added less than £18k in reality and the revenue chart looks like the red line:

    Let’s make some more sensible decisions, considering the SEO impact

    Back to the beginning of the year once more, but this time, imagine that we actually tested both the conversion rate and search performance impact and rolled out our tests when they were net winners. This time we see that while a conversion-focused team would have rolled out the first test:

    We would not:

    Conversely, we would have rolled out the second test because it was a net positive even though the pure CRO view had it neutral / inconclusive:

    When we zoom out on that approach to the full year, we see a very different picture to either of the previous views. By rolling out only the changes that are net positive considering their impact on search and conversion rate, we avoid some significant drops in performance, and get the chance to roll out a couple of additional uplifts that would have been missed by conversion rate changes alone:

    The upshot being a +45% uplift for the year, ending the year with monthly revenue up 73%, avoiding the false hope of the pure conversion-centric view, and real business impact:

    Now of course these are simplified examples, and in the real world we would need to look at impacts per channel and might consider rolling out tests that appeared not to be negative rather than waiting for statistical significance as positive. I asked CRO expert Stephen Pavlovich from conversion.com for his view on this and he said:

    Most of the time, we want to see if making a change will improve performance. If we change our product page layout, will the order conversion rate increase? If we show more relevant product recommendations, will the Average Order Value go up?

    But it’s also possible that we will run an AB test not to improve performance, but instead to minimize risk. Before we launch our website redesign, will it lower the order conversion rate? Before we put our prices up, what will the impact be on sales?

    In either case, there may be a desire to deploy the new variation – even if the AB test wasn’t significant.


    If the business supports the website redesign, it can still be launched even without a significant impact on orders – it may have had significant financial and emotional investment from the business, be a better fit for the brand, or get better traction with partners (even if it doesn’t move the needle in on-site conversion rate). Likewise, if the price increase didn’t have a positive/negative effect on sales, it can still be launched.

    Most importantly, we wouldn’t just throw away a winning SEO test that reduced conversion rate or a winning conversion rate test that negatively impacted search performance. Both of these tests would have come from underlying hypotheses, and by reaching significance, would have taught us something. We would take that knowledge and take it back as input into the next test in order to try to capture the good part without the associated downside.

    All of those details, though, don’t change the underlying calculus that this is an important process, and one that I believe we are going to need to do more and more.

    The future for effective, accountable SEO

    There are two big reasons that I believe that the kind of approach I have outlined above is going to be increasingly important for the future of effective, accountable SEO:

    1. We’re going to need to do more testing generally

    I talked in a recent Whiteboard Friday about the surprising results we are seeing from testing, and the increasing need to test against the Google black box:

    I don’t see this trend reversing any time soon. The more ML there is in the algorithm, and the more non-linear it all becomes, the less effective best practices will be, and the more common it will be to see surprising effects. My colleague Dom Woodman talked about this at our recent SearchLove London conference in his talk A Year of SEO Split Testing Changed How I Thought SEO Worked:

    2. User signals are going to grow in importance

    The trend towards Google using more and more real and implied user satisfaction and task completion metrics means that conversion-centric tests and hypotheses are going to have an increasing impact on search performance (if you haven’t yet read this fascinating CNBC article that goes behind the scenes on the search quality process at Google, I highly recommend it). Hopefully there will be an additional opportunity in the fact that theoretically the winning tests will sync up more and more – what’s good for users will actually be what’s good for search – but the methodology I’ve outlined above is the only way I can come up with to tell for sure.

    I love talking about all of this, so if you have any questions, feel free to drop into the comments.

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Marketing Lessons Learned from 16 Years of Building Moz – Whiteboard Friday

    Friday, April 20th, 2018

    Posted by randfish

    The lessons Rand has learned from building and growing Moz are almost old enough to drive. From marketing flywheels versus growth hacks, to product launch timing, to knowing your audience intimately, Rand shares his best advice from a decade and a half of marketing Moz in today’s edition of Whiteboard Friday.

    Marketing Lessons Learned from 16 Years of Building Moz

    Click on the whiteboard image above to open a high-resolution version in a new tab!

    Video Transcription

    Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we are going to chat about some of the big lessons learned for me personally building this company, building Moz over the last 16, 17 years.

    Back in February, I left the company full-time. I’m still the Chairman of the Board and contribute in some ways, including an occasional Whiteboard Friday here and there. But what I wanted to do as part of this book that I’ve written, that’s just coming out April 24th, Lost and Founder, is talk about some of the elements in there, maybe even give you a sneak peek.

    If you’re thinking, “Well, what are the two or three chapters that are super relevant to me?” let me try and walk you through a little bit of what I feel like I’ve taken away and what I’m going to change going forward, especially stuff that’s applicable to those of us in web marketing, in SEO, and in broader marketing.

    Marketing flywheels > growth hacks

    First off, marketing flywheels, in my experience, almost always beat growth hacks. I know that growth hacks are trendy in the last few years, especially in the startup and technology worlds. There’s been this sort of search for the next big growth hack that’s going to transform our business. But I’ve got to be honest with you. Not just here at Moz, but in all of the companies that I’ve had experience with as a marketer, this tends to be what that looks like when it’s implemented.

    So folks will find a hack. They’ll find some trick that works for a little while, and it results in this type of a spike in their traffic, their conversions, their success metrics of whatever kind. So they’ve discovered a way to game Facebook or they found this new black hat trick or they found this great conversion device. Whatever it is, it’s short term and short lasting. Why is this? It tends to be because of something Andrew Chen calls – and I’ll use his euphemism here – it’s called the “Law of Shitty Click-through Rates,” which essentially says that over time, as people get experienced with a sort of marketing trend, they become immune to its effects.

    Marketing Lessons Learned from 16 Years of Building Moz - Whiteboard Friday

    You can see this in anything that sort of tries to hack at consciousness or take advantage of psychological biases. So you get this pattern of hack, hack, hack, hack, and then none of the hacks you’re doing work anymore. Even if you have a tremendously successful one, even if this is six months in length, it tends to be the case that, over time, those diminish and decline.

    Conversely, a marketing flywheel is something that you build that generates inertia and energy, such that each effort and piece of energy that you put into it helps it spin faster and faster, and it carries through. It takes less energy to turn it around again and again in the future after you’ve got it up and spinning. This is how a lot of great marketing works. You build a brand. You build your audience. They come to you. They help it amplify. They bring more and more people back. In the web marketing world, this works really well too.

    Marketing Lessons Learned from 16 Years of Building Moz - Whiteboard Friday

    So most of you are familiar with Moz’s flywheel, but I’ll try and give it a rough explanation here. We start down here with content ideas that we get from spending lots of time with SEOs. We do keyword research, and we optimize these posts, including look at Whiteboard Friday itself.

    What do we do with Whiteboard Friday? You’re watching this video, but you’ll also see the transcript below. You’ll see the podcast version from SoundCloud so that you can listen to the text rather than watch me if you can do audio only for some reason. Each of these little images have been cut out and placed into the text below so that someone who’s searching in Google images might find some of these and find their way to Whiteboard Friday. A few months after it goes up here, hosted with Wistia on Moz, it will be put up on YouTube.com so that people can find it there.

    So we’ve done all these sorts of things to optimize these posts. We publish them, and then we earn amplification through all the channels that we have – email, social media, certainly search engines are a big one for us. Then we grow our reach for next time.

    Early in the days, early in Moz’s history, when I was first publishing, I was writing every blog post myself for many, many years. This was tremendously difficult. We weren’t getting much reach. Now, it’s an engine that turns on its own. So each time we do it, we earn more SEO ranking ability, more links, more other positive ranking signals. The next time we publish content, it has an even better chance of doing well. So Moz’s flywheel keeps spinning, keeps getting faster and faster, and it’s easier and easier. Each time I film Whiteboard Friday, I’m a little more experienced. I’ve gotten a little better at it.

    Flywheels come in many different forms

    Flywheels come in a lot of forms. It’s not just the classic content and SEO one that we’re describing here, although I know many of you who watch Whiteboard Friday probably use something similar. But press and PR is a big one that many folks use. I know companies that are built on primarily event marketing, and they have that same flywheel going for them. In advertising, folks have found these, in influencer-focused marketing flywheels, and community and user-generated content to build flywheels. All of these are ways to do that.

    Find friction in your flywheels

    If and when you find friction in your flywheel, like I did back in my early days, that’s when a hack is really helpful. If you can get a hack going to grow reach for next time, for example, in my early days, this was all about doing outreach to folks in the SEO space who were already influential, getting them to pay attention and help amplify Moz’s content. That was the hack that I needed. Essentially, it was a combination of the Beginner’s Guide to SEO and the Search Ranking Factors document, which I’ve described here. But that really helped grow reach for next time and made this flywheel start spinning in the way that we wanted. So I would urge you to favor flywheels over hacks.

    Marketing an MVP is hard

    Second one, marketing an MVP kind of sucks. It’s just awful. Great products are rarely minimum viable products. The MVP is a wonderful way to build. I really, really like what Eric Ries has done with that movement, where he’s taken this concept of build the smallest possible thing you can that still solves the user’s problem, the customer’s problem and launch that so that you can learn and iterate from it.

    I just have one complaint, which is if you do that publicly, if you launch your MVP publicly and you’re already a brand that’s well known, you really hurt your reputation. No one ever thinks this. No one ever thinks, “Gosh, you know, Moz launched their first version of new tool X. It’s pretty terrible, but I can see how, with a few years of work, it’s going to be an amazing product. I really believe in them.” No one thinks that way.

    What do you think? You think, “Moz launched this product. Why did they launch it? It’s kind of terrible. Are they going downhill? Do they suck now? Maybe I should I trust their other tools less.” That’s how most people think when it comes to an MVP, and that’s why it’s so dangerous.

    Marketing Lessons Learned from 16 Years of Building Moz - Whiteboard Friday

    So I made this silly chart here. But if the quality goes from crap to best in class and the amplification worthiness goes from zero to viral, it tends to be the case that most MVPs are launching way down here, when they’re barely good enough and thus have almost no amplification potential and really can’t do much for your marketing other than harm it.

    If you instead build it internally, build that MVP internally, test with your beta group, and wait until it gets all the way up to this quality level of, “Wow, that’s really good,” and lots of people who are using it say, “Gosh, I couldn’t live without this. I want to share it with my friends. I want to tell everyone about this. Is it okay to tell people yet?” Maybe it’s starting to leak. Now, you’re up here. Now, your launch can really do something. We have seen exactly that happen many, many times here at Moz with both MVPs and MVPs where we sat on them and waited. I talk about some of these in the book.

    MVPs, great to test internally with a private group. They’re also fine if you’re really early stage and no one has heard of you. But MVPs can seriously drag down reputation and perception of a brand’s quality and equity, which is why I generally recommend against them, especially for marketing.

    Living the lives of your customer/audience is a startup + marketing cheat code

    Last, but not least, living the lives of your customers or your audience is a cheat code. It is a marketing and startup cheat code. One of the best things that I have ever done is to say, “You know what? I am not going to sequester myself in my office dreaming up this great thing I think we should build or I think that we should do. Instead, I’m going to spend real time with our customers.”

    Marketing Lessons Learned from 16 Years of Building Moz - Whiteboard Friday

    So you might remember, at the end of 2013, I did this crazy project with my friend, Wil Reynolds, who runs Seer Interactive. They’re an SEO agency based here in the United States, in Philadelphia and San Diego. They do a lot more than SEO. Wil and I traded houses. We traded lives. We traded email accounts. I can’t tell you how weird it is answering somebody’s email, replying to Wil’s mom and being like, “Oh, Mrs. Reynolds, this is actually Rand. Your son, Wil, is answering my email off in Seattle and living in my apartment.”

    Marketing Lessons Learned from 16 Years of Building Moz - Whiteboard Friday

    That experience was transformational for me, especially after having gone through the pain of building something that I had conceptualized myself but hadn’t validated and hadn’t even come up with the idea from real problems that real people were facing. I had come up with it based on what I thought could grow the company. I seriously dislike ideas that come from that perspective now.

    So since then, I just try not to assume. I try not to assume that I know what people want. When we film a Whiteboard Friday, it is almost always on a topic that someone I have met and talked to either over email or over Twitter or in person at an event or a conference, we’ve had a conversation in person. They’ve said, “I’m struggling with this.” I go, “I can make a Whiteboard Friday to help them with that.” That’s where these content ideas come from.

    When I spend time with people doing their job, I was just in San Diego a little while ago meeting with a couple of agencies down there, spending time in their offices showing off a new links tool, getting all their feedback, seeing what they do with Open Site Explorer and Ahrefs and Majestic and doing their work with them, trying to go through the process that they go through and actually experiencing their pain points. I think this right here is the product and marketing cheat code. If you spend time with your audience, experiencing their pain points, the copy you write, what you design, where you place it, who you try and get to influence and amplify it, how you serve them, whether that’s through content or through advertising or through events, or whatever kind of marketing you’re doing, will improve if you live the lives of your customers and their influencers.

    Whatever kind of marketing you’re doing will improve if you live the lives of your customers and their influencers.

    All right, everyone. Hope you’ve enjoyed this edition of Whiteboard Friday. If you have feedback on this or if you’ve read the book and checked that out and you liked it or didn’t like it, please, I would love to hear from you. I look forward to your comments. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

    Video transcription by Speechpad.com

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Apex NC Kitchen Remodel by Asbury Remodeling and Construction

    Saturday, April 16th, 2016

    Check out this awesome kitchen remodel project from Asbury Remodeling

    http://www.asburyremodeling.com/ – 919-904-4548

    Watch this video on YouTube here – https://www.youtube.com/watch?v=6Os6Nl62bM8

    Asbury Remodeling & Construction, LLC

    1002 Towhee Drive

    Apex, NC 27502

     

    Check Out This Incredible Raleigh Bungalow Remodel and Addition

    Monday, January 25th, 2016

    1920 Raleigh Bungalow Remodel and Addition. Client comments:

    “Very, very good job.”

    “I\’ve worked with a lot of contractors in my life and I\’d have to say that hands down, that this is the best group I\’ve ever worked with.”

    “Personable, paid a lot of attention to details.”

    “I like how Damon gets excited about the little things.”

    “I appreciate the attention to detail. I appreciate the friendliness. I really like that I was kept in the loop about decisions and choices.”

    “They went out of their way to make me feel like I was part of the process and that\’s rare.”

    “I am very happy with the work that we got. I couldn\’t have asked for a better experience.”

    Asbury Remodeling & Construction, LLC

    1002 Towhee Drive

    Apex, NC 27502

    919-904-4548

    http://www.asburyremodeling.com/

    https://www.youtube.com/watch?v=oP4XMEqHz2U

    https://www.youtube.com/user/AsburyRemodeling

    Moz’s Acquisition of SERPscape, Russ Jones Joining Our Team, and a Sneak Peek at a New Tool

    Friday, August 28th, 2015

    Posted by randfish

    Today, it’s my pleasure to announce some exciting news. First, if you haven’t already seen it via his blog post, I’m thrilled to welcome Russ Jones, a longtime community member and great contributor to the SEO world, to Moz. He’ll be joining our team as a Principal Search Scientist, joining the likes of Dr. Pete, Jay Leary, and myself as a high-level individual contributor on research and development projects.

    If you’re not familiar with Mr. Jones’ work, let me embarrass my new coworker for a minute. Russ:

    • Was Angular’s CTO after having held a number of roles with the company (previously known as Virante)
    • Is the creator of not just SERPscape, but the keyword data API, Grepwords, too (which Moz isn’t acquiring—Russ will continue operating that service independently)
    • Runs a great Twitter profile sharing observations & posts about some of the most interesting, hardcore-nerdy stuff in SEO
    • Operates The Google Cache, a superb blog about SEO that’s long been on my personal must-read list
    • Contributes regularly to the Moz blog through excellent posts and comments
    • Was, most recently, the author of this superb post on Moz comparing link indices (you can bet we’re going to ask for his help to improve Mozscape)
    • And, perhaps most impressively, replies to emails almost as fast as I do 🙂

    Russ joins the team in concert with Moz’s acquisition of a dataset and tool he built called SERPscape. SERPscape contains data on 40,000,000 US search results and includes an API capable of querying loads of interesting data about what appears in those results (e.g. the relative presence of a given domain, keywords that particular pages rank for, search rankings by industry, and more). For now, SERPscape is remaining separate from the Moz toolset, but over time, we’ll be integrating it with some cool new projects currently underway (more on that below).

    I’m also excited to share a little bit of a sneak preview of a project that I’ve been working on at Moz that we’ve taken to calling “Keyword Explorer.” Russ, in his new role, will be helping out with that, and SERPscape’s data and APIs will be part of that work, too.

    In Q1 of this year, I pitched our executive team and product strategy folks for permission to work on Keyword Explorer and, after some struggles (welcome to bigger company life and not being CEO, Rand!), got approval to tackle what I think remains one of the most frustrating parts of SEO: effective, scalable, strategically-informed keyword research. Some of the problems Russ, I, and the entire Keyword Explorer team hope to solve include:

    • Getting more accurate estimates around relative keyword volumes when doing research outside AdWords
    • Having critical metrics like Difficulty, Volume, Opportunity, and Business Value included alongside our keywords as we’re selecting and prioritizing them
    • A tool that lets us build lists of keywords, compare lists against one another, and upload sets of keywords for data and metrics collections
    • A single place to research keyword suggestions, uncover keyword metrics (like Difficulty, Opportunity, and Volume), and select keywords for lists that can be directly used for prioritization and tactical targeting

    You can see some of this early work in Dr. Pete’s KW Opportunity model, which debuted at Mozcon, in our existing Keyword Difficulty & SERP Analysis tool (an early inspiration for this next step), and in a few visuals below:

    BTW: Please don’t hold the final product to any of these; they’re not actual shots of the tool, but rather design comps. What’s eventually released almost certainly won’t match these exactly, and we’re still working on features, functionality, and data. We’re also not announcing a release date yet. That said, if you’re especially passionate about Keyword Explorer, want to see more, and don’t mind giving us some feedback, feel free to email me (rand at moz dot com), and I’ll have more to share privately in the near future.

    But, new tools aren’t the only place Russ will be contributing. As he noted in his post, he’s especially passionate about research that helps the entire SEO field advance. His passion is contagious, and I hope it infects our entire team and community. After all, a huge part of Moz’s mission is to help make SEO more transparent and accessible to everyone. With Russ’ addition to the team, I’m confident we’ll be able to make even greater strides in that direction.

    Please join me in welcoming him and SERPscape to Moz!

    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!