SEO is dead, SMO is king!

“SMO” – Social Media Optimization is the new SEO.  You think back to 5 years ago, when you’d try to explain SEO to clients, and no one would get it.  Now, SMO is in it’s place, and has the same challenge.  With the integration of Social + Search (Facebook + Bing), SMO has risen to the top.

Over the past five years, Web publishing has been so heavily dominated by search engine optimization (SEO) that, to many publishing executives, the right keywords have become far more important than their sites’ actual content or audience. But this movement toward SEO has been dangerous, as it’s moved publishers’ eye off their most important job of creating great content, and onto the false goals of keywords, hacks, paid links, and technical engineering that their audience doesn’t know or care about.

Even venerable publishers like Forbes have traded in their leadership legacy to chase the Huffington Post pufferfish strategy of filling up Google’s database with more posts, more frequency, and more low-cost content; while stalwarts like Time Inc. (NYSE: TWX) are still chasing SEO basics like getting keywords into their URLs.

But the recent announcement of the Facebook/Bing partnership to integrate social and search results clearly marks the beginning of the end of SEO, and the smartest digital publishers will drop everything to rethink their distribution strategy entirely.

With the rise of Facebook, we’ve entered a new era of digital media: personalized discovery. The balance of power is shifting: Already sites at Wetpaint and other publishers are seeing more audience coming from Facebook than from search.

Search was critical when answers to questions were scarce. Google (NSDQ: GOOG) can find an answer to almost any keyword query from among the zillions of pages on the web. But at a time when such answers are abundant, it’s far more valuable to find the best content for me – and increasingly, find it before I’ve even asked for it. The sort algorithm that works best for that is more correlated to who’s doing the asking than how they would phrase the ask.
For that level of personalized results, no abject algorithm can keep up without deep knowledge of its users. Advantage: Facebook.

The encouraging implication is that the audience values content, not keywords. And Facebook sides with the audience. And so it’s time to christen a new era of social-media optimization, or “SMO.” The era of SMO liberates publishers from the exercise of tricks, hacks and keywords. Instead, the big opportunity is now once again creating and refining the most appealing content possible.

Imagine that.

SMO recognizes that Facebook already has the best position to introduce content to users. Already, audiences are using Facebook as the news interface to their favorite sources (both media titles and their friends) in a way that Google News hasn’t cracked the code on; products like Flipboard that take this to the next level are captivating.

As Facebook takes its immense database of “Likes” and pivots it to inform search results, there’s no question that it will have a huge advantage in delivering a better result set for almost every user. It simply knows more.

SMO strategy means appealing to the audience, not an intermediary; knowing what drives interest; and activating people’s desire to consume and share. Sure, there is buzz among many publishers around Facebook logins and likes, and the traffic bumps that come with them. But SMO offers more far than that. It’s about creating a positive feedback loop, where users are rewarded for both consuming and distributing content. The key is to develop virality in media like that of Zynga games and Groupon offers. Beyond, of course, creating great content and experiences that are worth sharing, publishers need to then reward their audiences with the full range of possibilities, including prestige, access, exclusive content and enhanced experiences.

For those who are still working on implementing search strategies: if you haven’t turned your focus to SMO, you will be left behind as the allure of gaming search engines fades into the past.

Brands Hiding on Google

YORK, Pa. ( — If a consumer types a brand name into the Google search box, a home-page link should — and likely will — appear as one of the top listings.

But does the same thing happen when typing in a generic keyword relevant to that business? Say, “home repair” for Home Depot or “gifts” for Harry & David? That depends on how well they’re optimized for Google. And in the case of those two examples, Home Depot and Harry & David website links don’t even make it to the first page of Google, according to a recent study by Covario that evaluated the search-engine optimization health of 100 branded websites.

30 SEO Problems & Tools

#1 – Generating XML Sitemap Files

The Problem: XML Sitemap files can be challenging to build, particularly as sites scale over a few hundred or few thousand URLs. SEOs need tools to build these, as they can substantively add to a site’s indexation and potential to earn search traffic.

Tools to Solve It: GSiteCrawlerGoogle Sitemap Generator

#2 – Tracking the Virality of Blog/Feed Content

The Problem: Even experienced bloggers have trouble predicting which posts will “go wide” and which will fall flat. To improve your track record, you need historical data to help show you where and how your posts are performing in the wild world of social media. What’s needed is a cloud based tracking tool that can sync up with the Twitters, Facebooks, Diggs, Reddits, Stumbleupons & Delicious’ of the web to provide these metrics in an easy-to-use, historical view.

Tools to Solve It: PostRank Analytics

#3 – Comparing the Relative Traffic Levels of Multiple Sites

The Problem: We all want to know not only how we’re doing with web traffic, but how it compares to the competition. Free services like and Alexa have well-documented accuracy problems and paid services like Hitwise, Comscore & Nielsen cost an arm and a leg (and even then, don’t perform particularly well with sites in the sub-million visits/month range).

Tools to Solve It: QuantcastGoogle Trends for Websites

Google Trends for Websites allows you to plug in domains and see traffic levels. Much like AdWords Keyword Tool, the numbers themselves seem to run high, but the comparison often looks much better. Google Trends has become the only traffic estimator I trust – still only as far as I could throw a Google Mini, but better than nothing.

#4 – Seeing Pages the Way Search Engine Do

The Problem: Every engineering & development team builds web pages in unique ways. This is great for making the Internet an innovative place, but it can make for nightmares when optimizing for search engines. As professional SEOs, we need to be able to see pages, whether in development environments or live on the web the same way the engines do.

Tools to Solve It: SEO-BrowserGoogle Cached SnapshotNew Mozbar

SEO-Browser is a great way to get a quick sense of what the engines can see as they crawl your site’s pages and links. The world of engines may seem a bit drab, but it can also save your hide in the event that you’ve put out code or pages that engines can’t properly parse.

#5 – Identifying Crawl Errors

The Problem: Discovering problems on a site like 302 redirects (that should be 301s), pages that are blocked by robots.txt (here’s why that’s a bad idea), missing title tags, duplicate/similar content, 40x and 50x errors, etc. is a task no human can efficiently perform. We need the help of robots – automated crawlers who can dig through a site, find the issues and notify us.

Tools to Solve It: GSiteCrawlerXenuGGWMT

#6 – Determine if Links to Your Site Have Been Lost

The Problem: Sites don’t always do a great job maintaining their pages and links (according to our data,75% of the web disappears in 6 months). Many times, these vanishing pages and links are of great interest to SEOs, who want to know whether their link acquisition and campaigning efforts are being maintained. But how do you confirm if the links to your site that were built last month are still around today?

Tools to Solve It: Virante’s Link Atrophy Diagnosis

#7 – Find 404 Errors on a Site (without GG WM Tools) and Create 301s

The Problem: Google’s Webmaster Tools are great for spotting 404s, but the data can be, at times, unwieldy (as when thousands of pages are 404ing, but only a few of them really matter) and it’s only available if you can get access to the Webmaster Tools account (which can stymie plenty of SEOs in the marketing department or from external consultancies). We need a tool to help spot those important, highly linked-to 404s and turn them into 301s.

Tools to Solve It: Virante’s PageRank Recovery Tool

Natural Link Love

(1) Don’t Take Yourself for Granted

A while back, I heard George Wright speak about how he came up with the idea for the “Will It Blend?” videos. The short version is that he was touring the production facility when he came across a bunch of QA engineers running crazy things through Blendtec blenders. He was amazed by what he saw, but they took it for granted (blending two-by-fours was their job, after all). I love this story, because it’s so applicable to any business. There’s something about your product or service that is amazing, but because you see it every day, you take it for granted. Put down your mission statement and PowerPoint slides and see your product through your customers’ eyes. If you can’t, go find a fresh perspective.

You have a story worth telling, even if you don’t know it. If you think your industry is too “boring” for link-bait, then you’re not trying hard enough. As my Dad likes to say, only boring people get bored.

(2) Be Careful Who You Love

Low-quality links are attractive because they seem easy, but are they really? Let’s look at some hypothetical times to build one link based on common tactics:

  • Directory Submissions – 15-30 minutes
  • Article Marketing – 15-60 minutes
  • Email Link Requests – 60-90 minutes

You might balk at that last one – finding and emailing one prospect should only take a couple of minutes, right? Ok, but what’s your conversion rate on those emails, maybe 1-2%? Let’s say it takes you 1 minute/email – you’re talking about 50-100 minutes to get one link back (I rounded down to be generous).

So, what would it take to build 800 links, even low-quality ones? At a very generous estimate of 15 minutes/link, you’re talking about 200 hours of work. Even counting research and testing on my target audience, I’d estimate that my checklist blog post took about 40 hours. My last e-book took 30 hours to research, write, and do layout. Do low-quality links still seem like a bargain?

(3) Natural Link-Love Is Real

Low-quality links are superficial. What you get in return for them is a tiny bit of SEO value, driving people to content that usually isn’t strong enough to get any love on its own. Building strong content that attracts natural links does more than build SEO value. It builds a real audience and actual, in-person relationships.

In 2010, I’ve calculated that roughly 65% of my revenue can be traced back to either blogging or social media. Great content gets the attention of like-minded people and builds your brand. It almost magically makes every piece of content that comes after it stronger.

The flipside of this equation is that it takes real relationships to drive natural links. Take 50% of the time you spend building low-quality links and spend it participating – get to know the communities, blogs, and linkerati in your industry niche. Give back to those communities, and when the time comes that you have something really outstanding to share, you’ll already have an audience for it.