It’s November, which means it’s time for my annual SEO Playbook for the coming year. Thanks to Google’s Hummingbird, Internet marketers and search engine optimization professionals are thinking a lot about content and authority, as well as what this new algorithm means for keyword relevance and link building. At the same time, SEOs seem to have a difficult time figuring out what works for Bing. This year, I’m making my playbook a two-parter. Today, I’ll discuss Hummingbird and content strategy. Next month, I’ll dive into social, local, mobile, international and the other search verticals.Let’s dig in.
Thanks to Google itself and a lot of very smart researchers, we know a lot about Google’s search algorithm. However, Google reveals what it wants search marketers to know and keeps the rest confidential. In reality, everything that happens between submitting a query and seeing the search results is a black box.
What We Know Comes From Three Sources
1. What Google Tells Us
Google wants to be genuinely helpful. It provides lots of honest and valuable information critical to effective SEO. Google’s vision aligns well with good marketing practices, so it’s worth paying attention to its advice. Another reason to listen is that, where once Google preached best practices only to get trounced by manipulative optimization techniques, the search engine has become adept at identifying and punishing websites that spam, manipulate ranking authority, publish low-quality content or otherwise breach Google’s terms of service. On the flip side — and at the risk of sounding a bit tin hat — assume that anytime a Google spokesperson reveals something about their algorithm or suggests a best practice, it’s done with an agenda in mind. Google wants to promote the company’s vision of providing certain types of search results and preventing others. While creating an algorithm to yield the best possible results is crucial, Google uses its own form of propaganda to convince companies to create the types of content it wants to feature (in terms of topics, writing quality and crawlability). Spokespeople are trained and well armed with what they will or will not say. It’s the perfect example of a company whose employees walk into a room with individual ideas and opinions, then walk out parroting only the agreed upon message.
Much of what we know about improving search engine rankings comes from experience and observation. This is why I refer to SEO as a craft: part science and part art. As SEOs, we depend on each other’s positive and negative experiences to discover trends and understand their significance. As an example, understanding what types of content earn the most links or social mentions is less about rigorous testing and more about learning from personal and shared experience. In the past, much observation was targeted toward beating Google’s organic search algorithm. It resulted in techniques like link wheels, content networks and content spinning. Going forward, observation will be less about trickery and more about best practices. People are looking for answers to questions such as what types of topics work best for B2B or B2C sites, how to make product pages “rank-worthy,” or how to track visitors without hurting SEO.
Correlation tests are fascinating. Researchers scour Google’s statements about search and observations by SEOs for whatever they can quantify. They’ll go through HTML, social media sites, content patterns and more looking for anything they can isolate and measure. Organize all the possible signals, gather measurements on hundreds or thousands of data points, compare them to ranking results or site traffic and then publish the results. Even with the obvious caveat, correlation does not guarantee causation, it’s easy to understand why correlation study insights are popular and valuable.
Let’s look at Hummingbird, Google’s new algorithm. While it affected 90% of search results, Google implemented Hummingbird — an entire replacement of the algorithm — a month before notifying us. No one noticed. One reason SEOs may not have noticed is because the biggest changes happened to long-tail queries. The long-tail makes up the bulk of search queries; yet, each query is low frequency. Due to low search volume, plus Google’s (not provided) referral string, it’s easy for minor results, on a site-by-site basis, to fly under the radar. This meshes with what Google says about its new emphasis on matching the meanings of phrases with concepts rather than just matching the individual words in a query to documents. I think the real reason people didn’t notice Hummingbird coming online is that while it is a totally new algorithm, it mostly matches the old one. When referring to the Google search algorithm, in reality, we mean a collection of formulas. Every year, Google makes hundreds of changes to the algorithm. Imagine these piling up year after year. Some rewrite old sections. Some are entirely new additions. As changes build-up, the entire collection becomes more complex and less optimized.
It makes sense that Google would reorganize all the pieces and combine many formulas to optimize, organize and clear the path for future improvements. Let’s call it housekeeping or spring cleaning. That’s conjecture on my part, but I don’t think it’s a big stretch, especially considering Google makes no algorithm updates without rigorous testing. Another feature of Hummingbird is to make it easier to add and test future updates to the algorithm. In any environment, one can only test a limited number of things until it becomes too messy to evaluate the results. I’m sure Google has a long list of improvements they want to make, and Hummingbird is designed with that list in mind. Don’t be surprised if Hummingbird helps Google to make future changes more quickly and in greater numbers. So while Hummingbird adds some new updates — apparently, how well it identifies and handles context — I expect it’s mostly reorganization, optimization and laying the groundwork for future improvements.
Taking Advantage of Today’s Search Engine Algorithms
From an SEO’s point of view, we’re interested in content and authority.
When it comes to content, using concepts, making each concept stand out and employing a four-point content strategy are key.
The low-hanging fruit is to go through your current content and make sure it uses phrases that match how people think and search. For example, instead of Centurylink – Quick Bill Pay you might want to try Pay Your Centurylink Bill Online. (Centurylink actually has a help page for Pay Your Bill Online; but, you get the idea.) A lot of sites, especially enterprise websites, use shorthand that emphasizes keywords in titles and headlines, not phrasing. Test more complete titles to see if it increases traffic. Abbreviated phrasing, popular among websites, may become a hindrance to search engines’ growing ability to process natural language.
Make Every Piece Stand On Its Own
Focus each piece of content on a specific concept. A popular SEO content strategy is to write a tent pole page for a key word or phrase, then publish additional content that links to the tent pole page with optimized anchor text. While this can still be effective, be certain each page of supporting content focuses on its own unique concept. If you’re just writing about the same topic five different ways and linking them all to the tent pole article, trying to get that to rank, you risk making it difficult for search engines to determine the most relevant document. Using project management as an example, you may have a tent pole page for Project Management Best Practices. While it may be impossible for supporting pages to not overlap, they should have clear separation and stand well on their own. You might include articles like Agile Project Management Basics and Waterfall Project Management Basics.
Don’t support Project Management Best Practices with articles like Project Management 101 or Project Management Basics, as these are essentially the same thing. Before, these separate pages would have targeted different keywords, but now Google knows, or is getting better at knowing, these are essentially the same thing. As search engines move away from words and phrases to concepts, revisiting the same topic again and again using different keywords becomes less effective. In fact, one has to wonder if this type of strategy will trigger future versions of Panda since it’s not exactly a good user experience.
Employ A Four-Point Content Strategy
When you think of organic search traffic as a secondary benefit created by good content and popularity, it’s easy to see how a four-point content strategy, each with its own objective, makes sense: While it’d be nice for one-size-fits-all, there may be a fifth or even sixth content area that makes sense for your business. While many Web businesses perform well with the first three, too many sites lack Education, Information and Resources. These are the areas where sites can target keywords and earn authority while growing awareness and trust. When properly executed, this feeds both SEO and the top of the marketing funnel, which brings us to authority.
Links continue to reign supreme as the ultimate SEO authority signal. Over the last three years, both search engines and SEO professionals have said a lot about social media authority signals. While social is important and the search engines are moving in this direction, it’s failed to achieve its promise. Google’s speakers seem to have retreated from their initial enthusiasm. It appears things aren’t progressing as well as the search engines would like. I continue to advocate a strong social media presence on the major sites (Facebook, Google+, LinkedIn, Pinterest, Twitter and YouTube) that make sense for your business. Participate on smaller networks and niche sites, too, if you can reach target audiences with them. However, do not substitute social media for link building. Keep reaching out to content authors and publishers; ask them to reference and link to your articles and content. Generate a steady stream of link worthy content.
Social Media => Audience Building
Relationship Building & Asking => Link Building
Don’t assume Twitter tweets and Facebook likes will result in links. Most social media participants are not content publishers. Depending on your industry or niche, this may or may not include the social media influencers. If the only time your business gets links is when the news media covers your latest product release, should you be so lucky, you will lose the authority battle to your competitors. You may have heard Google prefers brands and gives authority based on brand mentions in social media. I’m sure Google keeps a vast database of brand names and has a method to identify brands emerging into popularity. But, if you do not get a steady stream of tweets every time you publish a blog article or press release, chances are good your brand is too small or weak to earn significant ranking authority in this way. I’d categorize this as nice to know then get on with earning readers and links.
Make sure you put an intentional link-building program in place.
Architecture & URLs
At this point, my recommendations for Web architecture look to stay the same into the future. Use a simple, logical architecture, one that looks like an organization chart. Unless yours is a massive enterprise or media site, all pages should be accessible within four clicks from the homepage. The same goes for URLs; I anticipate no changes. Keep them simple and readable for people. Minimize or eliminate parameters, but if you do use them be consistent in order.
Last summer, Google made a point of saying not to worry about duplicate content. However, Google is not the only search engine out there, so ignore non-canonical issues at your peril. Continue to make sure each page has one unique URL or use canonical directives to indicate the correct URL. If you do use parameters, check your Webmaster Tools setting in both Google and Bing, and keep those parameters in the same order in all your links.
Source : searchengineland.com/2014-seo-playbook-part-1-hummingbird-175860