Local SEO: 5 Advanced Tactics for Greater Visibility

Tags

, , , ,

Like all other areas of SEO, local SEO is an ever-evolving entity. The last few years in particular have seen huge changes in the tools and techniques businesses need to use to rank well for location-specific searches.

Another part of the problem is the advice and discussion on the topic. 2014 in particular saw most advice and discussion on local SEO center on Google Places for Business and Google+ Local.

Now, I’m not saying that these aren’t important – that would be stupid, right? What I’m saying is that there are many other elements that seem to have been given more importance since the “Pigeon” update last year.

So, where should SEOs and business owners be focusing their efforts? Are there still “quick wins” to be had? Here are my five tips for helping you, or your clients, get more local visibility in 2015.
Local SEO
1. Think Visibility, Not Rankings

So much of the discussion around local rankings has centered on position rather than visibility. If your listing and website ranks top you’ll get the majority of the traffic, right? Not necessarily.

Visibility is about controlling and dominating search real estate. If your local listing ranks top, with your website just below the fold, that might not be enough. What if your competitors are ranking with images on a visually led search and using AdWords to secure above-the-fold real estate with locally relevant extensions, reviews, and social integrations? Even if they rank below your primary Web properties, they will have far more visibility, garner more trust, get more visitors, and probably convert more.

2. Google Places for Business and Google+ Local

Having said all that, you shouldn’t ignore Google’s suggested channels for ranking your business. They still dominate the primary real estate in any locally oriented search. So, how do you ensure your listing is presented as often, and in the best possible way, as possible?

In the past, it’s all been about reviews and citations. That much hasn’t changed. What is worth considering is the general shift toward ranking signals that can’t be gamed. This means it’s a very reasonable assumption that reviews will soon get more weight than citations.

Reviews are notoriously difficult to get. However, Google is making it far easier with every new integration it introduces between its various services. Take advantage of your audience on any Google-owned property, whether it’s a Google+ community, YouTube followers, or Gmail contact list, if you are asking people to leave a review when they’re already logged into their Google account, it’s literally a couple of clicks. Make it seamless and they are more likely to take two minutes to help you out.

3. Get the Basics Right

Though the standard of local SEO is generally increasing, there are still so many businesses that aren’t getting the basics right. One of the most common failings I spot is the omission of a custom meta description.

This is another area where businesses are throwing away chances to attract new visitors by focusing on their rank rather than how they are perceived in the SERPs. Ranking number one only gets you in the shop window. If your meta description is an isolated section of your page’s opening paragraph, and your competitors at number two and three have carefully crafted their snippets to make visitors want to click, you’re going to lose out.

Another area where businesses fail to get the most out of their ranking and listing is by ignoring the importance of schema markup. How often have you clicked on a SERP listing because it has five yellow stars next to it? Well, your customers are the same. Get these small things right and you’ll see an immediate surge in local search traffic, even without increasing visibility at all.

4. Citations

Yes, citations are still important. Even though they can be easily gamed, they still represent a trustworthy indicator of where a business is located. If you’ve got 200 citations that all tell Google that you’re based in the same place and have the same address, in the same format, they are going to trust that information and reflect it in their SERPS.

There are now lots of different tools for identifying and maintaining your local citations on the Web. The most recent of these is Moz Local. These let you manage all of your online business directory listing from one place. These tools will push your business details to all relevant citation sources and make sure it’s always presented in exactly the same format as your Google listing.

5. Do Your Other Properties Rank?

As I’ve already mentioned, it’s best to approach local SEO with the aim of increasing your visibility in your key search results, rather than focusing on the rank of one or two properties. So, which other properties should you be focusing on in order to achieve this? Well, this depends on your search terms.

As you know, the perceived relevance of particular properties will vary depending on the search term itself. If it’s a visually led search, you will probably get images ranking highly. If video is particularly relevant, you’ll see videos high up in the SERP. One of the best ways to work this out is to simply carry out searches using your key terms and take note of what shows up. This should tell you what type of media does well, but also what your competitors are doing. This is a great way to focus your efforts on the exact properties that Google deems relevant to your search. Do their Yelp listings sit below their website and places listing? What about their social properties? Are they getting good visibility? If so, which ones? If you can focus on developing these properties; making sure you have as much information on there as possible you will start to see broader visibility for your search. Another, often ignored, way to increase the visibility of these other properties is to do everything you can to help them rank. Though this seems counterintuitive (you should focus your SEO on your own website, right?), by linking to these other properties wherever possible will help increase their perceived authority and rank.

Your Google properties and company website should always receive the majority of your attention. However, by thinking more broadly about your visibility in the SERPs you’re likely to increase your visibility and perceived trust, not just by Google, but by potential customers, too.

Source: SEW

How to Align Google Webmaster Tools US and Web Report with Google Analytics

Tags

, , , ,

Understanding what people are searching organically is, and has always been, a foundational aspect to forming an SEO strategy. To be without keywords is to be without intent. Google took our keywords away in Analytics and gave us search query data within Google Webmaster Tools in exchange, so it is important to understand how we can leverage this.

Historically, aligning these reports with Analytics has seemed to prove an impossible task, leading many SEOs to believe Google’s keyword data was worthless. The most prevalent issues being:

Numbers were rounded (in January this was changed to provide exact numbers);
It only provided information for the top 2000 queries per day (if your site is under this threshold, you theoretically should have   comprehensive data; if your site is over, verifying additional sub-folders can help gain more search query data).
And it was not clear how best to match up filters to line-up reports.

This article focuses on what we believe to be the biggest issue and that’s the last bullet point above, aligning filters of both data sources.

Tools used:

Google Webmaster Tools Search Query Report
Google Analytics Custom Reports (Import Custom reports for U.S. and Web or U.S. and Mobile here).

Aligning Google Webmaster Tools US and Web Report with Google Analytics

Step 1 – Set Up Google Webmaster Tool Report

From the Site Dashboard in Google Webmaster Tools, navigate to Search Traffic -> Search Queries. From there, here’s a quick summary of what to do next.

  • Specify date range.
  • Filter 1 – Search: Web
  • Filter 2 – Location: US
  • Specify Date Range

For the purpose of this article we’ll choose the last full week of data available, which is July 27 through August 2.

goodsell-align-1Add Filters

Add the filters to limit the results to only those from Web search and United States and click Apply. These are the changes that should be reflected.

  • Search: Mobile
  • Location: United States
  • Traffic : All

goodsell-align-2Step 2 – Set Up Google Analytics Report

The next step is to create a custom report to match the filters specified in Google Webmaster Tools. Here’s a quick summary of what needs to happen:

If you choose to create your own custom report, follow this configuration:

  • (Include) Source / Medium (exact): google / organic
  • (Include) Country (exact): United States
  • (Exclude) Mobile (exact): Yes
  • (Exclude) Source / Medium (regex): image|video

Once the custom report has been created, be SURE to specify the same date range.

Repeat for Mobile

It’s possible to align organic reports for mobile visitors as well. Using this same method, create the U.S. Mobile custom report (import here) for Google Analytics and, generally, apply the following filters:

GWT Filters:

  • Search: Mobile
  • Location: United States
  • Traffic: All

GA Custom Report:

  • (Include) Country (exact): United States
  • (Include) Mobile (exact): Yes
  • (Include) Source / Medium (exact): google / organic
  • (Exclude) Source / Medium (regex): image|video

Findings and Got-Yas

We’ve created a document that can be used as a template to keep track of your findings.

After following this process for a handful of eligible sites (which have less than 2,000 search queries a day), we found that they typically lined up surprisingly well — within 10 percent or so of a 100 percent match. Comparing the Top Pages report in Google Webmaster Tools Search Query and the GA custom report can help show just how closely numbers are lining up.

goodsell-align-3Misattribution of image and video search and landing pages showing up as organic were two very common findings. To help further align these reports, be sure to address the following two items:

  1. Properly segment Sessions from Image and Video search (AJ Kohn has a good write-up on this and Google talks about it here).
  2. Compare the Top Pages report in Google Webmaster Tools to the results of the custom report we created in this article. Often landing pages get misattributed to organic.

Conclusion

More than proof that Google Webmaster Tools search query data is possible to line up with Google Analytics, this article is an attempt to provide a starting point to help everyone take advantage of what we do have to help identify intent of our customers coming from organic search. Knowing how this data can apply to Analytics is a step towards making this data source legitimate.

If you have the time to test this method out, please let us know what your findings are! Together perhaps we can take full advantage of what this data may have to offer.

Author: Ben Goodsell

HTTPS Sites Secure Ranking Boosts in Google

Tags

, , ,

“We want to convince you that all communications should be secure by default.”

Those were the words uttered by Webmaster Trends Analyst Pierre Far at the Google I/O event this summer, when he and a Google colleague talked “HTTPS everywhere.” And this week, Google Search is taking a very convincing stance on the matter: HTTPS is now a ranking signal in its algorithm.

From Google’s announcement:

Over the past few months we’ve been running tests taking into account whether sites use secure, encrypted connections as a signal in our search ranking algorithms. We’ve seen positive results, so we’re starting to use HTTPS as a ranking signal. For now it’s only a very lightweight signal — affecting fewer than 1% of global queries and carrying less weight than other signals such as high-quality content — while we give webmasters time to switch to HTTPS. But over time, we may decide to strengthen it, because we’d like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web.

On Google+, Webmaster Trends Analyst John Mueller answered questions from the community, like, “What if you have an informational site – does it apply to you, too?”

Mueller said this:

Some webmasters say they have “just a content site,” like a blog, and that doesn’t need to be secured. That misses out two immediate benefits you get as a site owner:

1.     Data integrity: only by serving securely can you guarantee that someone is not altering how your content is received by your users. How many times have you accessed a site on an open network or from a hotel and got unexpected ads? This is a very visible manifestation of the issue, but it can be much more subtle.

2.    Authentication: How can users trust that the site is really the one it says it is? Imagine you’re a content site that gives financial or medical advice. If I operated such a site, I’d really want to tell my readers that the advice they’re reading is genuinely mine and not someone else pretending to be me.

On top of these, your users get obvious (and not-so-obvious) benefits.

Moving a site from HTTP to HTTPS could have technical problems if not implemented carefully. Google gives tips on how to handle the move here.

And, in its help files, it also talks about best practices for setting up HTTPS, which include helping the search engines see the site as secure by following these tips (more details exist on the help page itself):

> Redirect your users and search engines to the HTTPS page or resource with server-side 301 HTTP redirects.
> Use relative URLs for resources that reside on the same secure domain.
> Use protocol relative URLs for all other domains or update your site links to link directly to the HTTPS resource.
> Use a web server that supports HTTP Strict Transport Security (HSTS) and make sure it’s enabled.

If you have questions or concerns, Google is directing people to the Webmaster Help Forums. For example, this search for “HTTPS” in the forums pulls up several conversations already happening on the matter. The announcement said that in the coming weeks, Google would be publishing detailed best practices on this issue.

Author: Jessica Lee

The decay and fall of guest blogging for SEO

Tags

, , ,

Okay, I’m calling it: if you’re using guest blogging as a way to gain links in 2014, you should probably stop. Why? Because over time it’s become a more and more spammy practice, and if you’re doing a lot of guest blogging then you’re hanging out with really bad company.

Back in the day, guest blogging used to be a respectable thing, much like getting a coveted, respected author to write the introduction of your book. It’s not that way any more. Here’s an example unsolicited, spam email that I recently received:

My name is XXXXXXX XXXXXXXX and I work as a content marketer for a high end digital marketing agency in [a city halfway around the world]. I have been promoting high quality content in select niches for our clients.

    We are always on the lookout for professional, high class sites to further promote our clients and when I came across your blog I was very impressed with the fan following that you have established.I [sic] would love to speak to you regarding the possibility of posting some guest articles on your blog. Should you be open to the idea, we can consider making suitable contribution, befitting to high standard of services that your blog offers to larger audience.

    On my part, I assure you a high quality article that is-
    – 100% original
    – Well written
    – Relevant to your audience and
    – Exclusive to you

    We can also explore including internal links to related articles across your site to help keep your readers engaged with other content on your blog.
    All I ask in return is a dofollow link or two in the article body that will be relevant to your audience and the article. We understand that you will want to approve the article, and I can assure you that we work with a team of highly talented writers, so we can guarantee that the article would be insightful and professionally written. We aim to write content that will benefit your loyal readers. We are also happy to write on any topic, you suggest for us.

If you ignore the bad spacing and read the parts that I bolded, someone sent me a spam email offering money to get links that pass PageRank. That’s a clear violation of Google’s quality guidelines. Moreover, we’ve been seeing more and more reports of “guest blogging” that are really “paying for PageRank” or worse, “we’ll insert some spammy links on your blog without you realizing it.”

Ultimately, this is why we can’t have nice things in the SEO space: a trend starts out as authentic. Then more and more people pile on until only the barest trace of legitimate behavior remains. We’ve reached the point in the downward spiral where people are hawking “guest post outsourcing” and writing articles about “how to automate guest blogging.”

So stick a fork in it: guest blogging is done; it’s just gotten too spammy. In general I wouldn’t recommend accepting a guest blog post unless you are willing to vouch for someone personally or know them well. Likewise, I wouldn’t recommend relying on guest posting, guest blogging sites, or guest blogging SEO as a linkbuilding strategy.

For historical reference, I’ll list a few videos and links to trace the decline of guest articles. Even back in 2012, I tried to draw a distinction between high-quality guest posts vs. spammier guest blogs:

Unfortunately, a lot of people didn’t seem to hear me say to steer away from low-quality guest blog posting, so I did a follow-up video to warn folks away from spammy guest articles:

In mid-2013, John Mueller gave spot on advice about nofollowing links in guest blog posts. I think by mid-2013, a ton of people saw the clear trend towards guest blogging being overused by a bunch of low-quality, spammy sites.

Then a few months ago, I took a question about how to be a guest blogger without it looking like paying for links (even the question is a clue that guest blog posting has been getting spammier and spammier). I tried to find a sliver of daylight to talk about high-quality guest blog posts, but if you read the transcript you’ll notice that I ended up spending most of the time talking about low-quality/spam guest posting and guest articles.

And then in this video that we posted last month, even the question itself predicted that Google would take stronger action and asked about “guest blogging as spam”:

So there you have it: the decay of a once-authentic way to reach people. Given how spammy it’s become, I’d expect Google’s webspam team to take a pretty dim view of guest blogging going forward.

Added: It seems like most people are getting the spirit of what I was trying to say, but I’ll add a bit more context. I’m not trying to throw the baby out with the bath water. There are still many good reasons to do some guest blogging (exposure, branding, increased reach, community, etc.). Those reasons existed way before Google and they’ll continue into the future. And there are absolutely some fantastic, high-quality guest bloggers out there. I changed the title of this post to make it more clear that I’m talking about guest blogging for search engine optimization (SEO) purposes.

I’m also not talking about multi-author blogs. High-quality multi-author blogs like Boing Boing have been around since the beginning of the web, and they can be compelling, wonderful, and useful.

I just want to highlight that a bunch of low-quality or spam sites have latched on to “guest blogging” as their link-building strategy, and we see a lot more spammy attempts to do guest blogging. Because of that, I’d recommend skepticism (or at least caution) when someone reaches out and offers you a guest blog article.

Originally Posted by: Matt Cutts

Google’s Matt Cutts: We Don’t Use Twitter Or Facebook Social Signals To Rank Pages

Tags

, , , , ,

Google’s head of search spam, Matt Cutts, released a video today answering the question, “are Facebook and Twitter signals part of the ranking algorithm?” The short answer was no.

Matt said that Google does not give any special treatment to Facebook or Twitter pages. They are in fact, currently, treated like any other page, according to Matt Cutts.

Matt then answered if Google does special crawling or indexing for these sites, such as indexing the number of likes or tweets a specific page has. Matt said Google does not do that right now. Why?

They have at one point and they were blocked. I believe Matt was referring to Google’s real time search deal expiring with Twitter. Matt explained that they put a lot of engineering time into it and then they were blocked and that work and effort was no longer useful. So for Google to put more engineering time into this and then be blocked again, it just doesn’t pay.

Another reason, Google is worried about crawling identity information at one point and then that information changes but Google doesn’t see the update until much later. Having outdated information can be harmful to some people.

However, Matt does add that he does see Google crawling, indexing and understanding more about identities on the web in the long term. He used our Danny Sullivan as an example, when Danny writes a story here, the site is authoritative, so it ranks well. But if Danny posts a comment on a forum or on Twitter, it would be useful for Google to know that an authority posted on a specific site and thus it should have more ranking weight in Google.

While Google doesn’t do this now, we know they are indeed working on a solution for this.

Here is the video:

Source: SearchEngineLand

21 Best FREE SEO Tools for On-Page Optimization

Tags

, , , , , , , , , , , , , , , , , , , , , , , , , ,

Google’s official position on webmaster best practices really hasn’t changed much over the years. What has changed is the search engine’s ability to enforce these guidelines through improved algorithms. The implementation of Panda, Penguin, and Hummingbird has had a profound impact on the SEO landscape.

Google’s Matt Cutts has remarked that no one should be surprised when a website that hasn’t followed the guidelines is penalized. What Cutts overlooks or chooses to ignore is something that I have dubbed the “Google Paradox”. I suspect the Google paradox is the root cause for one black hat forum member expressing his desire to “punch Matt Cutts in the face” (per Cutts’ Pubcon 2013 keynote).

In order to reach the top of the SERPs and stay there in 2013 and beyond, your website must deserve to be there. It needs to be the best in class. It must offer the best user experience in that niche. Fortunately, there are a number of free tools that can help you achieve that goal.

Keyword Research

Developing the right list of keywords remains a staple of SEO, even in 2013. Because the keyword selection has such a profound impact on the overall performance of a website, the keyword selection process shouldn’t rely on a single tool.

1. Wordstream

Wordstream

The Wordstream Free Keyword Tool offers thousands of keyword ideas from a huge database of more than a trillion unique searches. This tool outperforms some of the paid alternatives in the market.

2. Keyword Eye Basic

keyword eye

Keyword Eye Basic is a visual keyword suggestion tool, that works particularly well for brainstorming sessions.

3. YouTube Keyword Tool
youtube keyword tool
Julie Joyce recently described in detail just how great the YouTube Keyword Tool is for keyword research for all kinds of content, not just video.

4. Übersuggest
Ubersuggest
Übersuggest utilizes the “Suggest” data from Google and others. A terrific tool for developing long-tail phrases.

Content Tools

Everyone knows by now that Panda ushered in a new era for content quality. No longer will cheap, spun, or duplicate content frequently find its way to the top of the SERPs. These tools will help you with your content development

5. Anchor Text Over Optimization Tool
Anchor Text Optimization
The Anchor Text Over Optimization Tool outputs anchor text diversity. Words or phrases that are potentially over-optimized are highlighted for manual review.

6. Convert Word Documents to Clean HTML
Convert word documents to clean htmlConvert Word Documents to Clean HTML is a free converter tool for documents created in Microsoft Word, Writer, and other word processing software.

7. Copyscape
CopyscapeCopyscape is a free plagiarism checker. The software lets you enter a URL to detect duplicate content and to verify that your content is original.

Technical Tools

Don’t let the title scare you away. These are technical tools designed to be used by the non-technical among us.

8. Xenu’s Link Sleuth
Xenu link sleuthXenu’s Link Sleuth is a PC based spidering software that checks websites for broken links. It performs validation of text links, frames, images, local image maps, and backgrounds.

9. Robots.txt Generator
Robots txt generatorRobots.txt Generator is a freeware utility that makes the creation of robots.txt files a breeze.

10. Robots.txt Checker
Robots txt checker
Robots.txt Checker is a “validator” that will analyze the syntax of a robots.txt file to verify the format is valid.

11. URI Valet
 URL Valet
Use the URI Valet Header Checker to view total number of objects (http requests), time to download, object details, document internal links, and external links along with verifying server headers for each.

12. Title and Description Optimization Tool
title-and-description-optimization-tool
This title and description optimization tool gives insight as to what the top ranking competitors are using for titles and descriptions. The best competitor intelligence tool of its type.

13. Image SEO Tool
image-seo-tool
With Image SEO Tool, simply input a URL and this tool checks image name, alt attribute, and dimensions. Alerts are given if a potential problem is discovered.

14. Schema Creator
schema-creator
The Schema Creator tool is the easiest way to get started creating HTML with schema.org microdata.

15. Google Snippet Preview
google-snippet-preview
The snippet that appears in Google Search is usually taken from these meta tags. The Google Snippet Preview tool helps visualize and optimize what is displayed to searchers.
google-snippets-preview-tool16. Structured Data Testing Tool
Structured-data-testing-toolThe Structured Data Testing Tool verifies Schema.org or any other structured data type markup.

17. XML Sitemap Generators
XML Sitemap Generators create “web-type” XML sitemap and URL-list files (some may also support other formats).

18. XML Sitemap Inspector
xml-sitemap-inspector
XML Sitemap Inspector validates your sitemap (XML or gunzip), repairs errors, and pings all search engines.

19. Pingdom Website Speed Tool
pingdom
Pingdom’s Website Speed Tool tests the load time of a page, analyzes it, and finds bottlenecks.

20. Fiddler
fiddler
Fiddler is a web debugging tool which logs all HTTP(S) traffic between your computer and the Internet.

21. Microsoft Free SEO Toolkit
microsoft-free-seo-toolkit
Now that you have run each of these tools over your website, it’s time to run a full blown SEO audit, complete with detailed reports. Fortunately, Microsoft has created a Free SEO Toolkit which does just that.

What are some of your favorite tools for on page optimization?

Source: Searchenginewatch

Google Webmaster Tools Now Highlights Security Issues

Tags

, , , ,

Last week at Pubcon, Google’s Matt Cutts said that Google was working on the next generation of hacked site detection during his keynote. Google has announced some Webmaster Tools updates today in the way they communicate with webmasters.

There is a brand new section within Google Webmaster Tools that now offers a portal for “Security Issues” to allow webmasters to not only be alerted when they have a security issue or evidence that a site has been hacked, but also give more detailed information on the nature of the issue.
Webmaster Tools
The new Security Issues area will list sites and pages that Google believes has been hacked with malware or spam. It will include specific URLs, including the problem code snippets from the site if relevant, and the date Google last detected the issue. The date will be especially helpful if a site is attempting to clean up from a mass hacked site problem, and need a check to see if the problem really has been fixed.

Google will also detail the type of malware, such as whether it was a website template injection, a SQL injection, or a malware code injection. If it was a spam issue, they will include sample URLs which contain the spam, with the alert of it being a content injection.

Having this level of details about the type of issue will go a long way to helping webmasters, particularly those who are in the novice side of webmaster skills, to determine how the issue happened, and what they need to look at in order to fix it. Some webmasters with a WordPress blog might not be able to tell the difference between an SQL injection or if it was an injection coming from a template or plugin.
Webmaster Tools
When available, webmasters can then click the specific issues to get even more details, such as exact code snippets Google has detected as well as the recommended actions on how to fix the spam or malware issue. It also reminds users to fetch a page as a Googlebot, in case the spammy content has been hidden through CSS, if they aren’t able to see it on the page when they look at it.

Webmaster Tools
Lastly, it is easier for webmasters to request a review once they have cleaned up any spam or malware issues. On the same Security Issues page, there is a button where webmasters can easily request a review.

This is especially helpful because sites with malware have a “This site may harm your computer” warning alerting potential visitors that Google advises searchers to not visit the site. So after the webmaster checks a box confirming the issues have been fixed, the review request can be submitted.

This new change to Webmaster Tools will definitely be helpful to all webmasters who have faced having to fix a site after it has been hacked, especially for those who might not be as tech savvy as others, and then make it easy for a site to recover in Google.

Source: searchenginewatch.com

Google Hummingbird & The Keyword: What You Need To Know To Stay Ahead

Tags

, , , , , , , , , ,

On September 26, Google told participants at its 15th anniversary event it had a new algorithm impacting more than 90 percent of searches worldwide. They called it “Hummingbird.” Google’s Amit Singhal later said it was perhaps the largest change to the algorithm since he joined the company back in 2001.

This information made some marketers nervous, but at PubCon last week, Google’s Matt Cutts reminded the audience that the algorithm had been up and running for more than a month before it was announced, and no one even noticed.

Hummingbird allows the Google search engine to better do its job through an improvement in semantic search. As conversational search becomes the norm, Hummingbird lends understanding to the intent and contextual meaning of terms used in a query.

It seems that with Hummingbird, Google can now better answer those longer-tail queries even if a page is not optimized for them. So some pages may have a better chance of being found for certain queries now.

We saw clues this was coming. In fact, back in May, Google announced conversational search across devices plus improvements to the Knowledge Graph.

In the announcement, Singhal painted a picture of the future of search. “People communicate with each other by conversation, not by typing keywords — and we’ve been hard at work to make Google understand and answer your questions more like people do.”

Keywords Still Central To SEO
More recently, the loss of keyword data that skyrocketed from Google’s move toward 100 percent secure search punctuated the fact that at the same time, Google was getting better at search, and it was asking SEOs to move away from a strictly keyword-based approach.

hummingbird
So the question is: should SEOs be worrying about their strategy? And the answer is no — at least, not if they’ve been staying on the leading edge of SEO.

Why Google Hummingbird Means Business As Usual For Many SEOs
Google’s algorithm continues to be a complex mix of factors that weigh the relevancy of a page for a query. That hasn’t changed.

While some people may be panicking that their SEO strategy needs to be revamped, if you’ve been progressing with the natural evolution of SEO, there’s nothing to worry about. You’re on the right track.

Taking what we know about how Google is trying to improve its search results, here is just a sample of some of the things that continue to matter:

  • Mobile SEO: Undoubtedly, conversational search is driven in part by the way people search when on their mobile devices — so, mobile optimization is going to continue to be critical.
  • Structured Data Markup: Providing search engines with as much information as possible about your page content helps them do their job better. Structured data can also improve click-through rates in the search results when displayed in rich snippets.
  • Google+: Google’s social network is essential in helping to identify your online brand, connecting it with concepts and serving your content in the Google results.
  • Links: Google may not want SEOs obsessing over PageRank data, but that doesn’t mean links are irrelevant. Links help Google put concepts together on the Web; they also send strong signals to Google about the credibility of your page.
  • Keyword Optimization & Content Creation: Nowadays, it seems there is a lot of debate over the usefulness of focusing on keywords. But keywords are not dead. Quality content is crucial, and that includes at least some level of keyword optimization.

Start With The User, Execute With Content, Measure By Page

SEO now requires a keener understanding of your audience. It doesn’t start or end with keywords; rather, it starts with the user and an understanding of what your user wants.

Users: What Matters To Them & How Can You Help?
Your content may have four or five different types of users, who are searching for the answer to a query. Understanding what’s being served to which user and catering to those important segments with a good user experience on your site is key.

Currently, personas are talked about more than ever in the search marketing world. Traditional marketers have long since used this model to better understand their product or service user. This depth of understanding is important as you think about the topics your users are interested in and how you can be a solution for them with your content.

Keyword research still guides us to the topics people in our audience are searching for; but, our responsibility as marketers is to go beyond that data. That means having the most useful, most engaging, best quality page for a query – with the appropriate keywords on the page.

And although keyword optimization often happens best when a topic is thoughtfully written, and has enough depth to include many variations of a concept, optimizing your page for specific queries still reinforces the topic of the page.

If you haven’t spent much effort gathering qualitative data about your users, now is a good time to start. Surveys, monitoring conversations on social and talking face-to-face with your customers will help you build those personas to better understand what matters to them, so you can execute with content. But more on that in another post.

The Page: How Is It Performing?
At BrightEdge, we’ve been arming our customers with ways to measure our content’s performance at a page level even before Google’s secure search was launched in full. This was not only in anticipation of the change, but also a way to help businesses better understand the metrics that matter.

Post-Hummingbird and post-secure search is all about measuring the content, not the keyword. Start measuring what pages are generating the most value for you, and what types of content are generating the greatest ROI.

If you have content that ranks well, but isn’t driving traffic or engagement on your site, it’s not doing a good job of satisfying your users. You want to think about metrics like overall traffic to a page, conversion rate and so on.

Then, you can begin to look at groups of pages on your site that best perform on a traffic and revenue level, depending on your goals. In the old paradigm, SEOs may have used a “more content is better” approach. But now, it’s relevancy, credibility, timeliness and quality over quantity.

Once you have a picture of page performance on your site overall, you can then begin to make decisions about where you want to focus time and resources on your website.

Hummingbird Speeds Us Into The Future
Hummingbird is a great move for search results and could be a great way for websites to gain more visibility if they focus on the user and the content first.

It may actually be a relief for some SEOs to know that with Hummingbird and some of the other changes we’ve seen Google putting out, it’s a clear message that site owners should stop obsessing over keywords only and start focusing on creating a great experience.

Hummingbird
Today, instead of: How do I rank for this query? Think: How do I best answer the questions my users have?

Source & Author: Jim Yu is the founder and CEO of BrightEdge, the leading enterprise SEO Platform. He combines in-depth expertise in developing and marketing large on-demand software platforms with hands-on experience in advanced SEO practices.

SEO Strategies That Can Hurt Your Website Rankings

Tags

, , , , , ,

There is so much information available regarding the practice of search engine optimization (SEO) that it can be difficult at times to sort out what to do and what not to do. There are so-called experts selling their tips and “advice” online and unfortunately many of the methods they are teaching can actually damage your website from ranking online.

When managing your own SEO efforts, and when trying to make heads from tails of these so-called SEO experts, business owners should not only know what tactics are effective but also those that can land your site in big trouble if used. The content on your site should be relevant to your business and to your customers, and you shouldn’t try any sneaky maneuvers to try to game Google’s algorithms.

Here, I’ve outline four bad, yet common SEO practices that could damage your website’s search rankings.

Mistake No. 1: Buying links.
There are thousands of sites and services that will try to convince you to pay for getting droves of other sites link to yours. Fiverr, for example, is a common site that people buy links on because it can only cost $5 for hundreds of links pointing back to your website. But what are the sites that will be linking to yours? Will they be related to your industry? Will they be reputable?

While getting trusted and well-known sites to link back to yours is good for SEO, Google can de-index your site from its search results if it discovers that you are paying for backlinks. Translation: your company’s site won’t appear in any search results. Don’t pay for links and don’t participate in networks that sell links and help you distribute your articles to different sites. It’s not good for business.

Mistake No. 2: Publishing irrelevant content.
When creating content for your website you want only what is relevant to your business, industry and customers. For instance, if you’re a plumber, you don’t need to talk about hotels on your site. In Google’s eyes, a plumbing site that has content about hotels can be confusing. You want your site seen by the search engines as credible in your market, not someone else’s.

Google values the user experience and if you’re proving that to your customers, Google will see that and should reward your site for it. So when writing content for your business site, keep the needs of your customers in mind as much as your business goals. For example, if you own a pet store you most likely want to sell puppies, and want your keywords around “I want to buy a puppy.” But someone who is buying a puppy will likely want to know everything about that puppy — what type of food it will need and what type of shots it will have to get, to name just a couple of topics.

Think of every aspect of the puppy’s life as it relates to your customer and create useful content around that. This can help you get ranked for key terms while also showing Google that your site is an authority on puppies and that yours is the right business for people who want to buy a puppy.

Mistake No. 3: Spam comments.
Some business owners decide to pay services to spam sites around the internet with comments that include a link back to their website. While the idea is to spread links to your site across other sites that are relevant to your business, your brand can be damaged when customers or potential customers see those links associated with spammy, poorly written comments.

The same thing goes with bad comments on your own website. Don’t approve all comments on your site but only ones that bring real value to your customers. This can help you keep the comment value and integrity of your website on your website.

Mistake No. 4: Overloading on anchor text links.
An anchor text link is a specific keyword or phrase in the text on your site that is hyperlinking to a website URL. An example of this would be “awesome entrepreneur website” where it’s linking to Entrepreneur.com.

While keyword links can be good for SEO, you have to be careful when using them. Webmasters used to be able to build up thousands of anchor text links with specific keywords in them and that would get their website to rank for those keywords. Those days are long gone.

In order to build the best ranking, the keyword ratio needs to appear normal. That means, if you have 10,000 keyword links for one phrase that you want to rank for and none for other ones on your site, it can look strange in Google’s eyes. You should have many keyword links for a variety of keywords. If you want to rank for “I want to buy a puppy”, you should build keyword anchor text links like:

  • I want to buy a puppy
  • Owning a puppy
  • What shots for puppies
  • Tips to purchase a puppy
  • Popular dogs
  • Cute breeds of dogs
  • Dogs that are good with kids
  • Potty training new puppies
  • How much do pups cost
  • Smartest puppies to buy

The above keyword phrases all have to do with things customers may want to know when they are searching for “I want to buy a puppy.” But they are all saying different things. Make sure not to build too many links to just one of them or it will not look normal to people. Build keyword links for people and user intent, not for search engines.

Source: Entrepreneur