Search engine optimization (SEO) is one of the most important components of a brand’s digital marketing strategy. With it, you can increase targeted website traffic from Google, Yahoo and Bing, then convert that traffic into sales. One of the challenges with SEO, however, is that it can be challenging to stay ahead of all the changes and best practices that evolve in the world of search. Even some of the top optimizers out there have bad habits, all of which can be difficult to break.
You can, however, optimize your own command of SEO by breaking these six bad SEO habits.
1. Linking With Anchor Text Links
There was a time when search engine optimizers could improve their campaigns by optimizing their links. Known as anchor texts, this old tactic involved visible and clickable links that were the keywords that website owners wanted to rank for.
So, if you wanted to rank for the term “best men’s boots” the anchor text would be “best men’s boots” with a link to its shoe website. While this was effective, Google cracked down on this practice. One of the many mistakes you can make now is getting penalized for optimizing anchor links.
That’s not to say that you still can’t use anchor text. You just have to use safe anchors like variations of the keyword (“best men’s casual shoes”), plural/singular versions of the keyword (“best men’s boot”), naked URLs (“www.bestmensboots.com”), or long phrases (“Having a quality pair of boots is important for a man’s outfit and comfort”).
2. Getting 302 and 301 Redirects Mixed Up
For whatever reason, let’s say after you’ve had to change your domain name or an internal page, there are times when a webmaster must send visitors to a different page because the URL changed. Two of the most common types of redirects are 301 and 302.
Understanding the differences between the two are important for search engine optimizers. Confusing the two can put your rankings in danger because search engines are going to have a difficult time landing on your new URL. Here are brief descriptions:
- 301 Moved Permanently: A 301 redirect is used when you move your site to a new domain, want to merge two different websites or if people can access your site via multiple URLs. Because a 301 redirect sends 90-99% of the link juice (meaning incoming traffic routed to the new domain from an old link), it’s the preferred option for SEO.
- 302 “Found” or “Moved Temporarily”: A 302 redirect refers to a HTTP status meaning that the website has been moved temporarily. 302 redirects provide 0% of link juice, which is why this type of redirect isn’t recommended. Besides, how often would you really just be temporarily moving a website?
Some webmasters use a 302 redirect because they may be easier to create, and lots of old CMS systems use 302’s as a default. That being said, when you have to move a website and keep its link juice, your best bet is to use a 301 redirect. That way, you build SEO credibility while making sure your website is ready for customers.
3. Robots.txt Disallow vs. Meta NoIndex
At some point, you’ve probably come across technical terms like “robots.txt” and “noindex.” For SEO purposes, “crawling” refers to how a search engine scours the internet for your webpage and the act of adding your webpage to its results. “Indexing” refers to the practice of organizing that content in a single, searchable index. While allowing a search engine to crawl your site can be useful, there may be pages that you prefer to keep away from your clients.
These are both tags that will prevent search engines from crawling and indexing your site. But, what exactly is the difference between the two tags?
Creating a Disallow File
The robots disallow file is a simple text file that is meant to stop search engine robots from crawling and indexing your site. It can include as little as two lines of text:
Once a robot comes to your site, it will be informed that it should not be there, and will not crawl your site. You can also tell robots not to crawl specific pages or folders on the site if you like. You should refer to the first robotstxt.org link for more information about including specific search engines in your disallowed list.
Be aware that even if Google doesn’t crawl your site when you use a Disallow command in Robots.txt, it may still index your webpage(s) for its search. In other words, even though Google won’t have a crawled copy of your website on its servers, it can still offer the URL of the page as its title in the Google results.
Meta NoIndex HTML Command
This is a meta tag placed at the head section of your website. Unlike Robots.txt, this tag informs search engines crawlers not to display your webpage’s URL in its results. To use this command, place this piece of HTML on each of your webpages:
<meta name=”robots” content=”noindex” />
The advantage of using Meta Noindex is that if you have pages that you do not want to rank in Google, this will help clear them out of the results. Be aware, however, that Google says you must not block Google in your website’s robots.txt.
If you have any questions about which one to use, or if both can be used with each other, consult your web hosting service or trusted developer.
4. Keyword Stuffing
Keyword stuffing is another trick that search engine optimizers used to get over on search engines. Those days, however, are long gone, as Google has since cracked down on sites that overload a page with keywords, phrases, or numbers in order to manipulate its ranking. Because Google believes that keyword stuffing doesn’t enhance your visitor’s website experience, it will penalize your site by lowering its ranking.
Furthermore, Google looks less at keywords nowadays, and more at topical matter and keyword modifiers. For example, if you have a page that you want to rank for “loans,” utilizing topically relevant terms like “financing,” “lending” or “cash flow” will assist more so than overloading your page with “loans” terms.
Don’t try to fool the search engine bots. You won’t win. Instead of going into keyword overload, focus on creating valuable content where keywords are used appropriately and in context.
5. Purchasing Links
In 2012, Google released the Penguin algorithm, which sent a major message that search engine optimizers need to stop purchasing links. Doing so increases the possibility of getting penalized by search engines, which results in a lower ranking. Why, you might wonder? Because good SEO relies on earning links, not paying for them.
When you don’t earn links, you’re not helping your brand become an authority or trusted figure. Those paid links may drive in some traffic, but are they really going to help your conversion rates or boost your online clout? Additionally, purchasing links can get pricey. We’re talking hundreds and even thousands of dollars depending on the size of your site.
Don’t waste your time and money on buying links. Earn links by promoting quality content across multiple channels and reaching out to influencers.
6. Publishing Weak Content
Last, but certainly not least, you can no longer produce subpar content. Back in the day, there were plenty of websites that released content in order to achieve a higher ranking on a search engine like Google. Today, thanks to updates like Google’s Penguin, you can’t get away with pushing content that doesn’t provide any value. Google now expects sites to deliver quality content on a consistent level.
Simply put, if a site has no readers and no author, and poor content, who would want to read it anyway? And, for your purposes, why should Google rank it? That doesn’t mean that you have to produce more content. It means that instead of publishing ten so-so articles, you should be focusing on one or two pieces of excellent content.
Remember, you are proving to your audience—and search engines—that you’re an authority figure. Once you incorporate the best practices and break the bad habits of your SEO efforts, you’ll start seeing results. So go out there and improve your online visibility, build quality links and give your business a boost.