Common Misconceptions About SEO

Jul 20, 2011

No Comments

Posted In : Uncategorized

In the business of SEO (search engine optimization) nothing can be taken for granted. Every year new techniques and methods are introduced, new tools become available and several algorithmic updates take place. As a result, the circumstances of this uncertain and fast changing environment give birth to several myths and misconceptions. There are a lot of misconceptions about SEO (search engine optimization) and what it can and cannot do for a website.

From time to time few of those myths get confirmed, while most of them get debunked. A few of them (such as the effect of Social Media in SEO) are really popular that lots of web marketers take to be true and stay to the myths. Return on investment being the main issue of concern, website owners disregard the SEO myths and look for the top SEO agency which can help them get income from their internet business. Hopefully we can clear this up. If you are going to invest your hard earned money into SEO, you should at least know some of the basics, what you’re getting into, how SEO can help, and what you can expect in return.

seo misconceptions

Below are common misconceptions regarding SEO and we will explain why they are nothing more than misconceptions.
  1. Meta Keywords Help The Rankings
    The Keywords metatags were important for the first META-search engines that did not have the computer power to analyze and store the entire page. Since then, search engines have evolved and they are able to extract the important keywords of the page without using the META keywords tag. Another reason why search engines stopped using this tag is because many people were adding tοo many irrelevant terms in it. Google has made it clear many times in the past that they do not use meta keywords at all, so this tag will not help you improve your rankings.
  2. PageRank Is Useless/Irrelevant
    The last couple of years, more and more SEOs started to question whether the PageRank affects the SEO. This is mainly because it does not appear to be highly correlated with high rankings. PageRank is a signal, it is a metric that measures the quality/authority of the page and it affects the indexing. PageRank should be neither worshipped nor ignored.
  3. Submit Every Page On Google & Bing
    Submitting every page of your website in Google and Bing by using their submission forms will neither help you speed up the indexing nor improve your rankings. If you want to reduce the indexing time, add links from high traffic/authority pages, use XML and HTML sitemaps and improve your internal link structure. Submitting one by one all your pages will neither help nor hurt your rankings.
  4. PageRank Is Everything
    For years several SEO professionals considered PageRank the most important factor that affected the Search Results. In many cases some of them confused the real PageRank values with the ones of the toolbar and they were focusing primarily on how to increase it in order to improve their rankings. Nevertheless, as we mentioned in a previous article, PageRank is not the only signal that Google use. It is just one of the metrics and in some types of search it carries very little weight (news search, local search, real time search etc).
  5. Keep A High Keyword Density
    Several SEO Professionals suggest that having a high keyword density for the main keywords of the page will help the rankings. They consider this an important rule/target and as a result they try to stuff these words into the text. Of course by doing so, they produce really unattractive SEO copies and not only they don’t help their rankings but also they irritate the readers of their websites.
    What one should do in order to improve his rankings is to use different combinations of the main keywords in the text. This will increase the odds of ranking for other similar terms or combinations without affecting the quality of the text. Note that this technique will increase the Keyword Density of the important terms in a natural way. Nevertheless its primary target is not to increase the density but to incorporate in the text the most common keyword combinations that users are likely to search. A more accurate metric that helps you determine if a given keyword is optimized is the KeywordRank, which takes into account several different parameters such as the position of the keyword, the usage, the relevancy and more. You can check the KeywordRank of your targeted terms by using the Keyword Analyzer tool.
  6. Link All Pages To All Pages
    Some people suggested that by linking all pages to all pages you can improve the indexing or the rankings. So in order to achieve this they use too many secondary menus or footer links. Nevertheless by doing so, you increase dramatically the number of outgoing links per page and you do not pass enough PageRank to the important webpages of your site. Typically websites should use a tree-like structure that enables them to focus on the most important pages. More information on this topic can be found on the article “Link Structure: Analyzing the most important methods”.
  7. HTML Validation Helps SEO
    Lots of webmasters used to think that by validating their HTML code they improve their SEO campaigns. Fortunately or unfortunately this is not true. The HTML Validation does not affect the Search Engine Rankings and it is not used as a signal. Nevertheless if your HTML code is so bad that parts of the page do not appear in the browser then Search Engines might have problems in extracting your text. Thus have in mind that producing a valid HTML code is a good practice, but in general minor mistakes will not hurt your SEO.
  8. All The Nofollowed Links Do Not Help
    Typically Google say that they drop from their link graph all the links that are marked with nofollow and thus they do not carry any weight. Nevertheless not all of those links are irrelevant for SEO. For example Twitter and Facebook links are nofollowed, Google and Bing use those data as a signal. So it makes sense to say that not all nofollowed links are irrelevant for SEO and that the major search engines might in some cases consider them during their analysis.
  9. Nofollowing Links Improves The PageRank Distribution
    In the past, by using the rel=nofollow attribute, we could manipulate the PageRank distribution of our website and perform PageRank sculpting. Nevertheless an algorithmic update of Google changed the way that rel=nofollow operates and now it evaporates the amount of PageRank that does not pass through nofollowed links. Thus as we discussed in the article “The PageRank sculpting techniques and the nofollow issue” the rel=nofollow attribute leads to the evaporation of link juice. If you want to retain control over your PageRank and avoid the evaporation you can use the PageRank Sculpting technique that we have proposed in the past.
  10. All Links Carry The Same Weight
    In the original PageRank formula that was published by Page and Brin, all the links inside a webpage carried the same amount of weight. Nevertheless this has changed over the years and all the major search engines take into account not only the position of the link in the page, but also the relevancy and other characteristics that affect the CTR (font size, color etc). As a result footer links do not carry as much weight as links that appear on the top of the page or inside the main text.
  11. Duplicate Content Leads To Bans
    Several people suggest that having a lot of Duplicate Content in a website can lead to bans. Fortunately this is not true. Duplicate content can cause serious problems and it can affect the amount of pages that get indexed, the PageRank distribution within the website and consequently the rankings; nevertheless Google will not ban your website for that.
  12. The Page Title/Description Will Certainly Appear On Snippet
    Several webmasters believe that the Titles or META descriptions that they use in their pages are always the ones that will appear on the snippet of the Search Engine results. This of course is not always true, since Search Engines can change the snippet title and description with something more relevant to the query of the user. Some other times, search engines can even use text that does not exist in the landing page. Usually this text has been retrieved from external sources such as the DMOZ directory or the anchor text of the incoming links.
    web seo analytics search engine result

  13. Low Quality Links Can Help The Rankings
    Major search engines use several methods to detect paid or low quality links and they exclude them from their link graphs. The recent Panda update (or farmer update) made it even clearer that acquiring links from low quality websites/link farms, that contain a lot of duplicate or scrapped content, will not help to achieve high rankings.
  14. Pages Blocked With Robots.txt Will Not Appear In SERP’s
    Another common mistake that many SEOs make is that they use robots.txt in order to ensure that a particular page will not appear on the SERPs. Nevertheless this page can appear in the search results if it is linked by other pages. As we discussed on the article “The robots.txt, META-robots & rel=nofollow and their impact on SEO”, the proper way to ensure that a page will not appear in the search results is to use the “nofollow” meta-robots directive.
    pages in search results that are blocked in robots.txt

  15. Low Quality Links Can Hurt The Rankings
    Several SEOs have stated in the past that adding low quality links that come from link farms can actually hurt the SEO campaign of a website. If this was true then people would be able to negatively influence the websites of their competitors just by adding to them low quality links. Fortunately though, Google will not ban a website for getting low quality links. Nevertheless in extremely rare cases, Google has taken measures against websites that tried systematically to manipulate Search Engine Results by artificially increasing the number of their backlinks.
  16. Robots.txt Can Help Solve Duplicate Content Issues
    The Robots.txt file can be used to prevent Search Engines from parsing particular pages or segments of a website. As a result, some SEOs have tried to use this as a way to reduce the amount of duplicate content that they have on their websites. Nevertheless by blocking these pages, you prevent Google from crawling them, but you do not improve your link structure which causes the problem. As a result since the problem remains unsolved the negative effects on the rankings continue to exist.
  17. SEO Requires A Long Time To Return Positive Results
    The SEO is neither a process that will deliver results overnight nor a one-time activity. To achieve good results it requires effort and time. Nevertheless, positive results can become visible relatively fast. Of course a new website will not be able to achieve immediately good rankings on the highly competitive terms; nevertheless it should be able to rank for the more targeted and long tailed keywords.
  18. SEO Is A Spammy/Unethical Technique
    The SEO is an online marketing technique/process that can help websites increase their organic traffic, their exposure and their sales. In order to achieve this SEO professionals focus not only on the technical characteristics of the website but also on the content, on the designs and on external factors. The SEO is a marketing tool just like advertising. If you consider SEO unethical you should also feel the same about adverting.
  19. SEO Is Dead
    Every year, a major update takes place in the Search Engine business and several bloggers or journalists suggest that SEO is dead. Nevertheless as you can see SEO is alive and kicking and it is constantly evolving along with the Search Engines. Certainly the techniques have changed a lot, new tools and the methods become available while other ones are no longer used. SEO is a relatively new form of Marketing and it will exist for as long as Search Engines exist.
  20. SEO Is All About Google
    Google might still be the market leader in search, nevertheless we should not forget that Bing and Yahoo hold more than 30% of the total market. The Search Engine Optimization techniques do not focus on optimizing the websites only for Google, but they target on increasing the organic traffic from all the search engines and on developing websites that are attractive both for the users and the search engines. Note that there might be some methods that work better for Google, nevertheless a solid SEO campaign should be effective for all the major search engines.
  21. Conclusion

    This has covered a few basic SEO misconceptions, as there are many misconceptions in the SEO industry and over time they should all be cleared up as the industry continues to mature and evolve. In short, start basing your SEO activities on fact, not fiction. And you will see that SEO really can be your best friend, just make sure you get it done correctly!

    Never be satisfied with your SEO work, as there will always be something that can be improved or needs to be done. Don’t be afraid to come up with a solid SEO plan, then execute. Oh, and then there’s always those pesky algorithm changes – everything changes – so learn to adapt and get your traffic before they go elsewhere!

    Trackback URI  |  Comments RSS

    Leave a Reply