Thursday, November 26, 2015

What is Search Engine Optimization?

Search engine optimization (SEO) is the process of affecting the visibility of a website or a web page in a search engine's unpaid results - often referred to as "natural," "organic," or "earned" results. In general, the earlier (or higher ranked on the search results page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engine's users. SEO may target different kinds of search, including image search, local search, video search, academic search,[1] news search and industry-specific vertical search engines.

As an Internet marketing strategy, SEO considers how search engines work, what people search for, the actual search terms or keywords typed into search engines and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content, HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. - Wiki


How search engines work and how they can boost your business?

Showing up on search engines is one of the most critical ways to increase website traffic and expose your content, product or service to people who might be interested in what you have to offer. This means that you'll want to practice a little SEO (search engine optimization).

Most of the major search engines utilize an algorithm to determine where a website ranks. The criteria are different for every engine, but all engines share several commonalities. It all boils down to the type and amount of content provided on a given website, the level of optimization done on the site, and the popularity of the website (link popularity/PageRank).

Method 1 of 4: Taking Advantage of Google

1. Use Keywords. Google Keywords, a tool within the Google AdSense website, allows you to track keywords and find keyword suggestions. Browse the site and get familiar with how it works and then use it to your advantage. Find keywords that will help you maximize your website's viewership.

2. Use Trends. Google Trends tells you how searches in a subject change over time. You can use it to predict spikes and slumps as well as to know when you should update and change pages for the season or switch to using different keywords. You can look at and compare several different terms at a time.

3. Add yourself to Google. Google will reward Google Plus users and also businesses which are registered on Google Maps. Take advantage and join Google, as it is the most popular search engine by far.

Method 2 of 4: Creating Your Content



1. Have quality content. Quality content, in other words lots of original, error-free text organized well on a modern-looking website, is what matters most in terms of SEO that you can control. Hiring a professional website designer can help with the cosmetics and get you taken seriously by your visitors. You'll also want to be sure that you're not misleading visitors, they should be getting what was advertised when they looked at the site's description.

2. Create original content. You also create quality content by making sure that your content is original. This means that not only should each page of your site have different content than every other page of your site, but it also means that you'll be docked for stealing the content of others. Make your text original!

3. Incorporate appropriate images. Quality images, tagged with good keywords, can also help your rankings with search engines.

4. Use keywords. Find the most relevant and search keywords that relate to the content you provide and then add those keywords to your site text. Use the word a few times within the page in a way that relates to the rest of the text and is natural. Going over the top with the word-drops or pairing them with content that is irrelevant will get you punished in the rankings, however.

5. Target niche keywords with low competition. This involves at least a little bit of figuring out what makes your business unique. Maybe you're not just a clothing designer, but you're a geeky clothing designer. Maybe you're not just an auto shop but you're an auto shop in Seattle. Try to use Google Adwords to check how competitive your keywords are before deciding on them. Be sure that the keywords have at least some searches. You will want to try using broader key words too.

6. Have a site map. Create a site map that tells people where everything is on your site. You will get about a 1% click through rate to your site map. However, it will do wonders for those who know what site maps do, and the Search Engines will like it as well.

Method 3 of 4: Creating Your Code

1. Choose a good domain name. Keywords as the first word in a domain name will boost your traffic a little. Using a country TLD (top level domain) will boost your rankings locally but hurt you internationally so use that with caution. Avoid dated domain naming techniques like replacing words with numbers. Being a subdomain (like a something.tumblr.com) will also hurt you.

Keywords in your own subpages and subdomains also help. Your subpages especially should always have a descriptive title.

2. Use descriptions and Meta tags. Descriptions are a tagged part of your website code which describe the content on the page. Having one at all will help your rankings and having one which contains good keywords will help even more. If your site is using the same tags for all the pages, you are not helping search engines figure out the subject or relevance of your individual pages. Regarding Meta Tags, there are 2 very important fields:


  • Title Tag - arguably the most important SEO tag for any website. Google supports approx. 60 characters in the title, while Yahoo allows for up to 110 characters in the title. It is important to target the most critical keywords in the Title. Every page should have a unique Title.
  • META Description Tag - These were once important but are no longer. Some engines do display the description defined, while others do not. Some search engines do read the description tag, and do utilize the content found within in the ranking process. Google, MSN and Yahoo give very little weight to no weight for these.



3. Use headers. Headers are similar to descriptions and the same rules apply: having one at all helps and having one with keywords is even better. Use them!

4. Keep it simple. Keep the structure, navigation and URL structure of your site simple enough for search engines to follow. Remember that search engines cannot parse your navigation if it's using flash or javascript. So try to stay close to standard HTML when it comes to Navigation. URLs with dynamic parameters (?, &, SIDs) usually do not perform when it comes to search engine rankings.

Method 4 of 4: Making Connections

1. Create quality backlinks. Backlinks are when another website links to your page. It works in your favor if the website is one that gets more hits than yours does. The best types of "link building" are directory registration, text link advertising, and press release distribution, but you can also build links by doing a link exchange, cross promotion, or guest blogging for a relevant blog.
Try to offer valuable information or tools so that other people are motivated to link to your site. This will increase the chances of natural backlinks.

2. Do NOT spam. Spamming comment sections and other website areas (read: anywhere on wikiHow!) will actually make Google and other search engines dock you severely or remove you entirely. Do not spam people to build backlinks for yourself. Search engines will also punish you if your name is attached to spamming complaints or if you operate your website anonymously.

3. Do that social media thing. Right now, social media share and likes are the activity which is most rewarded by Google and other search engines. especially for subjects that are currently relevant. Create social media accounts with the major sites and update them regularly. Avoid being spammy by not just posting ads: post pictures of customers, events that you attend relating to your business, and other content that your fans might enjoy.

4. Update your site regularly. Most search engines reward websites which see regular or at least recent updates.

How each algorithm update affects your site?

2015


  • November 19, 2015 – Many webmasters noticed ranking changes at this time although it is not obvious whether this was a Panda related change. Glenn Gabe noticed that many sites with quality issues were affected.
  • November 2, 2015 – Possible Panda tremor. I personally saw several sites with Panda issues either increase or decrease on this date. However, other SEOs who do Panda work did not notice changes. This “new Panda” may make it difficult to determine when Panda related changes are happening.
  • November 1, 2015 – Google’s app interstitial penalty goes live. Mobile friendly pages that show a large interstitial app will no longer get the mobile friendly label. This could cause a drop in mobile traffic.



  • October 14/15, 2015 – Possible Panda movement – Some wondered if they were seeing more “zombie” traffic being reported. (Zombie traffic is useless non-legitimate traffic that Google can occasionally send). However, several sites reported Panda improvements or hits on this date. Still, many sites that are awaiting Panda recovery have not seen any movement at all.
  • October 7, 2015 (approx) – Possible unannounced algo shift. While nothing official was announced on this day, many webmasters saw a change in rankings. Some speculated that this was Google testing Penguin. Others thought it was a continuation of the long rollout of Panda that started in July. Personally, I saw some small increases in a couple of sites that had both Penguin and Panda issues.
  • October 5, 2015 (approx) – Google announced that they have released a new algorithm to deal with how they display hacked sites in the search engine results. This change affects 5% of queries, which is a fairly big thing. If you noticed a drop on or around this date, you may want to investigate for evidence of hacking. (Quick tip…try doing Google searches like this – site:yoursite.com viagra | cialis | ugg | kors to see if there are pages on your site that are hosting hacked content.
  • October 1-14 (approx) – Do you run a Wix website? For some reason Google has had trouble crawling these and they have been dropping out of Google’s index. Google became aware of the problem around October 13 and is working to fix it. This change also affected other websites that use AJAX crawling with an escaped fragment.



  • September 16, 2015 – Possible Panda tremor. Glenn Gabe noticed several sites seeing significant Panda hits/recoveries on this date. Still, most sites awaiting Panda recovery saw nothing.
  • September 16, 2015 (approx) – iOS 9 was released along with a feature to allow users to block ads. It appears that this feature also blocks scripts that track analytics. If you see a drop in Google Analytics traffic around this time, compare your mobile traffic to desktop to see if there is a big difference. It may be that you’re not actually seeing a traffic drop, but that visits from iPhones and iPads are no longer being tracked.
  • September 15, 2015 – Google image referral bug – This bug affected images outside of the US. Sites that see a lot of referrals from Google images may have a temporary drop on this day. It was fixed as of September 16.
  • September 9, 2015 – Possible Panda tremor. I didn’t personally see much Panda action on this day, but Glenn Gabe noticed some sites seeing Panda related movement.
  • September 2, 2015 – Possible Panda tremor.



  • August 31, 2015 – Possible Panda action – Panda 4.2 started rolling out in mid July but very few people reported changes. Around August 31, a few sites noticed some Panda related movement. For example, seroundtable.com which had seen an increase in July saw a slight decrease on August 31. Still, most SEOs who work with Panda hit sites saw very little movement on either of these dates.
  • August 6, 2015 – There were big changes in how Google displays local listings. It appears that in most places where Google would previously display 7 local listings, they are now only displaying 3. This is not just a US change, but appears to be international. While some sites will see a drop in traffic because they were previously in the lower half of the 7-pack, others may see an increase in traffic because Google is no longer displaying the full address which may cause more people to click through to the website.
  • August 1, 2015 (approximate date) – Unannounced algo update. Many people saw dramatic changes in rankings. I had several Penguin hit clients who saw significant improvements at this time, but Gary Illyes from Google told me that this was not Penguin related. It appears that most people who saw changes at this time had those changes revert a few days later.



  • July 28, 2015 – Google shuts down unverified Google My Business Listings.
  • Late July 2015 – Many webmasters are reporting that their images are not being seen on Google searches and that the number of images indexed is decreasing. If this happened to you, check to see if you have a manual action in Google Search Console. It appears that there was a large number of image mismatch manual actions sent out recently.
  • Starting July 17, 2015 and continuing to roll..possibly for months – Panda 4.2. Google announced that they started a slow rollout of Panda, after a 10 month wait. They stated that this rollout could take months to complete. As of July 27, there have been a couple of reported cases of slight upticks, but no big recoveries reported.
  • July 14, 2015 – Possible Phantom / Quality Update tremor.



  • June 28, 2015 – Possible Phantom / Quality Update tremor.
  • June 17, 2015 – There was a big change noticed on all of the SERP trackers, but it was not Panda or Penguin. Google confirmed that a change happened but said it was one of the many changes they make on a regular basis. Dr. Pete from Moz speculated that the change had something to do with Wikipedia switching to https but this was denied by Gary Illyes from Google. There is a good chance that this was a Possible Phantom / Quality Update tremor.
  • June 8, 2015 – Possible Phantom / Quality Update tremor.
  • Sometime in June, 2015 – Google app indexing is available to all app publishers now. This can change the rankings somewhat. For example, app indexing has caused some Etsy sellers to rank higher.



  • May 27, 2015 – Possible Phantom / Quality Update tremor.
  • May 9, 2015 – Many sites given manual actions for thin content.



  • Last week of April: Possible changes to the Google news algorithm.
  • April 29, 2015 – May 1, 2015  – The Quality Update, also known as Phantom 2. This was a fairly large update, which was quite similar to Panda. The focus appears to be on on-site quality, but Google has not given much instruction on recovery other than a vague, “Improve your quality”.
  • April 25, 2015 (date approximate) – Possible unannounced algorithm change. Many people saw ranking changes and the tools like Mozcast and Algoroo reported changes that appeared to be unrelated to the mobile friendly algorithm.
  • April 22, 2015 (or possibly starting within the following week) – The mobile friendly algorithm was launched. This algorithm gives a boost to sites that Google deems mobile friendly. It only affects mobile rankings and not desktop.
  • Sometime in April (probably): Google likely launched a change to the doorway page algorithm that punished sites with doorway pages. However, no one in the SEO world really noticed much change so it’s hard to say exactly when this happened.



  • March 25, 2015 – Possible unannounced algorithm change. Some people suspected that this was a test of Panda. John Mueller from Google commented that these changes and those seen around March 18-20 were likely just normal algorithm fluctuations.
  • March 18-20, 2015 (dates approximate) – Possible unannounced algorithm change. Many people saw ranking changes, but this was not Panda or Penguin.
  • March 10, 2015 – Google know showing drink recipes in the knowledge snippets. This might affect some recipe sites.
  • February 19, 2015 – A bug causes “This site may be hacked” messages to appear for a large number of sites, resulting in massive temporary traffic drops.
  • February 4, 2015 – Possible unannounced algorithm update. This was not Panda or Penguin. Glenn Gabe noticed that many of the sites that were affected by this change had quality issues, however.
  • February 4, 2015 – Google launched a mortgage calculator. This won’t affect traffic for most sites.
  • January 26, 2015 – Possible unannounced algorithm change. This likely was not Panda or Penguin.
  • January 16, 2015 – The Knowledge Graph displays ticket links. This won’t affect most people’s traffic, but could have big changes if you are a site selling show tickets.

2014

  • December 22, 2014 (+ a few days for rollout) – The Pigeon Algorithm, affecting the local search results rolled out to English speaking countries outside of the US such as Canada, Australia and the UK.
  • December 5-6, 2014 – Penguin fluctuations.
  • December 2, 2014 – Possible Penguin refresh. This was likely a continuation of Penguin 3.0 that started on October 17. Many sites saw further hits or increases on December 2, but most of these appeared to be short lived.
  • November 27, 2014 – Confirmed refresh of Penguin. This was a continuation of Penguin 3.0 that started on October 17. This was also Thanksgiving weekend, so changes in traffic around this time can be difficult to interpret.
  • November 24, 2014 – A large Polish link network was taken down by Google.
  • November 14, 2014 – Google drops the carousel for local results. This likely didn’t result in much traffic fluctuation, but could possibly have caused some changes for some small businesses.
  • November 13, 2014 – Possible unannounced algo update. There is a possibility that this was a mild refresh of Penguin. However, I did not see much change in sites I monitor.
  • November 10-11, 2014 – Possible unannounced algo update. This likely was not Panda or Penguin, but many people saw ranking changes.
  • October 24, 2014 – Unannounced Panda refresh. This was not announced by Google but most of us who do Panda work noticed dramatic changes in some Panda hit sites at this time. 6 months later when we were wondering why Panda was no longer refreshing, John Mueller confirmed that the last refresh was at the end of October.
  • October 22, 2014 – Possible tweaking of the Penguin algorithm.
  • October 21, 2014 – Google Pirate Update – Aimed at illegal torrent sites and other sites with large amounts of pirated content.
  • October 17, 2014 – Penguin 3.0 refresh. After waiting for an entire year, Penguin finally refreshed on this date. This was not an Update but rather a refresh which means that nothing changed in how Google makes Penguin calculations, but they just re-ran the algorithm. This was a rolling algorithm refresh that had no fixed end. However, most sites that saw changes saw them on October 17-18, October 22, November 27, December 2, 5 or 6. This was supposed to be a refresh that would allow people who had done cleanup work to recover, but many sites that should have recovered, did not.
  • October 9, 2014 – Possible testing of the Penguin algorithm. This is not confirmed, but I had many people who had sites previously affected by Penguin contact me to say that rankings were up. In most cases these disappeared within 24 hours. If you saw a short spike in traffic on this day it may be that your rankings were part of Google’s testing on Penguin.
  • October 12, 2014 – Probable Panda Tremor – Many sites that were previously affected by Panda 4.1 in late September saw an increase or decrease in traffic at this time.
  • October 4-6, 2014 (dates approximate) – Possible unannounced update – Many people saw dramatic changes in rankings at this time. Most likely this was a continuation of the slow rollout of Panda.
  • September 29, 2014 – Possible Panda Tremor – People who do a lot of work with Panda sites saw a lot of up and down movement on sites on this day. Some had seen no movement with the initial rollout of Panda 4.1 on September 25.
  • September 25, 2014 (and continuing off and on for 1-2 weeks) – Panda 4.1. This was a slow rollout that apparently took a week or more and may have started a few days prior to September 25.
  • September 21-23, 2014 (dates approximate) – Google gives out many manual penalties to private blog network sites. If you used PBN links to rank, you may have seen a drop at this time.
  • September 16, 2014 – Google penalizes several Turkish news sites. This is not likely to affect most sites’ traffic.
  • September 5, 2014 – Probable unannounced Panda refresh. Google did not announce a refresh of Panda on this date, but many people who do work with Panda hit sites noticed significant changes.
  • August 28, 2014 – Authorship completely removed.
  • August 18, 2014 – Google took action against a European and a German link network.
  • August 6, 2014 – Google announces that HTTPS is possibly a ranking signal. (It turned out to be an extremely small signal, if anything.)  If you migrated to HTTPS and did not do things properly you could see a drop in rankings around the time of your migration.
  • July 24, 2014 – The Pigeon Algorithm rolls out affecting local search results in the US only.
  • July 16, 2014 – Google disables discussion search. This is not likely to affect many sites, but if you run a forum you could possibly see a decrease in traffic at this time.
  • July 5, 2014 – Possible Unannounced Update
  • June 28, 2014 – Possible Unannounced Update – May have been due to a drop in Authorship photos.
  • June 24, 2014 – Possible Mild Unannounced Update – I debated even including this one as not many sites appeared to be affected but it is possible that Google made some significant algorithmic changes on this date.
  • June 21, 2014 – Possible Mild Unannounced Update – The tools that track changes in the SERPS all noticed something was going on, but there was not a lot of chatter on search engine forums about changes.
  • June 12, 2014 – Payday Loans 3 – This update to the Payday Loans algorithm now targets spammy queries as well as spammy sites.
  • May 28, 2014 (approx) – Possible unannounced update. Many people thought that this was a Penguin refresh, but Google denied it. Personally, I do not think that this was related to Penguin.
  • May 20, 2014 – Panda 4.0  – This was a large Panda update that affected many sites. Quite a few sites that were previously hit by Panda saw very dramatic increases in traffic. This was a whole new version of Panda with a new architecture that was apparently much more lenient. My note: Many sites that saw changes with Panda 4.0 started to see changes a few days before the official announcement of Panda 4.0.
  • May 16, 2014 (approx) – Payday Loans 2 – Google made the announcement on May 20 that they had released a new version of this spam algorithm that targets spam in competitive niches. You don’t have to be a Payday Loans site to be affected. This caused a lot of confusion as Payday Loans was updated a couple of days before a massive Panda update (Panda 4.0).
  • May 2 & May 7, 2014 – Possible unannounced update. However, this seemed to revert on May 12. Many big brands saw rankings change around this time.
  • April 20-23, 2014 (approx) – Poss Panda refresh.
  • April 18, 2014 – Google penalizes blog network, Post Joint.
  • April 18. 2014 – Many song lyrics sites affected again.
  • April 14, 2014 – Possible unannounced update. May have been a mild Panda refresh.
  • April 8, 2014 (and previously) – Google took action on seven Japanese link networks.
  • April 5, 2014 – Many song lyrics sites demoted.
  • March 24, 2014 – Possible unannounced update – could be a Panda refresh.
  • March 20, 2014 (approx) – Google took action on a Greek link network.
  • March 19, 2014 – My Blog Guest gets penalized. This is a network that many used to find sites on which to post guest posts. Many sites received inbound and outbound unnatural link penalties at this time.
  • March 14, 2014 – Google penalizes link networks in Germany, Italy and Spain.
  • March 5, 2014 (approx) – Google started publishing sponsored knowledge graph posts. This is not an algo update, but could affect your traffic if a competitor starts appearing above you because they’ve paid for inclusion.
  • February 13-16, 2014 (approx) – Unannounced update. There is a good chance this was a Panda refresh.
  • February 6, 2014 – The Page Layout Algorithm refreshes. This refresh did not seem to affect many sites.
  • February 6, 2014 – Google penalizes several German link networks
  • January 29, 2014 – Google takes action against the Buzzea, a French link network.
  • January 21, 2014 (date approximate) – Google drops discussion filter from search – This may affect traffic to a small degree for forum sites.
  • January 11, 2014 – Possible Panda refresh
  • January 8-9, 2014 – Unannounced update

2013

  • December 19, 2013 – Authorship removed.
  • December 17, 2013 – Possible unannounced algorithm change. All of the SERP trackers showed lots of change on this date but this was not Panda or Penguin.
  • December 13, 2013 – Google takes down link network Backlinks.com
  • December 6, 2013 – Google takes down a large link network, Anglo rank as well as some other unnamed link networks.
  • December 4, 2013 – Google adds car info to the knowledge graph.  This can definitely affect search traffic for sites in the automobile industry.
  • November 27-29, 2013 (dates approximate) – Possible unannounced update. Barry Schwartz reported that a lot of sites were reporting traffic losses, but it was possible that this was just low traffic due to the Thanksgiving holiday.
  • November 14, 2013 – Unannounced update – Google did not announce an update, but a LOT of websites had their traffic adversely affected on this day.
  • November 1-5, 2013 (dates approximate) – Possible unannounced update.
  • October 16, 2013 – Possible unannounced update.  It was speculated that this was related to Penguin, but we have no confirmation of this.
  • October 4, 2013 – Penguin 2.1 update – This Penguin update affected a lot of sites.
  • September 30, 2013 – Knowledge graph adds musician carousel – This probably doesn’t affect most sites, but certainly could affect the traffic to music sites.
  • September 29, 2013 – The knowledge graph gets filters – This probably didn’t have a big impact on most sites, but could possibly affect traffic for some.
  • September 26, 2013 – Google announced the launch of Hummingbird.  HOWEVER, they announced that Hummingbird had actually launched about a month previously.  Most SEO’s were not aware of a major change although there were some unexplained updates (see below) that could explain this launch.
  • September 20, 2013 – Large Russian link network, Ghost Rank (and possibly others) shut down.
  • September 12, 2013 – Unannounced update – May have been Hummingbird
  • September 4, 2013 – Unannounced update – May have been Hummingbird
  • August 20, 2013 – Unannounced update – May have been Hummingbird
  • August 6, 2013 – Google announces In Depth Articles.  This likely did not have a drastic effect on the traffic of most sites.  But, if you notice a change on this date, it may be that some of your articles were either pushed up or down in the SERPS depending on whether Google thought they were worthy of the “In Depth Article” label.
  • August 1, 2013 – Google Analytics Reporting bug – For many, but not all Google analytics users, GA just stopped reporting traffic.  I’m not sure if this has been corrected or not.  But, if you see a sharp dip on Aug 1, it’s probably just a reporting error.
  • July 26-29, 2013 (dates approximate) – Possible Unannounced update.  The general consensus was that this probably was not a Panda or Penguin update.  It is even debatable whether anything actually happened.
  • July 19, 2013 – The Knowledge Graph expanded significantly.
  • July 12-18, 2013 (dates approximate) - Confirmed Panda update. This was the first of the new “softer” Panda updates.
  • June 27, 2013 (and surrounding dates) – “Multi-Week” Algorithm update. There were many ranking fluctuations during the last week in June.
  • June 19, 2013 – Possible unannounced update.  However, if you read the comments, many people thought this was a Google Analytics bug.
  • June 14, 2013 – Google changes the image carousel in image search. This *might* affect image search queries.
  • June 11, 2013 – Payday Loans algorithm rolls out.  This algorithm doesn’t just affect payday loans sites, but any site that has the potential for super spammy SERPS.
  • June 5, 2013 – Unannounced update
  • June 5, 2013 – Knowledge Graph adds nutrition info.  This won’t affect most sites, but certainly can cause a drop in rankings for calorie counting/fitness/recipe sites.
  • May 22, 2013 – Penguin 2.0
  • May 21, 2013 – Domain Crowding update – designed to create diversity in the SERPS.
  • May 15, 2013 – Ghost link network deindexed and Text Link Ads (TLA) targeted.
  • May 4-May 9, 2013 –  Widespread unknown update. Matt Cutts confirmed it wasn’t Penguin.  Could be Panda?  Could be something else? Glenn Gabe dubbed this the Phantom update.
  • April 6, 2013 – Potential unannounced update – Although, if you read the comments, it sounds like perhaps there was a temporary glitch with Google analytics
  • March 27, 2013 – Google Updates its Quality Guidelines for News – “If a site mixes news content with affiliate, promotional, advertorial, or marketing materials (for your company or another party), we strongly recommend that you separate non-news content on a different host or directory, block it from being crawled with robots.txt, or create a Google News Sitemap for your news articles only. Otherwise, if we learn of promotional content mixed with news content, we may exclude your entire publication from Google News.”
  • March 15, 2013 – Google announces no more regular Panda updates – From this point on, Google will no longer be announcing Panda updates.  They will gradually roll out with the regular algorithm changes.
  • March 14, 2013 – Panda Refresh? – Google said that they would release Panda this weekend.  No official date was given, but most SEO’s believe it was around March 14…possibly March 13.
  • March 7, 2013 – Google Penalizes SAPE links – A Russian link network, SAPE links is penalized so that links coming from this network are likely worthless.
  • February 22, 2013 – News Websites Penalized – Google reduced the PageRank of many UK news websites to zero because they have been selling links in the form of advertorials
  • January 31, 2013 – Google changes Google+ (Maps) algorithm with respect to false reviews. – It is debatable whether this will cause a loss in rankings, but Google did say that they would be removing reviews that they felt were fake.
  • January 23, 2013 – Google changes how they present images in image search – While this was not an algorithm change, it could result in fewer visitors to a site.  Now, image search shows images directly on Google.  A user needs to click a link to go directly to your site.
  • January 22, 2013 – Panda Refresh
  • January 17, 2013 – Unofficial update? – Although no update was announced, many webmasters noticed significant changes in traffic around this date.

2012

  • December 21, 2012 – Panda Refresh
  • December 13, 2012 – Safe Search Changes – Google updated the way that Safe Search works so that it is now more strict on explicit content.
  • December 10, 2012 – Unannounced update? – No official update was announced, but many webmasters were compaining about traffic changes at this time.
  • December 4, 2012 – Google expands the Knowledge graph.
  • November 21, 2012 – Panda Refresh
  • November 15, 2012 (approx) – Google image update? – Many webmasters noticed that they were seeing fewer referrals from Google images.
  • November 5, 2012 – Panda Refresh
  • October 23, 2012 – Scraper update in Japan – Google Japan makes changes to its algorithm to combat scraper sites.
  • October 9, 2012 – Page Layout Update #2 – Google made changes to the page layout update which penalizes sites that have poor content above the fold.
  • October 5, 2012 – Penguin update.
  • October 1, 2012 – Possible Image Search Update.
  • September 28, 2012 – EMD update – This update demoted low quality exact match domains.
  • September 27, 2012 – Panda update – This was a major update that affected a lot of sites.
  • September 18, 2012 – Panda refresh
  • September 14, 2012 – Diversification update – This was a small update intended to diversify search results.
  • August 28, 2012 – Unknown possible update – Many webmasters complained of traffic losses but no update was announced.
  • August 20, 2012 – Minor Panda Refresh
  • August 14-20, 2012 (approximately) – Google began occasionally only showing users 7 search results on the first page.
  • August 16, 2012 – Possible unknown update. Many webmasters were noticing traffic drops.
  • August 10, 2012 – DMCA takedown update.  Sites can be penalized for having too many DMCA takedown requests.
  • July 24, 2012 – Panda Refresh
  • July 19, 2012 – A Large number of unnatural link manual actions were sent out.
  • July 3, 2012 – Image search stopped working for Internet Explorer users.  (Fixed July 16)
  • June 25, 2012 – Panda Refresh
  • June 8, 2012 – Panda Refresh
  • June 1, 2012 – Google shopping changes to a paid model with more prominent results for sites that pay for inclusion.
  • May 25, 2012 – Penguin Update
  • May 16, 2012 – Knowledge Graph rolls out
  • May 15, 2012 (or previous to this) – Google deindexes many free directories.
  • May 4, 2012 – Google announces changes to the news real time updates.
  • April 27, 2012 – Panda update (minor)
  • April 24, 2012 – First Penguin Update
  • April 19, 2012 – Panda update
  • April 16, 2012 – Parked Domain bug – A mistake caused Google to treat some sites as parked domains.
  • March 23, 2012 – Panda refresh
  • March 19, 2012 – BMR blog network deindexed

How to understand your search results?

Google strives to make it easy to find whatever you’re seeking, whether it’s a web page, a news article, a definition, or something to buy. After you enter a query, Google returns a results list ordered by what it considers the items’ relevance to your query, listing the best match first. (Sponsored links may appear above and to the right of the search results.) This part of Google Guide describes what appears on a results page and how to evaluate what you find so you’ll be better able to determine if a page includes the information you’re seeking or links to it.

How Google Works

If you aren’t interested in learning how Google creates the index and the database of documents that it accesses when processing a query, skip this description. I adapted the following overview from Chris Sherman and Gary Price’s wonderful description of How Search Engines Work in Chapter 2 of The Invisible Web (CyberAge Books, 2001).

Google runs on a distributed network of thousands of low-cost computers and can therefore carry out fast parallel processing. Parallel processing is a method of computation in which many calculations can be performed simultaneously, significantly speeding up data processing. Google has three distinct parts:

Googlebot, a web crawler that finds and fetches web pages.
The indexer that sorts every word on every page and stores the resulting index of words in a huge database.
The query processor, which compares your search query to the index and recommends the documents that it considers most relevant.
Let’s take a closer look at each part.

1. Googlebot, Google’s Web Crawler

Googlebot is Google’s web crawling robot, which finds and retrieves pages on the web and hands them off to the Google indexer. It’s easy to imagine Googlebot as a little spider scurrying across the strands of cyberspace, but in reality Googlebot doesn’t traverse the web at all. It functions much like your web browser, by sending a request to a web server for a web page, downloading the entire page, then handing it off to Google’s indexer.

Googlebot consists of many computers requesting and fetching pages much more quickly than you can with your web browser. In fact, Googlebot can request thousands of different pages simultaneously. To avoid overwhelming web servers, or crowding out requests from human users, Googlebot deliberately makes requests of each individual web server more slowly than it’s capable of doing.

Googlebot finds pages in two ways: through an add URL form, www.google.com/addurl.html, and through finding links by crawling the web.

Unfortunately, spammers figured out how to create automated bots that bombarded the add URL form with millions of URLs pointing to commercial propaganda. Google rejects those URLs submitted through its Add URL form that it suspects are trying to deceive users by employing tactics such as including hidden text or links on a page, stuffing a page with irrelevant words, cloaking (aka bait and switch), using sneaky redirects, creating doorways, domains, or sub-domains with substantially similar content, sending automated queries to Google, and linking to bad neighbors. So now the Add URL form also has a test: it displays some squiggly letters designed to fool automated “letter-guessers”; it asks you to enter the letters you see — something like an eye-chart test to stop spambots.

When Googlebot fetches a page, it culls all the links appearing on the page and adds them to a queue for subsequent crawling. Googlebot tends to encounter little spam because most web authors link only to what they believe are high-quality pages. By harvesting links from every page it encounters, Googlebot can quickly build a list of links that can cover broad reaches of the web. This technique, known as deep crawling, also allows Googlebot to probe deep within individual sites. Because of their massive scale, deep crawls can reach almost every page in the web. Because the web is vast, this can take some time, so some pages may be crawled only once a month.

Although its function is simple, Googlebot must be programmed to handle several challenges. First, since Googlebot sends out simultaneous requests for thousands of pages, the queue of “visit soon” URLs must be constantly examined and compared with URLs already in Google’s index. Duplicates in the queue must be eliminated to prevent Googlebot from fetching the same page again. Googlebot must determine how often to revisit a page. On the one hand, it’s a waste of resources to re-index an unchanged page. On the other hand, Google wants to re-index changed pages to deliver up-to-date results.

To keep the index current, Google continuously recrawls popular frequently changing web pages at a rate roughly proportional to how often the pages change. Such crawls keep an index current and are known as fresh crawls. Newspaper pages are downloaded daily, pages with stock quotes are downloaded much more frequently. Of course, fresh crawls return fewer pages than the deep crawl. The combination of the two types of crawls allows Google to both make efficient use of its resources and keep its index reasonably current.

2. Google’s Indexer

Googlebot gives the indexer the full text of the pages it finds. These pages are stored in Google’s index database. This index is sorted alphabetically by search term, with each index entry storing a list of documents in which the term appears and the location within the text where it occurs. This data structure allows rapid access to documents that contain user query terms.

To improve search performance, Google ignores (doesn’t index) common words called stop words (such as the, is, on, or, of, how, why, as well as certain single digits and single letters). Stop words are so common that they do little to narrow a search, and therefore they can safely be discarded. The indexer also ignores some punctuation and multiple spaces, as well as converting all letters to lowercase, to improve Google’s performance.

3. Google’s Query Processor

The query processor has several parts, including the user interface (search box), the “engine” that evaluates queries and matches them to relevant documents, and the results formatter.

PageRank is Google’s system for ranking web pages. A page with a higher PageRank is deemed more important and is more likely to be listed above a page with a lower PageRank.

Google considers over a hundred factors in computing a PageRank and determining which documents are most relevant to a query, including the popularity of the page, the position and size of the search terms within the page, and the proximity of the search terms to one another on the page. A patent application discusses other factors that Google considers when ranking a page. Visit SEOmoz.org’s report for an interpretation of the concepts and the practical applications contained in Google’s patent application.

Google also applies machine-learning techniques to improve its performance automatically by learning relationships and associations within the stored data. For example, the spelling-correcting system uses such techniques to figure out likely alternative spellings. Google closely guards the formulas it uses to calculate relevance; they’re tweaked to improve quality and performance, and to outwit the latest devious techniques used by spammers.

Indexing the full text of the web allows Google to go beyond simply matching single search terms. Google gives more priority to pages that have search terms near each other and in the same order as the query. Google can also match multi-word phrases and sentences. Since Google indexes HTML code in addition to the text on the page, users can restrict searches on the basis of where query words appear, e.g., in the title, in the URL, in the body, and in links to the page, options offered by Google’s Advanced Search Form and Using Search Operators (Advanced Operators).




What SEO is not?


While SEO can be defined by what it IS: the practice of making changes to your website in order for the search engines to find and deliver targeted visitors to the website and building links to establish relevance and search engine trust, it can also be defined by what it IS NOT:

SEM
Those new to the online marketing scene often use the terms SEM (search engine marketing) and SEO (search engine optimization) interchangeably. SEO technically falls under the SEM umbrella but they are not the same. The term SEM more often refers to paid search engine marketing techniques like PPC (pay per click). SEO is not considered a paid technique because it technically doesn’t cost anything. There is no money exchanged between the website owner and the search engine to rank a page in organic search. However, it does take a lot of time and effort which is essentially a cost and since many businesses don’t have the time or the knowledge, they will pay an SEO company to do it for them.

A Quick Fix
SEO isn’t like other marketing strategies, such as running a promotion that lasts for a specific amount of time. SEO is ongoing and always changing. It begins with research optimizing your website. The next step is link building. A site should build links naturally over time. The search engines view these inbound links as signs of trust. You can’t optimize your site and build 20 links and expect to see results. It just doesn’t work like that. It takes a long time to see an improvement in your search engine ranking and traffic from the search engines. However, it’s a marketing strategy that works as long as it’s done correctly. People are searching online for everything. It’s a problem if you don’t have a search presence. An SEO strategy may take time, but it is time well spent and the payoff is worth it in the end.

Dead
The search engines are getting stricter and smarter and search is becoming a lot more personalized. This has led some to believe that “SEO is dead”. Old-time SEO tactics like link exchanging and cloaking are dead, but that doesn’t mean that SEO as a whole is dead, it’s only evolving. SEO is becoming a lot more integrated with social media. The search engines understand that users trust the opinions of the people in their network, so they have integrated “social signals” into their search algorithm. Bing has Facebook data and Google has Google+ data. SEO is becoming “social SEO” and there is more of a focus on getting your content shared via social media +1’s, Likes, and links. As long as people are searching in Google and Bing, SEO will be an important part of an online marketing strategy.

Why SEO and ranking well matters?


In today’s competitive market SEO is more important than ever. Search engines serve millions of users per day looking for answers to their questions or for solutions to their problems.  If you have a web site, blog or online store, SEO can help your business grow and meet the business objectives.

Search engine optimization is essential because:

The majority of search engines users are more likely to choose one of the top 5 suggestions in the results page so to take advantage of this and gain visitors to your web site or customers to your on-line store you need to rank as higher as possible.

  • SEO is not only about search engines but good SEO practices improve the user experience and usability of a web site.
  • User’s trust search engines and having a presence in the top positions for the keywords the user is searching increases the web site’s trust.
  • SEO is also good for the social promotion of your web site. People who find your web site by searching Google or Yahoo are more likely to promote it on Facebook, Twitter, Google+ or other social media channels.
  • SEO is also important for the smooth running of a big web site. Web sites with more than one author can benefit from SEO in a direct and indirect way. Their direct benefit is increase in search engine traffic and their indirect benefit is having a common framework (checklists) to use before publishing content on the site.
  • SEO can put you ahead of the competition. If two web sites are selling the same thing, the search engine optimized web site is more likely to have more customers and make more sales.


Some of SEO is Dead
Many tactics that have fallen under the SEO umbrella can safely be considered dead, either because they don’t work anymore, never worked, or still work but are in violation of Google’s guidelines. I’m not going to spend time discussing why they don’t work or are risky, because that’s not what this article is about, and there has been plenty written about the topic.

Just so we’re on the same page, here are some examples of basic SEO tactics that aren’t worth your time:


  • Keyword stuffing and hiding
  • Buying mass links, directory links
  • Duplicating websites (or categories) on different domains
  • Content spinning, automatic content
  • Optimizing purely for “ranking” outcomes


What SEO is Today
SEO at its core is the art and science of making high quality content easier to find on search engines. The key point being ‘quality content’ that helps customers answer questions that lead to purchase or some other business outcome. Most of Google’s algorithm updates are intended to reward good content and punish spam. While it may not always feel like it, most of Google’s best practices for SEO are really on your side, you just need to learn and master them.

Here are some SEO tactics that are alive and well:


  • Keywords that support customer targeting
  • SEO copywriting and on-page optimization
  • Link attraction
  • Internal link optimization
  • Technical SEO (anything designed to make your site more accessible to search engines)
  • Optimizing for engagement and conversions
  • Quality Content is Good, Optimized Content is Best
If search engines are just trying to reward high quality content by making it more findable, isn’t it enough to just create great content and call it a day? Unfortunately, no.

While search engines are getting much smarter, more efficient, and overall better at ‘screening’ content, they still pale in comparison to people’s inherent ability to pick out the nuances and meaning of content. So it’s important to send the right signals to search engines and make those signals as easy to understand as possible.

Content quality comes down to relevance for customers and there’s no better way to target customer interests than through keywords. Every search begins with someone typing keywords into a search box, and ends with them clicking on one of the sites listed in the search results. If your site doesn’t include the keywords or closely related phrases on web pages, in meta-data, or inbound link anchor text, you’re not giving the search engines (or buyers) the information they need to understand your site’s relevance for that search query.

Optimization of on-page copy and meta elements can have positive effects on search traffic and rankings, in particular for sites that are strong in most other aspects. For example, I have been working with a client in the software industry who has a well designed site that is technically sound, has useful and compelling content, and a strong back-link profile.

However, competitor keyword research and customer targeting analysis indicated that the keywords which are most relevant to their audience related to consideration and purchase stages of the buying cycle weren’t being effectively targeted (i.e. they didn’t appear enough or at all in on page copy, meta elements, or cross-linking).

Within 3 months of implementation of basic on-page content optimization, we achieved a 320% increase in organic search traffic, a 15% decrease in average bounce rate and page one rankings in the major search engines for nearly all of our identified target keywords. Better visibility for what customers are actually looking for leads to more traffic and sales.

Links Still Matter
While Google’s recent announcement about the decreased importance of links is significant, it is far too soon to write off quality links altogether. Crawling links is an important way for search engines to discover content, thus the more links pointing to your site (from relevant, quality sources), the more opportunities the search engines have to find your content.

Don’t fall into the trap of treating links as more important than quality content, or that enough links pointing towards bad content can somehow make it good. This is the definition of misguided effort, as great content will not only attract quality links on its own (with help from effective promotion and social media shares), but is far more likely to increase visitor engagement when it’s found, and result in those all-important conversions.

Social shares are as important as links from other web pages, so ensure your content creation efforts include content promotion efforts through social networks. Grow networks on a regular basis to increase the audience reach of the optimized content you’re promoting too. Google+, Facebook and Twitter are must-haves with any content promotion efforts through social media. Just make sure you’re promoting plenty of other useful content, not just your own.

Increasingly, it has become important to not only acquire quality links, but to monitor and potentially remove low quality links, especially if you have received an unnatural link warning from Google. Regular monitoring and auditing of your site’s link profile is a good preventative measure, as bad links often have a cumulative effect, and can be very difficult to clean up once they become a clear problem.

Recently, a preliminary audit of a new client’s site indicated the prevalence of several nasty kinds of links, including paid site-wide links, and several thousand links from blog networks and link farms. Given the severity of the problem, we prioritized an extensive inbound link audit and disavowal initiative to ensure the quality content being published would not be negatively affected by previous SEO link building efforts.

Technical Problems can Prevent Search Engines (and People) from Finding and Engaging with Your Content

As fast as things change in SEO, the chances that search engine algorithms will start to penalize sites for functioning well from a technical standpoint are slim, and humans are no different. How many times have you wished a site would load slower?

The importance of optimizing your site so that your pages load fast, your content is easily accessible and your navigation is intuitive cannot be understated. People will leave a site and never return if they get confused or have to wait too long, and search engines will too.

This is one area in particular to keep a close eye on, as small technical issues can have wide-spread and severe effects on your site’s search engine friendliness. Many companies with large sites that employ digital marketing agencies with strong SEO skills, receive their value many times over just from ongoing technical optimization.

For example, un-intentionally  blocking pages or a whole site from being indexed in search engines via robots.txt is not only an SEO killer but very easy to do. Often development teams will temporarily block parts of a site when making updates, and unfortunately neglect to restore the robots.txt file following the updates.

As site updates can often introduce indexation as well as other technical website problems, it’s a good idea to include a step for an external team to check for any problems following  a major update, as well as on an ongoing basis.

Modern SEO is Alive and Well
By definition, SEO is about an ongoing effort to improve the performance of your website content to be found both by search engines and customers using search engines. What better time is there for your useful content to be found than at the exact moment your customers need it? That’s the value search engine optimization brings to the online marketing mix. As long as people use search engines to find information and businesses have content they want potential customers to see, SEO will be important. I don’t see that changing anytime soon.

SEO basics – the fundamentals of any successful SEO strategy

For beginners to SEO the above definition may sound complicated so in simpler terms Search Engine Optimization is a way to improve your web site so that it will appear closer to the top positions in the search results of Google, Yahoo, Bing or other search engines.

When you perform a search on Google (or any other search engine) the order by which the returning results are displayed, is based on complex algorithms. These algorithms take a number of factors into account to decide which web site (or blog) should be shown in the first place, second place etc.

Optimizing your web site for search engines will give you an advantage over non-optimized sites and you increase your chances to rank higher.

What are the main stages of the Search Engine Optimization process?

As I mentioned above, SEO is not a static process but rather a framework with rules and processes. For simplicity though SEO can be broken down into 2 main stages:

On-site SEO: What rules to apply on your site to make it search engine friendly

and

Off-site SEO: How to promote your web site or blog so that it can rank better in search results.

On-site SEO

In my search engine optimization tips for beginners article I have explained with examples the 15 most important rules for on-site SEO. These are simple tweaks you can do to your web site and increase your search engine visibility. If followed correctly these 15 SEO tips will also increase the usability and credibility of your web site or blog.

In addition to the above guidelines the structure of a web site is also very important for SEO purposes. In my article: the importance of web site structure for an SEO optimized web site you can read about the essential components of an optimised web site.  Which pages should not be missing from your web site, what is a high quality site and why the navigation and URL structure is vital for good SEO?

If you seriously take into account these 2 factors i.e. web site structure and the seo tips, then that’s all you need to do to help search engines trust your web site. There is no need to spend more time than needed with on-site SEO neither you should over optimize your web site or blog because it can sometimes (under certain conditions) generate the opposite results.

Off-site SEO

Besides the changes you can do to your web site (on-site SEO) so that it ranks higher in the SERPs, the other way to improve your web site’s ranking position is by using off-site SEO techniques.

Off-site SEO is generally known as link building but I prefer to use the term web site promotion since a proper way to promote a web site involves much more methods and techniques than building links.

In general, search engines are trying to find the most important pages of the web and show those first when a user enters a search query. One of the factors to determine the position a web page will appear in the results is the number of incoming links.

Incoming links are a signal of trust and depending from where the links are coming, they can greatly affect your ranking position (either positively if the links are coming from well-known and trusted sites or negatively if they are paid links, article directories, link farms etc.).

What can you do to get more links?

That’s a very good question and I am sure that if you search the Internet for that phrase you will get hundreds of different answers. In my opinion, and this is what I will try to explain in this web site, you should forget about building links and concentrate on creating good quality content for your web site.

Good content will get you natural links which in turn will give you good rankings and traffic. If you try to buy links or get them the easy way (read this: Guest posting for links), you may have a temporary success and then see your web site disappearing from the top pages after the next Google update.

What is the difference of SEO and Internet marketing?

Some people often ask me “Is SEO the same as Internet Marketing?” The simplest answer I can give is that SEO is one of the tools available in your Internet Marketing arsenal. It is not Internet Marketing as such but it can be part of your overall Internet Marketing campaign which normally includes other things like social media promotion, content strategy etc.

Good content is still the most important success factor with or without SEO

Before closing this introduction to search engine optimization you must have very clear in your mind that SEO cannot help you if you don’t have good content.

In other words if you try to SEO a web site with not very good content your chances of succeeding (in the long term) are minimum. On the other hand a web site with good content can do well with or without SEO. SEO will just give the web site an extra boost.

SEO is a must for every web property

To sum it up, Search engine optimization or SEO is a way to optimize your web site so that search engines will understand it better and give you higher rankings. It is important since a good SEO approach can drive more traffic to your web site, blog or on-line store and gain more customers, make sales and fulfill your business purpose.

No comments:

Post a Comment