Staying Current With The Search Engines


Welcome back! This is lesson twelve of our Internet Income course to enable you start and run a profitable online business. In previous lessons, course author, George Little, has broken down important principles using simple English to explain how to succeed in today's fluctuating global market. In this course, he continues to reveal tips, real-world advice, and in-depth, step-by-step instructions on setting up your Internet-based business. Read the 11th lesson here.

Recall the five goals of Internet traffic building we stated in Lesson 3:
  1. Utilizing effective branding,
  2. Obtaining good publicity, including links to your site from popular pages,
  3. Obtaining an effective search engine presence,
  4. Utilizing and maintaining flow in the placement of your Internet ads, and
  5. Maintaining an effective social media presence.
Starting in Lesson 4, we are providing a brief overview of each of these five goals. (We will cover each in much detail later in the course.) We are now on goal number 3 in our overview, "obtaining an effective search engine presence."

Recall also that in Lesson 10, we outlined the three main principles of Search Engine Optimization (SEO) as follows:
  1. Make your site as informative and/or entertaining as possible.
  2. Chose effective keywords and then properly integrate those keywords into your meta tags and your content.
  3. Stay current on recent announcements from the search engines.
We are now on the third of these three main principles of SEO. Thus, in this lesson, we will discuss how to stay current with recent announcements from the search engines.

Why is it Important to Stay Current with the Search Engines

How your site fairs in the search results for searches relative to your site's subject matter is crucial to your success. That is, your success depends upon how well you accommodate the search engines' expectations for your site. If you displease the search engines, your site will be penalized. If you please them, your site will show up in relevant searches and you will gain qualified traffic.

We focus our attention on the most popular search engines. They control most of what people see on the Internet. This sounds bad until you realize that, for the most part, the popular search engines today obtained that power fairly--by doing a better job of delivering relevant results to searchers than their competitors. The power of a particular search engine, such as Google, Bing, or Yahoo, comes from the fact that people use those search engines more than they use the others. (FYI, the 15 most popular search engines, in order of popularity, as of May 2015 are: Google, Bing, Yahoo, Ask, AOL Search, Wow, WebCrawler, MyWebSearch, Infospace, Info, DuckDuckGo, Bleko, Contenko, Dogpile, and Alhea.)

Google, which gets more visitors per month than Bing, Yahoo, Ask, and AOL Search combined, is by far the most important search engine. Plus, optimization that works on Google usually works on the others as well. Thus, most optimization experts today concentrate almost exclusively on pleasing Google.

In order to please a search engine, you have to know what they expect from you. This changes over time as the search engines evolve. Google exercises its control through its ranking algorithms--computer code that determines what shows up where in response to searches on Google for a particular word or phrase. No one (outside of Google) knows for certain what code Google uses and exactly how it works. Google does give us clues from time to time, however, and by studying search results we can form theories as to how the code works at any given time. But, it is a moving target as Google adjusts the algorithms from time to time. The adjustments are made in an attempt to keep the search results relevant and eliminate "black hat" SEO tactics.

History of Google Search Algorithms

The Google search engine was started in 1997. The first named update to the Google search ranking algorithm did not come until 2002. It was called "Boston."

The next algorithm update after Boston came in April 2003 and was called "Cassandra." Cassandra came down hard on massive linking from co-owned domains and hidden text, which had been largely practiced in the early years of SEO in an attempt to manipulate the search engine results. In that same month, the "Dominic" update made Google even more picky with the backlinks it counted. In June of 2003, "Esmerelda" made further changes that no one could really ascertain from the outside. In November 2003, an update called "Florida" began to penalize keyword stuffing.

"Austin," in January 2004, expanding on the Florida update applying to metatags and made other changes to refine identification of deceptive on-page tactics. "Brandy" in February 2004, began to use semantics and synonyms to refine keyword analysis and incorporated the concept of "neighborhoods" to better analyze linking patterns.

In January 2005, Google gave some control to Webmasters by recognizing the "NoIndex/Nofollow" attribute. With the "Allegra" update in February 2005, the first real popular awareness arose that Google was quite serious about penalizing certain linking strategies. In May 2005, "Bourbon" made some technical changes regarding duplicate content and the importance given to the presence or absence of "www" in a URL. In June 2005, in response to this growing awareness and frustration on the part of SEO experts, Google gave some more control to Webmasters by creating XML sitemaps via Webmaster Tools, bypassing traditional HTML sitemaps. In the same month, Google began to personalize search results. No longer was the ranking seen by one the same as the rankings seen by others. Each individual searching received results based on personal factors and search history. Many Webmasters and SEO experts saw this as another intentional roadblock to understanding how the algorithms worked. In October 2005, Google integrated local maps data into the search index. Also in October 2005, "Jagger" created penalties for reciprocal links, link farms, and paid links. In December of 2005, with the "Big Daddy" update, Google made some changes to how certain technical things (like canonicalization, re-directs, etc.) were treated.

A few other update names floated around, but the next major change came in August 2008, when Google began providing "suggestions" as you typed your search.

In February 2009, the "Vince" update seemed to help major brands. Also in February 2009, Google provided a great tool that allowed Web developers to clean up their sites organization without losing ranking. (The "rel-canonical tag" allowed site developers to tell the Google index a better, preferred, new location for the same page, so that the ranking for the old link could be redirected to the new link when sites were ported to new servers or reorganized.) In December 2009, Google provided faster and better integration of social media postings into its results.

In 2010, "Google Places" provided more local advertising options and provided greater integration of "places pages" with local results. In May 2010, the "May Day" update had a significant negative impact on "long-tail traffic." ("Long-tail keywords" are search phrases containing several words. With a long search phrase one can be more specific about what is being sought or offered. While appropriately using long keyword phrases is a good thing in the eyes of the search engines, attempting to manipulate them to corner traffic to be switched to other offerings is considered bad--and was penalized by "May Day.") In June 2010, Google implemented a new indexing system, called "Caffeine," which is able to index an extremely large number of links much faster. In August 2010, the "Brand Update" allowed specific URLs to appear more often in the results.

In September 2010, with "Google Instant," Google started displaying results even before the search was submitted, which affected the ultimate results of a search effort, favoring established sites. In November 2010, Google placed emphasis on landing page quality by providing a magnified preview on mouse-over of a search engine result listing.

In December 2010, an e-commerce site located in New York City got a great deal of attention by publicly bragging about how bad reviews from customers upped its ranks in the search engines. To counter, Google immediately reacted with an algorithm update to sort out positive reviews from negative reviews.

In January 2011, Google began with a vengeance to promote good user experience for those using its search engine. Penalizing "black hat" SEO tactics became a major priority. Google studied particular sites that were successfully manipulating search results and incorporated better ways to prevent their tactics from working. As a result, starting in February 2011, the updates became too frequent and numerous to name them all. For the most part, they started falling under the umbrella name of "Panda." The main goal of all the Panda updates is to improve user experience while searching, primarily by preventing low quality, low information sites from being ranked high in the search results. If you have ever searched for local information and been led to national sites that have no local information for your area (which you only discovered after drilling down for several minutes, being exposed to numerous irrelevant ads in the process), you can understand why it was necessary for Google to devote so much effort to this task. That negative search experience, which I have unfortunately encountered several times, creates user dissatisfaction with the search engine. To keep its place at the top of the Internet, Google had to provide better results for its users.

Another major focus of the original release of Panda was to eliminate "Scrapers" from the search results. "Scrapers" are sites that republish the content of other sites. There are degrees of scraping. Some sites just flat out steal copyrighted content and republish it as their own. Other sites republish it with a link to the original content buried at the bottom. Still others, do not republish the content in entirety, but give snippets and then link to the original content. The former mentioned sites are flat out illegal. These latter sites are not necessarily illegal or even bad--if they help to organize information by including several snippets and links to widespread content dealing with the same subject. The latter sites are frustrating to searchers, though, if they just create extra pages and extra ads that have to be negotiated to arrive at the original content sought. Sorting out the useful sites that aggregate information from the manipulative and frustrating sites (that only ultimately serve to hide information behind multiple advertisements) is a daunting task for Google.

"Penguin," another update name that showed up late in 2012, carries on with the Panda objectives...with more specific focus on discovering "black hat" practices at work and eliminating their effectiveness. Penguin allows, through specific form pages, for feedback from users when they discover offending sites or if they believe a good site has been wrongly penalized.

In early 2013, there were more Panda updates, an update dubbed "Phantom," a Domain Crowding update, and another Penguin update. In June 2013, Google announced a "Payday Loan" update, designed to penalize certain sites that manipulated the search results. In August 2013, Google announced "Hummingbird" which enhances Google's semantic search capabilities.

In February 2014, the "Page Layout" update penalized sites that included too many ads above the fold. In August 2014, the "HTTPS/SSL" update gave a slight advantages to secure sites.

In February 2015, Google announced "Mobilegeddon" which decreased ranking provided on mobile devices for sites that were not mobile friendly. This update was significant in that Google announced the change in advance and encouraged Webmasters to prepare for it. In the past, Google has not announced all of the changes that have been detected and rarely, if ever, did so in advance. The difference, some believe, is that Google remains silent when it is trying to detect and penalize "Black Hat" SEO tactics--at least until well after the fact, but is much more open when it comes to changes that focus on improving search in general. In any event, Google has not announced, or even acknowledged, many of the updates it has made. These updates have been discovered, revealed, and named by independent sources that closely monitor search results.

Google does have much to say to Webmasters about how to get good search results, however, but it usually speaks in general terms. The main theme of what Google has to say is that you should focus on the quality and usefulness of your content and not seek to find tricks or strategies to improve your search results by manipulation.

How to Keep Up

There is no one particular site to read all the announcements and information Google chooses to provide. Generally, important things Google wants you to know will be mentioned on one or more of these sites:
If you want to read what the independent experts have to say about what's going on with the search engines, periodically check out the popular tech sites, such as,,,, or just do a Google search for search engine news.

To be a successful Internet Marketer, it is important to keep up with the latest announcements from Google. The Google search algorithms are adjusted frequently and Google's effort to insulate themselves from manipulation and always provide relevant results are ongoing. To keep up with Google's latest announcements, check these sites regularly:
If you want to read what the independent experts have to say about what's going on with the search engines, periodically check out the popular tech sites such as,,, or just do a Google search for search engine news.

In our next lesson, we will introduce you to the concept of "flow." You may read the next lesson here: Use and Maintain Flow on Your Website or Blog.

By George Little, Panhandle On-Line, Inc. For more information on the Internet Income Course and other works and courses by George Little, see

No comments:

Post a Comment