Featured post

SEO techniques are classified into two broad categories

White Hat SEO - Techniques that search engines recommend as part of a good design. Black Hat SEO - Techniques that search engines do no...

Tuesday 31 May 2016

File Naming Style

One of the simplest methods to improve your search engine optimization is to look at the way you name your files. Before writing this tutorial, we did a lot of research on file-names and found that search engines like Google give too much importance to file names. You should think what you want put in your web page and then give a relevant file name to this page.
Just try giving any keyword in Google search engine and you will find file names highlighted with the keyword you have given. It proves that your file name should have appropriate keywords.

File Naming Style

  • The filename should preferably be short and descriptive.
  • It is always good to use same keywords in a filename as well as in page title.
  • Do not use filenames such as service.htm or job.htm as they are generic. Use actual service name in your file name such as computer-repairing.htm.
  • Do not use more than 3-4 words in file names.
  • Separate the keywords with hyphens rather than underscores.
  • Try to use 2 keywords if possible.

File Name Example

Listed below are some filenames which would be ideal from the users' point of view as well as SEO.
slazenger-brand-balls.html
wimbledon-brand-balls.html
wilson-brand-balls.html
Notice that the keywords are separated by hyphens rather than underscores. Google sees good filenames as follows:
seo-relevant-filename as seo relevant filename(good)
Filenames with underscores are not a good option.
seo_relevant_filename as seorelevantfilename (not good)

File Extension

You should notice that .html, .htm, .php and any other extension do NOTHING for your visitors, and they are simply a means of offloading some of the work of configuring your webserver properly onto your visitor's. In effect, you are asking your site visitors to tell your webserver HOW to produce the page, not which one?
Many Web masters think that it is a good idea to use filename without using extension. It may help you, but not a whole lot.

URL Sub-Directory Name

From Search Engine Optimization point of view, URL sub-directory name hardly matters. You can try giving any keyword in any search, and you will not find any sub-directory name matching with your keywords. But from the user's point of view, you should keep an abbreviated sub-directory name.

Guru Mantra

Keep the following points in mind before naming your files:
  • Keep the web page filename short, simple, descriptive, and relevant to the page content.
  • Try to use a maximum of 3-4 keywords in your filename, and these keywords should appear on your web page title as well.
  • Separate all keywords with hyphen rather than with underscore.
  • Keep your sub-directories name as short as possible.
  • Restrict the file size to less than 101K because Google chops almost everything above that.
The website design and layout gives the first impression about your site. There are sites which are too fancy and regular net surfers just reach those sites and come out even without creating a single click.
Search engines are very smart but after all, they are software and not human being, who can read the content of their interest. If you make your site too complicated, then the search engine would not be able to parse the content of your site properly, and finally indexing would not be efficient, which results in a low rank.
The actual page content should have a keyword density of about 10% and should weigh in at about 200 words - but there are as many opinions about this as there are SEO experts. Some say, keyword density should be 5% and some say it should be 20%. You can go with 10% which is good enough.
Here are a few guidelines that you should keep in mind while designing a web page.
  • You should have more text content than HTML elements.
  • No frames. They are the enemies of search engines, and search engines are enemies of frames.
  • No ads if possible. Because most of the ads use Java-Script which is not advised to be used.
  • No JavaScript. If you need JavaScript, call it from an external file rather than dumping the code in the HTML file. JavaScript drop-down menus prevent spiders from crawling beyond your homepage. If you use them, be sure to include text links at the bottom of the page.
  • Do not put anything in the page topic that does not fit perfectly.
  • No unnecessary directories. Keep your files as close to the root as possible.
  • No fancy stuff (Flash, Splash, Animated Gifs, Rollovers, etc.) unless absolutely necessary.

SEO techniques are classified into two broad categories

White Hat SEO - Techniques that search engines recommend as part of a good design.

Black Hat SEO - Techniques that search engines do not approve and attempt to minimize the effect of. These techniques are also known as spamdexing.

White Hat SEO
An SEO tactic is considered as White Hat if it has the following features:

It conforms to the search engine's guidelines.

It does not involve in any deception.

It ensures that the content a search engine indexes, and subsequently ranks, is the same content a user will see.

It ensures that a web page content should have been created for the users and not just for the search engines.

It ensures good quality of the web pages.

It ensures availability of useful content on the web pages.

Always follow a White Hat SEO tactic and do not try to fool your site visitors. Be honest and you will definitely get something more.

Black Hat or Spamdexing
An SEO tactic, is considered as Black Hat or Spamdexing if it has the following features:

Attempting ranking improvements that are disapproved by the search engines and/or involve deception.

Redirecting users from a page that is built for search engines to one that is more human friendly.

Redirecting users to a page that was different from the page the search engine ranked.

Serving one version of a page to search engine spiders/bots and another version to human visitors. This is called Cloaking SEO tactic.

Using hidden or invisible text or with the page background color, using a tiny font size or hiding them within the HTML code such as "no frame" sections.

Repeating keywords in the metatags, and using keywords that are unrelated to the website content. This is called metatag stuffing.

Calculated placement of keywords within a page to raise the keyword count, variety, and density of the page. This is called keyword stuffing.

Creating low-quality web pages that contain very little content but are instead stuffed with very similar keywords and phrases. These pages are called Doorway or Gateway Pages.

Mirror websites by hosting multiple websites - all with conceptually similar content but using different URLs.

Creating a rogue copy of a popular website which shows contents similar to the original to a web crawler, but redirects web surfers to unrelated or malicious websites. This is called page hijacking.

Always stay away from any of the above Black Hat tactics to improve the rank of your site. Search engines are smart enough to identify all the above properties of your site and ultimately you are not going to get anything.


When you start thinking of doing a business through internet, the first thing that you think about is your website domain name. Before you choose a domain name, you should consider the following:

Who would be your target audience?

What you intend to sell to them. Is it a tangible item or just text content?

What will make your business idea unique or different from everything else that is already avilable in the market?

Many people think it is important to have keywords in a domain. Keywords in the domain name are usually important, but it usually can be done while keeping the domain name short, memorable, and free of hyphens.

Using keywords in your domain name gives you a strong competitive advantage over your competitors. Having your keywords in your domain name can increase click-through-rates on search engine listings and paid ads as well as make it easier to use your keywords in get keyword rich descriptive inbound links.

Avoid buying long and confusing domain names. Many people separate the words in their domain names using either dashes or hyphens. In the past, the domain name itself was a significant ranking factor but now search engines have advanced features and it is not a very significant factor anymore.

Keep two to three words in your domain name that will be easy to memorize. Some of the most notable websites do a great job of branding by creating their own word. Few examples are eBay, Yahoo!, Expedia, Slashdot, Fark, Wikipedia, Google, etc.

You should be able to say it over the telephone once, and the other person should know how to spell it, and they should be able to guess what you sell.

Guru Mantra
Finally, you should be able to answer the following questions:

Why do you want to build your website?

Why should people buy off your site and not from other site?

What makes you different from others?

Who are your target audience and what do you intend to sell?

List 5 to 10 websites that you think are amazing. Now think why they are amazing.

Create 5 different domain names. Make at least 1 of them funny. Tell them to half a dozen people and see which ones are the most memorable. You will get more honest feedback if the people do not know you well.

Buy your domain name that is catchy, memorable, and relevant to your business.

Seo Tutorial

SEO Tutorial

Search Engine Optimization (SEO) is the activity of optimizing web pages or whole sites in order to make them search engine friendly, thus getting higher positions in search results.

This tutorial explains simple SEO techniques to improve the visibility of your web pages for different search engines, especially for Google, Yahoo, and Bing.

This tutorial has been prepared for beginners to help them understand the simple but effective SEO characteristics.

Prerequisites
We assume you are aware of simple web technologies such as HTML, XHTML, Style Sheet, etc. If you already have developed any website, then it is an added advantage and it will help you understand the concepts of SEO explained in this tutorial.

SEO stands for Search Engine Optimization. SEO is all about optimizing a website for search engines. SEO is a technique for:

designing and developing a website to rank well in search engine results.

improving the volume and quality of traffic to a website from search engines.

marketing by understanding how search algorithms work, and what human visitors might search.

SEO is a subset of search engine marketing. SEO is also referred as SEO copyrighting, because most of the techniques that are used to promote sites in search engines, deal with text.

If you plan to do some basic SEO, it is essential that you understand how search engines work.

How Search Engine Works?
Search engines perform several activities in order to deliver search results.

Crawling - Process of fetching all the web pages linked to a website. This task is performed by a software, called a crawler or a spider (or Googlebot, in case of Google).

Indexing - Process of creating index for all the fetched web pages and keeping them into a giant database from where it can later be retrieved. Essentially, the process of indexing is identifying the words and expressions that best describe the page and assigning the page to particular keywords.

Processing - When a search request comes, the search engine processes it, i.e. it compares the search string in the search request with the indexed pages in the database.

Calculating Relevancy - It is likely that more than one page contains the search string, so the search engine starts calculating the relevancy of each of the pages in its index to the search string.

Retrieving Results - The last step in search engine activities is retrieving the best matched results. Basically, it is nothing more than simply displaying them in the browser.

Search engines such as Google and Yahoo! often update their relevancy algorithm dozens of times per month. When you see changes in your rankings it is due to an algorithmic shift or something else outside of your control.

Although the basic principle of operation of all search engines is the same, the minor differences between their relevancy algorithms lead to major changes in results relevancy.

What is SEO Copywriting?
SEO Copywriting is the technique of writing viewable text on a web page in such a way that it reads well for the surfer, and also targets specific search terms. Its purpose is to rank highly in the search engines for the targeted search terms.

Along with viewable text, SEO copywriting usually optimizes other on-page elements for the targeted search terms. These include the Title, Description, Keywords tags, headings, and alternative text.

The idea behind SEO copywriting is that search engines want genuine content pages and not additional pages often called "doorway pages" that are created for the sole purpose of achieving high rankings.

What is Search Engine Rank?
When you search any keyword using a search engine, it displays thousands of results found in its database. A page ranking is measured by the position of web pages displayed in the search engine results. If a search engine is putting your web page on the first position, then your web page rank will be number 1 and it will be assumed as the page with the highest rank.

SEO is the process of designing and developing a website to attain a high rank in search engine results.

What is On-Page and Off-page SEO?
Conceptually, there are two ways of optimization:

On-Page SEO - It includes providing good content, good keywords selection, putting keywords on correct places, giving appropriate title to every page, etc.

Off-Page SEO - It includes link building, increasing link popularity by submitting open directories, search engines, link exchange, etc.

Wednesday 11 May 2016

On Page Factors

The way your page is optimised has the most profound effect on its rankings. Here are the page optimization factors that can affect its search visibility:

  1. Keyword in title tag. The title meta tag is one of the strongest relevancy signals for a search engine. The tag itself is meant to give the accurate description of the pages content. Search engines use it to display the main title of a search result. Including a keyword in it will indicate to search engine what to rank the page for.
    Ideally, the keyword should be placed at the start of the title tag. Pages optimized this way will rank better than those with keyword closer to the title’s tag end.
  2. Keyword in description tag. The importance of the meta description tag today is often discussed in SEO circles. It is nonetheless still a relevancy signal. It is also crucial for gaining user clicks from search results pages. Including keyword in it makes it more relevant to a search engine and a searcher.
  3. Keyword in H1 tag. H1 tag is yet another relevance factor, serving as a description of the pages content. In spite of an ongoing discussion about its importance, it is still a good practice to include your keyword in a unique H1 tag on a page.
  4. Using keywords in the pages copy. Up until not long ago, stuffing your page with keywords was a surefire way to increase its rankings for a particular keyword. That’s not the case anymore. Using the keyword in the copy still sends a relevancy signal of what the content is about. How you place it however has changed drastically.
  5. Length of the content. These days searchers want to be educated and won’t satisfy with basic information. Google therefore looks for authoritative and informative content to rank first. And it’s a common sense that the longer your content is, the greater the chance that you can cover more aspects of your topic. Don’t be shy of writing long but highly useful copy then.
  6. Duplicate content. Not all factors can influence your rankings in a positive way. Having similar content across various pages of your site can actually hurt your rankings. Avoid duplicating content and write original copy for each page.
  7. Canonical tag. Sometimes however having two URLs with similar content is unavoidable. One of the ways from preventing this from becoming a duplicate content issue is by using a canonical tag on your site. This tag does one simple thing, it tells Google that one URL is equivalent of another, clearly stating that in spite of two pages having the same content, they are in fact one.
  8. Image Optimization. It’s not only text that can be optimized on a page but other media too. Images for instance can send the search engine relevancy signals through their alt text, caption and description for instance.
  9. Content Updates. Google algorithm prefers freshly updated content. It does not mean that you have to edit your pages all the time. I believe that for commercial pages, such as product descriptions Google recognizes the fact that they are not as time sensitive as blog posts covering recent events. It is wise however to include some strategy to update certain types of content once every 12 months or so.
  10. Outbound links. Linking to authoritative pages sends trust signals to the search engine. Think of it this way, the only reason why you would send a user to another site is if you wanted them to learn more of the subject. This can be a huge trust factor for Google. Too many outbound links however can greatly diminish the page’s PageRank, hurting its search visibility. Outbound links can affect your rankings but use them in moderation.
  11. Internal links. Interlinking pages on your site can pass their strength between them.
  12. Keyword in URL. Including keyword in the URL slug (that’s the bit that appears after the “.com/“part of the URL) is said to send another relevancy signal to Google.

Tuesday 10 May 2016

Seo Tips

SEO Tips

To optimize your whole site for search engines, you’ll need to follow these basic tips:

1. Make the website about one thing.

It can be about other stuff, too, but choose one primary topic that is most essential to your message.
This step is important, so you may want to do a little keyword research before choosing a topic.

2. Mention keywords where they matter most.

Include your “one thing” in the site title, domain name, description, tagline, keywords, blog categories, page titles, and page content.
If you’re on WordPress, you can change a lot of this in the General Settings or through a plugin like All in One SEO Pack (which I use).

3. Link to internal pages on your site.

A lot of content management systems automatically do this, but if yours doesn’t, you’ll want to be intentional about linking to your most important pages directly from your homepage and cross-linking them with each other.

4. Use a permalink structure that includes keywords.

Some sites have “ugly” permalink structures that use numbers to identify pages.
Don’t do this. It’s bad for SEO and just doesn’t look good.
Use a URL structure that includes text, and make sure you include keywords in your URLs.
So instead of having a page’s URL be this:
http://yoursite.com/?p=12
It should look more like this:
http://yoursite.com/coolpage/

5. Remove anything that slows down your website.

Page load times are important, so get rid of any non-essentials that bog down your website.
These may including music players, large images, flash graphics, and unnecessary plugins.

6. Use keywords in your images.

Include words that reflect your site topic in the image title, description, and alt attributes.
Also, re-title the file name if it doesn’t reflect your main keywords (e.g. writing-tips.jpg instead of d1234.jpg).

7. Link to other websites with relevant content.

You can do this by including a blogroll, link list, or resources page on your website.
Of course, do it sparingly, as each outbound link is a “vote” for another site. However, if you do it well and people click your links, this tells search engines you are a trusted authority on your particular topic.

8. Update your website frequently.

Sites with dynamic content often rank higher than those with static content. That’s why blogs and directories (like Wikipedia) do so well on search engines. They are constantly being updated with new content.

9. Make sure your website is indexed in search engines.

A lot of search engines will automatically find and index your content, but don’t count on it.
You want to be sure engines like Google, Bing, and Yahoo are crawling your site, so that people are finding you online. (You can add them directly, if they’re not.)

10. Have other websites link to you.

This is really, really important, when it comes to SEO. The bummer is that it’s not something you can necessarily control. Other than creating excellent content, the only thing you can do is ask (which occasionally works).
My counsel is to spend the time you would trying to convince somebody to link to you on just writing great content. And, start guest posting on other blogs.
Regardless of what you do, know that inbound links are essential to SEO.

What’s a “search algorithm?”
That’s a technical term for what you can think of as a recipe that Google uses to sort through the billions of web pages and other information it has, in order to return what it believes are the best answers.
What’s “Hummingbird?”
It’s the name of the new search algorithm that Google is using, one that Google says should return better results.
So that “PageRank” algorithm is dead?
No. PageRank is one of over 200 major “ingredients” that go into the Hummingbird recipe. Hummingbird looks at PageRank — how important links to a page are deemed to be — along with other factors like whether Google believes a page is of good quality, the words used on it and many other things (see our Periodic Table Of SEO Success Factors for a better sense of some of these).
Why is it called Hummingbird?
Google told us the name come from being “precise and fast.”
When did Hummingbird start? Today?
Google started using Hummingbird about a month ago, it said. Google only announced the change today.
What does it mean that Hummingbird is now being used?
Think of a car built in the 1950s. It might have a great engine, but it might also be an engine that lacks things like fuel injection or be unable to use unleaded fuel. When Google switched to Hummingbird, it’s as if it dropped the old engine out of a car and put in a new one. It also did this so quickly that no one really noticed the switch.
When’s the last time Google replaced its algorithm this way?
Google struggled to recall when any type of major change like this last happened. In 2010, the “Caffeine Update” was a huge change. But that was also a change mostly meant to help Google better gather information (indexing) rather than sorting through the information. Google search chief Amit Singhal told me that perhaps 2001, when he first joined the company, was the last time the algorithm was so dramatically rewritten.
What about all these Penguin, Panda and other “updates” — haven’t those been changes to the algorithm?
PandaPenguin and other updates were changes to parts of the old algorithm, but not an entire replacement of the whole. Think of it again like an engine. Those things were as if the engine received a new oil filter or had an improved pump put in. Hummingbird is a brand new engine, though it continues to use some of the same parts of the old, like Penguin and Panda
The new engine is using old parts?
Yes. And no. Some of the parts are perfectly good, so there was no reason to toss them out. Other parts are constantly being replaced. In general, Hummingbird — Google says — is a new engine built on both existing and new parts, organized in a way to especially serve the search demands of today, rather than one created for the needs of ten years ago, with the technologies back then.
What type of “new” search activity does Hummingbird help?
Conversational search” is one of the biggest examples Google gave. People, when speaking searches, may find it more useful to have a conversation.
“What’s the closest place to buy the iPhone 5s to my home?” A traditional search engine might focus on finding matches for words — finding a page that says “buy” and “iPhone 5s,” for example.
Hummingbird should better focus on the meaning behind the words. It may better understand the actual location of your home, if you’ve shared that with Google. It might understand that “place” means you want a brick-and-mortar store. It might get that “iPhone 5s” is a particular type of electronic device carried by certain stores. Knowing all these meanings may help Google go beyond just finding pages with matching words.
In particular, Google said that Hummingbird is paying more attention to each word in a query, ensuring that the whole query — the whole sentence or conversation or meaning — is taken into account, rather than particular words. The goal is that pages matching the meaning do better, rather than pages matching just a few words.
I thought Google did this conversational search stuff already!
It does (see Google’s Impressive “Conversational Search” Goes Live On Chrome), but it had only been doing it really within its Knowledge Graph answers. Hummingbird is designed to apply the meaning technology to billions of pages from across the web, in addition to Knowledge Graph facts, which may bring back better results.
Does it really work? Any before-and-afters?
We don’t know. There’s no way to do a “before-and-after” ourselves, now. Pretty much, we only have Google’s word that Hummingbird is improving things. However, Google did offer some before-and-after examples of its own, that it says shows Hummingbird improvements.
A search for “acid reflux prescription” used to list a lot of drugs (such as this, Google said), which might not be necessarily be the best way to treat the disease. Now, Google says results have information about treatment in general, including whether you even need drugs, such as this as one of the listings.
A search for “pay your bills through citizens bank and trust bank” used to bring up the home page for Citizens Bank but now should return the specific page about paying bills
A search for “pizza hut calories per slice” used to list an answer like this, Google said, but not one from Pizza Hut. Now, it lists this answer directly from Pizza Hut itself, Google says.
Could it be making Google worse?
Almost certainly not. While we can’t say that Google’s gotten better, we do know that Hummingbird — if it has indeed been used for the past month — hasn’t sparked any wave of consumers complaining that Google’s results suddenly got bad. People complain when things get worse; they generally don’t notice when things improve.
Does this mean SEO is dead?
No, SEO is not yet again dead. In fact, Google’s saying there’s nothing new or different SEOs or publishers need to worry about. Guidance remains the same, it says: have original, high-quality content. Signals that have been important in the past remain important; Hummingbird just allows Google to process them in new and hopefully better ways.
Does this mean I’m going to lose traffic from Google?
If you haven’t in the past month, well, you came through Hummingbird unscathed. After all, it went live about a month ago. If you were going to have problems with it, you would have known by now.
By and large, there’s been no major outcry among publishers that they’ve lost rankings. This seems to support Google saying this is very much a query-by-query effect, one that may improve particular searches — particularly complex ones — rather than something that hits “head” terms that can, in turn, cause major traffic shifts.
But I did lose traffic!
Perhaps it was due to Hummingbird, but Google stressed that it could also be due to some of the other parts of its algorithm, which are always being changed, tweaked or improved. There’s no way to know.
How do you know all this stuff?
Google shared some of it at its press event today, and then I talked with two of Google’s top search execs, Amit Singhal and Ben Gomes, after the event for more details. I also hope to do a more formal look at the changes from those conversations in the near future. But for now, hopefully you’ve found this quick FAQ based on those conversations to be helpful.
By the way, another term for the “meaning” connections that Hummingbird does is “entity search,” and we have an entire panel on that at our SMX East search marketing show in New York City, next week. The Coming “Entity Search” Revolution session is part of an entire “Semantic Search” track that also gets into ways search engines are discovering meanings behind words. Learn more about the track and the entire show on the agenda page.
Postscript: See our follow-up story, Google’s Hummingbird Takes Flight: SEOs Give Insight On Google’s New Algorithm.


Google Algorithm Change History

2016 Updates







Unnamed Major Update — May 10, 2016

MozCast and other Google weather trackers showed a historically rare week-long pattern of algorithm activity, including a 97-degree spike. Google would not confirm this update, and no explanation is currently available.








2015 Updates



















2014 Updates







Penguin Everflux — December 10, 2014

A Google representative said that Penguin had shifted to continuous updates, moving away from infrequent, major updates. While the exact timeline was unclear, this claim seemed to fit ongoing flux after Penguin 3.0 (including unconfirmed claims of a Penguin 3.1).








Penguin 3.0 — October 17, 2014

More than a year after the previous Penguin update (2.1), Google launched a Penguin refresh. This update appeared to be smaller than expected (<1% of US/English queries affected) and was probably data-only (not a new Penguin algorithm). The timing of the update was unclear, especially internationally, and Google claimed it was spread out over "weeks".
















HTTPS/SSL Update — August 6, 2014

After months of speculation, Google announced that they would be giving preference to secure sites, and that adding encryption would provide a "lightweight" rankings boost. They stressed that this boost would start out small, but implied it might increase if the changed proved to be positive.












Payday Loan 3.0 — June 12, 2014

Less than a month after the Payday Loan 2.0 anti-spam update, Google launched another major iteration. Official statements suggested that 2.0 targeted specific sites, while 3.0 targeted spammy queries.
















2013 Updates



















Hummingbird — August 20, 2013

Announced on September 26th, Google suggested that the "Hummingbird" update rolled out about a month earlier. Our best guess ties it to a MozCast spike on August 20th and many reports of flux from August 20-22. Hummingbird has been compared to Caffeine, and seems to be a core algorithm update that may power changes to semantic search and the Knowledge Graph for months to come.








Unnamed Update — July 26, 2013

MozCast tracked a large Friday spike (105° F), with other sources showing significant activity over the weekend. Google has not confirmed this update.
MozCast Update (Google+)




Knowledge Graph Expansion — July 19, 2013

Seemingly overnight, queries with Knowledge Graph (KG) entries expanded by more than half (+50.4%) across the MozCast data set, with more than a quarter of all searches showing some kind of KG entry.




Panda Recovery — July 18, 2013

Google confirmed a Panda update, but it was unclear whether this was one of the 10-day rolling updates or something new. The implication was that this was algorithmic and may have "softened" some previous Panda penalties.




Multi-Week Update — June 27, 2013

Google's Matt Cutts tweeted a reply suggesting a "multi-week" algorithm update between roughly June 12th and "the week after July 4th". The nature of the update was unclear, but there was massive rankings volatility during that time period, peaking on June 27th (according to MozCast data). It appears that Google may have been testing some changes that were later rolled back.
















Domain Crowding — May 21, 2013

Google released an update to control domain crowding/diversity deep in the SERPs (pages 2+). The timing was unclear, but it seemed to roll out just prior to Penguin 2.0 in the US and possibly the same day internationally.












2012 Updates



Panda #23 — December 21, 2012

Right before the Christmas holiday, Google rolled out another Panda update. They officially called it a "refresh", impacting 1.3% of English queries. This was a slightly higher impact than Pandas #21 and #22.
























August/September 65-Pack — October 4, 2012

Google published their monthly (bi-monthly?) list of search highlights. The 65 updates for August and September included 7-result SERPs, Knowledge Graph expansion, updates to how "page quality" is calculated, and changes to how local results are determined.
































Panda 3.9 (#17) — July 24, 2012

A month after Panda 3.8, Google rolled out a new Panda update. Rankings fluctuated for 5-6 days, although no single day was high enough to stand out. Google claimed ~1% of queries were impacted.




















Penguin 1.1 (#2) — May 25, 2012

Google rolled out its first targeted data update after the "Penguin" algorithm update. This confirmed that Penguin data was being processed outside of the main search index, much like Panda data.












Panda 3.6 (#14) — April 27, 2012

Barely a week after Panda 3.5, Google rolled out yet another Panda data update. The implications of this update were unclear, and it seemed that the impact was relatively small.




Penguin — April 24, 2012

After weeks of speculation about an "Over-optimization penalty", Google finally rolled out the "Webspam Update", which was soon after dubbed "Penguin." Penguin adjusted a number of spam factors, including keyword stuffing, and impacted an estimated 3.1% of English queries.




Panda 3.5 (#13) — April 19, 2012

In the middle of a busy week for the algorthim, Google quietly rolled out a Panda data update. A mix of changes made the impact difficult to measure, but this appears to have been a fairly routine update with minimal impact.












Panda 3.4 (#12) — March 23, 2012

Google announced another Panda update, this time via Twitter as the update was rolling out. Their public statements estimated that Panda 3.4 impacted about 1.6% of search results.




Search Quality Video — March 12, 2012

This wasn't an algorithm update, but Google published a rare peek into a search quality meeting. For anyone interested in the algorithm, the video provides a lot of context to both Google's process and their priorities. It's also a chance to see Amit Singhal in action.








February 40-Pack (2) — February 27, 2012

Google published a second set of "search quality highlights" at the end of the month, claiming more than 40 changes in February. Notable changes included multiple image-search updates, multiple freshness updates (including phasing out 2 old bits of the algorithm), and a Panda update.




Panda 3.3 (#11) — February 27, 2012

Google rolled out another post-"flux" Panda update, which appeared to be relatively minor. This came just 3 days after the 1-year anniversary of Panda, an unprecedented lifespan for a named update.




















2011 Updates







Panda 3.1 (#9) — November 18, 2011

After Panda 2.5, Google entered a period of "Panda Flux" where updates started to happen more frequently and were relatively minor. Some industry analysts called the 11/18 update 3.1, even though there was no official 3.0. For the purposes of this history, we will discontinue numbering Panda updates except for very high-impact changes.
























516 Algo Updates — September 21, 2011

This wasn't an update, but it was an amazing revelation. Google CEO Eric Schmidt told Congress that Google made 516 updates in 2010. The real shocker? They tested over 13,000 updates.




















Google+ — June 28, 2011

After a number of social media failures, Google launched a serious attack on Facebook with Google+. Google+ revolved around circles for sharing content, and was tightly integrated into products like Gmail. Early adopters were quick to jump on board, and within 2 weeks Google+ reached 10M users.
























Panda/Farmer — February 23, 2011

A major algorithm update hit sites hard, affecting up to 12% of search results (a number that came directly from Google). Panda seemed to crack down on thin content, content farms, sites with high ad-to-content ratios, and a number of other quality issues. Panda rolled out over at least a couple of months, hitting Europe in April 2011.




Attribution Update — January 28, 2011

In response to high-profile spam cases, Google rolled out an update to help better sort out content attribution and stop scrapers. According to Matt Cutts, this affected about 2% of queries. It was a clear precursor to the Panda updates.
Latest Google Algorithm change (Search News Central)




2010 Updates











Instant Previews — November 2010

A magnifying glass icon appeared on Google search results, allowing search visitors to quickly view a preview of landing pages directly from SERPs. This signaled a renewed focus for Google on landing page quality, design, and usability.








Brand Update — August 2010

Although not a traditional algorithm update, Google started allowing the same domain to appear multiple times on a SERP. Previously, domains were limited to 1-2 listings, or 1 listing with indented results.












Google Places — April 2010

Although "Places" pages were rolled out in September of 2009, they were originally only a part of Google Maps. The official launch of Google Places re-branded the Local Business Center, integrated Places pages more closely with local search results, and added a number of features, including new local advertising options.

2009 Updates



Real-time Search — December 2009

This time, real-time search was for real- Twitter feeds, Google News, newly indexed content, and a number of other sources were integrated into a real-time feed on some SERPs. Sources continued to expand over time, including social media.












2008 Updates



Google Suggest — August 2008

In a major change to their logo-and-a-box home-page Google introduced Suggest, displaying suggested searches in a dropdown below the search box as visitors typed their queries. Suggest would later go on to power Google Instant.




Dewey — April 2008

A large-scale shuffle seemed to occur at the end of March and into early April, but the specifics were unclear. Some suspected Google was pushing its own internal properties, including Google Books, but the evidence of that was limited.
Google's Cutts Asking for Feedback on March/April '08 Update (SERoundtable)

2007 Updates







Universal Search — May 2007

While not your typical algorithm update, Google integrated traditional search results with News, Video, Images, Local, and other verticals, dramatically changing their format. The old 10-listing SERP was officially dead. Long live the old 10-listing SERP.

2006 Updates



False Alarm — December 2006

There were stirrings about an update in December, along with some reports of major ranking changes in November, but Google reported no major changes.




Supplemental Update — November 2006

Throughout 2006, Google seemed to make changes to the supplemental index and how filtered pages were treated. They claimed in late 2006 that supplemental was not a penalty (even if it sometimes felt that way).

2005 Updates



Big Daddy — December 2005

Technically, Big Daddy was an infrastructure update (like the more recent "Caffeine"), and it rolled out over a few months, wrapping up in March of 2006. Big Daddy changed the way Google handled URL canonicalization, redirects (301/302) and other technical issues.
Indexing timeline (MattCutts.com)




Google Local/Maps — October 2005

After launching the Local Business Center in March 2005 and encouraging businesses to update their information, Google merged its Maps data into the LBC, in a move that would eventually drive a number of changes in local SEO.




Jagger — October 2005

Google released a series of updates, mostly targeted at low-quality links, including reciprocal links, link farms, and paid links. Jagger rolled out in at least 3 stages, from roughly September to November of 2005, with the greatest impact occurring in October.








XML Sitemaps — June 2005

Google allowed webmasters to submit XML sitemaps via Webmaster Tools, bypassing traditional HTML sitemaps, and giving SEOs direct (albeit minor) influence over crawling and indexation.




Personalized Search — June 2005

Unlike previous attempts at personalization, which required custom settings and profiles, the 2005 roll-out of personalized search tapped directly into users? search histories to automatically adjust results. Although the impact was small at first, Google would go on to use search history for many applications.




Bourbon — May 2005

"GoogleGuy" (likely Matt Cutts) announced that Google was rolling out "something like 3.5 changes in search quality." No one was sure what 0.5 of a change was, but Webmaster World members speculated that Bourbon changed how duplicate content and non-canonical (www vs. non-www) URLs were treated.
Google Update "Bourbon" (Batelle Media)




Allegra — February 2005

Webmasters witnessed ranking changes, but the specifics of the update were unclear. Some thought Allegra affected the "sandbox" while others believed that LSI had been tweaked. Additionally, some speculated that Google was beginning to penalize suspicious links.




Nofollow — January 2005

To combat spam and control outbound link quality, Google, Yahoo, and Microsoft collectively introduce the "nofollow" attribute. Nofollow helps clean up unvouched for links, including spammy blog comments. While not a traditional algorithm update, this change gradually has a significant impact on the link graph.

2004 Updates



Google IPO — August 2004

Although obviously not an algorithm update, a major event in Google's history - Google sold 19M shares, raised $1.67B in capital, and set their market value at over $20B. By January 2005, Google share prices more than doubled.




Brandy — February 2004

Google rolled out a variety of changes, including a massive index expansion, Latent Semantic Indexing (LSI), increased attention to anchor text relevance, and the concept of link "neighborhoods." LSI expanded Google's ability to understand synonyms and took keyword analysis to the next level.




2003 Updates



Florida — November 2003

This was the update that put updates (and probably the SEO industry) on the map. Many sites lost ranking, and business owners were furious. Florida sounded the death knell for low-value late 90s SEO tactics, like keyword stuffing, and made the game a whole lot more interesting.




Supplemental Index — September 2003

In order to index more documents without sacrificing performance, Google split off some results into the "supplemental" index. The perils of having results go supplemental became a hotly debated SEO topic, until the index was later reintegrated.








Esmeralda — June 2003

This marked the last of the regular monthly Google updates, as a more continuous update process began to emerge. The "Google Dance" was replaced with "Everflux". Esmerelda probably heralded some major infrastructure changes at Google.




Dominic — May 2003

While many changes were observed in May, the exact nature of Dominic was unclear. Google bots "Freshbot" and "Deepcrawler" scoured the web, and many sites reported bounces. The way Google counted or reported backlinks seemed to change dramatically.




Cassandra — April 2003

Google cracked down on some basic link-quality issues, such as massive linking from co-owned domains. Cassandra also came down hard on hidden text and hidden links.




Boston — February 2003

Announced at SES Boston, this was the first named Google update. Originally, Google aimed at a major monthly update, so the first few updates were a combination of algorithm changes and major index refreshes (the so-called "Google Dance"). As updates became more frequent, the monthly idea quickly died.

2002 Updates



1st Documented Update — September 2002

Before "Boston" (the first named update), there was a major shuffle in the Fall of 2002. The details are unclear, but this appeared to be more than the monthly Google Dance and PageRank update. As one webmaster said of Google: "they move the toilet mid stream".

2000 Updates



Google Toolbar — December 2000

Guaranteeing SEO arguments for years to come, Google launched their browser toolbar, and with it, Toolbar PageRank (TBPR). As soon as webmasters started watching TBPR, the Google Dance began.