Search Engine History

Then There Were Three…
The early birds (Infoseek, Altavista, Webcrawler, and Yahoo) began offering search results back in 1994 and most have not survived the test of time.
According NetMarketShare, as of September 2017, Google (created in 1996) leads the global search engine market share (desktop/laptop) with 78.05%. The next closet competitors are:
- Baidu: 9.82%
- Bing: 6.69%
- Yahoo: 2.81%
- Yandex: 1.44%
- Ask: 0.64%
- DuckDuckGo: 0.26%
- Naver: 0.07%
- Seznam: 0.06%
- AOL: 0.05%
Looking only at mobile search: Google leads with 75.256%, followed by Baidu with 21.49%, and Yahoo with 0.93%!
Modern SEO Skillset
READ: I recommend you bookmark this page and read through the content when you have time to digest everything Rand mentions and download the whiteboard image for future reference. Also checkout the source code and examine the markup used from one of the leading sources of SEO information on the internet.
There’s no doubt that SEO challenges change on an almost daily basis… And today’s SEO assignments are typically more integrated with other aspects of online marketing than ever before. Here is a glimpse of the technical, creative and strategic thinking that is required for you to be effective in this field.
Here are a few of the topics and skills you should should focus on learning while taking this course:


READ: Curious what enterprise (500+ employees) level SEO is like? Check out the latest Moz survey, “State of Enterprise SEO 2017: Overworked SEOs Need Direction” and read the insights from 240 SEO Specialists. Though a series of 29 questions, you will find the difficulties and success these professional enjoy.
Internet Marketing Terms
Now that you’ve got an idea of the business, let’s get a little more familiar with some of the terms. If you’re new to Internet marketing be prepared for a new language – the industry has it’s own vocabulary.
Here is a listing of some of the more common terms to get you started (For those who already know, consider this a brief refresher!):
Search Engine Marketing or SEM
As the name implies it involves marketing services or products via search engines. SEM is divided into two main pillars: SEO and PPC. SEO stands for Search Engine Optimization, and it is the practice of optimizing websites to make their pages appear in the organic search results. PPC stands for Pay-Per-Click, and it is the practice of purchasing clicks from search engines. The clicks come from sponsored listings in the search results.
Backlink also called ‘inlink’ or ‘link’
A backlink is the hyperlink on another website pointing back to your own website. Backlinks are important for SEO because they affect directly the PageRank of any web page, influencing its search rankings.
PageRank® or PR
PageRank is an algorithm that Google used to estimate the relative importance of pages around the web. The basic idea behind the algorithm is the fact that a link from page A to page B can be seen as a vote of trust from page A to page B. The higher the number of links (weighted to their value) to a page, therefore, the higher the probability that such page is important.
Linkbait
Linkbait is a piece of web content published on a website or blog with the goal of attracting as many backlinks as possible (in order to improve one’s search rankings). Usually it’s a written piece, but it can also be a video, a picture, a quiz or anything else. A classic example of linkbait are the “Top 10″ lists that tend to become popular on social bookmarking sites.
Link farm
A link farm is a group of websites where every website links to every other website, with the purpose of artificially increasing the PageRank of all the sites in the farm. This practice was effective in the early days of search engines, but today they are seeing as a spamming technique (and thus can get you penalized).
Anchor text
The anchor text of a backlink is the text that is clickable on the web page. Having keyword rich anchor text helps with SEO because Google will associate these keywords with the content of your website. If you have a weight loss blog, for instance, it would help your search rankings if some of your backlinks had “weight loss” as their anchor texts.
NoFollow
The nofollow is a link attribute used by website owners to signal to Google that they don’t endorse the website they are linking to. This can happen either when the link is created by the users themselves (e.g., blog comments), or when the link was paid for (e.g., sponsors and advertisers). When Google sees the nofollow attribute it will basically not count that link for the PageRank and search algorithms.
Link Sculpting
By using the nofollow attribute strategically webmasters were able to channel the flow of PageRank within their websites, thus increasing the search rankings of desired pages. This practice is no longer effective as Google recently changed how it handles the nofollow attribute.
Title Tag
The title tag is literally the title of a web page, and it’s one of the most important factors inside Google’s search algorithm. Ideally your title tag should be unique and contain the main keywords of your page. You can see the title tag of any web page on top of the browser while navigating it.
Meta Tags
Like the title tag, meta tags are used to give search engines more information regarding the content of your pages. The meta tags are placed inside the HEAD section of your HTML code, and thus are not visible to human visitors.
Search Algorithm
Google’s search algorithm is used to find the most relevant web pages for any search query. The algorithm considers over 200 factors (according to Google itself), including the PageRank value, the title tag, the meta tags, the content of the website, the age of the domain and so on.
SERP or Search Engine Results Page
It’s basically the page you’ll get when you search for a specific keyword on Google or on other search engines. The amount of search traffic your website will receive depends on the rankings it will have inside the SERPs.
Sandbox
Google basically has a separate index, the sandbox, where it places all newly discovered websites. When websites are on the sandbox, they won’t appear in the search results for normal search queries. Once Google verifies that the website is legitimate, it will move it out of the sandbox and into the main index.
Keyword Density
To find the keyword density of any particular page you just need to divide the number of times that keyword is used by the total number of words in the page. Keyword density used to be an important SEO factor, as the early algorithms placed a heavy emphasis on it. This is not the case anymore.
Keyword Stuffing
Since keyword density was an important factor on the early search algorithms, webmasters started to game the system by artificially inflating the keyword density inside their websites. This is called keyword stuffing. These days this practice won’t help you, and it can also get you penalized.
Cloaking
This technique involves making the same web page show different content to search engines and to human visitors. The purpose is to get the page ranked for specific keywords, and then use the incoming traffic to promote unrelated products or services. This practice is considering spamming and can get you penalized (if not banned) on most search engines.
Web Crawler also called ‘search bot’ or ‘spider’
It is a computer program that browses the web on behalf of search engines, trying to discover new links and new pages. This is the first step on the indexation process.
Duplicate Content
Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. You should avoid having duplicate content on your website because it can get you penalized.
Canonical URL
Canonicalization is a process for converting data that has more than one possible representation into a “standard” canonical representation. A canonical URL, therefore, is the standard URL for accessing a specific page within your website. For instance, the canonical version of your domain might be http://www.domain.com instead of http://domain.com.
Robots.txt
This is nothing more than a text file, placed in the root of the domain, that is used to inform search bots about the structure of the website. For instance, via the robots.txt file it’s possible to block specific search robots and to restrict the access to specific folders of section inside the website.
Google Penalties
Panda Penalty
The Google Panda algorithm update was released in February 2011 and focuses on duplicate, plagiarized or thin content, user-generated spam, and keyword stuffing. The Panda update assigns a “quality score” to web pages and this score is then used as a ranking factor.
This was one the first major update that specifically penalized website on-page factors. As of January 2016, Panda has been included as part of the core ranking algorithm and is designed to prevent low-quality (“shallow”) content web pages from showing up in search results. Panda was also granted a patent in 2014.
Penguin Penalty
The Google Penguin algorithm update was released in April 2012 and focuses on spammy or irrelevant links, and links with over-optimized anchor text (unnatural link profiles: link quality, link velocity, and link diversity). The Penguin updates objective is to down-rank sites whose links it deems manipulative.
As of September 2016, Penguin has been included as part of Google’s core algorithm and unlike Panda, it works in real time.
Hummingbird Penalty
The Google Hummingbird algorithm update was released in August 2013 and focuses on keyword stuffing and low-quality content. The Hummingbird update helps Google better interpret search queries and provide results that match searcher intent (as opposed to the individual terms within the query).
While web page keywords continue to be an important ranking factor – Hummingbird makes it possible for a web page to rank for a search query even if it doesn’t contain the exact words the searcher entered. This is achieved with the help of natural language processing that relies on latent semantic indexing, co-occurring terms and synonyms.
Pigeon Penalty
The Google Pigeon algorithm update was released in July 2014 and focuses on poor on and off page seo. Pigeon affects those searches in which the user’s location plays an important part. The update created closer ties between the local algorithm and the core algorithm: traditional SEO factors are now used to rank local results.
Mobile Penalty
The Google Mobile algorithm update was release in April 2015 and focuses on poor mobile usability and a websites lack of a mobile page version. The Mobile Update (aka Mobilegeddon) ensures that mobile-friendly pages rank at the top of mobile search, while pages not optimized for mobile are filtered out from the results.
When Mobilegeddon arrived, there were some shakeups in the rankings, but only the worst mobile-offenders suffered any traffic losses.
RankBrain Penalty
The Google RankBrain algorithm update was released in October 2015 focuses on lack of query-specific relevance, shallow content, and poor UX. The RankBrain update is part of Google’s Hummingbird algorithm. It is a machine learning system that helps Google understand the meaning behind queries, and serve best-matching search results in response to those queries.
Google considers RankBrain their third most important ranking factor and is part of Google’s Hummingbird algorithm. RankBrain is a machine learning system that helps Google understand the meaning behind queries, and serve best-matching search results in response to those queries.
Possum Penalty
The Google Possum update was release in September 2016 and focuses on competition within a target area. The Possum update ensured that local results vary more depending on the searcher’s location (i.e. the closer you are to a business’s physical address, the more likely you will see that business listing within the local search results). Possum also resulted in greater variety among results ranking for very similar queries, like “dentist denver” and “dentist denver co.”
Interestingly, Possum also gave a boost to businesses located outside the physical city area.
Fred Penalty
The Google Fred update with release in March 2017 and focuses on affiliate heavy or ad-centered content. The Fred update targets websites that violate Google’s webmaster guidelines. The majority of affected sites are blogs with low-quality posts that appear to be created mostly for the purpose of generating ad revenue.