While the companies behind the search engines and the way those companies determine your website’s ranking for their results have changed quite a bit over the years, consumers continue to increasingly use search to purchase or investigate products and services.
Because SEO has changed so much over the years, we created the simplified the History of SEO to make it easy to understand where it started, where it’s going, and what that means for your business.
First, we need to start by understand what search is all about and what exactly is a search engine.
How does search work?
A search algorithm is any algorithm which retrieves information stored within a database. Determining the appropriate Search Algorithm for a particular application depends on the data structure being searched.
While search technically comprises both the discovery of your product and service through search engines as well as the internal search capabilities of your own website (if applicable), this article will focus on how your customers find your business using a search engine.
What is a search engine?
A search engine is comprised of three parts, an index and a ranking algorithm. Google’s WebCrawler, is known as Googlebot, goes from webpage to webpage to understand and record what’s there in Google’s index.
Google records the information to their index by saving a copy of each webpage to their servers, located in warehouse-sized data storage facilities located throughout the United States, and subsequently analyzing the documents to understand what is on each page. When a Google user types in a search, Google then uses their ranking algorithm, PageRank, to determine which webpages are most relevant to the users search.
Because Googlebot is a computer, albeit a super-computer, it is still restricted by it’s own processing bandwidth. Do to the enormous amount of data available on the web, Google triages it’s web crawling efforts, with popular pages on large website receiving far more “crawls” than a page for a small business.
If your company has a relatively small volume of web traffic, when you upload new content you should expect it to take a few days to a few weeks for Googlebot to crawl your website and index the results.
What do people search for?
Typically searches can be categorized as either informational, transactional, or navigational. People search for more information about your offerings, your customers reviews, product comparisons, your competitors, and sometimes even your promotions.
In the same way you’re able to identify your common sales objections, you can predict what information your prospects need to move forward to the next step.
What do people search for?
SEO, or search engine optimization, is the process of making your website search engine friendly. The purpose is to increase the likelihood google will feature your business at the top of their search results pages for searches that would bring you qualified leads.
To do this, you need to appear first in the search results for the most popular keywords that demonstrate purchase intent for their products or services. Increasing your rank on purchase-intent signaling keywords generating a high volume of monthly search traffic is a proven tactic to drive revenue for your business.
What do people search for?
To keep things simple, we’ll offer an introductory explanation of how SEO works.
SEO falls into three primary categories, technical SEO, On-site SEO, and Off-Site SEO. Each is extremely important in determining your overall ranking.
Technical SEO involves optimizes the URL structure, metadata, structured data, redirect structure, and site performance, as well as fixing any potential errors Google has come across.
On-site SEO involves optimizing page titles, header tags, content, internal linking structure, and UX. Off-site SEO involves building backlinks, managing backlinks, and other organic link building tactics.
Your technical SEO strategy, On-Site SEO strategy, and Off-site SEO strategy all depend on your Overall SEO strategy, which is made by identifying opportunities to improve your ranking in existing or rank for new search intent (keywords).
Keep in mind, this is a very basic explanation; Google currently uses over 200 factors in determining your websites ranking for search results and is constantly updating and improving their ranking system! Why do they do this? To consistently achieve their goal:
Google’s ongoing goal: Iteration #1
When Google Search launched in 1996, they were the first search engine to incorporate “Authority” into relevancy rankings.
Authority is determined by the number and quality of links back to your site, or backlinks, can be found for your website around the internet.
The theory is, the websites that are referred by the most and the most important websites are probably of more value than websites no one has ever heard of.
Although the industry leader in search by far, Google knew their product was far from delivering an ideal experience to users and continued to update their Search algorithm. Because technology, user needs, and user expectations change over time, Google’s algorithm is now improved multiple times on a daily basis.
When these changes are major and need to be implemented by the SEO community, Google traditionally announces an “Algorithm Update” prior to making the change so that business owners can keep their rankings in-tact, and potentially even improve.
The first of Google’s major algorithm updates, titled “Florida”, took place in November 2003.
his was the first time business owners and SEO specialists were made aware of the importance of proactively anticipating Google’s algorithm updates when many sites received penalties for practices designed to game the algorithm, such as keyword stuffing, instead of focusing on delivering the experience the searcher was looking for.
In 2005, Google took the first steps in personalizing their users experience by launching personalized search, which used an individuals browsing history in order to make their future search results more relevant.
Google teamed up with MSN and Yahoo to create the Nofollow Attribute, designed to decrease the amount of spam-links and illegitimate comments found on many websites at the time.
In 2006, Google released the “Caffeine” update, which was designed to speed up Googlebot’s crawling speed, expand their index of webpages, and to create the ability to simultaneously index and rank pages, allowing Google to list other results into search pages, such as news articles and tweets.
Google’s ongoing goal: Iteration #2
In 2007 Google launched “Universal Search”, which integrated normal HTML website listings traditionally found on search engine results pages with features likes Video, News, and Local Results
These features have a dramatic impact on the layout of the search result page, with much of the Google generated features taking up the “above-the-fold” or “above-the-scroll” search result page content.
Because these features provide meaningful assistance based on the intent of the searchers query, users begin using these features almost immediately.
In 2008, “Google Suggest” was launched, adding relevant suggestions for completing the users search based on what others frequently searched.
In 2010, Google further improved their search experience by showing users results in real-time as they type in their query.
Also in 2010, Google added new signals to their algorithm called “social signals” to account for the increasing amount of time spent on social networks.
These networks essentially serve as a signals for the sentiment users of your products or services might be saying in a real world word-of-mouth conversation about your offerings or your brand.
In 2011, Google teamed up with Yahoo and Microsoft again to create Schema.org, which created a unified language to “mark-up” the html on your website so that the content could be better understood by Google.
Google used this structured data to improve the quality of their features in “Universal Search”. Companies who took advantage of optimizing their online marketing for Universal Search achieved excellent returns.
Most companies who’s listings were featured on pages with Universal Search saw their traffic decrease as searchers took advantage of the “above the scroll” content.
Also in 2011, Google rolled out the major algorithm update panda, which penalized websites for producing large volumes of low-quality content for the sole purpose of driving website traffic, as well as for serving too many ads relative to the amount of content found on the page.
Panda has undergone at least 28 minor updates since then as Google continues to refine their ability to serve content that users find valuable.
This meant business’s who wanted to drive traffic through organic search traffic needed to focus on producing content that the searcher actually found meaningful, rather than post short, keyword filled content relevant to the business’s activities, but not the searcher’s intent.
In 2012, Google began to reward high-quality sites that implemented their recommended SEO guidelines through the algorithm update Penguin.
Additionally, Penguin penalized sites that contained hyperlinks identified by google as spam, eliminating the ability for business’s to boost their rankings through purchasing links.
In 2013 through the Hummingbird update, Google announced they had added the ability for the algorithm to parse the intent behind a query, rather than just the language itself.
With these major updates, google had transformed their algorithm into a much more complicated, associative, and integrated model.
In recent years Google’s updates have accelerated in recent years due to the explosion in popularity of mobile search and mobile apps.
In 2014, Google released the “Pigeon” update, which was designed to make the local results that appear at the top of the search engine result page more accurate, more useful, and more relevant. This was the first signal that Google would be placing increasing emphasis on their Knowledge Graph and Local Results in the future.
Additionally, google announced their initiative “HTTPS Everywhere” which called for all websites to begin using a secure website. Following the announcement, Google penalized the rankings of sites still using HTTP and displayed a “website not secure” warning message when a non-HTTPS website was clicked.
2015 saw what became infamously known as “Mobilegeddon”, a Google update that prioritized mobile-friendly sites in all search results.
Business’s without a mobile-friendly, or “responsive”, website saw their traffic decrease significantly.
Additionally in 2015, google announced that the next evolution of PageRank, called RankBrain, had been implemented into its algorithm. RankBrain is based on machine learning, allowing Google to increase the number of factors it considers, as well as the level of personalization it can deliver, at a faster and more accurate pace than ever before.
Despite the rapid changes taking place in the realm of search, Google has continued to focus on their key objective: to deliver the right content at the right time to every user!
In 2016 Google released the unofficially named “Possum” update which diversified local search results to many different types of industries that were previously unable to take advantage of the feature
Additionally, Possum included additional penalization for sites determined to be creating spam content.
In 2017 Google began penalized the search rankings of sites that used aggressive pop-up advertisements that damage a mobile users experience.
Additionally, an update unofficially titled “Fred”, punished websites who’s backlinks for low-quality, as well as websites that prioritized driving visitors towards monetization regardless of the experience that created.
Google announced the importance of entities, intent, and salience in your keyword strategies, as well as tools to check your content is compliant. Updates are no longer named due to the consistently learning nature of RankBrain.
Google’s ongoing goal: Iteration #3
We predict that the future of SEO will be heavily influenced by Google’s heavy investment in machine learning capabilities and the growing popularity of voice search. Google’s automatically generated “knowledge graph” is increasingly prevalent in search results and dominates searcher attention, reducing the number of organic search results on the first page to four.
This means it’s increasingly important for your business to have Google recommend you when a potential customers expresses purchases intent through search. Google’s goal will remain the same, however the key to ranking your business well in relevant search results is changing.
Google can deliver a significant step-up in the experience they provide their users by utilizing AI and machine learning, however this requires a dramatic shift in the way many websites are optimized.
Having the appropriate keywords is no longer enough, Google now searches for entities and intent, determining the salience of each and the relevance to the users query in real-time.
In addition to the growth of mobile-search, which now comprises the majority of Googles 3 billion daily searches, many mobile users are taking advantage of Google’s knowledge graph, which is powered by RankBrain, to find the answers they need.
As mobile search continues to grow, the amount of attention and clicks Google’s knowledge graph gets continues to grow.
Google's "knowledge graph" takes many forms, however the original Knowledge graph is still shown for many information queries.
For queries that contain a question, Google's RankBrain serves the single answer it determines to best answer that question at the top of the page.
Besides making results more relevant and easier for users to find, why is Google getting into AI and machine learning? Because information is continually being created at an exponential pace.
And the variety of mediums information is being created in has rapidly grown.
The variety and availability has grown so significantly and create the opportunity for google to gain a more complete, and more accurate, picture of the importance of an individual piece of content.
Additionally user expectations and preferences about the medium of content they wish to consume continues to change. Voice search is additionally posed to add new intricacies, and new competitors, into the search market.
So how can AI manage these changes to create a better experience for search users?
By applying a system of algorithms called deep learning to their dataset, google is able to constantly iterate and improve their algorithm to provide the most contextually relevant results. The first component of this learning is identifying the entity, or the subject, that the content is about.
An entity is not confined to a website the same way a keyword is. An entity, such as your brand, could have associated knowledge from a variety of sources including videos, audios, apps, or third-party shopping platforms. After the entity has been established, RankBrain identifies the intent of the content, in relation to the entities.
This is done because depending on the type of content that is most relevant to serve depends on the intent of the searcher. The layered intent model allows Google to provide more context into the intent of the searcher.
So what happens when multiple entities are identified?
RankBrain makes a decision on how important each entity is to the main idea, called salience.
RankBrain uses it's understanding of entities, intents, and salience to understand the content, context, and sentiment of text in a variety of mediums.
With an understanding of what the content is, what the purpose is, and what the most important parts are, RankBrain is able to deliver exactly what the user is searching for, conveniently displayed in Google's knowledge graph.
Entity-first indexing is based on how well developed the content surrounding your target entities is.
Much like the associative nature of the human brain, entity first indexing gives preferences to sites that can provide meaningful assistance to the searcher, whether that be more information following their initial question or a purchase process.
Entity-first indexing is changing SEO as we know it! Fortunately, while many sites require additional investment to optimize their keywords to entities, this optimization will allow you to drive more qualified traffic to your digital properties in the future.
1. Make sure to provide Google with all the information they'll accept
2. Create (or optimize) non-website content focused on your target entities.
3. Test your text content for entity classification and understanding using Google's free NLP API
Why does Google bother with all of this?
Like most companies, profitability. The more relevant and personalized Google can make the experience they deliver, the more consumers will favor their product of other search engines.
This increases Google's market share in Search, which ultimately gives Google the pricing power and data they need to ensure their longevity and grow into new markets.
Why do most searches happen on Google?
All search engines use unique, highly guarded algorithms to determine the most relevant results to display to the user. Relevancy is the most important factor for web search engines.
As of 2018,70% of the worlds searches, or about 3 billion searches each day, are being conducted on Google.
Google's unique and improving algorithm has made it one of the most popular search engines of all time. Other search engines continue to have a difficult time matching the relevancy algorithm Google has created by examining a number of factors such as social media, inbound links, fresh content, etc. Google began judging sites by authority.
A website's authority, or trustworthiness, was determined by how many other websites were linking to it, and how reliable those outside linking sites were.
Google's dominance can be explained through the simple fact that their search product solved a need for the market of search users: more relevant results. By creating an algorithm that served better results to their users, their market share for search continued to grow, which gave them more and more data to refine their algorithm.
Interestingly, Google is not the oldest player in the search market, rather they benefited from what's been called "The Last Mover's Advantage", or the idea that the company last to market has the most opportunity to learn from their competitors mistakes and therefore the most opportunity to create a product that truly satisfies their markets needs.
From improvements in web crawlers and categorizing and indexing the web, to introducing new protocols such as robots.txt so that webmasters have control over what web pages get crawled, to the introduction of voice search, the development of search engines has been the culmination of multiple search technologies that developed from different search engines.
Some smaller search engines offer features not available with Google, e.g. not storing any private or tracking information, are the most viable alternatives to using google.