search engine optimization

search engine BingMicrosoft Bing search engine is pictured on a monitor in the Bing Experience Lounge during an event introducing a new AI-powered Microsoft Bing and Edge at Microsoft in Redmond, Washington, on February 7, 2023.

search engine optimization (SEO), practice of increasing both the quality and quantity of “organic” (unpaid) traffic to a website by improving its ranking in the indexes of search engines.

Search engines use “bots” (data-collecting programs) to hunt the Web for pages. Information about these pages is then “cached” (copied) into large indexes for the engines’ users to search. Since numerous Web pages now exist online for every kind of content, Web pages compete with each other to be one of the first results that the search engines’ algorithms retrieve from these indexes. On the dominant search engine, Google, more than half of clicks go to the first three results on the first page. Thus, a high position on a search engine results page (SERP) can be crucial for a company, charity, or other entity that requires online visibility to operate. (The acronym SERP is also often used to refer to a site’s position on a results page.)

There are many ways to improve a website’s SERP, some of which are generally approved of by the SEO industry and some of which are denounced as bad stewardship. “White hat” methods are generally recommended because they enhance a search engine’s accuracy and therefore its users’ experiences. “Gray hat” methods focus on tactics for increasing traffic that are unhelpful to users but not so detrimental as to be widely denounced. “Black hat” methods raise a site’s ranking at the expense of the search engine’s usefulness. Sites discovered to be using black hat SEO might be penalized or even expelled from a search engine’s index.

The most basic SEO method is to place the most important keywords related to a page’s subject matter in its metadata—that is, within a page’s title tag (how a page’s title appears in the search engine results) and meta description (a brief summary of the page’s content). Updating a page regularly suggests to the search engine algorithm that the page is timely, another point in its favour. Updating the page with content that includes important keywords is also helpful. Perhaps less intuitive, but still within white hat expectations, is cross-linking individual pages in order to increase the number of inbound links for each page, which suggests to search engine bots that the page is more authoritative. This credibility also improves pages’ SERPs.

Website administrators can also improve their SERPs by dissuading bots from looking at some of the content on their sites (e.g., pages still under construction). To inform a bot that a certain file or directory is off-limits, an administrator adds its name to the “robots.txt” file that every website possesses. As its name implies, the robots.txt file is a document addressed to any bot that finds the site, essentially instructing the program on the rules of engagement. For example, the robots.txt file for this site includes the URL of the sitemap, a file that lists all the URLs on this site as well as information about them. The file also says “User-agent: *,” which means that any search engine bot is allowed to look at the site, and “Disallow: /search,” which means that bots are not allowed to look at pages generated by internal search results. Alternatively, a page can be exempted from indexing through a meta tag (a descriptor of the page’s content that is invisible to visitors), generally written as <meta name=“robots” content=“noindex”>.

Black hat SEO tactics generally work by abusing search engine logic. For instance, since search engines note the presence of relevant keywords on pages, one black hat SEO technique commonly used in the past was “keyword stuffing”—filling the page with those keywords, however unnaturally they flowed. To be stealthy, a black hat might even write superfluous keywords or links in an out-of-the-way place on the page where they would not be read by users and then make them invisible by writing them in text that matched the page’s background colour. However, search engines have adapted to this technique and now penalize pages that use keyword stuffing.

The earliest use of the phrase “search engine optimization” is still debated, although most popular claims date to the mid- to late 1990s. Nevertheless, the work of gaining search engines’ attention was already known by that time. In the unregulated landscape of SEO’s early years, Web developers were already trading and selling links—often en masse—to fool search engines into increasing their sites’ SERPs.

Because of Google’s early domination of the search engine market—over 90 percent of Web searches have been through Google since 2010—changes to how the company’s algorithm produces results have long represented sea changes in the SEO industry. Indeed, the history of SEO since the turn of the 21st century is effectively the story of its practitioners’ responses to changes in the Google search engine algorithm.

Google’s first major update, the “Florida” update, is a prime example of the giant’s importance. By changing the rules for how SERPs were ranked on November 16, 2003—just before the holiday shopping season—Google accidentally hurt many small retailers. Some of the affected companies were also “false positives,” penalized for but innocent of using the black hat tactics Google wanted to suppress. The technology giant promised not to repeat the mistake.

Adam Volle