A Brief SEO Overview

What should words like ‘Panda’, ‘Hummingbird’, ‘Hilltop’, and ‘Edgerank’ mean to those wishing to grow their business? This article aims to provide basic understanding of them and their effects on profit margins.



Search Engine Optimization, or SEO, is a key focus among many organizations wishing to increase quality online traffic. The phrase ‘quality traffic’ indicates users who will purchase the good or service that is being sold or interact with the site in the intended manner. It is a continuous process where the goal is to provide the best possible web experience for human users as judged by crawlers, or scripts that gather information for search engines to determine how to rank sites based upon a variety of keywords. Thousands of search engines exist, however this post will be focusing on the most popular one that handles roughly 70% of all searches in the United States: Google.

The ranking algorithms of the large search engines, particularly Google, are proprietary and the exact nature of the variables involved and associated weights are unknown. Still, a decent amount of how they generally work is available and further data may be gathered empirically, or through trial-and-error. The most important public knowledge, however, is that the search engine’s goal is to rank sites based off how much of an authority in their field users will view them.

What is known are four major factors that affect site placing:

  1. Content
  2. Inbound and outbound links
  3. Social Graph Optimization
  4. On-site factors


It has been said that ‘content is the currency of the Internet’. Content implementation thus holds the greatest weight for rankings. In the past, one merely had to place a bunch of keywords near the top of the page to benefit from the search engines’ content algorithms. Soon, web developers were placing duplicate, not fully relevant, and sometimes hidden content in a contest against their competitors’ clients to reap the most gain for their own clients. None of these strategies actually enhanced the experience of the user and some actually decreased it. Enter Panda; Google’s algorithm for boosting original, relevant content and penalizing content implementations that were simply trying to find workarounds to the top of the search pages instead of providing more value to the user.

Along those same lines, Google realized that the queries formed by many people, especially those using the microphone to input search text on their mobile phones, were in the form of a question. This furthered the decline of the ‘keyword stuffing’ agenda by taking context into consideration. Thus, Hummingbird was born; a radical stepping stone in redefining the role of the browser.

The original PageRank algorithm upon which Google was formed determined the authority of a site based upon ingoing and outgoing links. If a site had X links pointing to it, then X+1 links would surely provide it a higher rank. Soon, web developers looking to exploit the system set up link farms pointing to the site from shell sites, among other illicit black-hat means. Using its big-data capabilities mixed with its cutting-edge machine learning, Google began penalizing these link farms, sometimes by even removing them from the search engine altogether. Nowadays with algorithms such as Hilltop, links coming from valid expert sites contribute to site ranking and those coming from shell sites and farms decrease ranking.

Social Graph Optimization

The third factor mentioned in a site’s ranking was its Social Graph Optimization, or SGO. A large portion of communication now takes place over social media via Liking, Sharing, Tweeting, et cetera. It should come as no surprise then that the activity a site receives on platforms such as Facebook and Twitter contribute towards its ranking. The amount of attention a site receives over these media is a function of the number of people to which the post is served. This, in turn, is a direct function of algorithms such as Facebook’s EdgeRank, which determine how often and to which peoples’ feeds to serve the content. Once again, developers and advertisers soon began creating fake accounts from which to Like and Share content in hopes of boosting exposure. The social media platforms are quite intelligent however, and like the search engines, implemented a method for determining whether the activity was coming from real accounts while penalizing that which was determined to be coming from fake accounts.

On-Site Factors

The final section contributing to SEO is on-site factors. A site that loads slowly or serves content in a less-than-ideal manner is a pain for the user and thus, search engines began factoring this into their ranking algorithms. From this, techniques such as caching statically served content and minifying source code became the standard for sites wishing to experience a greater ranking. Another standard that arose from this is responsiveness as the default; a site’s stylesheets should include media-queries for any size of screen from mobile phones to large-screen Android TVs. Also, be sure your site serves a sitemap and valid robots.txt for the crawlers.


A key takeaway of this article should be that there are no ‘hacks’ in getting top search engine results immediately. Quality content and user experience are required to increase rankings. While there are illicit means of obtaining a better position, these are (almost) always caught and penalized such that any short-term gains turn into expensive long-term liabilities. SEO is an on-going and constant battle against the competition in the same sector that is won via continuous long-term strategy and adherence to the rules set out by the search engines.

INTP LLC Now Has A Blog

Welcome to the INTP LLC blog! We will be posting our musings here, particularly regarding general overviews of technical concepts and their relation to business profit margins.

Be sure to check here often and if you like or learn anything new, please feel free to share it.