Automated programs called robots, or spiders, or bots are sent out by Google. The bots navigate the web going from one link to the next as they find them on web pages. A copy of each page the bots comes across is stored in the main index that Google uses to retrieve search data. The Google bot scans all the text elements on the webpage and uses this information to determine what the page is about.

When a person "Google's" a word, or phrase, Google then scans through its index and finds all the pages that contain that word or phrase. Google often finds hundreds of thousands or even millions of results. Next Google determines a ranking by relevance. This is a very complex task, but here is the essence of it…

Where, and how often, a particular keyword, or phrase, occurs on a page adds to its relevance. Placement of the keywords in the domain name, page URL (the full address to that page – www.domainname.com/thekeywordinpagename.html), page titles, heading tags, in linked text, in bolded, italicized, or underlined text, etc. As well as how often the word or phrase is mentioned in the body of the text (keyword density) all play a role in determining its overall relevance.

In addition to the text on the web pages, the navigation structure, programming and page layout also affect how the spiders crawl through your site. Some programming and navigation can stop the bots dead in their tacks rendering them unable to find certain (oftentimes very important) content. Bots are also unable to read text that is contained in graphics or flash animation.

Each of the elements mentioned above can be manipulated, or "optimized", to gain better Google rankings. But this optimization must be done properly. Google penalizes website for "over optimization". For example if a word or phrase is "stuffed" into the text too many times, creating an un-naturally high density, the page could get penalized, or worse, banned from the index altogether.

That covers the on-page elements Google evaluate to determine the ranking of any particular webpage on any given search query. Google also use off-page elements. This refers to linking. A link from on website to another counts as a vote. A site that attracts more links from other sites is deemed by Google to be more important and will often rank higher than similar sites with fewer sites linking to them.

Here again this is more complex than it might appear. Not all links are created equal. Links from some sites carry much more weight, or authority, than links from other sites and when these authority site link to a site they pass more "link weight" to the site they are linking to. To make things even a bit more confusing certain poor linking strategies (like link farms) can even harm your rankings. You'll need the right kind of links from the right places to get the best results.

Google also reads the linked text within a link, this is called the anchor text of a link (the usually blue underlined text you would click on). When this anchor text is descriptive of the page it is linking to Google takes that into consideration.

Other factors that Google considers include the age of the domain name, the country the web-server that is hosting the domain is located in, and others.

Google also uses something called Latent Semantic Indexing, LSI. Latent Semantic Indexing takes into account the semantics of language and multiple terms that mean the same thing, as well as words and terms that have a tendency to occur together in content for particular topics. Using a mix of different variations on your main keywords, such as plurals, …ing's, etc., as well as other related terms.