Search engine optimization (SEO)

From HLWIKI Canada
Jump to: navigation, search
Assume each circle above is a website and arrows are links from one website to another. Users can click on a link within website F to go to website B, but not vice versa. Search engines begin by assuming each website has an equal chance of being chosen. Next, crawlers examine which websites link to which websites and guess that websites with more incoming links contain information users need
Are you interested in contributing to HLWIKI International? contact: dean.giustini@ubc.ca

To browse other articles on a range of HSL topics, see the A-Z index.

Contents

Last Update

  • Updated.jpg 16 July 2017

Introduction

See also Google scholar | Grey data ("hard to find" data) | Microsoft Academic Search | Open access | Search engines

Search engine optimization (SEO) is an activity, domain, even a discipline in Internet marketing where the volume or quality of traffic to web sites from search engines is strategically improved, gamed, or manipulated. Generally this improvement is accomplished using "natural" or un-paid ("organic" or "algorithmic") search results as opposed to search engine marketing (SEM) which relies on paid inclusion. Typically, the earlier (or higher) sites appear in search results or listings the more visitors it will receive. SEO may target different kinds of searching, including image search, local search, video search and industry-specific vertical searching. This gives a web site its digital presence.

Getting indexed

All of the leading search engines, Google, Bing and Yahoo!, employ crawlers to locate webpages for their algorithmic search results. Pages that are linked from other indexed pages do not need to be submitted because they are automatically found. Some search engines operate a paid submission service that guarantees crawling for either a fee or based on cost per click. These programs usually guarantee inclusion in the database, but do not guarantee specific ranking within search results. Two major directories, the Yahoo Directory and the Open Directory Project, require manual submission and human editorial review. Google offers webmaster tools where an XML feed can be created and submitted for free to ensure that all pages are found, especially those that are not usually discoverable by automatically following links.

Search engine crawlers may look at a number of different factors when crawling web sites. Not every page is indexed by the search engines. The distance of clicks or pages from the root directory of web sites may also factor into whether or not they are crawled.

Videos

References

Personal tools
Namespaces

Variants
Actions
Navigation
Toolbox