Вы находитесь на странице: 1из 9

What is SEO? SEO is the act of modifying a website to increase its ranking in organic (vs.

paid), crawler-based listings of search engines

----------------------------------------------------------------------------------------------------------Types of Search Engines (1) Crawler-Based Search Engines Crawler-based search engines use automated software programs to survey and categorise web pages. The programs used by the search engines to access your web pages are called spiders, crawlers, robots or bots. A spider will find a web page, download it and analyse the information presented on the web page. This is a seamless process. The web page will then be added to the search engines database. Then when a user performs a search, the search engine will check its database of web pages for the key words the user searched on to present a list of link results. The results (list of suggested links to go to), are listed on pages by order of which is closest (as defined by the bots), to what the user wants to find online. Crawler-based search engines are constantly searching the Internet for new web pages and updating their database of information with these new or altered pages. Examples of crawler-based search engines are: Google (www.google.com) Ask Jeeves (www.ask.com) (2) Directories A directory uses human editors who decide what category the site belongs to; they place websites within specific categories in the directories database. The human editors comprehensively check the website and rank it, based on the information they find, using a pre-defined set of rules. There are two major directories at the time of writing:

Yahoo Directory (www.yahoo.com) Open Directory (www.dmoz.org) Note: Since late 2002 Yahoo has provided search results using crawler-based technology as well as its own directory. (3) Hybrid Search Engines Hybrid search engines use a combination of both crawler-based results and directory results. More and more search engines these days are moving to a hybrid-based model. Examples of hybrid search engines are: Yahoo (www.yahoo.com) Google (www.google.com) (4) Meta Search Engines Meta search engines take the results from all the other search engines results, and combine them into one large listing. Examples of Meta search engines include: Metacrawler (www.metacrawler.com) Dogpile (www.dogpile.com) ----------------------------------------------------------------------------------------------------------Meta Definition: The <meta> element provides meta-information about your page, such as descriptions and keywords for search engines and refresh rates.

Meta element Meta elements are HTML or XHTML elements used to provide structured metadata about a Web page. Such elements must be placed as tags in the head section of an HTML or XHTML document. Meta elements can be used to specify page description, keywords and any other metadata not provided through the other head elements and attributes. The meta element has four valid attributes: content, http-equiv, name and scheme. Of these, only content is a required attribute.

An example of the use of the meta element In one form, meta elements can specify HTTP headers which should be sent before the actual content when the HTML page is served from Web server to client. For example: <meta http-equiv="Content-Type" content="text/html" /> This specifies that the page should be served with an HTTP header called 'Content-Type' that has a value 'text/html'. This is a typical use of the meta element, which specifies the document type so a client (browser or otherwise) knows what content type to render. In the general form, a meta element specifies name and associated content attributes describing aspects of the HTML page. For example: <meta name="keywords" content="wikipedia,encyclopedia" /> In this example, the meta element identifies itself as containing the 'keywords' relevant to the document, Wikipedia and encyclopedia. Meta tags can be used to indicate the location a business serves: <meta name="zipcode" content="45212,45208,45218,etc." /> In this example, geographical information is given according to zip codes.

{ Major search engine robots are more likely to quantify such extant factors as the volume of incoming links from related websites, quantity and quality of content, technical precision of source code, spelling, functional v. broken hyperlinks, volume and consistency of searches and/or viewer traffic, time within website, page views, revisits, click-throughs, technical user-features, uniqueness, redundancy, relevance, advertising revenue yield, freshness, geography, language and other intrinsic characteristics. } ----------------------------------------------------------------------------------------------------------<!DOCTYPE> Definition The <!DOCTYPE> declaration is the very first thing in your document, before the <html> tag. This tag tells the browser which HTML or XHTML specification the document uses. HTML 4.01 specifies three document types: Strict, Transitional, and Frameset.

HTML Strict DTD

Use this when you want clean markup, free of presentational clutter. Use this together with Cascading Style Sheets (CSS): HTML Transitional DTD The Transitional DTD includes presentation attributes and elements that W3C expects to move to a style sheet. Use this when you need to use HTML's presentational features because your readers don't have browsers that support Cascading Style Sheets (CSS): Frameset DTD The Frameset DTD should be used for documents with frames. The Frameset DTD is equal to the Transitional DTD except for the frameset element replaces the body element:
----------------------------------------------------------------------------------------------------------------------------------------

What is Spamdexing? Spamdexing is defined by search engines as all of the improper referencing techniques and methods. Spamdexing consists of adding keywords that have nothing to do with the page and hiding them from visitors. Thus, spamdexing is likened to spam because it is seen as deception, which is contrary to the interest of Internet users. Inserting keywords that are the same colour as a page's background (invisible keywords) Adding keywords that have nothing to do with the page to the meta tags Repeating keywords (or keywords stuffing) Webpage hijacking (pagejacking)

----------------------------------------------------------------------------------------------------------What is Cloaking? Cloaking is a technique banned by search engines (i.e. it should be avoided) that consists of generating different HTML content depending on whether it is intended for a visitor or for a search engine. Indeed, it is possible to detect search engine robots through the presence of a specific User-Agent field in the HTTP requests that they send and show them a different content that includes extra key words that are not shown to visitors.

Nevertheless, if this technique is detected by a search engine (which is easy for them to do), the website runs the risk of no longer being indexed or even of being blacklisted (banished) for several months. ----------------------------------------------------------------------------------------------------------Presentation of the robots.txt File robots.txt is a text file that contains commands for search engine indexing robots that specify the pages that can and cannot be indexed. When a search engine explores a website, it starts by looking for the robots.txt file at the root of the site. robots.txt File Format The robots.txt file is an ASCII file found at the root of the site. It can contain the following commands: User Agent: used to specify the robot that is subject to the following orders. The value * means "all search engines" Disallow: used to identify the pages to be excluded during indexing. Each page or path that is to be excluded must be on a separate line and must start with / The value / alone means "all of the website's pages". ----------------------------------------------------------------------------------------------------------Promoting a Website Promoting a website consists in making it known publicly through several channels in order to, depending on the case, improve traffic, gain a reputation, attract prospective customers or develop sales numbers. Web marketing" (also called cybermarketing or netmarketing) is any campaign that improves a website's visibility by using the Internet as a marketing channel. The term web marketing is used as opposed to "traditional marketing". Because webmarketing and traditional marketing are not necessarily exclusive of each other, a well articulated online advertising campaign and a traditional offline advertising campaign will have even more impact. -----------------------------------------------------------------------------------------------------------

What is Web Positionning? Web positioning" generally means all the techniques used to improve website visibility: referencing), which consists of introducing the website into search tools by filling out the search tools' forms positionning), which consists of positionning the website or specific pages of the website on the first results page for certain keywords ranking, whose goal is similar to that of positionning but for more elaborate phrases; part of the work is to identify these requests ----------------------------------------------------------------------------------------------------------Improving Web Positionning There are some design techniques that can be used to more effectively position WebPages: Original and attractive content\ an aptly chosen title an fitting URL a body text that can be read by search engines META tags that precisely describe the page's content well thought out links

Webpage Content Page Title The title must describe the webpage's content as precisely as possible (in under 7 words and 60 characters). The title is all the more important because it will appear in the user's favorites as well as in the search history. Page URL Body of the Page Meta Tags Hypertext Links ALT Attributes for Images website's images are opaque to search engines, i.e. they are not capable of indexing image content. Therefore, it is a good idea to place an ALT attribute on every image that

describes the content. The ALT tag is also of utmost importance for the blind who browse the Internet with a Braille terminal. Web Positionning Hinges on the Page The items that search engines index are webpages. Therefore, when designing a webpage, the above-mentioned advice must be taken into account when structuring each page. Most webmasters remember to correctly index their website's home page but neglect the other pages, even though it is the other pages that contain the most interesting information. It is therefore absolutely imperative to choose an appropriate title, URL and meta tags (etc.) for every page of a website. ----------------------------------------------------------------------------------------------------------Measuring and Qualifying Website Traffic Every webmaster's goal is to increase traffic to his or her website, i.e. increase the number of visits everyday. Therefore, it is essential to have indicators that, on the one hand, facilitate the measurement of how website traffic is evolving (which is called both "audience monitoring" and "website metering") and, on the other hand, identify the audience in order to provide content that is closer to what the website's visitors want. Generally there are thought to be two types of studies: Site-Centric Measurement User-Centric Measurement, performed mostly with a panel of users

Measuring and defining a website's traffic are two methods for measuring a website's effectiveness in order to permanently improve its quality. How to Measure a Website's Traffic There are three solutions for measuring a website's traffic: How to Measure a Website's Traffic There are three solutions for measuring a website's traffic: Exploiting the web server's logs (log files) by using a specific tool. This involves choosing a tool capable of analysing web server log files and creating a control panel containing the website's main traffic indicators

Developing an ad-hoc statistics system. It is possible on a website to store visitor information each time a visitor loads a page in order to use that information at a later date. For websites with high traffic flow, this type of mechanism can cause the processor to become heavily loaded and the disk space to be more full, especially if the collected data are stored in a database management system Using a "traffic measurement" service. This system consists of inserting a "marker or "tag" on each page so that the traffic measuring service can collect the data on a server. The advantage of this type of service is that it conserves material resources because all of the processing is done on a remote server. What is more, the company offering this service is responsible for upgrading the indicators and control panels so as to be constantly in sync with the evolution of Internet access technology and web browsers. However, the statistics gathered this way will not necessarily be exhaustive because: some users stop loading pages before the tag code is downloaded intermediate proxy servers are likely to impede the page from loading security infrastructures and firewalls in particular can block information from being uploaded Hits One webpage may be made up of a certain number of files (particularly image files), style sheets, JavaScript files, etc. Thus, a "hit" is a file loaded by a browser. If a webpage containing three images is loaded, that equals four hits. ----------------------------------------------------------------------------------------------------------Monitoring Referencing and Position How a website is referenced and how its position evolves over time. Regularly monitoring referencing and position is recommended, so as to verify that a website still has a good position and, if necessary, take the corrective measures towards improving the position. The "link:" command used by many search engines allows you to determine the number of websites pointing to a given address ("backlinks"). Thanks to this system, it is easy to create a gauge to measure a website's popularity. Gauge of the number of referenced pages according to search engines: Google: "site:www.commentcamarche.net" Altavista: "host:www.commentcamarche.net"

Voil: "url:commentcamarche.net" AlltheWeb: "domain:commentcamarche.net" Gauge of the number of links pointing to a website according to search engines: Altavista: "link:commentcamarche.net" Google: "link:www.commentcamarche.net" Hotbot, Lycos, Yahoo: "linkdomain:commentcamarche.net" Voil: "anchor:commentcamarche.net" -----------------------------------------------------------------------------------------------------------

Вам также может понравиться