Search Engine Optimization articles.

Discover what the substitute Google PageRank: the TRUST RANK

Desde hace algunos meses se viene especulando sobre la posibilidad de que Google cambie el algoritmo de PageRank por un nuevo algoritmo que filtraría todo el spam a buscadores o al menos intentaría neutralizarlo. Descubre en éste artículo todo lo que se sabe hasta el momento, acerca del nuevo algoritmo cuyo nombre es Trust Rank.

This is the technique used to get when you search for "Thieves" on Google, you first position appears on the website of the SGAE.

This is the technique used to get when you search for "Thieves" on Google, you first position appears on the website of the SGAE.

The new Google algorithm would prevent such practices.

The purpose of PageRank is to assign a numerical value to web pages according to the number of times the recommended and other pages according to PageRank that they have to turn. That is, it establishes the importance of that website. Their logic is: if web pages link to another page, is that is recommending. And if recommended, it is to be important in the field of subject matter first website. A recommendation that comes from a page that in turn is highly recommended is worth a recommendation that comes from a page that hardly anyone recommended.

Google wants in the top positions of search results pages find some relevance and are being recommended by other pages that in turn also have relevance. To determine the PageRank, Google analyzes the number of links coming from other web pages and PageRank. The Trust Rank, part of the same base. But instead of assessing the importance of a recommendation based on the PageRank of the page that recommends it does from a series of web pages that have been considered important by humans rather than algorithms.

A Web page that humans have determined how important they are considered "web seed" and their links are assigned a value. And it will be that value that will be transmitted across the network.

To illustrate with an example: Suppose we have a web seed "A". "A" transmit a value of 100 Trust Rank to all websites that link. These pages, in turn, forward a 99 Trust Rank to all websites that link. And the latter, forward a 98 Trust Rank to which they link.

To mitigate the degradation of Trust Rank as distances of the seed sites, the algorithm has included a correction that takes into account the number of degrees between the seed web and the web receiving Trust Rank without nullifying completely the distance between seed.

The idea of ​​Trust Rank looks good, but there are certain issues that must be considered:

The idea of ​​Trust Rank looks good, but there are certain issues that must be considered:

Who will be the seed webs?

Did you perform reverse spam?

Did you perform reverse spam?

So scathing comments and thinking about the future not too distant when the Trust Rank will work, it occurs to me that perhaps the same playing that searching the word "thieves" out the SGAE, perhaps to play to sabotage web pages, linking them unmercifully from its pages spam and therefore subtracting Trust Rank, from appearing at the top of search engines.

When will we have the Trust Rank algorithm built into Google?

No idea ... no one here agrees. When you least expect Google launches a statement and informs us that it has already implemented. What is clear is that notify the press and the Internet. Certainly will make a major qualitative improvement in obtaining search results so Google will make people aware of it, I doubt that the communications department of Google threw away an opportunity like this.

Further information for anyone wishing to broaden their knowledge:

Link to document Stanford University that deals with the Trust Rank: http://dbpubs.stanford.edu:8090/pub/2004-17

Search Engine Optimization course (which no doubt have to change the day that implement the Trust Rank, but already includes the new indexing system with Google Site Map Generator): Online Course Search Engine Optimization. The course is free.

Discover the indexing of the future: Google SiteMap

Google proposes what will be the new way to index web pages.
Search engines like Google and Yahoo, use spiders to gather information from the web pages published on the Internet there. Once you have the information, process to quickly sort search results, based on a specific algorithm, when a user goes to their websites and asks any term or a phrase.

The search engine spiders regularly visit websites that are published on the Internet and automatically update information about their content.

So far, spiders came into the root directory of a domain, sought the robots.txt file to ensure that the site wanted to be indexed and then proceeded to visit all the links found on the website, thus recording the content of the page.

Google SiteMaps will revolutionize this form of indexing the web pages.

No es sólo que Google ahora lea con más detenimiento los mapas del site que la gente incluye en sus páginas web… no es nada de esto… es una nueva forma radical de indexar el contenido de las páginas. Google nos propone la creación de un sitemap en XML siguiendo unas especificaciones determinadas que darán toda la información a sus arañas y que les permitirá el acceso a urls que hasta ahora podían haber estado escondidas por diversos motivos ajenos a la voluntad de los webmasters.

Google wants to access the content of the web pages of the easiest and most efficient way. As it stands now raised indexing pages, even being much more efficient than human rates we had old (who does not remember going to a search engine, be inserted by hand the definition of our site, keywords why we wanted to be found and the site URL ... but this is prehistory internautical), which Google presents us is now much better.

Everything is to make available spiders a special sitemap.

To create this sitemap, enough to have an application that is installed on our server (there are versions for all operating systems) and creates a site map in a certain format. The application proposes Google can generate a map from the URL of the website from the directories of the website, or from server logs (ideal for dynamic pages).
Once we have the sitemap done according to the specifications of Google, we can register it in Google Sitemaps. Automatically and in less than four hours, Google will be indexed.

Google allows webmasters to create a cron to generate a new map to every hour (for sites with lots of content renewal) and make the map automatically submit Google Sitemaps. In this way, the spiders will know immediately the new pages created and may be incorporated into the index.

Advantages of this application:

No matter how bad you have the web page level paths for spiders ... with a site map created by the Sitemap Generator, Google spiders always find the url of all your pages.

Another great advantage is the quick content indexing the entire site. In less than 4 hours, the spiders have visited up to 50,000 links on our website. For websites with more URLs, Google recommends various sitemap and have an index of sitemaps.

Disadvantages of this application:

It requires some programming knowledge, so that either ISPs offer this service as added value for your customers or many websites will not have that service and should remain indexed by ordinary spiders.

The sitemap that are already available in most web pages are not compatible with the format of Google. Google want an XML document with certain specifications.

With this project, Google seeks undoubtedly how to improve the indexing of web pages and to have in their indexes with pages that until now were lost in a sea of ​​links within our sites.

Google has created the Sitemap Generator and indexing service Express and offers completely free ... it will be interesting to see the reaction of Yahoo at this, because Yahoo offers service fast indexing payment of $ 49, $ 20 or $ 10 according to the number of url we want to index on an accelerated basis.

Currently there have firsthand results regarding the effectiveness of indexing through Google sitemap. Once we installed the new sitemap on various websites and we are ready to make comparative increase in number and frequency indexed spiders visititas pages, write a new article reporting the results. See you then.

later noteA few months have passed since we wrote this article. The results have been very good. A whole new website is indexed in less than 24 hours. It is ideal for when a new site goes to the network. The can be indexed at a time, without having to wait months and months for Google spiders read its entire contents.

Additional information:

URL with information about Google sitemap:
https://www.google.com/webmasters/sitemaps/docs/en/about.html

URL with specifications about Google sitemap:
https://www.google.com/webmasters/sitemaps/docs/en/protocol.html

Discover the tools that will make your life easier as SEO

En este artículo se describen algunas de las herramientas de gran utilidad a la hora de trabajar el posicionamiento de tu página web en Google y en el resto de buscadores.
En el último artículo describimos cuáles son los 8 factores clave para superar a tu adversario en Google o en otras palabras, qué factores debes tener en cuenta a la hora de realizar un benchmark de tu página web (tu posicionamiento respecto a otras webs) y así saber qué tienes que mejorar para superarlas. Hoy hablaremos de las herramientas que te ayudarán a realizar ese benchmark.

Cómo conocer cuantas páginas tienes indexadas en los principales buscadores

Ya indicamos que el comando a insertar en las cajitas de búsqueda de los principales buscadores es:
site:nombredeweb.com
(Substituyendo “nombredeweb” por el nombre de la web que desees analizar y sin poner un espacio entre “site:” y la url).

Pero hay una herramienta que permite que lo hagas simultáneamente en varios buscadores y que además guarda un histórico de tu posicionamiento para que veas tu evolución a lo largo de los meses. Completamente gratuita: Marketleap Marketleap es probablemente una de las mejores webs/herramientas para conocer tu posicionamiento.

En el enlace que indicamos, basta con que incluyas la url de tu página web (y hasta 3 webs más), introduzcas el código de seguridad, y en un momento sabrás cuántas páginas tienes indexadas en 6 buscadores, entre los que se incluye Google, Yahoo y MSN. Cuando te muestre los resultados, en la parte derecha superior te ofrecerá ver el histórico de tus consultas y poder conocer así tu evolución.

Si compruebas que tienes menos páginas indexadas de las que realmente tienes colgadas en la red, quiere decir que las arañas de los buscadores no entran en todas tus páginas.

Cómo conocer la densidad de las palabras clave en el texto de una página web

Hay varias páginas web que ofrecen este servicio. Si el peso de la página no es muy elevado, Ranks NL nos ofrece un muy buen análisis, de forma totalmente gratuita. Además, nos muestra en verde las palabras con una densidad adecuada y en rojo las que tienen una densidad peligrosa y corren peligro de causar sanción en los buscadores que controlan los abusos.

Cómo saber cuantas páginas web enlazan a la tuya

Tal como indicamos en el artículo anterior, el comando a realizar en un buscador es:
link:www.nombredeweb.com

Pero una vez más, Marketleap dispone de una herramienta que nos facilita el trabajo y nos hace esta consulta simultáneamente en varios buscadores a la vez, y además, si lo deseamos nos hace una comparativa con webs de nuestro sector (por desgracia sólo tiene catalogadas webs norteamericanas, así que esto último sólo es interesante a nivel de curiosidad).

También en la parte derecha superior encontrarás un enlace a tu histórico (obviamente, si es la primera vez que consultas tu web en Marketleap, no habrá histórico… pero sí lo habrá la próxima).

Cómo saber en qué posición está tu página web para ciertas búsquedas

Ir monitorizando tu posicionamiento en la palabras que tú consideras clave para que la gente encuentre tu negocio, es algo muy importante. Para ayudarnos en este menester existen muchas herramientas. Encontrarás la mayoría de ellas en Softonic, en el apartado de “Monitores Posición en Buscadores”.

Mi herramienta preferida es el Free Monitor for Google 2.0. Es 100% gratuita y francamente, es muy útil: le das una lista de palabras o frases, y te indica en qué posición se encuentra tu página web (o la de tus competidores, según qué estés analizando). Además, el resultado lo puedes exportar e incluirlo en cualquier documento. Lo único a tener en cuenta es que consulta google.com y no google.es, así que si queremos comprobar que si es cierto lo que el Google Monitor nos indica, debemos realizar la consulta en Google.com y contar el número de páginas que tenemos por delante.

Por la red encontraremos programas que nos comparan la posición que ocupamos en varios buscadores para ciertas palabras. Mi preferido es ThumbShots que muestra los resultados de manera gráfica y además de tu web, muestra un centenar más de resultados.

Cómo conocer el PageRank de una página web

La manera más fácil es teniendo la barra de Google instalada, pero existen muchas webs que también nos ofrecen esta información de forma totalmente gratuita y que además, nos permiten chequear el PageRank.

Enlaces interesante:

Enlace a la presentación en Power Point que explica cómo funciona un buscador por dentro y que podrá ayudarte si compruebas que Google tiene menos url’s de tu página web indexadas de las que debería.

8 key factors to overcome your opponent in Google

No one can claim to know the algorithm used by Google to sort search results, but it is relatively easy to investigate what factors are involved in it and to what extent affect the algorithm. In addition, the network will find much literature on the subject and you'll be extending knowledge if the topic you are passionate about.

This article presents 8 key factors that will help you know why other sites are above yours, and how you can overcome them.

1. Decide what words are going to concentrate your efforts

You can not fight for many words at a time, so concentrate your efforts in about 10 words or phrases that you think can be searched in Google for your target audience.

Begins the analysis that will lead to success making a list of the top 5 websites listed in the top results when looking for those 10 words.

Browse through the 5 pages that will appear. Make special attention to discover what words are targeting them.

2. Find out where are located the words for which you want to fight

Look carefully at where they are placing keywords.

Google gives more importance to words that are located in certain parts of a web page. The most important part is the URL (the address of your website), the following is the tag <title>, the following are the headers <h1>, <h2> and <h3>, then come the words that are links to other pages, and the importance is diminishing, although it is always higher than the plain text, if the words are bold, italicized, is part of a <alt> (alternative text on images), etc ...

3. Find out what keywords density have

Keep in mind a few things:

Google (y el resto de buscadores) funcionan por densidad, no por valores absolutos. Así que si en tu URL o en tu título tienes 50 caracteres y 9 coinciden con los que el usuario busca, el valor de tu URL o de tu título es de 9/50. Así que procura no poner textos superfluos o urls con tropecientos números que corresponden a una sesión de usuario o a algo parecido.

Also consider that from March 2004 Google works by characters, not words. Hence in the preceding paragraph has written "characters" rather than words. Until March 2004 if your title was "wooden tables Office" and the user was looking "wooden tables", the value of your degree was 3/5 (in Castilian not seeped prepositions and considered words). Now is not the case. Now going by letters. Thus, if someone searches a derivative of a word or a plural, or conjugated verb, the page containing something similar is also included in search results.

Cuando descubras dónde tienen situadas las palabras, mira con qué densidad aparecen. En tu página web, haz que tengan mayor densidad que en las suyas. Lo puedes hacer incluyendo la palabra más veces o incluyendo menos palabras que no tengan que ver con esa búsqueda. La cuestión es elevar la densidad y superar la suya en cada una de las partes donde aparezca la palabra.

Eye do not go overboard ... Google penalizes pages with suspiciously high densities. You can get a 100% density in the title and URL, without anything happening. But a page where you put a word repeated 100 times, everywhere, bold and links, and do not include any other text, you can be assured that it will be expelled from Google. So moderation.

Also, think that your website has to be read by your users / customers ... it is essential that the text is aimed at them, not search engine effectiveness.

4. Find out how many pages have their web pages

The more pages you have indexed in Google, the more likely they are to participate in the struggle for certain words. There are also indications that Google puts in a better position to web containing a large number of pages where the search term is included.

So, on the one hand, it includes the words why you want to position yourself in the maximum possible pages. On the other hand, try your web have about 200 pages or more.

But once again, find out what your competitors do and include it in the table started to do at the beginning of this study.

To find out how many pages are indexed in Google, simply type in the search engine box:

website: www.nombredelaweb.com

(Eye not include a space between site: and the URL)

To find out how many indexed pages contain a particular word or string of words, simply type in the search engine box:

website: www.nombredelaweb.com "palabra the phrase"

This will give you the number of pages containing the phrase "word or phrase" on the website www.nombredelaweb.com

5. Check the number of links pointing to your pages

The PageRank algorithm that forms the (cultural note: PageRank means "Larry Page rank", not "page ranking"), is formed by many other algorithms and is quite complicated to understand. But there are some basic features that can easily be applied to your website.

PageRank influence in all, the number of links pointing to a website, the density of these links on the source page and the PageRank of the source page.

So this number 5 will focus on the first of the factors affecting PageRank: the number of links.

Again, note the number of pages that link to each of the 5 competing websites that are analyzed on your list.

To find the number of links to a page, simply type in the search engine box:

link:www.nombredelaweb.com

Since March 2004, Google gives less value to the links come from pages with similar IP to yours, so do not need to cheat: Google knows.

We wrote an article about Hilltop algorithm used by Google to calculate and filter the PageRank of the sites, a few months ago: HillTop

6. Analyze what kind of web linking to your competitors.

In all likelihood you can not include in your listing the PageRank of each page that link to those of your competitors, but it is important to see what kind of website they are, what PageRank have, how many other websites link and what words they use for linking to your competitors.

The higher the PageRank of a page that links to you, the greater the number of points you get for this link. So look for pages with high PR and link you get.

To conclude this point, do not forget that Google and other search engines, everything works by density, so if a web out 100 links to other websites, the value of a bond to come to you is 1/100 . So forget about link farms. Get links to your web pages with few links and a high PageRank.

7. Find out what words your competitors use the links to go to their websites

If the search word is part of a third party link to your website, you have a bonus in points (to put it in some way). So if you dedicate yourself to making wooden tables Office, ensures that the pages that link to yours using the phrase "wooden tables" to link you, instead of www.minombredeempresa.com

Obviously, you can not always control what words to use third-party websites to link to yours ... but when you can do it, remember this clause 7: Remember the bonus to take you if you get !!

8. Write down what pages have PageRank your competitors

Do not forget to include a column in your study indicating which have PageRank websites of your competitors. This will help you understand why they are in the top positions.

Remember to increase your PageRank must, above all, increase the number of pages that link to yours. So if you have a PageRank of less than 4, put to work to get links. If you have more than 4, it is quite difficult if you do not perform any upload, well designed and with a good strategy specific campaign for this purpose.

Articles written as a collaboration in the magazine "Mercados del Vino"

Hasta aquí, hemos descrito los 8 factores clave que te llevarán a ganar posiciones en Google. Pero cuando realizo este tipo de benchmark, suelo incluir tres columnas más en el listado. Se trata de la posición de nuestros competidores en el ranking de Alexa. No es que Alexa influya en Google, pero es bueno saber dónde están situados a nivel de visitas únicas, de páginas vistas por usuario y de ranking en general. Estos tres datos los encontrarás buscando a tus competidores en Alexa.com.

8 factors hope you have been helpful. This article is aimed to provide guidance to people who wish to know the exact position of your web pages, compared to those of its competitors. It is not intended to be manual in depth about how Google works.

To view the presentation we use when we do lectures about how search engines work, you can download it here: Diapostivas stitches

For more information about search engine: Free Search Engine Optimization Course

By the way, if you have questions or want to expand more specifically some point, we will be happy to assist you.

A9 discover how it works: the final version of the browser created by Amazon

Amazon A9How it works A9, what kind of algorithms is based, why it is called A9, who devised it, and all that we learned about this new search engine that will have to medírselas with Yahoo, Google and new MSN that Microsoft has in beta. Lets go see it!

Entry A9 in the search engine market opens an interesting period in the war for monopolizing the user searches takes a new dimension.

With the new version Amazon hung yesterday, we will review again the topics covered in the functional analysis to find the differences, see if they have corrected the weaknesses that showed in April and discover the innovations that presents this search.

Tracing Service Customization:

When you enter A9 and a member of AmazonThe same cookie recognizes and greets you with a "Hello Montserrat"That leaves me flabbergasted and, by the impact, I can not help but make me think of a"Welcome professor Falken, ¿Would you like a game of chess?” :-)

A part of the name, show me the seeker box and the history of all my searches A9 for if I want to consult some previous results (and I do not know if they have programmed, but could also serve to know your choices in the search results that have offered you, so we can offer them better next time you are looking for something ... we'll find out with time)

About the database used by A9:

Definitely A9 uses the database of Google, instead of using Alexa (remember that Alexa was bought by Amazon in 2000 and that Alexa has scanned more websites that Google ... but Google kept clean its database and deletes each 6 months, the web pages that their spiders have been unable during that period of time ... Alexa does not)

In April analysis, commented that using the Google database but not used whole, but only a portion of it. Now we confirm that uses whole: by searching the type site: www.solocine.com get the same number of results (approx.), Both search engines.

About sorting algorithm A9

It is to Google, without hesitation.

Offers some variation in the order of the results, but I think it is because both Google and A9 apply filters to the results without you know it, that the algorithm itself. For example, according to the language Google layout you have, when you search results in Castilian, it offers different results ... even if you insist that you want no filters through ...

It's a shame they have not chosen to use its own algorithm and compete with Google searches to see who offers the highest quality. If they use the same database ... A9 had it very easy to use Alexa ranking PageRank instead of determining the relevance of a page and thus influence the sorting algorithm results. But it seems clear that it has chosen to ally itself to Google rather than compete against him.

About advertising on A9

The system uses Google Adwords and Google's sponsored links. It served directly from Google's own machines (you can see in the url redirection ads)

What is the value proposition A9? What differs from the rest?

Since we are seeing at the moment A9 is basically a Google with another look&feelLet's see how it differs:

  • A9 offers image search results while providing web search and even while looking at the texts of the books Amazon. It is a convenient feature that facilitates this page to find out if you are interested or not.
    Most of the functions of the web work with "drag & drop". It is the new trend in the usability of applications for the end user. Everything is dragged and placed where you want it to work or to be saved.
  • Favorites Tracing Service (Bookmarks): If you drag bookmarks to the URL of a web of appearing in a result, it is automatically saved here so you can consult it any other day.
  • It offers 4 skins and 3 different font sizes: If you want to see A9 in purple and suitable letters without glasses myopic, A9 allows it.
  • Offers "Site Info" from Alexa in its results: the results offered after a particular search are accompanied by a small icon "site info". This icon works like Alexa, activates a layer with information about the page (position in the ranking of Alexa, links to the page download speed, etc.)

No creo yo que Udi Manber esté muy satisfecho con el nuevo A9. Manber es un especialista en algoritmos, ex “jefe de algoritmos” de Amazon, ex “director científico” de Yahoo y ex profesor de informática de la Universidad de Arizona… no lo veo como alguien que se contente con sacar al mercado un Google con algunas cosillas retocadas en superficie… Desconozco por cuanto tiempo se ha cerrado el acuerdo con Google y si hay o no dinero por medio (a parte de los Adwords, que beneficia a ambos. Los Adwords de Google también están en Amazon).

Time will tell ... but I hope that A9 will end up being the chrysalis of something better awaits us in the near future ... or maybe die trying ... we'll see.

As a curiosity: Udi Manber is the man behind the name A9, which refers to 9 letters in the word algorithm in English (Algorithm).

By the way ... the A9 URL is www.a9.com if you want to play around and find the differences regarding Google :-)

What is the Hilltop algorithm?

Since March 2004, Google gives less value to the links come from pages with similar IP to yours, so do not need to cheat to change your PageRank and therefore improve your SEO: Google knows.

A filter this switch is called PageRank Hilltop algothim

Google has implemented this change in their algorithm to neutralize a trap that some experts in SEO webmasters have been doing since the PageRank became operational: to create endless small web, hosted on your own ISP, that link to your main website.Authority Pages

Also large corporations have abused the fact that a large number of inbound links you make to improve the positioning ... without going any further, SoloStocks we have links in the footer to all websites Intercom Group... and therefore on all pages Softonic (To quote one of our companies) there is also a link to SoloStocks. Since We have over 500,000 pages indexed in Google, my site receives 500,000 external links. Thing seems to me great ... but that is not 100% just from the point of view of an independent webmaster who runs a website with great content but never positioned above mine ... Google has implemented until Hilltop It has neutralized the effect of the links.

So I said ... do not need that to improve the SEO of your website, waste time including links to websites that are hosted on the same IP you ... because Google now looks at the IP from coming inbound links and has lowered the weight will greatly which have a similar IP to yours.

Effects of Google Dance September / October 2004

Unlike Google Dance Mar 2004, This September we all expected the new PageRank and cleaning the database with the consequent de-indexation of all pages that their spiders have been unable since last great cleansing, carried out in March ... but it has not happened .

Ok, ok ... let's start at the beginning ...

What is a Google Dance?

Are the changes that occur in the Google algorithm from time to time and cause the results that appear in the top positions change places and "dance".

September dancing fell short

In September 2004, Google has been limited to publish results as a month; without modifying the PR (at least outwardly, since we can not be sure that the PageRank displayed on the Google toolbar is really what Google uses to calculate the sorting algorithm results) and Google has only shown some variation in It results ... But October came, and with it, the new PageRank.

Since when Google PageRank not recalculated?

For half the PageRank of web pages in June had not been recalculated massively.

Specifically, according to rumors, they had not been recalculated since the algorithm Check Sum http://google.dirson.com/noticias.new/0569/ began running online.

Changes in the calculation of PageRank

We commented in an article in March that after the Florida Update, Google had included in the algorithm PageRank a filter to discriminate the websites of large corporations or the same owner, who conducted trade links with the sole purpose of raising your PageRank . This filter appears to remain active.

This filter is a complex algorithm itself, and we explained in the article:

What's Hilltop algorithm?

But let's see what Google has been doing these last 3 months:

25 August: big moves in the order of results

Moved first attributed to a Google Dance, but then, after a few weeks, the affected websites seen as recovering previous positions, so everything points to were tests in the algorithm.

September 23: new moves

They start running new results from all the material that Google spiders have collected until 30 August (except the homes of the websites that Google's updated every two or three days). begin serious doubts about whether the PageRank bar showing the PageRank that Google uses for its calculations ... and you think you have not updated data showing the bars, but if you use for your calculations.

7 October: begins assigning new PageRank

From October 7, some pages with zero PageRank, have begun to show PageRank in the Google bar. This've been able to confirm with the appearance of PageRank in the Google bar on pages that have been created during July, August and September, and so far showed us a zero.

Also on the site PageRank Watch, we can see some web that from that day, have the PageRank assigned or modified.

Some new features in Google searches

Searches in the pages of scanned books

We knew that after working with Amazon (A9, Amazon's search engine, the engine runs on Google), Google was able to look inside the books that Amazon is selling. Now, from Google it if you want to find results that appear within a book, you can make the following query:

book (+ whatever name, for example: book shakespeare)

This will show a first result with an icon indicating that it is a result that gives the words you want and appearing in a book. In fact, the search is done on http://print.google.com database of scanned books using Google.

The books belong to several online bookstores, not just Amazon.

In all likelihood, during this year we will discover more things about Google Dance September / October 2004 ...

AskJeeves not adopt the model of payment for indexing

Logo de AskJeevesTras el reciente anuncio del nuevo modelo de pago por indexación de Yahoo, el buscador AskJeeves se quiere desmarcar de ésta línea de actuación, anunciando que no aceptará modelos de pago por indexación en su base de datos.

Tras un año y medio, el programa de anuncios de pago Index Connect de Ask Jeeves ha llegado a su fin. Este tipo de servicio, con el que nunca ha estado de acuerdo Google, consiste en que una empresa puede pagar una tarifa para conseguir que sus páginas se indexen con más frecuencia en un determinado motor de búsqueda.

El problema de ésta táctica, tal como ha hecho ahora público AskJeeves, es que reduce la calidad de los resultados que se obtienen en una búsqueda. Así, ante determinados parámetros de búsqueda, se obtienen demasiados resultados comerciales de pago que muchas veces son irrelevantes para la persona que busca.

En estos momentos Google se convierte en el buscador que se opone completamente a este tipo de pago por mejor indexación. AskJeeves se une a esta política a favor de unos mejores resultados de búsqueda, mientras que Yahoo inicia recientemente un sistema de pago de este tipo, asegurando que mantendrá un muro entre este tipo de resultados y los gratuitos, para que esto no afecte a los usuarios.