Search Engine Optimization articles.

How indexes the https?

Https indexing is one of those mysteries that makes life more interesting SEO. While we know that it is possible to index it in most search engines, hardly anyone knows how to get it in the shortest possible time.

What is https?

The https is the secure version of the http protocol. The difference between one and the other is that the former transmits the encrypted data, and the second transmits unencrypted.

The system uses https based on Secure Socket Layers (SSL) encryption to send information.

The decoding of the information depends on the remote server and the browser used by the user.

It is mainly used by banks, online stores, and any service that requires sending personal data or passwords.

How does the https?

Contrary to what many people think, the https does not prevent access to information, only the encrypted when transmitted. Hence the content of a web page that uses the https protocol can be read by search engine spiders. What can not be read is the content that is sent from the website to your server, for example, the login and password for access to a private area of ​​the website.

The standard port for this protocol is 443.

How do we know the https is actually indexed?

Google indexes https since early 2002 and gradually other search engines have adapted their technology to also index the https.

The last search engine to do so was MSN, he got it in June 2006.

If we look for "https: // www." Or inurl: https in major search engines, we find https pages indexed in them.

How can we index our https?

In principle, naturally we can index our https pages, but as this protocol transmits information much slower, spiders sometimes fail to download the pages in the time they have established and will not index it. This is the main problem that we can find. We will resolve trying to reduce the download time of these pages.

How can we accelerate the indexing of https

There are two techniques:

  1. Google Sitemap: Include our sitemap https pages (we refer to the google sitemap, sitemap to not to humans), and register it in google sitemaps.
  2. Guerrilla: Internet links spread all over to go to our https pages, and thus achieve the spiders that are indexing the pages where the links have also come into the https part of our site.

How can we make our https being indexed

It is not as easy as it looks. It does not serve to include in our robots.txt https pages. Each port requires its own robots.txt, so we create a robot.txt to our http pages and another for our https pages. In other words, we also have a page called

https://www.nombredelapagina.com/robots.txt

If you need help or de-index to index your pages https, please contact us. We will encatados to assist you.

Additional information:

Blog MSN about indexing - Article where they explain that MSN index starts https
http://blogs.msdn.com/livesearch/archive/2006/06/28/649980.aspx

Information about how Google not index https:
http://www.google.es/support/webmasters/bin/answer.py?answer=35302

More information about Google Sitemaps:
SiteMaps de Google
http://www.geamarketing.com/articulos/Descubre_indexacion_futuro_Google_SiteMap.php

online course, free, search engine optimization: Course search engine positioning
http://www.geamarketing.com/posicionamiento_buscadores.php

Google PageRank update in September 2006

Last updated September 28 Google PageRank showing your navigation bar. Many websites have seen his rising, or how down ...

Ya hemos comentado otras veces que el PageRank que nos muestra la barra forma parte de una escala logarítmica de 10. Es decir, es fácil subir de 0 a 1 o de 2 a 3, pero en cambio es muy difícil subir de 5 a 6, y aún más subir de 6 a 7. En cambio el PageRank que utiliza Google para sus cálculos es mucho más preciso y utiliza una escala mucho mayor, aunque no sabemos cuánto. En Google existe un hermetismo total a este respecto.

O sea, el PageRank de la barra de Google no muestra el PageRank calculado en tiempo real (que es el utilizado por Google en el cálculo de su algoritmo de ordenación) sino que muestra la última actualización. Durante 2006 hubo 4 actualizaciones de PR: en febrero, en abril, en julio y ésta última, a finales de septiembre. Es decir, en 4 ocasiones durante 2006 Google ha cogido su PageRank, lo ha distribuido en base 10 y lo ha exportado a los servidores que alimentan las barras de Google.

Esto quiere decir que en principio, aunque alguien vea un cambio de PageRank en su barra de Google, sus resultados en Google no deberían verse afectados, ya que Google ya ha estado utilizando su PageRank actualizado durante algún tiempo. Así que aunque haya aumentado o haya disminuido, los resultados de octubre, en cuanto a lo que al PageRank se refiere, serán los mismos que ésta página tuvo en septiembre.

We know that the pages have been online in the past 3 months no new PageRank. Our website is an example of this is online since 29 August and we have not yet assigned PageRank in the navigation bar. Our same case can be found in many forums that discuss topics PageRank. Hopefully in the next update we assign.
The next export is scheduled for early January. We'll see what lies ahead ...

Links of interest:

Information about what is the PageRank and what it does:
What is page rank

online course search engine optimization, free of charge:
Search Engine Optimization Course

Matt Cutts Blog with some questions and answers about this latest update:
Google PageRank Update

Gloogle Trends - Trend Analyzer can help you improve your adwords campaigns

For several months, from Google Labs, Google offers a new tool for trends search of a term, or to compare search trends for two or more terms.

www.google.com/trends

As Google notes, this tool is in beta, so their results are not 100% reliable, but still, the information you provide is useful and can make a good game of it.

How can we use Google Trends level Digital Marketing?

It is obvious that apart from the pleasure of satisfying our curiosity, Google Trends has more utilities. The most prominent is to supplement the selection tool of Google Adwords keywords.

Comparison between two or more terms for hiring Adwords

Although Google Adwords already offers a similar tool:
https://adwords.google.es/select/KeywordToolExternal

If we look for keywords related to "tourism in Barcelona" in the Google Adwords tool and select the "Volume Trends global search" option, we obtain the data shown in this image:

This information is useful to choose keywords for which we want our ad to show, but if we perform this same search on Google Trends comparing "Tourism in Barcelona" with "barcelona hotels", the result gives us is more complete and it allows us to improve much more the campaign.

La herramienta de Google Adwords nos indica las palabras clave relacionadas con la palabra inicial, pero Google Trends nos muestra la evolución de las búsquedas de esos términos, y la información sobre el origen de ésas búsquedas. Nos las puede mostrar incluso por ciudades. Y si seleccionamos una región en concreto, nos lo indica por ciudades de esa región. Esto nos puede ayudar mucho a crear nuestra campaña y a segmentarla correctamente.

Para términos en otros idiomas, aún es más importante la información sobre las búsquedas regionales. Por ejemplo, si la búsqueda la escribimos en Inglés: “Tourism in Barcelona”, y seleccionamos la tendencia por regiones, veremos las búsquedas de éste término por países.

Aunque los resultados de esta búsqueda creo que sí se ven afectados por el hecho de que se trata de una beta… por mi experiencia en marketing turístico no me parece lógico que la mayor parte de búsquedas realizadas en Google para el término en cuestión, estén hechas desde Irlanda… así que creo Google no tiene en cuenta todas las búsquedas de todos los Google’s locales de todo el mundo. Pero bueno… aún así, la información que nos ofrece es útil y ayuda a completar la información que obtenemos desde Google Adwords… y si ahora ya nos es útil, es de esperar que en un futuro próximo esta herramienta deje de ser beta y aún pueda sernos de más utilidad.

The web of BMW driven from Google ... could happen to you?

It is the story of the week: German BMW website has been expelled from Google.de for practicing spam seekers.

It has been the Blog Matt Cutts who has revealed this expulsion. Matt is a Google employee who writes one of the best SEO blogs with content network. Obviously, Matt does not reveal anything that Google does not want, but at least the information it provides is always first hand and comes directly from its source.

Let's see what happened ...

Some weeks ago Matt commented that Google will be much harder to spam search engines and that between February and March will change their way of indexing sites, to combat it. It will not change the algorithm, but their spiders seek spam and will report for disposal.

The problem of spam is becoming a nightmare for the major search engines and the BMW case is not an isolated case. Many webmasters think they can fool Google and other search engines using keywords in hidden or camouflaging their texts code.

Many times, browsing Google results pages that you find are not positioned correctly ... but not be good ... if not the reverse. You might wonder how a page as "seedy" and with such poor content, can be in the first position by a search with more than five hundred thousand results. If you look well the code, you just found the reason. The case of BMW is also hidden code, now can not see if the image is not showing us Cutt, but there are still many pages that practice spam and that Google has not detected and expelled.

Consider an example where you can still see the hidden code:

www.todoalarmas.com

If we Google "home alarm" will find 996,000 results. This page is first. If you enter it you will see there is no apparent reason to fill this position. But if you edit your source code, you will discover why it is in first position: a hidden in a "no script" with more than 3000 words text.

Note: you will not see your code if you click on the right mouse button and you give to see code ... (that already charge them you can not do), but see your code if you go to the top menu bar and you click on: see >> Source Code.

We'll see how long they last ...

Looking whether or not Google disappear, we can also tell when Google has activated the antispam system indexing.

... And BMW: the BMW have already apologized to Google and Google and has again put on the list of sites to be indexed, so in the next update your pages will be indexed again. But it takes time (see months) to index an entire web again, with all its pages. (Unless you use the "site map" of Google to do so, which I do not know if BMW will ... we'll see).

The moral of all this is: Do not try to fool Google and focus on building good pages and have interesting content that other websites that get you recommend (this will make up the PageRank). Make a plan and stick to Digital Marketing.

The moral 2 would be: Really seekers permanently influence the success or failure of the web pages ... otherwise, BMW would not risk being expelled by such a theme, and many other web pages.

Additional information:

Article where we explained what the search engine spam and possible resolution of Google against him to include the Trust Rank algorithm to refine the Page Rank:
Find out how to be a substitute Google PageRank: the TRUST RANK

Article where we explained what the service "site map" on Google and where how it works: Discover the indexing of the future: Google SiteMap

Text camouflaged by BMW:
http://www.mattcutts.com/blog/ramping-up-on-international-webspam/

Free search engine optimization course, that will not get you the expulsion: Online Course Search Engine Optimization

Discover what the substitute Google PageRank: the TRUST RANK

Desde hace algunos meses se viene especulando sobre la posibilidad de que Google cambie el algoritmo de PageRank por un nuevo algoritmo que filtraría todo el spam a buscadores o al menos intentaría neutralizarlo. Descubre en éste artículo todo lo que se sabe hasta el momento, acerca del nuevo algoritmo cuyo nombre es Trust Rank.

This is the technique used to get when you search for "Thieves" on Google, you first position appears on the website of the SGAE.

This is the technique used to get when you search for "Thieves" on Google, you first position appears on the website of the SGAE.

The new Google algorithm would prevent such practices.

The purpose of PageRank is to assign a numerical value to web pages according to the number of times the recommended and other pages according to PageRank that they have to turn. That is, it establishes the importance of that website. Their logic is: if web pages link to another page, is that is recommending. And if recommended, it is to be important in the field of subject matter first website. A recommendation that comes from a page that in turn is highly recommended is worth a recommendation that comes from a page that hardly anyone recommended.

Google wants in the top positions of search results pages find some relevance and are being recommended by other pages that in turn also have relevance. To determine the PageRank, Google analyzes the number of links coming from other web pages and PageRank. The Trust Rank, part of the same base. But instead of assessing the importance of a recommendation based on the PageRank of the page that recommends it does from a series of web pages that have been considered important by humans rather than algorithms.

A Web page that humans have determined how important they are considered "web seed" and their links are assigned a value. And it will be that value that will be transmitted across the network.

To illustrate with an example: Suppose we have a web seed "A". "A" transmit a value of 100 Trust Rank to all websites that link. These pages, in turn, forward a 99 Trust Rank to all websites that link. And the latter, forward a 98 Trust Rank to which they link.

To mitigate the degradation of Trust Rank as distances of the seed sites, the algorithm has included a correction that takes into account the number of degrees between the seed web and the web receiving Trust Rank without nullifying completely the distance between seed.

The idea of ​​Trust Rank looks good, but there are certain issues that must be considered:

The idea of ​​Trust Rank looks good, but there are certain issues that must be considered:

Who will be the seed webs?

Did you perform reverse spam?

Did you perform reverse spam?

So scathing comments and thinking about the future not too distant when the Trust Rank will work, it occurs to me that perhaps the same playing that searching the word "thieves" out the SGAE, perhaps to play to sabotage web pages, linking them unmercifully from its pages spam and therefore subtracting Trust Rank, from appearing at the top of search engines.

When will we have the Trust Rank algorithm built into Google?

No idea ... no one here agrees. When you least expect Google launches a statement and informs us that it has already implemented. What is clear is that notify the press and the Internet. Certainly will make a major qualitative improvement in obtaining search results so Google will make people aware of it, I doubt that the communications department of Google threw away an opportunity like this.

Further information for anyone wishing to broaden their knowledge:

Link to document Stanford University that deals with the Trust Rank: http://dbpubs.stanford.edu:8090/pub/2004-17

Search Engine Optimization course (which no doubt have to change the day that implement the Trust Rank, but already includes the new indexing system with Google Site Map Generator): Online Course Search Engine Optimization. The course is free.

Discover the indexing of the future: Google SiteMap

Google proposes what will be the new way to index web pages.
Search engines like Google and Yahoo, use spiders to gather information from the web pages published on the Internet there. Once you have the information, process to quickly sort search results, based on a specific algorithm, when a user goes to their websites and asks any term or a phrase.

The search engine spiders regularly visit websites that are published on the Internet and automatically update information about their content.

So far, spiders came into the root directory of a domain, sought the robots.txt file to ensure that the site wanted to be indexed and then proceeded to visit all the links found on the website, thus recording the content of the page.

Google SiteMaps will revolutionize this form of indexing the web pages.

No es sólo que Google ahora lea con más detenimiento los mapas del site que la gente incluye en sus páginas web… no es nada de esto… es una nueva forma radical de indexar el contenido de las páginas. Google nos propone la creación de un sitemap en XML siguiendo unas especificaciones determinadas que darán toda la información a sus arañas y que les permitirá el acceso a urls que hasta ahora podían haber estado escondidas por diversos motivos ajenos a la voluntad de los webmasters.

Google wants to access the content of the web pages of the easiest and most efficient way. As it stands now raised indexing pages, even being much more efficient than human rates we had old (who does not remember going to a search engine, be inserted by hand the definition of our site, keywords why we wanted to be found and the site URL ... but this is prehistory internautical), which Google presents us is now much better.

Everything is to make available spiders a special sitemap.

To create this sitemap, enough to have an application that is installed on our server (there are versions for all operating systems) and creates a site map in a certain format. The application proposes Google can generate a map from the URL of the website from the directories of the website, or from server logs (ideal for dynamic pages).
Once we have the sitemap done according to the specifications of Google, we can register it in Google Sitemaps. Automatically and in less than four hours, Google will be indexed.

Google allows webmasters to create a cron to generate a new map to every hour (for sites with lots of content renewal) and make the map automatically submit Google Sitemaps. In this way, the spiders will know immediately the new pages created and may be incorporated into the index.

Advantages of this application:

No matter how bad you have the web page level paths for spiders ... with a site map created by the Sitemap Generator, Google spiders always find the url of all your pages.

Another great advantage is the quick content indexing the entire site. In less than 4 hours, the spiders have visited up to 50,000 links on our website. For websites with more URLs, Google recommends various sitemap and have an index of sitemaps.

Disadvantages of this application:

It requires some programming knowledge, so that either ISPs offer this service as added value for your customers or many websites will not have that service and should remain indexed by ordinary spiders.

The sitemap that are already available in most web pages are not compatible with the format of Google. Google want an XML document with certain specifications.

With this project, Google seeks undoubtedly how to improve the indexing of web pages and to have in their indexes with pages that until now were lost in a sea of ​​links within our sites.

Google has created the Sitemap Generator and indexing service Express and offers completely free ... it will be interesting to see the reaction of Yahoo at this, because Yahoo offers service fast indexing payment of $ 49, $ 20 or $ 10 according to the number of url we want to index on an accelerated basis.

Currently there have firsthand results regarding the effectiveness of indexing through Google sitemap. Once we installed the new sitemap on various websites and we are ready to make comparative increase in number and frequency indexed spiders visititas pages, write a new article reporting the results. See you then.

later noteA few months have passed since we wrote this article. The results have been very good. A whole new website is indexed in less than 24 hours. It is ideal for when a new site goes to the network. The can be indexed at a time, without having to wait months and months for Google spiders read its entire contents.

Additional information:

URL with information about Google sitemap:
https://www.google.com/webmasters/sitemaps/docs/en/about.html

URL with specifications about Google sitemap:
https://www.google.com/webmasters/sitemaps/docs/en/protocol.html

Discover the tools that will make your life easier as SEO

This article describes some of the very useful tools when it comes to working on the positioning of your website in Google and in other search engines.
In the last article we described what are the 8 key factors to outperform your adversary on Google or in other words, what factors should you take into account when benchmarking your website (your position compared to other websites) and thus know what you have to improve to overcome them. Today we will talk about the tools that will help you perform that benchmark.

How to know how many pages you have indexed in the main search engines

We have already indicated that the command to be inserted in the search boxes of the main search engines is:
website: numberofweb.com
(Substituting "webname" with the name of the website you want to analyze and without putting a space between "site:" and the url).

But there is a tool that allows you to do it simultaneously in several search engines and that also saves a history of your positioning so that you can see your evolution over the months. Completely free: Marketleap Marketleap It is probably one of the best websites/tools to know your positioning.

In the link that we indicate, it is enough that you include the url of your web page (and up to 3 more websites), enter the security code, and in a moment you will know how many pages you have indexed in 6 search engines, including Google, Yahoo and MSN. When it shows you the results, in the upper right part it will offer you to see the history of your queries and thus be able to know your evolution.

If you find that you have fewer indexed pages than you actually have on the web, it means that search engine spiders do not enter all your pages.

How to know the density of keywords in the text of a web page

There are several websites that offer this service. If the weight of the page is not very high, Ranks NL It offers us a very good analysis, completely free of charge. In addition, it shows us in green the words with an adequate density and in red those that have a dangerous density and are in danger of causing sanctions in the search engines that control abuses.

How to know how many web pages link to yours

As we indicated in the previous article, the command to carry out in a search engine is:
link: www.nombredeweb.com

But once again Marketleap has a tool that makes our work easier and makes this query simultaneously in several search engines at the same time, and also, if we wish, it makes a comparison with websites in our sector (unfortunately it only has North American websites cataloged, so the latter It is only interesting at the level of curiosity).

Also in the upper right part you will find a link to your history (obviously, if it is the first time you consult your website in Marketleap, there will be no history... but there will be the next time).

How to know in what position your web page is for certain searches

Go monitoring your positioning in the words that you consider key for people to find your business, is something very important. To help us in this task there are many tools. You will find most of them in Softonic, in the "Search Engine Position Monitors" section.

My preferred tool is the Free Monitor for Google 2.0. It's 100% free, and frankly, it's very useful: you give it a list of words or phrases, and it tells you where your website (or your competitors', depending on what you're looking at) ranks. In addition, the result can be exported and included in any document. The only thing to keep in mind is that it consults google.com and not google.es, so if we want to verify that what the Google Monitor tells us is true, we must perform the query in Google.com and count the number of pages that we have ahead.

On the net we will find programs that compare the position we occupy in various search engines for certain words. My favorite is ThumbShots that shows the results graphically and in addition to your website, it shows a hundred more results.

How to know the PageRank of a web page

The easiest way is to have the Google bar installed, but there are many websites that also offer us this information completely free of charge and that also allow us to check the PageRank.

Interesting links:

link to Power Point presentation that explains how a search engine works inside and that can help you if you check that Google has fewer urls of your web page indexed than it should.

8 key factors to overcome your opponent in Google

No one can claim to know the algorithm used by Google to sort search results, but it is relatively easy to investigate what factors are involved in it and to what extent affect the algorithm. In addition, the network will find much literature on the subject and you'll be extending knowledge if the topic you are passionate about.

This article presents 8 key factors that will help you know why other sites are above yours, and how you can overcome them.

1. Decide what words are going to concentrate your efforts

You can not fight for many words at a time, so concentrate your efforts in about 10 words or phrases that you think can be searched in Google for your target audience.

Begins the analysis that will lead to success making a list of the top 5 websites listed in the top results when looking for those 10 words.

Browse through the 5 pages that will appear. Make special attention to discover what words are targeting them.

2. Find out where are located the words for which you want to fight

Look carefully at where they are placing keywords.

Google gives more importance to words that are located in certain parts of a web page. The most important part is the URL (the address of your website), the following is the tag <title>, the following are the headers <h1>, <h2> and <h3>, then come the words that are links to other pages, and the importance is diminishing, although it is always higher than the plain text, if the words are bold, italicized, is part of a <alt> (alternative text on images), etc ...

3. Find out what keywords density have

Keep in mind a few things:

Google (y el resto de buscadores) funcionan por densidad, no por valores absolutos. Así que si en tu URL o en tu título tienes 50 caracteres y 9 coinciden con los que el usuario busca, el valor de tu URL o de tu título es de 9/50. Así que procura no poner textos superfluos o urls con tropecientos números que corresponden a una sesión de usuario o a algo parecido.

Also consider that from March 2004 Google works by characters, not words. Hence in the preceding paragraph has written "characters" rather than words. Until March 2004 if your title was "wooden tables Office" and the user was looking "wooden tables", the value of your degree was 3/5 (in Castilian not seeped prepositions and considered words). Now is not the case. Now going by letters. Thus, if someone searches a derivative of a word or a plural, or conjugated verb, the page containing something similar is also included in search results.

Cuando descubras dónde tienen situadas las palabras, mira con qué densidad aparecen. En tu página web, haz que tengan mayor densidad que en las suyas. Lo puedes hacer incluyendo la palabra más veces o incluyendo menos palabras que no tengan que ver con esa búsqueda. La cuestión es elevar la densidad y superar la suya en cada una de las partes donde aparezca la palabra.

Eye do not go overboard ... Google penalizes pages with suspiciously high densities. You can get a 100% density in the title and URL, without anything happening. But a page where you put a word repeated 100 times, everywhere, bold and links, and do not include any other text, you can be assured that it will be expelled from Google. So moderation.

Also, think that your website has to be read by your users / customers ... it is essential that the text is aimed at them, not search engine effectiveness.

4. Find out how many pages have their web pages

The more pages you have indexed in Google, the more likely they are to participate in the struggle for certain words. There are also indications that Google puts in a better position to web containing a large number of pages where the search term is included.

So, on the one hand, it includes the words why you want to position yourself in the maximum possible pages. On the other hand, try your web have about 200 pages or more.

But once again, find out what your competitors do and include it in the table started to do at the beginning of this study.

To find out how many pages are indexed in Google, simply type in the search engine box:

website: www.nombredelaweb.com

(Eye not include a space between site: and the URL)

To find out how many indexed pages contain a particular word or string of words, simply type in the search engine box:

website: www.nombredelaweb.com "palabra the phrase"

This will give you the number of pages containing the phrase "word or phrase" on the website www.nombredelaweb.com

5. Check the number of links pointing to your pages

The PageRank algorithm that forms the (cultural note: PageRank means "Larry Page rank", not "page ranking"), is formed by many other algorithms and is quite complicated to understand. But there are some basic features that can easily be applied to your website.

PageRank influence in all, the number of links pointing to a website, the density of these links on the source page and the PageRank of the source page.

So this number 5 will focus on the first of the factors affecting PageRank: the number of links.

Again, note the number of pages that link to each of the 5 competing websites that are analyzed on your list.

To find the number of links to a page, simply type in the search engine box:

link:www.nombredelaweb.com

Since March 2004, Google gives less value to the links come from pages with similar IP to yours, so do not need to cheat: Google knows.

We wrote an article about Hilltop algorithm used by Google to calculate and filter the PageRank of the sites, a few months ago: HillTop

6. Analyze what kind of web linking to your competitors.

In all likelihood you can not include in your listing the PageRank of each page that link to those of your competitors, but it is important to see what kind of website they are, what PageRank have, how many other websites link and what words they use for linking to your competitors.

The higher the PageRank of a page that links to you, the greater the number of points you get for this link. So look for pages with high PR and link you get.

To conclude this point, do not forget that Google and other search engines, everything works by density, so if a web out 100 links to other websites, the value of a bond to come to you is 1/100 . So forget about link farms. Get links to your web pages with few links and a high PageRank.

7. Find out what words your competitors use the links to go to their websites

If the search word is part of a third party link to your website, you have a bonus in points (to put it in some way). So if you dedicate yourself to making wooden tables Office, ensures that the pages that link to yours using the phrase "wooden tables" to link you, instead of www.minombredeempresa.com

Obviously, you can not always control what words to use third-party websites to link to yours ... but when you can do it, remember this clause 7: Remember the bonus to take you if you get !!

8. Write down what pages have PageRank your competitors

Do not forget to include a column in your study indicating which have PageRank websites of your competitors. This will help you understand why they are in the top positions.

Remember to increase your PageRank must, above all, increase the number of pages that link to yours. So if you have a PageRank of less than 4, put to work to get links. If you have more than 4, it is quite difficult if you do not perform any upload, well designed and with a good strategy specific campaign for this purpose.

Articles written as a collaboration in the magazine "Mercados del Vino"

Hasta aquí, hemos descrito los 8 factores clave que te llevarán a ganar posiciones en Google. Pero cuando realizo este tipo de benchmark, suelo incluir tres columnas más en el listado. Se trata de la posición de nuestros competidores en el ranking de Alexa. No es que Alexa influya en Google, pero es bueno saber dónde están situados a nivel de visitas únicas, de páginas vistas por usuario y de ranking en general. Estos tres datos los encontrarás buscando a tus competidores en Alexa.com.

8 factors hope you have been helpful. This article is aimed to provide guidance to people who wish to know the exact position of your web pages, compared to those of its competitors. It is not intended to be manual in depth about how Google works.

To view the presentation we use when we do lectures about how search engines work, you can download it here: Diapostivas stitches

For more information about search engine: Free Search Engine Optimization Course

By the way, if you have questions or want to expand more specifically some point, we will be happy to assist you.

A9 discover how it works: the final version of the browser created by Amazon

Amazon A9How it works A9, what kind of algorithms is based, why it is called A9, who devised it, and all that we learned about this new search engine that will have to medírselas with Yahoo, Google and new MSN that Microsoft has in beta. Lets go see it!

Entry A9 in the search engine market opens an interesting period in the war for monopolizing the user searches takes a new dimension.

With the new version Amazon hung yesterday, we will review again the topics covered in the functional analysis to find the differences, see if they have corrected the weaknesses that showed in April and discover the innovations that presents this search.

Tracing Service Customization:

When you enter A9 and a member of AmazonThe same cookie recognizes and greets you with a "Hello Montserrat"That leaves me flabbergasted and, by the impact, I can not help but make me think of a"Welcome professor Falken, ¿Would you like a game of chess?” :-)

A part of the name, show me the seeker box and the history of all my searches A9 for if I want to consult some previous results (and I do not know if they have programmed, but could also serve to know your choices in the search results that have offered you, so we can offer them better next time you are looking for something ... we'll find out with time)

About the database used by A9:

Definitely A9 uses the database of Google, instead of using Alexa (remember that Alexa was bought by Amazon in 2000 and that Alexa has scanned more websites that Google ... but Google kept clean its database and deletes each 6 months, the web pages that their spiders have been unable during that period of time ... Alexa does not)

In April analysis, commented that using the Google database but not used whole, but only a portion of it. Now we confirm that uses whole: by searching the type site: www.solocine.com get the same number of results (approx.), Both search engines.

About sorting algorithm A9

It is to Google, without hesitation.

Offers some variation in the order of the results, but I think it is because both Google and A9 apply filters to the results without you know it, that the algorithm itself. For example, according to the language Google layout you have, when you search results in Castilian, it offers different results ... even if you insist that you want no filters through ...

It's a shame they have not chosen to use its own algorithm and compete with Google searches to see who offers the highest quality. If they use the same database ... A9 had it very easy to use Alexa ranking PageRank instead of determining the relevance of a page and thus influence the sorting algorithm results. But it seems clear that it has chosen to ally itself to Google rather than compete against him.

About advertising on A9

The system uses Google Adwords and Google's sponsored links. It served directly from Google's own machines (you can see in the url redirection ads)

What is the value proposition A9? What differs from the rest?

Since we are seeing at the moment A9 is basically a Google with another look&feelLet's see how it differs:

  • A9 offers image search results while providing web search and even while looking at the texts of the books Amazon. It is a convenient feature that facilitates this page to find out if you are interested or not.
    Most of the functions of the web work with "drag & drop". It is the new trend in the usability of applications for the end user. Everything is dragged and placed where you want it to work or to be saved.
  • Favorites Tracing Service (Bookmarks): If you drag bookmarks to the URL of a web of appearing in a result, it is automatically saved here so you can consult it any other day.
  • It offers 4 skins and 3 different font sizes: If you want to see A9 in purple and suitable letters without glasses myopic, A9 allows it.
  • Offers "Site Info" from Alexa in its results: the results offered after a particular search are accompanied by a small icon "site info". This icon works like Alexa, activates a layer with information about the page (position in the ranking of Alexa, links to the page download speed, etc.)

No creo yo que Udi Manber esté muy satisfecho con el nuevo A9. Manber es un especialista en algoritmos, ex “jefe de algoritmos” de Amazon, ex “director científico” de Yahoo y ex profesor de informática de la Universidad de Arizona… no lo veo como alguien que se contente con sacar al mercado un Google con algunas cosillas retocadas en superficie… Desconozco por cuanto tiempo se ha cerrado el acuerdo con Google y si hay o no dinero por medio (a parte de los Adwords, que beneficia a ambos. Los Adwords de Google también están en Amazon).

Time will tell ... but I hope that A9 will end up being the chrysalis of something better awaits us in the near future ... or maybe die trying ... we'll see.

As a curiosity: Udi Manber is the man behind the name A9, which refers to 9 letters in the word algorithm in English (Algorithm).

By the way ... the A9 URL is www.a9.com if you want to play around and find the differences regarding Google :-)

What is the Hilltop algorithm?

Since March 2004, Google gives less value to the links come from pages with similar IP to yours, so do not need to cheat to change your PageRank and therefore improve your SEO: Google knows.

A filter this switch is called PageRank Hilltop algothim

Google has implemented this change in their algorithm to neutralize a trap that some experts in SEO webmasters have been doing since the PageRank became operational: to create endless small web, hosted on your own ISP, that link to your main website.Authority Pages

Also large corporations have abused the fact that a large number of inbound links you make to improve the positioning ... without going any further, SoloStocks we have links in the footer to all websites Intercom Group... and therefore on all pages Softonic (To quote one of our companies) there is also a link to SoloStocks. Since We have over 500,000 pages indexed in Google, my site receives 500,000 external links. Thing seems to me great ... but that is not 100% just from the point of view of an independent webmaster who runs a website with great content but never positioned above mine ... Google has implemented until Hilltop It has neutralized the effect of the links.

So I said ... do not need that to improve the SEO of your website, waste time including links to websites that are hosted on the same IP you ... because Google now looks at the IP from coming inbound links and has lowered the weight will greatly which have a similar IP to yours.