How indexes the https?

Https indexing is one of those mysteries that makes life more interesting SEO. While we know that it is possible to index it in most search engines, hardly anyone knows how to get it in the shortest possible time.

What is https?

The https is the secure version of the http protocol. The difference between one and the other is that the former transmits the encrypted data, and the second transmits unencrypted.

The system uses https based on Secure Socket Layers (SSL) encryption to send information.

The decoding of the information depends on the remote server and the browser used by the user.

It is mainly used by banks, online stores, and any service that requires sending personal data or passwords.

How does the https?

Contrary to what many people think, the https does not prevent access to information, only the encrypted when transmitted. Hence the content of a web page that uses the https protocol can be read by search engine spiders. What can not be read is the content that is sent from the website to your server, for example, the login and password for access to a private area of ​​the website.

The standard port for this protocol is 443.

How do we know the https is actually indexed?

Google indexes https since early 2002 and gradually other search engines have adapted their technology to also index the https.

The last search engine to do so was MSN, he got it in June 2006.

If we look for "https: // www." Or inurl: https in major search engines, we find https pages indexed in them.

How can we index our https?

In principle, naturally we can index our https pages, but as this protocol transmits information much slower, spiders sometimes fail to download the pages in the time they have established and will not index it. This is the main problem that we can find. We will resolve trying to reduce the download time of these pages.

How can we accelerate the indexing of https

There are two techniques:

  1. Google Sitemap: Include our sitemap https pages (we refer to the google sitemap, sitemap to not to humans), and register it in google sitemaps.
  2. Guerrilla: Internet links spread all over to go to our https pages, and thus achieve the spiders that are indexing the pages where the links have also come into the https part of our site.

How can we make our https being indexed

It is not as easy as it looks. It does not serve to include in our robots.txt https pages. Each port requires its own robots.txt, so we create a robot.txt to our http pages and another for our https pages. In other words, we also have a page called

https://www.nombredelapagina.com/robots.txt

If you need help or de-index to index your pages https, please contact us. We will encatados to assist you.

Additional information:

Blog MSN about indexing - Article where they explain that MSN index starts https
http://blogs.msdn.com/livesearch/archive/2006/06/28/649980.aspx

Information about how Google not index https:
http://www.google.es/support/webmasters/bin/answer.py?answer=35302

More information about Google Sitemaps:
SiteMaps de Google
http://www.geamarketing.com/articulos/Descubre_indexacion_futuro_Google_SiteMap.php

online course, free, search engine optimization: Course search engine positioning
http://www.geamarketing.com/posicionamiento_buscadores.php

Google PageRank update in September 2006

Last updated September 28 Google PageRank showing your navigation bar. Many websites have seen his rising, or how down ...

Ya hemos comentado otras veces que el PageRank que nos muestra la barra forma parte de una escala logarítmica de 10. Es decir, es fácil subir de 0 a 1 o de 2 a 3, pero en cambio es muy difícil subir de 5 a 6, y aún más subir de 6 a 7. En cambio el PageRank que utiliza Google para sus cálculos es mucho más preciso y utiliza una escala mucho mayor, aunque no sabemos cuánto. En Google existe un hermetismo total a este respecto.

O sea, el PageRank de la barra de Google no muestra el PageRank calculado en tiempo real (que es el utilizado por Google en el cálculo de su algoritmo de ordenación) sino que muestra la última actualización. Durante 2006 hubo 4 actualizaciones de PR: en febrero, en abril, en julio y ésta última, a finales de septiembre. Es decir, en 4 ocasiones durante 2006 Google ha cogido su PageRank, lo ha distribuido en base 10 y lo ha exportado a los servidores que alimentan las barras de Google.

Esto quiere decir que en principio, aunque alguien vea un cambio de PageRank en su barra de Google, sus resultados en Google no deberían verse afectados, ya que Google ya ha estado utilizando su PageRank actualizado durante algún tiempo. Así que aunque haya aumentado o haya disminuido, los resultados de octubre, en cuanto a lo que al PageRank se refiere, serán los mismos que ésta página tuvo en septiembre.

We know that the pages have been online in the past 3 months no new PageRank. Our website is an example of this is online since 29 August and we have not yet assigned PageRank in the navigation bar. Our same case can be found in many forums that discuss topics PageRank. Hopefully in the next update we assign.
The next export is scheduled for early January. We'll see what lies ahead ...

Links of interest:

Information about what is the PageRank and what it does:
What is page rank

online course search engine optimization, free of charge:
Search Engine Optimization Course

Matt Cutts Blog with some questions and answers about this latest update:
Google PageRank Update

Gloogle Trends - Trend Analyzer can help you improve your adwords campaigns

For several months, from Google Labs, Google offers a new tool for trends search of a term, or to compare search trends for two or more terms.

www.google.com/trends

As Google notes, this tool is in beta, so their results are not 100% reliable, but still, the information you provide is useful and can make a good game of it.

How can we use Google Trends level Digital Marketing?

It is obvious that apart from the pleasure of satisfying our curiosity, Google Trends has more utilities. The most prominent is to supplement the selection tool of Google Adwords keywords.

Comparison between two or more terms for hiring Adwords

Although Google Adwords already offers a similar tool:
https://adwords.google.es/select/KeywordToolExternal

If we look for keywords related to "tourism in Barcelona" in the Google Adwords tool and select the "Volume Trends global search" option, we obtain the data shown in this image:

This information is useful to choose keywords for which we want our ad to show, but if we perform this same search on Google Trends comparing "Tourism in Barcelona" with "barcelona hotels", the result gives us is more complete and it allows us to improve much more the campaign.

La herramienta de Google Adwords nos indica las palabras clave relacionadas con la palabra inicial, pero Google Trends nos muestra la evolución de las búsquedas de esos términos, y la información sobre el origen de ésas búsquedas. Nos las puede mostrar incluso por ciudades. Y si seleccionamos una región en concreto, nos lo indica por ciudades de esa región. Esto nos puede ayudar mucho a crear nuestra campaña y a segmentarla correctamente.

Para términos en otros idiomas, aún es más importante la información sobre las búsquedas regionales. Por ejemplo, si la búsqueda la escribimos en Inglés: “Tourism in Barcelona”, y seleccionamos la tendencia por regiones, veremos las búsquedas de éste término por países.

Aunque los resultados de esta búsqueda creo que sí se ven afectados por el hecho de que se trata de una beta… por mi experiencia en marketing turístico no me parece lógico que la mayor parte de búsquedas realizadas en Google para el término en cuestión, estén hechas desde Irlanda… así que creo Google no tiene en cuenta todas las búsquedas de todos los Google’s locales de todo el mundo. Pero bueno… aún así, la información que nos ofrece es útil y ayuda a completar la información que obtenemos desde Google Adwords… y si ahora ya nos es útil, es de esperar que en un futuro próximo esta herramienta deje de ser beta y aún pueda sernos de más utilidad.

The web of BMW driven from Google ... could happen to you?

It is the story of the week: German BMW website has been expelled from Google.de for practicing spam seekers.

It has been the Blog Matt Cutts who has revealed this expulsion. Matt is a Google employee who writes one of the best SEO blogs with content network. Obviously, Matt does not reveal anything that Google does not want, but at least the information it provides is always first hand and comes directly from its source.

Let's see what happened ...

Some weeks ago Matt commented that Google will be much harder to spam search engines and that between February and March will change their way of indexing sites, to combat it. It will not change the algorithm, but their spiders seek spam and will report for disposal.

The problem of spam is becoming a nightmare for the major search engines and the BMW case is not an isolated case. Many webmasters think they can fool Google and other search engines using keywords in hidden or camouflaging their texts code.

Many times, browsing Google results pages that you find are not positioned correctly ... but not be good ... if not the reverse. You might wonder how a page as "seedy" and with such poor content, can be in the first position by a search with more than five hundred thousand results. If you look well the code, you just found the reason. The case of BMW is also hidden code, now can not see if the image is not showing us Cutt, but there are still many pages that practice spam and that Google has not detected and expelled.

Consider an example where you can still see the hidden code:

www.todoalarmas.com

If we Google "home alarm" will find 996,000 results. This page is first. If you enter it you will see there is no apparent reason to fill this position. But if you edit your source code, you will discover why it is in first position: a hidden in a "no script" with more than 3000 words text.

Note: you will not see your code if you click on the right mouse button and you give to see code ... (that already charge them you can not do), but see your code if you go to the top menu bar and you click on: see >> Source Code.

We'll see how long they last ...

Looking whether or not Google disappear, we can also tell when Google has activated the antispam system indexing.

... And BMW: the BMW have already apologized to Google and Google and has again put on the list of sites to be indexed, so in the next update your pages will be indexed again. But it takes time (see months) to index an entire web again, with all its pages. (Unless you use the "site map" of Google to do so, which I do not know if BMW will ... we'll see).

The moral of all this is: Do not try to fool Google and focus on building good pages and have interesting content that other websites that get you recommend (this will make up the PageRank). Make a plan and stick to Digital Marketing.

The moral 2 would be: Really seekers permanently influence the success or failure of the web pages ... otherwise, BMW would not risk being expelled by such a theme, and many other web pages.

Additional information:

Article where we explained what the search engine spam and possible resolution of Google against him to include the Trust Rank algorithm to refine the Page Rank:
Find out how to be a substitute Google PageRank: the TRUST RANK

Article where we explained what the service "site map" on Google and where how it works: Discover the indexing of the future: Google SiteMap

Text camouflaged by BMW:
http://www.mattcutts.com/blog/ramping-up-on-international-webspam/

Free search engine optimization course, that will not get you the expulsion: Online Course Search Engine Optimization

Find out what is the conversion rate of visitors arriving through a search engine

A study by WebSideStory reveals conversion rates to client visits come from search engines.

Most marketers agree that the conversion ratio of customer visits is one of the metrics to follow monthly and to try to improve day after day with actions to optimize their sites and campaign optimization actions. But when you already have that ratio, we miss the power compared to other sites, to see if it is above or below the average.

Last week, WebSideStory published a study on conversion averages customer visits, the four major search engines.

The study was conducted from statistics collected by the websites of electronic commerce B2C (business to consumer end) using WebSideStory HBX as of analytics software.

The study analyzes data from millions of users who visit these pages and includes data on both organic positioning come as buying traffic keywords. Total analyzed sites totaling more than 3,000 million dollars in annual sales.

The study shows that during the month of January was the AOL search engine which got more customer conversions on ecommerce sites (6.17%), followed by MSN (6.03%), Yahoo (4.07 %) and Google (3.83%).

Una posible explicación para entender el hecho de que los ratios de conversión de usuarios de portales generalistas sean mayores que un portal orientado 100 % a búsqueda podría ser que los portales que además del buscador incluyen otros contenidos y servicios, atraen a un tipo de usuario que está más predispuesto a comprar. En cambio, los portales como Google, 100% enfocados a búsqueda, atraen más a gente que busca información y que tiene menos intención de compra.

El estudio muestra que los cuatro grandes buscadores ofrecen ratios de conversión mucho mayores que el resto de buscadores: el promedio de conversión durante del mes de enero 2006 para el total de buscadores fue de 1,97 %. Cifra que contrasta con el 2,30 % que era el promedio de los 3 últimos meses del 2005 (aunque es normal que el índice haya bajado, ya que en esta última cifra se incluye la campaña de Navidad 2005).

Another consideration about this study is that the conversion ratio of e-commerce sites that have been analyzed are certainly above average for the sector. This is because the fact of using HBX web analytics as a system allows them to optimize their sites to maximize customer conversion users:

  • On the one hand, using HBX web pages have already abolished search campaigns that are not financial performance.
  • On the other hand, this type of software allows to know step by step what users do on their websites and can, for example, to know which parts of the forms users abandon their purchase, which allows them to modify forms the maximum number of users reaches the end of the purchase.

Still, the study is interesting and allows us to place the figures obtained from our website or by our customers in a much larger context, and to know that if we improve to increase it, or whether we should congratulate ourselves for being over the average.

A final consideration about the study: The study was conducted based on e-commerce websites mainly the United States. Hence the ratio of AOL is so high. It would be interesting to have this same study, but with figures that refer to electronic commerce in Spain ... the question is: would we see Terra appear among the seekers of higher conversion?

The data:

Average conversion ratio for the 4 major search engines, e-commerce sites in January 2006:

AOL: 6,17 %
MSN: 6,03 %
Yahoo: 4,07 %
Google: 3,83 %
average figure of all search engines: 1.97%

More information about the study Internet News and in Websidestory