Google proposes what will be the new way to index web pages.
Search engines like Google and Yahoo, use spiders to gather information from the web pages published on the Internet there. Once you have the information, process to quickly sort search results, based on a specific algorithm, when a user goes to their websites and asks any term or a phrase.
The search engine spiders regularly visit websites that are published on the Internet and automatically update information about their content.
So far, spiders came into the root directory of a domain, sought the robots.txt file to ensure that the site wanted to be indexed and then proceeded to visit all the links found on the website, thus recording the content of the page.
Google SiteMaps will revolutionize this form of indexing the web pages.
No es sólo que Google ahora lea con más detenimiento los mapas del site que la gente incluye en sus páginas web… no es nada de esto… es una nueva forma radical de indexar el contenido de las páginas. Google nos propone la creación de un sitemap en XML siguiendo unas especificaciones determinadas que darán toda la información a sus arañas y que les permitirá el acceso a urls que hasta ahora podían haber estado escondidas por diversos motivos ajenos a la voluntad de los webmasters.
Google wants to access the content of the web pages of the easiest and most efficient way. As it stands now raised indexing pages, even being much more efficient than human rates we had old (who does not remember going to a search engine, be inserted by hand the definition of our site, keywords why we wanted to be found and the site URL ... but this is prehistory internautical), which Google presents us is now much better.
Everything is to make available spiders a special sitemap.
To create this sitemap, enough to have an application that is installed on our server (there are versions for all operating systems) and creates a site map in a certain format. The application proposes Google can generate a map from the URL of the website from the directories of the website, or from server logs (ideal for dynamic pages).
Once we have the sitemap done according to the specifications of Google, we can register it in Google Sitemaps. Automatically and in less than four hours, Google will be indexed.
Google allows webmasters to create a cron to generate a new map to every hour (for sites with lots of content renewal) and make the map automatically submit Google Sitemaps. In this way, the spiders will know immediately the new pages created and may be incorporated into the index.
Advantages of this application:
No matter how bad you have the web page level paths for spiders ... with a site map created by the Sitemap Generator, Google spiders always find the url of all your pages.
Another great advantage is the quick content indexing the entire site. In less than 4 hours, the spiders have visited up to 50,000 links on our website. For websites with more URLs, Google recommends various sitemap and have an index of sitemaps.
Disadvantages of this application:
It requires some programming knowledge, so that either ISPs offer this service as added value for your customers or many websites will not have that service and should remain indexed by ordinary spiders.
The sitemap that are already available in most web pages are not compatible with the format of Google. Google want an XML document with certain specifications.
With this project, Google seeks undoubtedly how to improve the indexing of web pages and to have in their indexes with pages that until now were lost in a sea of links within our sites.
Google has created the Sitemap Generator and indexing service Express and offers completely free ... it will be interesting to see the reaction of Yahoo at this, because Yahoo offers service fast indexing payment of $ 49, $ 20 or $ 10 according to the number of url we want to index on an accelerated basis.
Currently there have firsthand results regarding the effectiveness of indexing through Google sitemap. Once we installed the new sitemap on various websites and we are ready to make comparative increase in number and frequency indexed spiders visititas pages, write a new article reporting the results. See you then.
later noteA few months have passed since we wrote this article. The results have been very good. A whole new website is indexed in less than 24 hours. It is ideal for when a new site goes to the network. The can be indexed at a time, without having to wait months and months for Google spiders read its entire contents.
URL with information about Google sitemap:
URL with specifications about Google sitemap:
Specialist Digital Marketing Strategy and Digitization Process. I help businesses and territories to develop thanks to the use of Internet in their business strategies.
Impart conferences, training and consulting for over 20 years. Program Director of the Degree of Entrepreneurship Digital (Digital Business) La Salle - URL.
My digital DNA comes from Intercom (creators of Infojobs, Softonic, eMagister, Solostocks, among other companies). I created my first website in 1994 and started my first online business in 1998.
I am at your disposal for whatever you need. Let's talk.