Posts

What is a Favicon and what is

Every time I make a conference or a course, look at me down the websites of the attendees. To do this, the organization of the event sent me a list of your URL and will open in the same browser but in different tabs. Unfortunately, when I have opened pages and I can see fav icons around the world, I note that a large part of them are favicons CMS used to build your web pages (usually Joomla or WordPress), not a favicon corporate, which is what it should be.

So here's this short article explaining what is a favicon and what is, to see if in this way, I get that people give the importance it deserves.

What is a favicon

It is a small icon (usually 16 × 16 pixels or 32 × 32 px) used to identify a website on the tabs of a browser, a list of favorites, or anywhere else that requires a small identification.

In this blog you are reading, the favicon It is a "MP" with fuchsia background you can see next to the URL or the page tab on your browser.

What is a favicon

It is a symbol that conveys corporate image and used to:

  • Transmit our brand image when someone browsing our website and fixes his gaze on the area of ​​the browser URL.
  • For our page is easily identifiable when someone has open multiple tabs in your browser.
  • For our page is easily identifiable when someone keeps it in their favorites.

this is a favicon

How can we create a favicon to our website

There are many programs that will help you create it. This is the one I use to create them: creator of favicons.

My advice is to think your favicon 32 × 32 pixels and is like your logo. You can create favicons from images ... maybe this can be of help, but one pixel is also easy to create. Another tip: before designing your favicon look favicons other companies in Google imagesThis can help inspire you to yours and make sure inadvertently, do not use any favicon which it has already been created by someone else.

When you have already created, you must save it using the name favicon.ico and upload it via FTP to the root directory of your website. Some content management systems like WordPress have a section in the admin panel for the exact URL you indicate the favicon... But it works even if there not indicate anything but you do have favicon in your root directory. WordPress also some templates already carry a zone in which you indicate the URL of your image favicon and it is already.

For your favicon look as soon as possible, you can make known their existence through the source code of your web page, using meta tags in the following line of code:

<link type=”image/x-icon” href=”favicon.ico” rel=”shortcut icon” />

Although again, if you do not use the line of code, the browser will see it on its own at some point. In Firefox and Chrome is almost immediately in Explorer takes a little longer.

This is all.
I go all right ... (and when you have a moment, please ensure that your website has its own favicon ;-)

We talk to each other

We give the website the hotel you need it most

Yes it's correct. We do.

The idea was not ours, was of our customers, who in these times of crisis try not to have to pay for the cost of programming a website and convince ourselves that conceptualize and program their page, on account of future benefits or simply as a favor.

But our websites are not cheap. In creating each a marketing team (that conceptualizes, the guides customer and manufactures texts), a programming device (which makes it functional, while technically robusta) and a design team participates (which gives it that special touch that makes it unique and pleasing to the eyes of the user).

So we can not compete with "el_primo_de_tu_vecino_que_hace_webs", but of course, the result offering our pages also has nothing to do with the "cousin".

Esto provoca que nuestros clientes deseen nuestras páginas, pero que a la vez, con los tiempos que corren, a veces sea un poco complicado destinar a ellas los recursos necesarios. De ahí que intenten convencernos para que rebajemos los precios e incluso para que programemos gratis.

Lo más divertido, e instructivo, han sido los argumentos que esgrimían algunos clientes para convencernos de que deseaban la web sin tener que pagar por ella. De ahí que en una reunión de esas de “viernes por la tarde”, en la que nos juntamos todo el equipo y vamos repasando cliente por cliente y viendo cómo va cada proyecto, pensamos que sería una buena idea crear un concurso de argumentos con los qué convencernos de no cobrar una página web y regalar la página a quien estuviese necesitado y a la vez fuese ingenioso en sus argumentos.

We are sure to learn a lot with this contest ... so the cost of programming and page layout actually going to be on staff training.

If you think you can convince us, it could be that consiguieses the website free for your hotel. At least one of the participants will get it (with booking engine included), you could be you.

Here are the contest rules and registration form "We give the website the hotel you need it most

We talk to each other.

A hug.

AJAX, a technique used sparingly

This article explains what AJAX is, when to use and what contraindications. We also show how to overcome some of the contraindications.

What is AJAX?

AJAX stands for: Asynchronous JavaScript And XML. That is, the combination of JavaScript and XML asynchronously.

It is a technique developed for interactive Web applications, which consists of a set of three existing technologies work together effectively.

These technologies are:

  1. (X)HTML y CSS (Cascading Style Sheets) to give a structure and present the information on the website.
  2. JavaScriptUsed for dynamic interaction with data.
  3. XMLUsed for interaction with the web server. Although it is not necessary to use XML simpre with AJAX applications, as for example, plain text files can also be stored information.

Like DHTML, AJAX is not itself an independent web technology, but a term that encompasses the three aforementioned technologies.

What is Ajax?

Ajax is used to make changes to a web page at the user, without having to reload the whole page again.

For example, on a web page the user requests any information that is offered from the same web page (such as a description of a product) and click on the link on the same page (without loading it again) the requested information appears .

The process of displaying HTML data made entirely consume significant bandwidth, since all HTML should again be loaded to only show the changes. Instead, an AJAX application is much faster and consumes bandwidth.

The JavaScript used in AJAX application is a dynamic, able to make changes to a Web page without re-charge language. AJAX makes sure that only the necessary information is requested and processed, using SOAP or other Web services language loosely based on XML.

Hence a technical level, 3 advantages are obtained: one time charge much lower, saving the user bandwidth and server load much less where the website is hosted.

AJAX problems

Problems with search engine indexing:

AJAX is used by Google, Yahoo, Amazon and a lot more than search engines, portals and content creators, but not general use and massive as some think. Google, for example, which supports webmasters to use AJAX in their programming, uses it himself in GMail, Google Suggest, Google Maps, but not absolutely all your web pages.

The problem with AJAX is that the content displayed within the application using AJAX, not indexed in search engines. This is because spiders (spiders) search engines are not able to interact with the AJAX application and get the command that displays the content is activated.

Hence, it is a bad idea for example, create a list with the names of our products and make an AJAX application by clicking on a product name, product description and photograph is displayed to the right of the list. If we do this, descriptions of products and their images will not be indexed or Google or any other search engine.

Although not all bad news, certain ways of working with AJAX itself that index, for example, playing a show or not show content using positive and negative margins. So just to bear in mind when scheduling if spiders may pass or not can spend.

Accessibility problems:

If we start from the basis that our website should always be accessible to all types of browsers and users and should at least meet the standard A W3C (http://www.w3.org), We find that most scripts that improve appearance and interactivity of a website have accessibility issues. The AJAX also has them.

As we have seen at the beginning of this article, the use of AJAX involves using JavaScript, and some browsers do not support this type of programming. Although as we shall see it is solvable.

But keep in mind that a large part of AJAX applications that we find in the libraries that exist on the Internet have not corrected this problem and therefore are apps that do not meet the W3C standares (at the end of these lines provide links to libraries code and articles dealing with the issue of accessibility and AJAX).

AJAX, to use sparingly

As we have seen in previous section, although AJAX applications provide dynamic, interactive and reduced bandwidth to a website, they also have drawbacks to search engine indexing level and level of accessibility. Therefore, we must consider and neutralize the following:

  1. If we use AJAX on our websites, we must be aware that the content displayed within the AJAX application will not be indexed by search engines. To remedy this detail, we can create this redundant content and make it accessible to the spiders through a sitemap or through links in the footer of the website.
  2. If we use AJAX to make our website interactive, we must keep in mind that will not meet the Level A accessibility, unless we use the code libraries adopted by the W3C or means to surf the web without using JavaScript.

related links

New information on Google indexing AJAX (March 2010): http://code.google.com/intl/es/web/ajaxcrawling/

Examples of Web pages that use AJAX and AJAX code libraries for use by webmasters:
http://ajaxpatterns.org/Ajax_Examples

Articles which explains how to get AJAX code that does comply with the level A W3C accessibility:
http://www.maxkiesler.com/

List common accessibility errors:
http://www.w3.org/TR/WCAG20-SCRIPT-TECHS/#N11799

How indexes the https?

Https indexing is one of those mysteries that makes life more interesting SEO. While we know that it is possible to index it in most search engines, hardly anyone knows how to get it in the shortest possible time.

What is https?

The https is the secure version of the http protocol. The difference between one and the other is that the former transmits the encrypted data, and the second transmits unencrypted.

The system uses https based on Secure Socket Layers (SSL) encryption to send information.

The decoding of the information depends on the remote server and the browser used by the user.

It is mainly used by banks, online stores, and any service that requires sending personal data or passwords.

How does the https?

Contrary to what many people think, the https does not prevent access to information, only the encrypted when transmitted. Hence the content of a web page that uses the https protocol can be read by search engine spiders. What can not be read is the content that is sent from the website to your server, for example, the login and password for access to a private area of ​​the website.

The standard port for this protocol is 443.

How do we know the https is actually indexed?

Google indexes https since early 2002 and gradually other search engines have adapted their technology to also index the https.

The last search engine to do so was MSN, he got it in June 2006.

If we look for "https: // www." Or inurl: https in major search engines, we find https pages indexed in them.

How can we index our https?

In principle, naturally we can index our https pages, but as this protocol transmits information much slower, spiders sometimes fail to download the pages in the time they have established and will not index it. This is the main problem that we can find. We will resolve trying to reduce the download time of these pages.

How can we accelerate the indexing of https

There are two techniques:

  1. Google Sitemap: Include our sitemap https pages (we refer to the google sitemap, sitemap to not to humans), and register it in google sitemaps.
  2. Guerrilla: Internet links spread all over to go to our https pages, and thus achieve the spiders that are indexing the pages where the links have also come into the https part of our site.

How can we make our https being indexed

It is not as easy as it looks. It does not serve to include in our robots.txt https pages. Each port requires its own robots.txt, so we create a robot.txt to our http pages and another for our https pages. In other words, we also have a page called

https://www.nombredelapagina.com/robots.txt

If you need help or de-index to index your pages https, please contact us. We will encatados to assist you.

Additional information:

Blog MSN about indexing - Article where they explain that MSN index starts https
http://blogs.msdn.com/livesearch/archive/2006/06/28/649980.aspx

Information about how Google not index https:
http://www.google.es/support/webmasters/bin/answer.py?answer=35302

More information about Google Sitemaps:
SiteMaps de Google
http://www.geamarketing.com/articulos/Descubre_indexacion_futuro_Google_SiteMap.php

online course, free, search engine optimization: Course search engine positioning
http://www.geamarketing.com/posicionamiento_buscadores.php

Discover the indexing of the future: Google SiteMap

Google proposes what will be the new way to index web pages.
Search engines like Google and Yahoo, use spiders to gather information from the web pages published on the Internet there. Once you have the information, process to quickly sort search results, based on a specific algorithm, when a user goes to their websites and asks any term or a phrase.

The search engine spiders regularly visit websites that are published on the Internet and automatically update information about their content.

So far, spiders came into the root directory of a domain, sought the robots.txt file to ensure that the site wanted to be indexed and then proceeded to visit all the links found on the website, thus recording the content of the page.

Google SiteMaps will revolutionize this form of indexing the web pages.

No es sólo que Google ahora lea con más detenimiento los mapas del site que la gente incluye en sus páginas web… no es nada de esto… es una nueva forma radical de indexar el contenido de las páginas. Google nos propone la creación de un sitemap en XML siguiendo unas especificaciones determinadas que darán toda la información a sus arañas y que les permitirá el acceso a urls que hasta ahora podían haber estado escondidas por diversos motivos ajenos a la voluntad de los webmasters.

Google wants to access the content of the web pages of the easiest and most efficient way. As it stands now raised indexing pages, even being much more efficient than human rates we had old (who does not remember going to a search engine, be inserted by hand the definition of our site, keywords why we wanted to be found and the site URL ... but this is prehistory internautical), which Google presents us is now much better.

Everything is to make available spiders a special sitemap.

To create this sitemap, enough to have an application that is installed on our server (there are versions for all operating systems) and creates a site map in a certain format. The application proposes Google can generate a map from the URL of the website from the directories of the website, or from server logs (ideal for dynamic pages).
Once we have the sitemap done according to the specifications of Google, we can register it in Google Sitemaps. Automatically and in less than four hours, Google will be indexed.

Google allows webmasters to create a cron to generate a new map to every hour (for sites with lots of content renewal) and make the map automatically submit Google Sitemaps. In this way, the spiders will know immediately the new pages created and may be incorporated into the index.

Advantages of this application:

No matter how bad you have the web page level paths for spiders ... with a site map created by the Sitemap Generator, Google spiders always find the url of all your pages.

Another great advantage is the quick content indexing the entire site. In less than 4 hours, the spiders have visited up to 50,000 links on our website. For websites with more URLs, Google recommends various sitemap and have an index of sitemaps.

Disadvantages of this application:

It requires some programming knowledge, so that either ISPs offer this service as added value for your customers or many websites will not have that service and should remain indexed by ordinary spiders.

The sitemap that are already available in most web pages are not compatible with the format of Google. Google want an XML document with certain specifications.

With this project, Google seeks undoubtedly how to improve the indexing of web pages and to have in their indexes with pages that until now were lost in a sea of ​​links within our sites.

Google has created the Sitemap Generator and indexing service Express and offers completely free ... it will be interesting to see the reaction of Yahoo at this, because Yahoo offers service fast indexing payment of $ 49, $ 20 or $ 10 according to the number of url we want to index on an accelerated basis.

Currently there have firsthand results regarding the effectiveness of indexing through Google sitemap. Once we installed the new sitemap on various websites and we are ready to make comparative increase in number and frequency indexed spiders visititas pages, write a new article reporting the results. See you then.

later noteA few months have passed since we wrote this article. The results have been very good. A whole new website is indexed in less than 24 hours. It is ideal for when a new site goes to the network. The can be indexed at a time, without having to wait months and months for Google spiders read its entire contents.

Additional information:

URL with information about Google sitemap:
https://www.google.com/webmasters/sitemaps/docs/en/about.html

URL with specifications about Google sitemap:
https://www.google.com/webmasters/sitemaps/docs/en/protocol.html

Google free access to their AdWords APIs

Some time ago rumor of a possible liberalization of APIs that control Google AdWords. So far the rumors were denied by the Internet giant, but yesterday things have changed.

It is possible to request access to APIs.

But what are the APIs Google AdWords? It is basically a programming code up to now not modifiable by anyone but Google- that allow advertisers and other companies use their own software interface.

On the one hand it is a bit disappointing when taking into account the expectations that had raised the rumor, since the APIs do not add new features to AdWords. With APIs you can do what you can do with the control panel already using Google advertisers. In essence what we are talking about is a greater degree of customization, the ability to tune in any of these three aspects:

  • Campaign management.
  • Campaign reports.
  • Traffic estimate.

To prevent abuse or spam AdWords APIs be associated with a maximum number of transactions per month for each advertiser. Each of these numbers maximum operations is calculated individually based on existing accounts.

With this move Google seeks to achieve two things. The first is to provide APIs that allow free expansion to large advertising companies and entry into markets until now inaccessible. And the second is that given a little more control to advertisers.

Hace un tiempo Amazon consiguió llegar a lugares impensables y con funcionalidades increíbles al liberalizar ciertas APIs y permitir que programadores de todo el mundo pudieran consultar el catálogo de Amazon en casi cualquier dispositivo. Como consecuencia de aquello al día de hoy existen formas realmente originales y potentes de usar Amazon para el usuario. Un ejemplo es algo que ya es realidad en algunos países asiáticos: una persona se para delante de una zapatería y quiere comprobar si unos zapatos son caros. Escanea el código de barras con su móvil (funcionalidad que ya incluyen algunos móviles en Japón y Corea) y entonces el producto se compara con el precio en Amazon y se obtienen ventajas y descuentos en caso de comprarlo online.

In the case of Google APIs liberalization it is not as wide, but it is an indisputable first step towards opening new markets and consolidating its dominant position.

If the expected logical evolution occurs it is more than likely in months or even weeks to see the first results of this new flexibility in AdWords.

Links of interest:

General page on the AdWords APIs

Information page of Google for request access to APIs

Blog of API AdWords

Forums discussion of the APIs