Do you know the changes in the algorithm of management of Google Adwords?

So far, the order in which your ad appeared depended on the amount of money you were willing to pay for it. But it is no longer true, since early December.

Google ha introducido un “factor calidad” en el algoritmo que ordena los resultados y que acaba decidiendo qué anuncios salen en primer lugar, en segundo, en tercero, etc. Y lo ha hecho para evitar que gente que no dispone de ciertos productos y contenidos compre las palabras clave y dirija tráfico a su web como si las tuviera. (Por ejemplo: comprando palabras que corresponden a productos de su competencia).

While some SEOs and are beginning to fear that the management of announcements through the Adwords process, end up being as dark as the sorting of search results, we share with Google the view that is good for all advertisers and for those who they clicked on the ads.

What is the "quality factor"?

As always with things that work well, there are those who abuse them to get more out than the rest. This has been going on with Google Adwords and Google now wants to remedy.

Google wants to make sure that the page you are directed once you have clicked on one of the ads, deals with what you are looking for.

So from now on, if the content of a page has nothing to do with what the user was looking at Google's announcement that page will be out in last position.

How to be in the top ad positions without having to pay more for it?

For it is clear ... maximizes the number of times the words on which you hired ads appear in the text of the page pointed to by the ad.

And of course, make the page that takes your ad (landing page) is different for each word. You can do this easily from the menu Adwords.

For example:

A user searches Google: "fireproof safe"
You have a company that manufactures security products and sell fireproof safes and have hired Adwords so that when someone searches for "alarm", "safe", "safety deposit boxes", etc. your ad appears on the right side of Google.

So far, the best advice I could give in this case was to have two ads, one for alarms and one for safety boxes. With that you make the decision to click on one of your ads is higher, because the ad talks about what the user is looking.

  • "Alarm" -> alarm annunciation
  • "Safe" -> announcement of safes

With the change in the algorithm, in addition to selecting an ad for each type of search, you have to select the page that will direct the ad.

Does not serve redirect to your homepage (such as Adwords system default does) because surely within your pages have a specific to that product and in it, the search word will have more relevance (will be written many more times).

This also is good for you too because if the user goes directly to a page that includes the content he wanted, will be much more willing to fill out forms requesting information about the product or to send an email asking for it.

If you do not have a specific page for each type of product that match the keywords you've hired on Google, then you should create them, or your ad will never appear in the top positions.

Tips to get the most out of your Google Adwords:

  1. Buy only the words that your target audience will search Google.
  2. Create landing pages (landing page) containing the keywords.
  3. Test more than one ad and invests in having the largest conversion.
  4. Concentrate your budget on the words with you get more conversions.

A study shows that women and men surf the Internet following different patterns

The Eyetracking Media Spain Alt64 study in collaboration with the IACM concludes that 61% of Internet users look online advertising.
For the navigation pattern of Spanish and effectiveness of Internet advertising, it has analyzed his gaze

technology Eyetracking

Alt64 Digital, the company that distributes in Spain the Eye Tracking technology, in collaboration with the Association for Research of Media (AICM), made for the first time in Spain a study looking at the eyes of Internet users as they read media online communication, the study called Eyetracking Media Spain.

Es la primera vez que se realiza en este país un estudio de esta envergadura utilizando la tecnología Eye Tracking (seguimiento de la mirada) entre las webs y medios de comunicación online de mayor audiencia. Existe un antecedente en Estados Unidos que ha marcado el antes y después en el diseño de las páginas web y, en concreto, de los medios de comunicación online americanos, el Eyetrack Study, que llegó en 2004 a su tercera edición.

To make the Eyetracking Media Spain, analyzed the 6 media online wider audience, whose content is in sight and where no registration is required to read: ABC, newspaper, El Mundo, La Vanguardia, La Verdad and reason.

El análisis del patrón de navegación de los usuarios testeados muestra que las mujeres y los hombres leen la prensa online de forma diferente. Mientras los hombres leen en zigzag y se saltan muchos párrafos, las mujeres lo hacen de arriba abajo y leen, en la mayoría de los casos, los títulos y las entradillas de las noticias de forma completa. Tanto hombres como mujeres escanean las páginas en busca de palabras o de frases clave y en función de ellas, deciden si leer o no los artículos.

The results show, among other things, that 61% of users watch tested containing advertising websites. And after a later interview to navigation, the study indicates that 52.9% of them retains the mark of the announcement.

The study also shows that not include much advertising on a page, users see it more. Instead, with little publicity, but it made in certain advertising formats such as skyscrapers outer side or the top banner, is able to capture the user's gaze and get to see the ad.

Además de estudiar los patrones de navegación, en el estudio se responden preguntas como: ¿Los anuncios se miran realmente? ¿Dónde se ven mejor los anuncios en la web? ¿Qué formatos de anuncio son los más efectivos? ¿Cuáles fueron los anuncios más mirados por los usuarios que participaron en el estudio? ¿Se puede medir la efectividad de la publicidad online? ¿Qué marcas fueron recordadas?

The look of a human being is a good indicator of what is calling attention, and Eyetrack equipment automatically and with high precision is possible (less than 0.5% error) know, for example, what it is the journey of the gaze, how long the user stops at a particular point or area or whether the user is reading or scanning a text.

Data are captured by a small camera that incorporates the monitor from the user navigates. The route of the gaze is videotaped and software so that the sample can be analyzed routes, times fixating, reading patterns, etc.

A level of analysis of the sample, it is possible to obtain aggregated with dark areas information (areas that have not looked at anyone) and hotspots (areas that have looked most users) and make them heat maps pages website or any other medium being studied.

The complete study can be downloaded from the website of the AIMC ( And from the website of Alt64 (

Yes. A good web analytics program you can earn money ...

How? Very easy: if you knew exactly what websites users end up buying on your website and exactly where users who buy products in a particular category come get you. Would not you be much easier to attract more like them and thus increase your revenue?

But unfortunately, there are hundreds in the software market statistics, and few give you this kind of information.

La mayor parte de ellos ofrecen datos interesantes desde un punto de vista técnico o desde un punto de vista de estadísticas básicas. Pero no desde el punto de vista del profesional del Marketing.

Lo cierto, es que en la mayoría de las empresas, las estadísticas web están aún en manos del departamento técnico en lugar de estar en manos del departamento de Marketing. Si bien el control de la web, estos últimos años, ya ha ido pasando de técnicos a marketinianos, las estadísticas aún están iniciando sus primeros pasos en el cambio de departamento. De ahí que la gran mayoría de los programas que las analizan han sido creados por personal técnico, y están pensados para ser leídos por personal técnico.

This does not mean that a marketer not understand, nothing is further from reality, what happens is that the data provided by these programs are geared to meet the needs of knowledge of the technical staff rather than staff marketing.

Piensa en tu programa de estadísticas… ¿te indica claramente cuál es el camino de navegación que sigue la mayoría de tus usuarios?, ¿te indica los referrers de los usuarios que compran algo en tu web? más bien no… solo te indican los referrers de tus visitas, pero no distingue entre clientes y visitas. ¿Te indica en qué campos tus usuarios abandonan los formularios?… hay un sin fin de información que deberías poder manejar, pero que tu programa no te ofrece.

El pasado 29 de septiembre, en Barcelona, se realizó un seminario de Web Analytics dedicated to explaining the needs of marketing professionals and see how HBX a statistics program oriented 100% marketing could help them.

En el marco del seminario, algunos de los asistentes expusieron sus preocupaciones acerca de los programas de analítica. Algunas de estas preocupaciones eran, por ejemplo, cómo poder manejar datos de múltiples páginas web (entre los asistentes había varias personas que pertenecían a grandes corporaciones que disponen de decenas de páginas y que desean agregar los datos de sus webs). Otros asistentes manifestaron su preferencia por los programas que analizan tags (pequeños códigos de programación insertados en una página web) en lugar de los que analizan logs (registros del servidor), aunque temían que la instalación de los tags en sus páginas fuese complicada. Nada más lejos de la realidad, tal como nos mostraron los ponentes del seminario.

Volviendo al título de este artículo: cómo ganar dinero con un programa de estadísticas web. El secreto consiste en conocer tanto como te sea posible a tu cliente para poder crear su retrato robot y así facilitar la tarea de conseguir muchos más como él. Pero para ello necesitas un buen programa de analítica. Uno que trate clientes y no visitas.

Para esto, nada mejor que un programa que pueda conectarse con nuestro CRM. Los programas de análisis de nueva generación ya lo hacen. HBX se conecta de forma automática con programas como SalesForce y en caso de que la empresa disponga de otro CRM, es se puede crear una aplicación puente que comunique ambos programas. Además, con la identificación por cookies, es posible seguir todo el ciclo de vida de un cliente incluso antes de que sea formalmente cliente, desde el mismo momento en que entra en nuestra página web.

Definitivamente, el mundo de las estadísticas web está transformándose en algo potente como herramienta de marketing. No te pierdas la oportunidad de conocerlo más a fondo. Descubre qué sistema tienes instalado en tu web y aprende a utilizarlo!

Discover the 3 reasons why Microsoft enters the market of ads search engine

Microsoft announced last week that it had developed its own set of text ads on search results MSN Search. According to Microsoft this system is much better than its rivals Google and Yahoo as it will serve ads based on gender, age and location of the user.

There are three clear reasons why Microsoft is conducting this movement:

  1. For Microsoft, Google is becoming a real threat. Thanks to the revenue that Google is getting its adwords, they are funding many initiatives in the field of programming with free software and applications that have little to do with Windows and compete directly with Microsoft products, such as Gmail vs. Hotmail.
  2. It is obvious that the market for text ads included in search results is a great source of income. Google is a living example, has announced a net profit of 342.8 million dollars during the second quarter 2005.
  3. In a way, for some years, MSN has been forced to share revenue with YahooOne of its main competitors. Until now, the ads MSN offers on its website are offered by Overture, a company that since March 2003 belongs to Yahoo, so Yahoo gets a commission for each sale of MSN.

According to Microsoft the service they offer will be much more attractive than Google and Yahoo for advertisers because it will provide segmentation by gender, age, location, user, time display ads and other parameters that Microsoft knows about its users.

Microsoft has been latecomers to the world of search engines, but it looks like it is doing completely, but step by step and not risking.


El primer movimiento fue en verano 2003 momento en que lanzó su recién programada araña MSN Bot a escanear toda la red, cuando en sus portales aún utilizaba el buscador Inktomi (propiedad de Yahoo desde ese mismo año y por el que Yahoo pagó 235 millones de dólares). Durante 2003 Yahoo ingresó gracias a MSN 5,3 millones de dólares por el uso de su buscador.

A mediados del 2004 Microsoft lanzó la versión beta de su propio motor de búsqueda y a finales del 2004 dejó de utilizar definitivamente Inktomi, para pasar a ofrecer sus propios resultados de búsqueda. Desde entonces lucha por posicionarse entre los mejores portales de búsqueda. Aunque el mejor activo de MSN Search no es su algoritmo de ordenación de resultados (como lo es en el caso de Google) sino el hecho de que muchos de los usuarios de Windows no saben cómo cambiar la página de inicio de sus navegadores de Internet, ni tampoco cómo cambiar el motor de búsqueda que el MS Explorer lleva instalado por defecto. Así que no es de extrañar que MSN sea la web número dos del mundo en tráfico (la uno es Yahoo, la dos MSN y la tres Google).

After getting your own search engine, the next logical step is to exploit Microsoft itself the economic potential offered by search engines and that Microsoft has failed to see until Google and Yahoo have been presented every year positive economic results.

It seems that the first sites of Microsoft in testing this new ad system will MSN MSN Singapore and France. Then, it will spread to other countries.

We will be watching when this occurs, to perform segmentation and analysis of acceptance and subsequent expansion of the MSN network of advertisers.


Alexa Ranks ( )
World Ranking Web by number of visits and page views.

Discover what the substitute Google PageRank: the TRUST RANK

Desde hace algunos meses se viene especulando sobre la posibilidad de que Google cambie el algoritmo de PageRank por un nuevo algoritmo que filtraría todo el spam a buscadores o al menos intentaría neutralizarlo. Descubre en éste artículo todo lo que se sabe hasta el momento, acerca del nuevo algoritmo cuyo nombre es Trust Rank.

This is the technique used to get when you search for "Thieves" on Google, you first position appears on the website of the SGAE.

This is the technique used to get when you search for "Thieves" on Google, you first position appears on the website of the SGAE.

The new Google algorithm would prevent such practices.

The purpose of PageRank is to assign a numerical value to web pages according to the number of times the recommended and other pages according to PageRank that they have to turn. That is, it establishes the importance of that website. Their logic is: if web pages link to another page, is that is recommending. And if recommended, it is to be important in the field of subject matter first website. A recommendation that comes from a page that in turn is highly recommended is worth a recommendation that comes from a page that hardly anyone recommended.

Google wants in the top positions of search results pages find some relevance and are being recommended by other pages that in turn also have relevance. To determine the PageRank, Google analyzes the number of links coming from other web pages and PageRank. The Trust Rank, part of the same base. But instead of assessing the importance of a recommendation based on the PageRank of the page that recommends it does from a series of web pages that have been considered important by humans rather than algorithms.

A Web page that humans have determined how important they are considered "web seed" and their links are assigned a value. And it will be that value that will be transmitted across the network.

To illustrate with an example: Suppose we have a web seed "A". "A" transmit a value of 100 Trust Rank to all websites that link. These pages, in turn, forward a 99 Trust Rank to all websites that link. And the latter, forward a 98 Trust Rank to which they link.

To mitigate the degradation of Trust Rank as distances of the seed sites, the algorithm has included a correction that takes into account the number of degrees between the seed web and the web receiving Trust Rank without nullifying completely the distance between seed.

The idea of ​​Trust Rank looks good, but there are certain issues that must be considered:

The idea of ​​Trust Rank looks good, but there are certain issues that must be considered:

Who will be the seed webs?

Did you perform reverse spam?

Did you perform reverse spam?

So scathing comments and thinking about the future not too distant when the Trust Rank will work, it occurs to me that perhaps the same playing that searching the word "thieves" out the SGAE, perhaps to play to sabotage web pages, linking them unmercifully from its pages spam and therefore subtracting Trust Rank, from appearing at the top of search engines.

When will we have the Trust Rank algorithm built into Google?

No idea ... no one here agrees. When you least expect Google launches a statement and informs us that it has already implemented. What is clear is that notify the press and the Internet. Certainly will make a major qualitative improvement in obtaining search results so Google will make people aware of it, I doubt that the communications department of Google threw away an opportunity like this.

Further information for anyone wishing to broaden their knowledge:

Link to document Stanford University that deals with the Trust Rank:

Search Engine Optimization course (which no doubt have to change the day that implement the Trust Rank, but already includes the new indexing system with Google Site Map Generator): Online Course Search Engine Optimization. The course is free.

Discover the indexing of the future: Google SiteMap

Google proposes what will be the new way to index web pages.
Search engines like Google and Yahoo, use spiders to gather information from the web pages published on the Internet there. Once you have the information, process to quickly sort search results, based on a specific algorithm, when a user goes to their websites and asks any term or a phrase.

The search engine spiders regularly visit websites that are published on the Internet and automatically update information about their content.

So far, spiders came into the root directory of a domain, sought the robots.txt file to ensure that the site wanted to be indexed and then proceeded to visit all the links found on the website, thus recording the content of the page.

Google SiteMaps will revolutionize this form of indexing the web pages.

No es sólo que Google ahora lea con más detenimiento los mapas del site que la gente incluye en sus páginas web… no es nada de esto… es una nueva forma radical de indexar el contenido de las páginas. Google nos propone la creación de un sitemap en XML siguiendo unas especificaciones determinadas que darán toda la información a sus arañas y que les permitirá el acceso a urls que hasta ahora podían haber estado escondidas por diversos motivos ajenos a la voluntad de los webmasters.

Google wants to access the content of the web pages of the easiest and most efficient way. As it stands now raised indexing pages, even being much more efficient than human rates we had old (who does not remember going to a search engine, be inserted by hand the definition of our site, keywords why we wanted to be found and the site URL ... but this is prehistory internautical), which Google presents us is now much better.

Everything is to make available spiders a special sitemap.

To create this sitemap, enough to have an application that is installed on our server (there are versions for all operating systems) and creates a site map in a certain format. The application proposes Google can generate a map from the URL of the website from the directories of the website, or from server logs (ideal for dynamic pages).
Once we have the sitemap done according to the specifications of Google, we can register it in Google Sitemaps. Automatically and in less than four hours, Google will be indexed.

Google allows webmasters to create a cron to generate a new map to every hour (for sites with lots of content renewal) and make the map automatically submit Google Sitemaps. In this way, the spiders will know immediately the new pages created and may be incorporated into the index.

Advantages of this application:

No matter how bad you have the web page level paths for spiders ... with a site map created by the Sitemap Generator, Google spiders always find the url of all your pages.

Another great advantage is the quick content indexing the entire site. In less than 4 hours, the spiders have visited up to 50,000 links on our website. For websites with more URLs, Google recommends various sitemap and have an index of sitemaps.

Disadvantages of this application:

It requires some programming knowledge, so that either ISPs offer this service as added value for your customers or many websites will not have that service and should remain indexed by ordinary spiders.

The sitemap that are already available in most web pages are not compatible with the format of Google. Google want an XML document with certain specifications.

With this project, Google seeks undoubtedly how to improve the indexing of web pages and to have in their indexes with pages that until now were lost in a sea of ​​links within our sites.

Google has created the Sitemap Generator and indexing service Express and offers completely free ... it will be interesting to see the reaction of Yahoo at this, because Yahoo offers service fast indexing payment of $ 49, $ 20 or $ 10 according to the number of url we want to index on an accelerated basis.

Currently there have firsthand results regarding the effectiveness of indexing through Google sitemap. Once we installed the new sitemap on various websites and we are ready to make comparative increase in number and frequency indexed spiders visititas pages, write a new article reporting the results. See you then.

later noteA few months have passed since we wrote this article. The results have been very good. A whole new website is indexed in less than 24 hours. It is ideal for when a new site goes to the network. The can be indexed at a time, without having to wait months and months for Google spiders read its entire contents.

Additional information:

URL with information about Google sitemap:

URL with specifications about Google sitemap:

Learn how the Internet has affected the Porter 5 Forces

Internet has changed some of the bases that Michael Porter started in 1980 when he published 5 Forces involved the success or failure of a sector or a company.
Michael Porter
Over recent years, Porter has been modifying and clarifying 5 forces. In this article we will see how they have evolved to adapt to new times.

Michael Porter will be in Spain during the 12th and on May 13 giving a lecture under Expomanagement Madrid 2005. It will be an excellent opportunity to see first hand their views on the changes that the Internet is occurring in our economy and moment forces the move.

What are 5 forces of Porter?

In 1980 Michael Porter developed this method of analysis in order to discover what factors determine the profitability of an industry and its companies. For Porter, there are 5 different types of forces that mark the success or failure of a sector or a company:

  1. The rivalry between competitors
  2. The threat of new entrants into the market
  3. The threat of products that substitute to ours
  4. The bargaining power of buyers
  5. The bargaining power of suppliers

When the analysis of Porter's 5 Forces used?

  1. When you want to develop a competitive advantage over your rivals.
  2. When you want to better understand the dynamics that influence your industry and / or what your position in it.
  3. When you analyze your strategic position and looking initiatives that are disruptive and make you better.

But the Internet has changed some of the bases from which Porter started in 1980 for each of the 5 factors. Porter himself, over the years has been modifying them and adapting them to today's world.

In this article we will discuss 5 forces and see how the Internet has changed the rules of the game and what we should consider if we want to continue to use this type of analysis.

1. The rivalry between competitors:

Porter focuses on developing its recommendations differences between our products and those of competitors, to avoid falling into competing on price, a strategy that ultimately end up affecting the profitability of both companies.

  • Pero Internet ha permitido que realmente se puedan reducir los costes en empresas cuyos costes están relacionados con la comunicación, con la recepción de información o con la concertación de transacciones. Así que una gran parte de las empresas que han sabido aprovechar las ventajas que les brinda Internet y la tecnología asociada a la red, acaban pudiendo ofrecer precios más bajos y por lo tanto, compitiendo por precio en su mercado. Ante un mismo producto, con dos precios diferentes, la fidelización del cliente sólo influye en la pequeña cantidad de diferencia de precio que el cliente tolerará antes de abandonarnos e ir a la competencia. Internet permite que la rivalidad sea por precio, sin que esto lleve a una guerra donde no haya ganador.
  • Internet has also made appear on the market many products that were previously only intended for a local market, so even if our product was unique in our market, now appear identical products to our ... so again just competing by price .
  • La relación entre competidores ha cambiado radicalmente con la globalización de los mercados. Los clusters locales especializados en la producción de determinado producto o servicio, hacen que la relación entre empresas competidoras sea colaborativa, con objetivos con miras a desarrollar juntas tecnologías, investigación que hagan subir la productividad y la innovación de las empresas que participan en el cluster. Silicon Valley y Hollywood son los clusters más famosos, pero existen cientos de miles de clusters locales que han cambiado radicalmente la relación entre competidores.

There is information on the effect of local clusters in the relationship between competitors in various articles published by the same Michael Porter. At the end of the article we quoted one.

2. The threat of new entrants:

The threat of new entrants to enter our market is higher, when barriers to entry are low, when companies involved in a market do not want to fight the new players and when a new player has high expectations of profits if it goes into the market. So Porter advocates increasing entry barriers in a market. Their recommendations are as follows:

  1. Take advantage of economies of scale to lower costs
  2. Create differentiated products and patent them.
  3. Develop the brand image of the company, for customers to them more difficult to switch brands.
  4. Close access to distribution channels.
  5. Have restrictions for new players, dictated by government institutions.


  • Este modelo es válido para mercados estáticos. Internet ha propiciado multitud de mercados dinámicos que no permiten aplicar las recomendaciones de Porter. La consolidación de las empresas puntocom supervivientes al crac del 2000/2001 han cambiado los modelos de negocio y las cadenas de valor. Las puntocom han destruido eslabones de la cadena y han creado nuevos escenarios competitivos en los que han sido aplicaciones asesinas de muchos servicios ofrecidos hasta ese momento sólo por el mundo offline (ejemplo: las bolsas de trabajo online, las páginas web de clasificados de compraventa, las subastas online, etc.).
  • Network externalities, on the other hand, lead to the creation of natural monopolies because they generate positive feedback processes that make each new user of a service it has more value for the next user.

3. The threat of the development of substitutes:

Porter is considered a substitute of another product, only if you replace a product of a different industry sector to yours. For example, the price of aluminum beverage cans, is based on fluctuations in the price of glass bottles and plastic bottles. They are substitutes packaging, but are not rivals coming from the aluminum packaging industry.

  • La tecnología cada vez más permite la generación de nuevos negocios que hasta ahora eran impensables. Los cambios tecnológicos radicales que estamos sufriendo no permiten realizar ningún tipo de predicción ni análisis previo sobre este punto. Por ejemplo, pensemos en el mercado del ancho de banda: tenemos las conexiones vía cable telefónico, vía satelital, vía red eléctrica, etc… todas aparecidas y desplegadas en un espacio de tiempo relativamente corto. Se hace difícil poder prever y contrarrestar los efectos de este tipo de productos. El usuario cambiará tan pronto como perciba que el coste del nuevo producto es más bajo o cuando obtenga nuevas funcionalidades.
  • Internet also enables other ways to meet needs and functions, creating new and unimaginable substitutes.

4. The bargaining power of buyers:

For Porter this threat must be neutralized with an appropriate strategy to pursue this end.

  • The truth is that thanks to the Internet, customers increasingly have more power. Although it is seen from the point of view of the traditional company that is not exactly positive:
  1. Internet increases the information on products and market reality.
  2. Increases the bargaining power because it provides more direct routes and eliminates customer links in the distribution of products.
  3. It provides an unbeatable consumers to join and perform lobbying against certain companies when customers are dissatisfied frame.
  4. To better understand this point I recommend reading the summary of the conference by Philip Kotler at the World Forum of Marketing and Sales (Barcelona 2004): The 10 principles of "new Marketing

5. The bargaining power of suppliers:

Porter focuses the analysis of this point by remarking that the power of suppliers depends on the importance of them (think of providers who have captive market for example .: Telefónica, Microsoft, etc.).

  • The current trend is to treat suppliers as business partners, and share with them the ultimate goal of meeting the needs of our customers. The customer-supplier relationship is changing.

Although all qualified in the comments at 5 forces suggests that the Internet has forced companies to compete on price, the Internet has also led some of them have achieved great success in their differentiation strategies. These companies are the companies that are strong in:

  • Scientific investigation.
  • Product development teams with talent and creativity.
  • Sales teams with great communication skills and awareness of the needs of a changing market.
  • Brand image that conveys innovation and quality.

But the risks associated with differentiation have also been magnified by Internet:

  • Imitation of our products by others that have not invested in R & D.
  • The changing and unpredictable tastes of customers.
  • So to round off the article I say that the Internet has made the analysis of Porter's 5 forces still remain valid today, is much more complex than it was and with many more variables to consider.

Related information

Article by Michael Porter on Local Clusters.
Harvard Business Review: Local Clusters

Article summary of the conference about the New Philip Kotler Marketing and increased consumer power. Philip Kotler: the 10 principles of "New Marketing"

Information about Expomanagement, Madrid 2005, where Michael Porter will lecture about the results-oriented strategy:

  • The results-oriented strategy
  • The economic logic of higher returns
  • How to recognize what kind of business your company competes
  • The importance of abandoning the destructive competition and adopt strategic competition
  • What are the five keys to an effective strategy
  • Why most companies have no strategy
  • How to find the right strategy
  • How to successfully communicate and implement a strategy: essential steps

Huygens Titan's surface mapping

Impresionante vídeo en el que podemos ver cómo la sonda Huygens, tras abandonar la nave Cassini, va descendiendo hacia la superficie de Titán y mientras tanto va cartografiándola.

La música y los efectos especiales no creo que sean de la nave… pero la forma de cartografiar el planeta y el ir viendo cómo gana resolución a medida que se va acercando, es impresionante.

Why free content raises the billing? (OGame)

We all know, giving value for nothing is one of the best weapons to succeed online. If you can provide a good percentage of your free service without this just your results having a positive impact on your business will suffer.

Algo que los que llevamos años creando negocios en Internet sabemos a ciencia cierta, no es siempre fácil de explicar a terceros. En charlas y clases veo caras de escepticismo cuando tratamos este tema. La gente se pregunta ¿Cómo puedo ganar dinero ofreciendo cosas gratis?, o piensan que si das parte de tu contenido o lo que sea, gratis, mal acostumbras a los usuarios o piensan también que estás devaluando tu imagen… nada más lejos de la realidad. Pero ¿cómo se lo explicas para que lo entiendan?

The explanation is simple: the viral marketing that generates something free is much higher than can be generated with a payment product. So if you are able to transform users coming to your site into customers, you have the game won.

For several days, I have new arguments to convince my students ...


If there is something that I love are the games ... and of all the games, strategy games. Since early January 2005, I'm playing oGame ( A strategy game framed within the genre of massively multiplayer games turn (turn based massive multiplayer game). It is being played by the web, without requiring software installation. Typical game of colonizing planets (note for those over 30: is like the VGA Planets shifts but without sending email). Is free. It's addictive.

Right now we are more than 7,000 players playing in the same game.

I've been analyzing the growth of oGame since I joined him. Can anyone guess growing daily as the number of players?

The number of new players daily grows 6%. Daily!!

Can you imagine a business to grow at a 6% daily geometric?

Well ... certainly hard to imagine. In this case would be considered cheating OGame is a business. At least as far as the Spanish version is concerned. The German and English versions, and water are another matter.

Administrators of the game (German) so far have ruled the Spanish market. The business model for the other versions is as follows:

Business model: selling advertising on the German community and English version of the game + pay no advertising and more features that facilitate the organization of your strategies.

Advertising in the Spanish version exists but is minimal, poorly segmented, is by pop-up's (with which you can skip the all if you want) ... and worst of all ... it's all in German. Hence my assertion that the Germans have decided not to exploit the Spanish market at the moment.

But back to what interests us in this article. How to prove to unbelievers that offer something free, you end up doing the billing?

Consider the growth of the game:

El secreto está en la estructura de red que está detrás del juego. No es propiamente una red de Metcalfe (para conocer qué es una red de Metcalfe leer: Análisis del “Por qué eBay (y posiblemente Google) abren su código fuente a los desarrolladores”. Si analizamos su crecimiento veremos que no sigue una curva potencial típica de las redes de Metcalfe (crecimiento=número de usuarios ^2). Tampoco sigue la curva de crecimiento de una red grupal (crecimiento=número de canales ^usuarios), sino que es una especie de mezcla de ambos, que acaba mostrando una gráfica parecida a la curva de implementación de una nueva tecnología, la hype curve de Gartner (ver imagen adjunta). Es decir, primero sufre un crecimiento muy fuerte (en enero llegamos a un crecimiento de un 15% diario), luego baja de golpe, y acaba por estabilizarse un poco más arriba de los mínimos en los que ha bajado (al 6% diario que estamos ahora).

OGame viral marketing also works as a network of contacts ... small explosions as the recommendation is coming to small groups of friends or communities. Hence somehow it ends up showing a curve that combines the group growth with the growth of networks Metcalfe.

Anyway ... just a steady growth of 6% daily. These are many new visitors to a web page daily. Many more than we can attract with any marketing campaign, search engine indexing, advertising in Adwords, ... or anything that we can devise to increase visits to our website.

Again, I repeat, the secret is in knowing the views convert into customers. And that if the viral marketing or free content, and have nothing to do. They have already fulfilled their mission: bring users to the page. Here only consider our ability as business managers.

How do we convert visitors into customers?

This longer be the subject of another article. But not to leave readers dissatisfied, commenting that there are several models:

  • Online games: Offer a pay version of the game, with more functionality without micromolestias such as advertising. This is the model chosen by German managers ogame for their other communities.
  • Sites selling content: Offer expanded and better quality content, but by the same authors, for a reasonable price.
  • online marketplaces and classified: Offer a package of value-added services to complement the services they receive for free all users, but we know from interviews with some users, some of them would be willing to pay for. These packages can deliver advertising its products, more visibility, extra features, etc.

There are more models, but as I said before, the subject deserves a separate article. The important thing to remember is that there is no better way to attract users to your website that offering free content to generate viral marketing at speeds hyperlumínicas ;-)

Related links:


Article about networks Metcalfe: Metcalfe networks

Power Point presentation about the networks Metcalfe: Metcalfe networks

Video of the Huygens probe landing on Titan

Simulation of the landing of the Huygens probe on Titan Planet after a journey through the Solar System that lasted seven years aboard the Cassini spacecraft.

The ESA Huygens probe has successfully gone through the atmosphere of Titan, Saturn's largest moon, and landed 'safely' on its surface. This is the simulation of the landing which has prepared the ESA to illustrate the event.