The most important reason why SEO is necessary is that it makes your website more useful for both users and search engines. Although these cannot yet see a web page as a human does. SEO is necessary to help search engines understand what each page is about and whether or not it is useful for users.
Now let’s give an example to see the clearest things:
We have electronic commerce dedicated to the sale of children’s books. Well, for the term “coloring pictures” there are about 673,000 searches per month. Assuming that the first result that appears after doing a Google search gets 22% clicks (CTR = 22%), we would get about 148,000 visits per month.SEO important
Now, how much are those 148,000 visits worth? Well, if for that term the average cost per click is € 0.20, we are talking about more than € 29,000 / month. This only in Spain, if we have a business-oriented to several countries, every hour 1.4 billion searches are made in the world. Of those searches, 70% of clicks are on organic results and 75% of users do not reach the second page. If we consider all this, we see that there are many clicks per month for the first result.
SEO is the best way for your users to find you through searches in which your website is relevant. These users look for what you offer them. The best way to reach them is through a search engine.SEO important
1. How do search engines work?
The operation of a search engine can be summarized in two steps: tracking and indexing.
A search engine crawls the web tracking what are called bots. These scroll through all the pages through the links. Hence the importance of a good link structure. As would any user when browsing the content of the Web, they pass from one link to another and collect data about those web pages that they provide to their servers.SEO important
The crawling process begins with a list of web addresses of previous crawls and sitemaps provided by other web pages. Once they access these webs, the bots look for links to other pages to visit them. The bots are especially attracted to new sites and to changes in existing webs.
It is the bots themselves that decide which pages to visit, how often and how long they will crawl that web, that’s why it’s important to have an optimal loading time and updated content.
It is very common that in a web page you need to restrict the tracking of some pages or certain content to avoid that they appear in the search results. For this, you can tell search engine bots not to crawl certain pages through the “robots.txt” file.SEO important
Once a bot has crawled a website and collect the necessary information, these pages are included in an index. There they are ordered according to their content, their authority, and their relevance. In this way, when we make a query to the search engine it will be much easier to show us the results that are more related to our query.
At first, the search engines were based on the number of times a word was repeated. When doing a search they traced those terms in their index to find which pages they had in their texts, positioning better the one that more times had it repeated. Currently,
they are more sophisticated and base their indexes on hundreds of different aspects. The date of publication, if they contain images, videos or animations, microformats, etc. They are some of those aspects. Now they give more priority to the quality of the content.
Once the pages are tracked and indexed, the time comes in which the algorithm acts: algorithms are the computer processes that decide which pages appear before or after the search results. Once the search is done, the algorithms check the indexes.
This way they will know which are the most relevant pages taking into account the hundreds of positioning factors. And all this happens in a matter of milliseconds.SEO important