Main menu

Pages

Google algorithm , traffic ,Black Hat SEO,Backlinking,Link building,GOOGLE Penalties ,White Hat SEO


 In jobs related to web development, search engine rankings are a factor that is often neglected. The objectives of this material are focused on laying the foundations of Web positioning (correct use of keywords), analyzing the keys to establish a good Web positioning and identifying how to achieve it.

In addition, the keys to get the most out of the Internet will be identified, using it as a marketing channel method and having an online strategy.

We will base ourselves on analyzing the main concepts, the basis on which any positioning plan is based.

Web positioning or search engine optimization is that process that helps us improve the visibility of a page or website, within the main search engine results.

To achieve greater visibility with the positioning technique, changes are made to the structure and information of a Web page. The technicians who carry out these tasks are called search engine optimizers (SEO).

The better the position of a page the more visible it will be. This is possible if the appropriate keywords are used, words related to our business, or that provide added value.

Therefore, the positioning serves so that a site or Web page is better known and visited by those people to whom it is directed, such as potential clients.

We must bear in mind that there are two types of search engines:

Directories: in this type of search engine, the pages are ordered by categories, that is, by themes.

Search engines: in this case, the information is the URL, not the theme of the page. The pages of a certain site are indexed in the search engine and when a user performs a search by entering a word found on any of the pages that make up the site, the result will be shown to the user. In this case, our website can appear in any search engine simply because our URL is included in the page of a third party and the search engine shows it as a result.

But why optimize a website? Because it is a basic strategy of any online marketing plan. It is necessary to apply the positioning technique when there is a need to attract traffic to the Web.

How important is positioning?


There are a large number of people in the world who use the computer, mobile or tablet to search for specific information on the web.

Search engine marketing refers to the advertising actions carried out with the search engine as an advertising medium: Google, Bing, Yahoo!,… Internet, therefore, is used as a quick way to search for information. Globalization and technology or the search for information limit the functions of libraries.

As previously said, search engines are programs located within a website or web page, in which, when entering keywords, they operate within the database of the same search engine, collecting all possible words that contain information. related to the one being searched for.

Currently, for any company, one of the key factors for its projection is to analyze the positioning of its brand in the market and the needs of potential clients.

So much so that even if a Web page has a quality design, it will not be of any use if the page cannot be found by Internet users. Thus, the most important part of a site is its positioning.


In general, all Internet users think that the pages that appear in the first positions of the ranking are the best in their sector. This is one of the reasons why it is important that our website is among the top rankings. If the image of the company is achieved, it will benefit.

This becomes even more evident after knowing the results of several studies that indicate that most users seek information about products and / or services from Internet search engines, always visiting the ads that appear on the first page.

Being among the first positions a website will increase its traffic considerably. This means that our website or website gets more visits from potential customers, consumers, etc. Thus, depending on the type of business and the characteristics of potential consumers, a concrete plan must be established to direct traffic until we achieve the objectives we want to obtain with our website.

Thus, when an Internet user searches for a product or service, he does so through search engines. It is known that the conversion of visitors from search engines to buyers doubles the conversion of visitors from other sources. With loyalty, 4% of search engine traffic turns into loyal customers. With what companies should set as their main objective a good positioning, for an improvement in their sector of commerce.


goals

Know the functions of web positioning.
Know how search engines work.
Acquire skills to create an SEO campaign.
Differentiate the key performance indicators of the SEM.


Search engines and directories

The first thing we have to take into account is to know if Google knows that there is a page or website, since if we take actions to position ourselves but the page in question is not indexed, the work done is of no use.

A quick trick to know if a site is indexed and how many of its pages are indexed would be to enter the following text in the Google search engine:

"site: siteweb.domain"

In the place where the website appears, the name of said page is entered and instead of the domain, ".es", ".com" or any other is entered to see if we are present in this search engine.

Search engines use robots or trackers , also known as crawlers , bots or spiders , which are responsible for tracking all the content on our website in order to analyze, extract and add that information to search engines. In this way, search engines manage to index our website and display it in search results.

When users carry out searches on the Internet and enter keywords in the search engine, Google among other search engines, facilitate the search results to the Internet user, classifying them hierarchically according to the level of relevance that they have for the query that the user has made.

To have a good positioning it is important to know the possible factors that are taken into account in the different browsers. But first of all, we must know the details that are important to search engine robots. Robots or crawlers search the web to create indexes through the links that link one web page to another. Therefore, that a Web page can be linked from


If the user enters keywords in the search engine, the robot will proceed to execute the content it has in the index, so it does not crawl each page looking for the keywords, but will perform a search for the words in the indexes that have been created from previous crawling, analysis and indexing . Therefore, those pages that are not indexed in the search engines will not appear in the search results and, therefore, the user will not be able to access their contents.

search engine is a computer program that has been designed to search for different digital file formats, such as web pages, text documents, video or music files, images ... all of them are on a hard drive or local server, as well as on the Internet, offering the search results in a group of links that connect with all the files or with the requested information. For each search made by users, there are millions of web pages with useful information for them. The objective of Google is to offer the user the most relevant information for what they are looking for and in the fastest way

More generally, the term search engine is assigned to the web pages in charge of locating links related to the information that has been added with the use of phrases or keywords.

Search engines : Search engines store data relating to a large number of pages through special programs that are responsible for crawling content on the Web. These programs are known by the names of robot, spider, crawler or intelligent agents.

These programs search and collect information, locating different pages for their content or following their hyperlinks. The information is stored in an index.


Nobody really knows how exactly Google's algorithm works except its creators. The most widespread and supported theory is that of PageRank, which assigns numerically the relevance and importance of the web pages that appear on the list.

The PageRank ranges from 0 to 10 and is a note obtained by a web page indexed in Google and which establishes which page should appear before another when faced with certain searches made by the user. It is based on the number of inbound links a website receives in relation to a keyword.

However, PageRank is not the only factor that Google takes into account since it would be insufficient to offer the content that the user is really looking for. For each search made by users, there are millions of web pages with useful information for them. The objective of Google is to offer the user the most relevant information for what they are looking for and in the fastest way.

The Google algorithm is the search engine's way of positioning the millions of web pages that may be related to certain search terms in higher and lower positions, to give the most exact answer to what the user is looking for at that moment.

There are many clues that help Google guess what you are really looking for through its search engine. These clues include website terms, your region, content timeliness, and PageRank . As we have commented previously, there are many modifications in the factors that affect the Google algorithm, these change constantly and very quickly. The algorithm is updated about 500 times a year , which would be equivalent to one change every seventeen and a half hours.

Sometimes the changes produced in the algorithm are very small and sometimes, however, they are so profound that they change the overall operation of the algorithm. Therefore, SEO work is an arduous task that must be constant and lasts over time, as we must always be adapting to the changes produced in order to optimize the positioning of our website.



The Google Panda


Google Panda is one of the most important Google algorithm updates. This update was introduced on February 23, 2011 and is primarily focused on content and quality. Quality content is king for Google's algorithm.

The objective of Google is to offer relevant and quality content to the user and that their search experience is positive, therefore, Google Panda reinforces this intention and penalizes all low-quality content. Content can be considered low quality for different reasons:

Very short content.
It is not well written for the user and contains spelling mistakes.
Do not contain descriptive headlines about the topic at hand.
Contains duplicate content.
It does not include metadata or meta descriptions.
Images do not include the ALT attribute.
The content does not contain the right number of keywords.

The objective of Google Panda is, therefore, to eliminate the results that were well positioned but did not contribute anything to the user.

What should I keep in mind with Google Panda?

With Google Panda we must take into account the following recommendations that will directly affect the SEO of our website:

Do not copy the content of the competition. Copying a single description of a product will negatively affect our SEO as we will be penalized.

 Do not duplicate the content. Many times we intentionally or unintentionally generate duplicate content. If this content appears in multiple URLs on our website, it may become a reason for penalties. On the other hand, it will negatively affect our positioning since Google will not be able to detect which of all those pages is the most relevant.

 When writing content, we must think about our users and not obsess over Google. Thinking of our users will help us create the best content for them. This means avoiding spelling mistakes and the Keyword Stuffing technique, that is, a Black Hat SEO technique consisting in using excessively the number of keywords within the same text, frequently penalized by Google.

 Enrich your content. Make the content as attractive as possible to your users. Include text, images, infographics, videos ... This will make the content of value to the user and, therefore, increase your traffic.

 Define, establish and plan a strategy. We should not launch into content creation without further ado. Not by publishing more, we will get better positioning. We must create a quality content strategy and offer the user what he really needs.

The objective of Google Panda is, therefore, to eliminate the results that were well positioned but did not contribute anything to the user.

The Google Penguin


Google Penguin is an update released by Google on April 24, 2012. This update to Google's algorithm mainly focuses on spam and link quality . After the update of Google Panda, it pursued the quality of the content, and with Penguin the quality of links and the elimination of spam are pursued.

Thanks to this update, the algorithm manages to detect low-quality links , which are bought by users, which are found in article networks or directories and which are not links that occur naturally. To do this, the best way that our SEO is not penalized is to passively generate quality links, that is, it is our quality content that attracts those quality links.

As we can see, there is a close relationship between Google Panda and Google Penguin because if we manage to generate quality content, it will attract quality links, so our web positioning will benefit positively.

The Google Hummingbird


Unlike Google Panda or Google Penguin, Google Hummingbird is not an update but a completely new algorithm.

It was launched on August 20, 2013 and focuses mainly on semantic search and the knowledge graph, which represents a global renewal of the Google algorithm. While Google Panda pursued the quality of the content and Google Penguin the quality of the links, in this case, Google Hummingbird works for:

 Semantic search: It consists of Google being able to identify what we are really looking for, even if the terms we use are ambiguous or insufficient. For example, if we search for "Granada", Google must know if we mean the Spanish or Nicaraguan city.
 Knowledge graph: Thanks to it, Google searches for connections between different concepts to provide more information to the user in their searches. This is reflected in the expanded information box or carousel at the top. In this way, the user has access to more complete information about those terms that he is searching for.


What is PageRank for?


This indicator will be used by Google to determine the importance or relevance of a website. To calculate it, it will take into account above all the link structure. The greater the number of inbound links to a given url, the higher its PageRank. But there is something very important to keep in mind: those links that point to a website must be of quality. Thus, if a certain quality website points to a website, Google understands that that quality website is scoring the content of the website it points to because it contains relevant content. Therefore, Google will take into account the importance or relevance of the page that "scores" and generates a link to a specific website.

Our objetive? Get quality and highly relevant websites to point to our page to benefit from such relevance, increase our PageRank and improve our positioning. In the same way, as our PageRank increases, we will transfer part of our relevance when we create links that point to other pages, improving their PageRank.

It is important to understand that PageRank will function as one more variable that the Google algorithm will use to position the page, but it will not be the only variable to take into account for SEO.

TrustRank concept:


The TrustRank is a system introduced by Google with the aim of measuring the trust of a certain website and avoiding spam. That is, to help detect those sites with low credibility to decrease their positioning in the search results, preventing the user from clicking on content considered spam. While PageRank focuses on the relevance of a web page, TrustRank will focus on trust.

In this way, Google will create two categories: trusted websites and websites considered spam. The former are characterized by having links from other trusted pages, and vice versa. Therefore, it is vital that the links to our website come from trusted pages in order to increase our TrustRank (another of the variables that the Google algorithm will use to position us in its search engine). On the contrary, if the links we receive come from pages you consider spam, Google will consider that our site is also spam and will penalize us in our positioning.

Ultimately, the quality of the incoming links prevails over the quantity. Ideally, you should have a lot of quality and inbound links. In this way, we will improve the authority of our website, and consequently, its positioning. Hence the importance of discarding link buying strategies and betting on a linkbating strategy: achieving quality links thanks to the relevant and quality content on our website.


Website. General considerations

A dynamic website can have frequent changes in information. When the web server receives a request for a specific page on a website, the page is automatically generated by the software in response to the request made by the page.

Thus, a wide range of possibilities can be given, among others, showing the current state of a dialogue between users or monitoring a changing situation, providing personalized information in some way to the requirements of the individual user, ...

There are many varieties of websites, each specializing in a particular type of content or use, which can be arbitrarily classified in many ways. Some of these classifications can be:

Archive site

it is characterized by storing information belonging to websites of the past or on the website itself.

Weblog site

 also known as a blog is a website that allows the publication of stories, posts or articles.

Company site

 website where the company shows who it is and offers its services.

E-commerce site

 e-commerce is a website that allows B2C, B2B or C2C sales.

Virtual community site

 website that allows interaction and communication between its users. All users share interests.

Database site

 website responsible for storing and retrieving data.

Development site

website where web development and constant updating occurs. The information focuses on design, web and software.

Directory site

 it is a site organized by links to other web pages organized by categories.

Download site

 allow uploading and downloading of documents, files, apps, music ...

Game site

created for users to participate and establish playful relationships with other users.

Information site

website intended to inform the user with updated content.

Spam site

they have no relevance and are used to deceive search engines. They are harmful to users since they enter them by mistake and administrators benefit from the advertising inserted on the website.


Relevance of the results

Search engines are made up of four parts which we will explain below:

1. Spider or Crawler


It is a software whose function is to track changes on the Internet. The web pages are traversed through the different links that we can find on the pages that make up a website.

The function of the spider is to visit each and every one of the links it finds, visiting each of the webs that we can find on the Internet, extracting the content and omitting the underlying HTML.

This content is going to be added and processed by the different search engines.

2. Index


The index is used to store and classify content, to display it more quickly. This content is found by the spider through the different websites. Search engines keep their index as up-to-date as possible to improve their indexing processes and spiders, and thus keep an image as up-to-date as possible of a given website.

3. Relevance system


When carrying out any type of search in any search engine, it presents the results ordered under a concept called relevance. Relevance is calculated using a series of algorithms that take into account a multitude of factors and that change for each selected search engine.



Qualified traffic


We have seen in the previous point what are the components of a search engine, and also what are the filters they use to sort the content and show the results to the user in a certain order, well, we must know what are the aspects that we have to take care of our website to encourage any page of our site to appear above another page in the results shown by any type of search engine.

In addition, there is a very important aspect to consider: qualified traffic. Are we sure that the visits we receive on our website are of high quality? What is considered quality traffic? Here are ten keys that must be taken into account when analyzing the web traffic we receive:

Quantity vs Quality

 receiving thousands of visits will not help us and nothing if these visits are not qualified. Visits must convert, that is, achieve the objective that we have set ourselves (buy online, download a form, request information, share an article ...).

Natural traffic

this will come from the SEO work carried out, and as a general rule, it is usually more qualified although it is a slower process and requires time and effort.

Paid traffic

although it is usually traffic directed to convert, it must be taken into account that it requires an investment, for example through Google Adwords campaigns. In addition, effort must be invested in creating optimized campaigns so that visitors convert and we do not lose money.

Email marketing campaigns

 thanks to these campaigns, communication can be very well segmented by objectives and the needs of each client, increasing the possibility of conversion. Of course, you must work on the planning and optimization of these campaigns to avoid ending up marked as spam. A clear example of this type of campaign can be remarketing.

Social traffic

this traffic will come from social networks such as Facebook, Instagram, Youtube, Twitter, LinkedIn ... it is a highly qualified traffic and valued by the Google algorithm.

Creating a blog

 the traffic that comes from a blog is highly qualified since in many cases they will be subscribed and interested in what the company offers.

Affiliate marketing

 these inbound links to the website will come from other pages that share similar interests to ours, with which there is a high potential for users who end up converting and achieving our goal.

Directories

registering with different directories can help attract traffic that is found by doing searches related to what we offer.

Participation in forums

when the user seeks the opinion of other people about certain products or services, they usually go to forums. Being able to solve doubts on these platforms can be a great opportunity to attract highly interested traffic to our website.




Black Hat SEO


We define Black Hat SEO as the set of actions or attempts to improve the positioning of a website through techniques that go against the guidelines set by the Google algorithm and that are considered unethical.

Carrying out these strategies are not recommended at all for webmasters, since they put all the SEO effort made at risk, in addition to being almost certainly penalized by Google, damaging our positioning.

The goal of Google is to offer the best possible experience to users so if your goal is to increase your web positioning without thinking about the user, you are lost. The success of good SEO lies in building value for the user.

Important updates to the Google algorithm such as Google Panda or Google Penguin go against precisely these practices. Its objective is that we create quality content for users and that the traffic we receive is natural thanks to said content.

In this way, Google tries to go after and penalize low-quality content, the purchase of artificial inbound links, or spam. Many of these aspects, directly related to some of the most used Black Hat SEO techniques:

Cloaking.
Invisible text.
Domain duplication.
Spam in forums.
Spam Keywords.

CLOAKING

From English hiding, it refers to the technique that shows a Web page to the robot program used by search engines to index pages, and a different page to Web visitors.

The most common method is the insertion of text with the keywords of the website under an image or photograph. The photograph literally covers the text, and the visitor only sees the image, while the search engine robot does not see the image and does see the text underneath, counting a greater number of keywords and increasing the positioning of the page.

It is a relatively easy technique to detect, that provides a quick but very short-lived benefit, and that usually ends with the elimination of all the pages in the search engine results.

Invisible text

Simpler than the previous one, this technique is based on writing text of the same color as the background of the page, with the aim of increasing the density of keywords.

It is a bit more difficult to detect, but it only influences the text content, without being able to be applied to links, link juice, etc.

Domain duplication

Due to the low price of a web domain, some companies buy dozens of these relative to their keywords, and make different websites to take over the search engine results. These are capable of detecting if the domains have duplicate content, they are hosted on the same server, even if the domain is under the name of the same company, so they are also relatively easy to detect.

SPAM in forums


To generate inbound links to the Web, some services are dedicated to visiting dozens of forums on a regular basis and commenting on the messages that appear there, adding their own address in the footer or in the Web field.

Generally, the forum or blog usually uses the user name as Text Link, so that writing a comment as a user Furniture in Albacete, in a decoration forum, pointing to the address www.mueblespepito.com, will transfer the importance of the decoration site to the Furniture Pepito website, and it will indicate to the search engines that the website deals with furniture in Albacete.

Since the technique is very simple, two systems have been enabled to reduce the volume of SPAM that is generated. The first is to add the attribute rel = "nofollow" to the comments, so that they do not absorb importance to the site that links them, the other is to add a Captcha, a small program that asks the user a question that an automatic system cannot answer. , so that computer programs cannot be used to do SPAM.

Use of forced links or spam Keywords


Creating pages with dozens of links, containing their keywords as a link word, is another of Black Hat SEO's strategies, and causes many problems for search engines, since it forces the robots to visit all the generated links, with the consequent loss of time.

These pages saturate the content with links of the type:

Barcelona apartments
flats in Barcelona
houses in Barcelona
penthouses in Barcelona
garages in Barcelona
garages Barcelona
houses Barcelona

Hundreds of links are created with the keywords, obtaining unreadable pages, which are penalized by search engines, and easily locatable by containing more than 50-60 links, which is recommended on a Web page, although Google reads up to 100 links per page.



Backlinking

The Backlinks is a link that comes from a different page and that points to the website or blog itself. Two types of links or external links stand out:

Links "Dofolow" or "Follow"

Link that transmits authority to the page that links, that is, it transmits what is known in SEO jargon as “link juice”. With these types of links, relationships are created with other web pages and it is easier for search engines to identify the subject matter of the content more easily.

Links "Nofolow"

They transmit web traffic, being therefore very positive for SEO, since it grants having a more natural set of backlinks.

Backlinking, therefore, is one of those links that point to the web from other reference pages, which will increase the credibility of the site for the Internet giant and become a greater presence in the search engine.

To do backlinking correctly, thus obtaining optimal results, will require time and effort. Before starting this technique, the following should be done:

Locate reference websites for the specific sector. That is why it is important to calculate what the PageRank of the site is.
The content is the most important thing, so to position the page, it is important to ensure that there is relevant content with relevant information.

Quality links should be created that refer to the pages in question. Some tips for this are the following:

Find the most important directories in the sector.
Relate with specialized media in the business niche and with bloggers.
If you have a blog, you can do interviews.
Leave comments, participate in active forums and groups.

Keyword stuffing

Keywoerd Stuffing is one of the oldest Black Hat SEO techniques, in which it is about repeating the relevant keyword excessively, either in the content or in the links, with the aim of increasing the positioning of the website for specific keywords. This increases the density of keywords and is penalized by search engines for being spam.

In the short term, a post could be positioned, although in the long term the use of this technique would be uncovered since no value is added to that content.

Although this technique continues to exist, it is no longer perceived in such an exaggerated way. Keywords are used on many different sites, for example:

Meta title: use of repeated keywords in titles.
Content: if abused, inconsistencies in the text and meaningless keywords are generated.
Source: it is located in the non-visible part, code, for the user. The font "zero" or white with a white background is used with keyword abuse so that the user does not perceive it but the trackers do.

Search engines in general, especially Google, recommend not exaggerating the use of keywords, since these practices are penalized and, therefore, cause a deterioration in the positioning of the website in search results. In addition, the user experience also worsens because it is more difficult to read the texts.

It is about creating content that offers relevant information and added value to the user, since the abuse of keywords does not have any positive effect on SEO, so it is preferable to make more quality content.


Tiered link building


Tiered link building is another of the most widespread and powerful SEO positioning techniques. It is understood as a pyramid of links, that is, the construction of links by levels, diversifying the backlinks in different types of strata, trying to enhance the flow of link juice.

By performing this technique, a comprehensive marketing strategy is developed, where actions are carried out completely. When starting this technique it is important to know that it is best to start by making a pyramidal structure that is structured mainly in three "tiers", that is, three levels.

Level 1

Links that will point directly to the website itself are placed here. These links typically use free blogs and blogging platforms, as well as newspaper and university blog platforms. The content created must be of high quality.

Nivel 2

La calidad en este caso pasa a no ser tan relevante como la cantidad. Los links procederán de blogs, comentarios en bitácoras, foros, artículos que se encuentren en directorios, bookmarks… Este tipo de estrategias se suelen hacer así porque si se optase por gran calidad y extensión, llevaría demasiado tiempo.

Level 3

It can be spammed, because only the amount is going to count. The links can be exactly the same as in tier 1 and 2, the quality is not going to be important.

It will be taken into account to carry out a correct planning of anchor text (generic, contextual, domain, url, keyword). Special attention also deserves to avoid always linking to the home page, you must change the structure of links and also direct them to the articles.

The link pyramid does not have to be symmetrical, it is even advised that it is not. You can create social bookmarks to blog A of Tier 1 and blog B to create comments on websites. The links must be indexed, otherwise all the work done will not be worth anything.



Micro-niches

A micro-niche consists of creating a blog dedicated exclusively to a keyword or key word, that is, a word or short phrase that is searched a lot in search engines.

The main objective of the micro-niches is to find a profitable keyword, that is, a keyword for which AdSense advertising advertisers pay a lot for each click, without difficult competition and with a minimum number of searches.

It cannot be guaranteed with absolute certainty that a micro-niche is profitable. There can be different ways to make a micro-niche profitable, so it should be clear that the only way to make money is by getting traffic.

Affiliate Marketing

It is a type of performance marketing through which a company or business rewards one or more affiliates for each visitor it achieves, thanks to the marketing actions carried out by said visitor.

This structure will have four main players: the merchant, the affiliate network (which contains the different affiliate offers), the affiliate and the customer. The membership management agency can also be introduced.

The methodology followed in this technique is based on the main disciplines of online marketing, such as organic positioning or SEO, SEM, e-mail marketing ...

The key to creating an affiliate marketing program is to create a network of contacts with whom you can reach agreements and achieve goals.

Affiliates need textual links and audiovisual content, such as banners, to add the codes and identify the origin of the visits. Once the strategy is launched, some elements have to be optimized such as web design, usability, content, forms ... Affiliate marketing, therefore, requires a lot of involvement, since for the pieces to fit together , all people must actively participate.

Duplicate contents

One of the worst techniques that can be performed is to duplicate the contents at all costs without any logic in order to repeat, over and over again, our keywords. And even worse, copy existing content from other pages. All this can lead to great penalties in terms of SEO by Google.


Link building

It is a very important factor in the natural positioning of a website, since the more quality links there are towards the page itself, the more authority it will have, positioning itself better in the different search engines.

The main purpose of Link Building is to improve the positioning through the backlinks (inbound links) to the website itself, although it is also a good strategy to get more quality visits to the page.

Link Building, therefore, is the artificial generation of links. The main key is to artificially obtain links by simulating that it has been done naturally, such as requesting the inclusion of the website in a directory. For this strategy it is essential not to forget the quality of the sites with which you are going to interact and participate only in those that are related to the theme of the web.

Among the techniques that should be taken into account in Linkbuilding, the following stand out:

Solve doubts in forums.
Share on social networks.
Comment on other blogs in the sector.
Write as Guest Blogger.
Share articles in news aggregators.
Send press releases to the media and blogs.
Create relationships with other bloggers in the sector.
Create an RSS feed.
Write testimonials and give opinion.
Create internal links on the site.
Register the brand in industry directories.
Conduct brand interviews.
Look for links that point to the competition.

Below is a list of Linkbuilding mistakes that occur regularly, but should be avoided.

Send all the links to the home.
Always use the same anchor text.
Position only one keyword.
Make all the links the first month and then nothing.
Position the links in the footer.
Position the links on the blogroll.
Using too many links in the same article.
Thinking that quantity is better than quality.
Do not analyze the page from which a link is obtained.
Use blog links on other topics.
Not knowing what the competition is doing.
Use reciprocal links.
Use links on the same IP.
Get only DeFoloww links.


Link Bating

Link Bating is any idea or system put into practice with the aim of being linked from various sites to improve the positioning and relevance of the web. The way of obtaining the incoming links is 100% natural.

In order to get backlinks in this way, you have to make useful content for readers, while favoring that which is shared by social networks. Therefore, this content will have to be characterized fundamentally by its quality and usefulness for readers, as well as favoring its being shared by social networks, it must be promoted by blog directories, specialized pages and even the different social networks to achieve that other users outside the page recommend it to other users, whether they are acquaintances or professionals in the sector, thus obtaining links to the website itself.

Link Baiting is therefore about creating content so attractive that all websites want to share it. Although it is the technique that is most often valued, it has quite a few drawbacks, the main one being that nobody can guarantee that content made will be shared.

It requires certain skills to develop new, interesting content that can add value to readers who are interested in the website.

Among the actions that will help Link Baiting are using original videos or creating infographics that are visually attractive.



GOOGLE Penalties and solutions


Whenever there is a noticeable loss of traffic on a certain website, SEO first thinks that it may be a Google penalty. But we must be aware that penalties are not the only thing that can affect the traffic that a website receives since, for example, search trends can change, your products or services can be seasonal, competition improves its position ...

Therefore, the first thing we must do to be aware of whether we have been penalized by Google is to analyze the previous options and discard them. In case none of the above options is the reason for the sudden loss of our web traffic, then we can think of a penalty from Google.

What can lead us to think that it is a penalty?

  Our domain is unindexed from the search engine.
  Our domain does not appear when we search for it by name.
  Our organic traffic from Google is significantly reduced.
  Some pages disappear from the search results.

These may be some of the symptoms that can make us understand that we have been penalized by Google and this penalty may be due to various actions that we have carried out such as:

Carry out negative SEO actions.
Create artificial links, buy and sell links.
Reload keyword pages in order to increase their relevance.
Copy content or duplicate content.
The use of hidden texts.
Having shown different content for the search engine and for the user.

All these actions, among many others, can be penalized by Google, and consequently, our SEO can be negatively affected. The two types of sanctions that Google contemplates are:

Algorithmic penalty

when the algorithm detects that you violate Google's guidelines and automatically applies a penalty to your website. No notification about the penalty is received although it is easier to solve by the SEO professional.

Manual penalty

 Either Google detects an alarm signal or someone has sent a spam report. Google reviews your website and if it is correct it applies a manual sanction. In this case, if notification is received with the Search Console tool but it is much more complicated and expensive to recover by the SEO professional.

To know if our website is indexed, tools are needed, such as.

The Google search engine.
Google Analytics.
Google Search Console.



Indexing difficulties can become a major obstacle to web positioning work. Due to these types of problems, it is very important to know how to anticipate and anticipate the situation, so that this allows us to act with firmness and determination as far in advance as possible.

In the event that a problem of this type may be had or is detected, we must carry out actions that help to solve them, such as:

Make use of robots file tools.
Send updates to the Sitemap so that Google takes them into account in its crawling tasks.
Perform navigation optimizations and fix broken links.
It is advisable to remove any type of irrelevant content.
Make use of SEO techniques in the images.

Anyone who hires a company or person that does Black Hat SEO should be aware of the risks that this entails, although what is usually intended is to obtain quick results.

The risks are not going to be taken into account when it is not given importance to the project sinking, since it is not necessary to carry out the monetization of a brand for a long time.

Among the risks involved in the use of Black Hat SEO, the following will be highlighted below:

Penalty that influences the decrease in positioning (can be temporary or permanent).

Penalty in which it will be eliminated directly from the positioning (it can also be temporary and permanent).

Typically when repeat offenses occur, the following penalty options are considered:

Cancellation of the account.
Elimination of advertisements on own pages.
Elimination of advertisements on other pages that are not your own.

This positioning strategy is an attack against the rules established by search engines for the positioning of a site and uses somewhat dangerous techniques that are normally prohibited.

How to act if we are penalized by Google Panda?

In the event that we have been penalized by Google Panda, it is clear that the problem is directly related to the content of our website, especially its quality. Taking into account what aspects Google Panda takes into account to consider low-quality content, we must take the following measures to solve this problem:

Eliminate duplicate content both on our website and outside it. Eliminate copied content and take action against those who steal your content (scrapers).
Reduce advertising.
Create and publish original high-quality content for the user.
When we include outbound links in our content, make sure that they do not point to low quality sites.
Do not target our website only to search engines (avoid over-optimization).

How to act if we are penalized by Google Penguin?

Penalties by Google Penguin are directly related to incoming artificial links on our website. Therefore, it is strictly forbidden to carry out the following actions:

Creating manual links excessively.
Make use of automatic link building programs.
Compulsively exchanging links.
Make link purchases.
Carry out excessive guest blogging actions.
One of the basic ways to detect these types of artificial links is to observe that almost all of them have the same link text, their origin is low quality sites, they have a high proportion within the link profile or they use exact keywords and texts with an over-optimization.



White Hat SEO


On the other hand is the so-called White Hat SEO. It is a technique that complies with the stipulated guidelines, considered ethically correct, by the different search engines to position a specific website. Its purpose is to make a page more relevant and notable for search engines.

It refers, therefore, to the tactics, techniques and strategies that can be used to better position the blog in search engines, always attending to their terms and conditions, that is, in a totally legal way and as marked in the guidelines.

It is any practice that improves the search positioning on a search engine results page (the SERPs), without modifying the integrity of the website and respecting all the terms and conditions of service in the search engines. These are tactics that are kept within the limits defined by search engines, mainly by Google.

1. Regarding the content

It will offer content that is primarily intended to resolve any doubts that the user may have. Showing original and useful content is the best way to get links from other pages. This technique is known as linkbaiting.

2. Regarding the design

This must be careful, adapted to any device and that improves the user experience. Having a responsive website can be an essential element to position in mobile search results.

3. Regarding the code

It is essential to have a well-structured HTML code, since the more clarity the pages have for search engines, the better results. Optimizing crawling will help increase the relevance of the web in search engines.


LEARN MORE : 

The Long Tail,Copetition keywords,Google analytics


Comments

table of contents title