Main menu

Pages

Link exchange ,Black Hat SEO,Site Map

 


As is already known, the authority of a web page depends on the number of pages that link it, the authority of the pages, the level of similarity between the web itself and the linked web in terms of content, as well as other factors linked to the content that goes around the link.

How Google interprets a link

When the Google robot is in charge of crawling a website, you have to pay attention to the links that are found and they keep coming to the page that they link and analyzing it as well. In this way, the more links there are, the more likely they will be first.

It is important to know that there are numerous factors that establish the value of a link, numerous hypotheses about points that can influence:

Relevance: a link to a page that deals with a topic similar to one's own provides more value than another that talks about a completely different topic. Thus, if the most reputable page in the field of fashion links to a fashion blog, it will be much more valuable than if a generalist newspaper does, even if it presents more authority.

Trust: Google selects certain websites of maximum trust (government, universities, institutions ..., with the aim that their robots start a search. These pages have a maximum trust value. Google follows the links on these pages and discovers new websites The higher the number of intermediate websites between the most trusted websites and your own, the less trust Google will give.

Authority: a website to which hundreds of others link provides more value through a link than another to which only 10 link, if it is considered that the average quality of each link is the same in both cases. Therefore, a link from a page with authority is much more important than another from a page without it.


Links are not usually born from desperate attempts to do so, but rather, the large pages that have the largest link profiles do not usually obtain them actively, but users simply link them.

The generation of links is considered passive when the page obtains links without an external effort to obtain them. The main reasons why a website is linked are the following:

It is relevant: this happens with very large and relevant brands in certain niches.

You are an authority: if you are a reference within an industry, you get links from people who link to complement a content.

Value: Users place links in articles, news or portals to provide additional value or complementary information. If what is done offers a value that complements the content of others or provides new information such as news, users will link the web.

Partners and partners: when several people collaborate, they tend to link to each other because this can improve the SEO of both domains. However, you should not force this technique since the exchange of links is penalized by Google.



Social bookmarks or Bookmarking

Social bookmarks or Bookmarking are a social medium that makes it possible to store and share links, as well as other files. It is an interesting tool to carry out strategies on links to the web, being an ideal place to share information.

Some of the most important social bookmarks are:

- Google Bookmarks.

- Digg.com.

- Diigo

- Reddit.com.


Link exchange

Link exchange is based on exchanging links between two web pages. This is a way of promoting each other, being aware of the importance that Google gives to links. The exchange of links is also called reciprocal links, being one of the most common SEO practices a few years ago, although at present its use is more limited because Google penalizes for this type of practice.

Google's goal is to provide the best results in its search engine. One of the parameters that Google has most in mind when it positions a certain content is the number of links it receives as well as their quality. Google considers that if someone links a page it is because it is interesting. However, in the exchange of links, users link to you not because the content is interesting but because they have promised to return the favor. This is against Google's quality standards.



These days, Google has gotten very tough on link exchange and while the practice has become sophisticated in different ways, it is actively pursuing it with each new algorithm.

Although in general the inbound links are good, in many cases there are certain pages that we do not want to link to us due to their reputation or inappropriate content. In these cases it is not possible to do anything except a request to the owner of the page, which is convenient for us to analyze if we really want to receive this type of links or not.

However, the opposite may happen, requesting a website to link us. When we make such a request, it is necessary to take into account:

Relevance: the more relevance on the part of the page, the more relevance the link will have.

Theme: If the link comes from a website related to ours, it will be more important. Remember that you are beginning to think that Google understands the semantic meaning of webs.

Link Age: The age of a link gives relevance. The longer the page that links us and the link itself have been published, the more important it is.

Country of the web: A page located in our same country will have more relevance than an external one. And secondly, a website that has our language will have more relevance than one that does not. For example, a link from a Spanish-American website will have more value than one from Finland, since the language is the same.

Regarding the characteristics, we have to take into account:

Naturalness: Essential for the link to be quality. If we repeat the same anchor text several times, it will indicate that we are doing it on purpose and can be included as trying black hat seo techniques.

Density of links: It all depends on the amount of links that the page that links us has. If the source page has 20 links, the page authority will be divided by 20 for each of those links.

Depth: Refers to the level where the source link is located. If it is located on a home page, first level pages, or sidebars, it will be more important than if it is located on pages that are not directly accessible from the navigation menu.

Location on the page: In general, robots begin to read the pages from the top, so a link located right at the top will be more important than one located in the footer.

As we also saw previously, there are several types of URL: Domain, subdomain, directory, etc., which can be in free or purchased hosting services. Receive links from main domain roots hosted in private hosting, they give greater importance than the rest.




Black Hat SEO

Anyone who hires a company or person that does Black Hat SEO should be aware of the risks that this entails, although what is usually intended are quick results.

The risks will not be taken into account as much when you do not give importance to the project going under, since you do not have to monetize a brand for a long time.

Among the risks involved in using Black Hat SEO, the following will be highlighted:

Penalty that influences the decrease in positioning (can be temporary or permanent).
Penalty in which it will be eliminated directly from the positioning (it can also be temporary and permanent).
Typically when repeat offenses occur, the following penalty options are considered:

-Cancellation of the account.
- Elimination of advertisements on own pages.
- Elimination of advertisements on other pages that are not our own.

This positioning strategy violates the rules established by search engines to position a site and uses somewhat dangerous and commonly prohibited techniques.

Some examples of penalisable links are:

Artificial links to the website: Google has detected a pattern of artificial, deceptive or manipulative links directed to the website. Purchasing links or participating in link schemes to manipulate PageRank is a violation of Google's webmaster guidelines. As a result, Google has carried out a manual anti-spam action on the website.

Artificial links from the website: Google has detected a pattern of artificial, deceptive or manipulative links leaving the website. Purchasing links or participating in link schemes to manipulate RankPage is a violation of Google's webmaster guidelines. Google has applied a manual anti-spam action on the affected parts of the website. Actions that affect the entire site appear under Site Matches. Actions that only affect part of the website or some inbound links appear in Partial Matches.




How to measure the quality of a link?


Some years ago it was possible to measure the Google Page Rank to know the authority of the page on which the link was located. At present no longer, since Google has stopped publishing the updates of the Page Rank of the pages, although Google continues to count it and updates it internally. There are different alternatives to Page Rank:

- MozRank: it goes from 0 to 10 with a logarithmic scale. As Google updates it every several months or every year, MOZ updates it every several days. It depends only on the number of links received and the PageRank of the pages from which you are linking.

- Domain Authority: a value between 0 and 100 on a logarithmic scale that establishes the probability that a given domain has of being up in Google. It is like the previous one, although it shows the strength of the domain as a whole, not only of a particular page of that domain.

- Page Authority: a value between 0 and 100 on a logarithmic scale that establishes the probability that a given page has of being at the top of Google. It depends on many factors. New factors are added so that this value increasingly predicts the positioning of a Google website.

Site Map

A sitemap is a tool that will provide us with information about search engines if the individual pages on your site are new or updated.

three additional content items for each page:

- Priority of a URL that is related to other pages on the site.

- The last time a URL was updated.

- How often a URL is updated.

Sitemaps help ensure that all new and difficult-to-find pages are indexed quickly and comprehensively, as well as particularly hard-to-find websites being leveraged for efficient search engine crawl.

The sitemap is a list of pages on a website that search engines and users can access in order to plan the design of a website so that the pages are organized hierarchically. This can favor the positioning because it is easier for the pages to be found.

It is essential to have a sitemap, especially when it comes to a very large website so that both search engines and users access all the content on the site.

Site Maps can be provided to search engines with an XML file with a list of all the URL addresses on the page, thereby facilitating the information provided.

Comments

table of contents title