Referencing a website is a work that is both central and laborious. We can not escape because it is the sine for your website to be visible on the internet in search engines (especially on Google). To boost its SEO quickly, some people do not hesitate to use unsolicited SEO techniques and even prohibited by Google. To qualify these bad practices, we speak of “Black Hat SEO”, as opposed to the so-called “White Hat” SEO practices. The Black Hat SEO can in the short term be effective and paying, but the risk is great that Google will spot you and sanctions you one day or any other time. In the medium and long-term, Black Hat SEO is therefore counterproductive. Sanctions can range from losing positions in the SERPs to outright blacklisting (your site no longer appears in Google). To guard against these risks, it is important to know these deprecated and prohibited techniques. Here are 8 of the most common Black Hat SEO techniques which can destroy your google ranking.
Cloaking (cloak) is a technique that introduces users to content that is different from the user for search engines. On any website, there are two types of visitors: Internet users (human) and search engine indexing robots. These robots (Google Bots) “visit” your pages in order to index them in the search engine. The general idea, with cloaking, is to present robots a page overoptimized (stuffed with keywords in particular) and visitors to a page “normal”, classic, readable. Cloaking also makes it possible to display different pages from one user to another, depending on his location or the navigation device used (desktop or mobile). Even if the URL of the page is the same, robots and Internet users access two different pages. Technically speaking, we will use, for cloaking, HTTP User-Agent fields to identify types of visitors. This technique, like all the others we will see, is of course forbidden.
To learn more about the subject, you can watch the video of Matt Cutts in which he explains what is the method of cloaking.
The Satellite Pages
A satellite page is a page designed solely to improve the visibility of a website. This is an overoptimized page on a particular keyword, on a specific query. It has no interest for the visitor but can generate traffic from the search engines. Generally, we use a redirection technique to prevent visitors from landing on this satellite page. In this way, the page that appears in the SERPs (result pages) is the satellite page while the landing page is the page intended for “human” visitors. A landing page can thus increase its traffic by means of several satellite pages redirecting towards it. Cloaking and satellite pages are two very close techniques.
Sometimes we also talk about satellite websites. The principle is the same: you create several websites overoptimized from the point of view of SEO whose sole function is to generate traffic. On these satellite sites, you make links to your main site. The satellite sites, uninteresting for Internet users, thus boost the traffic of the main website.
The Hidden text refers to all content that does not appear to users but is visible to search engine indexing robots. Most hidden texts are ultra-optimized texts for SEO whose content is very poor, even unreadable. The extreme case being the pure and simple suite of keywords. Example: “find Newyork lawyer, find San Diego lawyer, etc. “.
There are several techniques for producing hidden content :
- Put the text to hide in the same color as the background of the site (white text on white background for example). The content thus literally blends into the background. This is the simplest and oldest technique. The development of new languages (CSS and JS) has opened the way to other, more complex techniques.
- Place text behind an image, thanks to CSS.
- Place text off the screen , thanks to CSS.
- Embedding text in font 0 is another, pretty basic solution.
Create Duplicate Content
Duplicate content is a practice of copying and pasting some content from page A to page B . The duplicate content can be partial (if you copy and paste a paragraph for example) or total. The duplicate content obviously concerns the case where the pages A and B belong to two different sites, but it also concerns two pages of the same site. The duplicate content can be external (two pages of two sites) or internal (two pages of the same site). In the first case, it is clearly plagiarism, theft of content. In the second case, it can be a way to save time.
The duplicate content is not permitted by Google. One could even say heavily punish. This is a practice to avoid at all costs, especially since the duplicate content is very easy to detect, compared to other techniques. Obviously, content theft can, in addition to degrading your SEO, lead to legal action by victims of plagiarism.
Of course, it is impossible to avoid the duplicate content entirely, when you make a quote for example. The idea to remember is that the duplicate content must represent a very small part of your content (less than 10%) to avoid the risk of sanctions. Technically, it is also possible to generate duplicate content without wanting to. For example, if it is possible to access a page of your site via two different URLs, it creates duplicate content. For Google, each URL must match the different content. It may be helpful in some cases to use an SEO specialist to ensure that these kinds of technical issues do not exist and resolve them if necessary.
Make sure to define for each page of your site a Title tag and a unique Meta Description tag. The duplicate content is not just about the body of text, but all the HTML tags.
Who says SEO says keywords. To position oneself on this or that request in Google, it is necessary to use the contents of its site the keywords which correspond to these requests. A keyword, for example, “find a lawyer in Paris”, must be positioned at all the levels of the page so that this page can have chances to be well positioned in the results of Google (on the request “find lawyer in NewYork ” for example). At all levels, this means: in the title, in the meta description, in the H1, in the introduction, in one or more H2 (subtitles). Based on this observation (the key role of keywords), some sites tend to overdo it and stuff their contents into keywords. This is the definition of the keyword stuffing (stuffing keywords). A typical example of keyword stuffing: display 5 or 6 times the keyword in the meta description, or more than 50 times in a text of 500 words. The worst method is to align keywords separated by commas. These practices degrade the quality of content and therefore the user experience but can also, in addition to that, be spotted by Google. At least since the Penguin update of the company’s algorithm of Mountain View. Many sites have already been penalized because of this bad SEO practice.
Black Hat SEO practices can be combined with each other. Here, concretely, we see that the great advantage of hidden texts is to allow a site to do keyword stuffing without it degrades the reader’s experience.
Content farms have understood that content is one of the main levers of SEO, to the point of abuse. A content farm – or “content factory” – is an editorial site that publishes hundreds or even thousands of low-quality articles for the sole purpose of generating traffic through the SERPs and advertising revenues that go with it. For example, a content farm will tend to target all topics that can generate traffic. Content farms have a demand-driven approach (what are the topics requested?) And not the supply. Some sites have implemented algorithms to detect hot topics at any time to be as responsive as possible. Obviously, only topics that can generate a large audience are processed. Other farms have set up automatic content generators, thanks to which it is possible to produce articles without duplicate content in a minimum of time.
The editorial quality of articles created by content farms is still very poor. To limit production costs, content farms use trainee armies, volunteer editors or offshore freelancers. Google, again, has decided for several years to clean and degrade the positions of content farms. Some have even been blacklisted. And it can be understood: content farms help degrade the quality and relevance of Google’s search results. This is the main argument of the Californian giant (the relevance of the SERP) that is here at stake.
Exchange of links
Link Building is a key element of SEO, as we already explained in our article on the Fundamentals of Link Building. To create backlinks, the exchange of links is a very used practice. The idea is simple: a website has a link to a site B, in exchange for which the B site is a link to the A site. The links can be placed inside articles, but also in the footer, in a side-bar, etc. It’s win-win, especially if both sites have similar themes. Except that, of course, for Google, a link is only relevant if it is natural if it was done solely because of the relevance of the link page and not under a contract. This practice of the exchange of links is now sanctioned. We do not advise you to abuse this technique. If you still want to do this, avoid 1 / industrialize this practice and 2 / place the links obtained in the context of exchanges in the static parts of your site (the footer, the sidebars, etc.). Put these links in editorial content (blog articles for example).