Search Engine Optimization
Practice: What sources are expected? How do I add my sources?
Search engine optimization, natural referral, or SEO (for search engine optimization) is a set of techniques designed to optimize the visibility of a web page in search results (SERP, search engine results page). These techniques seek to improve the understanding and content of one or all of the site pages by indexing bots, and increase the site’s (organic) natural traffic, which usually improves visibility, and which for a company improves its cycle.
The goal is to improve the position of a webpage on search results pages. A site’s ranking is considered good when it is ranked on the first page of search results, and in one of the first 10 natural responses to search for keywords that exactly match its topic. Such as sponsored links or paid advertisements.
2 SEO Challenges
3 SEO techniques
3.1 Technical Optimization
3.1.1 Content Optimization
3.1.2 Form Optimization
18.104.22.168 Meta tags
22.214.171.124 Link Analysis
126.96.36.199 Speed Optimization
3.2 Strategic Optimization (Marketing)
3.2.1 Keyword Selection
3.3.1 White Hat and Black Hat Techniques 
3.3.2 Black Hat Techniques
3.3.3 Local References
3.4 Location and Index Monitoring
3.5 Towards the SXO
4 Fight off offensive references
5 site promotion criteria
6 Reference to the profession
7 SEO competition
9 See also
9.1 Related Articles
9.2 External links
Search engine optimization began with the advent of the first directories and search engines in 1994. SEO first focused on engines like AltVista or directories like Yahoo! Before focusing on Google and its PageRank algorithm then getting more professional.
Search engine optimization represents a challenge at several levels:
Make a site visible in a sustainable way;
Allow good references;
Capture qualified search engine traffic;
Build a brand image for Internet users.
Search Engine Appearance on the front page guarantees significant traffic volume. In fact, about two-thirds of users click on one of the results on the front page and almost all of them don’t look beyond the third.
Because of this huge potential traffic, network masters want to appear at the top of the list, even if it means cheating. Once mentioned, the use of spamdexing techniques designed to deceive the bots of the engines in order to detect results; However, the engines responded by changing their classification algorithms to punish these sites. The purpose of a search engine is to provide the most relevant user query and therefore constantly improve the ranking algorithms to prevent fraudsters and continue to bring the most useful results for the web users.
The best optimization technique Google recommends to web masters is to design a site that is pleasant and user-friendly, and not to create a site by considering only crawlers 3. The best way to get a well-ranked website is to have real added value for the user.
Search engine optimization is a difficult area to quantify in terms of results, because the algorithms are constantly changing. Many SEO experts are trying to examine the operation of search engines in Reverse Engineering techniques. These experts have identified 50 to 100 relevance factors (out of a total of more than 300 according to SEO expert estimates) and are trying to correlate these factors with placement on the results pages. However, these correlations do not always prove to be true causality, and the number of factors, their weights, interactions, and development make calculations too complex to allow an accurate understanding of algorithmic performance. A few levers still make the difference in SEO:
Website optimization levers
Internal levers External levers
Implementing a custom tree structure
Internal grid of pages
Content considered “fresh”
Number of inbound links
Quality of inbound links
Social media signals
Natural redirect is a set of techniques designed to make webpages appear in the non-commercial search engine results. The information contained in webpages largely determines the quality of the natural referral.
There are three main categories for optimization activity: technical optimization, strategic optimization, and SEO.
Technical optimization brings together internal levers, and relates directly to site changes. They are usually the first to be done with a SEO perspective, which means search engine optimization, because these are direct changes. Technical optimization makes it easy to index the site by the search engines, as it simplifies the task for the bots and sends them clear information that allows optimal indexing. Technical optimization focuses as much on material (content) as on form.
With regard to web content, the trend is for Google’s acclaimed semantic search, as this search mode is suitable for posing questions for web users, especially voice calls made from a phone.
Optimizing the form
Meta tags can have a number of roles: Specify the page title and provide a summary of its content, inform bots about certain site details, such as the language used, the intended document type, used character encoding, page author, etc. However, they do not play a role in improving positioning, at least not in Google5. They can also be used to automatically redirect browsers to another page, to prohibit or index the page by engines. The meta code (which respects the hypertext markup language) is as follows (it is placed between the <head> and </head> tag at the beginning of the page):
<meta name = “robots” content = “Everything, Index, Track”>
This meta tag prompts robots to track the page and its index. However, this guideline is the basic one, which makes it unnecessary. The tag is used more often as a counter-directive, telling bots not to index pages (in case the webmaster can’t guarantee link reliability, when the link is charged or even when it comes to a question about registration links) 6:
<meta name = “robots” content = “noindex, nofollow”>
Search engines can read the content of the links or URLs on pages on the site as text. While it is true that a site’s SEO quality begins with the relevant quality of its content for web users, it is all important in its optimization, including the paths and URL links of each page.
Because the engines have access to the link semantics, the keywords listed there have significant weight. For example, if a domain.com domain name hosts a page with content on mp3 files, you should use the .com / mp3.html domain name rather than the .com / page 1 .html domain. Web crawlers analyze sites as an Internet user does, so semantics are important and the URL links must be structured for the content of the pages. You should also produce the shortest URLs.
Google formulated in 20107 that site speed is a criterion that is taken into account in ranking results (desktop). Since July 2018 this has also been the case for mobile8.9.
Strategic Optimization (Marketing)
Strategic content optimization includes identifying customer segments and understanding their online behavior. With these behaviors in mind, it is easier to create new content and pages specifically targeted to certain web users, which will attract their attention and generate more traffic. Search engines are sensitive to unique and good content that offers relevant customer information; Therefore, they will better index these sites compared to other sites that do not have targeted content.
Before you begin optimization for natural leads, it is necessary to identify and select the relevant keywords. Different targeting strategies are possible. It is therefore interesting to target both the generic requests (TV, home sales, cheap flights, etc.), which are competitive but allow significant traffic volume, as well as long tail requests which are less bulky for each of these requests. In the case of a long tail targeting strategy, it’s important to target a large volume of keywords.
It is not helpful to place yourself in a keyword that is not directly related to the site activity. The likelihood of placement is low because sometimes the engines can detect the mismatch between the targeted keywords and the content of the content. Keyword selection has some tools (including Google Adwords or SEMrush), free or paid, but the search engine bidding offers are a good indicator. These keywords should then be classified by several criteria:
Traffic volume potential;
Estimated conversion rate;
The competitive nature of the keyword.
For optimal content, it’s important to target the right keywords that represent the web users’ requests, and then adjust its line of editing based on this lexicon so that the keywords are equally included in the URLs. Than content.
SEO is an external lever that should not be ignored. In order for a user to trust a website, it must have a reputation that we have heard of. The more people become familiar with the site and its reputation, the more it will buy its products.
White Hat and Black Hat Techniques 10
SEO is managed in various ways, and is especially qualified as “white hat” and “black hat”. The basis of white hat SEO seeks to create quality, user-relevant content. The Black Hat base uses all available means, even if it means risking its site being removed from the search engine index.
Some white hat techniques include optimizing your pages by using relevant keywords and by doing “link building”, i.e. creating quality content on blogs or various directories to get “backlink”. As far as the white hat is concerned, Google’s SEO guide published is a good source of information. Some black hat techniques included buying bulk links or hiding text by making it the same color as the background or by giving it a transparent feature. However, Google no longer takes these “hidden” links into account and even penalizes them if they have not been removed.
However, the web masters cannot be categorized so clearly. While most of them actually obey the “white hat” laws in most of their work, it is common to see certain “black hat” methods used in a complementary way. The gray hats, which are between white and black, optimize referral efficiency with moderately illuminated techniques. The difficulty with this approach lies in limiting the power of using the “black hat” processes sufficiently to be indexed by the search engines, while generating maximum profit from it. Unlike the “black hat”, the “gray hat” thus allows to minimize risk taking while achieving better SEO performance.
Black hat techniques
Here are some techniques considered black hat:
Hidden texts or links;
Discrepancy between page content and description;
Repeat website or page (content duplication).
Registering a business on Google My Business lets Google know if it works in the specified physical location, giving it a better chance of appearing in search results and Google Maps14.
Location monitoring and indexing
Monitoring the performance and ranking of their keywords is crucial to measuring the results of SEO strategies.
Index tracking to make sure the engine is still familiar with the page concerned. It is hardly helpful to submit sites to engines and less submit them if they are already indexed.
Location tracking to optimize below for the keywords that interest us.
Indexing in thematic and general directories often needs to be done to maintain the popularity gained, or even optimize it by increasing the number and quality of inbound links to the site to be placed.
SXO (Search Experience Optimization) allows to improve the quality of content to meet user expectations through semantic networking. This is responsive to the development of search users’ Internet trends and especially the increasing use of voice searches. The SXO is therefore based on the user experience (UX for User eXperience) to improve the visibility of a site.
Fight against abusive references
Main article: spamdexing.
However, SEO is a profession that is evolving in a complex framework. Its innovative and innovative techniques incorporate new considerations that consist of marketing, relationships with communities, monitoring developments, as well as computer and technical similarities. However, the commercial challenges that SEOs are responding to can be counter to the financial challenges of search engines, especially those of Google.
Google has put together an arsenal of optimization prevention techniques that do not respect its quality guidelines (selling links, hidden text, etc.), we are talking about “penalties” 15. The penalties can be of two types:
Algorithmic penalty: The penalty is continuously updating the algorithm;
Manual penalty: The penalty is triggered by a manual action by a member of the Google spam team on the Internet, usually following the detection of a black hat operation.
Two recent Google updates specifically target SEO16:
Update Panda: Punishment for sites with little added value, too many advertisements;
Update Penguin: Fight against abusive external links.
Website Promotion Criteria
There are more than 200 parameters used in the location algorithms. According to a study with 140 referrers, some criteria will allow the page to be swept away in the first natural results, while others will result in heavy penalties from Google17.
Main criteria that positively affect location:
The relevance of the linking pages;
The text of backlinks;
The relevance of the page;
The page’s confidence index;
Domain name authority;
Page rank of page;
The quality of the links to this page.
A reference to the profession
According to the results of the International Job Observatory, “Search Engine Optimization” seems to be one of the most sought-after professional profiles. If this aspect seems to make sense for a webmaster or web designer, first glance may seem less of a priority to other profiles, such as for an internet database specialist; However, the consistency with which the latter organizes the functioning of the database concerned can result in significant gains in optimizing the site. In fact, according to some research, highlighted and proven SEO skill, is an essential asset in the job search phase 18.
The referral profession, which is still up-to-date, does not yet enjoy a standardized curriculum, so much so that this position can now be taken up after multiple training and careers. There are three types of profiles: the “technical” referral (most often from a computer or engineering background), the “referral” system (no dedicated curriculum but requiring writing sensitivity) and “marketing” (most often from commercial or communications and marketing research).
A study by CAMP SEO found that in 2016 70% of SEOs were men. 32% of SEOs are 26-30 years old, 53% have less than 5 years of experience (proof that the profession is young) and 39% work in Ile-de-France. 70% are on fixed-term contracts with an average gross annual salary of € 35,877 19.
Contests allow website promotions to compare their results. The purpose of these competitions is to reach first place in Google with a given keyword. These keywords are usually carefully chosen so as not to interfere with real keywords used in other formal contexts. Examples of keywords from previous competitions: pig food, greedy wizard, chococo, tiger osmosis, cobrow-oout, zortoj, golomlit, sentimancho, black hat, organic movement, indiborg, digmasbord, etc.
These competitions are a good way for SEOs to get better, measure themselves against others. These contests highlight the differences between the expected search engine behavior of a word and its actual behavior. During certain competitions, the organizers hid from the public the chosen keyword during the competition to avoid over-jamming during tests and analyzes.
Rémi BACHELET, “Reference Course” [Archive], Lil’s Central School, February 17, 2011, under the Creative Supervisor’s License.
Sylvain Nox, Website Souvenir and Announces: Beginner Level, 26 p.
“Online Chat with Webmaster Help Group Guides” [Archive], Google, March 28, 2008, Transcript [Archive].
“What are META tags for?” [Archive], at alsacreations.com, November 15, 2012 (accessed July 6, 2018).
Meta Tags Guide [Archive], WebRankInfo
“Use the rel =” nofollow “feature for specific links – Search Console Help” [Archive], at support.google.com (accessed July 5, 2018).
“Using the Site Speed in Internet Search Ranking,” Google’s Official Webmaster Blog, April 9, 2010 (read online [Archive], accessed February 6, 2018).
“Page Load Speed as a Criterion for Mobile Search Location,” Google’s Official Webmaster Blog, January 26, 2018 (read online [Archive], accessed February 6, 2018).
“Speed Test on Google Site Google’s recommendation is to be less than three seconds from downloading a full webpage” [Archive]
“Natural Reference – A Complete Practical Guide to Website Promotion” [Archive], in CommentCaMarche (accessed July 6, 2018).
“Google Getting Started Guide – Search Engine Optimization” [Archive], Google, 2011.
“[Guide] How to Know and Remove Bad Backlinks,” Creapulse, October 4, 2016 (read online [Archive], accessed July 5, 2018).
“BMW awarded Google ‘death penalty’,” BBC News, 2006 (read online [Archive]).
(he) LISA BARONE, “Google openly publishes SEOs as criminals” [Archive], Outspoken Media Marketing Agency, September 6, 2009.
“SEO: 13 Google Algorithms and Their Updates to Know” [Archive], at HubSpot, June 12, 2018 (accessed July 6, 2018).
“The Most Important Criteria According to 140 SEOs” [Archive], in JournalDunNet, June 21, 2017 (accessed July 6, 2018).
“International Observatory for Internet Professions” [Archive].
“SEOs in France in 2016” [Archive], SEO CAMP research was passed on JDN.
In other YashaHarari.com content:
Search Engine Optimization, at YashaHarari.com
Social Media Optimization (SMO)