Search Engine Optimization (SEO)
Search Engine Optimization, commonly known as SEO, is a set of practices and strategies aimed at improving the visibility and ranking of a website in search engine results pages (SERPs).
SEO is an ongoing process as search engines continually update their algorithms. It means that the content below may be obsolete.
Search engines are using crawlers to index websites. They are basically bots inspecting each page and navigating on the website.
A few crawlers we can use to check a website:
- siteliner (250 pages max)
- drlinkcheck (broken links)
- snyk.io (vulnerabilities)
Basic Practices
URL/robots.txt
β‘οΈ Tell which files robots shouldn't access, and link your sitemap.
More at robots-txt.com. Test your robots.txt using Google Search Console. You got examples here.
# allows everyone
User-agent: *
Disallow:
# Link to the sitemap (optional)
Sitemap: URL/sitemap.xml
Canonical URLs
It could be that two URLs are leading to the same page. For instance, https://example.com
and https://example.com/index.php
. You need to inform search engines, by setting the same canonical URL for both.
header("Link: <https://example.com/index.php>; rel=\"canonical\"");
<link rel="canonical" href="https://example.com/index.php" />
Titles
- You must use exactly one header "h1"
- Title's length should be between 55-65 characters
<title>MainTopic - title1, title2, title3</title>
<title>MainTopic - title1, title2 | Organization</title>
Descriptions
You should write a good and unique description of your pages. The size displayed in the results is usually 110 characters (mobile), and around 130 on a computer.
URL/sitemap.xml
β‘οΈ Ensure the search engine crawlers properly index your website. Provide additional information on resources such as videos/images... The format is an XML file, you can these to generate one on:
- https://www.xml-sitemaps.com/ (max. 500 pages)
- https://www.sitemapgenie.com/ (no limit, or it seems so)
- You can find some tools on GitHub
After creating a sitemap, you have to share it, as explained here.
Example
<?xml version="1.0" encoding="UTF-8" ?>
<urlset xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9
https://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd">
<url>
<loc>https://blog.quentinra.dev/</loc>
<lastmod>2021-08-27T18:11:59+02:00</lastmod>
<priority>1.00</priority>
</url>
</urlset>
Structured Data
I'm unsure if this helps with SEO, but it allows search engines, if they want to, to display rich results. For instance, when we type a name, we often see the Wikipedia page on the right.
You can look for websites to generate structural data, to learn the syntax, such as this one. Note that you may need to escape the "\".
Example
<script type="application/ld+json">
{
"@context": "https:\/\/schema.org",
"@type": "Article",
"mainEntityOfPage": {
"@type": "WebPage",
"@id": "https:\/\/example.com\/"
},
"headline": "...",
"description": "...",
"author": {
"@type": "Organization",
"name": "XXX",
"url": "https:\/\/example.com\/"
},
"publisher": {
"@type": "Organization",
"name": "XXX",
"logo": {
"@type": "ImageObject",
"url": "https:\/\/example.com\/assets\/icon64.png",
"width": "32",
"height": "32"
}
}
}
</script>
Performance Optimization
Content delivery network (CDN)
Resources are cached on nearby servers to reduce the loading time of static resources (CSS, images, videos, audios...).
- jsdeliver (free, works with GitHub)
- cdnjs (free, but not decentralized)
- gitcdn (same as jsdeliver, but slower)
- unpkg.com (π»)
- keycdn.com (π»)
- raw.githack.com (π»)
Images
Clients must download every image, and it slows page loading.
- Compress images
- Use
.webp
instead of.jpg
/.png
- See imagekit (π»)
- Cache images (see article)
Tools
- pagespeed (test page loading time)
π» To-do π»
Stuff that I found, but never read/used yet.
- biq.cloud (see your website ranking for a keyword, account required)
- woorank
- webpagetest
- website-checker
- sitechecker
- protege
- similarweb