Sat Nov 28 2020·3 min read
Photo by Geran de Klerk
We'll cover some of the very basics of SEO. No black magic tricks, that would rank you on the first page, no promises, just some of the things that might matter when our sites are being ranked.
Let's start with the foundation. Always update the document title - it serves as a short description of the page. When our site pops up in a google search the title of the document is rendered in large font, serving as a link to our page.
<title>Your Document title</title>
Everything has to be connected via links, no search bot is going to attempt
to guess the URIs on your website. If we have a
/blog endpoint available, but
there's no link to it, it will not get indexed.
Search bots don't use machine learning to guess what our image represents (at
the time of writing). For that reason we should always set the
alt property on
<img src="/path/to/image.png" alt="white cat" />
If we have content on our web page that is only accessible after the user fills in a form, chances are it won't get indexed. The same counts for content that is only accessible by registered users, so we have to be mindful when restricting content.
Try to have a domain that represents what your business does, what people would search for in a search engine. For example if you sell bikes - bikes.com would be a good pick, however most good domain names are already taken so try to at least include the information in the url path - example.com/bikes.
The tag provides a concise summary of our web page. The tag should be one - two sentences. It appears underneath the blue clickable links in a search engine results page.
<meta name="description" content="This is meta description text. This will often show up in search results." />
When our pages get parsed by robots, some tags are perceived as more important
than others. Heading tags get ranked depending on what headings they are. Always
make the few most important headings
It's important how often our keywords appear in the body content. That's why so many people just spammed words, that might get picked up by bots in hidden html tags a few years ago.
However those days are long gone and the search bots are not as easy to trick nowadays. For that reason the best thing we could do is to have more related content at a single endpoint. What I mean by that is if you have for example a long blog post, don't split it into multiple parts, instead publish it as is and you will have more related keywords at a single endpoint.
robots.txt file is:
You could disallow the search crawler to crawl some of your pages if you don't want a page to get indexed.
robots.txtto select endpoints they should target.
The sitemap lists the URLs for a site and you could write it yourself or generate it using a script.
I'll send you 1 email a week with links to all of the articles I've written that week