What is Technical SEO?
SEO refers to the optimization of the website from a technical point of view. Unlike external SEO or internal SEO, most of the work we do in technical SEO is related to the search engines, website technical issues, bots, programming codes, and so on.
The main purpose of implementing technical SEO on a website is to prepare a website that can be easily understood and reviewed by search engine robots, and a website that has a correct page structure and the user can easily understand where things are.
Why SEO?
SEO is less important in the minds of the audience than it was several years ago. Why? Because today content management systems like WordPress automatically implement half (or maybe more than half) of technical SEO on the website. This makes webmasters think that this is not really important. Where if we want to look at the basics; Technical SEO is one of the first steps that every SEO expert should take to optimize a website. To help you better understand this, let me give you an example from my experience:
Two years ago, a friend of mine decided to close his physical store and enter the world of online business. After launching the site and taking the initial steps, with the help of an author, he started producing content and started uploading interesting and good content to his website. About 3 months after, he called me one day and asked me to check their website and find the reason why his website is not in Google results.
Honestly, there was no need for much research! After a 5 minute look, I said that your website has been hidden from search engine bots for all these months! This was due to the existence of a one-line code snippet among the site code; In essence, this piece of code is part of this technical SEO. This short story is a good example of how this type of SEO can affect the success or even failure of your website.
Technical SEO training
So let’s go to the main part of the article and see how we should learn about technical SEO? How many sections does it consist of and how can we implement it on our website?
1. Site speed
You may also be interested to know that when Google is going to get a page from your website to the first results, as an important factor, its speed is compared to the speed of competitors’ pages. But why speed? Well, to be honest, Google pays a lot of attention to its users, and users do not like waiting for mobile or computer pages to open your site page.
If I want to talk a little bit about statistics and numbers; The opening time of each page of your website should be between 1 and 5 seconds max. So as the first step of technical SEO you should start optimizing the speed of your website.
2. Optimized structure for robots
Google and other search engines use a series of bots known as crawlers, spiders, etc. to view the pages of your website. The important thing is that these crawlers do not have much ability to browse web pages. Basically, you have to help the bots to see the site and its pages.
Crawlers basically use links to find new pages. This means that if I link to the SEO training page in the new academy, in addition to introducing that page to the user’s schema, I have also created a way for bots to access and index that page. (Index refers to saving the page in the Google database and displaying it in the results).
Given this example, your website should have internal links as standard. To summarize, there should be an internal link for all pages of your website; Otherwise, reptiles may crawl the site incompletely.
3. robot.txt file
As far as I know, robots are incomprehensible language creatures and it is not easy to deal with them, dear robot, do not index this page!
But some programming languages can be spoken to. The robot.txt text file that is uploaded to your web host contains a series of programming commands that communicate with search engine bots. but why? Basically, this file helps you to have a little control over the behaviour of crawlers on your website. For example, you can use the robot file to ask them not to index pages like your payment page and … of your site.
Remember the example I gave of my friend? The problem was that by default, WordPress sites put a line of code inside the robot file so that bots could not check for an incomplete website, and it was the job of the website designer to remove the code after the project was complete. Of course, removing or inserting it is not a difficult task; Just go to the WordPress settings, go to the read option and uncheck the option you see below:
4. 404 pages and broken links
If I decide to delete the SEO training article you are reading; When you try to log in again, you will see a 404 page. In essence, 404 pages are pages whose content has been completely deleted. Basically saying Google bots hate these 404 pages, and their presence on your site reduces the power of your SEO. What is the solution? The solution is to use a 301 redirect. You can redirect bots and users who enter these deleted pages to related pages (often the main page of the site).
5. Avoid duplicate content
The problem with duplicate content is often the problem that online stores face. For example, a store that has 5 LG k8 mobile phones in its products and the only difference is in its colour. Basically all these 5 pages have the same content and product description and the same details. Google does not keep quiet about duplicate content on your website and reduces your SEO.
But what is the solution to this problem? How can a store have similar products on its website? The solution that Google has determined for this problem is to use the canonical tag. With this tag, you can tell Google which of the duplicate pages on your site is the original version and which is not indexed in the results.
6. Site security!
Google is extremely sensitive to the security of websites. Basically, if your website is hacked and Google finds out, you should expect a reduction in rankings shortly. In this case, in addition to automatically reducing the rankings, Google also tells your web site users about the hacking your website so that they do not enter your site and a negative branding is formed for you. Therefore, I suggest that you take action to increase the security of your site.
7. Site Map
Sitemap site, like robot file, is a text file in XML format; But their function is different. A sitemap is basically a file that contains a link to all the pages of your website; From the pages of articles to tabs and …. but why?
Because search engine bots automatically check your site’s site amp. They can more easily find and review the pages by examining the links within this page. So by making a sitemap, you make the process of crawling bots on your site easier.
8. Data structure
Normally, Google uses a set of algorithms and predefined factors to determine how good your website content is. For example, by examining the density of the keyword, the presence of related words, images, etc. in the content text, and many more… it understands how much your page deserves to be in the first results, But it generally does not have the ability to recognize the content of the page itself.
The following is a series of programming codes called data structures; They can also help Google recognize the content of the page. If you use data structure code to help you program your website, Google will understand your content better and in return for the effort; In addition to increasing your credibility, it also changes the way your site is displayed in the results. For example, look at the image below:
Displaying prices, stars, etc. in Google results is one of the advantages of using data structure codes. Implementing these codes on the site is best done with the help of an SEO expert.
9. Structure the URL of the pages
Another SEO factor is the subject of URLs. In fact, Google cares about the beauty and shortness of your site pages. Of course, not so much, but in any case, this is an issue that you can consider optimizing your site more than before. To create page URLs, pay attention to the following:
- Do not use meaningless letters in the URL
- Use English keyword in the URL
- Preferably your page URL should be short
10. Breadcrumbs Links
Breadcrumb links are the same links or paths that show the user at the top or bottom of the web pages exactly where they are at right now. For example, look at the image below to better understand what I mean:
As you can see, the breadcrumb is made up of several internal links. So in the first step, it helps to create a link building structure on your site and in the second step optimize the user experience of working with your site.
11. Responsive template and mobile-friendly websites
A responsive website template is basically a template that responds to the user’s screen size. For example, when you visit a new site with a computer, you see the site differently than when you log in with a mobile phone or tablet. This is because of the responsiveness of the site, which shrinks to the size of your screen so that you can see all of its components in the best possible way.
At the moment, Google is very sensitive to the responsiveness of your websites. So if you are still using old templates that do not fit the user’s screen; Be sure to think about a website redevelopment project with Kiuloper.