Website URL Structure and SEO Benefits – Part I
What is the best structure for website URLs? Is it possible to please users and search engines at the same time? There are a number of best practices that can make URLs both SEO-friendly and user-friendly.
Let’s take a look at this dynamic.
Use a Single Domain with Subfolders
Subfolders are preferred to subdomains at a search engine ranking level. SEO improvements can be experienced simply by moving content from a subdomain to a subfolder. Evidence is also clear that doing the opposite (shifting content from subfolder to a subdomain) results in lost search traffic. Subdomains get no SEO juice from the root domain. So previous work put into building Domain Authority is lost on a new subdomain. The reason for this is because Google will index subdomains almost as though they are separate domains.
For the majority of cases, it is better to put website content under one domain – and into logical subfolders. The use of a third level, or subdomain should be considered only if absolutely necessary.
Easily Read by Humans
Accessibility has always been part of SEO and that impact is even higher today as search engines try to interpret things like user experience, engagement and readability. It is not necessary that a URL be absolutely clean and flawless, but at least it must have keywords and terms that make sense from a user view point. If the URL is easy to read it will get a thumbs up from search engines as well.
Keywords in URLs
There are a variety of reasons for having well crafted URL keywords.
- When URLs are added as links, keywords in the URL make up the anchor text.
- Keywords in the URL immediately identifies a company or brand. Drilling down even further it can provide valuable clues about the destination that user is trying to reach.
- URLs are shown in search results and some research suggests that they are among the most important elements users look at when considering what to click.
Use Short URLs
Short URLs are easier to copy and paste, or share on social media. But keeping URLs at an exact length is not critical. Below about 60 characters is great. If they exceed 100 characters, consider a rewrite. Even with a 100+ character situation, it is not a problem for search engines, which can interpret long URLs without difficulty. Of course for humans, shorter URLs are always more readable and preferable.
Limit Redirects
When webmasters make mistakes with resource locations or when changing content from subdomains to subfolders for example, server redirects are often employed. Search engines can manage multiple redirects quite well, but don’t overdo it. GoogleBot follows up to 5 redirects, so it is advisable not to go beyond that number. If possible, keep the automatic redirects below three. Also consider that some browsers, especially those of smartphones, may not be able to handle multiple redirects properly.
Case Sensitive
On Linux and Unix servers, it is possible that a URL in capital letters will be seen as different than one written in all lower case, thereby generating a “404 Not Found.” The best advice is to always use lowercase for website URLs or automatically redirect to lowercase versions.
Avoid the Hash
A hash or pound sign (#) within a URL is commonly used to take visitors to a specific location on a page. It can also be used for the purpose of monitoring traffic sources. In most cases the use of # in website URLs is not recommended. Sites like Amazon and Twitter have noticed big benefits in the simplification of URLs that previously contained a hash.