6 Tips From A WP Expert on How to Create an SEO-Friendly URL

To some, URLs are just meant to show the location of a web page in the World Wide Web. However, if optimized for search engines, URLs can serve a great part in your organic traffic and overall online performance.

Unless you are an SEO professional or have a dedicated inbound marketing team to help you such as the WordPress development agency DevriX, read on to find out what to focus on when optimizing your URLs.

 

Create Good-looking URLs

In order for your web page to rank well on Google, you need to make sure it looks as pleasing to the users as possible. Although this is not a ranking factor per se, it makes sense that the more people click on a SERP result, the further up it would be positioned.

This notion goes back as far as 2010, when Matt Cutts, who was the Head of Google’s webspam team at that moment, explained that long and multi-hyphenated URL slugs may look spammy and have a negative user impact. Instead, he recommended using a path name format.

In addition, according to Google’s Advanced SEO documentation a URL structure should be as simplified as possible. Omit all unnecessary parameters and separate each word rather than meshing them together. Also, use a robots.txt file to prevent crawlers from accessing similar URLs that have duplicate content and generate a redundant amount of similar search results for a simple query.

Also, your priority when tailoring your URL structure should be providing value for your users. Do not try to squeeze in as many keywords as you can in the slug. Not only is this hardly a ranking factor any more, but the whole URL is not even fully visible on mobile/ on SERPs.

Instead, concentrate your efforts on matching content with relevant category names. Many sites use breadcrumb navigation so this can help your SEO efforts up a notch.

 

Use An SSL Certificate

SSL.jpg

Ever since 2014, Google perceives HTTPS (Hypertext Transfer Protocol Secure) protocol as a ranking factor. In fact, it could be the one thing that makes or breaks your website’s SEO performance as lack of a HTTPS opens up the web to all kinds of dark and dodgy websites. SSL certificates on WordPress can be installed for free, or could cost up to $200 a year depending on the extras and duration.

When you switch to HTTPS, make sure you get rid of unencrypted versions of web pages. Duplicate content confuses both your visitors and the search engines crawlers. Therefore, it doesn’t make sense to have several versions of the same content indexed.

However, if you own just a small website that doesn’t require filling up any sensitive data but a name and email, you’d better think twice before investing in an SSL certificate, especially if your site has been running like this for a while. Google ranks HTTP pages as well so chances are that you’d put yourself through all the trouble for hardly any major result.

In fact, if you measure up the costs for a quality Extended Valuation SSL and the time it will take to migrate, it may not even be worth it. This includes manually crawl all pages of your website, update all links (including social media ones), canonical tags, set up 301 redirects, and update the HTTPS version of your site in the robots.txt file and all the tools you use (GA, GSC, CDN etc.), so you’d be in for a treat!

 

The URL Should Describe The Page Content

TheURL_ShouldDescribeThePageContent (1).jpg

Although this sounds like a rule of thumb, a lot of website owners abuse it and push their main keywords in the URL slugs regardless of the context and target of the respective pages.

Indeed, the URL and title of a web page should reflect the main concept covered on it so that it’s easier for users and search engine crawlers to understand what it is about. As mentioned above, keyword stuffing is not pretty nor effective and can get you penalized with the Panda update.

However, having a keyword in your URL is important. URL slugs shouldn’t be dynamically generated or contain too many words. The shorter, the better as the users are more likely to click on a simply structured URL and share – they assume it will give them exactly the information they are looking for. Make the slug readable, relevant and targeted.

A high clickthrough rate (CTR) will give search engines signals that your page is considered relevant to the search users made. As a result, your PageRank for this term will increase and therefore, traffic, sessions and hopefully conversions.

 

Be Consistent With URL Structure

A consistent URL structure matters for SEO and it matters for user experience, too. It is important to always shoot for static URLs as they do not include parameters and are easy to digest. Google ignores some parameters such as UTM automatically, while others like session IDs and system parameters may lead to variations and identical content crawled individually by search engines.

Especially if you don’t have breadcrumbs navigation, a clear and concise URL structure will help Google spiders and users find their way around your website. That way you can “hold their hand” and show them the path they took to where they ended up at., making it problem-free to navigate.

In addition, if you’ve only had dynamic URLs on your website, don’t be disheartened. If there’s anything Google hates, it’s drastic, unnatural changes. Parameter URLs get indexed as well and appear on SERPs although maybe not at the same CTR as their static counterparts.

However, if you only want to remove unnecessary or potentially harmful parameters, or if you are migrating to a new website but don’t want to lose all the traffic and link juice you’ve worked hard for, then you can go ahead and change the URL structure.

 

Tackle Bad/Broken URLs Regularly

BrokenPage.jpg

As a website owner/ editor, you should always be wary of bad URLs that damage your rankings. You can use various tools to monitor broken links on your WordPress website – Google Search Console, Semrush, Ahrefs etc. If search engines index duplicate or unoptimized URLs from your site, this is a penalty waiting to happen. Thus, make sure you install a plugin that filters bad URLs so you can deal with them (Broken Link Checker, WordPress Broken Link Manager, to name but a few).

Once you’ve identified the broken URLs on your website, here are some steps to make sure they don’t damage your SEO results:

 

Robots.txt file

Disavowing certain URLs in your site’s robots.txt file can be easily done on Google Search Console. This simply tells Google which URLs to crawl and index and is a good method to hide duplicate or unimportant pages. Nevertheless, if you want to completely block access to these pages, you should know the limitations of the robots.txt directive.

Mind you, Google and other search engines can still index it and show them in the SERPs if other resources link to it – however, they will show up without a meta description. Also, ensure you get familiar with the syntax other search engines use, as Googlebot obeys robots.txt instructions but other crawlers may not be able to read them.

 

Removals Tool

The GSC removals tool, or Removals Report, is a way to temporarily hide a page from Google results. If you want to remove content permanently, you need to take additional steps.In this report, removal requests are divided in 3 categories.

Temporary removals are used when you want to remove specific URLs on your site quickly. It lasts about 6 months and also clears the accumulated cache. Outdated content removals are public removal requests from users. Lastly, SafeSearch filtering requests (used mostly to restrict inappropriate content for supervision  in the workplace, or at home with children etc.).

 

301 Redirects

If you ever change a URL for whatever reason, make sure you inform Google that the content is placed elsewhere. Broken links happen either when you’ve altered your URL structure, or when one of your outbound links go to a broken page on another website. Either way, that’s where 301 redirects come into place. When a crawler or a user clicks on a broken page URL, it automatically redirects them to the new location so that you don’t lose the traffic you’ve gained on the previous URL.

It’s a good idea to monitor broken links on your website regularly. Otherwise, search engines will index both the 404 error page and the new optimized page separately and you’ll lose on traffic, link juice, CTR and ultimately, conversions. If you have a WP website, work alongside a WordPress agency that has maintenance plans fitted to your needs.

 

Password-protection

If you have pages with sensitive information on your WordPress site that you don’t want available to the public, the surefire way to hide their URLs from search bots is to either use the default settings on the WP dashboard or download a password-protect plugin.

Remember, this method is only necessary if you have confidential data, or on test sites to which you only want your team to access. It is also an option for large websites with many different sections where a user can contribute content.

 

Noindex Meta Tag/ Header

Unlike disavowed files that may appear in SERPs, the noindex derivative wipes out the page entirely. This is a good method if you’d like to control the crawlability of certain URLs but don’t have root access to your server.

You can push the noindex derivative by either adding it in the meta tag in the <head> section of the code of the page, or trigger a HTTP response header (good option for non-HTML resources like pdfs, video and images. However, keep in mind that if you’ve already disavowed a URL in a robots.txt file and try to noindex it as well, crawlers won’t be able to access the noindex tag and thus might still index the page.

 

Minimize Click Depth

ClickDepth.jpg

Click (page) depth is a ranking factor more important than URL structure. Generally, it is how few clicks away from the Home page a particular page is. Google considers URLs that are just a few clicks away from the Home page as important. The further the URLs are from the Home page, the less priority search engines grant them. Make sure the services (money) pages on your site are at the closest proximity possible to the main page.

There are ways to improve click depth without completely transforming your website and URL structure. Narrowing down the hierarchy of your site’s navigation with subcategories is a great way to help users and crawlers access a page directly from your Home page.

Another way is to place links to different sections on the Home page and money pages but be careful as too many will eventually disperse the traffic of the the most important pages of your website. Also, you can use Breadcrumbs as a secondary navigation. If your site is built on WordPress, you can easily add breadcrumbs automatically with a plugin – good options include Yoast SEO, All In One SEO and Breadcrumb NavXT.

 

Final Words

Although URL structure may not seem like an important ranking factor for SEO, it can certainly affect your SERP positions and organic traffic. However, as long as you keep it short and clear, consistent with the page title and content, shareable and easy to crawl, monitor it for broken links and make it as easy to access from the Home page as possible, you’ve done your part. Of course there are a lot of other factors affecting the SEO performance of a website.

    0 Comments

    No Comment.

    0
    • Your cart is empty.