Now, I know what you’re all thinking! Aren’t those SEO guys the ones who keep sending us big lists of stuff to do that seem pretty pointless? From my experience, a lot of web developers think of SEO guys as a bunch of crazy folk whose sole purpose in life is to make their life difficult.
Work smart, and modify your back office so that the SEO guys can do most of the work themselves, and you’ll save yourself a ton of time and effort over the lifetime of the site.
To be honest, they’re about half right. I’ve been working with SEO agencies for about 10 years now, so I’ve got a good feel for the sorts of things they’re going to want to ask you to do. There’s a lot of stuff you can do that means the SEO agency can do the majority of the things they’re likely to want to do themselves, without hassling your web dev team! Often times it’s silly things like adding scripts to pages, or changing meta tags, which aren’t particularly time consuming, but if you keep getting requests to do them, it breaks up you work, and it’s annoying. Work smart, and modify your back office so that the SEO guys can do most of the work themselves, and you’ll save yourself a ton of time and effort over the lifetime of the site.
Without getting into stuff like CRO (conversion rate optimization) the types of things that SEO agencies usually want fall into several basic categories.
Scripts
One of the biggies for SEO agencies is the ability to set things like Google Analtytics, Google Tag Manager and whatever eyeball tracking phone number switching script is currently hot. The types of script they are likely to want to set usually fall into two categories, global scripts (e.g Google Analytics) and page level (e.g. conversion scripts). These scripts usually need to be in one of three places, the HEAD of the page, after the start of the opening BODY tag, or before the closing BODY tag.
This is easily done by having a global SEO setting node (or you could have them on the home page) that has three text fields, one for each of the script locations, and having that displayed on each page. Next, have an SEO tab on your page DocTypes and add three fields there as well so they can set scripts at a p[age level too (compositions are great for this sort of thing)!
Once that’s done, you don’t have to worry about the incessant emails/calls demanding that you add script X to whatever page.
Meta Tags
Give your site the ability to set various meta tags, and you’ll keep the SEO guys happy! At bare minimum in your SEO tab, you should allow editors to set the following meta tags:
- Page title tag
- Meta description
- Meta keywords
- A canonical override to another page
- The ability to include a noindex/nofollow tag
There are a few packages that will do most of these for you if you wish, or you can roll them yourself, they’re all pretty straightforward!
Here are a couple of packages that can help with meta tags:
- https://our.umbraco.org/projects/backoffice-extensions/seo-metadata-for-umbraco/ - written by a client of mine (who also happen to be an SEO agency)
- https://our.umbraco.org/projects/backoffice-extensions/robots-meta-tag-property-editor/ - lets you set a custom robots tag for the page
- https://our.umbraco.org/projects/website-utilities/seo-checker/ - paid for SEO plugin, more on this later!
Redirects
Imagine that you are building a new version of a website, and the structure of the site has changed. If you don’t set up redirects for the older URLs, and inbound links that are using the older URLs will 404, and it’ll torpedo your SEO rankings. So you should always have a way of handling this. The same goes for if you move a page, delete it, or rename it. Ideally you should have a way of tracking these and setting up redirects.
You can do this manually, using either UrlRewriting.Net or the IIS URL rewriting module, but that means a developer has to change the script every time something is renamed/deleted in the CMS.
There is an awesome package for this, the 301 URL Tracker. it allows you manage the redirects etc in the back office UI. Very handy if you want to let the SEO agency be in charge of the redirects.
XML Sitemap
Your site should emit an XML sitemap, following the Sitemaps specs at it’s most basic, this should be a list of all of the pages of the site that you want search engine to see. Don’t forget to exclude things like thank you pages and protected content!
The reason for having this is that the sitemap can be used by spiders to find pages that may not be directly accessible via the navigation (assuming they want to be found), and you can use it in conjunction with things like Webmaster Tools and other SEO tools.
For bonus points, you could add the ability to set the sitemap priority for pages to the SEO tab, and possibly a checkbox to exclude the page from the sitemap.
As for building the sitemap itself, I often do that with a route-hijacked controller that spits out the markup for the sitemap.
The accepted standard for the location of the sitemap is /sitemap.xml, but you can customize the URL, as long as you link to it in the sites Robots.txt file.
Robots.txt
Your site should have a robots.txt file in the root of the site, saying what is and isn’t allowed. If you are emitting an XML sitemap, you should include a link to it in the file too, e.g.
Sitemap: http://www.yourdomain.com/yoursitemapurl
There is a handy package that will allow you to edit the Robots.txt file in the back office if you want, called Robots.txt Editor. I can’t guarantee if it works in the very latest versions of Umbraco though.
Canonicalize Your URLs
Google can be funny about your URLs. It’s quite common to see the same page available on multiple URLs. E.g.
- http://yourdomain.com/test
- http://www.your domain.com/test
- http://www.yourdomain.com/test/
- http://www.yourdomain.com/TEST/
Even though they’re all the same page, Google will treat them as unique URLs. So it’s a good idea to standardize your URLs across the board and 301 (permanently) redirect the others to the main one. In general, you should:
- pick either www or no www and redirect the one you aren’t using to the one you are
- lower case your URLs
- remove superfluous /default.aspx (or .html or whatever) off the end of URLs
- pick either trailing slash at the end of links or no trailing slash, and enforce that rule on your URLs
To accomplish this you can either use the UrlRewriting.Net component that ships with Umbraco, or you can use the URL Rewriting plugin for IIS (my preferred method, as it’s much faster). Detailed instructions on how to do this are worthy of an article in themselves, but you can read up on the basics here.
SEO Checker
In addition to the packages that I’ve mentioned already, there is a commercial package called SEO Checker that you can buy for your site. It performs a fair few of the things I’ve mentioned, and also contains additional useful functionality, like the ability to run SEO checks on pages, and check for broken inbound links etc. It’s not super pricey, and adds a lot of functionality that SEO agencies will actually use.
In Conclusion
Making your site usable by SEO agencies, doesn’t have to be a massive pain, and with a bit of planning, you can have a site that should be able to handle most of the things an SEO agency are likely to throw at you without too much trouble. You can wave goodbye to your developers getting upset at the endless barrage of requests from the client’s SEO agency!