1. Review Your Robots.txt File; Assess Your Meta Robots Tagging
If you have a robots.txt file on your site, check by visiting /robots.txt. You may be surprised to find out you are withholding pages, folders, images, etc. from search engines that can drive traffic to your site.
Additionally, run a site scan with a tool such as Screaming Frog to assess if there are any pages on your site you are excluding via a meta robots tag. Both of these are a very quick fix if you do find issues.
Unknowingly tagged pages or robots.txt entries are usually the culprit of a developer who forgot to remove the designations when a new page rolled live or a previous site administrator who deemed the quality content unimportant for the masses.
2. Review Your Site Organic CTR by Page; Revise the Worst Page’s Title Element and Meta Description
This is both a conversion optimization and SEO tip. The new world of SEO is heavily focused on the message you send, whether it be search engines or users.
Google provides click-through rate data on landing pages and keywords in your Google Analytics account. You don’t think they are providing this data out of the kindness of their heart do you? They are interested in sites that feature enticing and relative search result displays for web users.
While you may have many landing pages with atrocious bounce rates, identifying the worst one or a few will allow you to revise them in a short span of time to reflect listing users want to click. Simply visit your Google Analytics account and traverse to the Traffic Sources-Search Engine Optimization-Landing Pages section. You can also perform this test through the Keyword dimension of this analytical area as well.
Ultimately, You are improving your site in the eyes of the search engine and you may retain some visitors at the same time.
3. Assess Canonicalization of Your Domain
It only takes a moment to rid yourself of one of the most common forms of duplicate content and link value dilution.
Do your site pages exist at www.example.com and example.com? If so then you need to create a permanent 301 redirect directing all non-www. site pages to the www. version pages of your site.
Search engines don’t want to see two versions of your content. It’s helpful to combine the inbound link equity of these versions into one page as many people don’t always target links to your www version of site pages.
4. Review Your Most Frequently Linked Pages on Your Site
Through the use of a tool such as Open Site Explorer you can gain information in the server status of your most linked content. You may find out a site page that went viral last year and gained a ton of links has since been deleted from the server and displays a 404 code. Additionally, you may also see that a heavily linked page has since been temporarily redirected and is in need of a permanent redirection.
Finding a few of these can result in a few quick redirects to help boost the link value on the domain.
5. Review Your Site for Duplicate Title Elements
Do a quick check of duplicate title elements in Google Webmaster Tools. This can indicate duplicate pages, keyword cannibalization, and bad title element structure.
Checking this Google property feature can quickly show you these issues and give insight into whether you need to spend the next 15 minutes writing unique title elements, creating redirects, or thinking about which of the multiple pages should include a certain keyword term.
6. Find Your Most Authoritative Links; Request an Anchor Text Change
I see it all the time, sites which have links from very authoritative sites anchored on the text Click Here, Buy, Learn More. It drives me nuts!
All your anchor text doesn’t need to be keyword-rich, but it helps to identify your strongest links and reach out to these sites and request a text modification to a non-branded are partially branded variation. You can assess your anchor text by linking site authority with tools such asOpen Site Explorer and Majestic SEO.
7. Review Your Link Targets in Your Site Navigation and Any Other Sitewide Links
By reviewing the links in your main, footer, breadcrumb and any other supporting navigation you can quickly assess if you have duplicate content issues with pesky default pages (e.g., /index.html). These pages should be redirected to the absolute page and the links should also be revised to target the absolute page. These revisions clean up many, many internal linking deficiencies across your site.
8. Verify Your Local Listings
As web users become more localized in their searching behavior it becomes imperative that your off-site listings are owned by you. It doesn’t take long to claim your listing and show search engines that you have control over your external profiles.
Another reason this is a must: this is also believed to be a local algorithm ranking factor. Look to establish verification with other web profiles on sites such as Yelp down the road. If you need help with your local listings, MarketYourCorp.com offers an amazing local listing submission service to 32 different local listing services.