Project Description
Kinaxis
With worldwide offices serving a broad range of global customers in a variety of industries, Kinaxis helps companies run their supply chains.
The company was founded in 1984 and headquartered in Ottawa, Canada. Their product: A RapidResponse platform was first introduced in 1995 and It is one of the fastest growing and most innovative supply chain planning systems of record in the marketplace today.
Goal
Improve their overall website structure and solve their SEO issues.
SERVICES
Search Engine Optimization Audit – Main Website & Blog.
Organic Search Performance Report.
CATEGORIES
Supply Chain Manager
Main Website – SEO Audit & Performance
July 11th, 2019
Executive Summary
Kinaxis has a good base to build on
- Site size and pre-existing search engine optimization (SEO) at Kinaxis.com are in better shape than most sites
- That said, to gain on its competitors Kinaxis needs to increase its SEO footprint by focusing on indexation and site crawlability
Opportunities for improvement
- A number of areas can be improved to ensure that kinaxis.com works as hard as possible to capture free, high-quality organic traffic
- This is particularly true when it comes to technical SEO issues on kinaxis.com
Summary of Recommendations
SEO Audit Top Recommendations
Technical Updates required to:
Fix broken pages/pages returning 4xx errors.
Update robots.txt directive.
- Deindex thin pages that are irrelevant to searchers.
Identify if the current 302 redirects are truly temporary or not.
On Page & Off Page SEO updates required to:
Add keyword dense H1 tag to homepage.
Add structured data across pages.
Disavow spammy backlinks.
Table of Contents
Pages 8-19:
8. Robots.txt directive
9. Unsupported Hreflang Links
10. Thin Pages
11. Broken Pages (4xx Errors)
12. Canonicalization
13. HTTP Pages
14. Optimize Images
15. Non-301 Redirects
16. Minify JS, CSS, HTML
17. Add/Fix Structured Data
18. Homepage H1 Tag
19. Spammy Backlinks
Pages 15 – 17: Site Architecture
15. Navigation
16. Permanent links
17. Page Titles & URLs
Pages 18 – 20: UX/UI
18. Search Icons
19. Double Logos On Mobile
20. Social Icons
Technical SEO:
Robots.txt directive
Sitemap is missing from robots.txt file
A sitemap is an XML file that contains a list of all webpages on your site. It may also contain additional information about each URL in the form of meta data. Since search-bots crawl robots.txt first when crawling a site, it is highly beneficial to add the sitemap.xml file here because it contains a list of every page on the site and the pages can be crawled/indexed quicker.
* Each subdomain with a robots.txt file should have it’s own sitemap included in the robots.txt file
Unsupported Hreflang Links
- 373 pages across the site have unsupported Hreflang links
Hreflang elements are not supported inside normal anchor tags, they need to be inside <link> tags in the header.
Thin Pages
- Consider removing ‘thin pages’ from Google’s index or add content to ‘thin pages’
Pages with a small content size can be classified as ‘thin content’ and may not be indexed or devalue your site quality. It is best practice to populate Google’s index with high quality results which, in most cases, means providing in-depth and valuable content that users will find useful.
Here’s an example of a “thin page”:
Page Analysis
Broken Pages (4xx Errors)
- URLS that return a 4xx status code are considered broken (there are 420 of these)
These are often caused because a page has been removed but is still linked, or because the linked URL is incorrect. The solution is to update the links to point to an alternative target or remove the link if there is no suitable alternative. If the linked page has been deleted, it could be 301 redirected to an appropriate alternative.
Page Analysis: Broken Pages(4xx Errors) = 420
Canonicalization
- At this time, there are 544 pages across the site missing a valid canonical tag
A canonical tag (aka “rel canonical”) is a way of telling search engines that a specific URL represents the master copy of a page. Using the canonical tag prevents problems caused by identical or “duplicate” content appearing on multiple URLs. Practically speaking, the canonical tag tells search engines which version of a URL you want to appear in search results.
Page Analysis
HTTP Pages
Pages using the non-secure HTTP scheme.
Pages using the HTTP protocol are at a disadvantage when competing against secure pages in Google search, and are marked as Non Secure in Chrome browsers from July 2018. A major of the pages still loading over HTTP are from a subdomain.
HTTP & HTTPS
Optimize Images
Images across pages could use optimization & compression.
Image optimization helps in improving page load speed, boosts websites’ SEO ranking, and improves user experience.
Non-301 Redirects
- Identify if the current 302 redirects are truly temporary or not.
URLs redirecting with a 302 status tell search engines that the redirect is temporary, and will not immediately drop the redirecting URL in favour of the redirect target.
Page Analysis: 301 Re-directs = 34
Minify JS, CSS, HTML
- JavaScript, CSS and HTML files are not minified
By compacting HTML, CSS and JS code, including any inline JavaScript and CSS files contained in it, the site can save many bytes of data and speed up downloading, parsing, and execution time.
Add/Fix Structured Data
- Pages across site could utilize more structured data.
Although structured data is not a ranking factor, by adding structured data it will allows you to tell search engines exactly what kind of content they will find on your website. Doing so can give search engines a better understanding of that content and sometimes display your page as a rich result
Rich Snippet Example
Homepage H1 Tag
- H1 tag on the homepage is mistargeted.
HTML header tags are an important way of signaling to search engines the important content topics of your page, and subsequently the keywords it should rank for or would like to target. In this case, the site is targeting “Know sooner. Act faster.” as the main H1 on the homepage and this does not highlight what Kinaxis does. It would be better to use a keyword such as “Supply Chain Management” as the H1 tag or whatever the primary target keyword is.
Spammy Backlinks
- Consider disavowing spammy/toxic backlinks.
Google works very hard to make sure that actions on third-party sites do not negatively affect a website. In some circumstances, incoming links can affect Google’s opinion of a page or site and result in poor rankings or even a manual action penalty.
Domain by Toxic Score
Recent Work
BRANDING • UX/UI DESIGN • WEB DEVELOPMENT • CONTENT CREATION • SEO • CONVERSION RATE OPTIMIZATION • DATA ANALYSIS • DASHBOARD BUILD OUTS & REPORTING/TRAFFIC • PPC • MARKETING • SOCIAL MEDIA
“We are beyond pleased with the SEO technical work that OWF has done for us. We have learned so much and continue to work with them to make sure we stay on top of our industry”