Professional Documents
Culture Documents
SEO AUDIT
www.cedar-rose.com
Ross Britten
2
4. Other Technical Considerations ....................................................................................... 27
Language Targeting ....................................................................................................................... 27
Schema and Structured Data ........................................................................................................ 27
5. Content ................................................................................................................................ 29
Thin and Duplicate Content .......................................................................................................... 29
Blog Content.................................................................................................................................... 29
6. Google Analytics................................................................................................................. 31
Incorrect GA Implementation ....................................................................................................... 31
Event Tracking ............................................................................................................................... 31
Ecommerce Tracking ..................................................................................................................... 31
Prioritised Actions ................................................................................................................. 32
3
Introduction
This technical SEO audit is intended to highlight areas on www.cedar-rose.com that will be
having a negative effect on visibility in organic search engine results pages (SERPs). By
fixing these issues, you will greatly improve how search engine bots crawl and index your
site, meaning it will be easier for them to find your content and display it in search results
pages. This will help boost your organic traffic.
I have broken down these areas into four main categories: On-page and Hygiene issues,
Crawlability issues, Content issues, and Technical issues, Technical Considerations, and
Content. For each of these issues, I have provided a recommendation in-keeping with current
SEO best practice.
To prioritise these recommendations and to ensure improved organic visibility, I have added
the following legend to each of the tasks:
A breakdown of these ratings is below. They reflect both the impact I expect the changes to
have and how much time and effort I anticipate the fixes will require.
Priority:
Difficulty:
4
1. On-Page and Hygiene
Title Tags
Title tags are used to signal to search engines what a page is about. They are an important on-
page ranking factor. As such, we should optimise pages by using H1 tags to markup page
titles.
As a general rule, H1 tags should be unique to that page, contain the keyword you want to
target and limited to one per page.
Missing H1 Tags
Priority: LOW | Difficulty: LOW
The homepage is currently missing an H1 tag. This is common for homepages as they usually
have a logo in place of a title.
H1 tags on the homepage ensure that the homepage appears in search results when someone
searches for your company’s name.
Cedar Rose currently does rank in position one for the search term ‘cedar rose’ without an
H1, however, I would recommend including an H1 on the homepage as this will help
maintain that visibility as part of SEO best practice.
For example, someone searching for ‘Cedar Rose bankruptcy check’ will either see one of the
following as they both contain the same H1 tag ‘Bankruptcy Check’.
• http://www.cedar-rose.com/Product/Detail/399
• http://www.cedar-rose.com/Product/Detail/400
This creates confusion as search engines do not know which page to display. Instead of
choosing one to display, search engines will usually demote both pages by either pushing
them down search results pages, or not displaying them all together for the search term
‘Cedar Rose bankruptcy check’. This results in less traffic to these pages.
5
3. URL parameters
I have addressed each of these issues separately in this audit as they will also be having wider
SEO implications. Please refer to pages in the URL Structure and Taxonomy section to
address these issues and fix the duplicate H1 tags.
H2 – H6 Tags
Priority: MED | Difficulty: LOW
As well as the H1 tag, H2 – H6 tags can be used to help break up content in the form of
subheadings. In HTML coding, heading tags from H1 to H6 form a top-down hierarchy. The
most important heading should be marked up as an H1, and subsequent subheadings should
be marked up using H2, H3, H4 tags and so on. These subsequent subheadings can be used to
target secondary keywords.
This helps create association around a particular subject matter. For example, by creating a
page with the H1 ‘Bankruptcy Checks in Algeria’, and then creating subheadings with further
information, such as H2 ‘What is the bankruptcy process in Algeria?’, you can create more
association between your page and bankruptcy in Algeria. This means that you are more
likely to appear for the search term ‘bankruptcy in Algeria’.
There can be multiple H2 – H6 tags on the same page, as long as they are nested in separate
html elements.
Canonical Tags
Canonical tags are used to help search engines understand the relationship between similar
pages or variations of the same page. This is to make sure search engines display the correct
version in SERPs.
At present, there are no canonical tags on any page on www.cedar-rose.com. Each unique
page should have a self-referencing canonical tag as follows:
Variations of the same page should have canonicals referencing the main page.
For example, the following two pages are localised versions of the same page
http://www.cedar-rose.com/product/detail/284
6
• http://www.cedar-rose.com/product/detail/284?countryId=49
• http://www.cedar-rose.com/product/detail/284?countryId=48
However, search engines will see these as three separate pages. This will be having a
negative impact on how search engines crawl and index the site. It will also make search
engines think that this is duplicate content which can lead to a penalty and removal from
Google’s index.
These kinds of pages should reference the original page to avoid duplication and
cannibalisation in SERPs, i.e. both localised versions should have the following canonical
tag:
Recommendation: Add self-referencing canonical tags to all unique pages and add
canonical tags to variation pages referencing the original page.
Page Titles
Page titles are the blue links that appear in search results for your pages:
While page titles are not a direct ranking factor for organic search, they are the first thing a
user sees about your site and do influence user click-through rates (CTR) which can increase
organic traffic. As such, they should be both user and SEO-friendly to maximise CTR.
Current best practice for page titles adheres to the following guideline:
At present, page titles on www.cedar-rose.com do not follow this guideline. There are
currently 2,885 pages with the same page title ‘Cedar Rose’. This does not give the the user
any information about these pages, which makes them less likely to click on them.
Recommendation: Optimise page titles to current SEO best practice using above
guideline
7
Meta Descriptions
Priority: HIGH | Difficulty: LOW
Similar to page titles, meta descriptions are not direct ranking factors but can have a
positive effect on organic click-through rates.
For example, the following meta description is pulled directly from the content on this page:
This auto-generated meta description gives very little information about this particular
service, which means fewer people are likely to click on this link.
• Contain a call-to-action
• Contain the target keyword
• Contain the brand name
• Be within 150 to 160 characters long
Recommendation: Create unique meta descriptions for all html pages following SEO best
practice guidelines.
Alt Text
Current search engines are unable to understand or contextualise images. As such, many
search engines rely on alt text to understand what images portray. Alt text should be a brief
description of what the image contains. It should also include relevant keywords or answers
to search queries. This is to help search engines crawl and index images, but also to help
target specific keywords on a page to help rank that page for those keywords.
By including alt text, www.cedar-rose.com can target secondary keywords on a page to help
increase visibility for those keywords.
For example, on the homepage you could include secondary keywords such as ‘Credit reports
in United Arab Emirates’ for the below image to help create association between your site
8
and credit reports in UAE. This will help increase visibility of your site when someone
searches for credit reports in UAE.
Recommendation: Include alt text on every image to target secondary keywords and help
boost visibility for those terms
9
2. Crawlability
CSS Delivery
Multiple CSS files can have a negative impact on site load times. Page speed has become an
increasing ranking factor on mobile devices, with faster sites favoured over slower sites in
organic SERPs.
www.cedar-rose.com currently has four separate CSS files. Each time a user visits a page, a
browser must call each of these files before displaying the page. This can slow down how
long it takes to load a page.
Best practice is to combine CSS files into one main file so that browsers only have to spend
time calling and serving one resource.
SOURCE: VARVY.COM
Recommendation: Combine CSS files into one CSS file. Inline small CSS into HTML
where necessary to load the page.
Render-Blocking CSS
Priority: MED | Difficulty: HIGH
Render blocking CSS is where a page cannot load until all the CSS has been called and
served. This slows down page load times and can have a negative impact on organic
visibility. This is especially true for mobile searches.
10
By default, CSS is naturally render blocking as a user-friendly page cannot load without CSS.
However, there are measures to reduce the amount of time it takes a browser to render CSS.
Unless specified, browsers will call multiple resources, such as html, CSS and JavaScript, at
the same time. A page will not fully render until all these resources have been processed.
This can lead to slow load times. Often, these resources do not all need to be called at the
same time.
This can be avoided by optimising a page’s critical render path. That is to say, prioritising
resources that are required to load content on a page above-the-fold, and deferring the rest.
That way, the user, and the search bot, see the content they came for but are not hindered by
scripts running in the background or below the fold.
SOURCE: VARVY.COM
We can optimise the critical render path to concentrate on above-the-fold content by ensuring
we deliver resources in a logical order, i.e., html first, then CSS, then JS:
SOURCE: VARVY.COM
11
Labelling CSS
Incorrectly labelling CSS files can result in browsers unnecessarily calling all CSS files at the
same time, even those that are not needed. This can be avoided by correctly labelling CSS
files so a browser only renders the correct CSS files.
Small CSS can be included in the HTML of a page to prioritise above-the-fold content. This
is useful because it reduces the number of external CSS files a browser has to call, speeding
up page load times.
Recommendation: Optimise critical render path by labelling CSS files correctly, combining
multiple CSS files into one, and in-lining above-fold CSS.
Minify CSS
Priority: MED | Difficulty: LOW
Minifying CSS refers to compressing CSS files. While the above measures will improve page
load times, minifying CSS files can save many bytes of data and speed up download and
parse times.
JavaScript Delivery
Similar to render blocking CSS, JavaScript can block browsers from loading pages until all
the JS files have to be called and parsed. These are often not needed to display user-focused
content.
As with render blocking CSS, this can be combatted by optimising the critical render path for
above-the-fold content.
12
Multiple JavaScript Files
Priority: MED | Difficulty: HIGH
There are currently 12 JavaScript files on site. At the moment, browsers have to call and
parse each of these files separately. Combining all JS files into one can reduce the amount of
time it takes to load a webpage.
Minify JavaScript
Priority: MED | Difficulty: MED
Deferring JavaScript
Priority: MED | Difficulty: HIGH
A lot of JavaScript is not essential to display content above-the-fold, yet browsers will call JS
files and parse them at the same time as HTML and CSS files. This slows down page load
times.
Non-essential JavaScript, that is to say JavaScript that is not needed to display content above
the fold, can be deferred until after the rest of the page has loaded.
Recommendation: Allow HTML and CSS to be parsed first, run internal script to call
external JavaScript once content has fully loaded.
13
3. Technical
URL Structure and Taxonomy
Duplicate URLs
Priority: HIGH | Difficulty: LOW
URLs are case sensitive. For example, the following page has two URLs:
• http://www.cedar-rose.com/Registration
• http://www.cedar-rose.com/registration
Search engines will see this as two separate pages and will crawl and index them separately.
This can lead to duplicate content penalties. It can also waste our crawl budget, whereby
Google wastes resource crawling these two identical pages rather than crawling other pages
on the site. This can prevent some pages from being indexed.
Recommendation: Use a regex rule to 301 redirect all upper case URLs to their lower case
counterparts.
Parameter URLs
Priority: HIGH | Difficulty: LOW
There are 2,817 URLs with added parameters. For example, the following URLs are all
generated based on a master URL’s template that produce a parameter:
• http://www.cedar-rose.com/CompanyList?&page=42
• http://www.cedar-rose.com/product/detail/284?countryId=157
• http://www.cedar-rose.com/Product/DownloadProductPdf?ProductId=412
Search engines see these as individual pages rather than variations of a master page. This can
have a negative impact on your crawl rate as search bots have to crawl each page individually
to analyse their content. This means that search engines have to crawl 2,800+ nearly identical
URLs each time they visit your site. The result is that search bots reduce how many times
they visit your site as it is too resource heavy. In addition, as the content is very similar,
search engines may also penalise you for duplicate content. This can lead to your site being
removed from search results.
However, we should look at a longer term fix to address the URL structure of these pages
to be more user and SEO friendly. Please find these recommendations below.
14
Company List URLs
Priority: HIGH | Difficulty: MED
I understand that this is a directory of all your contacts in your database for people to search
for specific companies. However, this offers very little benefit to users in organic search.
Most of these pages are not being crawled or indexed by Google as they are seen as
duplicates with very little content to benefit the user (refer to Parameter URLs above).
Those that are indexed, also appear to be generic pages with very little information about said
companies. See image below:
As these pages are so similar, they have no discerning information to distinguish them from
one another. They also do not have specified page titles or meta descriptions so Google auto-
generates these instead. (Please refer to User and SEO-friendly Page Titles and Meta
Descriptions). This results in very low click-through rates as users are unlikely to click on
these links.
However, there is a lot of search opportunity around people searching for lists of companies
in particular markets. For example, people do search for ‘companies in Bahrain’. We could
boost traffic by restructuring the Company List pages to rank for these kinds of searches,
rather than for individual companies.
Recommendation: Update the Company List pages into the following hierarchical
structure:
1. www.cedar-rose.com/company-register/
2. www.cedar-rose.com/company-register/companies-in-bahrain/
3. www.cedar-rose.com/company-register/companies-in-bahrain/page2
4. www.cedar-rose.com/company-register/companies-in-bahrain/page3 etc.
15
Explanation:
1. More people search for ‘company register’ than ‘company list’. Change the name of
this page from Company List to Company Register. This page would contain links to
company registers for each of the markets you operate in, i.e. a link to a list of
companies in Bahrain, Saudi Arabia, Algeria etc.
a. www.cedar-rose.com/company-register/companies-in-bahrain/
b. www.cedar-rose.com/company-register/companies-in-saudi-arabia/ etc.
2. Each market would then have its own list of companies in its own subdirectory
3. Use rel=prev/next navigation to navigate to subpages: page 2, 3 and so on.
4. Add a canonical tag to all subpages, e.g. page 2, 3 and so on, referencing the original
page, in this case www.cedar-rose.com/company-register/companies-in-bahrain/
Product URLs
Priority: HIGH | Difficulty: MED
Similar to the Company List pages, product pages have very little distinguishable information
in the URL. They are also very hard to navigate to both from search results pages and
internally.
To make this easier for users and search engines to understand these pages, I would introduce
a hierarchical structure to these URLs.
• www.cedar-rose.com/solutions/business-credit-report
You could then break this structure further to target credit report searches for specific
markets, i.e. ‘company credit reports in Algeria’ or ‘credit reports Saudi Arabia’
• www.cedar-rose.com/solutions/business-credit-report/business-credit-reports-algeria
• www.cedar-rose.com/solutions/business-credit-report/business-credit-reports-saudi-
arabia
This format can be rolled out across all your products and markets to maximise organic
traffic.
Please note, this will require further keyword research to ascertain a URL naming
convention based on search volumes across all products and markets
16
Solutions Navigation
I would also recommend creating the page www.cedar-rose.com/solutions to help with page
hierarchy and crawl efficiency. At the moment, your solutions are available through a drop
down menu. See below image:
This kind of navigation is hard for search engines to crawl and index as it cannot understand
the hierarchy of these pages. I would recommend creating this as a standalone page at
www.cedar-rose.com/solutions as this will help Google understand that these links are all the
services you offer.
This would work similarly to, and replace, www.cedar-rose.com/products, but I would not
advise a filter function on this page as search engines need to be able to access these links.
I would also recommend working with a UX specialist to make sure this navigation works
for users and not just search bots.
Core Products
I would also recommend creating individual pages for your core products:
At the moment, these are only available via drop-down menus and are not actually accessible
themselves. You would improve visibility for these search terms if they had their own pages.
17
Each of these pages could then link to the individual products within each category. You
could then structure these into a hierarchical structure for each market, e.g.
www.cedar-rose.com/solutions/business-credit-report/business-credit-report-algeria
(see Product URLs above).
There is no need to remove your current drop-down navigation for Solutions, as this can
work in conjunction.
At the moment, there are two links for ‘Due Diligence Report’, ‘Bankruptcy Check’, ‘Court
Records Check’ etc. in the drop-down Solutions navigation (see above image). One is for
companies, the other is for individuals. This is difficult for search engines to understand as it
doesn’t sit within a URL structure and creates conflict.
As these two links have the same anchor text, but link to different product pages, you are
creating confusion for crawlers.
Rather than having two different navigations for these, I would direct users to one product
page. For example:
• www.cedar-rose.com/solutions/bankruptcy-check
The user can then choose whether this is for a company or an individual when ordering the
product. This can also work for market level product pages:
• www.cedar-rose.com/solutions/bankruptcy-check/bankruptcy-check-algeria
This solution would also help consolidate organic searches around a single product page,
rather than having one product page for companies and one for individuals, which can cause
confusion for both users and search engines.
Empty Subdirectories
Priority: HIGH | Difficulty: LOW
Empty subdirectories are folders within a URL that return a 404 error. For example:
These empty subdirectories are causing crawling issues as search engine bots will expect to
see accessible content in these directories. As there is no content, or internal linking, in these
subdirectories, search bots will often cease their crawl here and not continue to crawl all the
remaining subdirectories.
18
• http://www.cedar-rose.com/product/detail/399 << Unlikely to get crawled
This can prevent search engines from indexing your product pages. It also places them lower
in your site’s hierarchy as it sees these as unimportant pages.
Homepage Cannibalisation
Priority: HIGH | Difficulty: LOW
SEO cannibalisation is where two or more pages compete for the same keywords or search
queries. This can lead to duplicate content, in which case these pages may be penalised and
drop in rank, or it can result in reduced visibility where both pages are demoted as search
engines are confused as to which page to display.
This is currently what is happening to the homepage. The following pages are all indexed,
duplicate versions of each other:
§ http://www.cedar-rose.com/Index
§ http://www.cedar-rose.com/
There is also a similar issue involving pages with the following format as highlighted above
in Duplicate URLs:
§ http://www.cedar-rose.com/about
§ http://www.cedar-rose.com/About
§ http://www.cedar-rose.com/Product/Detail/402
§ http://www.cedar-rose.com/product/detail/402
Recommendation: 301 redirect duplicate page to main page. Add canonical tags if
duplicate page needs to stay live. Write regex rule to redirect uppercase URLs to lower
case.
Internal Linking
Priority: MED | Difficulty: MED
Internal linking is an important ranking factor for search engines. It helps identify the most
important page on a website. This is usually the homepage.
At present, www.cedar-rose.com currently has many pages that have more internal links than
the homepage. This could mean the site is not optimising crawl budget. When search bots
crawl a website, they find and navigate from what they think is the most important page
downwards in a hierarchical manner. If we have too many internal links pointing to a page
that is not that important, search engines will think that this page is the most important and
will crawl it more frequently. This could result in more important pages, such as the
homepage or product pages, being crawled less often. This could decrease their overall
visibility.
19
Ideally, we would like internal linking to be in a hierarchical order with the homepage, About
page, Contact page and Solutions pages at the top, and then subsequent product pages in
order of importance.
I have used two different sets of data to calculate internal linking. The first looks at how
many internal URIs a single page has pointing as it:
20
The second takes into account the importance of those pages based on how many internal
links they have and gives them each an internal page rank. This is how search bots will
interpret the hierarchy among these pages.
In both sets of data, the contact page has a disproportionately high internal page rank. This
means that there are more links pointing to this page than any other. Search engines will
therefore see this as the most important page and will crawl it more frequently than the other
pages. It will also prioritise any subdirectories that sit under /Contact/.
From an SEO perspective, it would make more sense to make the homepage the most
crawled page. This will ensure all subdirectories are crawled.
21
Log File Analysis
Uncrawled Pages
Priority: MED | Difficulty: MED
There are currently 246 pages that should be accessible to search bots, but have not been
crawled or indexed in the last 30 days. These are all Company List pages.
Google has seen these pages as low priority and will therefore not crawl them frequently or at
all. This is likely due to crawl budget. Google will assign a set amount of URIs it will crawl
in one go, and it will prioritise these pages based on internal page rank. It will see low
priority pages such as these as a waste of resource and will not crawl them.
Ideally, all accessible pages should be crawled by search bots. This helps with indexing the
whole site, and encourages search bots to revisit the site frequently. We can improve crawl
rate by creating a clear URL structure that helps search bots navigate the site efficiently. This
can be achieved by working through the recommendations suggested in URL Structure and
Taxonomy above.
Fortunately, despite the internal page rank issues cited above, Googlebot still seems to be
crawling the homepage the most often. However, internal page rank is having an impact on
how often other pages are crawled.
The following pages are the top 20 most crawled pages over the last 30 days:
22
http://www.cedar-rose.com/scripts/bootstrap/css/bootstrap.css 20
http://www.cedar-rose.com/scripts/nprogress/nprogress.css 20
http://www.cedar-rose.com/CompanyList?&page=2405 18 Wasting Crawl Budget
There seems to be a big disparity between the number of times the homepage is crawled and
the number of times other pages are crawled. For example, the homepage has been crawled
169 times in the last month, but the Contact page has only been crawled 29 times. None of
the other top level pages, such as About Us, FAQ or any of the product pages are in the top
20.
This is because Googlebot is placing too much emphasis on the Company List pages. As
there are 2,754 Company List URLs, Google is spending a lot of its crawl budget to try and
crawl and index all these pages. This will reduce how many times product pages are crawled.
In addition, the vast majority of Company List pages have only been crawled once in the last
30 days. This is indicative of a low crawl rate. Ideally, pages should be crawled several times
a week to ensure organic visibility in search results pages.
Recommendation: Update URL structure and Taxonomy to create page hierarchy and
boost crawl rate of priority pages and across the domain.
This will be having a negative impact on crawl rate as Google is wasting resource trying to
find these pages.
Recommendation: 301 redirect these pages to the homepage (see accompanying excel doc)
XML Sitemap
Priority: HIGH | Difficulty: LOW
This sitemap should contain all URLs on the site that you want search bots to crawl and
index. Sitemaps should also be updated regularly to make sure the URLs are up to date.
Recommendation: Most CMS will have plugins that allow you to auto-generate sitemaps
on a regular basis. I would recommend creating an XML sitemap that includes every html
page on the site.
23
Robots.txt
Priority: LOW | Difficulty: LOW
It is best practice to include a reference to your XML sitemap in your robots.txt file to help
search engines locate your sitemap. E.g.:
SITEMAP: http://www.cedar-rose.com/sitemap.xml
Location Declaration
Priority: HIGH | Difficulty: LOW
Domains that target specific markets or countries can declare their target location in Google
Search Console. For global domains, or domains that target more than one country, I would
not recommend using this feature in Google Search Console as you are telling Google to
restrict visibility of the domain to that particular market only.
Recommendation: Select ‘Unlisted’ to ensure you target a global audience and do not limit
visibility to one country. Please note, you cannot select market regions such as MENA,
EMEA, etc. For this it is best to target a global audience.
Non-HTTPS Content
Priority: HIGH | Difficulty: HIGH
Site security is becoming an increasing ranking factor for search engines, with https pages
being favoured over their http counterparts. Google also tends to index https before http if
available.
At the moment, www.cedar-rose.com is insecure. This means that Google is more likely to
favour other sites with https protection when ranking your domain.
24
Site Speed
Priority: MED | Difficulty: HIGH
Site speed is becoming an increasingly important ranking factor, with Google favouring
faster sites over slower ones. This is truer on mobile devices.
Desktop
The desktop version of www.cedar-rose.com currently scores 48/100 for speed. This will be
having a negative impact on rank as well as user experience. (Data from Google Page Speed
Insights tool).
Mobile
The mobile version of www.cedar-rose.com currently scores 46/100 for speed. This will be
having more of a negative effect as mobile algorithms take page speed into account when
displaying search results. Faster sites will tend to have higher organic rank. (Data from
Google Page Speed Insights tool).
25
Recommendation: Improve site speed by:
Backlink Analysis
Priority: HIGH | Difficulty: MED
Backlinks are links from other domains that reference your domain. Google uses these
backlinks to work out how important your website is in relation to other websites. In general,
the more backlinks a domain has, the more important it is.
It is now best practice to periodically audit your domain’s backlink profile to make sure there
are no low quality or spammy backlinks pointing to your site.
At the moment, there are 5,222 backlinks from other domains pointing to www.cedar-
rose.com.
26
4. Other Technical Considerations
Language Targeting
Priority: MED | Difficulty: MED
The site is predominantly optimised for English language speakers. This limits the amount of
potential traffic to people searching for your services in English. I would strongly
recommend creating an Arabic language version of the site to capture search traffic in this
language. E.g.
At the moment, the Company List does target searches for these companies in Arabic.
However, as these Company List pages have several crawl issues, they are currently having
little impact on driving traffic. By combining Arabic and English companies in one Company
List section, you are also creating confusion for Google’s search bots as they will use
different search bots for different languages.
If resource does not allow for translation of the whole domain, I would recommend
prioritising a separate Arabic version of the Company List section. For example:
English Content:
• www.cedar-rose.com/company-register
• www.cedar-rose.com/company-register/companies-in-bahrain
Arabic Content:
• www.cedar-rose.com/ar/company-register
• www.cedar-rose.com/ar/company-register/companies-in-bahrain
By separating this out, this would help with crawling and indexing the Arabic content. It will
also create a more consistent user experience. You should also be able to increase visibility
for company searches in Arabic.
Please note, I would also recommend using the hreflang attribute if creating content in
different languages so that search engines better understand the relationship between these
pages.
Schema.org (often called Schema) is a specific vocabulary of tags (or microdata) that you can
add to your HTML to improve the way your page is represented in SERPs. It is a way for
webmasters to provide the information search engines need to understand the context of your
content and provide the best search results possible.
27
Although schema is not a ranking factor, marking-up data can have a positive effect on
organic visibility and CTR.
For example, reviews and ratings can be marked up using schema to create star ratings in
SERPs. This can encourage more click-throughs:
• Organisation information
• Blog Posts
• Products and services
28
5. Content
Thin and Duplicate Content
Priority: HIGH | Difficulty: HIGH
Thin and duplicate content can result in penalties from search engines. This could lead to
demoted visibility in organic search.
Thin content refers to pages with very little useful information to the user. Such as:
http://www.cedar-rose.com/product/detail/284?countryId=90
This kind of content offers a poor user experience. Google takes this into account when
crawling the site. If there is too much content like this, it can penalise the site and remove
these pages from its index.
Similarly, duplicate content can also lead to penalties. Duplicate content is where more than
one page seems to have identical content. As well as offering a poor user experience, search
bots can interpret this content as an attempt to trick Google’s algorithms by creating lots of
similar content to improve organic visibility.
In addition to this, search bots will find it difficult to distinguish between pages with similar
content. This can lead to a loss of visibility in search results for these pages.
As an example, the following pages all have duplicate and thin content:
• http://www.cedar-rose.com/product/detail/284?countryId=90
• http://www.cedar-rose.com/product/detail/288
• http://www.cedar-rose.com/product/detail/284?countryId=49
• http://www.cedar-rose.com/product/detail/402
This affects all Company List and Product pages and leaves them open to penalty.
Recommendation: Create rich content that is beneficial to the user that explains each
product in detail to avoid penalties
Blog Content
Priority: HIGH | Difficulty: MED
At the time of this audit, there is no long-tail content on the domain. Long-tail content can be
used to target new traffic and increase overall organic visibility. Regularly publishing content
also encourages search engines to revisit your domain to crawl and index new content. One
of the best ways to target long-tail keywords is through a regular blog.
By creating long-tail content strategies, I have seen clients’ organic traffic increase by up to
150% year-on-year. I would highly recommend using content as a strategy to boost traffic to
your domain.
29
Recommendation: Create blog on domain to target long-tail keywords based on keyword
research and search volume. Separate keyword research would be required to identify
traffic opportunities and create content strategy.
30
6. Google Analytics
Incorrect GA Implementation
Priority: HIGH | Difficulty: LOW
It appears that Google Analytics has not been correctly implemented on the domain. As such,
there is very little data around traffic, users, navigation and site usage that can help inform
future web strategies.
Without correct GA implementation, it also makes it difficult to measure ROI from SEO or
UX strategies you have invested in. I would highly recommend fixing GA implementation
before investing any further in SEO or UX to make sure you can accurately measure ROI for
these strategies.
Event Tracking
Priority: HIGH | Difficulty: MED
There is also no event tracking enabled in your GA set up. Event tracking allows us to
measure how users interact with the site by assigning ‘events’ to certain interactions. For
example, it would be useful to know how many users click on particular links, add products
to the cart, download certain content etc. This can help provide valuable data and insights for
future digital strategies such as Conversion Rate Optimisation.
Ecommerce Tracking
Priority: HIGH | Difficulty: MED
Similarly, there is no ecommerce data available in your GA set up. This makes it difficult to
assign an accurate ROI for any digital strategies you have invested in.
Ecommerce tracking allows you to measure how much revenue you generate on particular
pages, which products perform better than others, and which channels (such as organic
traffic, PPC traffic, social traffic) perform the best. This enables you to direct resource
towards the most appropriate strategies.
I am happy to provide a quote and summary of what would be needed to set this up
correctly.
31
Prioritised Actions
I would recommend tackling the above issues in the following order:
32