seozone 2015 - mark thomas - 5 actionable technical seo tips
TRANSCRIPT
How valuable was Search in 2015?
@SearchMATH
@SearchMATH
Turkey
Have you ever intentionally browsed products at a store but, decided to buy them online?
Have you ever intentionally browsed products online, but decided to buy them in-store?
@SearchMATH
68%
32%
Yes No
70%
30%
Yes No
@SearchMATH
“Web-rooming” is just as important as
“showrooming”
@SearchMATH
56% - first thing they do when researching a purchase is to go to a
search engine
@SearchMATH
@SearchMATH
Today I’m going to look at “ON-THE-PAGE
FACTORS” AKA
“TECHNICAL SEO”@SearchMATH
ON-THE-PAGEFACTORS
§ CONTENT
§ ARCHITECTURE
§ HTML
Why does TECHNICAL SEO
matter?@SearchMATH
Ok… nice article
@SearchMATH
But how healthy is their ON-THE-PAGE SEO?
@SearchMATH
@SearchMATH
Let’s test the User Experience (U/X)
But do 404’s hurt my site?
@SearchMATH
YES
@SearchMATH
@SearchMATH
Website Health
0
2
4
6
8
10
1 5 10
Commercial Performance
Low J
High
Website ‘Health’ impact on Commercial Performance
What do the world’s top SEO
experts say?
@SearchMATH
"Even a basic understanding of what to look for in technical SEO can get you
far…
…So many people today focus too heavily on off-page SEO, but if a site is technically flawed, it won't matter how many links you have or how good your
content is.”
Isolate & Dominate Your Base Metric1. Organic revenue2. Visits compared to last month3. Visits compared year-over-year
http://searchengineland.com/win-battle-proving-seos-value-201328
"Thebiggestproblemswehavearetechproblems.Oftenwebmasters[are]trying
tobetoocleverandgiveconfusingsignals…
JohnMueller,WebmasterTrendsAnalyst,Google
http://managinggreatness.com/2014/01/27/john-mueller-closing-keynote-at-smx-israel/
…Sendclear,consistentandobvioussignals.”
JohnMueller,WebmasterTrendsAnalyst,Google
http://managinggreatness.com/2014/01/27/john-mueller-closing-keynote-at-smx-israel/
GOOGLE RECOMMEND CRAWLING SEPARATELYGoogle Webmaster Hangout, 16th October, John Mueller recommended running a separate crawl to help identify and resolve technical issues that could be causing problems and delays when Google tries to crawl your site.
“We get kind of lost crawling all of these unnecessary URLs and we might not be able to crawl your new updated content.”
I’m going to show you a case study to remind us of some of the power at
our disposal@SearchMATH
Let’s begin…
@SearchMATH
Where do I start with a technical review?
@SearchMATH
Jon suggests we “CRAWL”
@SearchMATH
TIP #1
Independent web crawler software which imitates
googlebot to produces rich reports detailing
opportunities to achieve perfect website architecture.
@SearchMATH
“CRAWL”=
DISCOVERY@SearchMATH
An embarrassing truth for too many web managers…
@SearchMATH
They don’t even know how many pages they have on their website
@SearchMATH
Let alone how many issues are lurking below
the surface@SearchMATH
Establish a clear picture of your website and
then consider how it fits into the URL universe
@SearchMATH
Recommend 5 actions from your audit:
• 3 Quick-wins• 2 Long-term wins
@SearchMATH
TIP #2
What’s the story with these rel=canonical
links?@SearchMATH
“Including a rel=canonical link in your webpage is a
strong hint to search engines your about preferred
version to index among duplicate pages on the web.”
“rel=canonicalcanbeabittricky
becauseit’snotveryobviouswhen
there’samisconfiguration.”
www.mavi.com
rel=canonical to a Disallowed URL
@SearchMATH
Audit your canonical tags quarterly
@SearchMATH
TIP #3
Your robots.txt file holds the power over
crawlers.Monitor for free with
Robotto.org@SearchMATH
TIP #4
So, how can I fix this issue?
Adding this directive in the robots.txt is quicker, cleaner, easier to manage than getting a meta noindex added to specific pages. @SearchMATH
But ultimately the meta canonical needs amending to…
DeepCrawl already supports the robots.txt noindex directive: check which pages are being noindexed in your report via Indexation > Non-Indexable Pages > Noindex
Pages.
Constantly optimise your architecture to
improve crawl efficiency@SearchMATH
TIP #5
So, remember:1) Crawl your website2) 5 actions per audit3) Audit your canonicals4) Try robotto.org for FREE5) Optimise your crawl efficiency
@SearchMATH
Thank you!Please say “hi”
@SearchMATH