why does my conversion rate suck? craig sullivan, senior optimisation consultant

Post on 28-Jan-2015

119 Views

Category:

Business

6 Downloads

Preview:

Click to see full reader

DESCRIPTION

Craig Sullivan, Senior Optimisation Consultant covers the top 10 reasons why your conversion rate might suck. Packed with actionable tips and resources this presentation is for anyone wanting to improve their Conversion Optimisation. Craig covers common problems and topic areas such as issues with Google Analytics setup, inputs, tools, testing, testing cycles, product cycles, Photo UX, how to analyse statistics / Data, segmentation, multiple channel optimisation. The resource pack also include a maturity model, Crowd-sourced UX, collaborative tools, testing tools for CRO & QA, Belron Methodology example, and CRO and testing resources.

TRANSCRIPT

Why does my Conversion Rate Suck?

17th Oct 2013

SEO/PPC

@

Insight - Inputs

#FAIL

Competitor copying

GuessingDice rolling

An article the CEO

read

Competitor change

Panic

Ego

OpinionCherished

notionsMarketing

whims Cosmic raysNot ‘on brand’

enough

IT inflexibility

Internal company

needs

Some dumbass

consultant

Shiny feature

blindnessKnee jerk reactons

Insight - Inputs

Insight

Segmentation

SurveysSales and

Call Centre

Session Replay

Social analytics

Customer contact

Eye tracking

Usability testing

Forms analytics

Search analytics Voice of

CustomerMarket

research

A/B and MVT testing

Big & unstructured

data

Web analytics

Competitor evalsCustomer

services

@

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

24 Jan 2012

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

•–

• Invest continually in Analytics instrumentation, tools & people

• Use an Agile, iterative, Cross-silo, One team project culture

• Prefer collaborative tools to having lots of meetings

• Prioritise development based on numbers and insight

• Practice real continuous product improvement, not SLED

• Source photos and copy that support persuasion and utility

• Have cross channel, cross device design, testing and QA

• Segment their data for valuable insights, every test or change

• Continually try to reduce cycle (iteration) time in their process

• Blend ‘long’ design, continuous improvement AND split tests

• Make optimisation the engine of change, not the slave of ego

• See the Maturity Model in the resource pack

• Belron – Ed Colley

• Dell – Nazli Yuzak

• Shop Direct – Paul Postance (now with EE)

• Expedia – Oliver Paton

• Schuh – Stuart McMillan

• TSR Group – Pete Taylor

• Soundcloud – Eleftherios Diakomichalis &

Ole Bahlmann

• Gov.uk – Adam Bailin (now with the BBC)Read the gov.uk principles : www.gov.uk/designprinciples

And my personal favourites of 2013 – Airbnb and Expensify

44

45

46

Photograph your receipts and have them scanned.

Forward expenses for import via email.

Saves lots of time and error prone manual entry.

Is there a way to fix this then?

52

@

Email

Twitter

: sullivac@gmail.com

: @OptimiseOrDie

: linkd.in/pvrg14

More reading. Download the slides! Questions…

53

RESOURCE PACK

54

RESOURCE PACK

• Maturity model

• Crowdsourced UX

• Collaborative tools

• Testing tools for CRO & QA

• Belron methodology example

• CRO and testing resources

55

Ad Hoc

Local Heroes

Chaotic Good

Level 1Starter Level

Guessing

A/B testing

Basic tools

Analytics

Surveys

Contact Centre

Low budget usability

Outline process

Small team

Low hanging fruit

+ Multi variate

Session replay

No segments

+Regular usability testing/research

Prototyping

Session replay

Onsite feedback

________________________________________________________________________

_____________________ _

Dedicated team

Volume opportunities

Cross silo team

Systematic tests

Ninja Team

Testing in the DNA

Well developed Streamlined Company wide

+Funnel optimisation

Call tracking

Some segments Micro testing

Bounce rates

Big volume landing pages

+ Funnel analysis

Low converting

& High loss pages

+ offline integration

Single channel picture

+ Funnel fixes

Forms analytics

Channel switches

+Cross channel testing

Integrated CRO and analytics

Segmentation

+Spread tool use

Dynamic adaptive targeting

Machine learning

Realtime

Multichannel funnels

Cross channel synergy

________________________________________________________________________

_______________________

________________________________________________________________________

________________________

Testing

focus

Culture

Process

Analytics

focus

Insight

methods

+User Centered Design

Layered feedback

Mini product tests

Get buyin

_________________________________________________________________________

_______________________Mission Prove ROI Scale the testing Mine valueContinual

improvement

+ Customer sat scores tied to UX

Rapid iterative testing and

design

+ All channel view of customer

Driving offline using online

All promotion driven by testing

Level 2Early maturity

Level 3Serious testing

Level 4Core business value

Level 5You rock, awesomely

________________________________________________________________________

________________________

56

Som, feedbackRemote UX tools (P=Panel, S=Site recruited, B=Both)

Usertesting (B) www.usertesting.com

Userlytics (B) www.userlytics.com

Userzoom (S) www.userzoom.com

Intuition HQ (S) www.intuitionhq.com

Mechanical turk (S) www.mechanicalturk.com

Loop11 (S) www.loop11.com

Open Hallway (S) www.openhallway.com

What Users Do (P) www.whatusersdo.com

Feedback army (P) www.feedbackarmy.com

User feel (P) www.userfeel.com

Ethnio (For Recruiting) www.ethnio.com

Feedback on Prototypes / Mockups

Pidoco www.pidoco.com

Verify from Zurb www.verifyapp.com

Five second test www.fivesecondtest.com

Conceptshare www.conceptshare.com

Usabilla www.usabilla.com

2 - UX Crowd tools

57

3 - Collaborative Tools

Oh sh*t

58

3.1 - Join.me

59

3.2 - Pivotal Tracker

60

3.3 – Trello

61

3.4 - Basecamp

62

• Lots of people don’t know this

• Serious time is getting wasted on pulling and preparing data

• Use the Google API to roll your own reports straight into Big G

• Google Analytics + API + Google docs integration = A BETTER LIFE!

• Hack your way to having more productive weeks

• Learn how to do this to make completely custom reports

3.5 - Google Docs and Automation

63

• LucidChart

3.6 - Cloud Collaboration

64

• Webnotes

3.7 - Cloud Collaboration

65

• Protonotes

3.8 - Cloud Collaboration

66

• Conceptshare

3.9 - Cloud Collaboration

67

4 – QA and Testing toolsEmail testing www.litmus.com

www.returnpath.com

www.lyris.com

Browser testing www.crossbrowsertesting.com

www.cloudtesting.com

www.multibrowserviewer.com

www.saucelabs.com

Mobile devices www.perfectomobile.com

www.deviceanywhere.com

www.mobilexweb.com/emulators

www.opendevicelab.com

68

5 – Methodologies - Lean UX

Positive– Lightweight and very fast methods

– Realtime or rapid improvements

– Documentation light, value high

– Low on wastage and frippery

– Fast time to market, then optimise

– Allows you to pivot into new areas

Negative– Often needs user test feedback to

steer the development, as data not enough

– Bosses distrust stuff where the outcome isn’t known

“The application of UX design methods into product

development, tailored to fit Build-Measure-Learn cycles.”

69

5 - Agile UX / UCD / Collaborative Design

Positive– User centric

– Goals met substantially

– Rapid time to market (especially when using Agile iterations)

Negative– Without quant data, user goals can

drive the show – missing the business sweet spot

– Some people find it hard to integrate with siloed teams

– Doesn’t’ work with waterfall IMHO

Wireframe

Prototype

TestAnalyse

Concept

Research

“An integration of User Experience Design and Agile* Software Development Methodologies”

*Sometimes

70

CRO

71

5 - Lean Conversion Optimisation

Positive– A blend of several techniques

– Multiple sources of Qual and Quant data aids triangulation

– CRO analytics focus drives unearned value inside all products

Negative– Needs a one team approach with a strong PM who is a

Polymath (Commercial, Analytics, UX, Technical)

– Only works if your teams can take the pace – you might be surprised though!

“A blend of User Experience Design, Agile PM, Rapid Lean UX Build-Measure-Learn cycles, triangulated data sources, triage and prioritisation.”

72

5 - Lean CROInspection

Immersion

Identify

Triage & Triangulate

Outcome Streams

Measure

Learn

Instrument

73

5 - Triage and Triangulation

• Starts with the analytics data

• Then UX and user journey walkthrough from SERPS -> key paths

• Then back to analytics data for a whole range of reports:

• Segmented reporting, Traffic sources, Device viewport and browser, Platform (tablet, mobile, desktop) and many more

• We use other tools or insight sources to help form hypotheses

• We triangulate with other data where possible

• We estimate the potential uplift of fixing/improving something as well as the difficulty (time/resource/complexity/risk)

• A simple quadrant shows the value clusters

• We then WORK the highest and easiest scores by…

• Turning every opportunity spotted into an OUTCOME

“This is where the smarts of CRO are – in identifying the

easiest stuff to test or fix that will drive the largest uplift.”

74

5 - The Bucket Methodology“Helps you to stream actions from the insights and prioritisation work.

Forces an action for every issue, a counter for every opportunity being lost.”

TestIf there is an obvious opportunity to shift behaviour, expose insight or

increase conversion – this bucket is where you place stuff for testing. If you have traffic and leakage, this is the bucket for that issue.

InstrumentIf an issue is placed in this bucket, it means we need to beef up the

analytics reporting. This can involve fixing, adding or improving tag or event handling on the analytics configuration. We instrument both

structurally and for insight in the pain points we’ve found.

HypothesiseThis is where we’ve found a page, widget or process that’s just not working well but we don’t see a clear single solution. Since we need to really shift the behaviour at this crux point, we’ll brainstorm hypotheses. Driven by

evidence and data, we’ll create test plans to find the answers to the questions and change the conversion or KPI figure in the desired direction.

Just Do ItJFDI (Just Do It) – is a bucket for issues where a fix is easy to identify or the change is a no-brainer. Items marked with this flag can either be deployed in a batch or as part of a controlled test. Stuff in here requires low effort

or are micro-opportunities to increase conversion and should be fixed.

Investigate You need to do some testing with particular devices or need more information to triangulate a problem you spotted. If an item is in this

bucket, you need to ask questions or do further digging. 75

5 - Belron example – Funnel replacement

Final

prototype

Usability

issues leftFinal changes Release build

Legal review

kickoff

Cust services

review kickoff

Marketing

reviewTest Plan

Signoff

(Legal,

Mktng, CCC)

Instrument

analytics

Instrument

Contact

Centre

Offline

taggingQA testing

End-End

testing

Launch

90/10%

MonitorLaunch

80/20%

Monitor < 1

week

Launch

50/50%Go live 100%

Analytics

review

Washup and

actions

New

hypotheses

New test

design

Rinse and

Repeat!

6 - CRO and Testing resources

• 101 Landing page tips : slidesha.re/8OnBRh

• 544 Optimisation tips : bit.ly/8mkWOB

• 108 Optimisation tips : bit.ly/3Z6GrP

• 32 CRO tips : bit.ly/4BZjcW

• 57 CRO books : bit.ly/dDjDRJ

• CRO article list : bit.ly/nEUgui

• Smashing Mag article : bit.ly/8X2fLk

77

END SLIDES

78

Feel free to steal, re-use, appropriate or otherwise lift

stuff from this deck.

If it was useful to you – email me or tweet me and tell me

why – I’d be DELIGHTED to hear!

Regards,

Craig.

top related