from tool selection to measurement: 6 steps to elearning … · 2020-02-11 · from tool selection...

32
From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Upload: others

Post on 17-Mar-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 2: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.1From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

L&D departments come in all shapes and sizes, and the way they approach eLearning authoring displays a similar level of diversity. Some small teams are out there carving out a place for great learning content in an organization new to eLearning. Others are just one among many such departments in a huge global organization, with highly-specified courses to create, and plenty of competition

for budgets and resources. Then there’s every size of business in between, each with unique challenges and perspectives.

Introduction:

The Whole Journey

No matter the experience level or organizational makeup, there are some common issues we see businesses tackling time and again. In this ebook, we’ve called on six members of our experienced team to offer advice for six key stages on your journey with a new learning authoring tool.

This best-practice guide isn’t just aimed at those of you taking your first steps into eLearning authoring. By re-examining the typical purchasing, introduction and first creative steps with a new eLearning tool, we can take a step back and re-examine some common issues as well.

This ebook covers the following milestones in the eLearning authoring journey:

Defining Needs: Gomo’s Managing Director Gavin Beddow kicks things off with some advice on the groundwork every learning department should lay before purchasing a new learning tool.

Working to Your Budget: Huw Edwards, our Business Development Executive, discusses some common challenges of working within a restrictive learning budget. He also examines some false economies and advises on how to build content that helps secure budget increases.

Introducing a New Tool: It’s only natural for staff to remain attached to legacy tools and processes. Gomo’s Customer Success Manager, Simon Waldram, offers tips for quickly and efficiently transitioning to new platforms.

Content Creation: With a new tool selected and the team ready to use it, it’s time to start creating some great learning content. But before you start, read Business Development Executive David Mewborn’s tips for creating high-quality, effective learning content.

Review and Testing: Pratibha Shah, Gomo’s Test Lead and Product Support, offers some pointers on how to put new and existing learning content through its paces.

Measurement and Tracking: Creating and launching content is just the beginning. Adam Fox, Head of Development at Gomo, talks tracking, the potential of xAPI and how to use data to refine your courses (without getting carried away).

Page 3: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.2From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Contents

Defining Your Needs

Working to Your Budget

Introducing a New Tool

Content/Creation

Review and Testing

Measurement and Tracking

Continuing on the Path to eLearning Authoring Success

p.3

p.8

p.12

p.16

p.20

p.26

p.30

Page 4: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.3From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

1: Defining Your NeedsBy Gavin Beddow, Managing Director

When you’re in the market for a new software tool, it can be difficult to define your needs and to prioritize accordingly. The eLearning authoring tool market is no exception: one L&D department’s essential feature may well be an unnecessary sideshow for another. In this chapter, we look at four key ideas to help you define your needs: your audience, output, ecosystem, and partnership.

Page 5: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.4From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Defining Your AudienceYou can only begin to understand your learning needs by understanding your audience. Ultimately, the goal of all L&D activity is to serve an organization’s employees with engaging learning content that results in more efficient and effective working practices as well as personal development. If your technology choices aren’t aligned to who your employees are and how they work, they’re poor technology choices.

Some of your audience’s requirements will be self-evident, while others require a little more digging to uncover. An organization with teams based in New York, London and Mumbai will understand that they require platforms that can account for needs in each location. You may have to explore existing usage data or use questionnaires to find out more about device usage or access habits.

Some questions you should be asking about your audience include:

• Where are your learners based? A platform that makes international hosting and/or distribution of courses easy will be needed if you have multiple offices in multiple countries.

• What languages do your learners speak? Both the nature of and the number of languages spoken by your audience have technical implications. You may need to accommodate right-to-left-reading languages such as Arabic. If you want to deliver the same training in multiple languages, a multilingual solution will save you significant time and effort.

• What devices do your learners typically use to access learning? The days of the desktop-only work habit are a distant memory. Mobile should at least be an option for most learners—and content should be designed to take advantage of its strengths and avoid its weaknesses.

• What are your learners’ typical working patterns? The importance of mobile and bandwidth-conscious versions of your content will increase if your learners are constantly on the road or working in remote areas.

• Do any of your learners have special access requirements? If any of your learners use assistive technology such as screen readers to access content, your tool choices will have to account for this (HTML5 compliance should have you covered). Having control over design aspects such as colors and interaction types can also help you avoid issues experienced by individuals with certain color vision deficiencies or motor impairments.

1:Defining Your Needs

Page 6: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.5From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Remember that your learners aren’t the only audience for the tool. The L&D team creating your courses will be spending by far the most time with the tool, and their skillset will help define the features that you need. Mapping existing skillsets to available tools is a great place to start. In addition to the questions above, consider:

• What is your content pipeline? If your team is just a lone SME inputting material into an LMS, you may be able to get away with using a simple desktop tool. If you need multiple people working on, designing, and editing a course, you will need a robust cloud-based solution that makes team collaboration easy.

• How many licenses do you need? Ideally, everyone in your L&D team and everyone who regularly supports that team (e.g. design department, regularly contributing SMEs, compliance, etc) will need access. However, in the early stages of onboarding a tool, it can be best to start smaller than this (see the chapter on ‘Working to Your Budget’ below.)

• Do you have access to advanced visual design skills? An authoring tool with a good range of pre-made themes will help teams create visually appealing courses if you don’t have a design resource. The tool should give the flexibility to add your brand colors and logo. If you need something more advanced, engage a vendor that will create a custom theme to your exact specifications.

• At what level is your team’s learning design skills? Is your team always making informed design decisions that drive greater learner engagement? Does it have experience using learner data to understand and improve course engagement? If so, a tool with plenty of design options to tweak and xAPI reporting won’t be wasted.

However, we find that organizations that get the best results don’t just have a high level of understanding of their team’s experience level. They also have a vision for where they want to take the team and are invested in finding a tool and vendor relationship that move them in that direction.

For example, alongside the usual onboarding training material, such an organization may also want training in learning design skills. Use the new tool as a springboard to new skillsets: building useful modules on ‘building content for mobile’ or ‘how to maximize engagement’, for instance.

1:Defining Your Needs

Page 7: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.6From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Defining Your OutputOnce your audience of learners and creators are defined, the next step is to decide what you want the learning to look like. This isn’t to say that you need a vivid picture of every aspect of the output. What matters is that you have a view on what you want, and an opinion on the evolving design process.

Particularly successful clients may bring examples of content they like, or will be strongly drawn to specific vendor examples. A custom theme isn’t always essential, but opting for one will guarantee no compromises.

Some other things to have in mind when defining your output include:

• Areas dictated by the audience (See ‘Defining Your Audience’, above): the need for mobile-friendly design and several foreign language versions for example. What your L&D team can achieve may dictate where you need to go as well.

• What media types are needed? Support for common image and video standards is becoming more widespread, but it still matters how they’re delivered. Look out for things like:

• Video captioning, full-screening, easy-to-use player interface

• Support for flexibility when displaying images (for instance, an easy to use gallery feature)

• Support for high-resolution video and image formats

• Support for specialist media types such as Virtual Reality or Augmented Reality

• Quick retrieval and upload of all media types

• What kind of learning do you want to deliver? Is your eLearning the primary training material or a complement to other learning formats? If the latter, will you deliver it before the main training, or after (as a means of assessment or just a knowledge refresher)?

• What is the mode of assessment? Do you need to check understanding as you go (for example, by asking a quick knowledge-check question at the end of a section or topic) or with a final assessment? If learners are allowed to attempt the course multiple times, do you require randomized questions to stop them from cheating?

• What do you need to measure? You need xAPI if you want to know more than the course’s completion, pass, and fail rates. Ideally, you should be checking for internalization of your learning, while identifying barriers to understanding and using data to remove those barriers.

Having a strong sense of the kinds of courses you want to create isn’t just helpful for finding an appropriate tool; it prevents scope-creep and a general loss of focus in your project. Remember to be led by the need to create quality relevant content, not by whatever platform gives you the most bells and whistles. To this end, we recommend storyboarding the flow of a typical piece of content you want to deliver.

1:Defining Your Needs

Page 8: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.7From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Defining Your Learning EcosystemWith your audience and your goals defined, you should be starting to form a shortlist of tools that could work for you. A further consideration to help narrow this list down is your learning ecosystem—that is, the places where learning content currently resides and how the tool you plan to add fits in.

Start by mapping out any learning systems you already have, such as your LMS, learning record store (LRS), learning experience platform (LXP), authoring tools, etc. Are there certain technologies and APIs that are necessary for all of these systems to communicate? Vendors should be able to tell you whether an existing way of bringing everything together already exists, or if they would commit to engineering one.

When looking at your full ecosystem, it’s worth looking beyond eLearning specifically. On the one hand, are learners enthusiastic users of other non-learning systems, and could these be used in any way? On the other hand, what learning is happening offline? This could be in the classroom or informally between colleagues—could tools be used to capture this?

Once mapped, conduct an honest assessment of your learning ecosystem, identifying the pros and cons. Make sure that the tool choices you make go some way to addressing the limitations of your current system—without losing the benefits.

Conclusion: Defining Your PartnershipMapping the needs of your audience, your requirements based on your content goals, and the state of your existing learning ecosystem will let you know everything you need from an authoring tool. Determine whether these points are core requirements or nice-to-haves, and search for tools accordingly.

There is, however, another important matter to consider as you embark on your search: what you want from the provider of the tool. At the very least, vendors should work with you and be honest about what a tool can and cannot do. It’s unlikely that you’ll have perfect clarity on every aspect of your audience, output, and ecosystem—a good partner will offer their expertise to help fill in the remaining gaps. They’ll understand your vision and work with you to achieve it while having the foresight to challenge aspects that don’t seem right.

If a tool you’re interested in has a free trial, use the needs you’ve mapped to look into some key areas: try to build a course by using your existing content. This will give you a rough idea of what it takes to make something brand compliant. Use the time to test-drive the partnership too: attend a demo webinar, ask questions about a feature you’re interested in or not yet sure how to use.

With an audience, output, ecosystem, and partnership defined, you’ll better understand your needs and set yourself up for success. Of course, the journey has only just begun—in the next chapter Huw will talk through what needs to happen when you come up against the realities of working to a budget.

1:Defining Your Needs

Page 9: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.8From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Every L&D professional finds themselves working with a smaller than ideal budget at some point. Indeed, when the time comes to introduce a new learning tool, the budget available may seem a very poor match for the long list of needs you’ve defined. However, don’t despair: working within the confines of a small budget may well be the best thing that happens to your project.

In this chapter, we take a look at how to prove the worth of a tool and push for budget increases—and how all organizations can best spend their budget to minimize wastage and scale effectively.

2: Working to Your BudgetBy Huw Edwards, Business Development Executive

Page 10: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.9From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

The Benefits of Starting SmallA reassuring message for all small businesses and teams shopping for tools on a small budget: even for the big guys, starting small may actually be the best option. While a little wiggle-room is always welcome, from time to time we’re reminded that going all in can be quite risky—and can result in projects with encouraging starts stalling (or worse).

If you’ve fallen head over heels for a learning tool, it can be tempting to show your commitment and go all in. Why stop at just L&D? Why not give every learner a license? After all, everyone is a potential SME. Unfortunately, if you go too big too quickly, you’re liable to find that:

• Licenses go unused as the majority lack the time to work with and learn the tools

• At such an early stage, those championing the tool will have too little knowledge spread too thinly over too many people

• The onboarding training you set up will quickly be forgotten without time to put it into practice

Unless you’re in the incredibly fortunate position of being able to carve out tool-learning time for a large group, starting small and aiming to grow from there should be your main tactic. Focus on building a core group—for instance, the L&D team—that drives use and promotion of the tool to the wider business. Get them to create content pieces that impress early, and prioritize purchasing training and custom design work over licenses in order to achieve this goal.

Theme-related work that focuses on creating a brand-style template is particularly cost-effective when taking this growth-focussed approach. Get the corporate branding handled before course contributors get their hands on the tool to avoid having to unpick and unify the design of published courses. Your design tool choice will have prioritized ease of use for SMEs—but elements such as themes will require your input to ensure that time isn’t wasted.

Common Cost-Saving MeasuresHaving economized on the number of licenses with a plan for future growth, you’re probably looking for more ways of scaling back and cutting costs. As we’ve touched on, starting small has the side effect of focusing efforts. Start big and it’s difficult to build conventions and a body of consistent work that forms the basis for future courses.

A key economy for any growing team is re-use. From the beginning, you’ll want your core team to keep track of what you’ve created so that you can help avoid duplication of efforts. Keep track of not only course names and the overall focus, but chapters and smaller topics as well as examples of good or unique layouts. Effective learning can be transplanted to other courses where it’s relevant—though some tools make duplicating material easier than others.

2:Working to Your Budget

Page 11: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.10From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Along similar lines, it’s common for L&D teams to create master template courses, especially for topics that are regularly reiterated. For example, if you create courses to teach learners about new products, it’s useful to have a template that can be used as the basis for all future product launches. This could have placeholders for product pictures, descriptions, knowledge-checking questions and so on.

Expand the philosophy of re-use to the media used in your courses. Don’t pay more for a tool just because it offers a large image library. Check what you already have in-house—find out what images and video the company owns, whether there are any existing subscriptions to premium image libraries. These sources have the added advantage of being pre-approved by brand stakeholders. Free sources and platforms can fill in the remaining gaps.

False Economies and Other Traps to BewareWhile working to your budget can mean getting creative with what you spend time and money on, there are two areas that we believe are liable to cost you more the less you spend on them. Namely, training and theme work.

As far as possible, don’t skimp on training. You need your core team up and running as quickly as possible with everything they need to know to do their work. Talk to your vendor and find a training solution that gets you everything you need to know (and also explore options for post-training support).

Theme work is a less clear-cut false economy. Theme libraries can be used to get some great results and present a quick and easy way to get your first courses up and running. However, even for the smallest customers, custom themes always remain a worthwhile investment.

Stakeholders want to see early successes, and courses that deliver visually have a better chance of securing extra budget for further projects. A custom theme will also help them conceptualize what it’s going to take to get subsequent projects up and running. With a fantastic looking template already in place, it’ll be easy to secure money for new content. Without a template in place early, you may find it impossible to get budget to improve visuals—and reduced scope for new courses to boot.

Finally, one big budget trap to be wary of is failing to budget for time for staff to learn how to use the tool. Any training commitment needs to be balanced against time to put the learning into practice. If staff don’t get this time within a decent timeframe of receiving their training, you risk having to pay to re-train them. In extreme cases, it can bring the tool onboarding project crashing down entirely.

Growing Your BudgetWe’ve already extolled the virtues of early success—it secures needed buy-in and gets the wider business thinking about how it can use the proven tool. So what’s the formula for early success?

We recommend choosing a project that is of an achievable scale, that is hyper-relevant to the business. Your pilot projects should not be something that could be off-the-shelf (such as compliance training or GDPR training). Instead, you should focus on something that’s custom to the business, perhaps training for a specific product. This will give you space to experiment and try different features of the tool while working towards your business’ goals and KPIs.

The result can act as a template and helps as a way of teaching what’s possible. Get this pilot project as quickly as possible without sacrificing quality. Take learnings from the process to apply to future projects. Once you have other departments interested in participating, using the tool just becomes an easily managed, incremental cost.

2:Working to Your Budget

Page 12: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.11From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

ConclusionWe’re often asked “how long will it take to create an hour’s worth of eLearning?” Unfortunately, there’s no easy answer to the question. The answer is dependent on the density of the training and the materials you have at your disposal. If you have 10 minutes of great training material from an SME recorded on video, you’re already one-sixth of the way there after a few mouse clicks. If you want to build something more interactive, production could take several days of effort.

Ultimately, working to your budget is a matter of balance. By investing in a content creation tool, you want to push your learning content beyond the static resources you can create in a simple presentation package. However, you have to avoid getting too carried away with interactivity. You’ll need to wow your learners and peers, but know where to find those high-value, low-cost resources that add variety and substance.

Understand upfront what is possible with the tool and storyboard your project to avoid going too far off-track. Combined with a small team approach, supported by ample training and great visual design, you’ll be in a great position to not only work to your budget but to work on increasing that budget.

2:Working to Your Budget

Page 13: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.12From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

It’s only human nature to be attached to old methods and tools. We’re always aware of the time we’ve already invested in learning the journey, layout, and processes that have become comfortable to us. So when a new piece of learning software is introduced, people can be slow to embrace it.

The benefits of moving to a new piece of software are likely to be obvious to the team who went through the process of purchasing it. Nevertheless, you still have to be prepared to do some work to demonstrate those benefits to the wider business. In this chapter, we discuss six steps that will help you create the best possible first impression, and keep your content creators and learners from falling back into old habits.

3: Introducing a New ToolBy Simon Waldram, Customer Success Manager

Page 14: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.13From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Ensure Your Needs Have Been Mapped

As Gavin talks about in chapter 1, mapping your needs—dictated by your audience, your output, and your ecosystem—is an important first step to discovering the right tool. This foundational work is also important when it comes to introducing it. Share your mapping insights with your provider, and expect them to do their own work to understand your situation and requirements. Particularly look out for and test:

• Vendor understanding of your audience and use-case

• Agreement on what final outputs will look like

• Your requirements. If you have multiple vendors, work collaboratively to establish channels of engagement. Getting everyone on the same page creates a much better experience for everybody.

Unmapped needs can be disastrous when introducing new software. They lead to unmanaged expectations of how things will work, frustrations with the tool and unnecessary time pressures. These kinds of negative first impressions are difficult to turn around.

Have a Champion for the Tool

Your champion is an expert in the software who, alongside their normal job at your organization, works to improve and support positive behavior through the onboarding period and beyond. The champion becomes a point of contact on both sides of the arrangement. They’re both the first point of contact for staff who need support with the software, and the point of liaison for the vendor to provide further information.

A product champion is typically:

• An account owner

• Someone with a higher level of technical ability

• Good at communicating technical findings to others

• A manager or supervisor of a team.

From their position, a champion can cascade knowledge throughout the team. Over time, they can also work to create further champions by passing on knowledge.

Taking ownership of support for the simple and intermediate functions of the tool in-house in this way is a very efficient way to work. If users have an issue, they can go to the champion (or champions) for an answer. If the champion doesn’t have an answer, they can escalate (and potentially catalog several new problems) to the provider’s own support channels.

Obviously, if you’re a single license user, you become the champion by default. But it’s worth bearing the above in mind if and when usage of the software expands.

Account for Staff’s Attachment to Old Software

It’s absolutely possible to support staff who’re attached to legacy software—the key word here is ‘support’. In collaboration with your vendor, you should create a custom onboarding process for your learners that’s sensitive to how staff currently interact with their existing solution. In particular, ensure that:

• All tasks can be brought forward and mapped to the new software so that users don’t feel uncomfortable

• Your users feel they’re going to be able to achieve or exceed what they were doing before.

Of course, if the tool you’re trying to onboard has similar processes that will be immediately familiar to your users, that’s a distinct advantage. However, don’t prioritize this similarity over finding a solution that allows you to achieve new and improved outputs.

Step 1: Step 2: Step 3:

3:Introducing a New Tool

Page 15: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.14From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Showcase the Benefits of the New Tool

Ensuring learners understand how to replicate old processes in the new tool is valuable, but you shouldn’t spend too much time dwelling on the old. After all, you have great reasons for switching to the new solution—it’s now time to demonstrate these to your audience.

Improving technology and changes in the workforce are often a driving force in switching to a new piece of software. For instance, we have seen (and continue to see) many organizations moving towards mobile-friendly solutions and away from desktop-only environments. Don’t just wait for your audience to stumble across the new features you’re focused on providing. Demonstrate the difference and make them a core element of how you communicate the need for change.

Focus on showing them something they couldn’t achieve before—by having a good grasp of how they used the old solution, new and improved methods should be evident. With help from your vendor, show them what they can do and help them visualize the end result. Teach your audience how to do it, so they instill positive behavior and create new habits.

Give Everyone Enough Time to Learn the Tool

Staff need time to learn new software. It’s critical that they get a good balance between training and practical application. We find that new habits take around two to three weeks of repeated use to become ingrained. Therefore, check in with your users two to three weeks into their relationship with the tool to ensure they haven’t reverted back to old methods.

Every team is subject to time pressures, whether they’re the result of poor planning and scheduling, or unforeseen and sudden compliance or workforce changes. These aren’t always preventable, but their effects can be minimized. At a certain level, vendors should understand what your future developments are, and help you to understand what innovations are coming so that you can stay on top of things. Your expectations should be managed—be wary of vendors that promise you’ll be able to achieve everything you can dream up.

Measure the Success of Your Introduction

Properly measuring adoption rates isn’t just a simple matter of checking the new tool’s usage metrics. These stats won’t tell you the nuance of how people are using the software: whether the behaviors that you’ve taught have been internalized and are making your users’ lives easier. They won’t guarantee that your users aren’t begrudgingly using the new solution while continuing to largely achieve things with old methods.

You need to bring multiple methods to bear if you want to capture all feedback meaningfully. In collaboration with your vendor, you should:

• Talk to end-users about their experiences with the tool

• Encourage sharing of courses created in the tool

• Promote discussion of how certain elements are achieved

• Distribute anonymous feedback forms to allow for unfiltered viewpoints.

Step 4: Step 5: Step 6:

3:Introducing a New Tool

Page 16: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.15From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

ConclusionIntroducing a new tool is all about enforcing positive behavior, creating new habits and ensuring that staff feel supported throughout the process. Confidence in what they’re doing enables new habits to stick, making it unlikely that the team will lapse back into old methods. This requires continuous support, especially during onboarding.

If you find yourself in a situation where things are already going wrong, it’s never too late to try to put things back on track. Work with your vendor to re-establish a point of contact, build the relationship, and ensure that the lines of communication are open. Empower your champion and start advocating for, and supporting the software.

If you aren’t aware of the roadmap or of industry trends, start a discussion with your vendor and see how you can get that information. You also should expect access to a robust support process—not just a written knowledge base, but help through video and other formats.

Ultimately, your vendor is there to work with you—they should be as invested in the success of your onboarding as you are. If they’re not playing along, it may be time to start seriously questioning why.

3:Introducing a New Tool

Page 17: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.16From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Learning designers now have a huge variety of assets and options when creating their courses. Every project must perform a balancing act between engagement and relevance: the temptation to stuff a course full of innovative assets has to be weighed against the possibility that you’ll make something too distracting to be worthwhile. In this chapter, we have taken a look at some content expectations. Then, we’ll take a look at how you can deliver these through your design choices and tool selection.

4: Content/CreationBy David Mewborn, Business Development Executive

Page 18: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.17From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

What Do Learners Expect From Learning Content in 2020?In many ways, our expectations for learning content are set by the content we consume in our everyday lives. After all, authoring tools have largely moved towards being HTML5-based or at least supporting it as an output method.

As HTML5 is the coding language that underpins almost all web-based content, everything that’s possible in web content is possible in eLearning. And perhaps on some level, it’s expected. These expectations are felt in two key areas: aesthetics and accessibility.

Modern Web AestheticsPeople want content that has the design sensibilities of popular web pages and services built to modern web standards. High-resolution imagery and high-definition video are a given at this point—something that can become a problem if you’re still uploading entire courses as SCORM packages. The use of full-screen ‘hero’ visuals (usually accompanied by only a small amount of text) in website design is an example that isn’t entirely out of place in learning content.

Image selection is a precise art. Too often, an image is placed simply to take up space—at the very least it needs to be relevant. Ideally, it should tell a story, reflect the target audience and clarify the content it accompanies. If the image isn’t delivering a message, consider whether you need it at all.

Achieving modern web visuals isn’t just about using detailed assets, either. The correct feel can also be helped along with more subtle elements such as transitions or parallax scrolling. Structural choices such as infinite scroll or galleries of ‘cards’ (mid-sized images accompanied by bite-sized copy) are also useful tools.

Users expect varied and engaging visual effects while simultaneously disliking clutter and distraction. This dislike is only heightened when the user is supposed to be learning. They’re used to seeing precise spacing around elements, and a sensible amount of white space that keeps text from running too far across our widescreen displays.

Modern Web AccessibilityToday’s learners expect to be able to access a website from any device. Whether you achieve this via responsive design or dedicated mobile apps, you should be delivering the same content and features in a way that plays to the strength of each device type.

This approach supports different work habits—learning content can be worked through by staff who’re always on the move or those who don’t have a dedicated desktop. HTML5 also provides the necessary hooks for assistive technologies. Screen readers (commonly used by users with visual impairments) rely on correctly formatted metadata to understand pages properly.

4:Content/Creation

Page 19: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.18From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Fundamentals That Shouldn’t Be OverlookedBeyond how modern media shapes expectations, your workforce will have seen enough eLearning in its time to have a sense of what is effective. We recommend a particular focus on your content being:

• Relevant and concise: The main thing a learner wants from a piece of content is for it to not be a total waste of time. And even if it’s useful, they don’t want it to take up more of their time than absolutely necessary. Refine your content to ensure its message is as effective as it can be in as little time as necessary. If working with video, have a tight script for your SMEs to learn, and don’t be afraid to cut dead air and pleasantries.

• Clear in purpose: When content is transparent about its purpose, the learner can be in no doubt about its relevance. Learning objectives should be clear to the user from the beginning. Keep content straightforward and focused on your learning objectives.

• Visually on-brand: It’s sometimes easy to view branding as basically pedantic. Does it really matter if content meant for internal use has a subheading in the wrong shade of tertiary teal? We would argue, yes. Learners are more attuned to what their branded content looks like than they realize—and when things look off, it feels less official and important.

• Interactive: While full-blown VR experiences and learning games aren’t appropriate for every learning project, there should be some degree of interactivity in your courses. Knowledge checks are a sensible inclusion: periodically test whether your learners have taken in what they’ve read so far. They provide variety and when paired with xAPI tracking, they’ll be another content effectiveness datapoint. Furthermore, incorrect answers are an opportunity to offer suggestions for content to review.

How Can You Make the Process of Content Creation Easier & More Efficient?Thankfully for designers, tools haven’t stood still while end-user expectations have evolved. They can do a lot of the heavy lifting towards the objectives above. Any HTML5 authoring tool worth its salt will manage a lot of aesthetic and accessibility elements for you. This means only a single authoring effort for a fully responsive or adaptive HTML5 course—largely letting you concentrate on the content. You won’t have to understand how to code responsive layouts, or flick between separate versions of your course.

Tools also have an important part to play in facilitating collaboration. If your authoring tool is cloud-based, you’ll already understand how useful it can be to have multiple authors and reviewers working on the same course simultaneously.

On a tight deadline, completing, sharing and reviewing sequential drafts and merging different sections can take the momentum out of the process. Cloud-based tools ensure that things don’t have to be so strictly linear. An SME who’s traveling or in high demand can drop their content in whenever and wherever they find the time, without disrupting a production sequence.

Tools exist to stop you from having to continually reinvent the wheel. A library of preconfigured assets such as screen templates can act as a shortcut to modern web aesthetics without you having to spend time worrying about layout. You should set up your own reusable assets too: build reusable on-brand themes that allow you to simply insert content into a pre-formatted templates.

4:Content/Creation

Page 20: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.19From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Conclusion: Ensuring Your Content Continues to Get ResultsThese content pointers should help set you up for eLearning success. Considering the fact that our expectations are shaped by tech in other aspects of our lives, you could boil a lot of this down to common sense. This said, it’s always worth remembering that common sense is just as subject to change as tech.

Adam will go into measurement at length in a later chapter, but the topic really should be considered inseparable from content creation. Naturally, we should be advising that you use analytics to spot trends within a project that point to screens that aren’t delivering. Look at time on page, knowledge check failures, pass/fail rates, and ideally, actual improved job performance post learning.

However, you should also keep an eye on your project-to-project trend. If engagement is slowly dropping off, you may need to rethink whether your aesthetics and accessibility choices are still reflective of audience expectations. One thing you can definitely say about learner expectations is that they’re definitely going to continue to change.

4:Content/Creation

Page 21: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.20From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Why do we need to review the courses that we create? The answer is surely obvious: to make sure there are no errors or omissions that affect the quality of our training. Nonetheless, and though it’s rarely by design, there’s no shortage of errors that make it through review and testing. In this chapter, we take a look at some common oversights and best practices that will help you tighten up your approach.

5: Review and TestingBy Pratibha Shah, Test Lead and Product Support

Page 22: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.21From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Look Out for These Common Testing Oversights:

Failure to Test Different End-User EnvironmentsFrom spurious error messages to courses just plain failing to load, your end-users’ environment can throw up unanticipated errors that result in a bad experience. However, it’s not just errors that can throw things off. You need to test whether different user environments result in different and potentially inferior experiences. Consider how the following could have an effect:

• Device: In 2020, you can expect content to be viewed on all kinds of devices: desktops, laptops, tablets, iOS phones, Android phones and beyond. Modern content creation tools will use responsive design to handle your content on all of these devices. But you still need to check what the output actually looks like. Don’t just settle with ‘it works’ either—tweak the design to ensure that you’re getting the best user experience on each device.

• Browser: These days, people are largely settled on a favored browser, Chrome, Safari, Firefox, Edge, or something else. This makes it easy to miss the fact that browsers can handle even web standards-compliant content in subtly different ways. Nonetheless, you need to test your content in all popular browsers to catch all of these errors—don’t just assume it will work because it works in the one you’re authoring in! And remember, there are multiple browsers in use on mobile as well.

• Operating System: Mobile-wise, you should ideally test Android and iOS devices. Meanwhile, a good chunk of business machines run Windows. However, you probably just have to go for a walk in a design department to find someone on macOS. And chances are good that your technical teams are running some flavor of Linux.

• LMS: We often see people reviewing via a test LMS or in their tool previews, but they then forget to test on the final LMS they’ll deliver the content on. Just as browsers sometimes treat web-standards in subtly different ways, SCORM and xAPI can be implemented slightly differently on different LMSs. For example, SCORM may require a character limit on a certain field. In one LMS, you may be given an error that flags the character issue, but in another, you may receive a confusing general message.

5:Review and Testing

Page 23: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.22From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Creators need to be aware of the differences that different user environments create and check whether that still matches the intent of the content. Ask: does it still achieve the desired learning outcome? And is it more or less effective in certain formats?

What should be clear from this list is that it would be impractical to test for every permutation of device, browser, operating system, and LMS that will be used to access a course. The important thing is to make sure that you have a set of test devices that are representative of your end-user group. Use existing user data and/or survey your learners to find out what you need.

Failure to Test Multiple User JourneysA common mistake that organizations make when testing is to individually check each screen of content. Once these all look correct, they’ll congratulate themselves on a job well done. However, this doesn’t account for all the different journeys and paths through the content that could possibly exist, leading to any one screen.

If you’re using any form of question logic, you’re often sending learners through diverging paths when they answer a question in different ways. It’s important to test each and every one of these paths to ensure that every learner gets the full experience you intend for them to have. Failure to test any user journey, or a very limited number of user journeys, can cause issues at very late stages of the course creation process.

5:Review and Testing

Page 24: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.23From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Non-Technical Aspects That Are Surprisingly Easy to ForgetFaced with the complexity of the systems and learning designs that require review, it’s not uncommon for teams to gloss over one or more aspects of the actual content. While your robust device testing policy notices that a screen doesn’t load properly on mobile, you can still miss an obvious spelling mistake in the second paragraph.

Therefore, remember to get someone knowledgeable to review all aspects of your language use, including:

• Spelling and grammar

• Tone of voice

• UK and US English (and dialectal differences in other languages)

• Company style guidelines

Company style guidelines are one area that catch teams out. Projects can be ready to launch, and then brand compliance will step in and throw a spanner in the works. In some scenarios, a project may encompass 50 or more courses, all created at the same time. If the theme used across all of these courses is not brand-compliant, every project will need to be corrected.

In some platforms (including Gomo) you would at least be able to update a master theme file to roll out the correction. However, even this is dependent on you building the courses to use a unified theme. Furthermore, you still need to test that the change has happened on all courses—and that there are no obscure issues that have happened as a result.

If you’re writing content on technical topics that you’re not an expert on, you would expect to have an SME checking the accuracy of your work. However, SMEs still need to be peer-reviewed. While your SMEs are (hopefully) less likely to misunderstand topics, they can still unintentionally be the source of inaccurate training and questioning.

Peer-review will catch errors that non-SME review won’t catch and they may be able to suggest improvements the first SME hasn’t considered. Your tool selection can make this process easier and more useful—if the tool has the ability to suggest edits and leave comments against pages, for instance.

What is Review and Testing Best Practice?

Test From Prototyping OnwardsWe’re often asked when it’s best to start testing. While the snappy answer is “as early as possible”, let’s temper that a little. You’re probably going to be wasting time if you start testing before you have your first substantial prototype built. So, get to a point where you’re happy to review and make it clear which elements simply aren’t yet complete.

Good prototyping practice is to first build a basic skeleton for the main functionality of your course. Use all the features, layouts and display conditions that you expect to include. Through testing this prototype, you’ll have some idea of how (and if) it works before you commit resources to the full course.

5: Review and Testing

Page 25: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.24From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Test Group TipsHow many people you need for your test group should be guided in part by the size of your end-user group. Sometimes just two people may be enough. There may be occasions where you need to have seven or eight people run through the tool, if you have a large target audience with particularly diverse device and browser requirements.

It’s important to use people who haven’t been involved in building the course. Creators can never see all of their own mistakes—you get so used to the courses you’re building that the mistakes become the expected output. You’ll always benefit from having a fresh pair of eyes looking at your work.

Furthermore, you need test groups that actually represent the end-user and their mentality in order to replicate end-user behavior. This can also include ensuring that you test properly for certain accessibility needs and ensuring compatibility with assistive technologies.

In the later stages of your course creation process, conduct a field test: give the course to a small sample of end-users. This will allow you to gain insight into how the course is perceived, how it could work better, and whether it makes sense.

Staging and Review ToolsAt enterprise level, we see some very thorough technical testing processes. One great example that more organizations should attempt to implement is the use of a staging environment. With plenty of time in advance of the planned launch, L&D will release the course to a staging environment, pending approval and release to the live system. This process results in things getting caught before it’s too late, allowing time for the organization to seek assistance and answers from vendors.

Staging in this way may require time and budget resources that smaller groups and individuals may not have access to. This reflects a reality where L&D groups at this size are more likely to struggle to test thoroughly before launch. This is often due to a lack of time or sometimes understanding. The end result is that they’re more prone to post-launch issues.

This is a tough area, but one way that smaller teams can compensate for these constraints is to let your content creation platform do some of the heavy lifting. Robust review tools help you to catch some of these issues sooner, even before going into production.

5:Review and Testing

Page 26: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.25From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

ConclusionHopefully, these common oversights and best practice tips we’ve cataloged through the years have highlighted areas you haven’t considered. If not, we’re happy to have given you peace of mind that you’re already on the right track.

There can be a lot to consider when it comes to review and testing, but there is one guiding principle that underpins all of the advice here: test your courses with the end-user in mind, creating as representative a mix of environments and journeys as possible. Ensure that the experience you’ve built is experienced in the way you want it to be experienced. If you do that, you can’t go wrong.

5:Review and Testing

Page 27: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.26From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

While we all understand the importance of tracking content performance, there are plenty of factors stopping L&D departments from using the available technology to its full extent. Whether through limited systems or a reliance on old structures, organizations are often stuck reporting little more than pass, fail and completion events. We could be achieving so much more with learning measurement. In this chapter, we offer a few tips for taking your approach to the next level.

6: Measurement and TrackingBy Adam Fox, Head of Development

Page 28: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.27From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

How to Use Measurement and Tracking to Inform Future ContentYour learning analytics approach will vary on a case-by-case basis, but there are some common principles. Primarily, you’re looking for the weaker areas of your content and determining how you can improve those areas either in live content or in the content you build in the future. Key things to look for include:

• Failure rates: Have learners failed specific questions or sections of the course? Look for rates significantly below the average for other sections—some questions are just naturally harder than others. Peer-review, rewrite, and restructure material to get a better result.

• Time on page: Authors should have some idea of how long a piece of content takes to read and absorb. If time on page is low, learners may be skipping it. If time on page is high, learners may be bored, or alternatively, struggling to understand the concepts or language used. Experiment with rewrites and restructures.

• Progress through a course: Do learners get halfway through a topic, then skip it, only to fail the quiz later? This can happen when learners are initially presented with too much they already know, while genuine new information they aren’t aware of is buried several screens in. You could restructure the course accordingly.

• Device data: Are mobile users taking longer to complete courses, or taking less time on a page? This may indicate an issue with how the course is being delivered on different devices.

• Before/after scores: If you’re running pre- and post-training tests, you’re obviously looking for improvement in before/after scores. If there’s no improvement, something in your approach isn’t working. Either the material itself needs review, or you should consider offering extra training that reinforces the same points.

xAPI can go beyond the parameters listed above for some very fine detail insights. For example, for a multiple-choice question with multiple correct answers, xAPI will report each choice individually. It can also tell you things like how many times a question was attempted, or how much time was spent on a course.

The granular tracking discussed above focuses on leveraging capabilities of the xAPI tracking standard. However, there are other tracking standards such as SCORM 1.2 and 2004 that, while not quite as new or comprehensive as xAPI, are widely used, particularly in compliance tracking.

The various SCORM standards have traditionally emphasized the completion and success status of an entire course. Although SCORM does offer the ability to track smaller-scale interactions, its tracking model is very rigid and not adaptable to different types of learning. It isn’t as well suited to offering especially in-depth analysis of how the content is performing. As a learning author, it’s imperative that measurement helps you understand which areas work well and what needs improving so you can continually improve your learning materials.

It’s worth remembering that xAPI isn’t just about improving eLearning content. Digital courses are often used as part of a blended learning approach, and learners’ successes and failures reflect on the effectiveness of classroom content too.

6:Measurement and Tracking

Page 29: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.28From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

The Benefits of Advanced Tracking Extend Beyond MeasurementSCORM is LMS-dependent, whereas xAPI content isn’t subject to this technical constraint. Instead, tracking statements are recorded in a Learning Record Store (LRS). The LRS can be completely detached from the learning, which can be hosted anywhere you like. This could be on a website, or on a company intranet, for example.

This isn’t to say that xAPI makes traditional Learning Management Systems defunct. They still provide some very important features such as access control and learning paths. Because of this, standards such as CMI5 define a common packaging format that allow xAPI to function inside the LMS.

In this way, xAPI doesn’t just allow you to change the depth at which you can track the content, but it also allows you to distribute the content in a different way. This can help you deal with bandwidth issues, or host the content more locally to the users taking your courses. This has obvious benefits for international organizations testing employees abroad.

Tips for Avoiding Analysis ParalysisOf course, the extra granularity that xAPI allows is a double-edged sword. Its structure of Actor-Verb-Object (“Adam-Answered-Question 4”) allows for a huge number of different combinations and comparisons between data points. While you can filter specific verbs to look at certain groups—such as all fails, all passes and all ‘experiences’—there has to be a purpose for doing so.

If you have a learning analytics platform (such as Watershed), you can avoid going too far off on a tangent by simply keeping to the recommended dashboards. Such platforms will also make the process of bashing various xAPI statements together effortless, meaning that you waste less time exploring dead-ends.

Troubleshooting TipsTroubleshooting is a complex area, and the most serious issues should be discussed with all relevant vendors in order to uncover the issue. However, we do see a couple of basic things that commonly trip teams up.

When setting up your xAPI tracking, you may be asked to provide end-point details (for example, your username/password). It seems obvious, but it’s important to double-check that these details are correct and firing off the information that you expect. Sometimes details change, other times an unchecked typo ruins everyone’s day. If you’re devoting adequate time to testing, this is something that should be weeded out early, ideally in the prototyping stage.

Though it isn’t strictly an implementation problem, one very common tracking issue to be aware of is caused by dropped connections. Because courses are run locally on a user’s machine, poor connections can sometimes result in tracking statements never returning to the server. Furthermore, the learner won’t necessarily notice that anything is up—content can be cached locally so it appears to work.

Therefore, make sure you’re aware of whether your user is subject to an intermittent connection during your troubleshooting process. There’s no technical fix you can deploy for this—unless your organization is responsible for providing the internet connection itself. Ideally, authoring tools should make the user aware of the issue, for example, with a ‘lost connection’ pop-up. This should also inform the user that course progress may not be recorded.

6:Measurement and Tracking

Page 30: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.29From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

ConclusionWhile they’re the subject of the final chapter in our guide to eLearning authoring success, measurement and tracking are their own beginning. Without them, you’ll never quite know whether you’ve successfully introduced your new platforms, and whether your content choices have properly hit the mark. Don’t neglect them. They are your most important tools for developing, improving, and supporting your learning content going forward.

6:Measurement and Tracking

Page 31: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.30From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

Continuing on The Path to eLearning Authoring SuccessWe hope that you find this ebook helpful as you plan your next steps with the eLearning authoring software you choose. Our team has drawn the tips here from their own experiences, and we’re confident that they’ll help you realize your vision for the courses you’re creating. Our goal was to create an ebook that would be helpful for different department sizes at different stages of their journey with an eLearning tool, and it should therefore stay relevant as your department grows and rises to new challenges.

It’s also true that these are topics we’ll all inevitably revisit again and again. Our needs and our budgets are always changing. Even if you stay loyal to a single authoring tool for many years, it’s always a new tool for someone new to the team. Content creation, review and measurement feed into each other with the goal of continuous improvement.

These best practice tips wouldn’t be possible without the many queries and questions that Gomo clients have shared with us through the years. We aim to continue to provide insights through our website and will continue to shape Gomo and our own knowledge based on the challenges we help solve. Please get in touch with our sales team or your account team if you have an eLearning authoring issue you’re currently working through.

Page 32: From Tool Selection to Measurement: 6 Steps to eLearning … · 2020-02-11 · From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success p.1 L&D departments come

p.31From Tool Selection to Measurement: 6 Steps to eLearning Authoring Success

About GomoThe Gomo learning suite provides multi-award-winning products that allow you to create, deliver, update, and track beautiful multi-device eLearning. With Gomo Authoring, you can create truly responsive and adaptive HTML5 content that looks perfect on all devices, including desktops, tablets, and smartphones. With Gomo Delivery and Analytics, you can get content into the hands of learners instantly via websites, direct link, the Gomo LMS wrapper, social media, and more—all with full xAPI analytics.

With an ever-growing client base including the BBC, British Airways, BT, Centrica, General Electric, HSBC, L’Oréal, Royal Mail Group, Shell, Sony, Squarespace, TDK, Vodafone, Weetabix, Whatsapp, the World Health Organization and many more, Gomo is quickly becoming the established choice for global organizations seeking collaborative, future-proof and responsive HTML5 multi-device eLearning.

Gomo is part of Learning Technologies Group plc’s award-winning group of specialist learning technology businesses.

For more, visit gomolearning.com

© 2020 Gomo Learning