software requirements specification...

35
Results of the Reporting Tool Proof of Concept Usability Evaluation 10.20.2015 Usability Results Report

Upload: trinhnguyet

Post on 24-Apr-2018

221 views

Category:

Documents


6 download

TRANSCRIPT

Software Requirements Specification Template

Reporting Tool Proof of Concept Usability EvaluationPage 18

Results of the Reporting Tool Proof of Concept Usability Evaluation

10.20.2015

Usability Results Report

Table of Contents

Table of Contentsii

Revision Historyii

1.Executive Summary1

2.Introduction2

2.1Purpose2

3.Overview3

3.1Participants3

3.2Participant Details4

3.3Usability Session Description4

3.4Additional Notes5

3.5Special Thanks5

3.6User Feedback Summary6

3.7Design Suggestions14

3.7.1Dashboard View Customization14

3.7.2Most Useful Dashboard Panes16

3.7.3Interactive Dashboard Panes17

3.7.4Locating The Dashboard And Reports19

3.7.5Report Types19

3.7.6Score Type Pyramid Is Confusing21

3.7.7Import Into Excel21

3.7.8Miscellaneous21

Revision History

Name

Date

Reason For Changes

Version

MShawala

10.20.15

New.

0.1

Reporting Tool Proof of Concept Usability EvaluationPage ii

Usability Results Report

Executive Summary

Overall, the initial feedback on the Dashboard and associated content panes was solid. Two users, after a quick initial look, immediately proclaimed that their upper management would love to use this view to obtain a quick look of what is going on with their background screening activities. Most users appreciated the colorful, 3D nature of each pane, and the details presented across all six panes seemed to make sense to most participants.

Further, one user also identified, as a global company, that crisp, colorful graphics accented by representative data will be key in sharing screening information across the business. Additionally, this presentation will aid in quickly breaking down any language barrier issues with his Japanese counterparts pictures and data doesnt lie).

However, most users were expecting a more responsive, interactive experience with the Dashboard (and their associated reports). Some were looking for the ability customize the Dashboard output (for example, by user, subaccount, county, or geographic region) to meet their own specific needs. Other users were looking to obtain additional data about a particular element portrayed on a Dashboard pane (for example, scores in a status of Not Scored or TAT outside of three days) in many cases looking to launch the Report Explorer or other medium. This same interactive nature was also expected on at least one of the reports (Score Summary). Most users identified closely with their own custom reports that we already provide, and expected them to be included in our reporting solution. Although pie charts, bar graphs, and line graphs were well recognized, the pyramid chart was not as well-received.

The response on the default reports was mixed; of the three presented (Account Order Activity, Search Statistics Summary, and Score Summary), the Search Statistics Summary Report seemed somewhat troublesome and could benefit from additional customer feedback to refine. For these three reports, most users indicated usage between weekly and monthly, and looked for data refresh between real-time and weekly.

While interacting with our reporting solution, our users also identified a number of items (for example, address spelling errors, repair calendar overlays/underlays, maintain start/end date combinations, ensure consistency across reports, etc.) these items appear to be of a nature that can be quickly and easily addressed by the IT Development Team, while also providing a more consistent, enriching user experience.

Overall, many users eagerly await the release of this functionality, and also felt it would be widely used across their organizations. For additional details about the participants, their feedback on the scenarios, and related design suggestions for the initial release, please see the remainder of this report.

IntroductionPurpose

The purpose of this document is to briefly summarize the user feedback that was received from select Asurint customers on the Asurint Reporting Tool (ART) proof of concept that was presented in an interactive usability session in the Abrams Room of the Hyatt Regency Hotel on October 1st, 2015 at the Asurint User Conference.

Based on that customer feedback, this document also presents the suggested design solutions and other recommendations related to the ART user interface for consideration by the IT Development Team and the Executive Team.

Overview

The following sections describe the process we used to obtain feedback from our customers during the Asurint Reporting Tool Proof of Concept Usability Evaluation.

Participants

There were approximately 33 attendees from 24 different clients participating at the 2015 Asurint User Conference. Of those 33 attendees, the following 11 (from nine different clients) participated in the Asurint Reporting Tool Proof of Concept Usability Evaluation:

Participant

Company

Title

Jackie Rosales

Centerline

Field Compliance Manager

Amanda Allen

LKQ

Human Resources Manager

Peggy Theriot

Jo-Ann Stores

Supervisor of Asset Protection Support

Brenda Forepaugh

Jo-Ann Stores

Asset Protection

Paul Stithem

Mattress Firm

Fleet Manager

Casey Kirk

Honda

Talent Management

Rhonda Pantila

Doherty Employment

B & D, UI Manager

Frank Long

Elwood

Risk Operations Manager

Nick Seger

Elwood

Vice President of Operations

Tina Stickdorn

iForce

Office Services Specialist

Toni North

Safeway

Project Manager, Retail Talent Acquisition

These participants spanned the following four industries:

Transportation

Staffing (4)

Automotive (2)

Retail (4)

Participant Details

The following table summarizes our participants experience with Asurint tools/services, other competitors tools/services, weekly time spent with Asurint products, and weekly time spent on Asurint reports:

Participant

Asurint Experience (Years)

Competitor Experience (Years)

Time Spent Each Week Asurint (Hours)

Time Spent Each Week Asurint Reports (Hours)

Toni North

3

2

25

2*

Amanda Allen

2

10

5

0

Peggy Theriot

2

15

1

1*

Brenda Forepaugh

3

1

30

20

Paul Stithem

.5

0

5

0

Casey Kirk

2

0

2

1

Rhonda Pantila

1

17

35

2*

Frank Long

5

10

10

1

Nick Seger

5

10

10

1

Tina Stickdorn

2

2

5

0

Jackie Rosales

2

10

2

0

Total: 27.5

Total: 77

Total: 130

Total: 28

Average: 2.5

Average: 11

Average: 11.8

Average: 2.5

* These numbers are approximate based on discussion with participants.

Usability Session Description

In the Abrams Room at the Hyatt Regency Cleveland, two computers, monitors, and keyboards were set up with hard-wire internet access on opposite sides of a boardroom table (monitors facing the outside walls). On each computer, a participant was asked to access the Asurint site (in a development environment - https://aws-dev.asurint.com/ART/) and log in using their standard credentials. Once logged in, the Reporting Dashboard POC was displayed reflecting their own data. Two facilitators - Greg Lee and Myron Shawala - presented two scenarios and related questions. All answers were manually captured on a second copy of the scenarios by the facilitators.

A copy of the scenarios is located on Sharepoint at:

http://sps/ucd/Shared%20Documents/Asurint%20Reporting%20POC/Asurint_Criminal_Reports_UsabilityBASE.docx

Additional Notes

Additional notes that were captured by KJ Newman in the companion ART presentation by Chris Sullivan are located on Sharepoint at:

http://sps/ucd/Shared%20Documents/Asurint%20Reporting%20POC/2015%20User%20Conference%20Notes.docx

Special Thanks

We wanted to extend special thanks to the following individuals whose insights and assistance were greatly appreciated during the entire design, preparation, and execution of the usability process:

Rich Golias

Mike Nowak

Alex Kubat

George Dolan (Hyatt)

Grace Johnson (Hyatt)

Jessica Sneed

Darlene Zwolinski

Greg Lee

Len Livshin

Chris Sullivan

Steve Palek

User Feedback Summary

The following table summarizes the feedback received from our customers during the Asurint Reporting Tool Proof of Concept Usability Evaluation as captured by Greg Lee and Myron Shawala.

Participant:

Jackie Rosales

Company:

Centerline

Asurint Team:

Myron Shawala

Date and Time:

11.01.2015

Scenario 1 Reporting Dashboard (Home) Feedback

First impressions of the Dashboard Cool.

My TAT is normally longer (over 3 days) however, this is not shown today (TAT for Criminal Products).

The Not scored category is confusing Score Types. How do I get more info about these? bring up those orders in Report Explorer so I can see.

Was able to easily identify all default data elements. Appreciated Product Searches provided insight into business.

Liked Hit Rate chart. Really liked the Order Volume.

Easily identified TAT.

Easily identified number of searches completed. However, would like to see by month, and by year.

Easily identified order volume in April. However, requested breakdown monthly by products.

Graphics are nice, but more focused on the data. Line graphs may be more appropriate in some instances.

Suggestions:

Can this be interactive? Click on September in Order Volume provide more detail breakdown by product.

In Product Searches click on MVRs get number of violations, those with suspended licenses, etc.

Order Volume - Make a bar graph by color by product shade differently; include percentages.

Scenario 2 Individual Report Types Feedback

Expected to click Report Explorer to get to reports.

Score Summary no

Search Statistics valuable

Account Order yes

Reference these reports weekly.

Data driven individual really liked content of the Search Statistics report. Makes sense to her.

Where is the Not Scored that I saw on the Dashboard? Discrepancy between Dashboard and the report.

What are Primary Orders and Alias Orders?

Code Reference 1 values are mixed those entered by person vs. those provided through Ten Street integration. Can we make consistent?110110/111 Colton Ontario

Selected Score Summary as second report choice.

Based by user instead? Show by worst criminals? Poorest drivers?

Wants to click on Not Scored - bring up those orders in Report Explorer so she can see.

Data freshness update weekly.

Additional Feedback

In Report Explorer, click on Name to sort. Once you click on subject name, then show all on that driver.

E-verify through Ten Street?

Participant:

Tina Stickdorn

Company:

iForce

Asurint Team:

Myron Shawala

Date and Time:

11.01.2015

Scenario 1 Reporting Dashboard (Home) Feedback

Initially started looking in the Report Explorer tab.

First impressions of the Dashboard Colorful and easy to read.

Recognized all report types easily.

Is this interactive? Product Searches click the drug records list all ## - another screen where these reports are shown only.

Hit Rates within each product, rank the results. Most - Embezzlement, then - espionage, etc.

TAT Click on section over 3 days display the Report Explorer to see why they are late. Sort by area to see what area and why.

Order Volume click September break down by branch number.

Identified TAT Makes sense. What is taking over 3 days? Why are these over 3 days? I need more information.

Missed number of searches completed.

Easily identified order volume in April.

Graphical depiction good. The Score Type pyramid is confusing the numbers are the key information. For Score Types, use bar graph or pie chart.

Suggestions:

See highlighted above.

Scenario 2 Individual Report Types Feedback

These are all general reports they are OK.

Reference these reports monthly.

Search Statistics report the left side is useful to me. Applicant Summary and Order Scores. Also like the detail of Hit Rate top right.

Selected Score Summary.

Allows for targeted training. For items not scored, which users and/or branches are slowest to score or not scoring at all? I can then work with them to improve.

Also likes the Order Activity report. Pulls it all together for me.

Data freshness update in real time; instant. This is very important to me.

Suggestions:

I need an over 3 day TAT report. I would check it every morning. Click a button (Expedite) send email to Asurint for more detailed info of what is going on.

Additional Feedback

I am a more visual navigator. I understand the tree and links, but can I get an icon instead. Like my phone.

Participant:

Nick Seger/Frank Long

Company:

Centerline

Asurint Team:

Greg Lee

Date and Time:

11.01.2015

Scenario 1 Reporting Dashboard (Home) Feedback

First impressions of the Dashboard Likes the visuals.

Increase the size of the headers to help differentiate.

Did not like the Score Types pyramid. Too many similar colors and slices hard to tell the difference. For this presentation, the less slices, the better.

No answer to question 2.

Quickly identified TAT.

Easily identified number of searches completed. Good, simple, solid information.

Easily answered order volume in April.

Pyramid should be bar graph due to number of elements.

Suggestions:

Keyed in on the Order Volume bar graph. There, show the dollars spent by product. Show me details by month (quarterly way too long). Would also like to see order volume by branch and geography to gauge costs in those areas.

For criminal products, show me number of Felony vs. misdemeanor. For Felony, show me the top 5 offenses (thefts, DUI, molestation, etc.) for criminal products.

Scenario 2 Individual Report Types Feedback

Search Statistics overall, not useful. However, TAT for packages (upper right) is something wed use.

Score Summary Good. Show geographically.

Data freshness did not respond.

Additional Feedback

Need to be able to dive into the Dashboard. Select by user? By county? Refine what I see.

Participant:

Rhonda Pantilla

Company:

Doherty Employment

Asurint Team:

Greg Lee

Date and Time:

11.01.2015

Scenario 1 Reporting Dashboard (Home) Feedback

Recognized all report types easily.

Recognized most report types easily.

Product Searches very good.

Hit Rates very useful. Drill down to subaccounts and regions would be very nice. Provide drop down to select?

Easily identified order volume in April.

Suggestions:

Printing of dashboard would be nice.

Scenario 2 Individual Report Types Feedback

Reference these reports weekly, some monthly.

Search Statistics report Staffing will want to customize this report need more package info (percentage of clients that have Package ZZZ).

Score Summary are aliases included? Aliases vs. primary show counts for each.

Account Order Activity ATS system upload to bill their clients (they add their price). Need Name, SSN, Client Comp, and total price.

Additional Feedback

Provided a couple of additional user interface suggestions:

On the Asurint site - Workplace > Packages I want a keyboard shortcut so that I can simply click a letter to select specific package names.

The IQ Report Wizard has many UI challenges. For example, there are no cursor defaults.

When entering an employment request on the site, and Im half way through entering (need to get data, alternate task, etc.), there is currently no way to save my progress/pause and then submit it later.

Participant:

Casey Kirk

Company:

Honda

Asurint Team:

Myron Shawala

Date and Time:

11.01.2015

Scenario 1 Reporting Dashboard (Home) Feedback

Initially started looking in the Report Explorer tab.

First impressions of the Dashboard Likes the graphical depiction. Colorful, 3D. Very useful at a glance.

Was able to easily identify all default data identified it as canned info. KPI for background check activity. Key info searches and scores. Hit Rate and Score Types most useful to him.

Quickly and easily identified TAT.

Easily identified number of searches completed.

Easily identified order volume in April.

Information is displayed in a pleasing fashion; good. As a Japanese-based company, this is key pics and data do not lie. This helps us with communications across the ocean.

Suggestions:

Can we get TAT on Education verifications? This is a bottleneck for us now.

Scenario 2 Individual Report Types Feedback

These canned report types are acceptable.

Would reference reports minimum monthly and maximum weekly.

First user. Error when opening Search Statistics.

Account Order is dangerously good. Shows cost, date submitted again, dangerously good.

Data freshness need in real time. Were manufacturing HUGE. Ideally, daily. Realistically, weekly.

Will import reports into Excel.

Suggestions:

I have limited experience with background checks. I like to keep ahead of things keep my team focused on fulfilling the employment requisitions.

I want to consult with Asurint to form best practices. Help me to select the best reports for me, and then add those reports to the tree so I can easily reference them.

Would like a report that details failed scores and Adverse Action letters sent. If there are 12 fails and 10 letters have been sent, whats going on with the other two? I need to know this information.

Additional Feedback

Report usage is slight now. Will increase tenfold in November.

Hates reporting just to report must hold value. If you say Who cares about this?, then whats the point?

Really likes our interactive chat feature. Prompt, Efficient. Captured his conversation and emailed back to him. Has shared his positive experience with Jessica Sneed.

Participant:

Paul Stithem

Company:

Mattress Firm

Asurint Team:

Myron Shawala

Date and Time:

11.01.2015

Scenario 1 Reporting Dashboard (Home) Feedback

Wants to click each element and dive in for more details.

Overall likes the bar graphs, pie charts; not a fan of the triangle.

Order Volume show seasonal.

TAT needs more detail. What about over 3 days? Under 1 day?

TAT got it right away.

Product Search What does verification mean?

Verification Volume YTD we dont do this; why is it blank? Just dont show it for us.

Show me new element that details pass/fail for drug screens. % Fail? % No show.

I asked for a TAT in Florida report one time; it took 3 weeks to get this.

Displayed in a pleasing fashion. Visuals are good.

Score Types - For not scored, hover over to see the details/numbers. Click not scored (23), immediately take me to the Report Explorer listing only those that are not yet scored. Need some feedback. Need more detail on scores.

Suggestions:

Three pie charts are repetitive. I want to configure dashboard elements in a graphical element that I select (bar graphs, line graphs, etc.)

Scenario 2 Individual Report Types Feedback

Tree navigation makes sense; click and display selected report.

Useful?:

Score Summary yes

Search Statistics no

Account Order no

This report is very busy. I know who I am Remove the customer column.

Score Summary monthly, maybe weekly

Search Statistics monthly

Account Order monthly

Search Statistics clean it up for me. If something is not used or useful, get rid of it (Alias Orders get rid of it).

Score Summary wanted it to be interactive. Click on the 23 not scored, immediately take me to the Report Explorer listing only those that are not yet scored. Need some feedback.

Data freshness daily.

Suggestions:

When I specify a Start Date/ End Date combo at the top, and then navigate to one of the other reports, I want those date settings to hold. Do not reset. Maybe add a apply date range check box to the top view.

Revise breadcrumbs to reflect what report I am in. Home > ART > Score Summary.

Scoring Summary He went in, rows were in order A as Paul. Exited, selected all users in upper right corner, and then came back to the Dashboard. Rows were in order B. Why did row order change?

Selected calendar function on Score Summary calendar overlayed the screen below and could not read data below. Then went back to Dashboard and clicked calendar was an under lay and partially hidden. Should be consistent neither behavior is good user experience.

Additional Feedback

I would like to configure my Dashboard. Move individual elements around in order that makes sense to me. And save it so when I came back in, those changes are maintained.

Participant:

Brenda Forepaugh

Company:

JoAnn Stores

Asurint Team:

Myron Shawala

Date and Time:

11.01.2015

Scenario 1 Reporting Dashboard (Home) Feedback

She is the Scoring Queen.

Mentioned multiple times that she is comfortable in her role, and the tasks that she completes daily.

First impressions of the Dashboard She likes it.

Does not work much with reporting now. However, likes to see Order Volume most important. Verifications Volume are not important.

Was able to easily identify all default data presented.

Easily identified TAT.

Easily identified number of searches completed.

Easily identified order volume in April.

Liked the bar charts and line graphs most useful representation to her.

Excited to see this in the field. Would be utilized by others at JoAnns.

Scenario 2 Individual Report Types Feedback

Expected to click Report Explorer to get to reports.

Search Statistics Reminds her of existing report we provide. Useful in this form, but would require customization. Could do that with export to Excel.

Score Summary Like to see what I, and others, have done in terms of scoring.

Account Order A lot of these elements also on her daily report. However, too much; couldnt work from this. Dont care about a lot of this info.

Data freshness. Reference these reports daily at 8 AM especially the Score Summary.

Additional Feedback

I would like a customized report that shows the status of every order. Review (most important), Fail, etc.

Include core reports and my reports in tree separate buckets for each.

Participant:

Peggy Theriot

Company:

JoAnn Stores

Asurint Team:

Greg Lee

Date and Time:

11.01.2015

Scenario 1 Reporting Dashboard (Home) Feedback

Our VP will love this graphical presentation!

Likes the TAT and Order Volume elements.

The pyramid presentation is solid Score Types. (she was only one who dug it)

Quickly identified number of criminal searches.

Had difficulty in finding order volume in April. Tried using the calendars at the top instead.

Dont currently look at hit rate for criminal products. But will when they convert to Data + 1.

Want to see number of Adverse Action and Pre-Adverse Action that have occurred.

Drill down into scoring to see who isnt scored show me more detail.

Show searches by packages and TAT for those packages.

Costing information (by package)

Hit rate by package (and then see cost of that package)

TAT by package (and then see cost of that package)

Scenario 2 Individual Report Types Feedback

Couldnt find individual reports had to be helped to locate the tree.

Reference these reports monthly. Possibly more frequently over time.

Search Statistics report Doesnt want to see applicant aliases. Average spelled wrong. Likes the Criminal product detail (bottom left).

Remove all products that I do not use (upper right).

Selected Score Summary. I want to drill down for more detail tell me the order date.

Please add all of my current scheduled reports to the tree.

Participant:

Amanda Allen

Company:

LKQ

Asurint Team:

Myron Shawala

Date and Time:

11.01.2015

Scenario 1 Reporting Dashboard (Home) Feedback

Our Senior VP loves graphs and charts will really like this!

Pie charts are a good medium. Labels make it clear as to the content of each element.

Was able to easily identify all default data.

TAT on criminal the time and associated percentages was valuable. Everything was less than four days excellent.

Easily identified number of searches completed.

Easily identified order volume in April.

Graphics are pleasing, eye catching.

Suggestions:

Would like to see data based on subaccount. Provide a dropdown to view?

Scenario 2 Individual Report Types Feedback

Expected to click Report Explorer to get to reports.

Score Summary weekly

Search Statistics monthly

Account Order monthly

Search Statistics Show me hit rate by region (mid right). Also Search Category by region.

Selected Score Summary. Use this data to identify training gaps and initiate training. Score more; score faster.

Data freshness real time.

Really liked the export to Excel. Took a report, began hacking it up immediately.

Suggestions:

Show my reports in tree. Would like to know reason of failures month to month.

Participant:

Toni North

Company:

Safeway

Asurint Team:

Myron Shawala

Date and Time:

11.01.2015

Scenario 1 Reporting Dashboard (Home) Feedback

Why are there three pie charts? I want to select best suited element type for info presented (as selected by me).

Product Searches good

Score Types confusing. Would like to see as a percentage.

TAT good.

Hit Rate good.

TAT Clearly marked.

Order Volume This display matches what she does now. Double the graph to show year over year. Confused by the monthly; compare year over year. Change title to Order Volume.

Graphics are solid. Must explore color consistency must differentiate the big ones (see below). Also show percentage on the score types.

Our internal terminology is somewhat different than what is represented here.

Suggestions:

In each of the different elements, always make the largest segment the same color. If I look to Product Searches and largest percentage is blue, and then look at TAT and largest segment is red, I get confused.

On the Dashboard, I want to pull up by region Product Search, TAT, and Hit Rate most important to me (by region). Common to others.

Show map of US on the Dashboard. Allow me to navigate in for detail.

Scenario 2 Individual Report Types Feedback

To me these report types are not as important.

Our weekly Adjudication Report is important refer to it most often. Create monthly reports based on this data. Duplicate SSN also of value. Provide these for me in the tree.

Search Statistics monthly

Score Summary monthly

Account Order monthly. Show me TAT by county (CA TAT has been wrong for years talk to Patrick).

Search Statistics report Interesting. Top left compare to the Adjudication Report. The order count is not accurate. We need to count them only once see KJ notes.

Wants to click on a link (for example, primary order hit rate) to get more detail.

Likes being able to specify start date and end date. Also the page through.

Suggestions:

Move the View Report button closer to the start/end dates.

Design Suggestions

The following sections will describe some of the key results identified during our usability sessions and some associated designs and/or changes.

Dashboard View Customization

During or upon completion of the first scenario, a large number of our users began to request Dashboard customizations based on their own unique, individual needs. By default, the Dashboard reflects data for the currently logged in user across the following six panes: Product Searches, Score Types, Turnaround Time for Criminal Products, Hit Rate for Criminal Products, Order Volume YTD, and Verification Volume YTD. However, two users wanted to drill into this Dashboard data based on a specific user within their organization. Those same users also mentioned their desire to focus the presentation based on an individual county. Down that same line, another user suggested that we refine the Dashboard presentation by geographic region. And still another user requested to view the Dashboard panes based on subaccounts within their organization.

To support this request, we suggest the ability to edit the display of each of the individual default panes through the integration of an Edit button in the upper right corner of each pane. For example:

When the Edit button is clicked, an Edit dialog is then displayed with a rules configuration interface. For example:

From this proposed user interface, our users can add or remove rules to further refine the presentation in the selected pane based on their own needs. When complete, they simply click OK and those rules are applied to the current panes display. If they do not make any edits to the selected pane, then the default display remains based on their individual log in.

Most Useful Dashboard Panes

When reviewing the default set of six panes that we provided, the highest number of users (four) identified with the Order Volume YTD details as the most valuable. Next, users (three) found the Turnaround Time for Criminal Products, Hit Rate for Criminal Products, and Score Types information somewhat valuable. Finally, one user stated that the Product Searches information had value, while none of the participants immediately identified that the Verification Volume YTD was important to them.

Considering our clients responses of information value, we suggest ordering the most useful panes across the top of set, from left to right. That said, we recommend the following re-order of the panes compared to that originally presented:

Top Three (from left to right)

Order Volume YTD

Turnaround Time for Criminal Products

Hit Rate for Criminal Products

Bottom Three (from left to right)

Score Types

Product Searches

Verification Volume YTD

For example:

Interactive Dashboard Panes

Although most users found some level of value in the Dashboard panes, it became clear early on that our users expected something more from each the panes the ability to click or hover over elements for more information and details. The following table summarizes some of our customer responses in this area:

Dashboard Pane

Comment

Score Types

For Not Scored Scores, click to display only those not scored in the Report Explorer.

Order Volume YTD

If September shows 1200 orders, click the bar to show me how those 1200 orders were split across products (900 Criminal, 100 Verifications, etc.).

Revise to separate the bar by color and section of the bar to reflect products across all of those orders. Include percentages in the sections.

Product Searches

Click the MVR chunk provide details on number of violations, suspended licenses.

Product Searches

Click the drug records in Product Searches show me the numbers, and then more info on these only.

Hit Rate for Criminal Products

Within each chunk of the pie, list how they broke down (of this percentage, 25 were offense A, 67 were offense B, etc.).

TAT

Click on those criminal products over 3 days and take me to the Report Explorer. What is taking over 3 days? And why? (multiple users said this).

Order Volume YTD

Show dollars spent by product. Show me these details by month. Show me by branch to gauge costs in select areas.

Hit Rate for Criminal Products

Show Felony vs. Misdemeanor. For Felony, show top five offenses found.

Order Volume YTD

Show seasonally. Need a way to diffuse this information.

TAT

What about over 3 days or under I day? Show more information.

Score Types

Hover over for more data. Click on Not Scored and go to Report Explorer for only those types need more detail on scores.

Score Types

Wanted to drill down into individual scoring elements and see who isnt scored. Show me more detail.

Show me costs by package.

Hit rate by package (with costs).

TAT by package (with costs).

Hit Rate for Criminal Products

By package (with costs).

TAT

By package (with costs).

Score Types

Would like to know reason for fails show me.

Score Types

Show as a percentage instead.

Order Volume YTD

Show year over year instead.

Interactivity was requested most often for the Score Types pane. The pyramid presentation of this information was confusing to many of the users (see section 3.7.6) which may have contributed to the request for additional data. However, those who wanted more information were focused on displaying more details related to the Not Scored and Fail categories; in particular, at least two users wanted to click a specific score type, and then launch the Report Explorer to see more complete information on orders with only that score type.

In this instance, we suggest launching the Report Explorer and displaying all of the selected scores at the top of the order list. Further, an alternate solution may be to allow users to hover their cursor over the specific score, and then display key order details. For example, if there are three orders in the Fail state, when the user hovers over the number 3, then the following is displayed above the Score Types pane:

The second most commonly selected pane for feedback was the Order Volume pane. However, the suggestions on this pane were very diverse (see above). We suggest additional research and discussion with customers to determine the best course of action in this area.

The next most requested panes for response were the Hit Rate and TAT panes. Of the three users requesting more information on TAT, two users wanted to know more information about those orders that exceeded three days. For these users, we could implement a similar response as described above for Score Types: click on the slice that shows over three days, and then launch the Report Explorer and display only those orders that are over three days TAT. Additionally, we could also explore the display of a hover over (see above for the Score Types pane).

For the Hit Rate pane, two of three users requested additional information about the type of offenses found within each product type. In this instance, we may also want to consider a hover- over to display the offenses within each segment, and then list them in priority order (largest to smallest). For example:

Locating The Dashboard And Reports

Of the 11 users who contributed to our test, four initially looked for the Dashboard and/or the individual reports in the Report Explorer tab. However, after exploring on their own and/or receiving limited assistance from the facilitators on the true location, each user became immediately comfortable with the Asurint Reporting Tool tab itself, as well as the associated report tree. The remaining seven users had little difficulty in locating the Dashboard and/or the associated reports. Further, most of the users indicated that the chosen report types behaved as expected when selected (the chosen report type was displayed).

It is the suggestion of this team to leave the Asurint Reporting Tool tab and report tree as is since most users were comfortable with the current orientation. However, we suggest including the New designator on the Asurint Reporting Tool tab (similar to what we did for the Task Manager tab when it was initially released). Additionally, we also recommend that the new functionality be covered fully in all training modules for our new users.

Report Types

The response to our three default reports (Search Statistics Summary, Score Summary, and Account Order Activity) was mixed. Most users felt these reports were canned, and the overall content was adequate but some of it may not be too useful. A few participants (2) explicitly did not find any value in these reports. The following sections provide some additional detail on each report type.

Score Summary Report

The Score Summary Report got high marks from at least three users, as they indicated that the content of this report would direct them to identify training gaps for their team (for example, point out users who werent keeping up with the scoring, identifying users who should not be scoring at all, or determine what branches are behind in their scoring). With this data in hand, they could quickly provide targeted training to their team members who required it.

Further, similar to the Score Types pane in the Dashboard, two users wanted to click a number under a score type in the table, and then display respective orders in the Report Explorer for that score type.

With the focus on scoring in the Dashboard, we recommend maintaining this report as one of the default set provided in the first release. Additionally, we should also move this report to the top of the general report tree.

Search Statistics Summary Report

The Search Statistics Summary Report got the most diverse marks. Although very busy and a mismatch of data, some users found portions of the report to be useful, while others indicated that it was not useful to them at all (see details in the User Feedback Summary, section 3.6). Further, some users asked that the layout and content be cleaned up for them and, if they deemed something of little value, wanted it to be removed entirely. One user also felt that the content would have to be customized for individual users to ensure its usefulness.

We suggest working with our users to further refine the important elements of this report to ensure that value is provided. Of the three initially identified, we also recommend placing this one at the bottom of the general report tree.

Account Order Activity Report

The Account Order Activity Report fared fairly well and was liked across some of the users who responded we believe this is because it was the cleanest and clearly formatted of the three reports shown during the test. A couple of users indicated that this report helped them pull all of the results we provide together; another user was very happy with the content, calling it dangerously good.

However, two users had the opposite response; one indicated that the report was very busy and the customer column was not necessary. The second user felt this report held too much information, and that information was data that they didnt care much about.

With the split in usefulness of this report, we suggest leaving it in the default set for now. Once released, if we get negative feedback, then we should work with our customers to identify an alternate, replacement report.

Custom Reports

Overall, most participants were fairly acceptant of the three canned reports that we provided: Search Statistics Summary, Score Summary, and Account Order Activity. However, after reviewing the content of these reports, many users immediately identified with one or more of the existing reports that we already provide to them. That said, four users were hoping to be able to add those custom reports to the report tree so they could have quick reference to those reports (as needed) at their fingertips.

One user also expressed that they were new to the reporting area of Asurint, and wanted to work directly with us to provide the best reporting experience for them including the creation of multiple custom reports that they could use to improve their interaction with Asurint.

We recommend that each users report tree be populated with all of their custom reports (by default). In addition, since our users seem to find their custom reports more valuable than some of the standard set of reports provided, we also suggest re-formatting the report tree to have three tiers: Home (Dashboard), My Reports, and General Reports. For example:

Frequency Of Report Usage

Of the three canned reports provided, the frequency in which each report would be used was also mixed. Three users responded monthly, while four others users responded monthly (with some weekly). A single user indicated that these reports would be used weekly.

Expected Frequency Of Data Refresh

Three users expected the data represented on both the Dashboard and in the individual canned reports to be updated in real-time, while three additional users expected the data elements to be updated on a daily basis.

Outside of the canned reports, two of those users added that they would check the data of their custom reports in the early morning to start their days. Further, other users were a little more relaxed in their data refresh expectations, looking to have the data refreshed on a weekly basis only.

At this time, we recommend updating on a daily basis and after hours, so that users can have updated information when they arrive in the office the following morning.

Score Type Pyramid Is Confusing

Going into the test, we wanted to display a number of different graphical elements available to us from Microsoft Reporting Services 2012 - these included bar graphs, pie charts, line graphs, the pyramid, and other elements. By displaying a number of different options, we were hoping to identify those elements that were most meaningful and comfortable to our users.

Of those presented, the pyramid took the most abuse. Although one user liked this information presentation, four others were confused by the pyramid. Of those users, two indicated that the level of confusion was increased when the maximum number of score types were depicted; the color differentiation was poor and the line callouts were hard to decipher. Two users requested that we provide this information in an alternate means (possibly by bar graph), and focus on the numbers, which is the most important.

Our recommendation is to focus on more standard graphical elements (such as bar graphs, line charts, and pie charts) in the initial release since our users were most comfortable with them. Additionally, when we migrate to Microsoft Reporting Services 2016, there may be new and additional graphical elements that we can consider for inclusion.

Import Into Excel

Of the users we spoke to, three readily recognized the importance of saving their report data to an alternate format (for example, Excel) to customize the content for their needs and readily distribute it in their organization. During our test, one user quickly opened one of the default reports, exported it to Excel, and began tweaking it to meet her needs in real-time.

This is a default function for the Microsoft Reporting Services 2012 and should be maintained.

Miscellaneous

During customer interaction with the Dashboard and the individual reports in our proof of concept, the following simple, solid suggestions were presented. From our perspective, these suggestions could be implemented with very little effort and provide a high level of value.

Address Spelling Errors

While combing through the Search Statistics Summary Report, one user noticed that the work Average was misspelled as Avrerage.

At this time, we suggest that all spelling errors are fixed in this report, and across all of our customer reports as well, to provide a professional-looking data presentation to our user base.

Set Breadcrumbs To Match Selection

When you click the Asurint Reporting Tool in the left navigation pane on the Asurint site, the bread crumbs reflect Home > ART. In this case, we should match the actual user selection and reflect Home > Asurint Reporting Tool. Further, when we select one of the default reports, then we should add that default report name to the bread crumb (for example, Home > Asurint Reporting Tool > Score Summary).

Fix Calendar Underlays/Overlays

A couple of users noticed some discrepancies when initiating the calendar functions within the Dashboard or when one of the individual reports was displayed. For example, from the Dashboard, if you click the calendar icon next to the Start Date, then the actual calendar is partially hidden by the first pane in the set:

Additionally, if you click the calendar icon next to the Scoring Start Date when the Score Summary Report is displayed, the opposite happens the entire calendar is displayed on top of the report and impedes your ability to view the report content below it. For example:

Our suggestion is to ensure that the calendar is displayed on top of the Dashboard or individual report, without allowing the content beneath it to bleed through the calendar itself.

Start Date/End Date Combination Maintained

During our sessions, one user was viewing the Scoring Report and entered a start date and end date pair to refine the data presentation. After viewing the results, he then navigated to one of the remaining reports. His expectation was to maintain the start and end date pair that he previously entered; however, when the second report was displayed, the previously entered start date and end date was lost.

Our suggestion is to add an Apply date range check box at the top of each report form. When selected, any date values entered for a single report or Dashboard are then maintained when navigating to any other form. For example:

Select Custom Chart Type

During the Dashboard scenario, one user began to explore out loud about the different chart types depicted on the Dashboard. To him, some chart types were pleasant and useful, while others were not. To that end, he requested that we provide him a means to select chart type.

In addition to the Edit functionality for each pane that was discussed earlier in this section (Edit dialog), we may want to include an additional parameter:

Display - Chart type

Operator is =

Value = bar chart, line graph, etc.

For example:

Move The View Report Button Closer To The Start/End Dates

At least one user was perplexed on how to update the individual report content based on the start and end dates entered. Once guided to the View Report button, they suggested that the View Report button was too far removed from the date interaction controls, making it difficult to make the association.

We suggest moving the View Report button closer to the date controls so that you feel a distinct correlation to the dates being entered. For example:

Additionally, we may want to also consider adding the ability to press the Enter key after inputting the End Date to allow update the display.

Report Consistency

During the course of the day, one of our users, as well as both of the facilitators, noticed a vast inconsistency amongst the three core reports displayed in scenario 2. For example, each report included start and end parameters, but their names were each subtly different:

Report Name

Date Name 1

Date Name 2

Search Statistics Summary

startDate

endDate

Score Summary

Scoring Start Date

Scoring End Date

Account Order Activity

StartDate

EndDate

Of the three reports, the Account Order Activity Report included a title. However, the included title (Customer Order Activity) did not match the report name in the reports tree.

Another area of concern was the report headings. In the Search Statistics Summary, the header rows were bold, but of mixed font sizes (one size larger than, and also equal to the report content). In the Score Summary, the header rows were bold and of the same font size as the content rows. Further, in the Account Order Activity report, the header rows were bold face and of the same font size as the report content. Additionally, the header rows were also shaded light blue this was the only report with that feature.

In this instance, we suggest including a bold face report name at the top of each report (center mast, two font sizes larger than the report text), and ensure that the report title matches the name of the report listed in the reports tree. Additionally, we suggest the use of a bold table header (one font size larger than the content) that includes a light gray or silver shade to help differentiate the header information from the content. In this situation, we also recommend the consistent use of Start Date and End Date across all reports. Finally, we suggest displaying the report content in one font size smaller than the header row, and in normal font.

For example:

Remove Not Applicable Dashboard Panes

When viewing the Dashboard, one user immediately recognized that the Verifications Volume YTD pane did not apply to him. Although we included a message in that pane indicating that it was not available, he suggested that we instead remove that pane since it had no use for him.

We agree; we recommend that if one of the default panes do not apply to a particular user, then we should remove that pane and only display those panes that apply.

Use Common Colors Across the Dashboard Panes

As a possible improvement, one participant suggested that we use common colors across each of the panes to represent similar levels of data (for example, the largest number of Product Searches and the largest Hit Rate for Criminal Products are displayed in the same shade of blue, and then the second largest in both panes be represented in red, etc..).

To support the quick look and review of our screening activities, we recommend incorporating the color match across the panes.

Increase Dashboard Header Size

Although one participant shared that they really like the individual pane headers, another participant recommended that we increase the size of the pane headers to further set each pane off from the next.

To further the quick recognition the Dashboard panes, we recommend implementing this suggestion as well.