moneyball for litigation and compliance

44
877.557.4273 catalystsecure.com Moneyball for Litigation and Compliance WEBINAR Adam Tondryk Mark Noel Speakers

Upload: catalyst-repository-systems

Post on 16-Apr-2017

119 views

Category:

Law


1 download

TRANSCRIPT

Page 1: Moneyball for Litigation and Compliance

877.557.4273

catalystsecure.com

Moneyball for Litigation and Compliance

WEBINAR

Adam Tondryk Mark Noel

Speakers

Page 2: Moneyball for Litigation and Compliance

Speakers

A former litigation associate, Adam is an expert at helping review teams use Catalyst’s advanced technology to achieve greater efficiency and savings. He has 10 years’ experience managing a large variety of document reviews (including overseas) and developing best practices for quality assurance and review project management protocols. He regularly assists clients with designing document review workflows and is adept at forming rapid-response teams to staff any size and type of review.

Managing Director, Managed Review, CatalystAdam Tondryk

Mark specializes in helping clients use technology-assisted review, advanced analytics, and custom workflows to handle complex and large-scale litigations. He also works with Catalyst’s research and development group on new litigation technology tools. Before joining Catalyst, Mark was an intellectual property litigator with Latham & Watkins LLP, co-founder of an e-discovery software startup, and a research scientist at Dartmouth College’s Interactive Media Laboratory.

Managing Director, Professional Services, CatalystMark Noel

Page 3: Moneyball for Litigation and Compliance

Today’s Agenda

• What we mean by “moneyball”

• Work through the checklist

• Practical examples along the way

• Questions taken at any time

Page 4: Moneyball for Litigation and Compliance

Reviewers typically code documents at a pace of 50 per hour. In a review of 723,537 documents, how many reviewers would you need to finish in five days?

TAR 1.0 TAR 2.0

Using early generations of technology assisted review (TAR 1.0), you would need 48 reviewers. Using TAR 2.0 with continuous active learning (CAL), you would need

only one reviewer—and you'd achieve superior results!

Get the DETAILS877.557.4273 | catalystsecure.com

VS

How Does 1 Reviewer

Do the Work of 48?

Page 5: Moneyball for Litigation and Compliance
Page 6: Moneyball for Litigation and Compliance
Page 7: Moneyball for Litigation and Compliance
Page 8: Moneyball for Litigation and Compliance

Tracking Cost and Review Progress

• Select the metrics you want to track during the project.

• Set up your daily reporting for those metrics

Page 9: Moneyball for Litigation and Compliance

Reporting Example 1

Page 10: Moneyball for Litigation and Compliance

Reporting Example 2

Page 11: Moneyball for Litigation and Compliance

Reporting Example 3

Page 12: Moneyball for Litigation and Compliance

Reporting Example 4

Page 13: Moneyball for Litigation and Compliance

Reporting Example 5

Page 14: Moneyball for Litigation and Compliance

Reporting Example 6

Page 15: Moneyball for Litigation and Compliance

Pre-Review Planning

• Be clear about your objectives, and remember the three broad categories of review tasks:

• Where are you trying to reasonably classify documents?

• Where are you trying to completely protect some data from inadvertent disclosure?

• Where are you trying to learn something from the documents’ contents?

Page 16: Moneyball for Litigation and Compliance

Pre-Review Planning

• Calculate the number of reviewers needed to meet deadlines based on a range of review rates, and ensure that quality control time and project management time is included

• Consider factors that can impact review rate

Page 17: Moneyball for Litigation and Compliance

Pre-Review Culling and Organizing

• Cull to your review population

• Organize your review population

Page 18: Moneyball for Litigation and Compliance

The 1st Problem with Keyword Search

• (((master settlement agreement OR msa) AND NOT (medical savings account OR metropolitan standard area)) OR s. 1415 OR (ets AND NOT educational testing service) OR (liggett AND NOT sharon a. liggett) OR atco OR lorillard OR (pmi AND NOT presidential management intern) OR pm usa OR rjr OR (b&w AND NOT photo*) OR phillip morris OR batco OR ftc test method OR star scientific OR vector group OR joe camel OR (marlboro AND NOT upper marlboro)) AND NOT (tobacco* OR cigarette* OR smoking OR tar OR nicotine OR smokeless OR synar amendment OR philip morris OR r.j. reynolds OR ("brown and williamson") OR("brown & williamson") OR bat industries OR liggett group)

Jason R. Baron, Through A Lawyer’s Lens: Measuring Performance in Conducting Large Scale Searches Against Heterogeneous Data Sets in Satisfaction of Litigation Requirements, University of Pennsylvania Workshop, (October 26, 2006)

It can become overly complex

Page 19: Moneyball for Litigation and Compliance

The 2nd Problem with Keyword SearchGenerally poor results

Recall

Precision

Recall The % relevant documents from the Collection Set that are in the Review Set

Precision The % documents in the Review Set that are truly relevant (the balance of the Review Set is junk)

Page 20: Moneyball for Litigation and Compliance

The 2nd Problem with Keyword SearchGenerally poor results

• Attorneys worked with experienced paralegals to develop search terms. Upon finishing, they estimated that they had retrieved at least three quarters of all relevant documents.

Blair & Maron, An Evaluation of Retrieval Effectiveness for a Full-Text Document-Retrieval System (1985).

20%75%

• What they actually retrieved:

Page 21: Moneyball for Litigation and Compliance
Page 22: Moneyball for Litigation and Compliance

Review Best Practices

• Keep a decision log

• Use email threading

• Organize documents with technology assisted review (TAR, or “predictive coding”), or with near-duplicate clustering so that reviewers see similar documents at nearly the same time

Page 23: Moneyball for Litigation and Compliance

Prioritized Review

Stopping Point

Order of review

Responsive

Page 24: Moneyball for Litigation and Compliance

Review Best Practices

• Consider where to review at a family level and where to review at a document level

• Do not mix several concepts into one tag or code, as this will seriously impact your ability to use analytics, TAR, or advanced quality control checking farther down the line

Page 25: Moneyball for Litigation and Compliance

Separate Concepts Means Separate CodingPrivileged = YES Privileged = NO

Responsive = YES

Responsive = NO

Page 26: Moneyball for Litigation and Compliance

Separate Concepts Means Separate CodingProduceable = YES Produceable = NO

Responsive = YES

Responsive = NO

Page 27: Moneyball for Litigation and Compliance
Page 28: Moneyball for Litigation and Compliance

Quality Control

• Employ individual sample QC at the outset

• Track errors to track reviewer performance

• Track error types

• Perform random sample QC throughout review to identify other issues

Page 29: Moneyball for Litigation and Compliance

Quality Control Metrics

• Overturn rate in a validation sample

• Overturn yield:

61% 49% 41% 31% 17% 19% 14% 17% 15%

Page 30: Moneyball for Litigation and Compliance

Additional Value Adds

• Your review company should be highlighting additional generic privilege terms. For example, we recently found a batch of privileged documents that were coded as not privileged by counsel. Quick confirmation with counsel enabled us to correct the coding errors

Page 31: Moneyball for Litigation and Compliance

Additional Value Adds

• Your review company should be finding new attorneys, confirming them with you and/or counsel, and adding the new attorney names to privilege highlighting. It is a very rare review in which new attorneys aren’t discovered by the review team

• Your review company should encourage questions from the team and capture the Q/A results on a decision log

Page 32: Moneyball for Litigation and Compliance
Page 33: Moneyball for Litigation and Compliance

Leveraging Technology

• Use technology assisted review (“predictive coding”)

• Automate privilege logs using technology (such as dropdowns for each portion of the privilege description)

• Remember that many questions about a large document collection can be quickly answered by reviewing a small random sample and calculating some simple statistics

Page 34: Moneyball for Litigation and Compliance

Prioritized Review

Stopping Point

Order of review

Responsive

Page 35: Moneyball for Litigation and Compliance

Two-Tailed Privilege Review

Priv Logging 2nd Pass Priv

Privileged

Page 36: Moneyball for Litigation and Compliance

Active Quality Control

X XXXXXX XX X

Page 37: Moneyball for Litigation and Compliance

Leveraging Technology

• Use technology assisted review (“predictive coding”)

• Automate privilege logs using technology (such as dropdowns for each portion of the privilege description)

• Remember that many questions about a large document collection can be quickly answered by reviewing a small random sample and calculating some simple statistics

Page 38: Moneyball for Litigation and Compliance
Page 39: Moneyball for Litigation and Compliance
Page 40: Moneyball for Litigation and Compliance

Reviewers typically code documents at a pace of 50 per hour. In a review of 723,537 documents, how many reviewers would you need to finish in five days?

TAR 1.0 TAR 2.0

Using early generations of technology assisted review (TAR 1.0), you would need 48 reviewers. Using TAR 2.0 with continuous active learning (CAL), you would need

only one reviewer—and you'd achieve superior results!

Get the DETAILS877.557.4273 | catalystsecure.com

VS

How Does 1 Reviewer

Do the Work of 48?

Page 41: Moneyball for Litigation and Compliance
Page 42: Moneyball for Litigation and Compliance

Questions & Discussion

You may use the chat feature at any time to ask questions

Mark [email protected]

Adam [email protected]

Page 43: Moneyball for Litigation and Compliance

CLE Webinar with Bloomberg Big Law Business Advanced Techniques for Managing Digital Discovery

Nov. 16th | 1 p.m. Eastern Time

Mark Noel Catalyst

Kim Leffert Mayer Brown

COMING SOON

Hon. David J. Waxse U.S. District Court for the District of Kansas

Page 44: Moneyball for Litigation and Compliance

We hope you’ve enjoyed this discussionThank You!

A former litigation associate, Adam is an expert at helping review teams use Catalyst’s advanced technology to achieve greater efficiency and savings. He has 10 years’ experience managing a large variety of document reviews (including overseas) and developing best practices for quality assurance and review project management protocols. He regularly assists clients with designing document review workflows and is adept at forming rapid-response teams to staff any size and type of review.

Managing Director, Professional Services, CatalystAdam Tondryk

Mark specializes in helping clients use technology-assisted review, advanced analytics, and custom workflows to handle complex and large-scale litigations. He also works with Catalyst’s research and development group on new litigation technology tools. Before joining Catalyst, Mark was an intellectual property litigator with Latham & Watkins LLP, co-founder of an e-discovery software startup, and a research scientist at Dartmouth College’s Interactive Media Laboratory.

Managing Director, Professional Services, CatalystMark Noel

[email protected]

[email protected]