anonymizing health data

Post on 28-Jan-2015

110 Views

Category:

Health & Medicine

1 Downloads

Preview:

Click to see full reader

DESCRIPTION

Slide deck from the O'Reilly webcast on the "Anonymizing Health Data" book

TRANSCRIPT

Anonymizing Health DataWebcast

Case Studies and Methods to Get You Started

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Part 1 of Webcast: Intro and Methodology

Part 2 of Webcast: A Look at Our Case Studies

Part 3 of Webcast: Questions and Answers

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Part 1 of Webcast: Intro and Methodology

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

To Anonymize or not to Anonymize

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Consent needs to be informed.

To Anonymize or not to Anonymize

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Consent needs to be informed.

Not all health care providers are willing to share their patient’s PHI.

To Anonymize or not to Anonymize

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Consent needs to be informed.

Not all health care providers are willing to share their patient’s PHI.

Anonymization allows for the sharing of health information.

To Anonymize or not to Anonymize

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Consent needs to be informed.

Not all health care providers are willing to share their patient’s PHI.

Anonymization allows for the sharing of health information.

To Anonymize or not to Anonymize

Compelling financial case. Breach cost ~$200 per patient.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Consent needs to be informed.

Not all health care providers are willing to share their patient’s PHI.

Anonymization allows for the sharing of health information.

To Anonymize or not to Anonymize

Compelling financial case. Breach cost ~$200 per patient.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Consent needs to be informed.

Not all health care providers are willing to share their patient’s PHI.

Anonymization allows for the sharing of health information.

To Anonymize or not to Anonymize

Privacy protective behaviors by patients.

Compelling financial case. Breach cost ~$200 per patient.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Masking Standards

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Masking Standards

First name, last name, SSN.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Masking Standards

Distortion of data—no analytics.

First name, last name, SSN.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Masking Standards

Creating pseudonyms.

First name, last name, SSN.

Distortion of data—no analytics.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Masking Standards

Removing a whole field.

Creating pseudonyms.

First name, last name, SSN.

Distortion of data—no analytics.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Masking Standards

Removing a whole field.

Creating pseudonyms.

Replacing actual values with random ones.

First name, last name, SSN.

Distortion of data—no analytics.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identification Standards

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identification Standards

Age, sex, race, address, income.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Minimal distortion of data—for analytics.

Age, sex, race, address, income.

De-identification Standards

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Minimal distortion of data—for analytics.

Age, sex, race, address, income.

De-identification Standards

Safe Harbor in HIPAA Privacy Rule.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

What’s “Actual Knowledge”?

Privacy Rule

Safe Harbor

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

What’s “Actual Knowledge”?

Info, alone or in combo, that could identify an individual.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

What’s “Actual Knowledge”?

Info, alone or in combo, that could identify an individual.

Has to be specific to the data set—not theoretical.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

What’s “Actual Knowledge”?

Info, alone or in combo, that could identify an individual.

Has to be specific to the data set—not theoretical.

Occupation Mayor of Gotham.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Heuristics, or rules of thumb.

Minimal distortion of data—for analytics.

Age, sex, race, address, income.

Safe Harbor in HIPAA Privacy Rule.

De-identification Standards

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Heuristics, or rules of thumb.

Statistical method in HIPAA Privacy Rule.

Minimal distortion of data—for analytics.

Age, sex, race, address, income.

Safe Harbor in HIPAA Privacy Rule.

De-identification Standards

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identification Myths

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identification Myths

Myth: It’s possible to re-identify most, if not all, data.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identification Myths

Myth: It’s possible to re-identify most, if not all, data.

Using robust methods, evidence suggests risk can be very small.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identification Myths

Myth: It’s possible to re-identify most, if not all, data.

Myth: Genomic sequences are not identifiable, or are easy to re-identify.

Using robust methods, evidence suggests risk can be very small.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identification Myths

Myth: It’s possible to re-identify most, if not all, data.

Myth: Genomic sequences are not identifiable, or are easy to re-identify.

In some cases can re-identify, difficult to de-identify using our methods.

Using robust methods, evidence suggests risk can be very small.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

A Risk-based De-identification Methodology

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

A Risk-based De-identification Methodology

The risk of re-identification can be quantified.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

A Risk-based De-identification Methodology

The risk of re-identification can be quantified.

The Goldilocks principle: balancing privacy with data utility.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

A Risk-based De-identification Methodology

The risk of re-identification can be quantified.

The Goldilocks principle: balancing privacy with data utility.

The re-identification risk needs to be very small.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

A Risk-based De-identification Methodology

The risk of re-identification can be quantified.

The Goldilocks principle: balancing privacy with data utility.

De-identification involves a mix of technical, contractual, and other measures.

The re-identification risk needs to be very small.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Steps in the De-identification Methodology

Step 1: Select Direct and Indirect Identifiers

Step 2: Setting the Threshold

Step 3: Examining Plausible Attacks

Step 4: De-identifying the Data

Step 5: Documenting the Process

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Step 1: Select Direct and Indirect Identifiers

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Direct identifiers: name, telephone number, health insurance card number, medical record number.

Step 1: Select Direct and Indirect Identifiers

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Direct identifiers: name, telephone number, health insurance card number, medical record number.

Indirect identifiers, or quasi-identifiers: sex, date of birth, ethnicity, locations, event dates, medical codes.

Step 1: Select Direct and Indirect Identifiers

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Step 2: Setting the Threshold

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Maximum acceptable risk for sharing data.

Step 2: Setting the Threshold

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Maximum acceptable risk for sharing data.

Needs to be quantitative and defensible.

Step 2: Setting the Threshold

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Maximum acceptable risk for sharing data.

Needs to be quantitative and defensible.

Is the data in going to be in the public domain?

Step 2: Setting the Threshold

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Maximum acceptable risk for sharing data.

Needs to be quantitative and defensible.

Is the data in going to be in the public domain?

Extent of invasion-of-privacy when data was shared?

Step 2: Setting the Threshold

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Step 3: Examining Plausible Attacks

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Recipient deliberately attempts to re-identify the data.

Step 3: Examining Plausible Attacks

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Recipient deliberately attempts to re-identify the data.

Recipient inadvertently re-identifies the data.“Holly Smokes, I know her!”

Step 3: Examining Plausible Attacks

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Recipient deliberately attempts to re-identify the data.

Recipient inadvertently re-identifies the data.

Data breach at recipient’s site, “data gone wild”.

Step 3: Examining Plausible Attacks

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Recipient deliberately attempts to re-identify the data.

Data breach at recipient’s site, “data gone wild”.

Adversary launches a demonstration attack on the data.

Step 3: Examining Plausible Attacks

Khaled El Emam & Luk Arbuckle

Recipient inadvertently re-identifies the data.

Anonymizing Health Data

Step 4: De-identifying the Data

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Step 4: De-identifying the Data

Generalization: reducing the precision of a field.Dates converted to month/year, or year.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Step 4: De-identifying the Data

Generalization: reducing the precision of a field.

Suppression: replacing a cell with NULL.Unique 55-year old female in birth registry.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Step 4: De-identifying the Data

Generalization: reducing the precision of a field.

Suppression: replacing a cell with NULL.

Sub-sampling: releasing a simple random sample.50% of data set instead of all data.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Step 5: Documenting the Process

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Step 5: Documenting the Process

Process documentation—a methodology text.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Step 5: Documenting the Process

Results documentation—data set, risk thresholds, assumptions, evidence of low risk.

Khaled El Emam & Luk Arbuckle

Process documentation—a methodology text.

Anonymizing Health Data

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Pr(re-id, attempt) = Pr(attempt) × Pr(re-id | attempt)

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”) Pr(re-id, acquaintance) = Pr(acquaintance) × Pr(re-id | acquaintance)

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)

T3: Data Breach (“data gone wild”) Pr(re-id, breach) = Pr(breach) × Pr(re-id | breach)

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)

T3: Data Breach (“data gone wild”)

T4: Public Data (demonstration attack) Pr(re-id), based on data set only

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

Many precedents going back multiple decades.

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

Many precedents going back multiple decades.

Recommended by regulators.

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

Many precedents going back multiple decades.

Recommended by regulators.All based on max risk though.

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

Many precedents going back multiple decades.

Recommended by regulators.All based on max risk though.

Anonymizing Health Data

Part 2 of Webcast: A Look at Our Case Studies

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Cross Sectional Data: Research Registries

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Cross Sectional Data: Research Registries

Khaled El Emam & Luk Arbuckle

Better Outcomes Registry & Network (BORN)of Ontario

Anonymizing Health Data

Cross Sectional Data: Research Registries

Khaled El Emam & Luk Arbuckle

Better Outcomes Registry & Network (BORN)of Ontario

140,000 births per year.

Anonymizing Health Data

Cross Sectional Data: Research Registries

Khaled El Emam & Luk Arbuckle

Better Outcomes Registry & Network (BORN)of Ontario

140,000 births per year.

Cross-sectional—mothers not traced over time.

Anonymizing Health Data

Cross Sectional Data: Research Registries

Khaled El Emam & Luk Arbuckle

Better Outcomes Registry & Network (BORN)of Ontario

140,000 births per year.

Cross-sectional—mothers not traced over time.

Process of getting de-identified data from a research registry.

Anonymizing Health Data

Cross Sectional Data: Research Registries

Khaled El Emam & Luk Arbuckle

Better Outcomes Registry & Network (BORN)of Ontario

140,000 births per year.

Cross-sectional—mothers not traced over time.

Process of getting de-identified data from a research registry.

Anonymizing Health Data

Researcher Ronnie wants data!

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Researcher Ronnie wants data!

Khaled El Emam & Luk Arbuckle

919,710 recordsfrom 2005-2011

Anonymizing Health Data

Researcher Ronnie wants data!

Khaled El Emam & Luk Arbuckle

919,710 recordsfrom 2005-2011

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

Average risk of 0.1 for Researcher Ronnie(and the data he specifically requested).

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

0.05 if there were highly sensitive variables(congenital anomalies, mental health problems).

Average risk of 0.1 for Researcher Ronnie

Anonymizing Health Data

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

Low motives and capacity

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

Low motives and capacity; low mitigating controls.

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

Pr(attempt) = 0.4

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)119,785 births out of a 4,478,500 women ( = 0.027)

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)Pr(aquaintance) = 1- (1-0.027)150/2 = 0.87

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)

T3: Data Breach (“data gone wild”)Based on historical data.

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)

T3: Data Breach (“data gone wild”)Pr(breach)=0.27

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)

T3: Data Breach (“data gone wild”)

T4: Public Data (demonstration attack)

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)

T3: Data Breach (“data gone wild”)

Overall riskPr(re-id, T) = Pr(T) x Pr(re-id | T) ≤ 0.1

Anonymizing Health Data

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)Pr(aquaintance) = 1- (1-0.027)150/2 = 0.87

Overall risk Pr(re-id, acquaintance) = 0.87 × Pr(re-id | acquaintance) ≤ 0.1

Anonymizing Health Data

De-identifying the Data Set

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Meeting Thresholds: k-anonymity

Khaled El Emam & Luk Arbuckle

k

Anonymizing Health Data

Meeting Thresholds: k-anonymity

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identifying the Data Set

Khaled El Emam & Luk Arbuckle

MDOB in 1-yy; BDOB in wk/yy; MPC of 1 char.

Anonymizing Health Data

De-identifying the Data Set

Khaled El Emam & Luk Arbuckle

MDOB in 1-yy; BDOB in wk/yy; MPC of 1 char.

MDOB in 10-yy; BDOB in qtr/yy; MPC of 3 chars.

Anonymizing Health Data

De-identifying the Data Set

Khaled El Emam & Luk Arbuckle

MDOB in 1-yy; BDOB in wk/yy; MPC of 1 char.

MDOB in 10-yy; BDOB in qtr/yy; MPC of 3 chars.

MDOB in 10-yy; BDOB in mm/yy; MPC of 3 chars.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

In 2006 Researcher Ronnie asks for 2005.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

In 2006 Researcher Ronnie asks for 2005—deleted.In 2007 Researcher Ronnie asks for 2006.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

In 2006 Researcher Ronnie asks for 2005.In 2007 Researcher Ronnie asks for 2006—deleted.In 2008 Researcher Ronnie asks for 2007.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

In 2006 Researcher Ronnie asks for 2005.In 2007 Researcher Ronnie asks for 2006.In 2008 Researcher Ronnie asks for 2007—deleted.In 2009 Researcher Ronnie asks for 2008.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

In 2006 Researcher Ronnie asks for 2005.In 2007 Researcher Ronnie asks for 2006.In 2008 Researcher Ronnie asks for 2007.In 2009 Researcher Ronnie asks for 2008—deleted.In 2010 Researcher Ronnie asks for 2009.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

In 2006 Researcher Ronnie asks for 2005.In 2007 Researcher Ronnie asks for 2006.In 2008 Researcher Ronnie asks for 2007.In 2009 Researcher Ronnie asks for 2008—deleted.In 2010 Researcher Ronnie asks for 2009.

Can we use the same de-identification scheme every year?

Anonymizing Health Data

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

BORN data pertains to very stable populations.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

BORN data pertains to very stable populations.

No dramatic changes in the number or characteristics ofbirths from 2005-2010.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

BORN data pertains to very stable populations.

No dramatic changes in the number or characteristics ofbirths from 2005-2010.

Revisit de-identification scheme every 18 to 24 months.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

BORN data pertains to very stable populations.

No dramatic changes in the number or characteristics ofbirths from 2005-2010.

Revisit de-identification scheme every 18 to 24 months.

Revisit if any new quasi-identifiers are added or changed.

Anonymizing Health Data

Longitudinal Discharge Abstract Data:State Inpatient Databases

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Longitudinal Discharge Abstract Data:State Inpatient Databases

Khaled El Emam & Luk Arbuckle

Linking a patient’s records over time.

Anonymizing Health Data

Longitudinal Discharge Abstract Data:State Inpatient Databases

Khaled El Emam & Luk Arbuckle

Linking a patient’s records over time.

Need to be de-identified differently.

Anonymizing Health Data

Meeting Thresholds: k-anonymity?

Khaled El Emam & Luk Arbuckle

k?

Anonymizing Health Data

Meeting Thresholds: k-anonymity?

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Meeting Thresholds: k-anonymity?

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identifying Under Complete Knowledge

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identifying Under Complete Knowledge

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identifying Under Complete Knowledge

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identifying Under Complete Knowledge

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

State Inpatient Database (SID) of California

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

State Inpatient Database (SID) of California

Khaled El Emam & Luk Arbuckle

Researcher Ronnie wants public data!

Anonymizing Health Data

State Inpatient Database (SID) of California

Khaled El Emam & Luk Arbuckle

Researcher Ronnie wants public data!

Anonymizing Health Data

State Inpatient Database (SID) of California

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)

T3: Data Breach (“data gone wild”)

T4: Public Data (demonstration attack) Pr(re-id) ≤ 0.09 (maximum risk)

Anonymizing Health Data

De-identifying the Data Set

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identifying the Data Set

Khaled El Emam & Luk Arbuckle

BirthYear in 5-yy (cut at 1910-);AdmissionYear unchanged;DaysSinceLastService in 28-dd (cut at 7-, 182+);LengthOfStay same as DaysSinceLastService.

Anonymizing Health Data

De-identifying the Data Set

Khaled El Emam & Luk Arbuckle

BirthYear in 5-yy (cut at 1910-);AdmissionYear unchanged;DaysSinceLastService in 28-dd (cut at 7-, 182+);LengthOfStay same as DaysSinceLastService.

Anonymizing Health Data

Connected Variables

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Connected Variables

Khaled El Emam & Luk Arbuckle

QI to QI

Anonymizing Health Data

Connected Variables

Khaled El Emam & Luk Arbuckle

QI to QI

Similar QI? Same generalization and suppression.

Anonymizing Health Data

Connected Variables

Khaled El Emam & Luk Arbuckle

QI to QI

Similar QI? Same generalization and suppression.

QI to non-QI

Anonymizing Health Data

Connected Variables

Khaled El Emam & Luk Arbuckle

QI to QI

Similar QI? Same generalization and suppression.

QI to non-QI

Non-QI is revealing?Same suppression so both are removed.

Anonymizing Health Data

Other Issues Regarding Longitudinal Data

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Other Issues Regarding Longitudinal Data

Khaled El Emam & Luk Arbuckle

Date shifting—maintaining order of records.

Anonymizing Health Data

Other Issues Regarding Longitudinal Data

Khaled El Emam & Luk Arbuckle

Date shifting—maintaining order of records.

Long tails—truncation of records.

Anonymizing Health Data

Other Issues Regarding Longitudinal Data

Khaled El Emam & Luk Arbuckle

Date shifting—maintaining order of records.

Long tails—truncation of records.

Adversary power—assumption of knowledge.

Anonymizing Health Data

Other Concerns to Think About

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Other Concerns to Think About

Khaled El Emam & Luk Arbuckle

Free-form text—anonymization.

Anonymizing Health Data

Other Concerns to Think About

Khaled El Emam & Luk Arbuckle

Free-form text—anonymization.

Geospatial information—aggregation and geoproxy risk.

Anonymizing Health Data

Other Concerns to Think About

Khaled El Emam & Luk Arbuckle

Free-form text—anonymization.

Geospatial information—aggregation and geoproxy risk.

Medical codes—generalization, suppression, shuffling (yes, as in cards).

Anonymizing Health Data

Other Concerns to Think About

Khaled El Emam & Luk Arbuckle

Free-form text—anonymization.

Geospatial information—aggregation and geoproxy risk.

Medical codes—generalization, suppression, shuffling (yes, as in cards).

Secure linking—linking data through encryption before anonymization.

Anonymizing Health Data

Part 3 of Webcast: Questions and Answers

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Khaled El Emam & Luk Arbuckle

More Comments or Questions: Contact us!

Anonymizing Health Data

Khaled El Emam & Luk Arbuckle

Khaled El Emam: kelemam@privacyanalytics.ca

Luk Arbuckle: larbuckle@privacyanalytics.ca

More Comments or Questions: Contact us!

top related