validity and reliablity

26
Validity, Reliability & Rigor Validity and Reliablity BASIC CONCEPT “The critiquer of research, when reading research studies & reports, must assess the reliability and validity of the instruments used in the study to determine the soundness of those selections in relation to the constructs under investigation” Lo Biondo Wood & Haber 1990 I e Surveys Types of quest Statistics used to find out if it is valid and riable

Upload: alan-p-jack

Post on 21-Jan-2015

6.360 views

Category:

Technology


1 download

DESCRIPTION

 

TRANSCRIPT

Page 1: Validity And Reliablity

Validity, Reliability & Rigor

Validity and Reliablity

BASIC CONCEPT

“The critiquer of research, when reading research studies & reports, must assess the reliability and validity of the instruments used in the study to determine the soundness of those selections in relation to the constructs under investigation”

Lo Biondo Wood & Haber 1990

I eSurveysTypes of questStatistics used to find out if it is valid and

riableSample

Piece of equipment

Page 2: Validity And Reliablity

Go simple first!!!!!!Make sure all the tools are reliable – valid

Validity Refers to whether a study instrument

accurately measures what it is supposed to measure Does it do what it says on the tin?

Reliability is a necessary but not sufficient condition of validity

Might be good for somethings but not everyging

Validity may be concerned with studies in general (internal & external) or of the measurements used.

Internal valid – instruments used, biasExternal valid – generisability.

Page 3: Validity And Reliablity

Validity " Validity is the best available

approximation to the truth of a given proposition, inference or conclusion" (Trochim, 1999; p.29).

Best guessControl things as much as posslibeInferences

But are they valid

What daily conclusions or inferences in our everyday life do we make?

Ie the sun will rise…..This course is going to get harder…….

Gut feelings

But there are biases….

Internal Validity

Page 4: Validity And Reliablity

is affected by flaws within the study itself such as not controlling some of the major variables (a design problem), or problems with the research instrument (a data collection problem).Types of questionsMethodsGender of researcher

Can’t always see the questionnaire

Relying on the researcher to report things properly

Are biased from the start

Look at interpretationAre people being truthful?What do you think about what they want to know…?

Did the person pilot?To see if it works or not…

Page 5: Validity And Reliablity

Tailor things to cover what you want…

Then you do it big

Pilotinig is a god way of tryinig to combat problems with internal validity….

Ethical issues.Use of non-partisipant observation

Data collectionResponse rate…

Why are you getting x%Are the respondants

different..

Why are the non-respondance different….?

Page 6: Validity And Reliablity

Here are some factors which affect internal validity:

Subject variability Size of subject population Time given for the data collection or

experimental treatment History Attrition

http://en.wikipedia.org/wiki/Attrition_%28medicine%2C_epidemiology%29drop out rate….

Maturation Instrument/task sensitivity

Eternal Validity

is the extent to which you can generalize your findings to a larger group or other contexts. If your research lacks external validity, the findings cannot be applied to contexts other than the one in which you carried out your research. For example, if the subjects are all males from one ethnic group, your findings might not apply to females or other ethnic groups.

Page 7: Validity And Reliablity

it can be a learning experience.Cnan look at speicif groups

can get a lot of interesting thing… if something is small doesn’t mean that it is not a valid piece of work… can give you rich pickings….

Here are seven important factors affect external validity

Population characteristics (subjects) Who is doing the funding it? They are going to need certain

results…. Skewing!!!!! They like the word

skewed!!!

Interaction of subject selection and research

Descriptive explicitness of the independent variable Is to too narrow or two broad….?

Page 8: Validity And Reliablity

The effect of the research environment Researcher bias

Hawthorn effect (http://en.wikipedia.org/wiki/Hawthorn_effect )

What can we do?

Ensuring anonymity

Participant observation

The longer you are there… you start to be normal….

Longitudinal aspect

Triangulation….

How is the methodology going to effect the sample…..?

Watch

Question

Compare… Researcher or experimenter effects

Page 9: Validity And Reliablity

Can impliemnt different effects More covert….

Data collection methodology The effect of timeThings change… you can

ValidityThink of target practice..

1 – des it mean anyting2 – relieable clustered not in the right place means it is not valid….

3- valid and reliable…ie what “THEY” want….

Validity of Measures(i) Content Validity: How far the

measure covers all dimensions of a concept. Face Validity the simplest form

Page 10: Validity And Reliablity

Is it rightIs it useful

(ii) Criterion Validity : May be: a) concurrent - using an existing measure

to validate the new. Triangulation may be used

using old ways of doing things..old and new to see if they are the same

similarhigh coffeicitn –

b) predictivepredicting (like waterlow score)

risk of something happening..

does the tool do what is says it is going to do….

We want to prevent the risk…..

Page 11: Validity And Reliablity

(iii) Construct Validity: the extent to which a test measures a theoretical construct or

trait

testing the theory…….

Common factors that contribute to errors

Situational contaminantsResponse set biasesTransitory personal factorsAdministration variationsMeasuring instrument clarityItem samplingInstrument format

Polit & Hungler 1991

Page 12: Validity And Reliablity

]ReliabilityIs concerned with the extent to which a

measure gives consistent results.StabilityHomogeneityEquivalence

Asks the question - how much error is acceptable?

There will be error…..

Random ErrorMakes study unreliable….So you cannot predict

Constant ErrorClock – 10 mins early…. Ie a constant error

Because the error is constant does not automatically make the study invalid…..

Need to allow for that in the results…….

Page 13: Validity And Reliablity

Reliable error…

Testing Reliability

Is the result consistent…?Need to udnertant the concepts of what the test is trying to do….

What is the test trying to do…?????

There are thousands of test by toy don’t need to unerstnat the m all….

What you need to dooGet tyt back principles

Look up the tests at the time…..

You can see if the papaper is good or not………

Page 14: Validity And Reliablity

Stability: - Test -Retest. Repeating & comparing results to get a reliability co-efficient.

Homogeneity: Testing for internal consistency

Equivalence: Parallel or Alternate form & Inter-rater reliability tests

Rigor of Qualitative Research “Trustworthiness.”Do you trust the researcher… To carry things out In a creadable manner….

Can yo make these dections…

Research carried out in a credible manner Work must be dependable, comprehensive

Strategies for Achieving Trustworthiness/Credibility in Qualitative Research

Page 15: Validity And Reliablity

Prolonged engagement with and observation of informants

Triangulation (multiple sources of data) Peer debriefing (colleagues)

Discussing things get feedback… Negative case analysis (to include

commonalities as well as variabilities) Positive things…. If you look at the negitives…. get out

quite a lot of information… help to improve the validity of positive data…..

Referential adequacy (theoretical sampling)

What , who, where, theoretically

Research books will tell you what is a good sample siaze…

Look at methodology… Not just what is being tested….

Member checks (research participants/informants)

Tape/notes….

Put it together…

Page 16: Validity And Reliablity

Go back to partisapant and say….“is this what you meant..?” Employing an auditor

In qualitative research Put things into themes.

Would an auditor make the same decisions ofas you….??

Thick description (to reflect complexities in the data) Make you yhave it all Not missing things Tape recording.

Eye contactMay miss a lot of information

Make sure you get all the information.

Prevention of premature foreclosure on the data Make sure you reach that saturation

point… Need to meet more…..

Not enough…..

Page 17: Validity And Reliablity

Maintaining a journal to enhance self-reflection

record biases…how you are feeling…research will keep a diary and try an eas the process… to acknowledge gniases…. They can explore this in their journals…..

Prolonged engagement with and observation of informants

Need to develop a trusting relationship with research participants

Need to observe and interact in various contexts over time

Need to get a deep and complex understanding of the phenomenon under study

Triangulation Multiple methods of data collection

(interviews – individual and group, observation, literature, archives)

Multiple investigators Multiple contexts/situations Peer Debriefing

Page 18: Validity And Reliablity

Share data with colleagues (those who are experts in the field of study and those who are not)

May ask peers to code a few transcripts May ask peers to listen to the analysis you

are in the process of developing – ask for feedback

Negative Case Analysis There are not “outliers” in qualitative

research Embrace all the variabilities Learn from the “negative” cases – what

explains why this case, this person is different from the others? – leads to a more complex, dense, thick analysis

Member Checks Going back to the informants to see if the

analysis/interpretation makes sense to them, reflects their experiences

May go back to the actual participants or to other informants who you have not previously interviewed, or both

Employing an Auditor An outside person who can verify the

steps you went through in arriving at your data analysis/interpretation

Page 19: Validity And Reliablity

Verify the logic of your chronology of the research process – able to outline the steps

Verify that a systematic process was undertaken

Maintaining a Journal to Enhance Self-Reflection

Keep track of your own ideas, responses, “biases” in order to try as best as you can to separate your responses from the responses of the participants

Acknowledge your own biases, “locate yourself in the data”

Continue to be self-reflective though not a naval-gazer!Prevention of Premature Closure on the Data

Continue data collection and analysis until “theoretical saturation” is reached

Provide evidence of theoretical saturation Generate questions for further study –

indicating what areas have not been answered yet

Page 20: Validity And Reliablity

Criteria for Evaluating Trustworthiness/Credibility in Qualitative Research

Evidence of systematically formulating “provisional hypotheses” and the data to support them (and interview questions becoming more focused)

Evidence of having reached “theoretical saturation” and data to support this saturation (do not prematurely foreclose on the data)

Empirical data must be presented throughout the presentation of results

Summary of Major Points Evaluating and critiquing qualitative

research (establishing “validity and reliability”) are based on the paradigm from which qualitative methods have been developed

Page 21: Validity And Reliablity

Need to evaluate validity and reliability in terms of the concepts of trustworthiness and credibility

Specific techniques for enhancing trustworthiness and credibility

Specific criteria upon which to evaluate trustworthiness and credibility