response processes in sjt performance: the role of general and specific knowledge

18
RESPONSE PROCESSES IN SJT PERFORMANCE: THE ROLE OF GENERAL AND SPECIFIC KNOWLEDGE James A. Grand Matthew T. Allen Kenneth Pearlman 27 th Annual Conference of the Society for Industrial & Organizational Psychology April 27, 2012 The views, opinions, and/or findings contained in this presentation are solely those of the authors and should not be construed as an official Department of the Army or Department of Defense position, policy, or decision, unless so designated by

Upload: margot

Post on 23-Feb-2016

21 views

Category:

Documents


0 download

DESCRIPTION

Response Processes in SJT Performance: The Role of General and Specific Knowledge. 27 th Annual Conference of the Society for Industrial & Organizational Psychology April 27, 2012. Matthew T. Allen. James A. Grand. Kenneth Pearlman. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Response Processes in SJT Performance: The Role of General and Specific Knowledge

RESPONSE PROCESSES IN

SJT PERFORMANCE:

THE ROLE OF GENERAL AND

SPECIFIC KNOWLEDGE

James A. Grand Matthew T. Allen

Kenneth Pearlman

27th Annual Conference of the Society for Industrial & Organizational Psychology

April 27, 2012

The views, opinions, and/or findings contained in this presentation are solely those of the authors and should not be construed as an official Department of the Army or Department of

Defense position, policy, or decision, unless so designated by other documentation.

Page 2: Response Processes in SJT Performance: The Role of General and Specific Knowledge

2

• Turning a critical eye towards SJT construct validity & its assumptions• What do SJTs measure?– An alternative take on an old

question– A response process model for SJT

performance

• Predictions of the response process model– Empirical support?

• Implications for interpretation & validity of SJTsResponse Processes in

SJT Performance

Presentation Overview

Page 3: Response Processes in SJT Performance: The Role of General and Specific Knowledge

3

Acknowledgement

• The authors would like to thank the U.S. Army Research Institute for the Behavioral and Social Sciences (ARI) for allowing the use of their data for this research• More information about the

study can be found in:Knapp, D. J., McCloy, R. A., & Heffner, T. S. (Eds.) (2004). Validation of measures designed to Maximize 21st-century Army NCO performance (TR 1145). Arlington, VA: U.S. Army Research Institute for the Behavioral and Social Sciences.

Page 4: Response Processes in SJT Performance: The Role of General and Specific Knowledge

4

Validity Evidence for SJTs

• Established evidence of criterion validity between SJT and job performance (cf., Chan & Schmitt, 2002; Clevenger et al., 2001; McDaniel et al., 2001)– Estimates in low .20s (corrected validities near mid-.30s)

(Chan & Schmitt, 2005)

• Agreement on construct validity is less certain...First-order Constructs

• Multiple, distinguishable dimensions• Specific a priori subscales

• Oswald et al. (2004)• Career orientation• Perseverance• Multicultural appreciation

Second-order Constructs

• Singular, high-level dimension

• Broad focal target

• Lievens et al. (2005)• Interpersonal skills

• Mumford et al. (2008)• Team role knowledge

“Practical Intelligence”

• Tacit knowledge or “common sense”• Everyday reasoning

• Sternberg et al. (2002)• “Practical know-how”

• Chan & Schmitt (2005)• Contextual knowledge

Page 5: Response Processes in SJT Performance: The Role of General and Specific Knowledge

5

A Test in Name Only

• Virtually all perspectives approach and treat SJT measurement in a manner consistent with Classical Test Theory

• SJTs are NOT tests! (at least in the traditional sense of the word)– Low-fidelity simulations (Motowidlo et al., 1990)– Measurement methods capable of capturing a variety of

constructs (Chan & Schmitt, 2005; McDaniel & Nguyen, 2001)

X = T + EObserved Score = True

Score + Errorij

ij

rkrk

)1(1

Page 6: Response Processes in SJT Performance: The Role of General and Specific Knowledge

6

So What Do SJTs Measure?

“SJT performance clearly involves cognitive processes. [...] Addressing basic questions about these underlying cognitive processes and eventually understanding them could provide the key to explicating constructs measured by SJTs”

(Chan & Schmitt, 2005)

“So far, there does not exist any theory about how people answer SJTs or about what makes some SJT items more or less difficult than others.”

(p. 1044, Lievens & Sackett, 2007)

Page 7: Response Processes in SJT Performance: The Role of General and Specific Knowledge

7

So What Do SJTs Measure?

• Rather than conceptualize SJTs as though they measure a static construct or “true score,” SJTs capture sophistication of a respondent’s reasoning process

• By their nature, SJTs capture similarity between respondent reasoning and that implied by keyed responses

?≠

Page 8: Response Processes in SJT Performance: The Role of General and Specific Knowledge

8

A Response Process Model of SJT Performance

• Generic dual-process accounts of human reasoning, judgment, and decision-making (Evans, 2008)

System 1

• Implicit, intuitive, and automatic reasoning

• Decisions guided by general heuristics , which are informed by domain experiences

• High capacity, low effort processing

System 2

• Systematic, rational, and analytic reasoning

• Decisions guided by controlled, rule-based evaluations and conscious reflection

• Low capacity, high effort processing

Page 9: Response Processes in SJT Performance: The Role of General and Specific Knowledge

9

A Response Process Model of SJT Performance

• Dual-process accounts have been applied in a variety of perceptual, reasoning, and decision-making tasks (see Evans, 2008)– Extensions of dual-process model serve as foundation for

much of judgment & decision-making literature (e.g., Gigerenzer et al., 1999; Kahneman & Frederick, 2002, 2005)

Central Tenets of Dual-Process ModelsBecause of limits on our cognitive capacity and information processing...• System 1 reasoning is primary determinant of

judgment/decision-making in most situations• System 2 reasoning is typically engaged to evaluate

the quality of decisions or in attempts to consciously contrast alternatives

Page 10: Response Processes in SJT Performance: The Role of General and Specific Knowledge

10

Dual-Process Accounts as a Response Process Model of SJT Performance

• Two predictions based on dual-process account relative to SJT performance:

– Beliefs about the general effectiveness of various behaviors, dispositions, or approaches serve as baseline heuristic for reasoning across many situations (cf., Motowidlo et al., 2006; Motowidlo & Beier, 2010)» “It is good to be thorough and conscientious in one’s work.”

– Domain experience/knowledge leads to development of more conditional, refined, and nuanced heuristics (Hunt, 1994; Klein, 1998; Phillips et al., 2004)» “It is good to be thorough and conscientious in one’s work, but you can

generally skimp on Task X and still do just fine.”– Thus, generalized heuristics/beliefs/temperaments become less

predictive of SJT performance as experience increases

As job experience increases, the predictive validity of domain-general heuristics on SJT performance will decrease

1

Page 11: Response Processes in SJT Performance: The Role of General and Specific Knowledge

11

Dual-Process Accounts as a Response Process Model of SJT Performance

• Two predictions based on dual-process account relative to SJT performance:

– Common for respondents to identify/rate most and least effective/preferred SJT response options (McDaniel & Nguyen, 2001)

– Identifying most effective option should engage System 1 reasoning» Select most reasonable option based on intuitive heuristic, less effortful

processing– Identifying least/less effective option should engage System 2

reasoning » “Play out”/evaluate consequences of remaining options, more effortful

processing– Thus, identifying least/less effective option more g-loaded than

identifying most/more effective option

Cognitive ability will be more strongly related to assessment of less preferred SJTs options than more preferred options

2

Page 12: Response Processes in SJT Performance: The Role of General and Specific Knowledge

12

Methods & Measurement

• Concurrent validation study on predictors of current and future expected job performance of Army NCOs (n = 1,838) (Knapp et al., 2004)– Primarily interested in predicting leadership

performance/potential– Sample:Rank N Relevant Experience

E4 433 Little to no leadership experience

E5 864 First opportunities for leadership experience, have received some leadership training

E6 541 More leadership experience across greater variety of situations

Page 13: Response Processes in SJT Performance: The Role of General and Specific Knowledge

13

• Domain-general heuristic measures– Differential attractiveness: individuals

who more strongly endorse a trait/quality perceive behaviors which reflect that trait/quality as more effective (Motowidlo et al., 2006; Motowidlo & Beier, 2010)

– Temperament inventories» Assessment of Individual Motivation (AIM)

1. Multidimensional 38-item forced choice measure (α ≈ .60 all scales)

» Biographical Information Questionnaire (BIQ)1. Multidimensional 156-item self-report biodata

questionnaire (α ≈ .70 all scales)

• General cognitive aptitude (ASVAB)• 40-item SJT on

leadership/interpersonal skills (Knapp et al., 2002)– 5 response alternatives, SMEs rated all

options– Respondents chose most & least effective

options» Responses recoded to SME ratings

Response Processes in SJT Performance

Methods &

Measurement

Page 14: Response Processes in SJT Performance: The Role of General and Specific Knowledge

14

Empirical Examination of Predictions from Dual Process Model

As job experience increases, the predictive validity of domain-general heuristics on SJT performance will decrease

1

AIM

-- D

epen

dabi

lity

0.00 0.05 0.10 0.15 0.20 0.25 0.30 0.35 0.40 0.45

E4 E5 E6

Correlation with Most Effective Response RatingLow Dependability High Dependability

4.75

4.85

4.95

5.05

5.15

5.25

5.35

5.45E4 E5

Mos

t Effe

ctiv

e Re

spon

se

Rati

ng

Regression Summary Main effect of

temperament Main effect of experience Significant interaction• Relationship stronger for

less experienced

Results consistent across

all scales & SJT scores

Page 15: Response Processes in SJT Performance: The Role of General and Specific Knowledge

15

-0.10 0.00 0.10 0.20 0.30 0.40 0.50

Empirical Examination of Predictions from Dual Process Model

As job experience increases, the predictive validity of domain-general heuristics on SJT performance will decrease

1

0.00 0.05 0.10 0.15 0.20 0.25 0.30 0.35 0.40 0.45

E4 E6

Leadership

Physical Conditioning

Agreeableness

Work Orientation

Adjustment

Dependability

Interpersonal Skill

Leadership

Openness

Tolerance for Ambiguity

Social Maturity

Social Perceptiveness

AIM Scales

BIQ Scales

Page 16: Response Processes in SJT Performance: The Role of General and Specific Knowledge

16

Cognitive ability will be more strongly related to assessment of less preferred SJTs options than more preferred options

2

Empirical Examination of Predictions from Dual Process Model

E4 E5 E60

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4 Most Effective Least Effective

Corr

elat

ion

betw

een

Gene

ral C

ogni

tive

Apti

-tu

de a

nd S

JT R

espo

nse

Rati

ng

Page 17: Response Processes in SJT Performance: The Role of General and Specific Knowledge

17

Conclusions & Implications

• Most research on SJT measurement, development, and validity has largely been atheoretical (but see Motowidlo & Beier, 2010)– Dual-process account appears to be a reasonable response process

model– Currently working on more explicit empirical examination (see

also Foldes et al., 2010)

• What does having a response process model buy us?– SJT construct validity: Constructs vs. Reasoning

» Could label it “practical intelligence,” but even that depends on...– Interpretation of SJT performance

» Who is selected as the “experts” holds significant importance » Extent to which respondents reason/process information in a manner

similar to “experts”– Response elicitation affects SJT interpretation

» Most likely option/ratings = more heavily influenced by heuristic reasoning» Least likely option/ratings = more heavily influenced by cognitive reasoning

Page 18: Response Processes in SJT Performance: The Role of General and Specific Knowledge

RESPONSE PROCESSES IN

SJT PERFORMANCE:

THE ROLE OF GENERAL AND

SPECIFIC KNOWLEDGE

James A. Grand Matthew T. Allen

Kenneth Pearlman

27th Annual Conference of the Society for Industrial & Organizational Psychology

April 27, 2012

The views, opinions, and/or findings contained in this presentation are solely those of the authors and should not be construed as an official Department of the Army or Department of

Defense position, policy, or decision, unless so designated by other documentation.