DOC PREVIEW
MSU PSY 255 - Selection

This preview shows page 1 out of 3 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 3 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

PSY 255 1nd Edition Lecture 10 Outline of Last Lecture I. Types of Predictors Outline of Current Lecture II. RecruitmentIII. SelectionIV. Validation Study Current LectureRecruitment- Attracting and encouraging application for employmento Sets limit on quality of applicants that can be selectedo Good for reducing adverse impact and diversifying the workforceo Realistic job previews (RJPs) o Formal (newspaper ads, campus visits, internet, on-site) vs. informal (word-of-mouth)- Selection is sometimes used to discourage and reduce the number of applicants!Selection- Begins with job analysiso Job specifications tell us what KSAs to select ono We can then develop or use off-the-shelf measures of those KSAs- Selection batteryo Set of predictors and tests o Yields better predictions vs. using only one predictorThese notes represent a detailed interpretation of the professor’s lecture. GradeBuddy is best used as a supplement to your own notes, not as a substitute.- Selection batterieso Ideal: predictors that relate strongly to criterion but weakly with each other- Selection and Validation: Once we’ve chosen our predictors, we need to evaluate themo Are they valid and do they predict job performance?o Two strategies Validation study- Content- Criterion-relatedo Predictive vs. Concurrent Validity generalizationValidation Study- Content validityo Is predictor content representative of job-related KSAs?- Steps:o Job analysis to figure out required job tasks and KSAso SMEs rank KSAs based on importanceo Choose predictors that measure critical KSAso SMEs evaluate predictors–KSA overlap- Subjective, judgmental method Does NOT provide validity coefficiento Doesn’t require criterion or predictor datao Rudder v. D.C. (1995) Content validity is indeed adequate for validation- Criterion-related validityo Establishes strength of relationship between predictors and job performance Quantitative method Gives validity coefficiento Two designs: Predictive Concurrent- Predictive validityo Longitudinal design where predictor data collected at T1 and criteria at T2o Stepso Job analysis tells us the required KSAo Choose predictors that measure KSAso Administer predictors to job applicantso Hire applicants based on different info other than predictors we are trying to validate - Concurrent validityo Predictor and criteria data collected simultaneously from incumbents- Cross validationo Validity shrinkage: validity of predictors will probably be lower for different samples of applicants/incumbentso Thus, cross validate to assess amount of shrinkage and double-check the validityo Essentially, we need to compare how much the coefficient of determination changes from sample 1 (R21) to sample 2


View Full Document
Download Selection
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Selection and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Selection 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?