Você está na página 1de 48

Global Selection Systems

Case Studies & Lessons Learned

2008 Hogan Assessment Systems

Overview of Session
Survey of Issues in Global Selection
Jarrett Shalhoop, Hogan Assessment Systems

Validation of the Shell Recruitment Process


Thi Bui, Royal Dutch Shell

Global Assessment Development & Implementation


Kelly Kaminski, Starwood Hotels & Resorts

Applying Assessments Across Cultures: A Case Study


Amie Lawrence, Select International

Q&A Session
2008 Hogan Assessment Systems

Global Selection Issues


Perspective
Transporting an assessment outside of the US and using for selection/assessment

General Issues
Language/Translation Assessment Norms Validation
Logistical Psychometric

Technology
2008 Hogan Assessment Systems

Language/Translation
Three approaches to translation (van de Vijver & Leung, 1997)
Application literal translation Adaptation altering as necessary Assembly developing a parallel assessment

Back Translation Forward Translation

2008 Hogan Assessment Systems

Assessment Norms
3-Component Model of Error (Meyer & Foster, 2008)
Absolute Sample Size Sample Differences Relative Sample Size Sample Composition Translation Quality Lack of Congruous Words Translation Differences Cultural Relevance Strength of Item Wording Response Styles Cultural Differences Reference Group Effects True Cultural Differences
2008 Hogan Assessment Systems

Session 143

Validation
Logistical
Coordinating research globally a challenge

Psychometric
Culture as a moderator of relationship to performance Manifestation of characteristics

Cultural response biases


Willingness to provide accurate information

2008 Hogan Assessment Systems

Technology
Compatibility
Front end
Web-based assessment Light assessment portal

Back End
HRIS Integrations HR-XML Compliance

Intellectual Property
Protection of IP

2008 Hogan Assessment Systems

Global Selection Validation Study Results

Human Resources LEARNING

4/28/2008 Copyright: Shell International Ltd 2005

About Shell
Active in over 130 countries Employs 108,000 employees world wide What we do:
Exploration & Production Downstream Gas & Power Trading Renewables & Hydrogen Global Solutions

Global Recruitment Process


Online application form via www.shell.com/careers People who pass screening are invited for interview Candidates who are successful at the first interview will be invited to take part in a formal assessment a chance for us to assess their suitability and for the candidates to get a real insight into what working life at Shell would be like. Graduates can choose one of the three routes to Shell for their assessment Shell Recruitment Day, Gourami or Assessed Internship while experienced professionals take part in a structured interview.

Background and Objectives


Validation study of the Graduate and Experienced Hire (XP) selection processes using data from a convenient sample from 2005 and 2006 hires with readily available performance scores and selection data. This study represents the first in a series of ongoing annual validation work. It is the first global validation study examining the relationship between selection processes and work performance,1st year Individual Performance Factor (IPF) across the Shell Group. Systematic efforts to better understand the effectiveness of the selection processes, including; the competence framework, the assessments, the assessment exercises, and how each of these function in the regions. A major objective of this initial study was to better understand how data is currently being collected and stored.

Graduate Assessments
Campus Interviews are functioning differently across the regions Evidence supports the effectiveness of SRD Assessed Internships are not functioning optimally Evidence supports the effectiveness of Gourami assessments

Graduate Competence Framework


Capacity competence dimension is effective in predicting work performance however regional differences are present. Achievement competence dimension is effective in predicting work performance. Relationship dimension scores are not currently predictive of work performance.

Graduate Assessment Exercises


Campus Interview shows a positive but not statistically significant relationship with work performance. Campus Interview is more positively correlated with work performance in the Americas. SRD exercises show positive but not significant relationships with work performance. Gourami exercises show positive correlations with work performance. Assessed Internship show negative correlations with work performance.

Graduate Assessment Recommendations


Expand the range of outcome measures
First year IPF was used as a proxy measure for work performance in this initial study. This metric does not cover the full range of success outcomes Recommend using additional criteria, such as manager ratings

Improve data collection and archiving


Automate and integrate data collection as much as possible Collect data on applications that do not progress fully through the selection system to track flow percentages and pass rates at successive assessment steps

Improve measures of Relationship competence on Campus Interview and SRD


The Relationship dimension is a valuable one (as evidenced by the Gourami dimension correlations). Shell Learning will recommend improvements to the Campus Interview and SRD measures.

Improve predictive validity of Assessed Internship


Shell Learning will conduct an examination of the training and process factors that may be inhibiting these measures.

XPA Assessments
Experienced Hire assessment exercises show weak (or slightly negative) correlations with IPF 1st year after hire. Functional scores vary in relation to work performance by exercise.
Project Discussion functional scores are more indicative to future work performance than functional scores from the Professional Interview. Possible that the Project Discussion is used as the final assessment, with the Structured Interview and Professional Interview acting as precursor or preliminary hurdles.

XPA Competence Framework


The XP Competence framework dimensions show weak (and often weak negative) correlations with work performance.

XPA Recommendations
2005 and 2006 XPA was not globally mandated, there may have been a diversity of practice in implementation.
More rigorous and standardized training and implementation are essential to improving the performance of these assessments.

Tips & Hints


Setting client expectations. Timing, Data Collection, Results Use first study to understand data collection practices and issues Establish a partnership with Recruitment Department to identify data needed Utilize manager rating forms in addition of other performance indicators Have as few focal contact points as possible to collect data (1 per region)

Assessment Around the World: A Case Study from Starwood Hotels


Kelly A. Kaminski & Monica A. Hemingway Starwood Hotels & Resorts Worldwide, Inc.
23rd Annual SIOP Conference April 2008, San Francisco
2008 Starwood Hotels & Resorts Worldwide, Inc.

Starwood: A Global Company


World class hotel and spa brands 890 managed & franchised properties in over 100 countries
Approximately 400 hotels managed by Starwood; we oversee the HR systems

145,000 associates Conduct business in as many as 40 languages

With a Global Assessment Program


Assessment Sales and Marketing Call Center Guest Service Usage 400 hotels globally US, Canada, and Ireland US, UK, China Versions Varies by level and division 2 versions locally norms Local norms Languages 9 English only 14

Our Approach to Job Analysis


Think globally, act globally: involve experts around the world Sift through thousands of job titles Job observations, and SME focus groups One branded core competency model
+ One guest service training program + One performance management system = A common understanding of the jobs

Jobs Vary Across Divisions: An Example for Sales Managers


Task Overlap Task r Comp Comp Overlap r

Standard Title

Comparison

Sales Manager Sales Manager Sales Manager Sales Manager Sales Manager Sales Manager

NA NA NA AP AP EAME

AP EAME LA EAME LA LA

76.9% 68.8% 68.8% 75.0% 75.0% 100.0%

.75 .86 .82 .84 .75 .82

95.2% 90.9% 95.2% 95.5% 90.9% 95.5%

.92 .84 .85 .82 .81 .88

Rank Ordering Guest Service Competencies in North America and Latin America
Importance Ranking (1-25) North America Latin America Problem Solving Attendance Coping with Stress Persistence Multitasking 16 4 3 23 8 1 15 13 8 18

Translation
Translate and back-translate Involve I/Os who are native speakers of the language Use in-country reviewers to capture local dialects Translate before validation
Have your validation sample provide feedback about the quality of the translations

How Many Languages Do You Need? Our Sales Leaders Said Just a Few
Bahasa 2%

Japanese 3%

Spanish (LA) 9% Chinese


10%

Italian 0%
German 13%

French 5%
Spanish (EU) 1%

English 57%

N = 281

Getting Technology To Speak Your Language


Starwoodhotels.com English German Italian Thai Malay French Chinese Spanish Japanese

ATS Recruiter Site Job Postings

Polish Dutch Spanish (LA)

Portuguese Korean French (CA)

Assessment Authoring English

Call Center Candidates in North America and Ireland Respond Differently


32 of 53 items had statistically significant differences in response patterns between Ireland and the US applicants. How would you describe yourself? Ireland 25% 44% 33% Dependable Clever Happy USA 78% 9% 11%

Implementation and Delivery


Legal requirements and data collection/retention policies vary Difficulty integrating assessments with e-recruit platforms
Must allow candidates to choose a language

Your norms or mine? Does a candidate test in their home country or the country where the job resides?

In Conclusion
Think globally, act globally to avoid problems and get buy-in Same job title does not mean same job Same language does not mean same responses Translate early but scope the need first Good luck with the technology

23rd Annual SIOP Conference April 10-12, 2008 San Francisco

Applying Assessments Across Cultures: A Case Study

Amie Lawrence, Ph.D. Lance Andrews & Matthew OConnell, Ph.D.

Our Case Study

GLOBAL AUTOMOTIVE MANUFACTURER


Needed standardized entry-level assessment process in 4 countries US, Canada, Mexico, UK All locations applicants were primarily local Web-based assessment Personality, SJT, & Cognitive Ability Administered in group proctored settings Part of a multiple-hurdle selection process

Issue #1: Candidate Privacy

CANDIDATE PRIVACY

Other countries have laws limiting the collection and/or transfer of candidate personal information
WHAT DID WE DO?

Did not collect demographic data from any candidates outside the US Obtained approval from all UK candidates regarding their personal data

Issue #1: Candidate Privacy

LESSONS LEARNED

Know where candidate data are stored and where they will be transferred Involve the clients legal department in the planning stages of the project to ensure that these issues are addressed

Issue #2: Applicant Tracking

APPLICANT TRACKING Unique Identifier Database Configuration WHAT DID WE DO? Created a unique identifier from telephone number and month and day of birth Reviewed all fields in the system to align with that country

Issue #2: Applicant Tracking

LESSONS LEARNED Discuss these issues early in project life-cycle Verify every field that a candidate completes throughout the selection process

Issue #3: Assessment Content

ASSESSMENT CONTENT The American version of the assessment was not appropriate for use in any of the foreign countries WHAT DID WE DO? Translated into Spanish Reviewed assessment content for phrasing and spelling changes Programmed changes

Issue #3: Assessment Content

LESSONS LEARNED Changes may be needed even in other English-speaking countries Account for the time and resources needed for this step

Issue #4: Assessment Norms

ASSESSMENT NORMS No local norms available No concurrent validation study planned WHAT DID WE DO? Delayed report availability Conducted Norm Analysis

Issue #4: Assessment Norms

LESSONS LEARNED Local Norms were necessary Cultural & Regional differences were identified

Issue #4: Assessment Norms

EVIDENCE FOR LOCAL NORMS Web-based Assessment Measurements


6 personality scales 6 cognitive measures (SJT, IP, Cognitive Ability)

Assessment Differences by Country Assessment Differences by Item Type

Personality Difference Scores

Mean Difference Scores as Compared to US Norms


Canada
Teamwork Conscientiousness Locus of Control Positive Attitude Attn to Detail Safety

Mexico +.58 +.02 +.87 +.05 +1.11 +1.31

UK -.05 -.44 -.65 -.49 -.46 -.15

-.13 -.07 -.08 -.03 -.11 -.02

Cognitive Difference Scores

Mean Difference Scores as Compared to US Norms


Canada
IP: Gauge IP: Count SJT: Teamwork SJT: Safety Qualitative PS Quantitative PS

Mexico -.76 -.57 -1.56 -1.25 -1.45 -1.33

UK -.05 +.06 -.47 -.03 -.44 +.03

+.43 +.32 +.04 +.09 +.15 +.48

Summary of Differences

Applicants in Mexico scored higher than applicants in the US on all 6 personality measures Applicants in Mexico scored lower than applicants in the US on all cognitive measures with the largest differences on SJT and traditional cognitive ability measurements Applicants in the UK scored lower than applicants in the US on all 6 personality measures Applicants in Canada were most similar to applicants in the US

Future Research Questions

What is the impact on validity with local v. global norms? What is causing the score differences? Are they true score differences or is it cultural variance? Are all item types developed in and for US companies transferable across cultures? What are the best processes and procedures for ensuring that assessments can be confidently applied across cultures?

About Select International

Founded in 1993, Select International is a leading provider of selection and development solutions for Global 2000 companies From entry-level to executive, organizations identify, hire and retain top talent using Select International's cutting-edge products, systems and recruitment process outsourcing services Headquartered in Pittsburgh, PA, Select maintains offices in Dallas, San Diego, Toronto, London and South Africa

Você também pode gostar