When I started in Human Resources, the Uniform Guidelines on Employee Selection Procedures (UGESP 1978) had just been adopted by the Department of Personnel, the Labor Department, the Equal Employment Opportunities Commission and the Department of Justice. These guidelines spelled out the requirements for demonstrating that selection procedures had content validity, criterion related validity and/or construct validity.

The adoption of the Guidelines was followed by a wave of class action law suits filed primarily by the Department of Justice suing — primarily public safety agencies — for illegal discrimination in hiring procedures and failure to demonstrate the validity of instruments being used for selection. This created a demand for individuals with test development and validation experience to assist public sector agencies in developing and validating new selection instruments. Or, as some agencies chose at the time, to opt out of using written exams.

The hiring of new “Personnel Analysts” focused on individuals with backgrounds in research, statistics and testing. Training for new analysts focused on test writing and validation which had at its heart the development of job analysis procedures. Job analyses that were intended to serve as the basis for developing content valid selection procedures had to be designed to withstand the rigorous scrutiny of the Department of Justice, which as the chief writer of the Guidelines admitted to me, were designed to tip the tables in their favor when it came to litigation. In addition, while the Guidelines contained a section outlining the requirements for demonstrating content validity in detail, the Department of Justice team of attorneys responsible for litigating most of their cases had a distinct bias toward criterion related validity. They also held content validity in low regard. That is, even if an agency followed the Guidelines to demonstrate that their selection instruments were developed to “build in” their validity (content validity), the DOJ would tend to pick their analyses and studies apart if the agencies did not also demonstrate a statistical correlation between test performance and a job related criterion such as job performance (criterion validity).

Overtime and over different administrations with different funding priorities, the big class action law suits diminished and so did the focus on the technical work involved in human resources. Many people outside the field of test development do not like tests and testing. Many biases against testing have developed naturally by people who had at one time or another been used or abused by a test. Giving into trends, many agencies did away with traditional testing in favor of a more “touchy feely,” approach. However; well developed written exams still remain our most effective tool for selecting employees with statistical correlations between test performance and job performance being higher than any other selection procedures. In addition, the UGESP have not gone away. They still remain in effect today even though hiring and training new analysts does not typically focus on selecting individuals with skills sets that include the ability to conduct job analyses, write or validate tests.

That does not mean that the need for valid selection instruments has disappeared. There is still a need and demand for well developed and appropriately validated tests, particularly written exams to help select public safety personnel (e.g., police officers and firefighters, etc.). The “do it yourself” attitude that led agencies in the past to create their own job analyses procedures, write their own tests and conduct their own validation procedures is no longer prevalent as it was in the eighties. Fortunately, unlike the conditions I started my career under, there are many good test publishers and consulting firms with many good products available for sale to Human Resources shops that don’t have the time, money or expertise to create their own materials. Even I determined toward the end of my career that there are many benefits in utilizing test publishers and consultants to help develop selection procedures and conduct job analyses. Once I saw that there were advantages to having tests that had been developed using larger numbers of subject matter experts and test takers than were available to me, I realized that the skill set for new analysts and training for them did not have to focus on developing or creating materials from scratch. Rather, the focus could shift to demonstrated ability and willingness to learn in order to gain the knowledge necessary to recognize good commercial products and vendors.

In that regard, individuals working in the field of HR tasked with selection activities should understand that a thorough job analysis is the foundation for most of the “technical work” performed in human resources. Typically, they are time consuming, labor intensive and fraught with potential pitfalls that only individuals with experience utilizing them can avoid. Therefore, many agencies take shortcuts in completing them or avoid them all together. This is particularly true when HR staff is small and/or does not have the necessary expertise to design a job analysis system and/or conduct a thorough job analysis. However; those agencies that understand the importance of utilizing good written exams and other selection procedures that have been validated can school themselves sufficiently to recognize the difference between good and bad work. In particular, IPMA-HR has one of the most thorough and effective job analysis procedures in use today.

Essentially, there is nothing mystical about a good job analysis. As indicated before, the UGESP as well as SIOPs Principles for the Validation and Use of Personnel Selection Procedures spell out the requirements. However; using them to create your own system is a bit like following the plans to build your own log home. Fortunately you do not have to reinvent the wheel. In simple terms, a good job analysis dissects a particular job and focuses on all the details that build the job from the ground up.

While a good job analysis provides such detailed information about a job that it can be used for a multitude of purposes, for the most part they are used for the development of content valid selection instruments, particularly written exams. So our focus here will be a simplified overview while more detailed information will be provided in the next article focusing on how the IPMA-HR system works and then finally in the third article in this series, we will look at some of the other uses that can be made of the information obtained by conducting a thorough job analysis.

The construct of content validity says that the content of the test is job related and therefore the test is valid for the target job. That being the case, validation of the selection instruments must include demonstrating a link between what the test measures and what is required to do the work. Simple content validity would take a particular job, look at the tasks performed on that job and then ask job candidates to perform that task. For instance, applicants for a mechanics job could be asked to tune up an engine. There wouldn’t be any question about the content validity of this test since it would be an actual work simulation. However; this type of testing is not practical. In particular it is time consuming, labor intensive, slow and very costly. That being the case, HR shops often take a step back from asking candidates to perform job tasks to asking them to take a written test that measures the knowledge, skills, and abilities necessary to perform job tasks. The idea being that if one possesses a sufficient level of job related knowledge, skills, and abilities he should be able to perform the job.

This is the foundation for a job analysis since it must identify the tasks performed on the job and from there identify the knowledge, skills and abilities necessary to perform those tasks. Further, the test developed from the job analysis must demonstrate that it measures the correct knowledge, skills, abilities and personal characteristics (KSAP’s). This means that the linkages between the tasks, the KSAP’s, and the test must be demonstrated in the job analysis which is typically done through a series of ratings.

Commonly, subject matter experts are asked to brain storm a comprehensive list of tasks performed in the job and then utilize those tasks to identify every knowledge, skill, ability or personal attribute that is necessary to perform the tasks on the list. This process focuses on the development of comprehensive lists as well as the importance of having qualified subject matter experts (SME’s) writing task and KSAP statements correctly. The next step involves SME’s rating the tasks and KSAP’s. Ratings for tasks usually include scales that focus on how important the task is, how often it is performed and when an incumbent must be able to perform the task. KSAP ratings are similar in terms of ratings for importance, time spent using the KSAP, difficulty level of the KSA and when required.

The next step requires linking Tasks and KSAP’s. There are several ways this is done depending upon the system used, but it involves identifying all the KSAP’s required to perform a specific task. For example the task of making an arrest for a police officer could include knowledge of the law, knowledge of criminal behavior, knowledge of arrest procedures, ability to subdue suspects, skill sufficient to handcuff a suspect, skill in the use of lethal and non-lethal force, command presence, ability to communicate verbally, and so on.

Once the Task and KSAP lists are complete and the linkages have been made, the next step in the process involves an inclusion v. exclusion process. Ultimately, the job analysis system should have brought you to a point where you can identify the KSAP’s that are important for job performance, utilized or required a reasonable percentage of the time, required at entry into the position and linked to at least one important task. The KSAP’s that survive this analysis and meet established criteria can be used for testing. Typically, they are used to create an exam plan outline that divides the KSAP’s into the category or type of test that is the most effective tool for measuring that particular KSAP. This process aligns the KSAP’s under the most appropriate selection tool and assists in identifying all the selection tools needed. That is, those KSAP’s best measured by an oral exam would be listed under that exam and the need to measure them would help identify the need for an oral exam. Through this process steps in the selection process such as: written exam, oral board exam, physical fitness test, psychological exam and background investigation are identified and related KSAP’s are aligned under them. The weights assigned in the original rating process also assist in determining the weight that each instrument will have in determining the final overall score for candidates as well as the weight for subsections of an exam. For example: the written test may have fifteen items on reading comprehension and ten items on English grammar, spelling and punctuation, while contributing thirty per cent of a candidate’s final rating based on the original ratings for the KSAP’s included in those areas.

That’s job analysis in a nutshell. The information provided here would not prepare anyone to go out and perform their own job analysis. It should, however, provide readers with an idea of what is involved in a job analysis, an understanding of linking Task → KSAP → Test, and the importance of accurate ratings as well as having the process conducted by trained experts.

The next article will provide a more detailed look at the process and help further understand how it works.


References

Equal Employment Opportunity Commission, the Civil Service Commission, the

Department of Labor and the Department of Justice. (1978). Uniform guidelines on employee selection procedures. Federal Register, Volume 43, Number 166, 38290-38315.  Uniform Guidelines .pdf

Society for Industrial and Organizational Psychology, Inc. (2003, 4th ed.). Principles for the validation and use of personnel selection procedures. College Park, MD. Telephone Number: (708) 640-0068  http://www.siop.org/_principles/principles.pdf