Participants Needed for Office of Fire Marshal Study

IPMA-HR is currently seeking participants for a nationwide Office of Fire Marshal/Office of Fire Prevention study. This study is the first step in the development of a new Fire Marshal Test. The first part of the study involves surveying current members of the Office of Fire Prevention to learn about the important duties and demands of their job.

Examples of applicable positions include, but are not limited to: Fire Marshal, Fire/Arson Investigator, Premise Officer, Deputy Fire Marshal, Fire Prevention Officer, Fire Inspector, and Code Enforcement Officer.

If you or someone you know holds one of the above positions or a similar position, we would greatly value your input in developing our newest test series.

Participants will be entered into a raffle with a $500 prize! Participating agencies will also receive a 15% discount toward a future IPMA-HR assessment product purchase. Interested parties can use the following link to participate:

 https://www.surveymonkey.com/r/FireMarshal

Questions can be emailed to our Research Associate, Julia Hind-Smith, at jsmith@ipma-hr.org.

Job Analysis – What’s New?

What’s new in job analysis? A cynic might reply – “very little.”  However, such a conclusion would lead to a very short blog and, more importantly, would not be accurate.  Despite the foundational nature of job analysis, there have been some recent developments worth sharing.

Consensus on Recommended Practices

Although it is still true that the Uniform Guidelines and courts show no preference for any specific method of job analysis, due to pressures for documentation from regulatory agencies, a professional consensus has begun to evolve and emerge around recommended practices for job analysis.  The associated principles can be expressed as follows.  A job analysis should:

  • Be task-based. Despite continued mention of worker-oriented approaches, including the emergence of competency models, the job description should be task-oriented including detailed listings of tasks and associated knowledge, skills, abilities, and personal characteristics (KSAPs).
  • Identify linkages. The identification and measurement of linkages between tasks and KSAPs is critical.  When job analysis is used in test development, it is equally important to establish linkages between the KSAPs and the test content.
  • Utilize interviews and focus groups. The appropriate use of interviews or focus groups remains important in obtaining job information from incumbents and supervisors.
  • Incorporate questionnaires. Where practical, with practicality a function primarily of the number of incumbents and the quality of the information obtained from the interviews, questionnaires should be used to gather quantitative ratings of tasks, KSAPs, and linkages.  The collected data can then be subject to statistical analysis.  Technological developments, including the widespread availability of easy-to-use online survey software, have made it much simpler and cost-effective to create and distribute job analysis instruments.  In designing surveys, practitioners should be aware of the now ubiquitous nature of smartphones.  Large matrices of the type so frequently used to collect job ratings do not translate well to small screens.  As a result, analysts must be creative in designing surveys when the incumbents will be responding using mobile devices, including tablets and smartphones.

Continue reading

Making Use of a Job Analysis Outside of Test Development

This is the third article in a three-part series on job analysis. We have covered the fundamentals of job analysis and we have reviewed a report prepared by IPMA-HR as a means of illustrating the role of a Human Resources Analyst in evaluating the work of test developers and consultants. In particular, it is important to recognize that as an HR professional you may not be personally responsible for creating job analysis procedures, writing tests and conducting validation studies, but it is important to know how they are done so you can play this key role for your agency. Even if you hire a test developer or consultant, you may be asked to assist in the process and understanding how job analyses are done will prove valuable to you in this role as well.

In the first article I stressed that a thorough job analysis is the foundation for most of the technical work performed in Human Resources. As we have already seen, a job analysis is critical for developing content valid selection instruments which should be the heart of your recruiting and selection program. As if that was not sufficient reason for conducting job analyses, the information obtained from doing thorough work in analyzing the jobs in your agency can also support your training program, your classification and compensation program, your performance evaluation program, disciplinary action and remediation and serve as a basis for transportability studies as discussed in the last article. Continue reading

Utilizing a Job Analysis to Create Content Valid Selection Instruments

As indicated in the first article in this series, a thorough job analysis should be the foundation for most of the technical work performed in Human Resources. We also discussed that while analysts in the field today may not necessarily need to be able to design their own job analysis systems and create written exams from the results, they should have an understanding of the process and the ability to recognize whether or not products and vendors meet professional standards and can stand up to court scrutiny.

Our focus, as suggested above, will be the utilization of job analyses to create content valid selection instruments with other uses for job analyses results being discussed in the next article. In addition, it is important to stress the Uniform Guidelines on Employee Selection Procedures (UGESP 1978) along with the Society for Industrial and Organizational Psychology (SIOP) Principles for the Validation and Use of Personnel Selection Procedures (Principles, 2003) are still the guiding documents for determining the adequacy of content validation procedures (see references at the end of this post). It is also important to note that the UGESP (1978) don’t apply only to written exams, but to all selection instruments. That includes oral exams, physical fitness tests, background investigations and one-on-one hiring interviews. Continue reading

There Is Nothing Mystical About A Good Job Analysis

When I started in Human Resources, the Uniform Guidelines on Employee Selection Procedures (UGESP 1978) had just been adopted by the Department of Personnel, the Labor Department, the Equal Employment Opportunities Commission and the Department of Justice. These guidelines spelled out the requirements for demonstrating that selection procedures had content validity, criterion related validity and/or construct validity.

The adoption of the Guidelines was followed by a wave of class action law suits filed primarily by the Department of Justice suing — primarily public safety agencies — for illegal discrimination in hiring procedures and failure to demonstrate the validity of instruments being used for selection. This created a demand for individuals with test development and validation experience to assist public sector agencies in developing and validating new selection instruments. Or, as some agencies chose at the time, to opt out of using written exams.

The hiring of new “Personnel Analysts” focused on individuals with backgrounds in research, statistics and testing. Training for new analysts focused on test writing and validation which had at its heart the development of job analysis procedures. Job analyses that were intended to serve as the basis for developing content valid selection procedures had to be designed to withstand the rigorous scrutiny of the Department of Justice, which as the chief writer of the Guidelines admitted to me, were designed to tip the tables in their favor when it came to litigation. In addition, while the Guidelines contained a section outlining the requirements for demonstrating content validity in detail, the Department of Justice team of attorneys responsible for litigating most of their cases had a distinct bias toward criterion related validity. They also held content validity in low regard. That is, even if an agency followed the Guidelines to demonstrate that their selection instruments were developed to “build in” their validity (content validity), the DOJ would tend to pick their analyses and studies apart if the agencies did not also demonstrate a statistical correlation between test performance and a job related criterion such as job performance (criterion validity). Continue reading