Court Decisions Affecting Test Development and Usage: Albemarle Paper Co. v. Moody (1975)

Test developers and test users alike need to be informed of the legal and professional guidelines that govern employment testing. Why? Stay tuned, at the end of this series (read part one), I will present specific cases where agencies have violated these guidelines and what and how much it cost them.

Albemarle Paper Co. v. Moody is a case that perfectly illustrates the cost of cutting corners.

When I think of the lesson learned in this case, it takes me back to the day I attempted to cut corners on my math homework. My Calculus teacher had suspected for a while that most people in the class were not actually attempting the homework calculations and simply using a calculator. One day, I guess he got fed up and made an announcement when handing out the nightly homework that we must “show your work.” Despite his instruction, me being the busy teenager that I was (joking), I attempted to cut a corner to finish my math homework faster by using the calculator instead of showing my work. The next morning when the teacher came around to collect homework, he explicitly stated that anyone who didn’t show their work this time would get a zero.

“We’ll see about that,” I thought. I quickly tried working out the problems before my teacher arrived at my desk to collect my homework. I was sweating and, perhaps unsurprisingly, I did not finish in time. My teacher caught what I was doing, and I received a much-deserved zero. Needless to say, I’ve avoided cutting corners ever since.

The case of Albemarle Paper Co. v. Moody is similar in that it involves cutting corners. Just as cutting corners did not turn out well for me (or my math grade), cutting corners definitely did not turn out well for Albemarle Paper Co.

Albemarle Paper Co. v. Moody (1975)

Prior to the passage of the Civil Rights Act of 1964, Albemarle Paper Co. (Roanoke Rapids, NC) explicitly excluded Black employees from high-skilled and higher-paying positions. This is similar to the case of Griggs v. Duke Power Co. (1971) in that Black employees were restricted to the lower-paying departments.  Of course, with the passage of the Civil Rights Act of 1964, this practice had been made illegal, and Albemarle Paper Co. permitted Black employees to transfer to the high-skilled positions or lines of work if they passed the Wonderlic Test, and another intelligence test called the Beta Examination Test.

Albemarle employees who were already in a certain line of work or a high-skilled department were not required to take or pass the exam to retain their jobs or qualify for promotions within the same department. Despite Albemarle removing the exclusion of Black employees from high-skilled, higher-paying positions, Black employees were still not able to secure these promotions or positions due to their not being able to pass the exams at the set cut score, which was based on national norms.

Approximately 10 years after the Civil Rights Act was signed into law, several present and former African American Albemarle Paper Co. employees filed a suit against the company on the basis that the testing procedure resulted in disparate impact against minorities.

Throwback to Griggs v. Duke Power Co.

If you’ve learned anything from the case of Griggs v. Duke Power Co. (1971), you should already be thinking, “I hope Albemarle Paper Co. did a job analysis to support the job-relatedness of the test. The law states that you cannot test for qualifications that are not job-related.”

Well, I think Albemarle Paper Co. was thinking that, too. Four months before the trial, the company paid a professional to validate the exams. This person attempted to lump positions together based on the level of the position in the career progression lines; however, he/she did not attempt to analyze the individual jobs in terms of the specific skills they require. Essentially, they attempted to provide support that the exam could be used across multiple positions, as it was being used, without analyzing the specific skills needed for each position.

The Decision: The Court ruled the validation method unacceptable, and condemned the company’s decision to validate the exam only after they were being sued. The Court also explicitly stated, “A test may be used in jobs other than those for which it has been professionally validated only if there are no significant differences between the studied and unstudied jobs.” Again, the professional hired by Albemarle Paper Co. to validate the exams did not investigate whether necessary skills differed among job groups, and therefore did not have sufficient evidence that the exam is suitable for use across jobs.

The Implications: This case re-emphasized the importance of employment examinations being job-related, a decision that was first mandated in Griggs v. Duke Power Co. (1971). This case also emphasizes the importance for test developers and test users to conduct/coordinate job analysis and validation studies to provide support for the job-relatedness piece. Test-users are strongly encouraged to read the technical report that accompanies each exam and the results of the job analysis the exam is based on, which will help determine if the exam assesses skills that are necessary for a particular position in your agency.

Test developers should be careful to link the developed examination to the required skills of the position and to document this process thoroughly (see job analysis for more information).

Author’s Note: The case of Albemarle v. Moody resulted in several other decisions that affect back pay awards. Those decisions were not discussed in this post, but for more information, please see the references below. 

References

  • Albemarle v. Moody, 422 U.S. 405 (1975)
  • Civil Rights Act of 1964, 42, U.S.C.
  • Griggs v. Duke Power Co., 401 U.S. 424 (1971)
By |2019-02-25T13:10:14-04:00February 25th, 2019|Adverse Impact, Job Analysis|Comments Off on Court Decisions Affecting Test Development and Usage: Albemarle Paper Co. v. Moody (1975)

Court Decisions Affecting Test Development and Usage: Griggs v. Duke Power Co. (1971)

Happy Black History Month! As you know, every February is reserved as Black History Month as a way to encourage the citizens of America to remember and reflect on important people and events in African American history. As employment test developers and test users, we would especially benefit from reflecting on the civil rights movement, which culminated in the Civil Rights Act of 1964. After all, the Civil Rights Act of 1964 and subsequent litigation have set strict legal and professional guidelines regarding tests and adverse impact that must be adhered to with care by employment test developers and test users alike.

Today, we’ll review the case of Griggs v. Duke Power Co. (1971) a major court decision that affects test development and usage.

Griggs v. Duke Power Co. (1971)

Duke Power Co. was a public utility company that serviced the general public in the U.S. Carolinas. The company was challenged for putting a policy in place that would inhibit African Americans from transferring out of the labor department  the department with the lowest paying jobs in the company  into other positions.

In 1955, Duke Power Co. imposed the following qualifications for placement in any section of the company, apart from the lowest-paying labor department:

  1. Candidates must possess a high school education.
  2. Candidates must achieve satisfactory scores on the Wonderlic Personnel Test and the Bennett Mechanical Comprehension Test.

Approximately six years after the Civil Rights Act was signed into law, Willie Griggs and several fellow African American Duke Power Co. employees filed a class action suit against the company for their transfer policy. Griggs and his co-workers argued that the policy unfairly discriminated against African American employees and, therefore, was a violation of Title VII of the Civil Rights Act of 1964.

Cognitive ability and aptitude tests like the Wonderlic and Bennett exams are known to have adverse impact against protected groups. Additionally, in the 1970s, approximately only 30 percent of African Americans had completed four years of high school (Educational Attainment by Race and Hispanic Origin, n.d.).

The Decision: The Court ruled that Duke Power Company’s transfer policy did in fact unlawfully discriminate against African Americans. It is important to note that the Court made this decision not because the transfer policy had adverse impact. Although the exams used were professionally developed, the Court ruled that “neither [test] was directed or intended to measure the ability to learn to perform a particular job or category of jobs.” In other words, neither exam was shown to be related to any particular position in the company. The Court thus set the prerequisite for all selection procedures: They must be job-related.

The Implications: Unsurprisingly, most public sector agencies these days use cognitive or knowledge-based tests that are job-specific instead of general cognitive ability tests such as the Wonderlic (Sproule, 2009). Test users benefit from using job-specific tests and, if a test from a test development company such as IPMA-HR is administered, it is important to ensure that the test assesses competencies important for effectively performing the job specifically at your agency.

A good starting point for this is to compare the job analysis your agency conducted for the position to the job analysis conducted by the test development company. For more information on job analysis or gauging the job-relatedness of a stock test within your agency, please feel free to email me or comment on this post.

For test developers, the job-relatedness component should be embedded in the test design. IPMA-HR’s Assessment Services Department starts each test development project with a national job analysis as the first step in providing support for job-relatedness. Please check out our website for information on current test development opportunities.

Sources

Civil Rights Act of 1964, 42, U.S.C.

Educational Attainment by Race and Hispanic Origin. (n.d.). Retrieved from https://www.census.gov/prod/99pubs/99statab/sec04.pdf

Griggs v. Duke Power Co., 401 U.S. 424 (1971)

Sproule, C.F. (2009). Rationale and Research Evidence Supporting the Use of Content Validation in Personnel Assessment: A monograph of the International Personnel           Assessment Council. International Personnel Testing Council (IPAC).

By |2019-02-15T12:38:15-04:00February 12th, 2019|Adverse Impact, Job Analysis, Public Safety Testing|2 Comments

Calling All Police Departments: Send in your Police Officer and Police Lieutenant Job Descriptions

IPMA-HR’s Assessment Services Department is developing new entry-level Police Officer and promotional Police Lieutenant exams. The first step in the test development process is the development of a job analysis questionnaire to determine the most important tasks, duties and qualifications to perform as police officers and lieutenants. In order to compile a list of duties and qualifications for use in the job analysis questionnaire, we are requesting that police departments submit any job descriptions for the ranks of police officer and police lieutenant. All submitted materials will remain completely confidential. Please forward job descriptions to yrandall@ipma-hr.org. Thank you!

By |2019-02-04T18:47:33-04:00February 1st, 2019|Job Analysis, Products & Services, Public Safety Testing, Public Safety Tests, Uncategorized|Comments Off on Calling All Police Departments: Send in your Police Officer and Police Lieutenant Job Descriptions

Office of Fire Prevention/Fire Marshal Employees – Your Insights are Requested!

IPMA-HR is currently seeking participants for a nationwide Office of Fire Marshal/Office of Fire Prevention study. This study is the first step in the development of a new series of tests for positions within the Fire Marshal’s Office/Office of Fire Prevention.  In this phase, job incumbents are needed to complete a survey to identify the most important tasks and knowledge, skills, abilities, and personal characteristics (KSAPs) required to perform their job successfully. The questionnaire takes approximately 30 minutes to complete.

Examples of applicable positions include: 

  • Fire Marshal
  • Deputy Fire Marshal
  • Assistant Fire Marshal
  • Fire Inspector
  • Fire/Arson Investigator
  • Fire Prevention Officer
  • Code Enforcement Officer.

If you or someone you know holds one of the above positions or a similar position, we would greatly value your input in developing our newest test series.

Please Note: If your position has a different title than the positions listed above, but is similar in nature, we would still like to hear from you!

Participants will be entered into a raffle with a $500 prize! Participating agencies will also receive a 15% discount toward a future IPMA-HR assessment product purchase. Interested parties can learn more about the project and complete an interest form using the following link:

https://www.ipma-hr.org/assessment-services/about-test-development/test-development-opportunities/office-of-fire-marshal-fire-prevention-project

If you would like to participate or have additional questions please email our Research Associate, Julia Hind-Smith, at jsmith@ipma-hr.org.

By |2018-02-12T15:00:47-04:00February 12th, 2018|Job Analysis, Public Safety Testing, Public Safety Tests, Survey, Uncategorized|Comments Off on Office of Fire Prevention/Fire Marshal Employees – Your Insights are Requested!

Participants Needed for Office of Fire Marshal Study

IPMA-HR is currently seeking participants for a nationwide Office of Fire Marshal/Office of Fire Prevention study. This study is the first step in the development of a new Fire Marshal Test. The first part of the study involves surveying current members of the Office of Fire Prevention to learn about the important duties and demands of their job.

Examples of applicable positions include, but are not limited to: Fire Marshal, Fire/Arson Investigator, Premise Officer, Deputy Fire Marshal, Fire Prevention Officer, Fire Inspector, and Code Enforcement Officer.

If you or someone you know holds one of the above positions or a similar position, we would greatly value your input in developing our newest test series.

Participants will be entered into a raffle with a $500 prize! Participating agencies will also receive a 15% discount toward a future IPMA-HR assessment product purchase. Interested parties can use the following link to participate:

Fire Marshall Study

Questions can be emailed to our Research Associate, Julia Hind-Smith, at jsmith@ipma-hr.org.

By |2018-01-29T17:05:15-04:00August 24th, 2017|Announcements, Job Analysis, Public Safety Tests, Uncategorized|Comments Off on Participants Needed for Office of Fire Marshal Study

Job Analysis – What’s New?

What’s new in job analysis? A cynic might reply – “very little.”  However, such a conclusion would lead to a very short blog and, more importantly, would not be accurate.  Despite the foundational nature of job analysis, there have been some recent developments worth sharing.

Consensus on Recommended Practices

Although it is still true that the Uniform Guidelines and courts show no preference for any specific method of job analysis, due to pressures for documentation from regulatory agencies, a professional consensus has begun to evolve and emerge around recommended practices for job analysis.  The associated principles can be expressed as follows.  A job analysis should:

  • Be task-based. Despite continued mention of worker-oriented approaches, including the emergence of competency models, the job description should be task-oriented including detailed listings of tasks and associated knowledge, skills, abilities, and personal characteristics (KSAPs).
  • Identify linkages. The identification and measurement of linkages between tasks and KSAPs is critical.  When job analysis is used in test development, it is equally important to establish linkages between the KSAPs and the test content.
  • Utilize interviews and focus groups. The appropriate use of interviews or focus groups remains important in obtaining job information from incumbents and supervisors.
  • Incorporate questionnaires. Where practical, with practicality a function primarily of the number of incumbents and the quality of the information obtained from the interviews, questionnaires should be used to gather quantitative ratings of tasks, KSAPs, and linkages.  The collected data can then be subject to statistical analysis.  Technological developments, including the widespread availability of easy-to-use online survey software, have made it much simpler and cost-effective to create and distribute job analysis instruments.  In designing surveys, practitioners should be aware of the now ubiquitous nature of smartphones.  Large matrices of the type so frequently used to collect job ratings do not translate well to small screens.  As a result, analysts must be creative in designing surveys when the incumbents will be responding using mobile devices, including tablets and smartphones.

(more…)

By |2015-08-05T09:23:03-04:00August 5th, 2015|Assessment, Job Analysis|1 Comment

Making Use of a Job Analysis Outside of Test Development

This is the third article in a three-part series on job analysis. We have covered the fundamentals of job analysis and we have reviewed a report prepared by IPMA-HR as a means of illustrating the role of a Human Resources Analyst in evaluating the work of test developers and consultants. In particular, it is important to recognize that as an HR professional you may not be personally responsible for creating job analysis procedures, writing tests and conducting validation studies, but it is important to know how they are done so you can play this key role for your agency. Even if you hire a test developer or consultant, you may be asked to assist in the process and understanding how job analyses are done will prove valuable to you in this role as well.

In the first article I stressed that a thorough job analysis is the foundation for most of the technical work performed in Human Resources. As we have already seen, a job analysis is critical for developing content valid selection instruments which should be the heart of your recruiting and selection program. As if that was not sufficient reason for conducting job analyses, the information obtained from doing thorough work in analyzing the jobs in your agency can also support your training program, your classification and compensation program, your performance evaluation program, disciplinary action and remediation and serve as a basis for transportability studies as discussed in the last article. (more…)

By |2015-07-29T15:02:39-04:00August 27th, 2013|Job Analysis|Comments Off on Making Use of a Job Analysis Outside of Test Development

Utilizing a Job Analysis to Create Content Valid Selection Instruments

As indicated in the first article in this series, a thorough job analysis should be the foundation for most of the technical work performed in Human Resources. We also discussed that while analysts in the field today may not necessarily need to be able to design their own job analysis systems and create written exams from the results, they should have an understanding of the process and the ability to recognize whether or not products and vendors meet professional standards and can stand up to court scrutiny.

Our focus, as suggested above, will be the utilization of job analyses to create content valid selection instruments with other uses for job analyses results being discussed in the next article. In addition, it is important to stress the Uniform Guidelines on Employee Selection Procedures (UGESP 1978) along with the Society for Industrial and Organizational Psychology (SIOP) Principles for the Validation and Use of Personnel Selection Procedures (Principles, 2003) are still the guiding documents for determining the adequacy of content validation procedures (see references at the end of this post). It is also important to note that the UGESP (1978) don’t apply only to written exams, but to all selection instruments. That includes oral exams, physical fitness tests, background investigations and one-on-one hiring interviews. (more…)

By |2015-07-29T15:04:39-04:00August 22nd, 2013|Job Analysis|3 Comments

There Is Nothing Mystical About A Good Job Analysis

When I started in Human Resources, the Uniform Guidelines on Employee Selection Procedures (UGESP 1978) had just been adopted by the Department of Personnel, the Labor Department, the Equal Employment Opportunities Commission and the Department of Justice. These guidelines spelled out the requirements for demonstrating that selection procedures had content validity, criterion related validity and/or construct validity.

The adoption of the Guidelines was followed by a wave of class action law suits filed primarily by the Department of Justice suing — primarily public safety agencies — for illegal discrimination in hiring procedures and failure to demonstrate the validity of instruments being used for selection. This created a demand for individuals with test development and validation experience to assist public sector agencies in developing and validating new selection instruments. Or, as some agencies chose at the time, to opt out of using written exams.

The hiring of new “Personnel Analysts” focused on individuals with backgrounds in research, statistics and testing. Training for new analysts focused on test writing and validation which had at its heart the development of job analysis procedures. Job analyses that were intended to serve as the basis for developing content valid selection procedures had to be designed to withstand the rigorous scrutiny of the Department of Justice, which as the chief writer of the Guidelines admitted to me, were designed to tip the tables in their favor when it came to litigation. In addition, while the Guidelines contained a section outlining the requirements for demonstrating content validity in detail, the Department of Justice team of attorneys responsible for litigating most of their cases had a distinct bias toward criterion related validity. They also held content validity in low regard. That is, even if an agency followed the Guidelines to demonstrate that their selection instruments were developed to “build in” their validity (content validity), the DOJ would tend to pick their analyses and studies apart if the agencies did not also demonstrate a statistical correlation between test performance and a job related criterion such as job performance (criterion validity). (more…)

By |2015-07-29T15:06:07-04:00August 6th, 2013|Job Analysis|2 Comments