IPMA-HR’s Assessment Services Department Launches New Electronic Scoring Service

IPMA-HR’s Assessment Services Department recently piloted a new electronic scoring service in December 2019. The goal of the pilot was determine whether the new service would be implemented permanently and replace the old one. The results of the pilot were overwhelmingly positive, with test security agreement signers applauding the following features:

Electronic Scoring Instructions
  • Eliminates Shipping Costs. The new electronic scoring service eliminates shipping costs associated with mailing the answer sheets – you can now upload them electronically!
  • Results in 1 Business Day. Now that you do not have to mail answer sheets, you can enjoy a significantly shorter turnaround time to getting your results. Results are delivered within one business day of receiving the answer sheets.
  • Easy-to-Read Reports. The new electronic scoring service provides easy-to-interpret score and test reports, and assist with IPMA-HR’s collection of national test response data.

Utilizing the new electronic scoring service also makes it easier for your agency to participate in future criterion-related validation studies conducted by the testing experts at IPMA-HR.

Although we gave the electronic scoring service a major upgrade, we managed to keep the price the same at $50.00 + $0.50 per answer sheet. Be sure to as your assessment services representative about adding electronic scoring to your next test order!

By |2020-01-24T15:53:45-04:00January 24th, 2020|Test Administration, Test Scores|Comments Off on IPMA-HR’s Assessment Services Department Launches New Electronic Scoring Service

Want to see the best videos ever?

They may not be of a baby panda sneezing, but they’re still pretty awesome! Take a few minutes to check out our new “everything you wanted to know about IPMA-HR’s assessment products in two minutes” video and our new “everything you wanted to know about IPMA-HR’s fire service products in just over a minute” video. You won’t regret it. We promise, it’s better than “CATS.”

https://www.youtube.com/watch?v=U7Ygha7elbQ&t=7s

https://www.youtube.com/watch?v=yPUuGX8e9kg

By |2019-11-14T19:53:21-04:00November 14th, 2019|Announcements, Assessment, Products & Services, Public Safety Testing, Public Safety Tests, Resources|Comments Off on Want to see the best videos ever?

The 2019-2020 Catalog is Available!

Woohoo! The new Assessment Services catalog is available online for viewing ⁠— or download your very own copy! What’s new? SO MUCH! I can’t even tell you how cool it is. You really should see for yourself.

Well, okay, here’s a sneak peak: new tests (paging Public Works …), new services (I can take my test from where?!), new and updated publications (setting that passpoint like a pro now!) and new ways to communicate with you (your awesome sense of humor is now available on social media?!) ⁠— oh, my!

Check it out! Tell your friends! Tell your dog! Tell your friends and your dog. It’s that awesome.

By |2019-06-17T11:50:23-04:00June 17th, 2019|Announcements, Products & Services, Public Safety Testing, Public Safety Tests, Test Administration|Comments Off on The 2019-2020 Catalog is Available!

Exciting News: New Answer Sheets and Data Collection Method

Attention hand-scoring customers: You will notice new and improved candidate answer sheets and scoring stencils in your test orders starting June 5, 2019! The most notable change to the answer sheets, besides minor layout improvements, is the collection of demographic information directly on the answer sheets. The Assessment Services Department recently made these changes to facilitate test response data collection and to make the submission process easier for customers. Gone are the days of having to physically mail your test response data back to IPMA-HR (unless you want to). You can now conveniently scan, upload, and send your completed and ungraded candidate answer sheets securely online to IPMA-HR’s Assessment Services Department!

Why share scores from your test administration with the Assessment Services Department?

Have you ever called the Assessment Services Department and asked for guidance on setting cut scores or for national data on adverse impact? As a member of IPMA-HR’s Assessment Services Department, I can personally attest that we receive these calls all the time! I am always happy to be able to share national data with customers to help guide their decisions regarding setting cut scores, investigating/addressing adverse impact, and handling item challenges.

As test developers and test users, we have a legal responsibility to make fair assessments and selection decisions. IPMA-HR’s Assessment Services Department is dedicated to helping our customers make fair decisions. To assist with this process, we routinely collects test score and demographic data for each of our tests from our customers. The purpose of collecting this information is to allow us to run adverse impact analyses on our exams; this information is then provided to our test users in the form of a Test Response Data Report and includes the score frequency distribution and adverse impact data. It is our hope that these reports can help inform our users’ selection decisions. However, we would not be able to compile and share these reports to help inform your selection decisions without your help! These reports are compiled thanks to the agencies that choose to voluntarily submit their candidate data from each test administration. All data is held completely confidential and reported only in the form of group statistics.

How can I submit data from my agency’s test administration?

Be on the lookout for the new Test Response Data Submission forms in your test orders, which contains important instructions for how to submit your data either electronically or by mail. As always, if you have any questions, please feel free to contact the Assessment Services Department: assessment@ipma-hr.org or 1-800-381-TEST (8378). Thank you!

 

By |2019-06-10T07:31:26-04:00June 10th, 2019|Adverse Impact, Housekeeping|Comments Off on Exciting News: New Answer Sheets and Data Collection Method

Public Safety Voices | Sheriff Joel Merry

The opioid epidemic and succession planning are my two top concerns right now.

In Sagadahoc County, the opioid epidemic is of real concern. There are a lot of issues related to it that require a great deal of our time, from the number of calls we take regarding overdoses, to investigators working on trafficking cases, dealing with the number of people in our jails who are addicted and getting folks into treatment and recovery – it’s a lot for any department.

Sagadahoc County Sheriff Joel A. Merry | Image: Bangor Daily News

Succession planning is my top human resources issue. Retirement is the main reason we’re losing people, though we recently lost two transport deputies to the private sector. On the patrol side, we’ve had a very stable workforce, but we do have some pending retirements. This concerns me due to what my fellow law enforcement administrators are going through with recruiting. It’s a real concern. When I started my career in law enforcement 35 years ago, it was so competitive that I didn’t get my first two attempts.

We’re answering the call with life-saving aid, aggressive tactics, education and recovery.

Sagadahoc County is one of the first agencies to start carrying Narcan in Maine. We did this because we service a lot of small rural communities where EMS are 15-30 minutes away. Having a deputy with AEDs and Narcan can save a life.

The other thing we’ve done is get more aggressive on the law enforcement side trying to eradicate the traffickers and educate the public. We’ve trained patrol in interdiction strategies and we work closely with MDEA (Maine Drug Enforcement Administration) on public awareness, as well as some diversion tactics.

The tactic I’m most proud of is one where we connect people with a recovery coach and group counseling. Our programs deputy carries a caseload of 7-15 folks who are required to check in every night and meet with him face-to-face once a week. They are also subject to random drug testing and need to be employed or looking. We want to hold them accountable. It’s another level of probation and provides additional support to the probation office to help keep them on the right track.

Thinking ahead, we’re providing leadership training, adding specializations, and performing youth outreach.

Everyone who applies for a promotional position gets to attend a leadership training program through Granite State Police Career Counseling. It consists of a one-day leadership course and a three-day course on supervision, teaching them what supervision is within an agency, what does it mean and how will your role change as a supervisor.

Sagadahoc County Sheriff’s Dept. attending a You Matter event at Woolwich Central School. The officers greet students as they arrive.

We’re also adding in some specializations to build skills and to help keep our deputies interested in this agency. To that end, we recently restarted a K-9 program, and we have another deputy who is specializing in accident reconstruction.

We have done some youth outreach, including Project ALERT, which is similar to DARE. Prevention work is something I would like to do more of. I’ve toyed with the idea of a visitation program such as deputies stopping by schools to say hello and have lunch with the kids.

9/11 was a defining moment in my career.

One of the defining moments of my career came when I was a lieutenant with the Bath Police Department. My chief at the time was away attending the FBI academy, which corresponded with 9/11. A lot of things were fast moving. There were so many unknowns: are we a target, are we next?

The USS Zumwalt at Bath Iron Works. Image: Bangor Daily News

In Bath we have Bath Iron Works, which is a major U.S. shipyard and producer of naval ships. We had a lot of protocols around that – we had to provide guards 24/7 to protect the military assets. We were working very closely with neighboring law enforcement departments and built strong partnerships during that time.

Working with other agencies in both the private and public sector, I had to learn a lot of communication skills very quickly and make sure information was being shared — that I was communicating with all stakeholders. I had to focus. It provided me with insight into what leadership needs to be: As a leader, you have to be thinking about the now and what happens tomorrow at the same time.

We care.

What is the one thing I’d want our community to know about law enforcement? We care. We really do care about the health and well-being of our community.

Our communities are a great place to live, work and play, and as members of law enforcement, we work hard to keep them as safe as possible so people can live without fear and enjoy their lives.

-Sheriff Joel Merry, Sagadahoc County Sheriff’s Office, Maine

Court Decisions Affecting Test Development and Usage: Ricci v. DeStefano (2009)

At last, we have reached the end of the employment law series. It is important to note that the four cases discussed in this series are in no way exhaustive of all the legal cases that exist with implications for test development and usage. They are simply the major and most often referenced cases.

As test users, if you ever find yourself asking “Am I allowed to do this” before, during or after a test administration, chances are there are legal cases and/or professional guidelines that can answer your question. If you’re still feeling uneasy, try asking the test developer or your legal team for advice. The most common questions I personally receive are:

Although this exam was designed for X position, could I also use it for a similar Y position?

Where should I set my cut score?

How should I determine who passes the exam?

Test validation can be a confusing topic, and it is my hope that this series of posts has clarified the purpose and importance of test validation for test users and developers. I like to think of test validation as the process of gathering evidence that your test is job-related. What does “job-related” mean and how do you gather this evidence? This is where the court cases and legal/professional guidelines really come in handy. For example, the case of Washington v. Davis (1976) involved debate over the strategy used to provide support for job-relatedness reviewing the Court’s decision on this case can help provide insight into what is likely to be accepted in Court. Other useful resources will be listed at the end of this post, but for now, let’s get started with Ricci v. Destefano (2009).

Ricci v. Destefano, 557 U.S. 557 (2009)

New Haven, Connecticut’s Fire Department required all firefighters seeking promotion to captain or lieutenant to pass a written examination that was developed and validated by a test development company. The African American and Hispanic/Latino firefighters received significantly lower scores on the exams than the White candidates for the position, which would have resulted in none of the minority test takers being eligible for promotion based on their test scores. The Black and Hispanic/Latino applicants then threatened to sue the city of New Haven if they used the results from the test to make promotional decisions and demanded that the city invalidate the test on the basis of adverse impact. Fearing litigation, the city of New Haven complied and settled the case without litigation. The test results were ignored, and no candidates were promoted based on the results.

The White firefighters who would have received promotions based on their passing score were not happy with the city’s decision to invalidate the results and argued that this was unfair due to the exam having been properly validated and administered. They brought their case to the U.S. Supreme Court for verdict.

The Decision: The U.S. Supreme Court ruled that although the results of the exam would have resulted in disparate impact against Black and Latino/Hispanic firefighters, it was unlawful for the city of New Haven to invalidate the results due to the exam having been properly validated and administered.

The Implications: Needless to say, this court decision in particular was controversial and perhaps confusing, but let’s try to clear some things up. First, the Court made it clear that adverse impact alone is not an indication of an invalid test or that a test should not be used for hiring or promotional purposes. This is important because “adverse” or “disparate impact” are words that cause many test users to shudder due to the negative connotation. To avoid using the results of an examination that has sufficient evidence to support it’s use due to fear of adverse impact litigation is therefore unwise for test users, and additional steps need to be taken.

The Court also made clear that when adverse impact is found, the burden of proof shifts to the test user (the city of New Haven) to defend the use of the test as a “business necessity.” That is, the city of New Haven needed to prove that they needed to use the test to be able to properly determine who is qualified and eligible for promotion. Once business necessity has been proven, the city then needs to prove that they investigated alternate means to make the promotional decision that could have achieved the same result without adverse impact. If no alternate means (or alternate test) were found that could achieve the same result without adverse impact, then the Court is likely to rule in favor of the test user.

Test developers and test users should attempt to conduct adverse impact analyses when data permits. Test developers should make this information available to test users, and test users would also benefit from submitting their candidate data to the test development company. These analyses will help to first determine if adverse impact has historically been indicated. If your agency will be using an examination that is known to have adverse impact, be sure to maintain the evidence you need to prove that using the exam is a business necessity (job analysis information, technical reports, etc.). Test users are also encouraged to investigate many alternate testing means before making their final decision on which test to use doing so also provides support for using your chosen exam.

References

Ricci v. DeStefano, 557 U.S. 557 (2009)

Other useful resources

SIOP’s Principles for the Validation and Use of Selection Procedures (2019)

APA, AERA, and NCME’s Standards for Educational and Psychological Testing (2014)

Uniform Guidelines on Employee Selection Procedures (1978)

Adverse Impact: Implications for Organizational Staffing and High Stakes Selection (2010) (Edited by James L. Outtz)

By |2019-03-27T11:19:48-04:00March 27th, 2019|Adverse Impact|Comments Off on Court Decisions Affecting Test Development and Usage: Ricci v. DeStefano (2009)

Court Decisions Affecting Test Development and Usage: Washington v. Davis (1976)

It’s Friday, and I’m sitting at my desk singing “Woahhh, we’re halfway there. Woahhh! Livin’ On a Prayer.” Why? Because we are halfway through the employment law series! In the first two posts, we reviewed the cases of Griggs v. Duke Power Co. (1971) and Albemarle Paper Co. v. Moody (1975) and brushed up on the important decisions that affect test development and usage. For the final two posts of this series, we will review the cases of Washington v. Davis (1976) and Ricci v. DeStefano (2009). Let’s get started!

Washington v. Davis, 426 U.S. 229 (1976)

The Metropolitan Police Department of the District of Columbia (MPD) required all applicants seeking entry into the police academy to pass an exam intended to assess cognitive traits such as verbal ability, vocabulary and reading comprehension. At their set cut score, approximately four times as many Black applicants failed the test compared to Whites. African American test-takers who failed the exam at the set cut score then sought judgment that the exam unlawfully discriminated against Black applicants.

The case first went to two lower courts for verdict. The District Court initially ruled that the exam was “directly related to a determination of whether the applicant possesses sufficient skills requisite to the demands of the curriculum a recruit must master at the police academy.” In other words, the District Court ruled that the exam had been validated and shown to be related to “the demands a recruit must master at the police academy.”

However, the DC Court of Appeals then ruled the opposite: “the adverse impact was sufficient to establish a constitutional violation, absent proof by petitioners that the test was an adequate measure of job performance in addition to being an indication of probable success in the training program.” In other words, the DC Court of Appeals ruled that it was not enough to show that the exam was related to the demands of the police academy, but also to the job itself.

The Decision: The Supreme Court made the final decision that the District Court was correct and the DC Court of Appeals erred. The test “was directly related to the requirements of the police training program and … a positive relationship between the test and training-course performance was sufficient to validate the former, wholly aside from its possible relationship to actual performance as a police officer.”

The Implications: In a nutshell, the conflict between the District Court and the Court of Appeals concerned the appropriateness of the validation strategy used. When candidates challenge hiring decisions based on disparate impact, the test development and validation strategy are usually reviewed by the courts and with great scrutiny  to investigate the job-relatedness of the exam.

MPD attempted to provide support for the validity or job-relatedness of the exam by presenting the significant relationship found between exam scores and police academy performance. The Court of Appeals challenged this validation strategy as inappropriate on its own due to the criterion measure of “police academy performance.” Instead, the Court of Appeals argued that a relationship needed to be established between exam scores and on-the-job performance. Ultimately, the Supreme Court ruled that the validation strategy was appropriate and the relationship between exam scores and police academy performance was sufficient to provide support for job-relatedness.

The implications for test developers are huge! The criteria used in validation studies are not limited to job performance. You do not absolutely have to establish a relationship with the exam and job performance for it to be valid or constitutionally acceptable.This is great news for test developers because adequate measures of job performance are not always available, and not being limited to this sole criterion measure allows many other avenues for investigating the relationship between test scores and aspects of the job.

Test users should be sure to read the test’s technical report for details on any studies investigating the relationship with the exam and aspects of the job. Look at the type of job aspects used in the studies (e.g., job performance, academy scores, training performance) and how it was measured (e.g., supervisory ratings of performance, attendance records, negative customer service reports) to see if you measure the same job aspects in your agency. This information can help you gauge the appropriateness of using the exam at your agency.

References

Washington v. Davis, 426 U.S. 229 (1976)

By |2019-03-08T08:09:57-04:00March 8th, 2019|Adverse Impact|1 Comment

Court Decisions Affecting Test Development and Usage: Albemarle Paper Co. v. Moody (1975)

Test developers and test users alike need to be informed of the legal and professional guidelines that govern employment testing. Why? Stay tuned, at the end of this series (read part one), I will present specific cases where agencies have violated these guidelines and what and how much it cost them.

Albemarle Paper Co. v. Moody is a case that perfectly illustrates the cost of cutting corners.

When I think of the lesson learned in this case, it takes me back to the day I attempted to cut corners on my math homework. My Calculus teacher had suspected for a while that most people in the class were not actually attempting the homework calculations and simply using a calculator. One day, I guess he got fed up and made an announcement when handing out the nightly homework that we must “show your work.” Despite his instruction, me being the busy teenager that I was (joking), I attempted to cut a corner to finish my math homework faster by using the calculator instead of showing my work. The next morning when the teacher came around to collect homework, he explicitly stated that anyone who didn’t show their work this time would get a zero.

“We’ll see about that,” I thought. I quickly tried working out the problems before my teacher arrived at my desk to collect my homework. I was sweating and, perhaps unsurprisingly, I did not finish in time. My teacher caught what I was doing, and I received a much-deserved zero. Needless to say, I’ve avoided cutting corners ever since.

The case of Albemarle Paper Co. v. Moody is similar in that it involves cutting corners. Just as cutting corners did not turn out well for me (or my math grade), cutting corners definitely did not turn out well for Albemarle Paper Co.

Albemarle Paper Co. v. Moody (1975)

Prior to the passage of the Civil Rights Act of 1964, Albemarle Paper Co. (Roanoke Rapids, NC) explicitly excluded Black employees from high-skilled and higher-paying positions. This is similar to the case of Griggs v. Duke Power Co. (1971) in that Black employees were restricted to the lower-paying departments.  Of course, with the passage of the Civil Rights Act of 1964, this practice had been made illegal, and Albemarle Paper Co. permitted Black employees to transfer to the high-skilled positions or lines of work if they passed the Wonderlic Test, and another intelligence test called the Beta Examination Test.

Albemarle employees who were already in a certain line of work or a high-skilled department were not required to take or pass the exam to retain their jobs or qualify for promotions within the same department. Despite Albemarle removing the exclusion of Black employees from high-skilled, higher-paying positions, Black employees were still not able to secure these promotions or positions due to their not being able to pass the exams at the set cut score, which was based on national norms.

Approximately 10 years after the Civil Rights Act was signed into law, several present and former African American Albemarle Paper Co. employees filed a suit against the company on the basis that the testing procedure resulted in disparate impact against minorities.

Throwback to Griggs v. Duke Power Co.

If you’ve learned anything from the case of Griggs v. Duke Power Co. (1971), you should already be thinking, “I hope Albemarle Paper Co. did a job analysis to support the job-relatedness of the test. The law states that you cannot test for qualifications that are not job-related.”

Well, I think Albemarle Paper Co. was thinking that, too. Four months before the trial, the company paid a professional to validate the exams. This person attempted to lump positions together based on the level of the position in the career progression lines; however, he/she did not attempt to analyze the individual jobs in terms of the specific skills they require. Essentially, they attempted to provide support that the exam could be used across multiple positions, as it was being used, without analyzing the specific skills needed for each position.

The Decision: The Court ruled the validation method unacceptable, and condemned the company’s decision to validate the exam only after they were being sued. The Court also explicitly stated, “A test may be used in jobs other than those for which it has been professionally validated only if there are no significant differences between the studied and unstudied jobs.” Again, the professional hired by Albemarle Paper Co. to validate the exams did not investigate whether necessary skills differed among job groups, and therefore did not have sufficient evidence that the exam is suitable for use across jobs.

The Implications: This case re-emphasized the importance of employment examinations being job-related, a decision that was first mandated in Griggs v. Duke Power Co. (1971). This case also emphasizes the importance for test developers and test users to conduct/coordinate job analysis and validation studies to provide support for the job-relatedness piece. Test-users are strongly encouraged to read the technical report that accompanies each exam and the results of the job analysis the exam is based on, which will help determine if the exam assesses skills that are necessary for a particular position in your agency.

Test developers should be careful to link the developed examination to the required skills of the position and to document this process thoroughly (see job analysis for more information).

Author’s Note: The case of Albemarle v. Moody resulted in several other decisions that affect back pay awards. Those decisions were not discussed in this post, but for more information, please see the references below. 

References

  • Albemarle v. Moody, 422 U.S. 405 (1975)
  • Civil Rights Act of 1964, 42, U.S.C.
  • Griggs v. Duke Power Co., 401 U.S. 424 (1971)
By |2019-02-25T13:10:14-04:00February 25th, 2019|Adverse Impact, Job Analysis|Comments Off on Court Decisions Affecting Test Development and Usage: Albemarle Paper Co. v. Moody (1975)

Court Decisions Affecting Test Development and Usage: Griggs v. Duke Power Co. (1971)

Happy Black History Month! As you know, every February is reserved as Black History Month as a way to encourage the citizens of America to remember and reflect on important people and events in African American history. As employment test developers and test users, we would especially benefit from reflecting on the civil rights movement, which culminated in the Civil Rights Act of 1964. After all, the Civil Rights Act of 1964 and subsequent litigation have set strict legal and professional guidelines regarding tests and adverse impact that must be adhered to with care by employment test developers and test users alike.

Today, we’ll review the case of Griggs v. Duke Power Co. (1971) a major court decision that affects test development and usage.

Griggs v. Duke Power Co. (1971)

Duke Power Co. was a public utility company that serviced the general public in the U.S. Carolinas. The company was challenged for putting a policy in place that would inhibit African Americans from transferring out of the labor department  the department with the lowest paying jobs in the company  into other positions.

In 1955, Duke Power Co. imposed the following qualifications for placement in any section of the company, apart from the lowest-paying labor department:

  1. Candidates must possess a high school education.
  2. Candidates must achieve satisfactory scores on the Wonderlic Personnel Test and the Bennett Mechanical Comprehension Test.

Approximately six years after the Civil Rights Act was signed into law, Willie Griggs and several fellow African American Duke Power Co. employees filed a class action suit against the company for their transfer policy. Griggs and his co-workers argued that the policy unfairly discriminated against African American employees and, therefore, was a violation of Title VII of the Civil Rights Act of 1964.

Cognitive ability and aptitude tests like the Wonderlic and Bennett exams are known to have adverse impact against protected groups. Additionally, in the 1970s, approximately only 30 percent of African Americans had completed four years of high school (Educational Attainment by Race and Hispanic Origin, n.d.).

The Decision: The Court ruled that Duke Power Company’s transfer policy did in fact unlawfully discriminate against African Americans. It is important to note that the Court made this decision not because the transfer policy had adverse impact. Although the exams used were professionally developed, the Court ruled that “neither [test] was directed or intended to measure the ability to learn to perform a particular job or category of jobs.” In other words, neither exam was shown to be related to any particular position in the company. The Court thus set the prerequisite for all selection procedures: They must be job-related.

The Implications: Unsurprisingly, most public sector agencies these days use cognitive or knowledge-based tests that are job-specific instead of general cognitive ability tests such as the Wonderlic (Sproule, 2009). Test users benefit from using job-specific tests and, if a test from a test development company such as IPMA-HR is administered, it is important to ensure that the test assesses competencies important for effectively performing the job specifically at your agency.

A good starting point for this is to compare the job analysis your agency conducted for the position to the job analysis conducted by the test development company. For more information on job analysis or gauging the job-relatedness of a stock test within your agency, please feel free to email me or comment on this post.

For test developers, the job-relatedness component should be embedded in the test design. IPMA-HR’s Assessment Services Department starts each test development project with a national job analysis as the first step in providing support for job-relatedness. Please check out our website for information on current test development opportunities.

Sources

Civil Rights Act of 1964, 42, U.S.C.

Educational Attainment by Race and Hispanic Origin. (n.d.). Retrieved from https://www.census.gov/prod/99pubs/99statab/sec04.pdf

Griggs v. Duke Power Co., 401 U.S. 424 (1971)

Sproule, C.F. (2009). Rationale and Research Evidence Supporting the Use of Content Validation in Personnel Assessment: A monograph of the International Personnel           Assessment Council. International Personnel Testing Council (IPAC).

By |2019-02-15T12:38:15-04:00February 12th, 2019|Adverse Impact, Job Analysis, Public Safety Testing|2 Comments