Test Score Posting Policies in the Public Sector

My blog this month deals with what I believe is a complex question that requires deft consideration of the demands of multiple stakeholders and the careful weighing of legal and ethical issues. I am speaking of the question of how does a public sector jurisdiction make decisions regarding the posting of scores both during and after the completion of an assessment or selection project.

As human resource and assessment professionals, we have to resolve the conflict between the equally important values of transparency of feedback to test takers, the privacy rights and expectations of confidentiality held by job or promotional candidates, and the public’s right to know, along with the media’s right to information. Deciding how and what type of information to post can seem like a judgment worthy of Solomon, as the human resource professional must reconcile:

  • the public’s right to information, including possible public record laws;
  • the candidate’s desire for feedback and test score information; and
  • the right of the candidate to privacy and the candidate’s expectation that their scores will be handled in a confidential and sensitive manner.

In my opinion, one of the complicating factors is that the release and posting of test score information has to consider many factors beyond simple psychometric and assessment issues. Some of the factors that must be considered or questions which need to be asked and answered include:

  • Are there federal or state laws that govern the release of public sector employment test information, as well as public records in general?
  • Are there local Civil Service regulations or rules?
  • Does the union contract specify how test results will be issued?
  • Are there past, relevant court decisions?
  • How have we done it in the past? What are the existing precedents?
  • What precedent, if any, do we want to create for future tests?

My own experience has been that every jurisdiction tends to make decisions regarding the posting of and release of a candidates test and score information differently, even within a specific geographic area such as Northeast Ohio. I know of some cities that post in public all the test score information for each candidate, while similar nearby cities post only the final rankings of the test takers.

If at this point you are starting to mumble to yourself, “I fear that Doverspike is not going to give us a simple answer in this blog,” you are correct. However, I am going to share with you some data from a survey conducted by IPMA-HR Assessment Services.

The Distributing Test Scores Survey

The survey was completed in November of 2013. There were 9 questions and a little over 100 respondents. I will present the question and let you try to first guess the response percentages; so please do not cheat ahead, read the bolded test question and offer your own estimates of the results. I will offer my own guess or what I would expect in terms of a response. Then I will provide the response percentages and offer additional commentary. If you can think of better rationales for the response distributions than I can, please feel free to share those with me by email.

Hopefully, we can all learn from the shared experiences of others; except, from my perspective, one of the most revealing features of the survey was the inconsistency between responses to seemingly related or overlapping questions. Still, I believe the contradictory nature of the responses can help to inform our discussion and I will try to offer some clarifying comments.

To recap, this month’s blog will deal with a question that is frequently directed to IPMA-HR Assessment staff concerning “How and what type of information or feedback should be presented on test scores or results.”

  1. Does your agency include information on test scoring procedures in the announcement for the job?

I will be honest and state that I would have expected most agencies to say yes – or at least somewhere around 75%. What was your guess? Well, the actual distribution was 36% “Yes” and 64% “No.” So, most agencies do not provide information on test scoring. Perhaps, I was wrong because of the interpretation of “test scoring.” Perhaps, it was because the respondents used a narrow interpretation of the term “announcement,” and instead publicize such information in packets handed out after the application process. In any case, I was surprised by this initial result, as I would think at least general information on the test scoring would be contained in the job posting or announcement.

  1. Does your agency provide test scores to entry-level candidates?

This is an area where I would expect a deep divide between the public and private sectors. For private companies, the answer to this question would probably be about 80 – 90% “No.” For the public sector, our sample of interest, I expected most of the agencies to respond in the affirmative. In fact, 85% said “Yes,” which I have to admit was probably even higher than I expected. So, agencies do provide some type of notification of the results to entry-level candidates, although there is still the question of what type of score information and how exactly it is provided. Future questions will provide clarification on this issue.

  1. Does your agency provide test scores to promotional candidates?

Again, I would expect vary different responses from private organizations, but for the public sector I would expect a very high positive response, approaching 100%. Given that, I was surprised that the “Yes” rate was only 86%. As with Question 1, the response percentage may fall below unity because of the interpretation of “test scores.” Comments received indicated that some agencies gave out rankings or pass-fail results but not scores; my guess is that some agencies regarded rankings and pass-fail notifications as being scores, while other organizations defined test scores more narrowly as an actual numeric score.

  1. Does your agency send a letter or email to individual candidates with their test score?

Simple math tells me that given the answers to Question 1 and 2 above, I would expect 85% of the respondents to have sent a letter or email, one or the other. But surprise, surprise, 42% do not. A letter was sent by 34%; an email was sent by 35%. This gives us a total of 69%, leaving 16% unexplained. So, why the inconsistency or discrepancy? If not by email or letter, how are results being communicated to our other 16% of candidates? This result is hard to explain because it clashes with the responses to 2 and 3, but a strong possibility is that the agencies may be communicating the information in another manner, phone or in-person, which does not create a public record or document trail. In part, this may be to maintain the confidentiality of the candidate’s score. For successful entry-level candidates, any specific test feedback could be offered during an interview. For promotional candidates, the test scores could be given out in-person, during a meeting with supervisors or managers. In a number of cases, agencies report scoring the test in front of the candidate and providing the test taker with immediate feedback on the test score.

  1. Does your agency post a listing of candidate scores for entry-level tests?

I would have expected more than half the agencies to respond in the affirmative. From my perspective, the results were quite unexpected. The overwhelming majority said “No,” 85% of the respondents. Now, perhaps “posts” was interpreted in a narrow sense, such as on a public bulletin board. Still, the response percentages are surprising. It seems clear that there is a tendency to keep scores private and confidential, probably to protect the candidates.

  1. Does your agency post a listing of candidate scores for promotional tests?

Again, I would have expected a majority of agencies to say “Yes,” with the rate being less than that for entry-level tests. Maintaining my record so far, I was wrong.  Seventy two (72) percent said “No.” So, we are beginning to see a pattern that agencies provide test scores, but it is not clear how they are provided. For promotional test results, it would appear that in the interest of protecting applicant and incumbent privacy, any communication of scores is done in a relatively personal fashion, or that there is limited posting of scores.

  1. Does your agency explain test scoring procedures the day of the test?

I do not think there are any surprises here, as 76% responded that they did.

  1. Does your agency inform candidates of their overall candidate selection process result and rank or standing?

The way I would interpret this question, I would think the number of “Yes” responses would be higher than for Question 2 and 3, yet it was lower with 75% indicating they informed candidates of the overall results. The tricky part of this question may be that it is double-barreled in referring to both the overall result, which I would take to mean score, and the ranking.  One of the comments was particularly interesting and I repeat it in part here “… (entry-level candidates) either given an offer letter or a letter advising them they were not selected. Promotional candidates are offered a review session to go over test scores and assessment results.” Based on other comments, I am going to venture a guess the answer depends on how many positions are open and whether the list will be kept open for an extended period of time. In those cases where the agency is only filling one slot, then they may feel that any ranking is irrelevant.

Summary

Looking at the overall results, what seems clear is that agencies do provide information on test performance, but that the type of data communicated varies greatly. Based on selective comments, it appears that test scores are sometimes provided verbally rather than in writing; in the case of promotional candidates, by the supervisor or human resources. An interesting aspect was that a number of agencies score at least some tests immediately and give the test takers the result before they leave the administration site.

So, policies toward the distribution of ranking and test score information could be seen as being diverse.  As I indicated in the beginning of this blog, my experience has been that the particular policy is based on a host of factors, but that even very similar agencies or cities may have very different rules.

I think the general rules that everyone could agree on would be:

  • Whatever your policy, document it and be consistent.
  • Inform candidates of the policy and most agencies seem to do so.
  • Regardless of what information you decide to distribute, provide the relevant data in an easy-to-understand form.
  • If possible, agencies try to avoid posting failing scores, probably to protect the privacy of the candidate.

Before we finish this blog, a related question is “how do agencies distribute scores when a test consists of multiple hurdles?” Take for example a battery that consists of a knowledge test, followed by a work sample, and finishing with a panel interview. What type of results does the agency provide and distribute. My short answer would be that similar to policies for overall scores, the procedures for a battery with a set of pass/fail hurdles will be driven by a host of factors and, thus, there will be a lot of variation. Obviously, candidates must know or be told in some fashion whether they are moving onto the next step. It seems likely that in many or most cases they receive limited feedback on their exact rank or scores and the tendency is to distribute material in a manner that protects the privacy of the candidates.  For further information, see the four part series by article by Bob Burd Bob on Successive Hurdles, Test Weighting, and Certification Rules.

As far as I know, no real research base or professional guidelines exist to inform our decisions regarding the distribution and posting of scores. Ultimately, the decision most likely will be outside the control of Human Resources anyway or as Bob Burd indicated in Part 4 of his series “Unfortunately, experience has shown that local laws, statutes and/or civil service rules that provide the blue print for how HR work is to be done are many times in conflict with exam development and validation procedures.” However, our hope is that this blog offers you some guidance in terms of the typical practices of other agencies.

Note:  We will be conducting a survey in the next few weeks to gather information from you on how you handle the distribution of scores when a test consists of multiple hurdles.  Look for the survey in the next edition of the ASR.

This entry was posted in Assessment, Test Scores by Dennis Doverspike. Bookmark the permalink.

About Dennis Doverspike

Dennis Doverspike, Ph.D., ABPP, is President of Doverspike Consulting LLC. He is certified as a specialist in Industrial-Organizational Psychology and in Organizational and Business Consulting Psychology by the American Board of Professional Psychology (ABPP), serves on the Board of the American Board of Organizational and Business Consulting Psychology, and is a licensed psychologist in the State of Ohio. Dr. Doverspike has over forty years of experience working with consulting firms and with public and private sector organizations. He is the author of 3 books and over 150 other professional publications. Dennis Doverspike received his Ph.D. in Psychology in 1983 from the University of Akron.

Leave a Reply