Update to Promotional Test Reading List
From this point forward, we’ll announce updates to our reading lists for promotional tests as they happen.
In the meantime, however, we’d like to bring to your attention the fact that the reading list for the PDET 101 Police Detective test was updated in February to reflect a new edition of one of the publications.
You can submit a request to receive a reading list on our website.
You’ll find that we’ve also added a table that will show you when the reading list for each test was last updated. As of today, that table looks like this:
Type | Test Name | Last Updated |
---|---|---|
Police | PSUP 301/302/303 | Jan 2012 |
PL 301 | Jan 2012 | |
PDET 101 | Feb 2012 | |
Fire | FCO 101-EM/102-EM | Mar 2011 |
FCO 103/104 | Mar 2011 | |
Corrections | CF-FLS 102 | Feb 2011 |
ECC | ECC-FLS 102 | Oct 2010 |
We’ll be announcing future updates here on the Assessment Services Review so be sure and subscribe to Instant Updates on the ASR in the sidebar to the right.
Announcing Updates to the PSUP Series Police Supervisor Tests
As a result of ongoing feedback and our commitment to making sure our tests are the absolute best they can be, we’ve made some minor updates to PSUP series tests. Because of these changes, the names of the tests have been updated as follows:
Previous Test | Also Known As | New Test |
---|---|---|
PSUP 201 | PSUP 1.2 | PSUP 301 |
PSUP 202 | PSUP 2.2 | PSUP 302 |
PSUP 203 | PSUP 3.2 | PSUP 303 |
All links to the 200 series tests will be updated to the 300 series once available.
Changes to each test are as follows:
- PSUP 301 updates four questions from PSUP 201.
- PSUP 302 updates three questions from PSUP 202.
- PSUP 303 updates two questions from PSUP 203.
It is important to note that all replacement questions were written to assess the same content areas as the original questions and are supported by the books on the current reading list. (more…)
2012 Test Product Catalogs Coming Soon
It’s hard to believe that it has been two years since we last printed our test products and services catalog. We made a decision at that time to switch to a two-year catalog cycle and quite a lot has changed since then!
Besides updating you on the latest assessment product information, you’ll notice that it is no longer just one catalog. Instead, we’ve split it into a unique publication for each of the five different services we support:
- Police
- Fire
- Corrections
- Emergency Communications Center
- Non-Public Safety (Administrative/Clerical)
By having separate catalogs, we can better align our products with your specific needs. (more…)
It Pays to Take A Look: Item Analysis Part 3
In the two previous articles, we looked at the statistical and technical aspects of item analysis. Individual test developers will view the statistical computations and their value differently based upon their knowledge of statistics and their understanding of their application. However, a test developer or test user with a rudimentary understanding of item analysis can still make accurate decisions regarding the effectiveness of test items and therefore, written exams. As we emphasized previously, IPMA-HR conducts item analyses on potential test items in their test development process and maintains item analysis data from successive administrations of all exams. These practices ensure that only items that perform well continue to be utilized and, in addition, this practice reflects a standard that all test developers should employ. Also note that for our discussion, we will be focusing on typical four response multiple choice items and true false items.
Effective utilization of item analysis information for item and test revision is where science meets art. This process of “cleaning” up test items and tests requires utilization of the basic information from an item analysis, effective analysis of the applicant response data and application of the information available for developing good test items. There is an extensive amount of scholarly information available on item writing as well as response theory and the effective practitioner should take the time to review some of this information prior to writing or “repairing” test items. It should also be noted that even though the information provided in this article focuses on actual test developers, it can also be extremely valuable for those who purchase or lease tests since it can assist them in evaluating the quality of tests they are considering. (more…)
Everything You Wanted to Know About Assessment Centers
The Assessment Center Educational Materials are a comprehensive guide to the complicated process of administering an assessment center in your organization. Whether you’re using an in-house assessment center system or one developed for you by an outside organization, the ACEM is an invaluable tool to make sure your administrators, assessors and candidates are informed, prepared and know what to expect during the process.
Starting immediately, the ACEM is also $50 less! We’ve lowered the price to $249 and you can order it online today. (more…)
It Pays to Take A Look: Item Analysis Part 2
In the previous article, we began our discussion of the valuable information available through conducting an item analysis and we focused on the two most readily available pieces of information. First, of course, is the Difficulty Index. Just as the name implies, this index is an indicator of the difficulty of the item. It is expressed as a percentage and reflects the number of candidates that got that item right out of the total number of candidates that responded to the item. That is if nine out of ten respondents answered an item correctly the index would be .9 or 90%. From this illustration, we can also see that the Index actually has an inverse relationship with the difficulty of the item. That is, the higher the index or the higher percentage the easier the item is.
The second Index we discussed was the Item Discrimination Index. Essentially, this index reflects how the candidates who performed best on the test responded to a specific item when compared to how the candidates who performed the worst on the test responded to that same item. The top 27% of test performers and the bottom 27% of test performers are used for calculating the Discrimination Index and it is expressed as a proportion or percentage of the number in the top group that answered the item correctly in relation to the number in the bottom group that answered the item correctly. (more…)