BARS Workshop Series Part 3: Interpreting Results and Analysis of Behaviorally Anchored Ratings Scales

  • 11 Jun 2025
  • 12:00 PM - 2:00 PM
  • Virtual and Columbus, Ohio

Registration

  • Select if you plan to attend in person in Columbus
  • Select if you plan to attend in person in Columbus, OH
  • Select if you plan to attend in person in Columbus, OH
  • Select if you plan to attend via Zoom Link
  • Select if you plan to attend via Zoom link
  • Select if you plan to attend via Zoom Link

Register


The BARS Rating technique is an incredibly versatile tool, but many of use rarely receive an in depth education or experience with with it.  Through this three-part series, Liz will share the knowledge and experience that she has gained participating in the use of Behaviorally Anchored Rating Scales (BARS) and analyzing the results. You will learn: 1) how to create BARS, 2) how to train evaluators in the use of BARS, and 3) how to analyze and use the results.

In the final workshop in this series, we will review how to interpret your BARS findings through six learning objectives:

1. Discuss tabulations of raw scores-BARS

2. Learn when standardizing of scores is most appropriate

3. Learn calculations for z-scoring to obtain reported scores

4. Discuss inter-rater reliability and learn how to calculate using the Spearman-Brown calculation

5. View means and standard deviations by board and assessors

6. ***Discuss how to interpret results

Liz Reed is the Executive Assistant Director for the Columbus Civil Service Commission.  She leads the teams responsible for the development, administration, and scoring of all Civil Service Exams. Her teams are committed to fair selection processes that give applicants access to demonstrate their skills and perform at their best.

Liz is committed to data-driven approaches to selection, and is bold as she implements innovative practices within an appellant culture. Additionally, she demonstrates leadership skills that result in the achievement of goals through the performance of dynamic and diverse teams. She encourages her teams to perform at exceptional levels and leads the teams to produce high-quality simulations, structured oral boards, writing samples, role-play exercises, in-baskets, multiple-choice exams, and written work samples. In her 30+ year career, she created or supervised the development and implementation of Behaviorally Anchored Rating Scales through multiple uses. She analyzed the statistical results from the use of BARS.

In 2004, her team was awarded the IPMAAC (IPAC) Innovations in Testing Award for their work in developing in-house video exams. She served in leadership roles within IPAC, including President. She partnered with professionals from Akron University, Bowling Green State University, the City of Dayton, the Ohio State University, the State of Ohio and others to the create IPAC’s first regional section the Great Lakes Employment Assessment Network (GLEAN).

We want to continue with what makes IPAC successful and unique. IPAC is a forum for assessment professionals who share practices, research and training to meet the applied needs in employment assessment. We have a level of friendliness and openness among our members that is unmatched in our field. We are a premiere organization where research meets reality. We embrace the breadth of our membership which includes directors, managers, staffing specialists, recruiters, psychologists, analysts, attorneys, consultants, faculty and students from the private and public sectors and academics and research.

Powered by Wild Apricot Membership Software