2018 IPAC Conference

July 29 - August 1, 2018
Alexandria, VAOld Town, New Assessments

Background and margins can also be set for the layout or for the placeholder within the page template.
Use CSS to fine-tune appearance and behavior. <a href="http://help.wildapricot.com/display/DOC/Gadgets#Gadgets-HTMLandCSSparameters/?utm_source=contexthelp&utm_medium=site&utm_campaign=contexthelp" target="_blank" >Learn more</a>

2018 IPAC Pre-Conference Workshops

On Sunday July 29, before the conference gets underway, IPAC will be hosting six hands-on workshops on a diverse selection of timely topics. These workshops are outside of the 2018 Conference Program and offer attendees additional exposure to topics they may be interested in. Three workshops will be offered in the morning (8 AM - Noon), and three will be offered in the afternoon (1 PM - 5 PM). Attendees can register for a morning and an afternoon workshop of their choice. 

SHRM Credit: Each workshop has been approved for 3.5 professional development credits.

Developing Training and Experience Questionnaires with Customized Rating Scales

Training and Experience Questionnaires (T&E), also known as Occupational Questionnaires or Assessment Questionnaires, are commonly used assessment tools in the employee selection process. These questionnaires measure applicants’ previous experience, training, and/or education relevant to the target position and typically utilize a generic, or previously developed, rating scale that can be applied to a wide variety of different organizations, positions, and/or competencies. This session will discuss best practices in the development of T&E items using customized rating scales.

 The presenters will discuss various item formats and types that can be used on T&E questionnaires, and provide guidance in developing rating scales customized based on job analysis information and subject matter expert input. Discussion topics will include: 1) multiple choice – single response and multiple choice – multiple response item formats; 2) task-based, competency-based, and knowledge-based item types; and 3) processes for establishing the scoring protocol at the item level.

 The session will help audience members to develop T&E questionnaires with customized rating scales and provide the opportunity to practice applying the information discussed through hands-on exercises. 

Joyce Wentz, M.S.

Joyce Wentz is a Program Analyst specializing in Data Analytics for the Automated Systems Management Group (ASMG) at the United States Office of Personnel Management (OPM). She provides direction and support for the development of reports and reporting capabilities to provide organizations with the information necessary to support data-driven decision making in the areas of staffing recruitment, employee selection, and new hire onboarding. Her areas of expertise include job analysis, selection assessment development, and data analysis and visualizations.

Joyce has a Master of Science in Industrial/Organizational Psychology and a Bachelor of Science in Psychology from Missouri State University. She has 9 years of job analysis and selection assessment development experience, and has delivered and developed training to numerous organizations on the development of training and experience questionnaires.

Kimberly Lepore, M.A.

Kim Lepore is a Personnel Research Psychologist at the U.S. Office of Personnel Management (OPM), and she works on the Selection and Promotion Assessment team in OPM’s Human Resources Solutions organization. Her work at OPM consists of mostly selection and assessment development projects for various federal agencies. Kim has developed customized training and experience questionnaires for a number of federal agencies and also has experience conducting customized training and experience questionnaire agency trainings.

Kim holds her Master of Arts and Bachelor of Arts degrees in Psychology, and is a currently A.B.D. in her Industrial/Organizational Psychology PhD program at the University at Albany. Prior to joining OPM in 2016, Kim interned at a number of organizations working on test development and validation projects.

Back to Top

Developing Structured Interviews

This workshop will provide participants with a road-tested process to develop structured interviews. Interviews are among the most widely used methods of assessing candidates for a position, and research has demonstrated that adding structure can increase their reliability and validity.

Participants will learn techniques to select interview competencies, facilitate critical incident panels, transform critical incidents into questions, and develop scoring criteria. Interactive exercises will allow participants to put techniques into practice. Decision points, logistical issues, rater training considerations, and barriers to success will be discussed.

Participants will come away from the session with an understanding of methods used to develop structured interviews as well as templates and tools for implementing sound structured interview development processes in their workplace.

Kristen Pryor, M.S.

Kristen Pryor is an Associate Principal Consultant for DCI Consulting. She has previously worked for the Office of Personnel Management and as a federal contractor. Kristen has over 10 years of experience developing, implementing and evaluating human capital systems in the private and public sector, with a focus on selection processes. Her work has included job analyses, developing and validating the use of a variety of pre-employment and promotional assessments, and conducting complex data analyses to evaluate the performance of human capital systems. A substantial portion of Kristen’s current work is focused on the evaluation and improvement of selection systems in the private sector. She has presented at professional and scientific conferences and has published work in the Handbook of Employee Selection and Industrial and Organizational Psychology: Perspectives on Science and Practice. Kristen received her M.S. in Industrial and Organizational Psychology from the University Central Florida.

Emily Steinau, M.A.

Emily Steinau is a Personnel Research Psychologist with the Personnel Research Assessment Division within U.S. Customs and Border Protection (CBP). In her role at CBP, Emily conducts job analyses, develops assessments, facilitates training, designs surveys, and analyzes data. Recently, she led the development of a structured interview question bank for the selection of supervisory agents. Prior to joining CBP, Emily worked as a contractor in the federal and local government space. She has over 10 years of applied experience developing, implementing, and evaluating selection processes with a focus on public safety positions. Emily earned her M.A. in Industrial-Organizational Psychology from Xavier University.

Laura Fields, Ph.D.

Laura Fields has over 19 years of experience in employee selection, as both an external and internal consultant. Currently, Laura is a Senior Selection Consultant at Wells Fargo where she provides guidance to recruiters and hiring managers on employee selection strategy and practices. In her role, Laura mitigates risk, enhances candidate experience, and optimizes employee selection decision-making. Laura manages several high-volume online selection assessment programs as well as an enterprise-wide interview guide content management system. Recently, she led an effort to develop and deploy an interviewer training program. Laura earned her M.S. in Industrial-Organizational Psychology from Radford University, and her Ph.D. in Industrial-Organizational Psychology from Capella University.

Back to Top

Barrier Analysis: How do you really get that done?

Barrier analysis, the research and investigation of policies, practice, or procedures that may present a barrier or impediment to equal employment opportunity, is required by EEO law and regulation for federal agencies and other entities. The process is complex from a research standpoint and in light of different constituency viewpoints. This session will discuss key decisions and experiences leading these efforts in different organizational contexts.

The panel will address specific questions of interest and provide practical experience-based advice. Discussion topics will include: prioritizing projects across issues; identifying and reviewing triggers; using data from different sources, research methods; integrating data into a clear story, presenting action recommendations, and implementing resulting change.

The session will help audience members develop their barrier analysis program, continue to proactively advance equal employment opportunity, present research information for maximum influence, and evaluate their program for continuous improvement.

Martha E. Hennen, Ph.D. 

Martha is a Management and Program Analyst working in the Office of Equal Employment Opportunity of a federal financial regulator headquartered in Washington, DC. Dr. Hennen leads research to analyze and understand the aspects of workplace experience for different employee groups that may facilitate or impede equal employment opportunity. She has held positions at the Consumer Financial Protection Bureau, where she led the talent analytics function, and at the USPS, where she was responsible for assessment and measurement at all levels of the organization. Dr. Hennen’s research interests cover several employee measurement areas including recruitment and selection, equal employment opportunity, diversity and inclusion, employee engagement, leadership assessment, performance management, and talent analytics. Dr. Hennen is a licensed Applied Psychologist and a member of the Society for Industrial and Organizational Psychology and of the American Psychological Association. She holds a Master's degree and Ph.D. in Industrial/Organizational Psychology from the University of Connecticut.

Maurice B. Champagne, Ph.D. 

Maurice is a Senior Data Analyst working on contract through FM Talent with the Securities and Exchange Commission’s Office of Equal Employment Opportunity. Dr. Champagne provides technical direction and leadership to the analysis work for the Office’s Barrier Analysis function. He holds doctoral degree in Public Policy from George Mason University. Dr. Champagne’s research interests focus on the use of quantitative methods to understand social network data, in particular understanding and supporting public policy with data gathered from social media.

Brittany M. Dian, M.S. 

Brittany is a Consultant at DCI Consulting Group, Inc., a human resources risk management firm located in Washington, D.C. that specializes in the complex issues of Equal Employment Opportunity compliance. Specifically, DCI utilizes quantitative and statistically driven methods to uncover potential impediments to equal employment opportunity, and provides expertise in interpreting and prioritizing potential areas of workplace discrimination. Brittany has worked directly on a multi-level barrier analysis and benchmarking project in which extensive research and adverse impact analytics were conducted over the course of a year to uncover potential impediments to EEO for a federal client. Brittany’s areas of expertise include equal opportunity employment, diversity analytics relating specifically to hiring, promotion and terminations, compensation equity studies, and affirmative action plan development and implementation. Brittany holds a Master’s degree in Industrial/Organization Psychology from Florida Institute of Technology.

Back to Top

Designing and Evaluating Assessment Games and Gamification for Selection

Gamified and game-based assessments are increasingly appearing as selection methods in modern organizations, yet many practitioners are unaware of how to create or even evaluate such assessments. By participating in this workshop, participants will learn the language of this new area of assessment and gain the skills to make them a gameful designer. Workshop contents include discussion of what gamification actually is and how it differs from game-based assessment, interactive demonstrations of current vendor offerings in the space of game-based assessments with discussion of how they differ both functionally and psychometrically, hands-on experience gamifying assessments, and hands-on experience prototyping assessment games along with a discussion of the challenges faced when identifying and contracting game design firms to make a prototype into a reality.

Richard N. Landers, Ph.D.

Richard N. Landers holds the John P. Campbell Distinguished Professorship in Industrial and Organizational Psychology as an Associate Professor at the University of Minnesota. His research program concerns the use of innovative technologies in assessment, employee selection, adult learning, and research methods, with his work appearing in Journal of Applied Psychology, Industrial and Organizational Psychology Perspectives, Computers in Human Behavior, Simulation & Gaming, Social Science Computer Review, and Psychological Methods, among others. His research and writing has been featured in Forbes, Business Insider, Science News, Popular Science, Maclean’s, and the Chronicle of Higher Education, among others. He currently serves as Associate Editor of Simulation & Gaming and the International Journal of Gaming and Computer-Mediated Simulations. In 2016, he was awarded a Certificate of Recognition for his research on big data presented to the Society for Industrial and Organizational Psychology and in 2015 was Old Dominion University’s nominee for the State Council of Higher Education in Virginia’s Outstanding Faculty Award in the “Rising Star” category. 

Back to Top

Developing and Evaluating a Training Program

This workshop will walk participants through the process of developing a training program based on major frameworks and concepts from industrial and organizational (I‑O) psychology. A training program is designed to teach a newly hired employee how to do the job or to provide an existing (experienced) employee developmental opportunities. Generally, developing a training program consists of four phases: (a) conducting a needs assessment/gap analysis, (b) translating those needs/gaps into training objectives, (c) developing a training design/content, and (d) developing a plan for training evaluation. In this workshop, the presenters will discuss the steps taken to complete each phase, the materials used during each phase, and the results relevant to the goal(s) of each phase.

The content of this workshop will be based predominantly on seminal resources in I-O psychology (e.g., Goldstein & Ford, 2002; Kirkpatrick, 1976; Knowles, Holten, & Swanson, 1998; Kraiger, Ford, & Salas, 1993 to name a few) although the presenters will introduce and discuss recent training-related research and case studies. Presenters will also share knowledge acquired and observations from their own experiences and encourage participants do the same. Participants will be given several opportunities throughout the workshop to demonstrate what they have learned.

Note - This workshop is intended for general audience (i.e., students) and HR professionals who have beginner- to intermediate-level experience with the design, delivery, or management of training programs in organizations. Additionally, bringing a laptop to the workshop will be helpful, but not essential.

Bharati B. Belwalkar, Ph. D.

Bharati B. Belwalkar is a Personnel Administrator at the Civil Service Department of the City of New Orleans. In her current role at the City, she manages employee testing and training divisions and has re-engineered some of the selection and training practices. Bharati earned her Ph.D. in industrial and organizational (I-O) psychology from Louisiana Tech University. She has previously worked for the City of Jacksonville and Aon Hewitt specifically in the areas of personnel selection, assessment, and data analytics.

James De Leon, M.A.

James De León is a Doctoral Candidate in Industrial and Organizational (I-O) Psychology at Louisiana Tech University. He is also a Sr. Consultant at APTMetrics, a global human resource consultancy. He has experience in the areas of statistical analysis, pay equity, litigation support, job analysis, assessment and selection, competency modeling, performance management, and organizational surveys for Fortune® 100 clients across a broad range of industries.

Back to Top

Establishing a Comprehensive Human Capital Program using Competency Models

Think of competencies as the building blocks of a workforce. Competency modeling can be used to determine the unique combination of building blocks that a workforce needs for the organization to be successful. With the proper design, organizations can use their competency models to recruit and select highly qualified talent, identify and close skill gaps, guide employee learning and development, plan future workforce needs, manage performance, and more.

In this workshop, human capital consultants and competency experts will provide step-by-step guidance and tips on efficiently creating and validating competencies and how to integrate them into all aspects of the human capital lifecycle to include selection and assessment, performance management, training and development, succession and workforce planning. Techniques to assess workforce and employee competency gaps, set priorities to support decision making on the investment of resources, and identify the focus of human capital program initiatives will be discussed.

Participants will come away from the session with an understanding of the competency modeling process, how competency models provide the foundation for all aspects of the human capital lifecycle, and how to apply competency models in practice to establish consistent and comprehensive human capital initiatives.

Marni Falcone, M.A.

Marni Falcone is a Senior Consultant at FMP Consulting and has more than 10 years of experience working on and leading human capital projects for the public and non-profit sector. Ms. Falcone joined FMP Consulting in June, 2012. Her areas of expertise include: competency modeling, applied training and development, employee selection, job/competency analysis, competency gap analysis, workforce analysis, assessment center development, validation and administration, adverse impact analysis, personnel selection, and career development. Ms. Falcone led several projects with Federal agencies such as Federal Emergency Management Agency (FEMA), National Resources Conservation Service (NRCS), Bureau of Primary Health Care (BPHC), and the Center for Public Safety Excellence (CPSE) conducting workforce analyses, job analyses, competency modeling, gap analyses, and training needs analyses and development in an effort to provide clear expectations and structure around organizational roles and responsibilities, streamline hiring timelines, assess competency gaps and develop training to fill organizational needs, and assess organizational effectiveness of human capital initiatives and processes. Ms. Falcone received her M.A. in Industrial/Organizational Psychology from George Mason University and her B.A. in Psychology and Criminology/Criminal Justice from the University of Maryland, College Park.

Rob Calderón, Ph.D.

Rob Calderón is a Managing Consultant at FMP Consulting and has more than 20 years of experience working on and leading applied personnel research projects in both the private and public sector. His primary areas of expertise include: job and occupational analysis and competency modeling, the design and evaluation of personnel selection and classification systems, workforce analysis/planning, training needs analysis, development, implementation, and evaluation, survey design, and quantitative/qualitative data analysis. Dr. Calderón has led competency-related efforts across a wide variety of organizations including Centers for Disease Control and Prevention (CDC), Natural Resources Conservation Service (NRCS), Department of Veterans Affairs (VA), Office of the Comptroller of the Currency (OCC), Federal Emergency Management Agency (FEMA), Defense Contracting Management Agency (DCMA), Center for Veterinary Medicine (CVM), Society for Human Resource Management (SHRM), Universal Service Administrative Company (USAC), and the Overseas Private Investment Corporation (OPIC). He has also taught graduate level courses in research methods, statistics, and performance management as an Adjunct Professor at both George Washington University and George Mason University. Dr. Calderón received his Master’s degree and Ph.D. in Industrial/Organizational Psychology from The Ohio State University, and Bachelor degrees in Statistics and Psychology from Northwestern University.

Back to Top

Powered by Wild Apricot Membership Software