2016 IPAC Conference Keynote Speakers
GREG HURTZ, Ph.D.
Test Security Systems: Using Statistical and Psychometric Models for Prevention and Detection of Test Fraud
Test security is a serious issue for high-stakes testing. Individuals may engage in fraudulent testing behaviors to improve their own scores and testing outcomes, or to profit by gathering information that can be sold to others. Increased use of computerized testing can potentially exacerbate the problem (e.g., with unproctored internet testing) but can also provide opportunities to combat it (e.g., through technology-driven test delivery and response data collection strategies). In this presentation I will discuss test security strategies involving the use of statistical and psychometric methods to automate the generation of unique test forms for candidates, and data forensics methods that can help detect aberrant patterns consistent with test fraud in response data. I will discuss my recent research in both areas using a combination of simulated and real-world data.
Bio: Greg Hurtz is a Professor in the Psychology Department at California State University, Sacramento, where he has been teaching and supervising research in statistical methods, industrial psychology, and psychological testing and measurement at both the undergraduate and graduate levels since 2002.
Since the early 1990s he has regularly carried out project work as a consultant in the areas of statistical analysis, psychometric methods, work analysis, employee selection testing, and employee training evaluation for public and private sector organizations and consultancies. His current consulting efforts are focused exclusively on his role as a Senior Psychometrician at PSI Services LLC, where he has spent the past three years engaged in research and development activities in advancing item response theory applications, automated test assembly methods, and data forensics methods for test security analysis. His strongest research interests involve applications of latent trait (Rasch and IRT) models and Monte Carlo methods to the evaluation of psychometric and statistical analysis practices.
He has given over 50 presentations at scientific and professional conferences and meetings, has written 2 book chapters, and has published 12 articles in respected journals such as Organizational Research Methods, Journal of Applied Psychology, Applied Psychological Measurement, and Educational and Psychological Measurement. He served as an associate editor on the recently published Handbook of Work Analysis: Methods, Systems, Applications and Science of Work Measurement in Organizations. Greg earned his B.A. and M.A. in Psychology from California State University, Sacramento, followed by his Ph.D. in Industrial-Organizational Psychology at the University at Albany, State University of New York.
Back to top
DENIZ ONES, Ph. D.
Employee Screening for Higher Stakes Occupations: Measurement and Nomological Network of Maladaptive PersonalityFor employees in some occupations, freedom from psychopathology is essential. Typically, individuals in these occupations have greater responsibility for public safety and the well-being of others. Such occupations include pilots, law enforcement personnel, and nuclear power plant operators, among others. Those in high stakes jobs may also be responsible for the welfare of vulnerable populations (e.g., childcare workers, medical professionals). In other cases, reputations and financial welfare of organizations may be dependent on a few critical decision makers (e.g., CEOs). In this presentation, I will focus on recent scientific and applied developments in employee screening for higher stakes occupations. In particular, with the release of the DSM-5, screening for so-called dark side personality traits and psychopathology is being transformed. I will discuss how a better understanding of the structure and spectrum of maladaptive personality measurement can help in better personnel decision making. Personality constructs range between maladaptive positive and negative extremes, with the middle normal range representing typical (i.e., “normal” or adaptive) traits. Both the adaptive and maladaptive personality construct space is characterized by hierarchy (including a general factor of personality, meta-traits, Big Five factors, Big Five Aspects, and personality facets), lack of simple structure (resulting in compound traits indicating more than one personality domain), and bipolarity. Implications for maladaptive personality assessments and employee screening will be discussed.
Bio: Deniz S. Ones is the Hellervik Professor of Industrial Psychology and a Distinguished McKnight Professor at the University of Minnesota, where she also directs the Industrial-Organizational Psychology program. She received her Ph.D. from the University of Iowa in 1993 under the mentorship of Frank Schmidt. Her research, published in more than 175 articles and book chapters, focuses on staffing, employee selection and measurement of personality, integrity, and cognitive ability and has been cited over 12,500 times in the scientific literature. She has received numerous awards for her work in these areas; among them the 1994 Wallace Best Dissertation and the 1998 McCormick Early Career Distinguished Scientific Contributions Awards from the Society for Industrial and Organizational Psychology, the 2003 Cattell Early Career Award from the Society for Multivariate Experimental Psychology, and the 2012 Lifetime Professional Career Contributions and Service to Testing Award from the Association for Test Publishers. She is a Fellow of Divisions 5 (Evaluation, Measurement, and Statistics) and 14 (SIOP) of the American Psychological Association as well as a Fellow of the Association for Psychological Science. She has served as co-editor in chief of theInternational Journal of Selection and Assessment (2001-2006) as well as on editorial boards of multiple prominent scientific journals. Dr. Ones is also the past Chair of APA's Committee on Psychological Tests and Assessments. In her applied work, she focuses on helping organizations design and implement valid and fair staffing and selection systems.
Back to top
Dan Putka, Ph.D.
Sifting for Truth in the Big Data Morass: Benefiting from Big Data Methods with Your Small Assessment Data
For the past several years, Big Data has been recognized as one of the top workplace trends. Despite the mystique surrounding Big Data, there is still little published work accessible to HR practitioners and psychologists describing how Big Data ideas and methods can readily be leveraged in practice. As a result, several myths have perpetuated in our field regarding these methods that are slowing their adoption. In this presentation, I will introduce and debunk several Big Data related myths as they pertain to problems typically encountered in applied assessment and prediction work. Myths related to Big Data methods being irrelevant to smaller data sets, producing results that are too hard to convey to decision makers, producing overly optimistic results (i.e., capitalizing on chance), and lacking theoretical value will be addressed. A concrete example based on a biodata measure and job performance outcome will be used to illustrate key points.
Bio: Dan J. Putka is a Principal Staff Scientist at the Human Resources Research Organization (HumRRO) in Alexandria, Virginia. He has over 15 years of experience helping private and public sector organizations innovate in the areas of talent acquisition and human capital analytics. Dr. Putka has helped numerous organizations design, develop, and evaluate assessments to (a) enhance their hiring and promotion processes, and (b) guide individuals to career and job opportunities that fit them well. He has also conducted several large-scale analytics and evaluation projects to (a) identify precursors of employee engagement and turnover, (b) evaluate and refine personnel selection systems and hiring processes, and (c) identify job critical competencies and organization-wide competency gaps.
Complementing his client-centered work, Dr. Putka has maintained an active presence in the industrial-organizational (I-O) psychology scientific community. He has delivered over 50 presentations and invited workshops at national conferences, published over 20 book chapters and articles in peer-reviewed journals, and serves on the editorial board of five scientific journals.
Dr. Putka is a past-president of the Personnel Testing Council of Metropolitan Washington (PTC-MW), a fellow of APA and three of its divisions (5, 14, and 19), and was the 2015 recipient of IPAC’s Stephen E. Bemis Memorial Award.
Back to top
Ryan Ross, M.A.
High Potential Identification – Are You Doing It Wrong?
Leadership is usually defined in terms of a person’s status in an organization. If a person has a title, he or she must have the leadership skills. Right? Wrong. The business landscape is shifting at an ever increasingly faster rate. People represent the difference between organizational success and failure, and thus the stakes of correctly identifying and developing the next generation of leaders could not be higher. The leadership pipeline needs to be populated by those who can successfully lead high performing teams, also known as HIPOs. Despite guidance from the academic and business literatures, some organizations still base these important decisions on politically fraught processes, or confuse successful emergence with effective leadership. When it comes to desired leadership outcomes, emergence does not necessarily equal effectiveness, and accurately identifying top talent must involve science and data; which is what we as a profession are supposed to do. The question is – are we doing it right? This session will explore the question and provide five areas to explore in your own organization.
Bio: Ryan has more than 20 years of experience across a wide range of industries. He has worked in numerous practices at Hogan over the last 14 years including the selection and development practices, as well as working with Hogan’s strategic alliances and partners around the world. Ryan has developed and implemented large scale, multi-level selection programs domestically and internationally, consulted with organizations on selecting people into new jobs, and on the use of personality based and future oriented job analysis. Ryan also has vast experience in validating and defending the use of personality assessments in the pre-employment context.
Ryan’s experience also encompasses leadership development, talent management, and succession planning projects. Considered an expert on leadership derailment and the use of assessments to help identify potential points of failure in current and future leaders, Ryan frequently speaks at conferences and invited sessions on the topic of Strategic Self Awareness. Practically, Ryan has experience integrating Hogan’s tools into various development programs at all levels of the organization, including the integration of data into larger development and succession planning processes.
Ryan received a Masters in Clinical Psychology from Baylor University and a Masters in Industrial / Organizational Psychology from the University of Tulsa.
Back to top
Assessment as a Driver of Performance – It’s Doable; Why Aren’t We Doing It?
Most of us think about assessment as the measurement of individual capabilities for decisions about selection and promotion – and maybe development.
This presentation has two distinct parts: The first is a description of a project using assessment principles as a foundation for purposes quite different, and potentially more impactful, than employee selection. The second is a survey of IPAC members and their IPMA-HR counterparts inquiring about “state of the art” when it comes to assessing performance at the organizational or departmental level. Of particular interest is the experience of public entities in measuring performance outcomes (assessing performance) and the forces impeding attempts to do so.
The original impetus for the project was a request by a city of 50,000 to build a “Pay for Performance” system. The final result, still in place 17 years later, is far more than that. It is a tool by which individuals, departments, and the city as a whole measure their performance and design strategies to improve.
The survey and presentation will describe the process and terrace its evolution over time. Contrary to expectations, other public agencies have not, it appears, embraced comparable efforts. With the assistance of survey, we will consider “why not?”.
Bio: Until October 27, 2015, Harry Brull was Senior Vice President, Public Sector Services for PDINinthHouse (formerly Personnel Decisions International, now Korn Ferry), an I/O psychology consulting organization with 34 offices in North America, Europe, South America, and Asia. At the present time he is a Senior Partner of BCG Consulting Group.
He joined PDI in 1978. Prior to that, he served as a probation officer, elementary school teacher, drug counselor, therapist, and general contractor. He was a charter member of the Minnesota Employment Law Council where he was the only non-attorney. During his tenure at PDI he has consulted with major corporations, public agencies and non-profit organizations. He has designed and implemented more than 3,000 selection and promotion processes.
He has taught at Cornell University, the University of Minnesota, Minnesota School of Professional Psychology, and the Southern Police Institute. He taught I/O psychology at St. Olaf and Macalester Colleges. He was president of the International Public Management Assessment Council (IPMAAC) and the recipient of the 2002 Bemis Memorial award and 2007 Clyde Linley Service Award. He currently serves as a board secretary of the League of American Bicyclists, and board chair of KHEN Community Radio.
In his spare time, he enjoys long-distance bicycling, competitive volleyball, scuba diving, and good liquor. His excuse is that he's originally from New York and left-handed.
Back to top