Hernandez, IvanNie, Weiwen2023-01-052023-01-052022http://hdl.handle.net/10919/113042We propose a framework for integrating various modern natural language processing (NLP) models to assist researchers with developing valid psychological scales. Transformer-based deep neural networks offer state-ofthe- art performance on various natural language tasks. This project adapts the transformer model GPT-2 to learn the structure of personality items, and generate the largest openly available pool of personality items, consisting of one million new items. We then use that artificial intelligencebased item pool (AI-IP) to provide a subset of potential scale items for measuring a desired construct. To better recommend construct-related items, we train a paired neural network-based classification BERT model to predict the observed correlation between personality items using only their text. We also demonstrate how zero-shot models can help balance desired content domains within the scale. In combination with the AI-IP, these models narrow the large item pool to items most correlated with a set of initial items. We demonstrate the ability of this multimodel framework to develop longer cohesive scales from a small set of construct-relevant items. We found reliability, validity, and fit equivalent for AI-assisted scales compared to scales developed and optimized by traditional methods. By leveraging neural networks’ ability to generate text relevant to a given topic and infer semantic similarity, this project demonstrates how to support creative and open-ended elements of the scale development process to increase the likelihood of one’s initial scale being valid, and minimize the need to modify and revalidate the scale.application/pdfenCreative Commons Attribution 4.0 InternationalArtificial intelligenceBig dataMachine learningPersonalityPersonality assessmentTechnologyThe AI-IP: Minimizing the guesswork of personality scale item development through artificial intelligenceArticle - RefereedPersonnel Psychologyhttps://doi.org/10.1111/peps.125432022