Mixture Modeling for Discipline-Based Education Researchers (MM4DBERs) is a training program funded through the National Science Foundation (Award Number 2224786) that will provide online training and mentoring to support the application of mixture modeling to participants' own research questions. This training is available to Discipline-Based Education Researchers (DBERs) who are actively conducting research on critical questions in STEM education around diversity, equity, and inclusion. Mixture modeling is a powerful tool for uncovering heterogeneity in student experiences and providing a richer way of understanding the experiences of students typically underrepresented in STEM fields. Although mixture modeling is not yet common among DBER scholars, it is a well-established tool with great potential to enrich the field's understanding of DEI-related issues in STEM education. 

  • Two cohorts of 10 participants will be selected (Cohort 1: 2023-2024; Cohort 2: 2024-2025)

  • Applications for Cohort 2 will be due by March 1, 2024. Participants will be notified of their selection by April 1, 2024 and must commit to participating by the end of the month. Training for the first cohort will start in May 2024.

  • Each participant will be provided with a $2,150 stipend to support their research (e.g., purchasing software or other needed research support)

  • Training will include: 10 days of virtual training (synchronous and asynchronous) and will cover

    • R/RStudio and Mplus Automation

    • Fundamentals of data science for reproducible science

    • Latent class analysis, Latent profile analysis

    • Latent transition analysis, growth mixture modeling

    • Covariates and distal outcomes in mixture modeling

    • Writing and communicating mixture modeling results

  • Mentoring will include year-long one-on-one monthly meetings with project staff to discuss each participant's analysis plans and opportunities to share findings with other cohort members.

  • Ongoing methodological and data analysis support