R&D Hub

Published on Friday, November 10, 2023

NAEP Process Data Researchers Create Winning Data Visualization

NAEP Process Data Researchers Create Winning Data Visualization

NAEP Center for Process Data Researchers Ruhan Circi and Juanita Hicks won the 2023 Educational Measurement: Issues and Practice (EM:IP) Cover Graphic/Data Visualization Competition. Their visualization comparing traditional data and process data measurements of the percentages of students who did not reach an item in a NAEP assessment was featured on the cover of the most recent EM:IP issue, published in September.

EM:IP is a quarterly journal from the National Council on Measurement in Education (NCME) focusing on educational measurement through test scores and other modes of assessment. According to NCME, the primary purpose of the journal is “to promote a better understanding of educational measurement and to encourage reasoned debate on current issues of practical importance to educators and the public.” The journal serves as a means for NCME members to communicate among themselves and with the public about issues of assessment.

The EM:IP Cover Graphic/Data Visualization Competition began in 2014 with a call for NCME members to submit original data visualizations that make educational assessment data accessible or illustrate key statistical and psychometric concepts. Winning submissions are selected by the EM:IP editorial board based on criteria for originality, visual appeal, storytelling, relevance to the field, and thoughtful analysis. Winners are featured on the covers of new issues. Circi and Hicks were previous winners of this competition, with another award-winning visualization featured on the cover of the spring 2021 issue. To peruse past winners, check out the EM:IP Cover Gallery.

Circi and Hicks’ winning visualization was selected for the cover of EM:IP volume 42, issue 3, released in the fall of 2023. The researchers wrote this to contextualize and describe the graphic:

This graph explores the differences between the percentage of students not reaching items in an assessment block from two different sources: traditional data and process data.

Descriptively, terms like “not reached” and “missing response” are aligned with “not presented” or “not attempted,” which can mean examinees did not see the item and are thus labeled “not reached.” But what if examinees did see the item? Process data can offer some solutions to these questions and potentially help to improve current item labeling rules that are rooted in paper-and-pencil tests.

When there are no process data records (e.g., student/system actions, response time) for an item, it indicates that the item was indeed not reached. Therefore, it is relatively easy to look at the percentage of students who truly do not reach an item. How does this compare to results from traditional labeling rules for not reached?

To learn more about EM:IP, check out the journal overview here. Stay up to date on process data news and potential training and internship opportunities from these leading NAEP research experts by signing up for our mailing list.

Comments (0)Number of views (105)
Print

More links

title of plugged in news

The Summer 2024 NAEP Data Training Workshop - Applications Open

04-12-2024

Applications are now open for the summer 2024 NAEP Data Training Workshop! This workshop is for quantitative researchers with strong statistical skills who are interested in conducting data analyses using NAEP data. For the first time, participants in this year's training will get an introduction to COVID data collections. Learn more here!

EdSurvey e-book now available!

02-14-2022

Analyzing NCES Data Using EdSurvey: A User's Guide is now available for input from the research community online here.  Check it out and give the team your feedback.

«April 2024»
MonTueWedThuFriSatSun
25262728293031
1234567
891011121314
15161718192021
22232425262728
293012345