R&D Hub

Published on Friday, August 18, 2023

NAEP Validity Studies Panel Publishes White Paper: Improving Equitable Measurement and Reporting in NAEP

NAEP Validity Studies Panel Publishes White Paper: Improving Equitable Measurement and Reporting in NAEP

In June 2023, NAEP Validity Studies (NVS) panelist Dr. Gerunda B. Hughes of Howard University published a white paper exploring the history and equity issues of NAEP and making recommendations for improving “equitable measurement and reporting in NAEP.”

NVS is an independent panel of experts contracted by the National Center for Education Statistics (NCES) to study the validity of all aspects of the National Assessment of Educational Progress (NAEP), including assessment development, data collection and analysis, and reporting and data use. The panel’s work is important as technological innovations, such as digitally based assessments, push testing forward and generate new considerations for monitoring the validity of the NAEP assessment. This white paper is one of many produced by NVS panelists over more than 20 years of research on issues of NAEP validity. Additional detail on the role of the panel and a complete list of reports produced by the NVS panel can be found here.

The most recent NVS white paper, Dr. Gerunda B. Hughes’s Improving Equitable Measurement and Reporting in NAEP, examines the history of NAEP and its validity, fairness, and equity. The historical background begins with an overview of “the NAEP Law,” or the National Assessment of Educational Progress Authorization Act of 2002, which mandates the administration of NAEP and sets out goals including collecting “representative data on a national and regional basis” in a “valid and reliable manner” without excluding “special student groups.” Hughes lays out the purpose of NAEP according to the NAEP law, which is to provide a “fair and accurate measurement of student academic achievement.”

Hughes goes on to lay out some of the “assumptions, policies, and practices” undergirding assessments like NAEP that have been challenged by researchers over the years, such as standardization or “assumptions of construct equivalence across culturally different test takers.” Hughes explores some examples of literature that deal with these and other elements and their impact on test validity. She examines the meaning and practical impact of terms like validity, fairness, and equity as they relate to assessments and NAEP specifically.

Hughes ultimately makes suggestions for the improvement of equitable measurement and reporting in NAEP, including recommendations on how to implement equitable features, potential paths for future research, and an examination of who would benefit (directly or indirectly) from proposed changes to increase equity in NAEP. Hughes suggests, among other things, analyzing the performance of demographic subgroups against what is known about the subgroup overall so as to facilitate fair and meaningful comparisons with other subgroups; comparing “NAEP subject area content framework objectives and assessments” with their counterparts in state and district assessments to assess differences in content and practice; and providing options and choice in passages when measuring reading comprehension, allowing students to choose a passage that engages them. To read more of Hughes’s recommendations and get more detail on their context, history, related research questions, and more, check out the white paper here.

Comments (0)Number of views (184)

More links

title of plugged in news

The Summer 2024 NAEP Data Training Workshop - Applications Open


Applications are now open for the summer 2024 NAEP Data Training Workshop! This workshop is for quantitative researchers with strong statistical skills who are interested in conducting data analyses using NAEP data. For the first time, participants in this year's training will get an introduction to COVID data collections. Learn more here!

EdSurvey e-book now available!


Analyzing NCES Data Using EdSurvey: A User's Guide is now available for input from the research community online here.  Check it out and give the team your feedback.

«April 2024»