What assessment actually reveals about learning — and who it leaves out.
Digital assessment accessibility, NAEP process-data methodology, online and hybrid learning outcomes, and inclusive curriculum design for learners often treated as edge cases. Our education research uses click-stream data to see what proficiency tests never show.
Accessibility features aren't a checklist. They're data about what learning actually requires.
Assessment accessibility is usually treated as a compliance box: does the test have text-to-speech, magnification, color contrast, breaks, and extended time? Our NAEP Process Data project asks a sharper question. When those features are available, who actually uses them? How long do students engage with items when they do? And do those engagement patterns predict different learning outcomes for students with disabilities?
The answers reshape how we evaluate whether a test is fair. Uptake rates differ sharply across features, learner groups, and item types — and the process-data signal predicts outcomes that proficiency scores obscure.
NAEP project Read the paperAccessibility features are not a checklist. They are data about what learning actually requires — and who the current test is failing to measure.— RISEI education principle
Assessment, rethought.
What shifts when we treat accessibility as measurement data rather than a compliance checkbox.
Accessibility as a checklist
- Features listed in the test documentation
- One-size-fits-all default settings
- Pass/fail audit for availability
- No insight into who uses what
- Score gaps treated as proficiency gaps
Accessibility as data
- Click-stream capture of feature usage
- Differentiated design by learner group
- Engagement patterns analyzed per item
- Process data predicts outcomes
- Test design changes based on findings
Four active fronts in education research.
Each thread is a live stream of work — from digital assessment to inclusive curriculum — anchored in NAEP data and state education agency partnerships.
Digital Assessment Accessibility
Using process data from NAEP digital math assessments to analyze how accessibility features — text-to-speech, magnification, contrast, breaks, extended time — are used by students with disabilities, and how that usage predicts performance. IES Innovation Grant.
Differentiated Design
Why one-size-fits-all defaults disadvantage some learners — and what differentiated accessibility design looks like in digital assessment.
Online & Hybrid Learning
Outcomes from online and hybrid learning for students with disabilities in under-resourced contexts — including digital access, purchasing-power, and geographic dimensions. Built on data assembled during and after pandemic-era instruction.
Curriculum Innovation
New curriculum models for inclusive classrooms — from K–12 into workforce apprenticeship. How our apprenticeship curricula, deployed in Virginia EPIC and Maine P2P, translate learning-innovation principles into real pathways.
NAEP Process Data & Accessibility.
An IES Innovation Grant investigating how accessibility features in digital assessments affect math performance for students with disabilities — and what the findings imply for next-generation test design.
Open project page →Process data as measurement infrastructure
The project's core contribution is methodological: moving NAEP analysis from proficiency scores alone to click-stream process data that reveal how students with disabilities engage with items.
Published from this area.
Federal, philanthropic, academic.
Scoping an education innovation grant?
RISEI serves as named evaluator on IES, NSF EDU, Spencer Foundation, and state education agency grants — especially those involving digital assessment, accessibility research, inclusive curriculum, or AI-enabled education tools. Bring us in at the proposal stage.
Start a partnership →