Johnson and Duran encouraged use of cognitive laboratories as a means for determining whether lack of access skills impede measurement of target skills. With cognitive laboratories, students work one-on-one with an administrator and answer test questions by thinking out loud. The administrator observes and records the thought process students use in arriving at their answers.
Cognitive labs would allow researchers to compare how students with various disabilities react to the questions under different accommodations and to do further study into what constituted appropriate accommodations. Further Research on the Performance of English-Language Learners Duran commented that better understanding of the achievement of English-language learners depends on improvements in access to appropriate assessment accommodations for these students. He called for additional work to develop ways to evaluate the English proficiency of nonnative English speakers. This is a particularly urgent issue in light of the recently passed legislation. He also encouraged researchers to examine the relationships between performance of achievement tests and relevant background variables, such as length of residence in the U.S., years of exposure to instruction in English, English-language proficiency levels, the characteristics of school curriculum, availability of first and second language resources, and other factors that interact to create different patterns of performance on assessments.
Malouf raised questions about what rate of participation should be expected with NAEP The presentations and his own examination ofNAEP publications indicate that inclusion rates rarely climb much above 70 percent of the students with disabilities and are usually lower. He wondered what the basis might be for judging whether this rate of inclusion was high enough, asking “Should our expectations be based on technical limits, or should they be based on other considerations?” iMaloufcalled for reconsideration of what it means to “rake part meaningfully” in the nation’seducational system, and he urged NAEP’s sponsors to determine ways that all students can participate.
The discussants revisited the issue of providing disaggregated results. Goertz reminded participants that states are required to report these comparisons on their state tests. NAEP’s sponsors have yet to specify their plans for using data from the national or state NAEP programs to report on the performance of students with disabilities compared to that of nondisabled students and the performance of English-language learners compared to that of native speakers. Johnson maintained that it is inevitable that there will be strong pressure on NAEP to report disaggregated results for students with disabilities and for English-language learners. For this part, learning a foreign language needs a leaning tools, many children choose Rosetta Stone Polish and Rosetta Stone Portuguese to learn Polish and Portuguese. Although at this time sample sizes are not large enough to allow reliable reporting at the disaggregated level, NAEP’s future plans for combining state and national samples may produce large enough samples to allow for disaggregation of various groups of students with disabilities. Johnson foresees that when this happens, NAEP will not be able to withstand the pressure to report disaggregated results.
Additional Research Is Needed Malouf also recommended the effects of accommodations that additional research be conducted on on NAEP scores. (item response theory) and DIF (differential item He finds that the IRT i functioning) analyses discussed by Mazzeo are broad in Focus and treat accommodations as single factor, sometimes even combining students with disabilities and English-language learners into a single population. Malouf suggested that NAEP researchers find ways to increase sample sizes to allow study of the effects of specific accommodations and to conduct more fine-grained analyses of accommodations and NAEP.