Capti Assess Research & Development of ReadBasix

ETS ReadBasix, previously known as RISE or SARA, is the most researched diagnostic assessment in the market, based on over two decades of research by a team of distinguished reading scientists, assessment researchers, and reading intervention practitioners at ETS and the SERP Institute. To fully appreciate the depth and academic importance of this product, we briefly review the history of ReadBasix and cite relevant research.

History of R&D

  • 2023 ETS® ReadBasix™ is linked to MetaMetrics® Lexile® reading measure

    In Fall 2022 Capti, ETS, and MetaMetrics conducted a Lexile alignment study with almost 4,000 students in grades 3-12 in 18 schools across the country. The students completed ETS ReadBasix and Metametrics Lexile assessments. The data obtained in the study enabled alignment between ReadBasix and Lexile. In Spring 2023, full support for reporting Lexile measures on the basis of student's performance in 3 of 6 ReadBasix subtests (sentence processing, reading efficiency, and reading comprehension) was rolled out in Capti Assess.

  • 2020 Capti® Assess with ETS® ReadBasix™ is released for wide adoption

    In 2020, Capti became the official ETS distributor and made Capti Assess with ETS ReadBasix available for sale. Since its release Capti Assess with ETS ReadBasix has been adopted by numerous school districts across the U.S. ETS and Capti continuously improve the capabilities of the assessment.

  • 2018 Alpha version of Capti® Assess with ETS® ReadBasix™ released, testing in schools started

    In 2018, Capti partnered with ETS to bring ReadBasix to market. Capti integrated ReadBasix into its Capti Assess platform and iteratively tested and improved the product.

  • 2016 National norming study in grades 3-12

    In 2016, the team of reading scientists, assessment researchers, and reading intervention practitioners at ETS and the SERP Institute performed a national norming study in grades 3-12, after which the RISE (Reading Inventory and Scholastic Evaluation) evolved into its current form under the name ETS ReadBasix.

  • 2012 Field testing expansion with administrations at a large school district in Maryland

    In 2012, field tests of RISE (Reading Inventory and Scholastic Evaluation) expanded to include a large district in Maryland. The tests allowed ETS and the SERP Institute to refine RISE based on user feedback and analysis of the data.

  • 2010 $115M Department of Education IES Reading for Understanding (RfU) Initiative

    In 2010, ETS was awarded an assessment grant under the Reading for Understanding Initiative funded by IES at the U.S. Department of Education. This funding allowed for the expansion of the assessment to include more grade levels including elementary school starting from Grade 3 to high school.

  • 2007 One of the first large scale administrations in a large Massachusetts school district

    In 2007, one of the first large-scale administrations of the RISE (Reading Inventory and Scholastic Evaluation) occurred in a school district in Massachusetts. This allowed RISE to be field tested for the first time with students in entire middle schools.

  • 2004 Dr. Sabatini began working on RISE at ETS in collaboration with the SERP Institute

    In 2004, Dr. John Sabatini at the Educational Testing Service (ETS) began creating the RISE (Reading Inventory and Scholastic Evaluation) assessment, the predecessor to ETS ReadBasix. This work was spurred through collaborative work with the Strategic Education Research Partnership (SERP) Institute, a group that closely works with school districts across the U.S. School districts in Massachusetts noticed that many of their middle school students were arriving in 6th grade with weak reading skills, but the schools were not equipped to identify students’ exact reading skills weaknesses—or what to do about them. RISE was initially designed specifically for middle school students, to give schools the information they needed to help struggling readers. The project was funded by grants from SERP, Carnegie, and Lila Wallace.

Technical Reports by ETS

  • SARA Reading Components Tests, RISE forms: Technical adequacy and test design, 3rd edition (Research Report No. RR-19-36). Princeton, NJ: Educational Testing Service. Sabatini, J., Weeks, J., O’ Reilly, T., Bruce, K., Steinberg, J., & Chao, S.-F. (2019).

    This is the third and most recent edition of the technical report for the ReadBasix (SARA / RISE) assessment battery. This report expands the first and second reports by featuring a national sample of students from grades 3-12 (the first report had grades 6-8; the second one had grades 5-10). This report includes a theoretical overview of the battery of assessments including a subtest for each foundational skill: word recognition and decoding, vocabulary, morphology, sentence processing, and reading efficiency, and for basic reading comprehension. The report includes psychometric analyses, item response theory scaling study, evaluation of multidimensionality, validity evidence, evaluation of differential item functioning for gender, and race/ethnicity.

    Full Text

  • SARA Reading Components Tests, RISE Forms: Technical Adequacy and Test Design, 2nd Edition (ETS RR-15-32). Princeton, NJ: Educational Testing Service. Sabatini, J., Bruce, K., Steinberg, J., & Weeks, J. (2015).

    The second edition of the technical report on the ReadBasix (SARA / RISE) assessment battery expands the first report by featuring grades 5-10 (the original had grades 6-8). Included in this report are analyses for each subtest (word recognition and decoding, vocabulary, morphology, sentence processing, and reading efficiency, and basic reading comprehension), psychometric analysis of parallel forms of each subtest, results of item response theory scaling studies for each subtest across the entire grade span, and evaluation of differential item functioning for gender, and race/ethnicity.

    Full Text

  • SARA Reading Components Tests, RISE Form: Test Design and Technical Adequacy (ETS RR-13-08). Princeton, NJ: Educational Testing Service. Sabatini, J., Bruce, K., Steinberg, J. (2013).

    This is the first technical report on the ReadBasix assessment (SARA / RISE). ReadBasix was originally designed for struggling readers in middle school because teachers within a large, urban district wanted more information about why their students were struggling to read. The battery of assessments includes a subtest for each foundational skill: word recognition and decoding, vocabulary, morphology, sentence processing, and reading efficiency, as well as for basic reading comprehension. This report details the research base that supports the design and development of the reading skills components battery, and describes a pilot study with students in grades 6-8.

    Full Text

Relevant Research Papers by ETS

  • A tale of two tests: The role of topic and general academic knowledge in traditional versus contemporary scenario-based reading. Learning and Instruction, 73, 101462 Wang, Z., O’Reilly, T., Sabatini, J., McCarthy, K., & McNamara, D. (2021).

    This article presents research suggesting high school students’ academic knowledge is highly predictive of traditional comprehension assessments, which require identifying information and drawing inferences from single texts, but less so for scenario-based assessments, which call for integrating, evaluating, and applying information across multiple sources. Within the study, a shortened version of three ReadBasix subtests (vocabulary, morphology and sentence processing) all strongly predicted academic knowledge (r’s .43 - .57), and reading comprehension on both a traditional comprehension test (r’s .56 - .57) and a scenario-based comprehension test (r’s .50 - .54). The strength of relation between ReadBasix to either comprehension test was comparable to the relation between the two comprehension tests (r = .57). Results demonstrated that ReadBasix subtests are valid indicators of students’ academic achievement, single text comprehension, and scenario-based multiple-text comprehension.

    Full Text

  • When slower is faster: Time spent decoding novel words predicts better decoding and faster growth. Scientific Studies of Reading Wang, Z., Sabatini, J., & O'Reilly, T. (2019).

    This article presents research from two studies that compared poor and normal decoders’ processing times on real words, pseudo-homophones, and nonwords (Study 1), and evaluated how a processing time difference is associated with rates of decoding development (Study 2). The results suggest that poor decoders spend more time recognizing real words and pseudo-homophones, but less time on non-words, whereas normal decoders spend more time decoding non-words. The researchers concluded that poor decoders may be trapped in a vicious cycle where poor decoding skill combined with less time spent attempting to decode novel words interferes with decoding development.

    Full Text

  • Decoding and reading comprehension: A test of the decoding threshold hypothesis. Journal of Educational Psychology, 111(3), 387-401. Wang, Z., Sabatini, J., O’Reilly, T., & Weeks, J. (2019).

    This article presents research from two studies that examined the relation between decoding and reading comprehension with middle and high school students. Using prominent reading theories as a basis, the authors propose the Decoding Threshold Hypothesis, which suggests the relation between decoding and reading comprehension can only be reliably observed above a certain decoding threshold. In Study 1, the Decoding Threshold Hypothesis was tested. Researchers found a reliable decoding threshold value below that there was no relation between decoding and reading comprehension, and above which the two measures showed a positive linear relation. Study 2 examined a longitudinal analysis of reading comprehension growth as a function of initial decoding status. Results showed that scoring below the decoding threshold was associated with stagnant growth in reading comprehension, and above demonstrated accelerating reading comprehension growth from grade to grade.

  • Middle school reading assessment: Measuring what matters under an RTI framework. Reading Psychology Special Issue: Response to Intervention, 33 (1-2), 162-189. O’Reilly, T., Sabatini, J., Bruce, K., Pillarisetti, S., & McCormick, C. (2012).

    This article describes an early conception of ReadBasix designed to measure six component and integrated reading skills and determine the assessment’s fit into an RTI framework. Aligning ReadBasix with the research in cognitive science, reading and learning allowed researchers to create an assessment that can help identify weakness in each of the six foundational skills. Additionally, the battery was found to be more predictive for students who were struggling readers. From the information provided by the assessment’s results, educators can make more informed decisions about who needs help, what help is needed, and whether the instructional support is effective.

Research Papers on ReadBasix by ETS

  • Engineering a 21st Century reading comprehension assessment system utilizing scenario-based assessment techniques. International Journal of Testing. Sabatini, J., O’Reilly, T., Weeks, J., & Wang, Z. (2019).

    This article presents a developmentally sensitive reading comprehension assessment grounded in a scenario-based assessment paradigm, which was designed to meet the evolving construct of reading comprehension. Evidence for the concurrent validity of ReadBasix is included. The authors found the ReadBasix comprehension subtest to be correlated with external measures of reading comprehension, specifically the Gates-MacGinitie reading test and the scenario-based assessment. The correlation between the ReadBasix comprehension subtest and the scenario-based assessment of reading comprehension is important because the scenario-based assessment requires higher level comprehension constructs and shows that higher level constructs are related to foundational comprehension as measured by ReadBasix.

  • How do people read the passages during a reading comprehension test? The effect of reading purpose on text processing behavior. Educational Assessment O’Reilly, T., Feng, G., & Sabatini, J., Wang, Z., & Gorin, J. (2018).

    This research study examined the effect of reading purpose on participants’ reading behaviors using eye-tracking technologies. Proficient undergraduate students read four passages; two required participants to write a summary, and two required answering multiple choice questions. Results indicated that more time was spent constructing a coherent mental model of text content (deep comprehension) when the purpose for reading included a written summary as compared to only answering multiple choice questions. This study provided evidence for content validity of the ReadBasix assessment because reading relevant parts of passages facilitated answering comprehension questions.

  • How individual differences interact with task demands in text processing. Scientific Studies for Reading, 21 (2), 165-178 Wang, Z., Sabatini, J, O’Reilly, T., & Feng, G. (2017).

    This research study investigated how individual differences interacted with task requirements utilizing eye tracking technologies to measure undergraduate students’ reading efficiency. Researchers found that participants spent more time reading when the task required a written summary as compared to when the task required only answering multiple choice questions. The time spent reading benefitted students who had relatively low reading efficiency as they were able to answer the multiple choice questions more efficiently after writing a summary. The results provide structural validity of ReadBasix by showing convergence in reading comprehension, fluency, and summary writing measures.

  • Integrating Scenario-based and component reading skill measures to understand the reading behavior of struggling readers. Learning Disabilities Research & Practice, 29(1), 36-43 Sabatini, J., O’Reilly, T., Halderman, L. & Bruce, K. (2014).

    This study presents data from two measures that were designed to provide a more holistic picture of reading comprehension. The measures include the Reading Inventory and Scholastic Evaluation (RISE), now known as ReadBasix, and Global, Integrated Scenario-Based Assessment (GISA), now known as ReadAuthentix in the Capti Assess suite of assessments. The results show that each subtest on ReadBasix predicted unique variance on ReadAuthentix. Further, this study provides evidence for measuring foundational reading skills, five subtests of ReadBasix, when assessing reading comprehension because lower level foundational skills may impede comprehension.

Reports on ReadBasix Administration by Other Research Labs

  • Linking the ReadBasix™ Assessment with the Lexile® Framework for Reading. Linking Study Report. Redacted. Prepared by MetaMetrics for the ETS under License Agreement, signed August 1, 2022. March 2023 (Updated April 2023).

    The primary purpose of this study was to link the ReadBasix Sentence Processing, Reading Efficiency, and Reading Comprehension Subtests to the Lexile Framework for Reading. ReadBasix Subtest scale scores can now be used to present a solution for matching students with text and information that can leverage tools such as the Lexile “Find A Book” to answer questions related to standards, test score interpretation, and test validation. A predictive function was constructed to transform ReadBasix Sentence Processing, Reading Efficiency, and Reading Comprehension subtest scale scores to Lexile reading measures. The regression approach allows for a profile of ReadBasix scores to be combined to predict a Lexile reading measure, rather than a multitude of functions for each subtest.

  • Exploring thresholds in the foundational skills for reading and comprehension outcomes in the context of postsecondary readers. Journal of Learning Disabilities, April 2022. Magliano, J. P., Talwar, A., Feller, D. P., Wang, Z., O’Reilly, T., & Sabatini, J. (2022).

    This article presents evidence to suggest potential thresholds in foundational reading skills that may limit college students’ reading comprehension on both close and applied literacy tasks. This research extends the work of Wang, Sabatini, O’Reilly, and Weeks (2019) that found students’ growth in reading comprehension conditional on their decoding scores to explore whether there are thresholds in foundational skills that may limit reading comprehension for college students. The study included students who were determined to be underprepared for college and assigned to developmental literacy programs, and others who were determined to be prepared for college. The findings suggest that there are thresholds for foundational reading skills—decoding/word recognition, morphological knowledge, and sentence processing—that had implications for students’ inclination to engage in the reading comprehension strategies of paraphrasing, bridging, and elaborating (all higher level literacy tasks). Students who fell below the thresholds demonstrated a lower level of employing reading strategies when compared to those who above the thresholds. These are important findings as they highlight problems with foundational reading skills that may persist into college.

  • Explanatory modeling in science through text-based investigation: Testing the efficacy of the Project READI intervention approach. American Educational Research Journal, 56, 1148, 1216 Goldman, S. R., Greenleaf, C., Yukhymenko-Lescroart, M., Brown, W., Ko, M. L. M., Emig, J. M., George M.A., Wallace P., Blaum D. & Britt, M. A. (2019).

    This article shares research on READI, a reading intervention designed to increase students’ reading comprehension. The Reading Inventory and Scholastic Evaluation (RISE), also known as ReadBasix, was used as the pretest and the Global, Integrated Scenario-Based (GISA), now known as ReadAuthentix, was used as the posttest. Both ReadBasix and ReadAuthentix are part of the Capti Assess suite of assessments. Ninth-graders’ performance on the comprehension measures suggests that the skills measured on ReadBasix are related to the deep comprehension required by ReadAuthentix.

  • Engaging struggling adolescent readers to improve reading skills. Reading Research Quarterly, 52, 357–382 Kim, J. S.,Hemphill, L., Troyer,M.,Thomson, J.M., Jones, S. M., LaRusso,M. D., & Donovan, S. (2017).

    This article shares research on the Strategic Adolescent Reading Intervention (STARI), which was designed as a supplemental reading program based on peer- and discussion-based instruction that supports word-reading skills, fluency, vocabulary, and comprehension. ReadBasix (formerly known as RISE) was used to measure success of the intervention based on students’ scores. The results from 6th to 8th grade students indicate that the skills assessed by ReadBasix can be improved from targeted reading interventions such as STARI.

  • Examining general and specific factors in the dimensionality of oral language and reading in 4th–10th grades. Journal of Educational Psychology, 107, 884, 899 Foorman, B. R., Koon, S., Petscher, Y., Mitchell, A., & Truckenmiller, A. (2015).

    This research article shares supporting evidence for the vast amount of variance in reading comprehension being attributed to oral language, specifically lexical knowledge. The findings differ from the Simple View of Reading proposed by Gough and Tunmer (1986), which suggest it is decoding and language comprehension that contribute to reading comprehension. The study also provides evidence for the concurrent validity of ReadBasix as the component subtests were predictive of reading comprehension. ReadBasix subtests, specifically the vocabulary and morphology, correlated with the Gates-MacGinitie reading test.

The R&D of Capti Assess with ETS ReadBasix was supported by the Institute of Education Sciences, U.S. Department of Education, through Grant R305F100005 to the Educational Testing Service (ETS) as part of the Reading for Understanding Research (RFU) Initiative, as well as through the Small Business Innovation Research (SBIR) program contracts 91990019C0024, 91990021C0029, and 91990022C0042 to Charmtech Labs LLC. The opinions expressed are those of the authors and do not represent views of the Institute or the U.S. Department of Education.

Let’s talk!

Call: 888-533-7884 Ext. 2

Schedule a Call or Demo