Dynamic Indicators of Basic Early Literacy Skills (DIBELS) has been a part of my teaching career since I arrived in Arizona and entered my current school district in 2016. I have prior experience with this assessment from different grade levels as a classroom teacher and a DIBELS administrator. This reading assessment holds high importance for the creation of small groups in the classroom, but also dictates Arizona's Move On When Reading (MOWR) policy for specific students. Let's take a look at this assessment and see how appropriate it is in the elementary school setting.
History and Research
This assessment, derived from the University of Oregon, is a "set of procedures and measures for assessing the acquisition of literacy skills." Each measure is timed for one minute and will be used to "detect risk and monitor the development of early literacy and early reading skills" in students. The University of Oregon has updated the versions of DIBELS throughout the years, modifying and accommodating to all students in various states and districts. The assessment was updated in 2018 to include 8th grade reading measures, naming it DIBELS 8th Edition. This assessment is taken three times per school year; beginning of the year (BOY), middle of the year (MOY), end of the year (EOY).
A study conducted by Dawn Sheldon Johnson (1996) stated data from DIBELS had high reliability for students going from Kindergarten to First Grade. This was only for specific measures, such as Letter Naming Fluency (LNF), Picture Naming Fluency (PNF), and Phonemic Segmentation Fluency (PSF). The measures conducted with the group of Kindergarteners, and then a year later when they were First Graders, showed that they were "valid predictors of future performance on pre-reading and reading skills in first grade" (Johnson, 1996, p. 57). Others like early childhood expert Samuel J. Meisels, say DIBELS has "very, very weak validity" and University of Arizona's Kenneth S. Goodman stated DIBELS "is an absurd set of silly little one-minute tests that never get close to measuring what reading is really about - making sense of print"(Strauss, 2007).
Role of Standardization
There are specific measures and benchmarks students must obtain at each grade level. Below is the revised table of measures for this assessment, as of 2018.
"Parent Guide to DIBELS Assessment" by University of Oregon Center on Teaching and Learning
Looking at the table, students in Kindergarten begin with the basics of identifying letter names and first sounds. Eventually they progress to associating those sounds in CVC (consonant-vowel-consonant) words by listening to a word being said and identifying all the sounds. By the end of the school year, they will have to blend various sounds together to either read a whole nonsense word fluently or recite all the sounds separately. Students eventually work up to skills such as oral reading fluency and comprehension in the Maze testing in other grade levels. Standardization occurs when students must achieve the specific benchmark goal at each point of the school year. For example, below is the benchmark goals for a third grade student in the measures of Oral Reading Fluency (ORF) and Maze.
Third Grade Benchmark Goals
Year | ORF | Accuracy | Retell | Retell Quality | Maze |
BOY | 70 | 95% | 20 | 2 | 8 |
MOY | 86 | 96% | 26 | 2 | 11 |
EOY | 100 | 97% | 30 | 3 | 19 |
Students are placed in specific categories: Intensive, Strategic, and At or Above Benchmark. Students that are in Intensive or Strategic receive a Move On When Reading letter. Students receive a Move On When Reading letter two times a year. It indicates what category they are in and what interventions are done to make sure they get the support needed. Interventions include computerized reading intervention, reading intervention services, tutoring, and Tier 2/Tier 3 intervention small groups. The students that are categorized as Intensive or Strategic will also be progress monitored. Intensive students are assessed biweekly and Strategic students are assessed monthly. Students that are At or Above Benchmark are assessed once every quarter.
My Recommendation
Overall, DIBELS seems to be a valid and reliable assessment. It drives many of the data decisions in my school district and classroom. However, sometimes the data it provides can be skewed. For example, my coworker had Student L who performed below what my coworker had expected. It did not accurately show her reading skills because at the time of testing, Student L's family member died. She attended his funeral and returned to school, knowing that one of her family members was no longer with her. Because of the external factors, she performed the assessment but did not perform to the best of her ability. Thus, she was placed in Strategic, when back in the BOY, she was At or Above Benchmark. This is the only assessment genre I have been exposed to since arriving in Arizona, hence there is not another assessment I can compare to.
Digital Contexts
There is nothing digitized about DIBELS. The only digitized part is inputting and recording the data from the DIBELS testing booklets into Acadience. Here in Acadience, reports of specific classrooms can be printed out to see the data of DIBELS. This is a great tool to create small groups with specific measures such as fluency, accuracy, retell and comprehension, and other components.
Final Thoughts
Overall, this is the only reading assessment I have been exposed to or know a plethora of information about. I had to be trained for this as a Second Grade and Kindergarten teacher, as Kindergarten had different measures than Second Grade. This assessment shows student growth and shows if students are stagnant and not growing. It also shows how this can be a test that can increase anxiety for students. To be timed for a minute to read as fast as you can, then recall what you remember in a coherent and understandable way, can be a bit anxious-ridden. DIBELS, nonetheless, is a good assessment that I know my school district will continue to use. We use this data for everything we do in the classroom when it comes to reading; there is not another assessment I would see our school district implementing.
As I wrap up this review, I have some questions to consider: I wonder what the University of Oregon considered when they decided assessing students to read orally was set for one minute and Maze was set for three minutes. I also wonder how can this assessment could be done electronically. For our state testing this year, there was a component for third graders in which they had to record themselves reading on the testing platform. Very similar to the actual in-person component, the only factor that changed was that students used headphones with microphones to speak into and the testing platform recorded their response in a minute.
References
Johnson, D. S. (1996). Assessment for the prevention of early reading problems: Utility of dynamic indicators of basic early literacy skills for predicting future reading performance (Order No. 9706743). Available from ProQuest Dissertations & Theses Global. (304288581). Retrieved from http://ezproxy.msu.edu/login?url=https://www.proquest.com/dissertations-theses/assessment-prevention-early-reading-problems/docview/304288581/se-2?accountid=12598
Strauss, V. (2007, Mar 26). DIBELS test: A question of validity: [FINAL edition]. The Washington Post. Retrieved from http://ezproxy.msu.edu/login?url=https://www.proquest.com/newspapers/dibels-test-question-validity/docview/410083328/se-2?accountid=12598
The University of Oregon. (2022). Official DIBELS Home Page. DIBELS Dynamic Indicators of Basic Early Literacy Skills. https://dibels.uoregon.edu/
Comments