What’s all the Nonsense about Teaching Nonsense Words?

NWBlogpicfin

It’s been nearly 10 years now that our schools have been using the Dynamic Indicators of Early Literacy (DIBELS) Next as a school-wide reading assessment.  DIBELS was first introduced as part of Reading First and fit so nicely into our new Response to Intervention initiative.  The DIBELS Next is a valid and reliable screener that can identify students at-risk for reading difficulties.  The idea is that if we can identify those students early, we can put in place interventions to actually prevent the reading problem!  Personally, I love this assessment.  It does exactly what it is suppose to do.  In the ensuing years; however, misunderstandings of the assessment have lead to inappropriate use of the data as well as inappropriate instructional practices.  One of the issues I have surrounds the use of the Nonsense Word Fluency assessment.

 

What is the Nonsense Word Fluency Assessment?

Nonsense word fluency measures a student’s ability to decode individual phonemes (use of the alphabetic principle) and then blend the sounds together to read words.    There is a large body of evidence that supports the use of pseudowords (nonsense words) for assessment purposes.  According to research (Ravthon, N., 2004) “pseudoword decoding is the best single predictor of word identification for poor and normal readers” and is the “most reliable indicator of reading disabilities” (Ravthon, N, 2004; Stanovich, 2000).  The assessment is really that powerful and when you administer the assessment, you glean a lot of information on the child’s mastery of the alphabetic principle as well as his/her ability to blend sounds into words.   On the DIBELS Next NWF assessment, the student is given a page of “nonsense words” (pud, dak) and essentially asked to read the words.  Some students are able to read the whole words (/pud/, others say the sounds (/p/ /u/ /d/), and some use onset-rime (/p/ /uk/).  The assessment is a one minute assessment and the assessor records sound errors as well as if and how the student blended the sounds.  A score is recorded (22 correct letter sequences/3 whole words read) and then compared to a set standard for the students grade.  Using this score, the teacher can determine if the student’s ability falls within  the benchmark (doing fine), strategic (require some additional instruction) or intensive (significanlty at-risk) range.

 

Why is the NWF assessment timed?

All the DIBELS assessments are timed.  This is because we want to not only assess whether a student is accurate with their sound-symbol correspondence, we what to know if the student has learned the letters and sounds to automaticity.  In other words, are they so good at their letters and sounds, they can quickly and easily say the letter sounds.  Students may be somewhat accurate with their letters and sounds, but if they are unsure and slow, their ability to read text and decode unknown words efficiently will be affected.

 

Purpose of the Nonsense Word Fluency Assessment

The presenter at my training many years ago said something to the effect of “nonsense words are used more for assessment and not as an instructional target”.  In other words, the ultimate goal is not for students to read make believe words.  The purpose of the assessment is two-fold.  First, we want to know if the child knows the most common sounds for the letters and, second, we want to know if the child can blend the letters together to form words.  Real words cannot be used because there would be no way of knowing whether the child is recognizing the word by sight, therefore, we are not isolating the skills that we wish to assess.  It is important to think of the student’s performance on the NWF as an “indicator” of the child’s understanding of the alphabetic principle as well as the ability to blend sounds into word.  The DIBELS Next is a screening assessment.  The assessments are predictors of later reading performance.  Below benchmark performance on the NWF assessment is an indicator that the student does not have mastery of the alphabetic principle and/or is not yet profient at blending.

 

Use of Nonsense Word in Instruction

Let’s use the medical model to help us understand the use of nonsense words in instruction.  Last year I went to my family physician for a physical.  A series of routine screening tests were performed.  Based on the results of those tests, it was determined that I had high cholesterol.  Now, I don’t “work” on my cholesterol number.  I work on the factors that contribute to a higher than desired cholesterol (diet, exercise).  Same holds true with the use of nonsense words.  Students don’t necessarly need to “work” on nonsense words.  They need to work on the skills necessary for quick and accurate decoding unknown words (alphabetic principle, blending).  It is always helpful to analyze a student’s performance on this measure.  Analysis of the errors as well as if and how the student blended words helps when planning intervention.  Here’s a planning sheet to get you started:

NWPlanningfin

 Click the following link to download this 2-page freebie NWF Planning Form

So with this in mind, here are a few more thoughts and ideas:

  1. It important to know that students who are automatic and accurate with their letters and sounds and who can blend sounds together will do well on the DIBELS Next NWF assessment even if nonsense words were never used during your instruction. Be sure your students are solid with their sound-symbol correspondence.  Whether using nonsense words or not within your instruction, be sure provide explict and systematic instruction with blending.
  2. Your NWF data can be used to quickly identify sound errors as well as identify where your student falls on the word-blending continuum.  You will want to use this data (as well as other data available to you) to group students into skill-based groups.  Refer to the following chart.
  3. The progress monitoring component of the DIBELS Next NWF assessment is extremely valuable.  You will want to be sure you progress monitor your students to ensure that they are progressing.  You will want to change your instruction based on ongoing assessment.
  4. Some teachers choose to use nonsense words during instruction and other do not.  If you are using nonsense words, just be sure that you are not over-using them.  The vast majority of your instruction (really, vast majority) or your phonics instruction should include the use of real words and within context.  Personally, I find using nonsense and real words while teaching the skill of blending during directed small group instruction is helpful as you can informally assess during instruction, have more opportunities for practice of a specific skill and move at a faster rate of presentation.  See Using Blending Boards Blog post.
  5. I would caution the use of nonsense words during independent literacy centers.  If a student can read a nonsense word as a whole word, there really is not need for him/her to practice reading nonsense words in a center.  These students should be reading words in connected text.  Reading nonsense words is not the best use of instructional time.  If a student is inaccurate or is not blending, he/she needs direct teaching of the skill with immediate and corrective feedback.  If the student has errors by placing a nonsense word fluency activity in a center you run the risk that the student will be practicing letters/sounds incorrectly.  Students who are not blending will need explicit scaffolded instruction on this skill.  This cannot occur in an independent center.

SignatureRast

Follow on BL

Dolch 220 Sight Word Assessment

Sight Words Assessment pin

The Dolch 220 sight words make up between 50-70% of the words we encounter in text.  Most of these words cannot be sounded out and students need to be taught to instantly recognize these words in order to be fluent readers.  Because recognizing these words is so important during reading, it is important that we use a variety of activities to teach, practice and memorize sight words.

It all begins with assessment.  Before you begin teaching sight words, it is important to know which words your student(s) already know.   When you download the Sight Word Assessment and Progress Monitoring file, you’ll receive the assessment, student recording forms and progress monitoring charts.  You’ll want to periodically re-assess you student(s) to be sure they are making progress.   Students absolutely LOVE coloring in their own graph.

Sight Word Assessmentprevpg1

Click HERE to download the Dolch Sight Word Assessment and Progress Monitoring materials FREE from my TpT store.

To assess your students, you can use the assessment materials contained in the file above, or you can use the Dolch 220 Sight Word Flashcards.  The flashcards can also be used for drill and practice activities.  I color-coded the flashcards according to the lists to help with organization.

Sight Word Flashcardsprevpg1

Click HERE to download the Dolch Sight Word Flashcards from my TpT store

Especially when working with struggling readers, it is important to engage parents in helping their child learn to read.    Helping their child learn sight words is one way they can play a role in helping their child on the road to becoming a fluent reader.   You may want to download this parent handout on Learning Sight Words.  It contains ideas for fun ways for practicing words.

Learning Sight Wordsblogpic

Click the following link to download the FREE parent handout on learning sight words Learning Sight Words

Using a multi-sensory activity to introduce a sight word increases the liklihood that the student will remember the word.  We first introduce a word using a multi-sensory activity and then we use the drill and practice activities.  Of course, it is very important that the student recognizes the word in text.  The student needs to see the word multiple times in text before the word is learned to automaticity–so, be sure to read, read, read (and read some more).

Multi-Sensory Cards- Sight Wordspg1

Click HERE to download the Dolch Sight Words Multi-Sensory Templates from my TpT store

SignatureRast

Follow on BL

Making the Most of the DIBELS Next Nonsense Word Fluency Data

My good friend, Jen, teaches a college level reading assessment class and asked if I could talk a bit on aligning intervention with data.  Of course, I was thrilled she asked.   I decided to take a grade level and focus on the assessments typically administered in the winter.  All of our schools use the DIBELS Next as a school-wide screening assessment.  So, I wanted to take that assessment and analyze individual student performance to see what information we can glean from this particular assessment and what type of “digging deeper” assessments and decisions for intervention we can make.

If your school administers the DIBELS Next and you subscribe to the online data system, you’ll receive a print out of your student’s performance.  The distribution report and the class list report will “categorize” your students as “benchmark”, “strategic” or “intensive”.   Knowing which category your students fall is helpful in a way, but you really need to analyze their test booklets to get the most of your data and to decide if and what type of intervention is needed.  The goal for a first grader in the winter on the NWF assessment is 43 correct letter sequences (cls) and 8 whole word read (wwr).  You may have two students who scored a 17 cls on the assessment, but have two very different needs in terms of intervention.  Let’s take a look at a few students.

Ray

DIBELS Rayblogpic

Ray’s NWF results show that he is attempting to blend, but is very inaccurate.  Take a look at the types of errors he is making.  He is confusing his short vowels sounds and reversing “b” and “d”.  He is also confusing the following sounds: g/j, f/v and s/z.  These are similar sounding phonemes.  The f/v and s/z have the same placement of the lips and teeth, but only differ in voicing.  Ray may not be hearing the differences between the sounds.  Because of the large amount of errors, I would “dig deeper” with an assessment of his letters and sounds.  In terms of intervention, I would incorporate the use of the phonics phones and mirrors when I work with him during small group instruction.

Phonicsphonesblog

Click HERE to access a video on how to make and use a phonics phone.

David

Now let’s take a look of David’s performance.  David is not blending yet as he is sounding the words sound-by-sound.  He is not yet consistent with his vowel sounds and is demonstrating b/d reversals.  David is not automatic with sound-symbol relationships.  Because there are numerous errors, a letter-sound assessment is in order.  Intervention will begin with the letters and sounds David has not mastered.

DIBELS Davidblogpicrev

David’s intervention plan should include vowel discrimination activities for the /a/, /e/, and /i/.  David would also benefit from the use of the blending board during intervention.  This will not only help with blending but with sound-symbol automaticity.

Vowel Sticksblogpic

Click HERE to download the free printable to make your own vowel sticks.

Breanna

Breanna is accurate with her sound-symbol correspondence.  Her score of 34 cls falls below what is expected for the winter of first grade.  Breanna is not blending automatically.   When we listen to Breanna read text, she is sounding words sound-by-sound, so her reading is slow and labor intensive.

DIBELS Brennablogpic

Breanna’s intervention should focus on sound-symbol automaticity and blending.  Quick sound drills and the use of the blending board will be helpful.

Blendingboardblog

Click HERE to access the blog post which contains a video on how to use a blending board, how to make your own and a link to download the free blending board cards.

Ashley

Ashley is very accurate with sound-symbol relationships.  She is quickly saying the sounds and easily going back to blend the word.  Her score of 89 cls places her above benchmark, but her score of 0 wwr falls within the intensive range.  Ashley’s performance on the Rigby Running Records place her reading skills at the beginning second grade level.  She is also reading at a rate of 56 wcpm on the DIBELS ORF assessment (benchmark: 23 wcpm).

DIBELS Ashleyblogpic

It’s important to remember that the DIBELS Next is a screening assessment.  You must look at all the student’s data to make a determination as to whether the student is at-risk for reading difficulties.  Because Ashley’s running records assessment and ORF assessment fall above grade level, she is not having difficulty decoding or blending words in text, therefore, Ashley would not need skills-based intervention.

One of the books that I find very helpful in interpreting data and structuring intervention based on data is Susan Hall’s book I’ve DIBEL’d Now What?   The book provides step-by-step instructions for anaylzing data much like what I did above with the NWF data.  It also provides a framework for grouping students as well as providing different formats for providing the interventions.

DIBELS bookblogpic

It’s important to remember that the purpose of assessment is to guide instruction and intervention.  Testing to “test” or filing the data in a filing cabinet is not helpful.  Teachers who have the most success in terms of accelerating student achievement are those who understand the purpose of assessment and who can use the data to structure both whole class instruction as well as targeted intervention for struggling students.

signatureideas

Blog Flower Borders

DIBELS Next and the Benchmark Goal Controversy

I have to say this fall started in an absolute chaos.  All of our elementary buildings use the DIBELS Next universal screener to identify students who may be at-risk for reading difficulties.   Shortly after our first few days of school we received Part II:  DIBELS Next Benchmark Goals from the University of Oregon Center on Teaching and Learning (CTL) in which the benchmark goals for each assessment dramatically increased.  Let me give you an example.  In the fall of first grade, using last year’s benchmark goals, we would expect a child to achieve 27 correct letter sounds (cls) and 1 whole word read (wwr) on the Nonsense Word Fluency assessment.  Using the new “recommended goals” proposed by the CTL, we now would expect 42 cls and 7 wwr.  Oh my gosh! That’s a dramatic increase in expectations.  This increase in goals held true for all grade levels and for every subtest.   After our schools received and read the document, panic set in and the phone calls started flying!

Through all the ensuing chaos over the next few days, there was one thing that became evident early on. The CTL and the authors of the DIBELS Next are not one and the same.  I’m not sure why we thought they were, but shortly after reading Part II, we stumbled upon a document from the authors (aka Dynamic Measurement Group) in which they adamantly disagreed with the CTL recommended goals.  Since that time they’ve made a DIBELS Next Benchmark Goals video explaining their position.  Okay, I’m one of those visual people who survived graduate school by drawing everything out.  I was feeling that this issue, for sure, needed a cartoon so that I could wrap my head around what I thought was going on.

Click the following link if you’d like a copy of the cartoon DIBELS Next Benchmark Goals Explained

After taking a few collective deep breaths, we decided to hold an “Emergency DIBELS Next Meeting”.  Present at this meeting were Reading Specialists from our local districts, Teacher Consultants, Psychologists, and an administrator.  The room was filled.  Each of us read the CTL Part I and Part II documents as well as the DMG document.  Papers were highlighted and notes scribbled along the sidebars.   We were not going to leave the room until we came to consensus as to which goals to use for this school year.  A Pros/Cons list was developed for moving to the CTL recommended goals.  In a nutshell, we decided to continue to use the DMG goals for this school year for the following reasons:

 DIBELSinsertblog

Following our meeting, we came across a document that was developed by Michigan’s Integrated Behavior and Learning Support Initiative (MiBLSi).  Although they did not make a formal recommendation as to which goals to use, they developed a pretty nice chart to guide schools in their decisions.  Sorry I couldn’t give you a direct link to the document.  If you click on the above link and type in “DIBELS Decision-Making Considerations” into their search box, the document will come right up.

So, for this school year, we are using the DMG goals and will use the UO computer system to manage the data.  That meant that I needed to change the DIBELS Next parent handouts.  If this is what you will be doing this year, feel free to download these handouts.

 

Click the following link to download the handouts DIBELS Next Parent Handouts K-6 DMG goals for the CTL computer system

If you are not using the CTL computer system to manage your data—maybe you are scoring by hand or using one of three other data management systems, the original DIBELS Next Parent Handouts, which can be downloaded for free, are still available on my Teachers Pay Teachers store.

As a group we may decide to continue with the DMG benchmark goals for the years to come or we may transition to the CTL.  We need to do our own careful analysis.  I have to say that I am so lucky to be working with such a large committee of intelligent, hard-working and skilled professionals that understand the value of collaboration.  With any challenge that presents itself, we’ll figure this out together!

signatureideas

Blog Flower Borders

 

 

Four Types of Reading Assessments

Assessment and intervention is the heart and soul of Response to Intervention (RtI).  Prior to beginning either a school-wide program or developing an intervention plan for a particular student, it is critically important to have assessment data.  There are 4 types of reading assessments that comprise a comprehensive K-3 reading assessment plan.  Each type of assessment is important in its own right and provides valuable information to school teams in the RtI process.  So, take a look at these types of assessments.  How comprehensive is your K-3 reading assessment plan?

1.  Screening-  The purpose of a screening assessment is to identify students who are at-risk for reading difficulties.  Identifying the students early on who are likely to struggle with learning to read is important as we can then develop intervention plans that, hopefully, PREVENT a life-long reading deficit.

The DIBELS Next and the Aimsweb assessments are the two most commonly administered screening assessments in schools.  Our schools utilize the DIBELS Next assessments.   Students are screened three times a year using the particular assessments designed for each grade level.  For example, students in 2nd-6th grade read 3 grade level passages for one minute.  The number of words correct per minute (wcpm) is calculated and compared to an expected level of performance.   The assessments are easy to administer and take no more than 8 minutes per student.  When the data is entered into the computer-based system, a variety of charts/graphs are provided for school-based problem solving teams to analyze.  Below is a sample of a class list report that a teacher will receive after each benchmark period.

2.  Progress Monitoring- The purpose of progress monitoring  is to track student performance during an instructional period.   Once a student is identified as at-risk for reading difficulties, an intervention plan is developed.  Every week or every other week, the student is assessed with a progress monitoring probe (usually a one minute assessment).  The purpose of the assessment is to determine if the student is making progress when provided with the additional support.  Below is a sample of a progress monitoring chart measuring oral reading fluency for a 2nd grade student.

Both DIBELS Next and Aimsweb offer progress monitoring assessments.

3.  Diagnostic-  Diagnostic assessments provide the teacher with more in-depth information about the student’s skills.  Diagnostic assessments can range from standardized assessments to teacher-made classroom assessments.   The Quick Phonics Screener and the Primary Spelling Inventory are two assessments that we use to help us target specific deficits in the area of phonics.  We also find that having a running records assessment is helpful when designing interventions.  Several schools use the Rigby Running Records while others use the Diagnostic Reading   Assessment.   In Michigan, we use the Michigan Literacy Progress Profile to assess letters/sounds and specific phonemic awareness skills.

4.  Outcomes-  Outcome assessments are typically administered once a year.  These assessments are usually referred to as “high stakes” assessments and the data is used to assess curriculum design, implementation and teachers’ efforts over the course of a school year.  Outcome assessments provide standard scores and percentiles so that the problem solving team (and parents) can compare a particular student’s performance to peers across the nation as well as peers within the district.   The two outcome assessments used in my schools are the Gates-MacGinitie for kindergarten and the Iowa Test of Basic Skills.  Although these assessments provide valuable information, they are costly and time consuming to administer.

Click the following link to download this poster 4 Types of Reading Assessments

Be sure to check back as I will be exploring each type of assessment in future blog postings.  This post corresponds with Priniciple 8 of the 8 Core Principles of Response to Intervention.

signatureideas

Blog Flower Borders

Assessing and Progress Monitoring Sight Words

One of the many benefits of my job is that I have the opportunity to travel to many different elementary buildings.  I know you may be sitting there thinking, “I’d hate that”, but it’s actually tons of fun.  Each building has it’s own unique personality so every day of the week brings a brand new experience.  When I first started my job traveling among the 13 elementary buildings in our ISD, it was plainly obvious that each building had it’s own measure of what they considered to be “normal”.  In fact, the idea of what was considered “normal” could (and often did) vary from teacher to teacher within the same building.  We were quite often faced with a situation where a teacher would bring a child to the Child Study Team wanting to refer the child for special education services, however, the child would be completely within the “normal” range if he/she were in the classroom next door.   This was quite a dilemma.

Within a RtI model, one of the first steps is to identify students at-risk for learning difficulties.  With such varying views of “normal”, we felt we had to pull together and develop common assessments as well as expected levels at various grades.  Of course we use our universal screener (DIBELS Next); however, in making such important decisions regarding who receives intervention, we felt we needed additional assessment data.  We pulled together all the Reading Specialists from our districts and came up with common assessments and cut off points that we all agreed to (fortunately, one of our districts had a good model already developed).  The Reading Specialists then took this back to the teachers within their buildings.  We all now have a common vision of what is “normal” at each grade level.

In each school, giving a sight word assessment was common.  We were not all giving the same assessment (some using the Dolch sight words, others using their own developed lists) and certainly not giving them at the same time of year or with the same materials.  Now, beginning mid-kindergarten and going through the end of second grade, all students are now assessed using the Dolch 220 sight words three times a year.  An added bonus is that the vast majority of teachers are using the same form!   Actually, all RtI tracking and assessment forms are standardized between our schools.  When students move between schools, the forms go with them and intervention and progress monitoring begins immediately.

Below is the assessment that we use to assess sight words.   Feel free to download this file.  It can be found on my Teachers Pay Teachers store as a FREE download.  It contains the assessment materials, student recording forms and progress monitoring graphs for the Dolch 220 sight words.  I’ve also included specific directions on how to administer the assessment.  The agreed upon targets for each grade level assessment periods (fall, winter, spring) are contained within this file.

This is the student recording form.  The Dolch 220 sight words are divided into 9 lists.  When you administer the assessment, you will show the student the word and then place a “+” or “-” in the appropriate square.  A correct response is when the student was able to quickly name the word.  If the student sounded the word out or took longer than 3 seconds then a “-” would be recorded.   The student recording form allows for multiple administrations of the assessment.

In a RtI model, monitoring the student’s progress during intervention is a key component.   Several options for charting progress are included.  I have the students color in the graph after each assessment so they can also see how many words they are learning.  They LOVE this!  I literally have students beg to be tested.

Click HERE to download the free Dolch sight word assessment.

This post correlates with Principles VI and VIII of the 8 Core Principles of Response to Intervention.

SignatureRast

Follow on BL

Related Posts Plugin for WordPress, Blogger...