Wednesday, March 2, 2016

AV#144 - Fulfilling the Colorado READ Act: a steep climb ahead

March 2, 2016


“By now Lewis and Clark were growing ever more anxious to catch sight of the Rockies, the mountain barrier they knew they would have to cross. In the last week of May, Lewis saw the mountains for the first time. He was filled with joy, immediately tempered by a realization of the challenge that lay ahead."   http://www.nationalgeographic.com/lewisandclark/journey_leg_7.html

Colorado Read Act – Fact Sheet
Supporting K-3 Literacy Development
Achieving reading competency by the end of third grade is a critical milestone for every student and predicts ongoing educational success. If a student enters fourth grade without achieving reading competency, he or she is significantly more likely to fall behind in all subject areas beginning in fourth grade and later grades. Early literacy development is … one of Colorado’s top education priorities.   (http://www.cde.state.co.us/)
We cheered the passage of the READ Act in 2012 and continue to hope it will achieve a key purpose: to see more students achieve “reading competency by the end of third grade.”  But after three years of data, I trust no one believes we are anywhere      
close to seeing all Colorado students enter 4th
grade reading at grade level.  Last spring’s
PARCC results for 3rd graders makes it only more clear that—as READ Act advocates know full well—the new law only addressed a fraction of the tens of thousands of K-3 students who struggle with reading.  

No news there: we knew the problem is bigger than the READ Act could tackle—which was designed, to use the polysyllabic mouthful–to serve K-3 boys and girls determined to be Significantly Deficient Readers (SRD).  But are we willing to admit what a steep climb we have ahead of us, if we are to accomplish the READ Act’s ultimate goal?

I know this: we do not help matters by overstating our “success”—if success it is—in our implementation of the READ Act. 

Last July I thought it was important to say – addressing the other end of the K-12 system – that our “higher graduation rates” are probably inflated because we have no consistent measure across our 178 districts of what it means to qualify as a high school graduate.[1]  I make the same point here regarding the number of K-3 struggling readers in Colorado’s schools.  READ Act figures are not the full story. 
Our first PARCC results for third graders suggest that perhaps 40% of Colorado students entered 4th grade this past August well below the proficient level in English Language Arts – and over 60% were not yet proficient (see page 5).  Quite a different message than one might take from the first external report on the Colorado READ Act, “An Evaluation of Implementation and Outcomes after Year One,”[2] released last summer.  It was hard to get past the Executive Summary and not think—WOW! GOOD NEWS!
     “The results of this research are immensely positive, and we are thrilled to see that the policy’s implementation has had a significant impact on the lives of thousands of Colorado students.” Scott Laband, President Colorado Succeeds
    “The READ Act is making an incredibly positive impact in the lives of thousands of Colorado kids after just one year…. The majority of Colorado schools reduced the number of students with an SRD (significant reading deficiency). Many schools have seen dramatic reductions in the overall number of students with an SRD…. Statewide, the number of students with an SRD was reduced from 16% in 2013 to 14% in 2014, resulting in nearly 5,000 fewer students with a significant reading deficiency.”   Executive Summary, page 6                                    (Bold mine)

The headlines that followed the report’s release cheered as well:
·         Early literacy effort having impact, advocacy group reports  (Chalkbeat Colorado)
·         Colorado’s READ Act has a Positive Impact, Report Shows (Colorado Children’s Campaign)

And yet a close look at the study (henceforth referred to as the CRA report) makes one wonder.  Such a big change—5,000 fewer students--in only one year?  Only 14% of our K-3 students having great difficulty?   

I examine the CRA report here, in part because we have new information since it was written: Year 3 data- the 2014-15 on the number of students SRD eligible, as well as last year’s 3rd grade PARCC scores.  Also because it is imperative that we be honest about the challenge—and about evaluating progress. 
**

First, note that the CRA report only provided data on 2012-13 and 2013-14.  Not a word on 2014-15.  Both Chalkbeat Colorado[3] and the Colorado Children’s Campaign[4] mistakenly reported that this report gave 2014-15 data. It did not, even though it was published after the 2014-15 school year had ended.  What we have since learned about 2014-15 is that many districts increased the number of K-3 students identified as SRD last year compared to 2013-14. Which raises the question: does success mean fewer students judges to be SRD? What if schools and districts that identify and serve a greater number of our struggling K-3 readers might actually be doing more to fulfill the purpose of the READ Act?

Can we have multiple definitions of SRD - and still claim we are reducing the number of SRD students?

READ Act*
  Year
Eligible Students
% of K-3 students
2012-13
42,479
16.5%
2013-14
37,506
14.4%
2014-15
36,420
13.8%
*Figures made available to me by CDE
My first point: how can “The Colorado Read Act – An Evaluation of Implementation and Outcomes after Year One,” claim such success on reducing the number of student classified as Significantly Reading Deficient from one year to the next, when we had no common definition or assessment used in 2013 and 2014, or district to district?                                                            
                                              
I start here: there was no common test used statewide to determine which students were classified as SRD.  There were three possible tests in year one (2012-13) and seven in year two (2013-14), as the report itself makes clear.[5]  The report also acknowledges “limitations” in what to make of the data.[6] So that statement should read:                                                                                     
…the number of students IDENTIFIED BY SCHOOLS USING DIFFERENT TESTING OPTIONS was reduced from 16% to 14% …..
The Executive Summary also asserts that “after only one year of implementation, we uncovered some encouraging results”:
The majority of Colorado schools reduced the number of students with an SRD.
But again, to be accurate, that too-generous statement should read:
The majority of Colorado schools reduced the number of students IDENTIFIED BY SCHOOLS USING DIFFERENT TESTING OPTIONS with an SRD.

The big picture - 2012-13 to 2013-14

The CRA report is packed with interesting data from individual districts and schools, with breakdowns showing the LOWER number and percentage of students identified as SRD in several categories in 2014 versus 2013. (Including: “Schools that significantly reduced the percentage of ELL students with an SRD,” and African-American/Black students, and Hispanic/Latino students,” and much more.)  But I find it strange that a 30-page report did not present the larger story. In the fall of 2014, based on data the Colorado Department of Education had already released[7], I put this together.

Read Act Funding Formula  -  2012-13 to 2013-14*

2012-13
2013-14
change
STATE
41,942
36,993
-4,949
DPS
6,940
5,172
-1,768
Jefferson County
3,267
2,386
-881
St. Vrain
1,635
1,186
-449
Cherry Creek
1,914
1,553
-361
Adams 14
685
420
-265
Adams 50 - Westminster
1,139
890
-249
Aurora Public Schools
3,867
3,637
-230
Adams 12
2,530
2,355
-175
           *Numbers here do not always match number from CDE’s Office Literacy, which is the source of most figures quoted in this newsletter.

“Specifically, the purposes of this evaluation study were to (1) determine if the READ Act successfully reduced the number of students with a significant reading deficiency (SRD) after its first full year….”   
      From Executive Summary of the CRA evaluation
The CRA report highlights such reductions and states: “these districts are the top performers.” Really?

I sent those numbers to and met with CDE staff in November 2014 to ask several questions: Why the big drop in Denver, Jeffco, and several other districts? Was CDE confident DPS and Jeffco and other districts across the state were using the same measurements one year to the next?  Is it possible DPS, Jeffco, St. Vrain, etc. could use different criteria to identify students with a Significant Reading Deficiency? Could DPS honestly claim greater success in meeting the needs of its lowest readers than districts where the number identified stayed about the same, or grew, as in those below? Again, how do we define success?

Read Act Funding Formula
2012-13
2013-14
Change
Douglas County
1,651
1,649
-2
Poudre R-1
781
910
+129
Harrison
531
692
+161
Colorado Springs
1,301
1,466
+165
Mesa County Valley 51
852
1,130
+278

With no one consistent measure across the state[8] of what it meant to be identified as a student who was SRD, what did we really know about whether we were truly making progress?

TCAP & PARCC results invite the question: What percentage of our 3rd graders struggle to read?
Along with these SRD numbers, we now have at least two other ways of looking at the skills of 3rd graders in Colorado (2014-TCAP; 2015-PARCC), to gauge the number of 3rd graders who are–I will try a deliberately vague term for the moment–struggling readers.  So let’s zero in on that grade.  Of course I realize that neither the 3rd grade TCAP nor PARCC tests are a perfect match with assessments approved by the READ Act.  CDE’s 2014 “Annual Report on the Implementation of the Colorado READ Act (co-written by Patti Montgomery, at that time the Executive Director of the Office of Literacy at CDE) spoke directly to the difference of TCAP scores versus those identified as SRD[9]

Nevertheless, TCAP and PARCC scores tell us something useful about how many 3rd graders are not proficient readers—and, given the various categories below proficient, how many are really struggling.  Do these scores raise legitimate questions about the SRD numbers? I believe they do. (If I am wrong, feel free to ignore what follows.)

SRD 2014: The CRA report tells us that in 2013-14, out of 64,382 3rd graders assessed, 11,220 (17%) were identified as SRD.  First grade (18%) and second grade (16%) had a similar percentage.[10] 

2014 – 3rd grade - TCAP – 28% not proficient  -  vs.  -  17% SRD

TCAP data from 2014 included a breakdown of the students in the state and in each district who scored in the lowest two categories as Not Proficient in reading: Unsatisfactory and Partially Proficient.  That number and percentage increased from 2013 to 2014. 
 TCAP  -  3rd grade reading
2012-13
2013-14
Change
Unsatisfactory
6,047 -  10%
6,447   - 10%
+400
Partially Proficient
10,585 – 17%
11,326  - 18%
+741
TOTAL
16,632 – 27%
17,773 – 28%
+1,141 –   +1%

Again, I know: TCAP and SRD are entirely different assessments. I realize it is unlikely that all Partially Proficient 3rd grade readers would be identified as significantly deficient readers.  I merely point out that the number not proficient on TCAP increased from 2013 to 2014 (16,632 to 17,773), while the number identified as SRD decreased over those same years (12,241 to 11,220) (CRA report. p. 10).             

                    
  3rd grade - 2014 TCAP Reading: 71% proficient   vs.   2015 PARC English: 38.2% meeting expectations

The READ report’s Executive Summary used the 2014 TCAP results when it framed the problem this way:
Colorado’s most recent third grade reading results show that literacy continues to be an area of dire need of improvement. Third grade reading results went down statewide with just over 71% of students scoring proficient or above.                                                                            (bold mine)

But now that we have 2015 PARCC scores, we can update that statement—revealing an even greater crisis for early literacy in our state.  That “over 71%” proficient in reading, above, becomes “just over 38%” meeting expectations in English.  Again, I understand that PARCC provides us with only one score—combining results for reading and writing—so yes, an inexact comparison.  But telling?  As so few third graders opted out of the PARCC assessments last spring[11], we cannot use that as an excuse to dismiss these scores (as some do for PARCC results for 11th graders). 

Most frightening—if we believe these numbers—is to see that in several districts, less than 20% met expectations: Aurora-18.4%, Adams 50-16.8%, and Adams 14-14.3%.
                                                                                                                                                                              
Now see how the number of students scoring in the lowest two categories in English on PARCC match up with number identified as SRD in 2014-15. (NOTE: On PARCC, 23.2% scored in a third category for students not at grade level: Approaching Expectations. So in all, 61.8% of our 3rd graders (23.2 plus 38.6, below) were not meeting expectations; 38.2% were meeting expectations.)

SRD 2015: CDE gave me the 2014-15 results, which are fairly consistent with the previous year: out of 66,117 3rd graders tested, 10,639 (16.1%) were identified as SRD.  In first grade (16.9%) and second grade (15.4%), not much different.

2015 - 3rd grade - PARC English: 38.6% well below meeting expectations  -   vs.  -  16.1% SRD  

PARCC results – grades 3: % in lowest two categories  – 7 districts below state average – vs. SRD

PARCC – English Language Arts–2014-15
SRD – 2014-15

Did not yet meet expectations
Partially met expectations
TOTAL % in bottom 2 categories[12]

STATE of Colorado
19.6
19.0
38.6
16.1%





Adams 12
23.9
20.0
43.9
21.1%
Pueblo City
20.3
26.3
46.6
14.5%
Greeley
25.9
21.4
47.3
24.8%
DPS
27.8
20.2
48.0
23.8%
Westminster (Adams 50)
33.7
27.5
61.2
35.5%
Aurora
40.7
23.1
63.8
35.2%
Adams 14
39.9
26.1
66.0
38.9%

Perhaps the first column, Did not yet meet expectations, captures most 3rd graders found by the various READ Act assessments used last year to be significantly reading deficient (last column). Note the similar percentages. But if PARCC gives us a valid assessment on English skills—used across the state—it reveals that the enormous number of students we might call struggling readers.  Just look at the percentage of students who only Partially met expectations.   Add that percentage in and we find that in a district (like DPS), nearly 50% of the 3rd grade students score in the bottom two categories on English Language Arts.  In Aurora and Adams 14, nearly two-thirds score that low.  To restate: the READ Act only identifies a fraction of our youngest students struggling to read. My best guess on that number: 40% of our 265,000 K-3 students—over 100,000 students.

2014-15 SRD numbers: in several districts an increase (and what if this is GOOD news?)

Since the CRA report about 2012-13 and 2013-14 was produced, we now have Year 3 (2014-15) data, specifically the number of students eligible as SRD when the 2015-16 school year began.  Two points: 1) It shows how the numbers bounce frequently enough from year 1 to year 2 to year 3, again raising questions about the consistency and reliability of the assessment being used.  2) It, too, challenges the assumption in the report that success equals a reduction in the number of SRD students.  

That assumption would cheer significant reductions in K-3 students identified as SRD in districts, such as these below.  But when numbers bounce up and down this drastically, isn’t there good reason to doubt the consistency and validity in determining which students are SRD? 


2012-13
2013-14
2014-15
Changes over time
STATE
42,479
37,506
36,420
-6,059 fewer SRD students over 2 yrs
Mesa County Valley 51
852
1,130
599
cut # by 531 - nearly 50% - in one year!
St. Vrain
1,635
1,186
743
 cut # by 892 - over 50% - from 2012-13
Thompson R2-J
674
581
359
cut # by 315 - over 47% - from 2012-13
DPS
6,940
5,172
5,027
cut # by 1,913 - over 27% - from 2012-13

That assumption would also suggest that the many districts—like the 10 below, who identified more significantly deficient readers in 2015 than in 2014—are doing a poor job of implementing the READ Act.[13]

Read Act Funding Formula - 2013-14 to 2014-15*
                           
2013-14
2014-15
Change from previous year
STATE
36,993
35,974
-1,109
Jefferson County
2,386 – 10%
2,799 – 11.4%
+413
Adams County 14
420 – 17.9%
591 – 25.1%
+171
Pueblo 60
809 – 13.7%
952 – 15.9%
+143
Greeley
1,247 -18.3%  
1,383 - 20.1%
+136
Poudre
910 – 10.7%
972 – 11%
+62
Colorado Springs
1,466 – 16.4%
1,524 – 17.3%
+58
Adams 12
2,355 - 18.4%
2,405 – 20%
+50
Fountain
427 – 14.4%
470 – 16.7%
+43
Montrose County RE-1J
251 – 15.4%
290 – 17.6%
+39
Littleton 6
304 – 7.2%
340 – 7.9%
+36
           *Numbers here do not always match number from CDE’s Office Literacy, which is the source of most figures quoted in this newsletter.

Another View – a contrary view–might be the districts are more successful when they identify a higher percentage of their struggling readers.  For it means a greater number of boys and girls who are not reading at grade level will benefit from the funds ($32 million this year) and the services made available through the READ Act.


The CRA report highlighted “success” in one district, Westminster (Adams 50), and several schools

Finally, the report offered a closer study of one school district, Westminster (pages 22-23), and four elementary schools (pages 116-21)—“recognized for their success in reducing the number of students with an SRD.”  A 2015 update—now with PARCC assessments in front of us—suggests the CRA report was cheering too soon.

According to the CRA report, Westminster was chosen “because it had reduced the percentage of students identified as having an SRD by eight percentage points from 2012-13 to 2013-14.”  The report included an interview with Mat Aubuchon, Director of Early Childhood Education, who was asked about his “biggest celebration.”
Aubuchon: The drop in the number of students identified as having an SRD; increased use of data in the elementary schools; principals starting to hold K-2 teachers accountable.

28% K-3 SRD (2014)
2012-13
2013-14
change
# of K-3 students 2012-13
# of K-3 students with an SRD
% of students with an SRD
# of K-3 students 2013-14
# of K-3 students with an SRD
% of students with an SRD
# and percentage pts difference  from 2012-13 to 2013-14
3153
1,139
36%
3146
890
28.5%
249 fewer students
8% fewer students

We now have 2015 SRD figures to suggest Westminster is suddenly not a “top performer”; in fact, like the 10 districts listed on the previous page, its SRD numbers went up last year.

2012-13
2013-14
2014-15
1,139 – 36%
890 – 28.5%
910 – 29.1%
Number of students and % from https://www.cde.state.co.us/cdefinance/readactperpupilfunding-0, CRA report, and emails to me from CDE.

Hey, some will say, the overall numbers are down from 2012-13.  Can’t we at least celebrate that? 

But I cannot.  For as I have demonstrated, the SRD figures only reveal a portion of the K-3 students in Colorado who struggle to read at grade level.  Westminster is a good example

In 2014, we can compare the SRD percentage with TCAP figures for third grade in Westminster.
SRD – 3rd grade – 33.6%[14] - about 232 students
TCAP – 3rd grade – 43.7% not proficient (Of 690 3rd graders taking TCAP, 16.23% scored Unsatisfactory; 27.39% scored Partially Proficient.  A total of 301 3rd graders not proficient in 2014.) (http://www.cde.state.co.us/assessment/coassess-dataandresults
In 2015, we can compare the SRD percentage with PARCC results for third graders in Westminster.
SRD – 3rd grade – 35.5%[15] - about 255 students
PARCC – 3rd grade –61.2% of the 3rd graders – 440 students - fell well short of Meeting Expectations in English. 

2014-15 – PARCC – 3rd grade
# students
# students Did Not Yet Meet Expectations
% students
Did Not Yet Meet Expectations
# students Partially Met Expectations
% students Partially Met Expectations
# and percentage in two lowest categories
719
242
33.7
198
27.5
440 students  - 61.2% of 3rd graders Did Not Meet or Partially Met Expectations

2014-15 update – Cole Arts and Sciences Academy - PARCC

The CRA report also provided a picture of the work in several elementary schools, including Cole Arts and Sciences Academy in Denver.  We read of a remarkable decrease in K-3 students identified at Cole as SRD between 2013 and 2014: 137 students (45%) down to 65 (23%) – a decrease of 22%.  An inspiring story. To hear about Jessica Jackson and her team is to make one eager to say: KEEP UP THE GOOD WORK!

And yet … one more inconvenient truth.  OK, not a truth—just test scores.  But telling?  Troubling? 
Here are the grim PARCC numbers at Cole Arts and Sciences Academy:

93.1% Not Meeting Expectations or Partially Met Expectations on PARCC – English (2015)
2014-15 – PARCC – 3rd grade
# 3rd grade students
% / # students
Did Not Yet Meet Expectations
% / # students Partially Met Expectation
% / # students Approached Expectations
% students Met Expectations
# of students Met Expectations
58
36.2% (21)
27.6% (16)
29.3% (17)
6.7%
4

Not put in front of you, or especially in front of the Cole teachers—and so many like them who work so hard to help the boys and girls  in their care to learn to read well in their first few years in public school—to discourage anyone.
Just to insist that we not misstate the challenge, that we not exaggerate our "success" to date—or the tough trek before us. 
Climb every mountain
Mountain Valley
“… we proceeded on to the top of the dividing ridge from which I discovered immense ranges of high mountains still to the West of us with their tops partially covered with snow."  
                                Meriwether Lewis, August 1805, http://www.nps.gov/nr/travel/lewisandclark/lem.htm

Lewis and Clark and team headed west over 200 years ago, with an ambitious goal—to reach the Pacific.  When they saw the Rockies, it must have dawned on them too: a steep climb ahead.

We cheer the goal: to see our students reading at grade level as they start 4th grade.  But no illusions, please. It won’t be easy.

Another View is a newsletter by Peter Huidekoper.  Comments are welcome. 303-757-1225 - peterhdkpr@gmail.com



[3] “The program rolled out in the 2013-14 school year, so results from 2014-15 provided data for comparison.”  Bold mine
[4]  “The report’s authors compared assessment data from the 2013-14 school year (the first of implementation) with 2014-15 data….” http://www.coloradokids.org/colorados-read-act-has-a-positive-impact-report-shows/       Bold mine
[5] “In the first year of implementation, schools were required to use one of three tests previously approved for the Colorado Basic Literacy Act. The options available included the Developmental Reading Assessment 2nd Edition (DRA), DIBELS (either DIBELS Next or DIBELS 6th Edition), or the Phonological Awareness Literacy Screening .…  In 2013, the State Board of Education approved seven different testing options, from which schools can choose to administer” (p. 7).
[6] “The assessment results provided by CDE were not always easily interpreted and we recognize the statistical and psychometric limitations associated with analyzing data from a single year of implementation. Some districts showed a significant and unexplained discrepancy in the number of students in a particular category from one year to the next…” (p. 10).
[8] I am not calling for one test. I am simply stating that different tests makes it necessary to be cautious, at the very least, about comparisons of SRD numbers one year, or one district, to the next.  Especially, of course, when 2012-13 preceded the distribution of any READ Act funds!  I heard concerns about the validity of SRD number from several K-1 teachers when I visited an elementary school last month:  They expressed no fundamental problem with READ Act—just concern about the amount of time spent documenting and updating plans. Still trying to develop “an effective process to put that into effect” in our school; “not clear” in many cases on identifying the SRD students; “seems like a lot of gray area”; “there is so much room for error,” even with Scantron—which provides more consistency; “I wouldn’t let different people progress monitor” as it was leading to such a range of scores; “I think DIBELS are really good tests, but we just need to administer them consistently.”

[9]“Students in third grade in the 2012—2013 school year were assessed with both the Transitional Colorado Assessment Program (TCAP) test and one of the State Board approved READ Act interim assessments in the spring of 2013. The READ Act interim assessments identified a higher proportion of struggling readers than the TCAP (19% and 10%, respectively). While this may seem incongruous, it is important to remember that the interim assessments measure only the critical early literacy indicators that are most predictive of future reading success and therefore are not comprehensive in nature….” (http://www.cde.state.co.us/coloradoliteracy/readactannuallegislativebrief2014final)
from READ Act Report - 2013-14

#of K-3 students READ-Act Tested
# of K-3 students - SRD
% of students- SRD
All K-3
261,343
37,506
14%
½ Day Kindergarten
17,822
1,210
7%
Full Day Kindergarten
47,411
2,921
6%
Ist Grade
66,309
11,619
18%
2nd Grade
65,419
10,536
16%
3rd Grade
64,382
11,220
17%

[12] I highlight the middle column above to emphasize the huge percentage of students scoring in the lowest two categories. These columns do not even include that third category of students also falling short: Approaching Expectations, statewide, another 23.2% of third graders.

[13] Consider this example, and tell me if this school is succeeding, or failing, to implement the READ Act:
From the school’s Unified Improvement Plan for 2013-14 -Target – Read Act
·         Number of SRD students will lower to 5%. (DIBELS)
A year later, from the school’s UIP for 2014-15 – (A look back) - Performance on Target
·         The goal was not met as the percent of SRD students increased to 13% at End of Year assessment.
QUESTIONS A SCHOOL MIGHT ASK:  What if we are conscientious in assessing our new kindergarten and first grade students and identify a large percentage of them as Significantly Reading Deficient?  Isn’t this what we should be doing?  But then, even if we help a good many of our returning SRD students improve to the next level, our overall K-3 SRD numbers might rise.  If this is implementing the READ Act well, isn’t that “success”?

[14](email from CDE to me)
[15](email from CDE to me)

No comments:

Post a Comment