Monday, March 21, 2016

AV#145 - Teacher Leadership & Collaboration: DPS develops a better way to evaluate and support teachers

                                                                                                                                   March 22, 2016


“This is an enormous paradigm shift from the traditional way we’ve done school. We’re still learning
and there are bumps along the road. But it’s been extraordinarily positive so far.”
Superintendent Tom Boasberg (“DPS to expand teacher leadership program,” Chalkbeat Colorado, 2/9/16)

Teacher Leadership & Collaboration Model 
# of schools with Team Leads keeps growing
2013-14
14
2014-15
40
2015-16
72
2016-17
110 (projection)
It is not, I hope, a sign of the Apocalypse—to learn that the largest school district in Colorado has adopted policies around teacher evaluation and support that address concerns raised in Another View, over several years, about Senate Bill 191, the Educator Effectiveness law: AV#62, #68, #74B, #84, #113 (see page 4).  I, for one, applaud Denver Public Schools for deciding that we do a better job of supporting teachers by peer review than by expecting principals to “evaluate” 40-50 members of the faculty. 

If my criticism and warnings now seem justified, for at least one school district, I take no pride in this.  All I ever said was based on my experience, especially of the benefits of colleagues, not administrators, observing my classes and talking with me about what was and was not working well.  If DPS wants to call this change “innovative,” fine.  I simply note it is similar to what I witnessed in two private schools in the 1970’s and 1980’s.  The point is: this development deserves our attention.

The state of teaching in Colorado – shortage, turnover, frustration

First, stepping back a bit–the big picture. Stories and reports this year continue to raise alarms about the state of the teaching profession in Colorado—and across the country.  In our state we hear of the teacher shortage as “crisis” in rural communities; fewer grads earning a teaching degree from Colorado universities; fewer applications for Teach for America; and the on-going challenge of how best to recruit and keep teachers of color.  Here in Douglas County, we read of Ponderosa High students protesting the high turnover of their teachers.  One reason they leave, students gather, is the district’s teacher evaluation policy.  Courtney Smith, president of the Douglas County Federation, points to this as a cause as well the higher turnover across the district, telling The Denver Post that “… teacher morale has never been lower. She counts the teacher evaluation system — which she said was mostly about ‘uploading evidence’ rather than true assessment of teaching skills — among the chief problems” (http://www.denverpost.com/news/ci_29615913/douglas-county-students-walk-out-protest-teacher-turnover).

Implied in all this is the larger question as to whether state (and federal) teacher evaluation policies designed to improve learning has had the unexpected (or was it?) result of burdening teachers—and principals—in ways that actually do more damage than good.  In particular, due to its impact on (both a principal and a teacher’s) time—and that key intangible: trust. (Did SB 191 aim to judge, or to support?)

Two of the most influential voices in the country on teacher evaluation suggest they also see ways in which the effort might have been—and still could be—implemented in ways more helpful to teachers.

    When we focus on ratings, how much do teachers—and students—benefit?
We learn that in Jefferson County in 2014-15, 98% were rated effective or highly effective. Which says what—exactly? Is that what we wanted SB 191 to do?
Vicki Phillips recently stepped down after eight years heading education grant-making at the Bill and Melinda Gates Foundation.  Under her leadership, the Gates Foundation played a significant role in seeing 33 states with “teacher evaluation systems based on the (Foundation’s) “Measures of Effective Teaching Practices” (MET) (https://www.edsurge.com/news/2015-10-27-vicki-phillips-to-leave-gates-foundation).

Asked about the foundation’s biggest successes and missteps during her tenure, she answered:
"One of the things I am most proud of in this job is the way we have worked to put teachers in the center of everything….”  That said, Phillips said the foundation does have a bit of a mea culpa when it comes to teacher evaluation. “In the best of all worlds, everyone would have loved it if [the MET study] had come out in time to inform all the changes and policies around teacher evaluation, so people didn't jump too quickly and overemphasize one component over another....  And as that happened and other things happened, people would think the Gates Foundation is only about evaluation of teachers, when we were, all along, about meaningful improvement and actionable feedback.” (Bold mine)                               (http://blogs.edweek.org/edweek/teacherbeat/2015/10/gates_vicki_phillips_announces.html
                                                                       
U.S. Secretary of Education John King – and formerly Commissioner of Education in New York, where he saw the battle lines drawn on teacher evaluation–went a step further in his remarks this past January:
Rethink Teacher-Evaluation Systems if They're Not Working, John King Says  -  bAlyson Klein 
        The Every Student Succeeds Act presents states, districts, and educators with a chance for a "fresh start" and "much needed do-over" on the very testy issue of teacher evaluation through student outcomes, acting U.S. Secretary of Education John King said at a town hall meeting for teachers ….
What educators say of the old paradigm
“Administrators who already wear several hats find themselves trying to carve out more time to observe teachers in the classroom and score them against the CDE's rubric for teaching practices.  ‘Now we've got to be in the classroom a lot more and actually help teachers, coach the teachers in how they can provide the quality instruction we want,’ said Centennial Superintendent Brian Crowther. ‘So right now, that's the overwhelming piece.’" http://www.denverpost.com/news/ci_24847810/districts-roll-out-new-colorado-teacher-evaluations-during
Nicole Veltze, the principal of North High, said that the new role (of teacher leaders) was helping. “As a principal, having to manage 70 teachers is unrealistic if I’m really trying to improve their practice. It’s done a lot to create ownership for professional learning and built relationships among teachers.”
      "I'll start by being frank, if maybe also obvious, and say this conversation hasn't always gone well.  A discussion that began with shared interests and shared values—the importance of learning and growth for all our children—ended up with a lot of teachers feeling attacked and blamed. Teachers were not always adequately engaged by policymakers in the development of new systems. And when they disagreed with evaluation systems, it appeared to pit them against those who they cherished most—their students. That was no one's desire." He said states should be prepared to rethink their evaluation systems if they're not really helping teachers get better. (http://blogs.edweek.org/edweek/campaign-k-12/2016/01/john_king_if_teacher_evaluatio.html?cmp=eml-enl-eu-news3)          (Bold mine)                                    

The rethinking done by DPS—focusing on feedback and support for teachers, by the men and  women who teach in their buildings, often in their subject—might be exactly what King hopes to see.
Team Leads in DPS  
“Support support support” for 6-7 colleagues

Superintendent Tom Boasberg has been a persuasive advocate for this change—in part due to his understanding of leadership. In January 2014 he told The Denver Post: “So long as schools are structured where one principal is responsible for coaching, supporting and evaluating 30 or 40 people, any system in the world is not going to work” (http://www.denverpost.com/news/ci_24847810/districts-roll-out-new-colorado-teacher-evaluations-during).  In September 2015 he told an A Plus Denver audience that, in a “knowledge intensive” workplace, “this model is broken…. In other sectors we see managers develop six to seven people.” 

Which is exactly what Denver Public Schools asks its Team Leads to do, thereby getting at my fundamental problem with SB 191 as I understood the bill: expecting principals to spend a much greater percentage of their time observing and evaluating their teachers, when – in my experience – school leaders often have little expertise in good classroom instruction, while the school itself has a host of teachers better suited to helping their colleagues grow. 

Denver’s Teacher Leadership program began in 2013-14. Now in its third year, there are nearly 250 Team Leads in more than 70 DPS schools.  They stay classroom teachers half of the time, “and the rest of the day (are) coaching, engaging in planning sessions and providing feedback for a small team of educators”—usually six to seven teachers (http://www.denverpost.com/news/ci_25129131/denver-public-schools-expands-teacher-leadership-program).  Laney Shaler, Associate Director for New Educator Development at DPS, anticipates 2016-17 will see another huge growth: about 400 Team Leads in 110 schools.  

Boasberg’s goal is to have Team Leads in every district school by the fall of 2018.

“Both teachers and principals say teacher-leaders, who teach some classes while taking on additional responsibilities, offer support to and play a bridging role between administrators and teachers. ‘It’s not always easy to go to the principal or assistant principals, so I like that I’ve been able to take on that role. I can really stand up for what teachers need so students can achieve and be successful,’ said Mandy Israel, a high school history teacher who is in her second year as a team lead—one of the new hybrid roles for teachers—at Kunsmiller Creative Arts Academy." http://co.chalkbeat.org/2015/02/09/dps-to-expand-teacher-leadership-program/#.VuHdAZwrLIU
                                                                                                                                    
In this new structure in DPS, principals are still ultimately responsible for the evaluation of the teaching staff in a way that fulfills the goals of the Education Effectiveness legislation. They still observe classrooms.  But, as Shaler puts it, the Team Leads are the ones giving teachers “on a weekly basis … high quality feedback and support.”

In the process, this change also advances two goals:
1) develops leadership skills in these teacher-leaders, perhaps encouraging them to become school leaders themselves; while at the same time 2) allows exceptional teachers who want to keep teaching—but who are also eager to share what they have learned in their teaching career and support colleagues, especially those in their first few years in the classroom—another avenue to grow, without taking them out of the classroom altogether.

It addresses another key goal: higher retention. Shaler is deeply troubled to see over 25% of Denver teachers leave after 1-2 years.  Today, less than half of DPS teachers, she says, have been in the district over five years. She hopes more effective support in these first few years DPS can significantly improve teacher retention.  Absolutely critical, I am sure you will agree.  Schools want to hire and invest in terrific young teachers who will find the job do-able and fulfilling—and stay a while.  I love what Jim Shelton, former Deputy Secretary at the U.S. Department of Education, told a Hot Lunch crowd this past January regarding the time we spend on teacher evaluation: “Support support support needs to be the focus.”

Posting/Defining an Impossible Job Description (and we wonder why they can’t succeed?)

This is more than an academic matter; what we ask our principals to do, and not do, reveals a lot about our understanding of how good schools work. I close with a pointed comment on Aurora’s failed efforts to improve Aurora Central High.  Spring 2013: APS hires a new principal for ACHS.  Spring 2015: APS hires a new principal.  Spring 2016: any guess?  Yes, APS plans to hire again, and posts a job description* where we see, among the principal’s “duties and responsibilities,” at a school with close to 90 teachers:   
-“Hire, supervise, and evaluate all staff.” (Estimated to involve 14% of his/her time.)
-“SUPERVISION/TECHNICAL RESPONSIBILITY: Directly supervises all school personnel; may delegate some supervisory responsibilities to Administrative Team. … Responsibilities include interviewing, hiring and training employees; … planning, assigning and directing work; appraising performance….” (Bold mine)                          

  

Concerns expressed in past newsletters—Principal as chief evaluator? Why not more peer review?

From AV#62 – So “teacher evaluation” is broken – but is it worth fixing?    -   Dec 12, 2009
Besides, how many principals have taught our grade, our subject, and really know the dynamics of this particular eighth grade group we are struggling with as well as our colleagues do, those men and women teaching many or all of the same kids?  It is natural, then, that we turn to our fellow teachers for advice and affirmation, not to the too-busy principal who lives in another part of the campus or building, and in reality, who inhabits a different world.
And I would never fault the principal for being in that different place! It’s the world of major disciplinary issues and unhappy parents, of budgets, hiring, fire drills—and countless personnel issues that don’t even begin to touch on good instruction.  Along with guiding the school towards its larger goals, fulfilling its mission… no, I do not expect my principal to have a good handle on what is and is not working well in my classes.  But several of my colleagues do.
… In six years (teaching in two private schools) no school head ever visited my classes. It was tremendously helpful, though, to have the academic dean come in and observe—Jack was still teaching, he had twenty years of teaching experience on me, and we had co-taught an AP English class together.  It was equally valuable to have Donna, the chair of the English Department, visit and take notes.  It felt less like a judgment by an outsider and more like a much appreciated review by a friend.  I looked forward to the conversations that followed. I taught WITH these people every day, on one level we were peers, and I knew they understood the challenge of engaging the group of students they saw in the room that morning.  Yes, let’s explore the possibilities of peer review.
From AV#68 – A skeptic on SB 191 takes a closer look    -      Sept. 26, 2010
Principals as chief evaluator? I hope not. Allow flexibility on who does the evaluation.
Legislation that expects the current generation of administrators—who often found their way to these positions in spite of their lack of “instructional leadership”—to suddenly be trained well enough to offer sound evaluations is unrealistic.  I suppose in a perfect world, where principals and school leaders have a rare insight into good classroom management and teaching techniques across a wide range of grades – K-5 in many schools, K-8 in some, 9-12 in most high schools (and just consider the diversity of classroom subjects a principal might be asked to “evaluate”—physics and Shakespeare, calculus and studio art, economics and band, technology and dance)–well, if such folks exist, God Bless them and more power to them.  But for mere mortals it’s probably not going to happen.
From AV#113 – Uncomfortable Questions     -        May 7, 2014
Many acknowledge that a large percentage of principals were not hired to be, first and foremost, instructional leaders, and that—prior to the passage of SB 191—many were not well trained in how best to evaluate teachers.  Do teachers believe their principals are now well prepared to handle the evaluations? What concerns do they express about the capacity of their school leaders to handle this more substantial (and potentially high-stakes) role regarding these evaluations?
In some professions employees are evaluated by a senior colleague who has similar responsibilities.  Do teachers compare how they are evaluated by people in positions who do not do their jobs—often principals who do not teach—with how people in other professions are evaluated, and feel the evaluation system in education is placed in the wrong hands?
If teachers could determine who would be the men and women whose evaluations and recommendations for improvement would be most meaningful to them, who would it be, and why?  (Colleagues, department chairs, peers from other schools teaching the same age/subject?)  Are those people conducting the evaluations today?  Does SB 191 allow the flexibility so that those who can be most helpful to a teacher in terms of improving instruction are conducting the evaluations?


Wednesday, March 2, 2016

AV#144 - Fulfilling the Colorado READ Act: a steep climb ahead

March 2, 2016


“By now Lewis and Clark were growing ever more anxious to catch sight of the Rockies, the mountain barrier they knew they would have to cross. In the last week of May, Lewis saw the mountains for the first time. He was filled with joy, immediately tempered by a realization of the challenge that lay ahead."   http://www.nationalgeographic.com/lewisandclark/journey_leg_7.html

Colorado Read Act – Fact Sheet
Supporting K-3 Literacy Development
Achieving reading competency by the end of third grade is a critical milestone for every student and predicts ongoing educational success. If a student enters fourth grade without achieving reading competency, he or she is significantly more likely to fall behind in all subject areas beginning in fourth grade and later grades. Early literacy development is … one of Colorado’s top education priorities.   (http://www.cde.state.co.us/)
We cheered the passage of the READ Act in 2012 and continue to hope it will achieve a key purpose: to see more students achieve “reading competency by the end of third grade.”  But after three years of data, I trust no one believes we are anywhere      
close to seeing all Colorado students enter 4th
grade reading at grade level.  Last spring’s
PARCC results for 3rd graders makes it only more clear that—as READ Act advocates know full well—the new law only addressed a fraction of the tens of thousands of K-3 students who struggle with reading.  

No news there: we knew the problem is bigger than the READ Act could tackle—which was designed, to use the polysyllabic mouthful–to serve K-3 boys and girls determined to be Significantly Deficient Readers (SRD).  But are we willing to admit what a steep climb we have ahead of us, if we are to accomplish the READ Act’s ultimate goal?

I know this: we do not help matters by overstating our “success”—if success it is—in our implementation of the READ Act. 

Last July I thought it was important to say – addressing the other end of the K-12 system – that our “higher graduation rates” are probably inflated because we have no consistent measure across our 178 districts of what it means to qualify as a high school graduate.[1]  I make the same point here regarding the number of K-3 struggling readers in Colorado’s schools.  READ Act figures are not the full story. 
Our first PARCC results for third graders suggest that perhaps 40% of Colorado students entered 4th grade this past August well below the proficient level in English Language Arts – and over 60% were not yet proficient (see page 5).  Quite a different message than one might take from the first external report on the Colorado READ Act, “An Evaluation of Implementation and Outcomes after Year One,”[2] released last summer.  It was hard to get past the Executive Summary and not think—WOW! GOOD NEWS!
     “The results of this research are immensely positive, and we are thrilled to see that the policy’s implementation has had a significant impact on the lives of thousands of Colorado students.” Scott Laband, President Colorado Succeeds
    “The READ Act is making an incredibly positive impact in the lives of thousands of Colorado kids after just one year…. The majority of Colorado schools reduced the number of students with an SRD (significant reading deficiency). Many schools have seen dramatic reductions in the overall number of students with an SRD…. Statewide, the number of students with an SRD was reduced from 16% in 2013 to 14% in 2014, resulting in nearly 5,000 fewer students with a significant reading deficiency.”   Executive Summary, page 6                                    (Bold mine)

The headlines that followed the report’s release cheered as well:
·         Early literacy effort having impact, advocacy group reports  (Chalkbeat Colorado)
·         Colorado’s READ Act has a Positive Impact, Report Shows (Colorado Children’s Campaign)

And yet a close look at the study (henceforth referred to as the CRA report) makes one wonder.  Such a big change—5,000 fewer students--in only one year?  Only 14% of our K-3 students having great difficulty?   

I examine the CRA report here, in part because we have new information since it was written: Year 3 data- the 2014-15 on the number of students SRD eligible, as well as last year’s 3rd grade PARCC scores.  Also because it is imperative that we be honest about the challenge—and about evaluating progress. 
**

First, note that the CRA report only provided data on 2012-13 and 2013-14.  Not a word on 2014-15.  Both Chalkbeat Colorado[3] and the Colorado Children’s Campaign[4] mistakenly reported that this report gave 2014-15 data. It did not, even though it was published after the 2014-15 school year had ended.  What we have since learned about 2014-15 is that many districts increased the number of K-3 students identified as SRD last year compared to 2013-14. Which raises the question: does success mean fewer students judges to be SRD? What if schools and districts that identify and serve a greater number of our struggling K-3 readers might actually be doing more to fulfill the purpose of the READ Act?

Can we have multiple definitions of SRD - and still claim we are reducing the number of SRD students?

READ Act*
  Year
Eligible Students
% of K-3 students
2012-13
42,479
16.5%
2013-14
37,506
14.4%
2014-15
36,420
13.8%
*Figures made available to me by CDE
My first point: how can “The Colorado Read Act – An Evaluation of Implementation and Outcomes after Year One,” claim such success on reducing the number of student classified as Significantly Reading Deficient from one year to the next, when we had no common definition or assessment used in 2013 and 2014, or district to district?                                                            
                                              
I start here: there was no common test used statewide to determine which students were classified as SRD.  There were three possible tests in year one (2012-13) and seven in year two (2013-14), as the report itself makes clear.[5]  The report also acknowledges “limitations” in what to make of the data.[6] So that statement should read:                                                                                     
…the number of students IDENTIFIED BY SCHOOLS USING DIFFERENT TESTING OPTIONS was reduced from 16% to 14% …..
The Executive Summary also asserts that “after only one year of implementation, we uncovered some encouraging results”:
The majority of Colorado schools reduced the number of students with an SRD.
But again, to be accurate, that too-generous statement should read:
The majority of Colorado schools reduced the number of students IDENTIFIED BY SCHOOLS USING DIFFERENT TESTING OPTIONS with an SRD.

The big picture - 2012-13 to 2013-14

The CRA report is packed with interesting data from individual districts and schools, with breakdowns showing the LOWER number and percentage of students identified as SRD in several categories in 2014 versus 2013. (Including: “Schools that significantly reduced the percentage of ELL students with an SRD,” and African-American/Black students, and Hispanic/Latino students,” and much more.)  But I find it strange that a 30-page report did not present the larger story. In the fall of 2014, based on data the Colorado Department of Education had already released[7], I put this together.

Read Act Funding Formula  -  2012-13 to 2013-14*

2012-13
2013-14
change
STATE
41,942
36,993
-4,949
DPS
6,940
5,172
-1,768
Jefferson County
3,267
2,386
-881
St. Vrain
1,635
1,186
-449
Cherry Creek
1,914
1,553
-361
Adams 14
685
420
-265
Adams 50 - Westminster
1,139
890
-249
Aurora Public Schools
3,867
3,637
-230
Adams 12
2,530
2,355
-175
           *Numbers here do not always match number from CDE’s Office Literacy, which is the source of most figures quoted in this newsletter.

“Specifically, the purposes of this evaluation study were to (1) determine if the READ Act successfully reduced the number of students with a significant reading deficiency (SRD) after its first full year….”   
      From Executive Summary of the CRA evaluation
The CRA report highlights such reductions and states: “these districts are the top performers.” Really?

I sent those numbers to and met with CDE staff in November 2014 to ask several questions: Why the big drop in Denver, Jeffco, and several other districts? Was CDE confident DPS and Jeffco and other districts across the state were using the same measurements one year to the next?  Is it possible DPS, Jeffco, St. Vrain, etc. could use different criteria to identify students with a Significant Reading Deficiency? Could DPS honestly claim greater success in meeting the needs of its lowest readers than districts where the number identified stayed about the same, or grew, as in those below? Again, how do we define success?

Read Act Funding Formula
2012-13
2013-14
Change
Douglas County
1,651
1,649
-2
Poudre R-1
781
910
+129
Harrison
531
692
+161
Colorado Springs
1,301
1,466
+165
Mesa County Valley 51
852
1,130
+278

With no one consistent measure across the state[8] of what it meant to be identified as a student who was SRD, what did we really know about whether we were truly making progress?

TCAP & PARCC results invite the question: What percentage of our 3rd graders struggle to read?
Along with these SRD numbers, we now have at least two other ways of looking at the skills of 3rd graders in Colorado (2014-TCAP; 2015-PARCC), to gauge the number of 3rd graders who are–I will try a deliberately vague term for the moment–struggling readers.  So let’s zero in on that grade.  Of course I realize that neither the 3rd grade TCAP nor PARCC tests are a perfect match with assessments approved by the READ Act.  CDE’s 2014 “Annual Report on the Implementation of the Colorado READ Act (co-written by Patti Montgomery, at that time the Executive Director of the Office of Literacy at CDE) spoke directly to the difference of TCAP scores versus those identified as SRD[9]

Nevertheless, TCAP and PARCC scores tell us something useful about how many 3rd graders are not proficient readers—and, given the various categories below proficient, how many are really struggling.  Do these scores raise legitimate questions about the SRD numbers? I believe they do. (If I am wrong, feel free to ignore what follows.)

SRD 2014: The CRA report tells us that in 2013-14, out of 64,382 3rd graders assessed, 11,220 (17%) were identified as SRD.  First grade (18%) and second grade (16%) had a similar percentage.[10] 

2014 – 3rd grade - TCAP – 28% not proficient  -  vs.  -  17% SRD

TCAP data from 2014 included a breakdown of the students in the state and in each district who scored in the lowest two categories as Not Proficient in reading: Unsatisfactory and Partially Proficient.  That number and percentage increased from 2013 to 2014. 
 TCAP  -  3rd grade reading
2012-13
2013-14
Change
Unsatisfactory
6,047 -  10%
6,447   - 10%
+400
Partially Proficient
10,585 – 17%
11,326  - 18%
+741
TOTAL
16,632 – 27%
17,773 – 28%
+1,141 –   +1%

Again, I know: TCAP and SRD are entirely different assessments. I realize it is unlikely that all Partially Proficient 3rd grade readers would be identified as significantly deficient readers.  I merely point out that the number not proficient on TCAP increased from 2013 to 2014 (16,632 to 17,773), while the number identified as SRD decreased over those same years (12,241 to 11,220) (CRA report. p. 10).             

                    
  3rd grade - 2014 TCAP Reading: 71% proficient   vs.   2015 PARC English: 38.2% meeting expectations

The READ report’s Executive Summary used the 2014 TCAP results when it framed the problem this way:
Colorado’s most recent third grade reading results show that literacy continues to be an area of dire need of improvement. Third grade reading results went down statewide with just over 71% of students scoring proficient or above.                                                                            (bold mine)

But now that we have 2015 PARCC scores, we can update that statement—revealing an even greater crisis for early literacy in our state.  That “over 71%” proficient in reading, above, becomes “just over 38%” meeting expectations in English.  Again, I understand that PARCC provides us with only one score—combining results for reading and writing—so yes, an inexact comparison.  But telling?  As so few third graders opted out of the PARCC assessments last spring[11], we cannot use that as an excuse to dismiss these scores (as some do for PARCC results for 11th graders). 

Most frightening—if we believe these numbers—is to see that in several districts, less than 20% met expectations: Aurora-18.4%, Adams 50-16.8%, and Adams 14-14.3%.
                                                                                                                                                                              
Now see how the number of students scoring in the lowest two categories in English on PARCC match up with number identified as SRD in 2014-15. (NOTE: On PARCC, 23.2% scored in a third category for students not at grade level: Approaching Expectations. So in all, 61.8% of our 3rd graders (23.2 plus 38.6, below) were not meeting expectations; 38.2% were meeting expectations.)

SRD 2015: CDE gave me the 2014-15 results, which are fairly consistent with the previous year: out of 66,117 3rd graders tested, 10,639 (16.1%) were identified as SRD.  In first grade (16.9%) and second grade (15.4%), not much different.

2015 - 3rd grade - PARC English: 38.6% well below meeting expectations  -   vs.  -  16.1% SRD  

PARCC results – grades 3: % in lowest two categories  – 7 districts below state average – vs. SRD

PARCC – English Language Arts–2014-15
SRD – 2014-15

Did not yet meet expectations
Partially met expectations
TOTAL % in bottom 2 categories[12]

STATE of Colorado
19.6
19.0
38.6
16.1%





Adams 12
23.9
20.0
43.9
21.1%
Pueblo City
20.3
26.3
46.6
14.5%
Greeley
25.9
21.4
47.3
24.8%
DPS
27.8
20.2
48.0
23.8%
Westminster (Adams 50)
33.7
27.5
61.2
35.5%
Aurora
40.7
23.1
63.8
35.2%
Adams 14
39.9
26.1
66.0
38.9%

Perhaps the first column, Did not yet meet expectations, captures most 3rd graders found by the various READ Act assessments used last year to be significantly reading deficient (last column). Note the similar percentages. But if PARCC gives us a valid assessment on English skills—used across the state—it reveals that the enormous number of students we might call struggling readers.  Just look at the percentage of students who only Partially met expectations.   Add that percentage in and we find that in a district (like DPS), nearly 50% of the 3rd grade students score in the bottom two categories on English Language Arts.  In Aurora and Adams 14, nearly two-thirds score that low.  To restate: the READ Act only identifies a fraction of our youngest students struggling to read. My best guess on that number: 40% of our 265,000 K-3 students—over 100,000 students.

2014-15 SRD numbers: in several districts an increase (and what if this is GOOD news?)

Since the CRA report about 2012-13 and 2013-14 was produced, we now have Year 3 (2014-15) data, specifically the number of students eligible as SRD when the 2015-16 school year began.  Two points: 1) It shows how the numbers bounce frequently enough from year 1 to year 2 to year 3, again raising questions about the consistency and reliability of the assessment being used.  2) It, too, challenges the assumption in the report that success equals a reduction in the number of SRD students.  

That assumption would cheer significant reductions in K-3 students identified as SRD in districts, such as these below.  But when numbers bounce up and down this drastically, isn’t there good reason to doubt the consistency and validity in determining which students are SRD? 


2012-13
2013-14
2014-15
Changes over time
STATE
42,479
37,506
36,420
-6,059 fewer SRD students over 2 yrs
Mesa County Valley 51
852
1,130
599
cut # by 531 - nearly 50% - in one year!
St. Vrain
1,635
1,186
743
 cut # by 892 - over 50% - from 2012-13
Thompson R2-J
674
581
359
cut # by 315 - over 47% - from 2012-13
DPS
6,940
5,172
5,027
cut # by 1,913 - over 27% - from 2012-13

That assumption would also suggest that the many districts—like the 10 below, who identified more significantly deficient readers in 2015 than in 2014—are doing a poor job of implementing the READ Act.[13]

Read Act Funding Formula - 2013-14 to 2014-15*
                           
2013-14
2014-15
Change from previous year
STATE
36,993
35,974
-1,109
Jefferson County
2,386 – 10%
2,799 – 11.4%
+413
Adams County 14
420 – 17.9%
591 – 25.1%
+171
Pueblo 60
809 – 13.7%
952 – 15.9%
+143
Greeley
1,247 -18.3%  
1,383 - 20.1%
+136
Poudre
910 – 10.7%
972 – 11%
+62
Colorado Springs
1,466 – 16.4%
1,524 – 17.3%
+58
Adams 12
2,355 - 18.4%
2,405 – 20%
+50
Fountain
427 – 14.4%
470 – 16.7%
+43
Montrose County RE-1J
251 – 15.4%
290 – 17.6%
+39
Littleton 6
304 – 7.2%
340 – 7.9%
+36
           *Numbers here do not always match number from CDE’s Office Literacy, which is the source of most figures quoted in this newsletter.

Another View – a contrary view–might be the districts are more successful when they identify a higher percentage of their struggling readers.  For it means a greater number of boys and girls who are not reading at grade level will benefit from the funds ($32 million this year) and the services made available through the READ Act.


The CRA report highlighted “success” in one district, Westminster (Adams 50), and several schools

Finally, the report offered a closer study of one school district, Westminster (pages 22-23), and four elementary schools (pages 116-21)—“recognized for their success in reducing the number of students with an SRD.”  A 2015 update—now with PARCC assessments in front of us—suggests the CRA report was cheering too soon.

According to the CRA report, Westminster was chosen “because it had reduced the percentage of students identified as having an SRD by eight percentage points from 2012-13 to 2013-14.”  The report included an interview with Mat Aubuchon, Director of Early Childhood Education, who was asked about his “biggest celebration.”
Aubuchon: The drop in the number of students identified as having an SRD; increased use of data in the elementary schools; principals starting to hold K-2 teachers accountable.

28% K-3 SRD (2014)
2012-13
2013-14
change
# of K-3 students 2012-13
# of K-3 students with an SRD
% of students with an SRD
# of K-3 students 2013-14
# of K-3 students with an SRD
% of students with an SRD
# and percentage pts difference  from 2012-13 to 2013-14
3153
1,139
36%
3146
890
28.5%
249 fewer students
8% fewer students

We now have 2015 SRD figures to suggest Westminster is suddenly not a “top performer”; in fact, like the 10 districts listed on the previous page, its SRD numbers went up last year.

2012-13
2013-14
2014-15
1,139 – 36%
890 – 28.5%
910 – 29.1%
Number of students and % from https://www.cde.state.co.us/cdefinance/readactperpupilfunding-0, CRA report, and emails to me from CDE.

Hey, some will say, the overall numbers are down from 2012-13.  Can’t we at least celebrate that? 

But I cannot.  For as I have demonstrated, the SRD figures only reveal a portion of the K-3 students in Colorado who struggle to read at grade level.  Westminster is a good example

In 2014, we can compare the SRD percentage with TCAP figures for third grade in Westminster.
SRD – 3rd grade – 33.6%[14] - about 232 students
TCAP – 3rd grade – 43.7% not proficient (Of 690 3rd graders taking TCAP, 16.23% scored Unsatisfactory; 27.39% scored Partially Proficient.  A total of 301 3rd graders not proficient in 2014.) (http://www.cde.state.co.us/assessment/coassess-dataandresults
In 2015, we can compare the SRD percentage with PARCC results for third graders in Westminster.
SRD – 3rd grade – 35.5%[15] - about 255 students
PARCC – 3rd grade –61.2% of the 3rd graders – 440 students - fell well short of Meeting Expectations in English. 

2014-15 – PARCC – 3rd grade
# students
# students Did Not Yet Meet Expectations
% students
Did Not Yet Meet Expectations
# students Partially Met Expectations
% students Partially Met Expectations
# and percentage in two lowest categories
719
242
33.7
198
27.5
440 students  - 61.2% of 3rd graders Did Not Meet or Partially Met Expectations

2014-15 update – Cole Arts and Sciences Academy - PARCC

The CRA report also provided a picture of the work in several elementary schools, including Cole Arts and Sciences Academy in Denver.  We read of a remarkable decrease in K-3 students identified at Cole as SRD between 2013 and 2014: 137 students (45%) down to 65 (23%) – a decrease of 22%.  An inspiring story. To hear about Jessica Jackson and her team is to make one eager to say: KEEP UP THE GOOD WORK!

And yet … one more inconvenient truth.  OK, not a truth—just test scores.  But telling?  Troubling? 
Here are the grim PARCC numbers at Cole Arts and Sciences Academy:

93.1% Not Meeting Expectations or Partially Met Expectations on PARCC – English (2015)
2014-15 – PARCC – 3rd grade
# 3rd grade students
% / # students
Did Not Yet Meet Expectations
% / # students Partially Met Expectation
% / # students Approached Expectations
% students Met Expectations
# of students Met Expectations
58
36.2% (21)
27.6% (16)
29.3% (17)
6.7%
4

Not put in front of you, or especially in front of the Cole teachers—and so many like them who work so hard to help the boys and girls  in their care to learn to read well in their first few years in public school—to discourage anyone.
Just to insist that we not misstate the challenge, that we not exaggerate our "success" to date—or the tough trek before us. 
Climb every mountain
Mountain Valley
“… we proceeded on to the top of the dividing ridge from which I discovered immense ranges of high mountains still to the West of us with their tops partially covered with snow."  
                                Meriwether Lewis, August 1805, http://www.nps.gov/nr/travel/lewisandclark/lem.htm

Lewis and Clark and team headed west over 200 years ago, with an ambitious goal—to reach the Pacific.  When they saw the Rockies, it must have dawned on them too: a steep climb ahead.

We cheer the goal: to see our students reading at grade level as they start 4th grade.  But no illusions, please. It won’t be easy.

Another View is a newsletter by Peter Huidekoper.  Comments are welcome. 303-757-1225 - peterhdkpr@gmail.com



[3] “The program rolled out in the 2013-14 school year, so results from 2014-15 provided data for comparison.”  Bold mine
[4]  “The report’s authors compared assessment data from the 2013-14 school year (the first of implementation) with 2014-15 data….” http://www.coloradokids.org/colorados-read-act-has-a-positive-impact-report-shows/       Bold mine
[5] “In the first year of implementation, schools were required to use one of three tests previously approved for the Colorado Basic Literacy Act. The options available included the Developmental Reading Assessment 2nd Edition (DRA), DIBELS (either DIBELS Next or DIBELS 6th Edition), or the Phonological Awareness Literacy Screening .…  In 2013, the State Board of Education approved seven different testing options, from which schools can choose to administer” (p. 7).
[6] “The assessment results provided by CDE were not always easily interpreted and we recognize the statistical and psychometric limitations associated with analyzing data from a single year of implementation. Some districts showed a significant and unexplained discrepancy in the number of students in a particular category from one year to the next…” (p. 10).
[8] I am not calling for one test. I am simply stating that different tests makes it necessary to be cautious, at the very least, about comparisons of SRD numbers one year, or one district, to the next.  Especially, of course, when 2012-13 preceded the distribution of any READ Act funds!  I heard concerns about the validity of SRD number from several K-1 teachers when I visited an elementary school last month:  They expressed no fundamental problem with READ Act—just concern about the amount of time spent documenting and updating plans. Still trying to develop “an effective process to put that into effect” in our school; “not clear” in many cases on identifying the SRD students; “seems like a lot of gray area”; “there is so much room for error,” even with Scantron—which provides more consistency; “I wouldn’t let different people progress monitor” as it was leading to such a range of scores; “I think DIBELS are really good tests, but we just need to administer them consistently.”

[9]“Students in third grade in the 2012—2013 school year were assessed with both the Transitional Colorado Assessment Program (TCAP) test and one of the State Board approved READ Act interim assessments in the spring of 2013. The READ Act interim assessments identified a higher proportion of struggling readers than the TCAP (19% and 10%, respectively). While this may seem incongruous, it is important to remember that the interim assessments measure only the critical early literacy indicators that are most predictive of future reading success and therefore are not comprehensive in nature….” (http://www.cde.state.co.us/coloradoliteracy/readactannuallegislativebrief2014final)
from READ Act Report - 2013-14

#of K-3 students READ-Act Tested
# of K-3 students - SRD
% of students- SRD
All K-3
261,343
37,506
14%
½ Day Kindergarten
17,822
1,210
7%
Full Day Kindergarten
47,411
2,921
6%
Ist Grade
66,309
11,619
18%
2nd Grade
65,419
10,536
16%
3rd Grade
64,382
11,220
17%

[12] I highlight the middle column above to emphasize the huge percentage of students scoring in the lowest two categories. These columns do not even include that third category of students also falling short: Approaching Expectations, statewide, another 23.2% of third graders.

[13] Consider this example, and tell me if this school is succeeding, or failing, to implement the READ Act:
From the school’s Unified Improvement Plan for 2013-14 -Target – Read Act
·         Number of SRD students will lower to 5%. (DIBELS)
A year later, from the school’s UIP for 2014-15 – (A look back) - Performance on Target
·         The goal was not met as the percent of SRD students increased to 13% at End of Year assessment.
QUESTIONS A SCHOOL MIGHT ASK:  What if we are conscientious in assessing our new kindergarten and first grade students and identify a large percentage of them as Significantly Reading Deficient?  Isn’t this what we should be doing?  But then, even if we help a good many of our returning SRD students improve to the next level, our overall K-3 SRD numbers might rise.  If this is implementing the READ Act well, isn’t that “success”?

[14](email from CDE to me)
[15](email from CDE to me)