Tag: assessment

Negotiating Release

January 26, 2015 by
Ollie and Tobey

Ollie and Tobey

My husband and I have two dogs. Ollie is a springer spaniel; Tobey is a rather unfortunate cross between a Yorkshire terrier and a miniature husky. We live near water and enjoy spending time in the lake when the temperatures rise.

Ollie took to the water immediately. In no time she figured out swimming, and she could be counted on to paddle leisurely until we were ready to leave. Not the case for Tobey. He surprised us with his reluctance to put a paw in the water.

Tobey did eventually learn to swim and now enjoys a quick lap or two, but it involved a process. We had to introduce him gradually, making sure he had the skills and confidence to move from the beach area to deeper water.

I mention my dogs because they serve as an example of how we make assumptions. I assumed that all dogs instinctively knew how to swim. After all, they enjoyed going down to the beach with us. As teachers, we are tempted to make the same assumption: because our students like to use technology, surely they know how to use it effectively.

A tenant of proficiency-based teaching and learning is that students will determine how and when they will “show what they know.” This implies that the student will be asked to direct their own learning. We avail ourselves as facilitators, flip our classrooms, determine pacing guides, and do less direct instruction. Time, not mastery, is now the variable. (more…)

Helping HELP: Paul Leather’s Testimony on Assessments and Accountability

January 21, 2015 by
Paul Leather

Paul Leather

Earlier today, Paul Leather, Deputy Commissioner at NH’s Department of Education, testified at the Senate HELP Committee Full Committee Hearing on “Fixing No Child Left Behind: Testing and Accountability” about improving assessments and accountability systems. His testimony is provided below or you can watch here. Additional resources on ESEA include:

- – -

Chairman Alexander, Senator Murray, and Members of the Committee, thank you for inviting me to testify about testing and accountability in the Elementary and Secondary Education Act.

I am Paul Leather, Deputy Commissioner of Education of the NH Department of Education.

In NH, we are working to explore what the next generation of assessments might look like, beyond an end-of-the-year test.

We have coordinated with the Council of Chief State School Officers on its Priorities for ESEA Reauthorization. These Priorities contain three important ingredients that are in line with the work we are doing:

  • First, it would continue to support annual assessments of student performance to ensure every parent receives the information they need on how their child is performing, at least once a year.
  • Second, it would allow states to base students’ annual determinations on a single standardized test, or the combined results from a coherent system of assessments.
  • Third, it gives states the space to continue to innovate on assessment and accountability systems, so important when the periods of authorization can last 10 years or longer. (more…)

In Search of the Goldilocks Scale

January 15, 2015 by
Porridge

Too hot? Too cold? Just right!

We have learned a lot over the past five years as our district has implemented a competency-based model of grading and assessing. Competency-based grading and assessment requires a significant shift in the way we think about assessment—its purpose and its meaning. Our school, Memorial School in Newton, NH and our district, the Sanborn Regional School District, moved to this model five years ago. We continue to learn more about what assessment of students truly means as our overall understanding of assessment practices (our assessment literacy) increases.

When we moved to this model of grading and assessment, our elementary teachers made a wholesale change to grading with a four-point rubric. There would be no number scale (100 point scale) and there would be consistency across grade levels horizontally and vertically. The grade scale rubrics we used would identify the expectations around each level. Our learning curve was steep as we created the rubrics, but we found that our learning was not going to stop there. It continues to this day.

Our first year, we identified our rubric indicators as E (Exceeding), M (Meeting), IP (Inconsistent Progress), and LP (Limited Progress). The chart below reflects this first attempt at our rubric scale. The first roadblock came after the first progress report was distributed. As an educational staff, we looked at IP as what the descriptor outlined—inconsistent progress. A student was able to demonstrate competency, but it was on an inconsistent basis. Many parents provided feedback that it just “felt negative” (the word inconsistent). We decided that “In Progress” was also an accurate indicator, and parents agreed. We made the change immediately within the “Level” while keeping the performance descriptor the same. (more…)

Tackling Work Study Practices in a Competency-Based Educational System

December 9, 2014 by
Sun

Responsive Classroom

Last year, teams of teachers within our district, the Sanborn Regional School District in New Hampshire, became deeply involved in building Quality Performance Assessments. These assessments are designed to truly assess a student’s competency, or transfer of learning. Our teachers have worked incredibly hard at building high-quality, engaging assessments. Their overall assessment literacy, and the learning that has occurred throughout these processes, has been significant. However, it has also raised additional questions.

The most recent questions have had to do with Work Study Practices (also referred to as work study habits or dispositions/behaviors). The State of New Hampshire defines the four work study practices in New Hampshire as Communication, Creativity, Collaboration, and Self-Direction. For the past six years, our district elementary schools have identified the Responsive Classroom CARES (Cooperation, Assertion, Responsibility, Empathy, and Self-regulation) as the behaviors we will assess in each student. These fit in well with the work study practices the State has identified. Within each performance assessment, teachers have been identifying a specific behavior as the one that will be assessed within the performance assessment itself. For example, a performance assessment may lend itself to having cooperation/collaboration of students assessed, so teachers are including this to be assessed, complete with its own indicators within a rubric as part of the scoring within the assessment (separate from the assessment of academic competencies). (more…)

The Case for Performance Assessments in a Standards-Based Grading System

December 5, 2014 by
DeLoreto

Louis F. DeLoreto

If only measuring students meeting academic standards in the classroom was as easy as it is in the performing arts or athletics. Concerts and games are authentic performance assessments. They provide the opportunity for students to demonstrate their skill levels and grasp of the concepts before an audience. Observers can see and hear the results and make judgments on the level of performance using their knowledge of the criteria commonly used to determine proficiency levels. If only we, the audience, could see how well a student is performing on authentic challenges in the classroom like we do at an orchestra concert or a basketball game.

The principle of demonstrating performance on an academic standard is the same as in the performing arts and athletic arenas. The “audience” wants to see what the student is being asked to do and to be able to understand how they did. However, the traditional classroom performance assessment is not as readily identifiable as the complexity of a musical piece or the competitive level of an opposing team. Therefore, the degree to which the student grasps an academic standard in a classroom is difficult for counselors, administrators, and parents to see and understand in today’s traditional high school assessment systems. (more…)

First Stop of the Magical Mastery Tour: Bronx International High School

December 4, 2014 by

BxIHS

This article is part of a series of case studies of schools in New York City. For the full story, start with my overview of the Magical Mastery Tour and the three biggest takeaways. You can also read the report on Carroll Gardens School for Innovation.

Inspiring. I know no other word to describe the students and staff at Bronx International High School (BxIHS). Arrived from all around the world, the 400+ BxIHS students come to the school with hope, drive, curiosity, creativity…and little or no English.

Designed as a high school to serve new immigrants, BxIHS “accepts students who score at or below the 20th percentile on the Language Assessment Battery (LAB-R) and have been in the United States fewer than four years.” Students enter with a wide range of academic experiences behind them, some having spent little or no time in a formal education setting.

Regardless of background, the two things all the students share is a desire to learn English and to complete high school. Staff members, many of whom were English language learners at one time in their own lives, work collaboratively and joyfully in an “outcomes” approach to ensure that students reach proficiency in language/literacy, content, and skills. (more…)

The Role of Assessment Instruments in a Competency-Based System

November 5, 2014 by

Screen Shot 2014-11-05 at 7.01.11 AMNo matter how you approach it, you cannot mitigate the massive change agent that is competency-based education. It does not leave much room for “old school” notions of teaching and learning. It does not tolerate anything less than a committed belief that all students can achieve at high levels.

It certainly demands a philosophical and ideological shift in thinking about “best practice” in education.

When I had first embarked on this journey, I had prepared myself for these shifts as they pertained to my practice. How can I become more student-centered? What does that look like? How will I know if my students are ready?

The question I never asked: How will I assess it and grade it? (more…)

Is There Enough Time for Learning?

November 4, 2014 by
Oliver Grenham

Oliver Grenham

Because of the growing number of mass-administered, required tests under state and/or federal law, there is an increasing and unsustainable demand being placed on student time in school. In recent years, these mandated test increases have affected students in Colorado at all grade levels, from kindergarten through twelfth grade.

While student assessment is vital to learning, excessive testing is not, particularly in the way it is handled today. The quantity and quality of instructional time is what matters most for productive learning to occur.

Our experience in Adams County School District 50 has shown that a mass administration of the same test to students of the same age at the same time does not promote learning. In fact, it penalizes students, their teachers, and their schools. An overemphasis on testing significantly reduces the quantity and quality of time that could be better utilized in closing the achievement gap: something our data shows we are successfully doing.

The Teaching Learning Cycle in a Competency-Based System

We all know that teaching and learning take place in the classroom. As educators, we refer to this cyclic process as the Teaching Learning Cycle.

Teaching Learning Cycle (more…)

Reflections after Two Years of Performance Assessment Cohorts in New Hampshire

October 22, 2014 by

Originally posted on September 22, 2014 for the Center for Assessment’s Reidy Interactive Lecture Series.

Let’s now return to the question posed in an earlier post: what have we learned about the possibility of sparking systemic implementation of performance assessment? These reflections come from the NH Performance Assessment for Competency Education (PACE) districts, as well as recent check-ins with team leads who participated in 2012 and 2013 Performance Assessment Network Cohorts. Half of these team leads reported that the work has been brought back to the rest of the school, and teachers outside of the group that attended the institutes are using performance assessments, while in other schools, QPA implementation has been more limited to the teachers who attended the institutes.

A strong, coherent vision helps people see the big picture

Administrators need to understand the big picture first and then set up the enabling conditions for the implementation to happen and the work to be sustainable. Participating in the 5-day training helps administrators develop their own instructional leadership and understanding of performance assessment. As one team leader noted, “[we] need administration to attend sessions, to show the seriousness and importance of this work, and get a solid team of committed individuals.” A recent post by a PACE district elementary principal illustrates how one district has integrated the training into their vision.

It takes time and effective structures to create a collaborative professional culture

A collaborative culture enables educators to use QPA protocols to engage in quality design, analysis, and instructional decision-making. PACE districts and 11 of the other administrators reported having Common Planning Time (CPT) built into their schedules. About half of those administrators said that the CPT was being used to specifically develop the QPA work. Two other schools that didn’t have CPT had time for the QPA group to meet to advance the work on their own. Structures provide the space, but the CPT must be used effectively. As one teacher at a PACE district school noted, “If we hadn’t done all work in the past becoming PLCs [professional learning communities], setting goals for our teams and norms, having expectations of our teammates then we wouldn’t be where we are. We couldn’t sit at a table and talk about what happens here.”

(more…)

The Power of Deep Discussions around Student Work

October 21, 2014 by
Laurie Gagnon

Laurie Gagnon

Originally posted on September 15, 2014 for the Center for Assessment’s Reidy Interactive Lecture Series.

During the first week of August, thirteen educators from five states gathered for a three-day scoring institute as part of the Innovation Lab Network’s Performance Assessment project. The goals of the institute included attaining reliable scoring on the performance assessment the teachers had field tested in spring 2014 and informing the design of the emerging national task bank and accompanying resources to support implementation of tasks.

I had the privilege of co-facilitating the English Language Arts group. As we discussed the rubric and the annotated anchor work samples, and practiced scoring student work, the group gained a common understanding of the elements of the rubric and a level of confidence about how to apply them to student work. In the course of the three days several themes emerged that underscore some guiding principles for implementing performance assessment.

(more…)

WordPress SEO fine-tune by Meta SEO Pack from Poradnik Webmastera