PARCC in Massachusetts

One teacher's attempt to connect with other educators across Massachusetts and beyond and provide relevant, up-to-date, and sometime helpful information on next generation assessments, common core state standards implementation, and college and career ready initiatives

What should districts (states) do with PARCC?

After listening and reflecting on the PARCC assessments with other teachers in Chicago, I believe the new tests do hold a number of promises worth considering.  Much of the possibilities that technology delivered assessment will bring, I will focus on in-depth in a later post, but the fact that they are planned to be delivered via technology and that it is an assessment system really force districts (states) to think about what they should do with PARCC.  However, I first want to address a few misconceptions.  (NOTE:  This post assumes that assessment is not the end toward which education is directed, but it is a component.  In addition, given that federal and state accountability exams are a component of U.S. education policy this is not intended to be a critique of assessment, but rather assumes that given these assessments what should be done with them)

Misconceptions

The PARCC assessment is sometimes is simplified to just another test or is seen as a testing regimen that encroaches on classroom time and significantly expands the time spent on testing.   Neither are, as far as I can tell, true.

The first belief or suspicion that these assessments will be more of the same will be addressed in due time as prototype items and pilots are run during the 2012-2013 school year.  However, it is worthwhile to look around and start to imagine what these assessments might be like and to truly start to think about what technology enhanced assessment might mean.  Here is a list of places to go (sorry ELA folks, but my eyes are usually looking for math stuff!) where you can see what others have done with technology and assessment.  These tests will be different and not solely because of the technology, but a look in this direction is enough to start thinking more deeply about how different these assessments will be (more of this in a later post when I reflect on specific technology enhancements built into the assessment using UDL principles and enhancements that are optional for ELL students and Students with Disabilities).

1.  ONPAR (Obtaining Necessary Parity through Academic Rigor) from the University of Wisconsin.

2. CBAL (Cognitively Based Assessment offor, and as Learning) from Educational Testing Services (ETS).

3.  Research conducted at Shell Centre for Mathematical Education at the University of Nottingham.

The second misconception is about time spent on assessment.  The true part of this is that PARCC is developing a comprehensive system of five assessments; however, it wrongly assumes that schools, districts, and states must implement them all.   States that choose to adopt PARCC and replace their already existing statewide assessments in the coming years will only be required to administer two components for accountability purposes.  These two assessments, the performance based assessment (PBA) and the end of year assessment (EOY), will be “free” to districts just as, in Massachusetts, the MCAS is a state funded exam with no costs to districts (the speaking and listening assessment is required, but is not for accountability and can be given at any point in the year).  These two assessments do not in and of themselves represent a drastic increase in testing time, but may represent some increase.  These two components represent the belief that high stakes assessment should contain a broad set of assessment experiences that go beyond multiple-choice testing.

The interesting question is, should a particular school or district choose to adopt the entire assessment system?

The Potential Power of PARCC as an Assessment System

The image above is the entire assessment system with all of its components.  Typically, districts and individual schools have cobbled together an assessment system of their own in including their state assessments.  Individual teachers, schools, or districts may have a diagnostic assessment to serve as an early indicator to inform instruction and supports as well as professional development.  Teachers may also have developed or purchased an interim assessment(s) to monitor student performance and use the interim assessment(s) result to inform instruction.  Lastly, school and districts hope these early assessments also provide information to prepare students and faculty for the state assessments.  While not universal,  most districts have some assessments beyond curriculum embedded assessments that  inform what they do.

So, what is the potential power of the PARCC assessment as a whole?  Below is a list, in no particular order of what this system promises to offer districts and schools (states):

1.  Alignment.

Usually districts have multiple assessment products some of which claim to be aligned to their accountable exams.  In this case, the non-accountability components are directly aligned with the assessments used for accountability.  The experience and  information from the diagnostic and midyear exams provides a clear sense to students, families, and teachers how a student is making progress toward end of year mastery as well as the student’s particular growth from year to year.  This internal alignment provides districts, schools, teachers, and parents a set of assessment tools that clearly align with end-of-year goals, but with the ability to inform instruction, adjust programming, reallocate resources, modify professional development during the school year.

2.  Timeliness.

The push to deliver assessments on computers has a variety of rationales, but one relevant to districts is time.  The turn around time on state assessments is usually in month intervals and does little to help inform teacher practice, support interventions to help struggling students, or make decisions on what is and is not working.  With technology delivered diagnostic assessment near the beginning of the year (with an estimated two week turnaround time) and timely scoring of performance-based assessments at the mid-year (not two weeks, but returned in time to still plan for summative exams) could be hugely valuable for schools who want to have timely data on which to act, plan, and respond to student needs.

3.  Quality.

Quality assessments that are reliable and valid as well as solidly aligned to the standards are crucial if they are to be useful in any way.  So, will these tests be high quality?  Time will obviously tell, but there is reason to believe that they will be.  Here are few reasons in no particular order.  (1)  PARCC is made up of 23 states all of whom are involved in vetting and overseeing the development of these tests for purposes of accountability (remember, these states already do this individually for their state exams).   All of these member states and their associated education department personnel are working hard to make these tests of higher quality than their current assessments.  (2) These tests also bring higher education to bear in the vetting process.  The higher education community wants these tests of such a quality that students will be ready for college-level coursework. (3) Assessment vendor contracts for assessment delivery and development will be awarded based upon the quality of the items produced and involve a multitude of vendors in the process.  Vendors are currently competing for their future contracts. (4) Given the scope and stakes involved in the project, leading specialists on English Language Learners and Students with Disabilities are actively informing the development of items to ensure that all students can be successful within this assessment system (more on accommodations and UDL in a later post).  (5) Lastly, PARCC has made a commitment in their design blueprint to have assessments that measure the full range of student performance in order to provide meaningful feedback on high and low performing students.  All of these parameters point toward these exams being of high quality.  We will have to see!

4.  Sharing

Within Schools and Districts

Within schools and within districts this whole system would give you grade-by-grade (course in HS) diagnostics exams to  share data and action plan around best practices, supports, interventions and professional development.  Some districts and schools already do this, but given the alignment, timeliness, and quality that this assessment promises it could make these assessments a powerful tool for learning what is working in your school and determining what interventions are proving successful.

Across Schools and Districts

Given the broad range of districts that will potentially be part of PARCC it is conceivable that districts and schools (states) could learn from each other and start sharing what is working as judged by these assessments.  This would obviously be a whole new world too many districts and schools, but it is conceivable that districts or schools within the same state or in different states could collaborate and learn from each other if they are getting information around the same assessment system.  This sort of sharing was limited with a single state assessment that occurred at the end of the year and which provided feedback months later, but with a more comprehensive system that isn’t solely linked to accountability it could provide an easier model for sharing lessons learned with other schools and districts.

So, what should districts (states) do?  Keep watching.  See if these assessments are coherent, timely, and of high quality.  If they are, the patchwork of assessment that schools and districts typically employ may go away and find a really powerful replacement.  The net result would be little increase in testing time overall, higher quality assessments, and assessment tools that are meaningful and useful for teachers and districts.  In the mean time, if you are against assessment altogether keep writing and speaking out, but if you think it has a meaningful role in your schools and districts now is the time to be vocal about what we (you) may want from PARCC.

Advertisements

5 comments on “What should districts (states) do with PARCC?

  1. Lori DiGisi
    August 11, 2012

    Thank you Darrin ! This is a great summary of the direction PARCC is going in and possible advantages. I think a question that people have, particularly ELA folks, is how will open responses or essays be scored? By machines or humans?

    • dgburris
      August 12, 2012

      @Lori-I don’t think scoring decisions will be entirely known until pilot information in the Spring of 2013 is analyzed. My sense was that given the explicit distinction between machine-scorable exams (EOY/diagnostic) compared to performance assessment components (Mid-year and PBA) that human scoring will be involved at some level for these exams, but the question is still of degree. We should place it near the front of questions for February. If I hear/read/see anything that can say more, i will let you know!

  2. Renee Moore
    August 12, 2012

    one of my concerns at this point, is it appears that compared to the type of assesssments the other consortium is developing, PARCC still is just highly digitized more of the same approach to testing. Have you done or seen a comparison/contrast of the two?

  3. dgburris
    August 14, 2012

    @Renee Have you seen the Aspen Institute brief comparing them in a global way? http://www.acsa.org/MainMenuCategories/ProfessionalLearning/ccr/Assessment20.aspx

    As for items, I do think there will be a range of item types, but did you see some of the items in this link: http://www.mathshell.org/papers/dpthesis/appendix.htm. These were items developed at Nottingham in 2006 and I do think that clicking through these assessments demonstrate more than digitized paper; however, I will wait for prototype items to appear (i’ll blog about it as soon as I see a “real” PARCC item) before saying how different the new assessments will be.

    What makes you think that “PARCC still is just highly digitized more of the same”?

    • Renee @TeachMoore
      August 14, 2012

      Thanks for the links and additional information. Most of what I had seen and heard so far focused on the fact that the tests would be delivered digitally, but little or nothing on how the content of the test items would differ from what we’ve always had–hence my concern. I’ll look at what you’ve shared, and look forward to more of your analyses.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Information

This entry was posted on August 10, 2012 by in ELC Chicago July 2012.

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 2,779 other followers

The opinion expressed within each post is my own and is not a reflection of the Massachusetts' Department of Elementary and Secondary Education, PARCC, or any other body.

%d bloggers like this: