top of page
Diagnostic Assessments

InTASC Standard 6: The teacher understands and uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, and to guide the teacher’s and learner’s decision making.

Introduction

In order to successfully close the gap between where a student is and where he/she needs to go a teacher must answer three questions—where is the student going, where are they now and where to next (Heritage, 2010). Each student comes to our classroom with their own individual culture, set of value, beliefs, and prior academic knowledge. Each of these aspects of a student can and should be leveraged to enhance the learning environment for all. Prior academic knowledge is particularly important because while each course comes with a built-in common finish line—the standards—each student is beginning at their own starting line. We, as educators, need to assess our students’ prior academic knowledge so we can fill in gaps of knowledge, effectively plan how to relate past learning to new ideas, and scaffold appropriately. It would be impossible to effectively guide our students along their educational journey if we do not know where they are each starting. Diagnostic assessments are the tool we can use to find each student’s starting line.

Atomic Theory Diagnostic

In my classroom, I give students a diagnostic at the start of every unit and often during the unit before presenting new complex information. Below you will find several student samples of the ‘atomic theory’ diagnostic given at the start of our unit dedicated to the Periodic Table. 

In chapter 2 of Instructional Planning & Delivery, two different types of diagnostic questions are identified—readiness questions and pre-test questions. Readiness questions test a student’s prerequisite knowledge. Pre-test questions test a student’s mastery of grade level learning (Teach for America, 2011). Each of these questions are valuable and I include a mix of each on start of unit diagnostics in my class. In the samples below, question one and two are designed to assess readiness and questions three through five are designed to assess understanding of high school chemistry content.

Student samples of diagnostic assessment

The diagnostic shown on the left is also scaffolded—beginning with a fairly simple question that should have been learned in previous classes according to the Next Generation Science Standards (NGSS) for middle school and increasing in difficulty as you go down the page. The included sample diagnostic begins with a fairly simple question that students should have learned in previous classes—the charge of the three subatomic particles. Then a common knowledge question is asked—how do magnets and attraction work. Both of these concepts need to be mastered before the structure of the atom can be understood. While these topics were covered in previous classes, many students have a tendency to memorize facts for an assessment and then immediately forget them (Marzano, 2006). It is important to check for retention of these previously learned facts. 

 

Next, several pre-test questions of increasing difficulty are included. The first asks students to draw a specific model of an atom. This would require them to be familiar with the scientist and his experiment as well as the subatomic particles. Finally, a content question about the periodic table was included. This question would require the greatest amount of content knowledge to answer correctly. 

 

Then, I included a question to assess students’ attitude on the class and the topic. I asked students how they think the class could be improved as a way to gauge their interest. This allows me to modify the unit, make the class more enjoyable, and increase engagement. 

 

Finally, students were then given a space to write everything they have ever heard about the topics we are about to cover in class. This space will allow me to check for misconceptions and general knowledge students may hold about atoms, molecules, elements, and the periodic table.

Why Is There No Feedback Provided?

In the video clip “Diagnostic Assessment,” it is stated that diagnostic assessments are not done for assessment purposes—rather they are done for instructional purposes (Laureate Education, Inc., 2012). These assessments can tell you where to begin and also let you know “how far you’ll need to travel” (Teach for America, 2011, p. 28). While feedback is a crucial component of most assessments, I do not return diagnostic assessments to students and thus do not mark them with extensive feedback. Students likely do not know some or most of the information on a diagnostic. Ideally, they will know the solutions to questions designed to test readiness but they will likely not know the answers to the pre-test questions. To not discourage students, I do not return diagnostic assessments—rather I collect them and utilize them in my planning for the unit. 

How Do I Use This Data?

Table Groups.png

Kagan inspired table chart for collaborative grouping

Seating Chart.jpg

Recreation of class seating chart with students' names replaced with diagnostic scores

The small learning community in which I teach—the SMART academy—places a strong emphasis on the Kagan method for collaboration. All new teachers are required to attend a multi-day professional development workshop for these techniques.I use the data collected about students’ prior knowledge from diagnostic assessments while making purposeful table groups and partner pairings. Kagan structures for collaborative grouping consists of placing four different “levels” of student in regards to achievement in each group—each having a ‘high,’ a ‘medium-high,’ a ‘medium-low,’ and a ‘low’ student. By having each of the four “levels” of students at each table, it allows for maximum support. The students with higher levels of prior knowledge can support the students who are struggling to access the content. The ‘high’ student also benefits because they will occasionally have to explain a concept to their partner. Research shows the best way to learn a concept and retain it in long-term memory is to teach it to someone else. Conversely, the ‘low’ students have someone at their table they can ask for assistance if I am personally assisting a different group. This structure maximizes learning and holds all students accountable to their partner. 

 

To tailor the Kagan method to my classroom environment, each student has an ‘ionically bonded’ and ‘covalently bonded’ partner. This serves the additional purpose of allowing quick and easy transitions from whole instructions to partner discussions and other collaborative efforts. This system would not work without effective diagnostics to place students into the appropriate learning teams. 

For the “Atomic Theory Diagnostic,” included above, there were 15 points possible. When I give a diagnostic to my students, I grade each and then organize them by score from lowest to highest. I then sort the assessments into four stacks—low, medium-low, medium-high, and high scores. Once these four stacks have been made, I consider student relationships and dynamics and select one student from each level to create collaborative table groups—each with four students. During this process, I can also give students with IEPs and English language learners preferential seating close to the board or the front of the room based on specific needs. The document on the left shows an example seating chart for one of my classes. This document is a recreation where I have substituted students’ names for their scores on the diagnostic to show the organization system. Once this chart has been created, I shuffle the students around each table further so it is not evident which student falls into which of the four achievement levels. These assessment and data-driven seating charts and collaborative teams are one method I use to allow assessments to guide both my students’ and my own decision-making. 

I also use diagnostic assessment data to guide my decision making about the order or instruction and what material we need to cover in class. As previously stated, several of the questions on the diagnostic above are readiness questions designed to test prerequisite knowledge. For example, students should have covered the basic properties of subatomic particles in their middle school science classes. However, many students did not answer the first question—listing the charge of each of the three particles—correctly. Originally, I had planned to skip this topic—operating under the assumption that students already knew this content. However, it is clear that may student would benefit from a review of this foundational material. I used this gap in knowledge, identified by the diagnostic assessment data, to guide my decision-making and I modified the first lesson of the unit. The slides on the left were added to the beginning of the first lesson to act as a review of the basic properties of subatomic particles. 

Finally, at the end of the unit, I pass back students’ start-of-unit diagnostic with their end-of-unit written summative assessments. This allows students to engage in their own growth and see what they have learned and the progress they have made during the unit.

Modified lesson presentation based on diagnostic data

Reflection

When I first started teaching, diagnostic assessments were drastically underused in my classroom. I gave one to students the first week of my first year to assess foundational math and science skills—the average score was a two out of 30. These scores discouraged me and—for a while—I entirely stopped implementing this form of assessment. However, reflecting on that original diagnostic, I now see that it was poorly written, did not take the needs of my actual student into account and was not assessing the knowledge I had intended it to assess. In order to successfully close the gap between where a student is and where he or she needs to go a teacher must answer three questions—where is the student going, where are they now and where to next (Heritage, 2010). By not using diagnostic assessments during my first semester in the classroom, I was unintentionally ignoring one of these three key aspects of an effective education. This was doing a disservice to my students and I have done everything in my power to remedy this past mistake in my instruction and try to grow in this area. I now strive to write and implement diagnostic assessments that authentically check both readiness and pre-test knowledge and use this data to monitor student progress, engage my students in their growth and guide decision-making in my classroom.

References:

 

Heritage, M. (2010). Formative assessment: Making it happen in the classroom. Thousand Oaks, CA: Corwin.

Laureate Education, Inc. (Executive Producer). (2012). Diagnostic assessment. Baltimore, MD: Author.

Marzano, R. (2006). Classroom Assessment & Grading That Work.Alexandria, VA: Association for Supervision and Curriculum Development.

 

Teach For America. (2011). Instructional planning & delivery. Retrieved from http://www.teachingasleadership.org/sites/default/files/Related-Readings/IPD_2011.pdf

©2018 by Alex Gergely Portfolio. Proudly created with Wix.com

bottom of page