Exploring Trends in Engagement: Grade Levels by School Type

“Brutally simple” might be one way you could describe the Wellington Engagement Index — after all, the base data point is an entry listing the student, the class, and the love it and challenge coordinates. While even this simple data can provide us insights into a single student’s experience, it can be even more interesting when viewing data in aggregate. What trends might be exposed by thousands, or hundreds of thousands, of data points on student engagement? What questions might we raise?

Let’s explore this by looking at three different examples from the 2019–2020 school year. To give you an idea of the breadth of this data:

From June 2019 to June 2020, 18,771 students from grade 5 through grade 12. These students attended 61 schools, including public schools, charter schools, and private schools, located in Washington State, Connecticut, and everywhere in between (a few from Spain, too!). Their experiences are summed in a total of 304,402 dots — each one representing a single student’s reaction to their experience in a single class.

In our data, we describe the combination of a positive love it score and a positive challenge score as being engaged. This is the experience we seek to provide for every student, and where we believe the best learning happens. Students who are challenged but do not love their work — what we refer to as the grind — may learn a lot, but may be so turned off from the subject that they never wish to touch it again. When students love their class experience but are unchallenged, we see students who may be having a good time but who may not be learning well. We call this being entertained. Finally, if students neither love their experience nor are challenged in them, we refer to them as bored experiences, where students are mentally checked-out from the classroom.

Now that we share an understanding of the year’s data, we can start analyzing data in more specific terms:

First, we will look at dots in independent schools (members of the National Association of Independent Schools), aggregated by dot quadrant, and by grade level. This subsection of the data represents 106,917 data points. In our data, we describe the combination of a positive love it score and a positive challenge score as being engaged. Positive challenge and negative love it scores are defined as grind, with negative challenge and positive love it being entertained. Both values being negative is associated with the bored quadrant.

Of note here is the shift that occurs in the middle grades, from 6 to 8. Here we see both the lowest engaged percentages, and the highest bored percentages. This may seem relatively obvious: these are formative years in which students experiencing numerous changes to their bodies and minds, so of course students will express lower love of their classroom experiences. This is a pre-existing condition that exists everywhere.

To this, firstly I ask: if we know that students are going through changes that may result in lower engagement in school, why have we not already taken steps to improve engagement here? School experiences in these independent schools clearly do not do enough to engage students during this strange time in their life. Furthermore, if this is an issue that should exist everywhere, then we clearly would also see the same trend in public schools — right?

Wrong! In fact, in the public school sample of 134,062 dots, we actually see the 7th and 8th grades being the highest engagement out of all grades. Here we see the lowest engagement being in 6th and 9th grades. Perhaps this low engagement is due to 6th and 9th grades often being the “transition years” for students, where students enter a new school building and all of the changes that come with it — schedules, settings, social circles, et cetera.

It is of course important to note that my hypotheses are purely that — hypotheses. This data shows us differences in student engagement, but it does not tell us why. Similarly, the data does not tell us why the trends in engagement are different for public and independent schools — only that there are differing trends.

What is apparent in this data is what deeply interesting questions can be found, even when the question we ask students is so simple. What drives these differences could be quite profound and take quite some time to uncover, but what is important is that we now have a question to drive us forward.

Previous
Previous

Engaging Teachers: Emelie Inderhees

Next
Next

Why We Measure Student Engagement