School Board Meeting 10/12/2021

The main focus of the board meeting on 10/12 was a discussion of student achievement data and what data we want to use to evaluate our schools. I submitted a written public comment and also spoke at the board meeting on this issue. I do have a few comments on the presentation of the SOL results, but I want to first highlight the excellent discussions during the Q&A portions of the meeting, particularly about what data we want to use and what its purpose is. Dr. Anderson (no relation) brought up the utility of having a more longitudinal dataset that would provide a better view of how individual students are learning. I whole-heartedly support looking into this kind of data. If I recall correctly, the February learning analysis using STAR data (the assessment tests the schools use) was designed to look at students in a longitudinal format. Not only would this get us away from looking at pass rates (which provide only a blunt signal of performance), but it would also give indicators of academic growth. Other points of interest from the 3 Q&A sections include: 

  1. Ensuring the measures we use take into account differential participation. With whatever data we use for evaluation, we need to be aware of which students are included and which students are not.

  2. Looking at what data to include that would highlight the career-oriented (rather than academic-oriented) students. None of my immediate family graduated college and all work or worked in some sort of trade (be it construction, hair styling, or mechanical repair), so I really appreciated this issue being raised. It may be helpful to survey students about what they think about the opportunities provided. It may also be useful to have a post-graduation survey to understand employment opportunities or trade school participation.

  3. Discussion of the value of deep learning that occurs during many extra-curricular activities. Ensuring that students are actively invited (rather than passively notified) to these opportunities can help alleviate participation disparities in extra-curricular activities.

I agree with the many comments noting that SOLs should not be used as the single or even most important data point. We should take a much more holistic view towards our children’s education, so the following discussion should be understood not as defending an “only SOL” view. My view is that if we have useful data, we should use and present it as best we can (while also understanding the limitations of that data), and the data should be presented with the goal of showing the board and community what the current situation of students’ academic preparation is. Last year was tough, and it is okay to acknowledge how big of a struggle it was. 

As was mentioned by some board members, we must be careful in what data we use for evaluation because participation might be subject to self-selection by students. SOLs provide a glimpse of the academic preparation of the student body without that self-selection, making it less biased than many other data sources mentioned in the presentation. This is the biggest benefit of looking at SOL data. 

With all that being said, my 2 main concerns (shared by some board members) with the SOL analysis were 1) the lack of comparisons and 2) the lack of disaggregation by demographics. Regarding 1: while comparisons of FCCPS schools to comparable schools can be useful for some things, I am not too concerned with comparisons to peer groups. It is the lack of multiple years of comparison that make it hard to understand the 2020-2021 SOLs (or any of the other data presented). Regarding 2: the FCC school board and community have a deep commitment to equity. While one purpose of the SOLs is to identify students who need extra resources, another function those data serve is giving the board and community some indication of whether the commitment to closing gaps is being fulfilled. That’s what makes the absence of demographic data so noticeable. Additionally, one thing that was missing on the slides about the non-SOL data sources was the potential equity issues present in those other data sources due to student self-selection. I think the board did an excellent job of ensuring that topic was discussed and highlighted throughout the discussion.

I understand why the schools presented the data as they did (particularly because they thought the various data were not comparable). The SOL participation rate for FCCPS was at 96 percent (down only 3 percent from normal), while the state participation rate was around 77 percent, and there were changes in the rules for getting verified credits and retaking SOLs that applied to end-of-course (high school) SOLs. These seem to be the main caveats with the data, but I (as a statistician) do not see these as reasons to not perform a full data analysis, particularly for grades 3-8 (as I mentioned in my previous post on the SOL data). Indeed, Arlington County provided a breakdown of its 2020-2021 data in August with 3 years of data for all students and by grade in addition to English learners, students with disabilities, and economically disadvantaged students.

My ideal presentation would have been a brief overview of the data, presentation of the results in historical context, noting and discussing any concerning results (like the 62 percent pass rate for 6th grade math), looking at various demographic groups, noting anything concerning (like the 13 percentage point decrease in math pass rates for Hispanic students or the 14 point decrease for students with disabilities), highlight any bright spots or caveats (preferably written on the relevant slide), receive an update on interventions planned for the students, and then have a discussion about alternative data sources. 

I found the discussion around data sources for evaluating the schools excellent. It is an important conversation to have as the board takes up the development of the strategic plan. We should remember that data analysis serves two functions: 1) helping the schools identify individual students who need extra help and 2) providing the board (in its oversight role) the means of evaluating whether the goals in the strategic plan are being met. The discussion last night showed thoughtful creativity in how to ensure FCCPS uses data that best performs those functions, and I am excited about the future possibilities for FCCPS.

Timestamps:

Previous
Previous

My thoughts on vaccine mandates and masks

Next
Next

Covid Testing Program and Vaccinations