It hope this bit of humor helps as we all navigate some very turbulent times. |
1. ITEMS
March Meetings:
March
March 10 - Shared Staff Meeting (I do not know the agenda for this meeting. Many moving parts makes developing an agenda a bit more complicated. A draft agenda will be sent out no later than Tuesday morning. I will be asking to meet briefly with Social Studies and Science staff from both CBMS and HUMS to discuss a bit more about curriculum alignment - that is what I know right now)
March 17 - Exhibition Team for HS - Middle School Staff meeting (we need to select spring conference dates)
March 26 - LT/DH Meeting (Thurs) - updated
March 20 - Inservice -Building Based
March 24 - Team /Collaboration/Individual Time ⚠️
March 31 - Dept Mtg
2. Middle School Assembly - Harwood Auditorium - will be held on March 19th - ELO 2
3. I wlll be meeting with all 7th grade students during ELO 2 (I believe) in the 7th grade team area. This will allow a open discussion about concerns/questions regarding the merger (or not)..
4. This is an interesting article..
A Study of Unconscious Bias
In this article in Educational Researcher, Yasemin Copur-Gencturk and Ian Thacker (University of Southern California/Los Angeles), Joseph Cimpian (New York University), and Sarah Theule Lubienski (Indiana University/Bloomington) describe an experiment they conducted on implicit bias. Teachers recruited for the study were told that the researchers were in the final stage of selecting items for an assessment that would capture the features of middle-school students’ knowledge and skills and accurately predict their mathematical growth. Teachers were asked to look at students’ handwritten solutions to the same math problems and were told that their anonymous feedback would help finalize the best items for the assessment.
To generate the students’ work, the researchers looked at a number of released NAEP extended-response math test items and chose a set that seemed likely to prompt a range of student responses. They had a group of middle-school students answer the questions and chose three test items that elicited a range of student responses. The researchers then chose two correct, two partially correct, and two incorrect student responses to each question, creating a set of 18 solutions. Here’s one example (with the student’s actual handwritten response in red):
Question #1:
The growing number pattern below follows a rule.
3, 4, 6, 9, 13, …
(a) Explain the rule.
Connor
13
-9 your adding 1 every time. 1 + 3 = 4 + 2 = 6 + 3 = 9 + 4 = 12
4
When selecting the final items, the researchers made sure that the handwriting and language used did not align with potential gender- or ethnicity-related stereotypes.
The researchers then randomly assigned a student name to each of the solutions, inserting the name in handwritten form by each item, choosing names associated with black, white, and Hispanic girls and boys:
Lakisha, Shanice, Tanisha Emily, Katie, Molly Bianca, Esmeralda, Rosalie
Tyrone, DeShawn, Trevon Connor, Ethan, Todd Alejandro, Diego, José
Students’ names were evenly distributed among correct, partially correct, and incorrect responses.
Participating teachers were asked to rate each student’s answer based on correctness (on a 10-point scale from “absolutely nothing correct” to “fully mathematically sound”). Teachers were then asked to assess each student’s mathematical ability.
The researchers looked at teachers’ responses with two questions in mind: did teachers’ evaluations of the correctness of students’ solutions and students’ mathematical ability vary by gender and race? and did teachers’ background characteristics predict biases? Here are the findings:
No comments:
Post a Comment