Monday, July 31, 2017

Time for an AP Chemistry Story Slam

In 2015, I (somewhat last minute) found out I would be teaching AP Chemistry.  I was pretty excited for the challenge but not feeling super prepared. I signed up for the only AP Summer Institute I could find that still had openings. It turned out to just be a four day problem-solving session and was not super useful. The main message I left the institute with was, "don't expect any of your kids to pass the test the first year."

And I thought, "Yeah okay but that doesn't apply to me because 1) I work really hard and 2) I am going to be following the curriculum and pacing of a veteran AP Chemistry teacher who would be teaching the other section that year!” Admittedly, this is a pretty unique experience.

As the 2015-2016 year went on we moved through the material really quickly and had over a month before the test to do review and take practice tests. At this point, I wasn't sure about my students' understanding and throughout the year I had encountered a lot of classroom management issues that I didn't expect out of AP students. Still, when the test results came out that July, I was surprised that not a single one of my students had passed. I was disappointed--but not in them.

Last year, I had a lot more confidence in the material, I incorporated design projects, inquiry, more practice, more feedback, and we slowed down! At the end of the year I knew students would pass. Even if the test is really hard, when students understand the material they should do well, right? Maybe some would even get 4s. They deserved it. I deserved it.

Then July came. Once again, there was not a single passing score in my class. Admittedly, the other class only had one student pass. Despite all the confidence I had in May, I found myself questioning: which is a better indicator of what my students know? That impossible test? Or my own observations and assessments? I even began to wonder: do I have enough experience to be trusted to properly prepare and assess my students?


I know that when it comes to a high-stakes, 3+ hour test, there is more at play that just whether or not students understand the material. Looking ahead to the 2017-2018 year, I know that I could teach more to the test, do less inquiry, and teach more test-taking strategies. I know I could do these things, but I would have to sacrifice some of the things I know to be best practices when it comes to helping students understand chemistry. Last year, I regularly gave students upwards of 30 minutes to work on free-response questions in groups. They asked each other questions, argued about their answers, and eventually reached conclusions I knew they understood. This is science. I know that when it comes to these questions on the test, they are expected to answer (alone) in about 10 minutes. If I change these practices, maybe more students will pass and maybe not, but am I really helping them beyond the test?

Wednesday, March 8, 2017

A Day Without a Woman

Today is International Women's Day. You may or may not be aware of the Day Without a Woman movement, where many women are striking, marching, protesting, conversing, boycotting, and/or wearing red today to stand up for equal rights for women.

A colleague of mine in KSTF, Laura Wang created an amazing STEM lesson plan to use on this day, with tons of ideas to spark conversations in STEM classes. I decided to try out some of it in my Chemistry and AP Chemistry classes today. While a lot of the conversation was productive, I also faced some challenges I wasn't expecting.

My hope going into the lesson was to illuminate a few key facts (using data):

  • Girls take as many science and math classes in high school as boys. This suggests that girls are at least as interested in math and science as boys.
  • Girls have higher GPAs in math and science than boys in high school. I also wanted to suggest that one reason why this might be is because teachers may be more likely to offer help to female students and males may also be less likely ask for or accept help from teachers.
  • Boys outperform girls on AP science and math tests. I was hoping to tie this into the stereotype threat talked about in the article.
  • Girls are less likely to obtain degrees in STEM fields, especially in engineering.
I tried to make it clear to students that any gender disparities (in any subject, favoring boys or girls) were worth talking about, but that we were focusing on girls because of the day and on STEM because of the subject of our class. While I was able to have these conversations without problems in a few classes (other than the occasional "This is stupid, I don't want to do this"), one class had a particularly negative response.

I started the lesson by explaining the Unity Principles from the Women's March and how that related to the Day Without a Woman movement, and I think this is where a few students stopped listening and started forming their own conclusions about the rest of the lesson. Because I had mentioned a march/protest, I must be a hyper-feminist who wants women to take over the STEM workforce and push all men out. I have to assume this is what one student inferred, because at the point when I had only given them the data and asked what the graphs showed, he refused to answer the questions and said, "This goes against everything I believe in." 

What?

You can't "believe in" data. We hadn't even begun to interpret the data. I asked the student to expand on what he meant, but he wouldn't (or couldn't).

As we started to brainstorm possible reasons for the gaps, other students unleashed frustrating remarks as well. "Women have less degrees because they want to be stay-at-home moms." This statement itself is not inherently a problem. Being a stay-at-home mom is an extremely admirable job and certainly this is why some women don't get degrees in STEM. When I explained that factor alone didn't account for the entire gap, students still insisted they couldn't suggest possible solutions without having that graph "fixed" to exclude stay-at-home moms. I wasn't sure where to go from there. Students started having their own side conversations. One student remarked that the graphs didn't account for other genders...a great point, but I'm certain it was only brought up to try to derail the conversation. Another student said we shouldn't even be talking about this in school because it is too political. 

Again, what?

I still asked that class to take the article home and discuss is with an adult woman in their life for homework. I am hopeful that discussion will help illuminate for those students why this issue is important. I am worried, though, that some of the comments I heard today represent a larger problem society is facing regarding the tendency of those with privilege to lash out when others try to push for equity.

I'd be interested to hear others' responses in the comments.

Monday, January 16, 2017

Writing to vent outrage

I guess I knew at some point I would start blogging about education. I didn't think I had time to do it this year, let alone right now. When someone is outraged enough, venting can become writing...so here I am.

Anyone who is involved in education is Ohio is aware of the "Graduation Crisis" looming on the horizon due to new standardized tests tied to graduation (~40% of juniors are not on track to graduate). While I was fully aware that many on the State Board of Education were either unaware of the problem or unwilling to admit it existed, I did not believe they truly thought that we cared more about our students getting a diploma than learning. That is, until I read this blog post, which included that exact sentiment from State School Board President, Tom Gunlock.

So, I quit eating dinner and wrote to him as well.

An open letter to Tom Gunlock, President of the State Board of Education in Ohio:

Mr. Gunlock,
I was recently directed by my principal to read your response to Matthew Jablonski’s email of concern about the “Graduation Crisis.” I was dismayed at, and quite frankly offended by, your comment that asking for the graduation requirements to be changed means we are not interested in educating our students. More than anything, I would love to educate my students, but I frequently find my passion for education and my ability to teach to be hindered by testing.
I can see how from an outside perspective, these tests may seem reasonable to you. After all, you have been told that these tests are designed to determine whether or not a student has achieved a 10th grade education before they graduate. The problem, as Mr. Jablonski tried to explain in his email, is that the skills employers want are not the skills being tested. As a teacher, I am therefore forced to choose every day whether I should prepare my students to answer questions like those that will be on the test (even though we have been given very little information on that) or to prepare them to handle novel situations in the real world.
As an Algebra 1 teacher, I will use the practice test items available for that subject to illustrate my point (see the Algebra practice test scoring guide). The most obvious problem with these items is that they are scored only 0 or 1 point, regardless of the answer. None of the students’ process is rewarded, and in fact, none of the problems require students to share their thinking in any way. If the goal is to prove that students are ready for college or career, these questions do a poor job of it. Less than half of the questions (2, 4, 9, 10, 11, 12, 14, 16) even attempt to link the math practices tested to real-world scenarios and instead ask students to interpret an obscure equation or graph.
I was particularly distressed by the scoring on Question number 7 (page 72 of the document). The question clearly has 3 parts (graph the first line correctly, graph the second line correctly, place the star in the correct area), yet a student who gets 2 of the 3 parts correct gets the same score as a student who doesn’t even attempt the problem—ZERO. Assuming that at least last year’s tests were scored in a similar way, many students’ scores are not even close to being reflective of their knowledge of performing algebraic calculations
This brings me to my next point. Conrad Wolfram, the director of the world-renowned mathematical company Wolfram Alpha, is an outspoken critic of the traditional way of teaching math—as a set of rules and calculations. He proposes that there are four stages of mathematics in action:

1.      Posing a question
2.      Going from the real world to a mathematical model
3.      Performing a calculation
4.      Going from the model back to the real world, to see if the original question was answered.

In the 1970s, computational skills were very important to employers, ranking 2nd on the Fortune 500 “most valued” skills. Since then, computers and calculators do almost all of that work and that skill is now at the bottom of the list. Yet, this is the only skill from the list above that is assessed by the Algebra EOC. If you want to know whether students are leaving high school able to do the other three stages of mathematics, a standardized test is not the answer. Real, high-level thinking cannot be measured in a standardized way. There is also an incredible amount of research suggesting that students’ brains grow the most when they make mistakes, yet these tests punish students who are anything but perfect (0 or 1). If you don’t believe me, or would like to read more, then I encourage you to look at any of the work done by Jo Boaler, a world-renowned mathematics professor at Stanford University.
I understand that the proposed purpose of these tests was to raise the level of rigor for students to solve a problem you outlined in your email—that students are not prepared for the workforce. As Mr. Jablonski and many other educators and administrators have tried to point out, these tests are a false solution, as they test some of the least important skills to employers.
Quite frankly, these tests are only deemed necessary because there is an unfounded lack of trust in teachers to provide quality educations. As Mr, Jablonski explained, there is no federal requirement to have a test tied to graduation. If in fact you are concerned that teachers are not providing a quality education, then I have many suggestions for how you could explore that further. One of the most powerful things you or anyone on the State Board of Education could do is come into our classrooms. I invite you to visit mine any day. If it is a matter of deciding where to invest money, I suggest you spend it improving and adding to teacher preparation programs and teacher professional development that is grounded in research-based practices. Instead, we are asked to spend our professional development time learning how to properly prepare students for and administer a test.
I hope you will understand that I am writing this email in concern for my students and for education in general. I am not the type of person you suggested in your email. I have never tried to cut corners when it comes to education. I graduated in the top of my class in high school and in college. I was chosen to speak at my graduation from Michigan State University. I am a teaching fellow in the Knowles Science Teaching Foundation. I think about how I can improve my teaching all the time. In fact, I spent my entire day off today reading about teaching and revamping my lesson plans for this week (for the tenth time at least). I read Mr. Jablonski’s post while I was eating dinner and literally stopped mid-meal to write to you. I am a passionate teacher who cares about my students and their future. I hope that voices like mine are important to you as the President of the State Board of Education.

Sincerely,

Beverly Stuckwisch
Licking Valley High School Chemistry and Math Teacher
Knowles Science Teaching Fellow