Wednesday, May 07, 2008

Board Benchmarks - Passed up again

UPDATE: The board chairman has once again passed up my request to serve on a board sub-group. The members serving on this group are Pedersen, Cook, and Bailey

----------------------------------------------------------------------

At the recent work session the board reviewed the benchmarks approved in 2005 (see below). It was decided to remove Benchmark # . The remaining benchmarks will be reviewed, reworded, updated by a yet to be determined sub-committee. I have requested to serve on this committee. Please provide your feedback.

Benchmark #1
"Charles County Public Schools SAT participation rate and average scores will increase annually for each school. Participation is optional. (Approved October 11, 2005)

Benchmark #2
“Charles County Public Schools will rank in the top ten of the 22 school systems according to the Washington Post Index Formula for Advanced Placement. At least 81 percent of AP course enrollment will take the AP exam. (2004 CCPS is 81 percent) The average score will increase. (2004 CCPS average score is 2.24) (Approved October 11, 2005)

Benchmark #3
"Charles County Public School graduates receiving Scholars recognition at graduation will increase annually by at least 1 percent at each school." (Approved October 11, 2005)

Benchmark #4
"The Superintendent shall continue to hire and retain highly qualified (trained/expertise in the class subject), competent, and dynamic teachers. The percentage of classes taught by highly qualified teachers will increase annually. (2004=51 percent or 3,484 of 6,838 classes.) The percentage of teachers receiving a Highly Effective or Outstanding on their evaluation in the area of Teaching Power will increase annually." (Approved October 11, 2005)

Benchmark #5
"The Superintendent shall continue to implement the standard for student behavior as contained in the Student Code of Conduct. In addition, no school in the Charles County Public School system will be classified as persistently dangerous by the state standard. (Approved October 11, 2005)

Benchmark #6 (REMOVED)
“The Superintendent shall continue to aggressively seek ways to decrease the percentage of student enrollment above core capacity in each Charles County public school annually."

Benchmark #7
"The Superintendent shall create and administer to parents a school satisfaction survey, including but not limited to family involvement, communication, and overall satisfaction, upon completion of the state survey. The results of this survey will improve annually. (See Maryland Parent Advisory Council Preliminary Recommendations)" (Approved October 11, 2005)

Benchmark #8
"Charles County Public Schools will exceed the High School Assessment state averages, in order to achieve Maryland State Department of Education graduation mandates. The classes involved are Algebra I, Biology, Government, and English II." (Approved October 11, 2005)

Benchmark #9
"Charles County Public School system will comply with the federally mandated No Child Left Behind (NCLB) requirements by exceeding the state performance in Adequate Yearly Progress (AYP) in the 20 mandatory reporting areas and in the number of schools making AYP. No school will be in program improvement." (Approved October 11, 2005)

Benchmark #10
"Charles County Public Schools will increase the number of courses utilizing technology support as a major component for the instructional program and increase the number of courses offered in distance learning annually." (Approved October 11, 2005)

11 comments:

Anonymous said...

BENCHMARK 1 – SAT
· This benchmark has two parts
o Increase in participation rate (listed by school)
o Increase in average scores (listed by school)
· Why are they STILL using number of participants instead of the percentage of participants? This is most likely a technique they are using to soften the blow that fewer and fewer students are taking the SAT. Take Lackey for example. It looks like only 4 more students took the SAT in 2007. If enrollment went down, they might be able boast an increased % of kids taking the test. If enrollment stayed the same the % would stay the same. If enrollment went up, they could potentially see a decrease in % of kids participating. Because you are a growing county you must report %’s. Your statisticians should know this. Ask them why they are reporting in this fashion. Don’t let them tell you that the %’s are on the next page. Those % have nothing to do with the % you need to look at here.
· Table shows 5 schools, 4 individual categories (reading, math, writing, total). That’s 20 markers. Of the 20 markers, 13 of them (65%) went down, wow! Two of them (10%) stayed the same and 5 of them (25%) went up. Draw you own conclusions.
· School by school comparison
o Lackey - # taking increased (so what, need %), reading =, everything else went down
o La Plata – down across the board. That’s bad! Fewer people taking the test usually raises the average score.
o McDonough – # taking and math decreased, reading and writing increased, and total score stayed the same
o Stone - # taking, reading and writing decreased; math and total score increased.
o Westlake – down across the board except for slight increase in reading.
o County – down across the board except stayed same for reading
o Maryland – down across the board except for # participating went up by 1327. That is a 2.9% increase in the NUMBER (not to be confused with true percent participation)
o National – down across the board except for # participating went up by 28,787. That would be a 1.9% increase from the previous year’s number, but remember – now knowing the total number of seniors, there’s no way you or I can figure out whether the PERCENTAGE of participation went up or down. In other words, I can tell you the NUMBER went up by 1.9%, but that’s not the same as saying the PERCENT PARTICIPATION went up by 1.9%.
· Why are they just comparing two years worth of data? You NEED to see a TREND in order for you – supposedly “accountable” board members - to draw logical conclusions and take appropriate actions! Trends cannot be determined by only looking at two year’s worth of data! Past data is available. They showed you multiple year trends for the HQ teacher benchmark, the scholars benchmark, and the AP benchmark. Why not just construct the table so that they add on a new year to last year’s table and keep a total of 10-12 years on the chart at a time? Twelve years is essentially the span of a child’s formal education. Are your employees incompetent or are they just hoping you’re too busy or brain dead to remember all these numbers from year to year? My gut tells me, the TREND is NOT pretty! Reporting multiple years on one chart can be done, see page 10. I would suggest you get the board to decide on a range of years (5 – 20) and require all benchmarks or future reports have that many years worth of data. Keep the reporting more uniform. If Venn diagrams are used one year, your staff should NOT switch to pie or bar graphs to present the same concept the next time around. Information should be presented to board members so that it is easy to read and complete. It should not require this much homework on your part! Your job is to digest the material and make governance decisions. Your time should not be spent trying to fill in the gaps where staff has dropped the ball or obfuscated information. I would suggest they always report 12 year trends since the kids are in school for 12 years and parents/students can track the system’s progress from the time they enter school to the time they graduate.
· The benchmark specifically references the participation RATE. Rate is more accurately reflected by % or a ratio (miles per hour, dollars per gallon, books per person, widgets per dollar, and test takers per graduating senior). A percentage must be used because the denominator of the equation (graduating seniors) fluctuates yearly.
· That pie chart is a bunch of gobblety-gook! Why did they get rid of the Venn Diagram? So you couldn’t easily compare last year to this year?
· Touts 89% 2007 had post secondary plans aside from entering the workforce. This appears to falsely insinuate 89% were college bound. Why are so few of your seniors taking the SAT/ACT? (Yeah, yeah, I know the CSM loophole factors in, but still…)
· Page 2 – What is the difference between “Other Post Secondary Plans” (11%) and “Other” 2%? My guess is it is starting college in spring vs. getting married/joining army/etc. I’d venture to say the vast majority of kids start college in the spring because they are wait-listed by the college, not because they really planned it that way. And, to have close to 11% wait listed is very sad.
· I know that CSM administers their own entrance test so I’m assuming that is the “CSM” (23%) part of the pie. If students don’t take the SAT or don’t receive a certain score on a subset of the SAT, CSM requires the applicant take the CSM test. But what is the difference between “SAT, CSM” (4%) and plain old “SAT” (16%). Same goes for ACT, CSM and plain old ACT, and SAT, ACT, CSM, and plain old SAT, ACT. I’m guessing that the kids that took the “real” college admission tests did better than those the guidance counselors “hid” by encouraging CSM testing. Counselors sometimes do this after they review the kid’s PSAT score.
· Since 23% - close to ¼th! – of your kids take the CSM test, why aren’t you tracking those scores to monitor your success or lack thereof?

Anonymous said...

BENCHMARK 2 – AP
o This benchmark has 3 parts
o Rank in top of 22 school systems review by Wa Po
o 81% will take AP test
o Average score will increase
· SPELLING - Benchmarks is spelled wrong in the title (extra a). Correct it before it becomes an official document of the Board of EDUCATION.
· Part 1 - Ranking
o They are correct, the formula has changed. Since the old benchmark was #10 out of 22 schools, that’s the top 45%. So, since there are 185 schools, then the top 45% would be a ranking of 83. Therefore, Stone, Westlake, and Lackey (more than half of our schools) did not meet the equivalent of the benchmark.
o Remind your colleagues that the Wa Po formula is flawed because it is based on number of kids sitting in the class, not whether they passed the test or the class. You are doing your students a great disservice if the majority of them (mostly studious kids to begin with) do not have sufficient grasp of the material in order to pass the test. The student might be better off enrolling in CSM. If they pass the class they are guaranteed the credit.
· Part 2 – Enrollment
o WOW! Look here on page 10. A grid that shows 7 years worth of AP testing. Why couldn’t the same be done for SAT? J
o But, before you go patting them on the back too enthusiastically, ask them if these numbers mean that every child took the test for every class he sat in, OR if these numbers reflect whether a person enrolled in several AP classes just took one of the tests. Personally I think it is the latter because of the terminology they use, “Percent of Enrollment Tested.” Example – All children enroll in 3 AP classes, but all children only take 1 of the AP exams. If calculated honestly that “enrollment tested” figure should be 33%. If data diddling is occurring the “enrollment tested” figure would be 100%, because 100% of the kids enrolled in the AP PROGRAM (not class) were tested. So, what formula is your statistician using to determine % tested. (What is the definition of “is”? J)
o Now if you really have the political intestinal fortitude, ask what the system plans to do to bring up the % of white kids that are enrolling in AP classes. You could even say “minority kids” since whites are a minority now in CCPS, right? (I think I read somewhere that CC is now a majority minority school system.) Caucasians have “only” a 185% increase versus every other category that is nearly 1.5 - 3 times that rate. After all, if the shoe were on an African American foot, wouldn’t that be a valid question?
· Part 3 – Average scores
o Do you think it’s odd that this part of the benchmark was not addressed? “Average score in 2004 was 2.24.” This report is for 2007!!!! What happened to the 7 year grid they showed for % participation?
o “The average score will increase.” If you really want accountability, it should be reworded to say, “The average scores for each subject in each school and for the county as a whole will increase annually.” EACC will shoot that one down faster than an Air Force fighter jet will shoot down an Iraqi missile over top the White House! But, the board is either accountable or it’s not, and based on campaign promises…….

Anonymous said...

BENCHMARK 3 – SCHOLARS PROGRAM
· The benchmark specifically requires an annual one PERCENT increase for each school. I’m beginning to think that the people submitting this report to you assess the board members as too incompetent to know the difference between percentages and head counts. MAKE THEM DO THE MATH SO YOU GUYS TO DO YOUR PART. It’s obvious that the benchmark was not met. That’s not a crime in itself, rather the crime is the way they obscure the information.
· Notes
o Long term
· La Plata’s and Stone’s NUMBERS have increased a fair amount since 2001 (but what about their enrollment; that’s why you NEED to see %)
· Lackey’s NUMBERS have increased significantly – Good for them! My guess is that their % have too.
· McDonough and Westlake are essentially no better off than they were in 2001. How much time and $ have you invested in this program over the past 7 years? Your efforts may not be cost effective.
o Short term
· McDonough and Stone took significant nose dives from 06 – 07 (and it isn’t because the kids were transferred to North Point. There were, and still are, no 12th graders at NP and the scholars benchmark is specifically a 12th grade benchmark.)

Anonymous said...

BENCHMARK 4 – TEACHER RETENTION AND HIRING OF HIGHLY QUALIFIED TEACHERS
· There are two requirements for this benchmark
o % of classes taught by highly qualified teachers to increase annually
o % of teachers receiving a Highly Effective or Outstanding on their evaluation in the area of Teaching Power will increase annually
· Part 1 - # classes w/ HQ teachers
o They have given you 4 years worth of data! Significant improvement every year – GREAT!
o However note the title of their % column. They are only looking at core academic subject classes. Apparently your special education, foreign language, gym, music, art, technology, etc. students are not worthy of HQ teachers since your system has chosen not to monitor/report those figures to you. According to what I’ve read about NCLB, I’m pretty sure ALL teachers must be highly qualified. Why should a foreign language teacher be off the hot seat when in fact foreign language is a graduation requirement? Same goes for technology and arts credits.
o THE BENCHMARK DOES NOT SPECIFY CORE ACADEMIC SUBJECT TEACHERS – IT SAYS, “the percentage of classes” not “the percentage of core classes.” They have presented you with a misleading chart.
· Part 2 – teacher evaluation
· This part was never even addressed! Ask them why they failed to address the second part of the benchmark. It’s not as if you’re asking to publish a teacher’s individual evaluation. You are asking for an overall statistic in order to get the flavor of a specific aspect you deem important for the teachers you employ.

Anonymous said...

BENCHMARK 5 – STUDENT BEHAVIOR
· This benchmark was never addressed. Ask why.
· There are two requirements for this benchmark
o Superintendent to continue to implement the standard listed in the Student Code of Conduct
o No school will be classified as persistently dangerous as defined by state standard
· The first part of the benchmark is somewhat objective. How can you really measure it aside from gut feelings you have after attending discipline hearings? I’m not saying throw this part out, it’s just that it is hard to objectively measure.
· The second part is definitely measurable and the information is easily obtainable. Why in the world did they not give you that information? Quick call to MSDE might yield the information. Hopefully what you find out will not give them heartburn when you disclose it on camera.

Anonymous said...

· This benchmark was never addressed. Ask why.
· Two requirements for this benchmark
o Create and administer to parents a school satisfaction survey
o Results will improve annually

Anonymous said...

BENCHMARK 8 – HSA (page 7)
o Great! It looks like this benchmark was met... for 2007. What happened to the other years? To reach logical conclusions and take appropriate actions, you need to see trends! If you’re steering a ship to get back on a course, you don’t look to see where you 5 feet ago. You look at the entire picture.

Anonymous said...

BENCHMARK 9 – AYP
· There are 4 requirements for this benchmark
o Compliance with NCLB
o Exceed state performance in AYP percentage in 20 reporting areas
o Exceed state performance in # of schools making AYP
o No school will be in program improvement
· Part 1 - compliance
o This was never addressed. Ask why.
· Part 2 – 20 reporting areas
o See page 4, last bullet. HOW is this measurable to YOU? Where’s the data to support that CLAIM?
o Page 4, bullet 1. And that percentage would be…. [drumroll]? You guys are just supposed to sit there like bobble heads and swallow these claims?! Where is the data that turns these claims into facts?
o See page 6. I’m assuming these are the “20 reporting areas.” WHERE ARE THE NUMBERS? How does one exceed “met.” The benchmark requires CCPS to EXCEED.
o The benchmark requires CCPS to exceed state performance in APY PERCENTAGE… Do you see one % sign on this page?!
o This is only for 2007 – Where are the TRENDS??? This is even worse than the SAT table.
· Part 3 - # schools making AYP
o This part was never addressed
o Truthfully, this part of the benchmark seems a little weird to me. Maybe because I don’t know the original intent. The state has hundreds of schools, CC has about 35. CCPS would never be able to exceed the state in the number of schools that make AYP. BUT, if you were to look at the percentage of schools making overall AYP (slightly different from benchmark’s second bullet where its broken down into 20 areas) you might get data something like, “75% of the Maryland schools met AYP and 86% of CCPS schools met AYP.” I really don’t know. The benchmark wording is confusing. Might want to tweak the wording on this one.


· Part 4 – school improvement
o This was addressed on page 4, second bullet, but only for 2007. You need to see trends! Can’t say that enough.
· PAGE 5
o AMO is not even addressed in the benchmark. Why are they fluffing up the report with this information when they have purposefully left out so many other benchmarks?
o Their claims are great, but where’s the data to turn the claims into facts? Don’t let them tell you MSDE only reports met or unmet. That’s baloney! They have to meet some numerical score in order to qualify as “met.” Show me the numbers!

Anonymous said...

BENCHMARK 10 – TECHNOLOGY
· This benchmark was never addressed. Ask why.
· Two parts to this benchmark
o Increase the # of courses utilizing technology support as a major component for the instructional program
o Increase the # of courses offered in distance learning annually
· I’m betting the first part has been met since only because my kids are constantly commenting on and complaining about the number of videos they have to watch in school. Imagine that, kids complaining about watching videos! It does speak to the necessity of human interaction though.
· The second part should be very easy to determine. I’d bet the # is 0. Don’t let them tell you they have it set up so our kids can “talk” with kids in Europe or soldiers in Afghanistan. That’s not a course, it’s a classroom experience. This benchmark is talking about distance learning the same way a college might provide distance learning credit. Are there any distance learning programs listed in your course of studies book?

Anonymous said...

These benchmarks are BS that paves the path for another several years of failure.

I hear of so many stinking teachers that lecture from power-point, encourage absolutely no interactivity from the students, then hand out worksheets until the next sky falls. Writing on an overhead is absolutely no replacement for standing in front of a class, working ten or fifteen problems, then asking the students to come to the board to work the problems.

Whether or not someone is qualified in the State Board of Educations eyes is not going to bring home the bacon when it comes to a student's ability to understand the material.

These nincompoops have got to grow the balls to allow professionals from the fields of mathematics and other hard sciences in to observe these classes.
Then,
1) Write up observations and where the teachers are lacking.
a) discipline?
b) lack of knowledge of the subject matter?
c) note if the teacher has absolutely no idea of how to provide a motivational atmosphere for young minds, and cultivate these minds, especially in the higher level classes to turn these kids on to the hard sciences.
d) Making sure that the teacher is sticking with the AP curriculum, the pace of the curriculum in order to provide the students a fighting chance to obtain a 4 or 5 on the AP test, and insure that each school is in lockstep with the curriculum, the the teacher knows the material well enough that they do not have to go searching for the solutions for student prompted questions.

Go back to using grammar in elementary school, and make it a hard requirement to learn correct spelling and the correct usage of grammar throughout high school.

We have supervisors over at CCBOE that are either hiding under the desks, putting their heads in the sand, or something else that is keeping them from getting into these classrooms and making the changes.

It's a very sick atmosphere when a parent can't go in to visit their child's classes except for two lousy forty-five minute classes per quarter, under the auspice of an administrator.

These teachers and administrators have to start taking parental input seriously, and demand that the teachers of the hard sciences and mathematics engage the students for a large percentage of the class.

This does not mean doing three lousy warm-ups. This does not mean handing out worksheets and sitting at the teacher's desk, instructing the students to approach the teacher "with any questions".

This is not teaching.

Anonymous said...

Passed up for that committee? I watched that meeting and you were the ONLY board member that offered to serve on it. If I'm not mistaken you said you were not assigned to any committees. BAsed on the conversation, I'm guessing the chairman determines committee positions? From what I can surmise, your chairman is either afraid of your insight or he feels threatened by intelligent younger women that are not of the puppet gender. What rational reason could he give for not assigning you to a committee you and only you volunteered to serve on? None of the other board members appeared to be prepared for that part of the meeting let alone interested in serving on the committee. It was painful (but amusing) to watch. Is he trying to force other board members to do more homework before meetings? There was lots of confusion amongst the entire board. I guess your chairman was looking for puppets and you don't seem to meet that criteria. Ms. Peterson does though. She was more than happy, if I recall properly, to punt the entire responsibility for revising the benchmards onto staff. Does she not understand that these are benchmarks the BOARD sets for the system? Let the administration develop a plan and write the regs to meet those benchmarks. A board member that considers it "too painful" to write BOARD benchmarks should resign from the board. That is her job for goodness sakes! To do it her way is like asking the fox to write the safety regs for the hen coop. I'll be much more careful with my votes next time around. What a disappointment. Better luck next time. More family time for you though.