Monday, December 15, 2008

Another fun map

Here's another fun map, showing how many students travel 10+mi at various magnet programs. Larger boxes mean more students.

I'm not sure how useful this one is; it's just a fun one to look at.

Where do the 10+mi riders live?

I got an early holiday present on Thursday: a response to my open records request to HISD for data on the magnet students riding 10+ miles to their programs. I was pleasantly surprised; I'd not received an acknowledgment, and was about to send a "snail mail" follow up to my original email request.

Part of what I found out can be seen graphically here at Geocommons.com, a web site that lets you upload geo-coded data and see it plotted on a Google (or Yahoo or Open Street Maps) map. You can have several "overlays" if you like, each corresponding to a data set. The data on that map shows how many students travel 10+ miles from each zip code - it's their home code, not the destination. I also received a long list of every school and the number of 10+ mile riders, and I got a small table explaining the racial breakdown of the same group:

RacePercentage of 10+mi ridersPercentage of District
White10%8%
African American48%28%
Hispanic34%60%
Asian8%3%
Native American<1%<1%

In my request, I asked for income level bands, which they don't have; but I did get the response that 84% of the 10+ riders qualify for free/reduced lunches (this was also reported in the Houston Chronicle today). I also asked for the same distribution information for the magnet program as a whole, and for the district; I got the district racial breakdown from a different source, which also pointed out that district-wide, 79% of the students qualify for free or reduced lunches.

I worry that reducing or eliminating services for the 10+mi riders will make magnet attendance difficult for 84% of them (2822 students). They are likely to have the least flexibility in their schedules or access to transportation.

Resouces:

Feeding the Hungry

At First UU Church Houston a few years ago, our Parent's Group started a project to create meal packs for the hungry in Houston. We were not trying in a broad way to feed the homeless; instead, we were trying to give congregants an alternative to giving money to street solicitors. In this way, we thought people might find it easier to be more generous in the moment.

With the packs themselves, were looking to accomplish a few goals:

  1. Packs should comprise non-perishable items, so they can be stored at home and taken with you in the car/on your bike so you can give them to people who are in need;
  2. The nutrition should be balanced, with a good amount of protein;
  3. There should be plenty of liquid, important especially in the summer;
  4. Items should be edible even with bad gums or teeth (soft foods);
  5. The packs should be inexpensive and provided at-cost to purchasers so they feel comfortable buying many and handing them out.

With those in mind, we managed to create the following packs for about $3 each (by purchasing items in bulk):

  • A quart sized "zip" locking bag
  • A paper napkin and plastic spoon or spork
  • A postcard with a map of Houston and locations/phone numbers of aid agencies
  • A liter/quart of water
  • A juice box
  • A tuna-fish based lunch pack (the most expensive item, but the most nutritious)
  • A pudding
  • An applesauce
  • A pack of cheese its or other snack
  • A pack of cheese or peanut-butter stuffed crackers
  • A granola bar

We managed to stuff almost 2000 calories of long-lasting food with a good mix of nutrition, while providing water and kind of fun snacks. The packs were easy to shop for, simple to assemble, and are satisfying to give out. Because we were able to recoup the cost of the packs each time, the project itself was self-sustaining after an initial "investment" from the social action group of the congregation. By taking pre-orders, we were even able to expand the production at various times.

I'm proud that my daughter's Girl Scout troop did a similar project this year at Thanksgiving. Because of fund-raising rules, they were not able to recoup the costs from outside the troop; still, they considered it a valuable project and invested their collective dues into making five packs for each girl.

This is not a project which tries to address the causes of homelessness or hunger; it's not a program which will help people get off the streets. Those are additional, sustained efforts which need to take place as well. But as we know, structural change doesn't happen overnight; in the meantime, we can make these small efforts while we work on the large problems.

Wednesday, December 10, 2008

Houston ISD December Board Meeting

Hi Folk(s?)

Prior to today, there was a widespread assumption that the Houston ISD Board would address the issue of transportation for Magnet students at their December meeting. However, when you look at the agenda published for the meeting, there's no mention of that issue as a topic for discussion or decision.

It's been reported that the decision will happen at the January meeting; but, as always, read the agenda first.

That said, there is a group of parents who will be attending the meeting to demonstrate their position on the issue. The plan is to meet at 6:00pm at

HISD Administrative Headquarters Hattie May White Building - Board Auditorium 4400 W 18th St, Houston, TX 77092-8501 Hwy 290 @ Loop 610

Monday, December 8, 2008

Houston ISD ASPIRE: Good idea, bad implementation?

Houston ISD has implemented a new evaluation system called ASPIRE, developed in collaboration with Battelle for Kids, based on the SAS Educational Value-Added Assessment System (EVAAS). This is a new initiative to track student progress year-over-year (longitudinally), instead of comparing this year's third grade class to next year's and last year's (a cross-sectional study). The idea behind the new evaluation is that if you track the same population over time then you can see how different teachers improve their progress year after year. It's a nice idea, but as far as I can tell, this implementation has at least two major flaws. I'd appreciate comments from educators and statisticians either confirming or rebutting these observations; I'm neither, and I like to hear from experts.

Before I outline what I consider are ASPIRE's weaknesses, I'd like to go on record as a supporter of the concept at least in theory. It's clear that the old method of measuring a teacher's "performance" year after year, with a changing population each year, is unfair to the teacher because it does not control for possibly wide swings in their class's demographics. If one year a teacher has an overall eager student body, and the next, one that comes in with a lack of skills or a lack of focus, the teacher's "performance" will vary. The exit scores for each of those classes will differ; one year he or she will look like a success, and the next, possibly a failure.

The implementation of ASPIRE is an attempt to measure a class's incoming and outgoing level of achievement, and determine what if any effect the teacher has on the students. On first sight, that seems reasonable; however, the implementation at HISD falls short for the following reasons.

The first potential flaw in the system is that schools with an advanced student population will show little to no year-over-year improvement. This is an effect of the tests chosen for the ASPIRE metrics: the Texas Assessment of Knowledge Skills (TAKS), and a normed test (Stanford or Aprenda). Neither of these tests can differentiate among the students in the 99th percentile, and that effect may even be true for a larger population (perhaps to the 95th percentile). If a child stays in the 99th percentile year after year, or bounces around within the 95th-99th percentile band, they show no (or negative!) progress according to those tests. It's not fair to penalize that child's (or that class's) teachers because the test can't measure student progress at that level. The same goes for the TAKS; it's designed to measure a very basic level of subject mastery; for schools with advanced students, that mastery happens as a matter of course, and possibly even early in the school year. What happens in the classroom supplemental to that (a deeper investigation of the subject, a broader survey of related topics) is not measured in ASPIRE.

When HISD implements "performance based pay" on such systems, they intend to truly reward the teachers who take struggling students and help them reach new levels of accomplishment during the school year. However, they run the risk of leaving behind the teachers who are teaching the average or advanced students, and that's not fair. By publishing this "value add" data on the web, they actually give the misleading impression that the schools with more accomplished students actually have flat or failing performance.

Let me be clear: this is not a problem for all student populations. For students who are not advanced, an improvement year after year would be meaningful if it were measured correctly.

That brings me to the second potential flaw: the data may not be reflecting what ASPIRE needs to measure. The first input is TAKS performance, a measure of mastery in various subject areas. This is probably a good indicator when used for reading (comprehension) and mathematics, which are measured every year; those are subjects where each year builds upon the student's previous knowledge, and an improvement may signal a significant change in understanding. The other areas measured (science and writing) are less obviously incremental, and aren't tested every year; and other parts of the curriculum (history, art, music, foreign language, health, for example) aren't measured at all.

The other test used as input is either the Stanford or the Aprenda (for Spanish speaking students). Unfortunately for this effort, these are nationally normed tests, useless for measuring student progress. Very briefly, a norm referenced test is one in which the student is assigned a rank against their peers that year; the questions are not assessing subject matter knowledge, but are instead chosen as differentiators between students. To see the effect of the first characteristic, just think of how a student will score differently based on a different set of kids taking the test; the same student could be in the 70th, 80th, or 90th percentile depending on who else is taking the test. Clearly, this is not simply measuring achievement; while a large part of how well a student does on the test depends on their knowledge, a significant factor is the set of other students, over which they have no control.

The second problem with normed tests is more subtle. The questions on the test are not chosen to assess what a student knows; instead, they're effectively chosen for how "tricky" they are, so they expose a difference between sets of students. The purpose of a normed test is to rank all the test-takers along a continuum of scores; you can't do that if there are a large number of questions on the test that everyone gets right or wrong. On the TAKS, which is a criterion-referenced assessment and is measuring comprehension and mastery, it's OK for everyone to get all the questions right; that means that all the students in Texas have mastered that subject that year. The normed tests are not serving that same purpose; such a result on the Stanford or Aprenda would be a serious failure.

The final issue with ASPIRE involves the more general debate about whether these standardized tests are actually providing a relevant measure of student accomplishment, and accurately reflect the effects of good or poor teachers (as opposed to a good or bad curricula, inappropriately homogenized pedagogical methods, external factors such as days lost to weather, etc.). You clearly cannot improve a system such as public education without being able to measure it; however, there's a valid debate over whether we know how to describe and measure the effects of a successful education. Until we get to that point, I'm supportive of attempts to assess educational effectiveness, and skeptical of punishing or rewarding teachers simply by using the results of those potentially ineffective efforts.

The idea of tracking each student population's progress longitudinally (year-over-year) and measuring their improvement is a good one; however, I'm disappointed that HISD and Battelle seem to have gotten the implementation wrong with ASPIRE. I can't tell if they use the TAKS and Stanford/Aprenda metrics simply because that's what they have at hand (and they don't want to change the tests, or add new ones), or if it's just because they fundamentally don't understand how poorly the tests measure what they're trying to track. Perhaps ASPIRE will get better over time; it may also be that my analysis above is flawed in one or many ways. If I'm way off base, I'd love the reassurance of being proven wrong.

More portal trouble at HISD

Frankly, I'm embarrassed. Houston ISD does many things right, but their web technology developers seem to be unable to create an online application which works for everyone. I wonder if it's because they exclusively use Microsoft tools for development and deployment?

We received a flyer in the mail recently which describes the Houston ASPIRE effort, and points parents to their web link. Your mileage may vary; in fact, it's probably likely to work for you if you use MS Internet Explorer, or are on an MS Windows system. I have access to neither, so I can't verify that setup. It sure doesn't work for me in my Firefox-based browser on my Linux system.

Never mind of course that the whole implementation of ASPIRE has its own set of weaknesses. I'll address that idea in a later post. I'm just frustrated that HISD only seems to be making a half-hearted effort to engage parents and the public online.

Tuesday, December 2, 2008

HISD Magnet Statistics

I've heard some opinions from Mary Nesbitt (HISD District Advisory Committee Member), Maggie Solomon, Judy Long, Ann Blackwood (HISD Parent Engagement Committee Members) and Ed Klein (Greater Houston Partnership) floating around in emails and on the Viewpoints page of a local weekly. The gist of their argument is that by making changes to magnet transportation, HISD is potentially removing large amounts of funding from magnet programs. While they may be correct in the worst case, I believe that their argument is incomplete; and that's perhaps intentional, as a complete analysis would diminish the alarm they seem to profess in their writing.

Let me be clear. I am opposed to HISD making transportation changes for the 2009-2010 year for magnet students, but for different reasons, which I outlined in an earlier post. What I'm objecting to is this use of statistical information to try to create an extreme reaction, which in my opinion, makes it more difficult to enter this discussion on a collaborative basis. Let me say how I disagree with the conclusions in the email/article.

First, the facts. They are, unfortunately, not simple to corroborate. The authors claim a student is worth $3446.00 in funding to the school they attend; let's accept that on face value. The authors also seem to know exactly how many students at various schools must travel 10+ miles to attend; I have been unable to date to get this information from HISD, so I'll assume they have direct access to information I can't corroborate.

Next, the analysis. Given the number of 10+ mile travelers in each magnet program, the authors make a straightforward calculation on how much funding the school would lose if those riders were unable to attend; they multiply the number of students by $3446 and report the result. For example, Lamar HS, with 230 affected students, would potentially lose $792,580.00; Westside HS, with about 400 affected students, would lose $1,374,954.00. Those are big numbers, and would cause a big disruption; the program would have to curtail staff and possibly faculty to meet the shortfall. Overall, the nearly 3400 affected students would represent over $11,500,000.00 in funding at stake.

But it's only half the story. Their analysis assumes that the magnet programs would not be able to find new students to attend, which is almost certainly not the case. I called Lanier MS (163 affected students) and found out that each year 600 applicants qualify for 235 spots. If students 10+ miles away were unable to attend, it would affect those directly (which I would argue is a worse effect), but is unlikely to affect the program at Lanier much. For Johnston MS (206 affected students), a similar situation; they receive on the order of 1000 applicants for 250 spots. TH Rogers MS receives several hundred applicants for 75 open spots. Carnegie HS (141 students affected) has several hundred applicants for 150 spots. Westside HS, with a huge 399 students potentially affected, admits 120-200 per year, of around 400 qualified applicants. Again, a transportation disruption could have a huge affect on the students, but may or may not cause the programs to see the financial reductions the authors describe.

I think there are already plenty of reasons to ask HISD to postpone or amend any changes to transportation for magnet students. I just think this reported analysis is sloppy, and should be disregarded.

Don't Change Magnet Transportation in 2009-2010

HISD is considering making changes to the transportation made available to students in its successful magnet programs. Unfortunately, these changes seem to be part of a terribly compressed timetable between proposal, discussion, and decision. Very few pertinent facts and analyses are being presented to students and parents by HISD. Finally, HISD is presenting these changes as an opportunity to reduce expenses, but is not giving a commitment on how those savings may be applied elsewhere. For these reasons, I think HISD should not try at this time to make changes to transportation; instead, they should consider starting a discussion and deliberation process which will frame these changes as part of a plan to support, strengthen, and extend the magnet programs. To put this into a bigger context, they should also discuss the expense reductions available in other parts of the budget, and should plan for implementation no earlier than the 2010-2011 school year.

The timetable. It seems a bad idea to rush through a decision process which has the potential to negatively affect almost 3,400 students directly, and possibly 11,000 students overall (the total population of the magnet programs). These students and their families, who are among other things citizens and taxpayers, deserve more consideration than they seem to be receiving during this process. Surely these stakeholders should receive more than an opportunity to "be heard"; the process should be more collaborative and consultative, and should be slowed down so each group participating can receive good information and make informed decisions.

The information. There has been a lot of information (data?) flying around, but not enough of it identified as directly from HISD, or available from their web site. Faced with a proposal which affects such a large community, HISD should seriously consider creating a web site with at least the proposals and an impact analysis. At best would be community oriented tools, such as at least a feedback form so people can contact HISD and ask for more information and/or discuss how the proposed change might impact them.

I would like to see, for example: how many students will be affected at each magnet program? what are the demographics of the affected students (aggregate age, economic, geographic, ethnic/racial information)? How does that compare to the populations of the whole magnet program and the district? Is any sub-group going to be impacted disproportionately? What provisions will be available for families which cannot adjust to the new regime?

The bigger context. I have not heard enough about how these changes fit into the plan for either the magnet programs in general, or the overall HISD budget at large. If these proposed economies would be applied back to the magnet schools and programs, where will they go? Will existing programs get expanded? Failing programs get more resources? New programs established? In what proportion? And is that a firm commitment?

If this is instead about cost reductions, it's not clear that saving $8 - $9 million in an overall annual budget of $1.6 billion is going to have much effect. If the issue here really is a concern that falling property values will lead to lower taxes and a smaller HISD budget, then where are the significant reductions going to come from? How many students will those changes affect? Given the magnitude of the possible savings and the number of students involved, is this proposal reasonable?

Perhaps students, teachers, families, and staff might feel better about participating in this process if HISD were to commit to its magnet programs, to promise that any changes would affect the smallest possible number of students and families, and that families which demonstrably can't cope with the changes will not be left behind or forced to exit their chosen programs. It appears that HISD needs to go a long way to establish trust with its constituents, and should consider how to take advantage of every possible opportunity to do so.

Monday, December 1, 2008

HISD Transportation Meetings

Dear readers: please don't forget these upcoming transportation meetings, as well as the regularly scheduled Board meeting at 5pm at the District headquarters:

  • Districtwide Magnet Transportation Community Meetings Schedule
    DateTimeLocation
    December 26:30 p.m.Waltrip High School
    December 46:30 p.m.Chávez High School
  • Regional Magnet Transportation Community Meetings Schedule
    DateTimeLocationRegion
    December 36:30 p.m.Madison High SchoolSouth
    December 36:30 p.m.DeBakey High SchoolCentral
    December 86:00 p.m.Austin High SchoolEast
    December 96:00 p.m.Davis High SchoolNorth

The Law, and Lori Drew

There's a case now before Judge Wu of the US District Court for the Central District of California which you may have heard of: The United States vs Lori Drew. Lori Drew is the Missouri woman who is suspected of contributing to the suicide death of a 13 year old Missouri girl by creating an online account for a fictitious 16 year old boy (Josh Evans) and using that account to befriend and ultimately "dump" the 13 year old. I think the basic facts (she created the account, she communicated with the 13 year old, the conversations are archived and available) are not in dispute. The question being decided here is if her conduct violates the Computer Fraud and Abuse Act (CFAA), and if so, she should be punished as a federal felon.

I don't think there's any question that Lori Drew acted maliciously and irresponsibly, especially for an adult. What she did is not morally defensible in any way; if and how she should be punished for her behaviour is a valid topic for debate, and should be a matter for the law to decide. However, in this case, it seems the Government decided the existing criminal statutes were inadequate; it seems that her actions may be covered as free speech, and may have been difficult to assign as a primary factor in the girl's suicide. Therefore the prosecutor decided to bring a case under the CFAA, which describes her actions as contrary to the Terms of Service of MySpace, and therefore an act of computer fraud and abuse.

While it's tempting to applaud the use of this unrelated statute to punish Ms. Drew, the consequences of expanding the reach of the CFAA will be troublesome. In effect, this case would imply that any time you violate the Terms of Service for an online site, you run the risk of being prosecuted as a federal felon. Think about that for a minute; how many online services do you use? Think of MySpace, Facebook, LinkedIn, Google Search, Google Docs, Yahoo, Windows Live, etc. Do you read the whole contract you're "signing" which grants you access to the site? Do you keep up with changes to the Terms of Service for each of the sites?

While this amicus brief from the Electronic Frontier Foundation (EFF) is 46 pages long, it's a good (hair-raising?) read. One of the points made early on is that these Terms of Service documents include provisions both important and trivial. Claims under the CFAA would cover both, thereby giving the operators of online sites the power to create federal felony offenses on the fly, so to speak. For example, Facebook and Ning want to keep for themselves the ability to show ads on their sites; if you figure out how to make an online "widget" or application which shows your ads to their members, they will accuse you of violating their terms of service. The reasonable consequence for that should be to deny you access to their systems; however, if this decision stands, they also will have the opportunity to get a federal prosecutor to charge you under the CFAA as well. That's unreasonable.