When you’re planning a trip, do you ever rely on airline on-time performance statistics to choose one carrier over another? Maybe you shouldn’t.
Consider how on-time performance is defined: A flight is considered on time if it arrives within 15 minutes of its scheduled arrival time. But here’s the catch: Who determines just how long a given flight is supposed to take? The airlines do.
We’ve all seen stories in the past about how airlines that show a lot of late arrivals relative to their competitors can resort to a very simple solution — just adjust the schedule so that the flight can take longer to get there but still be on time. This is known as schedule padding.
And now an interesting analysis by Travel Weekly sheds more light on the inaccuracy of on-time statistics. It notes that in setting their expected gate-to-gate travel times, different airlines have different priorities that can affect those estimates. So even if two airlines achieve the very same travel time, one flight could be on time while the other could be considered late.
Using data from FlightStats, the publication noted, for example, that on the Atlanta-Chicago O’Hare route, American Airlines and Spirit Airlines flights both managed exactly the same average gate-to-gate time of 129 minutes. And yet American showed a 78 percent on-time performance rate, while Spirit’s was only 40 percent.
Low-cost carriers face more pressures to keep estimated flight times short, the article noted; for one thing, their labor costs are based on estimated rather than actual travel times. For another, scheduling shorter travel times lets them squeeze in more flights per aircraft.
Just one more reason why on-time performance numbers can be misleading at best.
Do you consider on-time performance stats when making airline choices? Please leave your comments below.
NOTE: Be sure to click here to see all recent TravelSkills posts about: New Two brand new United Clubs + Jennifer Aniston needs a shower + Best Megahub? + Big Hilton/SkyMiles bonus