The Dept. of Transportation recently released its internal audit of SafeStat, the database used to measure the relative safety fitness of Interstate commercial carriers with respect to crashes, drivers, vehicles and safety management.
The audit had been requested by the House Transportation and Infrastructure Highways and Transit subcommittee in part to determine whether the SafeStat system identifies high-risk carriers with any degree of reliability.
Some of the findings, such as geographic disparities in the consistency and timeliness of reporting procedures, have been widely reported in the trade press. Other findings have been largely ignored, including the fact that carriers have not been reliable in reporting census information such as the number of vehicles and drivers they have.
The report noted that 42% of the 644,000 active motor carriers had not updated their census information — as they're mandated to do by the Motor Carrier Safety Improvement Act of 1999. For example, 11% to 15% of active carriers reported either zero power units or zero drivers. Even worse, 15,136 of these “zero-driver” fleets had at least one inspection on record for the October 2001-September 2002 time period. Something doesn't add up.
DOT is concerned because this kind of reporting error could result in failure to identify a carrier with a deficient safety record. Under the current SafeStat methodology, when the number of power units is reported as zero, accident involvement ratios are not calculated. This is particularly disturbing because “number of power units” was used to calculate the Accident SEA for 74% of the highest risk carriers (Category A) in the January 2003 ranking.
The audit confirmed the existence of geographic reporting disparities and inconsistencies in reporting crash and inspection information. For example, since Georgia and Florida are particularly slow in reporting crash data, high-risk carriers operating in the Southeast might be able to avoid identification altogether.
In addition, the average time it took different states to upload inspection results varied from 32 to 35 days during 1999-2002. That average is misleading, however, since statistical measures of variation in “inspection reporting delay” were classified as “extreme.”
As a result of the audit, DOT has recommended changes that it hopes will make the data more consistent, as well as more effective in identifying high-risk carriers.
One suggestion is a new disclaimer on the site highlighting states that are not timely or consistent in their reporting of crash and inspection data.
Another suggestion involves using statistical analysis to optimize high-risk carrier identification. Such an approach could be accomplished rather quickly by using simulation modeling and/or factor analysis techniques to arrive at a best-case ranking algorithm.
Changes related to improving the timeliness, consistency and accuracy of the data itself-carrier census information, as well as state crash and inspection data — will be more difficult to implement. Since carrier and the states are both responsible for the problems, both should be held accountable in making improvements.
After all, both groups have benefited from the public disclosure of SafeStat data, which began in December 1999. Recordable crash rates nationwide dropped from 0.823 per million vehicle miles traveled (VMT) in 1999 to 0.716 in 2002. This reduction can be attributed at least in part to use of SafeStat data to benchmark a carrier's safety performance.
Continued benchmarking pressures and an improved ranking system should yield similar improvements in motor carrier safety.
Jim York is the manager of Zurich Service Corp.'s Risk Engineering Transportation Team, based in Schaumburg, IL.