A counterintuitive way to think about regulation

View original post

Recent word from the AAA Foundation for Traffic Safety explains that drivers using advanced safety systems such as cruise control, lane-keeping, and automatic braking are nearly twice as likely to be distracted as those driving less-equipped cars. This should remind us that human beings are not as predictable as we like to believe.

We humans think. We adapt to new circumstances and respond to incentives. And that means efforts to reduce risk by regulation must take account of how the human mind can respond to an otherwise safer environment.

The AAA-reported phenomenon relates to something called “risk compensation,” which was first described for auto safety by economist Sam Peltzman in 1963. His research focused on outcomes associated with early government requirements for improved brakes, crashability, and seatbelts. Peltzman was surprised when his statistical studies showed that driver behavior became riskier in safer automobiles. While driver fatalities declined with more safety equipment, there was an increase in fatalities involving pedestrians and people on bicycles and motorcycles.

Later researchers, unable to confirm Peltzman’s study fully, also found risk compensation — but not enough to offset the gains delivered by additional safety equipment.

Engagement in a mental experiment helps us to understand the findings. Imagine that you are driving a car with failed brakes. Obviously, you try to keep a lot of distance between yourself and the car in front of you. You pray that you will not encounter a red light. Eventually, the car rolls to a natural stop, and you call a tow truck and have it taken to a garage. Later, after the repairs are done, you tap the brake pedal, and behold, your car slows down. After trying this several times and getting good results, you no longer fear closing in on the car ahead of you or speeding up when approaching a controlled intersection.

In other words, you’ve engaged in risk compensation.

Similar results are to be expected in other settings. Tamper-proof lids on pain relief medication, which are difficult for arthritic fingers to deal with, may lead troubled consumers to open the containers and toss the lid. If enough customers do this, it may fully offset the expected benefits of tamper-proof lids and make a product more, dangerous instead of less.

Other complications emerge when regulators require lenders to give complex information to borrowers when they are applying for or closing a loan or arranging a funeral. So much, in fact, that some clients simply stop listening. If enough people tune out, the intended benefits of the required warnings are erased.

Then there are requirements that consumer product producers include detailed warnings about side effects in all advertisements, labels, and package inserts — to such an extent, in fact, that many drug producers meet the requirement by attaching a multipage label to their products. While some might say that any step taken to perfect consumer information is a move in the right direction, research by psychologists suggests otherwise. They report how “information overload” may cause consumers to ration the amount they will absorb. This suggests that “you can take a horse to water but can’t make him drink.”

All of this helps us to better understand the challenges faced by regulators and others who hope to reduce risks in consumer markets — or to (as some might say) create the world in their own image. Before pushing for regulated perfection, we need trial runs that do not involve the full U.S. population, so that we can learn about possible risk compensation, information overload, and other benefit-offsetting behavior.

And on that thought, perhaps we should learn from the 50-state laboratory we live in. It is not a question of whether or not we regulate, but rather how we might regulate better.

Bruce Yandle is a contributor to the Washington Examiner’s Beltway Confidential blog. He is a distinguished adjunct fellow with the Mercatus Center at George Mason University and dean emeritus of the Clemson University College of Business & Behavioral Science. He developed the “Bootleggers and Baptists” political model.

Comments are closed, but trackbacks and pingbacks are open.