Were one to poll people in Colorado as to their perceptions of a drunk driving offense, many might describe a scene of a person standing on the side of the road blowing into a hand-held breath testing device. Indeed, the number “.08” (which represents the legal blood-alcohol content limit in most states) has become synonymous with DUI charges.
Yet the aforementioned visualization prompts the following question: why would law enforcement officials measure one’s breath to determine the alcohol content of their blood? While a roadside blood test is obviously unfeasible, one might wonder how accurate a breathalyzer is at accurately measuring one’s actual degree of intoxication.
How breath testing devices work
Much of the ethanol alcohol one ingests while drinking ends up in their bloodstream due to a process known as passive diffusion (which allows water-soluble compounds to permeate membrane surfaces). Eventually, that alcohol-saturated blood ends up in their lungs. A small portion of it vaporizes when it comes into contact with the oxygen in the lungs. One then expels that vaporized ethanol when they breathe.
This process continues as the body metabolizes the consumed ethanol, with the level of alcohol on one’s breath remaining in equilibrium with that in their blood. One’s blood-to-breath ratio of alcohol is what indicates their level of intoxication. According to the Alcohol Pharmacology Education Partnership, breath testing devices assume a static ratio of 2100:1 when generating readings.
Revealing potential breath test inaccuracies
The problem with this assumption is that in reality, one’s actual blood-to-breath ratio after drinking can range from 1500:1 to 3000:1 (depending on factors such as age, weight, gender or genetic makeup). It is this wide variance that no doubt contributes to the fact (as shared by the American Motorists Association) that experts estimate that breath testing devices have a margin of error as high as 50%.