August 6, 2016 - 8:00 pm
Water, we are repeatedly told, will be “next oil.” In the United States, climate projections predict increasing drought frequency throughout most of the country. Around the world, political and even military conflicts due to water scarcity are multiplying. Water is a limited and essential resource, and we are becoming more sensitive to the need to use it wisely.
So it probably seems like good news that the U.S. Interior and Agriculture Departments have pledged almost $50 million in new public investment to improve water efficiency in domestic agriculture. This money will build on the hundreds of millions of dollars that were allocated by federal and state agencies in the past two decades to subsidize the adoption of water-efficient technology and practices in farming.
But there is one problem: The assumption that such spending automatically leads to reduced water use is not grounded in theory or evidence. In fact, improving efficiency may fail to deliver large water savings and even, paradoxically, increase use.
Agriculture is the major consumer of water in the United States— responsible for 80 percent of consumptive use nationwide and 90 percent in some Western states. This consumption takes place against a backdrop of growing water scarcity problems: droughts in the West, “water wars” in the Southeast, and depletion of aquifers nationwide. Promoting efficiency is perceived as a win-win for farmers and environment.
Here’s the problem: Human behavior is complex. Cutting-edge techniques to increase water efficiency are designed and tested by agricultural scientists and engineers. This research usually focuses on the performance of new technologies and practices, with human behavior held constant. When human behavior enters into the mix, the results may look quite different.
In the press release on the new investments, an Interior Department official said “there are technologies and management strategies available that conserve water while maintaining yield and production standards.” That’s true. Yet the link between improved efficiency and reductions in water use hinges on the conjecture that farmers will stick to those last six words: keep everything else about their yields and production constant — just use less water.
But like all businesses that innovate, farmers might change their operations to take advantage of the innovation, and these changes may also affect water use.
A study by Cynthia Lin of the University of California at Davis and Lisa Pfeiffer of the National Oceanographic and Atmospheric Administration looked at how farmers in the Ogallala aquifer of the Great Plains changed their water use after being subsidized to adopt more efficient irrigation technology — drop nozzles, which deliver water from just above the crop to decrease evaporation.
The Ogallala aquifer is one of the largest in the world, but it is being rapidly depleted, and more than 90 percent of its pumped water goes to agriculture. Unlike previous studies, Lin and Pfeiffer collected water-use data to isolate the causal impacts of the new nozzles.
The results were striking. Rather than finding that farmers who used drop nozzles reduced their water use, they showed that those farms increased their consumption, on average, by about 3 percent. The efficient technology had an effect, but it was the opposite of what was intended.
Why did this happen? When water delivery becomes more efficient, using more water can increase profits. Farmers may also feel less obligated to conserve water because they have done a good deed by adopting the technology. (Psychologists call this a “licensing effect.”) In the Ogallala, some farmers responded to the new technology by increasing how much land they irrigated or switching to more water-intensive crops — from wheat to corn or soy, for example.
So, how will farmers react to the new federal investments? Unfortunately, few people are asking that question. Western politicians, governments, and other proponents of efficiency seem convinced that efficiency automatically translates to less water consumed.
To its credit, the Agriculture Department has begun to think more carefully about the behavioral assumptions and evidence underlying their programs. In the past few years, it has established three academic centers to incorporate insights from the behavioral sciences and stronger empirical evidence into departmental programs. (Disclosure: I co-direct one of these centers.) These efforts need to be deepened and broadened if we are to understand the impacts of agricultural-environmental interventions.
To generate more credible evidence, we must deliberately design and implement programs with the aim of estimating their real-world effects. The Agriculture and Interior Departments should require that the new investments be tied to credible efforts to measure the actual impacts on water use and farmer behavior. In medicine, where the guiding ethic is to do no harm, new ideas are rigorously tested in the field before being scaled up. A similar culture of rigorous field evaluations could transform agricultural and environmental science.
For too long, environmental programs of all sorts have been advanced in the absence of evidence about their impacts. Spending $50 million to look at how people actually use water-saving technologies and practices would be a big step in turning things around.
Paul Ferraro is the Bloomberg Distinguished Professor in the Carey Business School and the Whiting School of Engineering at Johns Hopkins University, and the co-director of the Center for Behavioral and Experimental Agri-Environmental Research.