ERS Charts of Note
Wednesday, January 19, 2022
The importance of irrigation for the U.S. agricultural sector has evolved significantly over the past century. Irrigated acreage in the country has grown from fewer than 3 million acres in 1890 to more than 58 million acres in 2017. The expansion of irrigated acreage during this period reflects Federal, State, and local investment in irrigation infrastructure to deliver surface water to farms and ranches. Additionally, this expansion is partly due to advancements in well drilling and pumping technologies, which have facilitated growth in groundwater-based irrigated agriculture. Since 1969, the amount of water used per acre irrigated has decreased substantially. The average water use per acre irrigated was more than 2 acre-feet (1 acre-foot = 325,851 gallons) in 1969, which declined to nearly 1.5 acre-feet by 2018. Efficient water application technologies, such as the transition from gravity-based to pressurized irrigation systems, have driven the reduction in water use per acre of irrigated land. This chart was drawn from the USDA, Economic Research Service report “Trends in U.S. Irrigated Agriculture: Increasing Resilience Under Water Supply Scarcity,” published December 2021.
Tuesday, January 4, 2022
Regional distribution of U.S. irrigated acreage changed significantly from 1949 to 2017. Trends in irrigated cropping patterns, technological advances, water availability, and changing growing-season weather drove this evolution. The arid Mountain and Pacific regions consistently irrigated the most farmland until 2007, when irrigated acreage in the Northern Plains region surpassed acreage in the Pacific region. Irrigated acreage in the Mountain and Pacific regions remained relatively constant over the 70-year period, despite increasingly limited opportunities for additional water development and increasing competition for water from non-agricultural sectors. The Northern Plains region has experienced the most substantial increase in irrigated acreage, expanding from less than 2 million acres in 1949 to nearly 12 million acres in 2017. The expansion of irrigated acreage in the Northern Plains is related to advances in groundwater pumping technologies, the diffusion of center pivot irrigation application systems, and the region’s abundant aquifer resources. The Southern Plains region experienced similar growth in irrigation until the 1980s, when dwindling groundwater supplies resulted in irrigated acreage declines. The Mississippi Delta and Southeast regions also have expanded irrigated acreage since 1949 reflecting, in part, changing cropping patterns, abundant aquifer water supplies, and producer responsiveness to changing precipitation levels during growing seasons. This chart was drawn from the USDA, Economic Research Service report Trends in U.S. Irrigated Agriculture: Increasing Resilience Under Water Supply Scarcity, published December 2021.
Monday, December 13, 2021
Irrigation organizations use a variety of methods to calculate on-farm water use so they can accurately track water use within their delivery systems. The methods used to calculate on-farm water use partially determine ways organizations can price water deliveries. For example, implementing volumetric water pricing is difficult unless organizations can directly meter on-farm water use. According to data collected in the USDA’s 2019 Survey of Irrigation Organizations, about 44 percent of irrigation water delivery organizations use direct metering to calculate on-farm water use, and about 42 percent of organizations use time-of-use estimation to determine water deliveries. The time-of-use method estimates the volume of water delivered based on the duration of deliveries and the characteristics of the conveyance infrastructure. About 17 percent of organizations calculate water deliveries based on self-reporting from irrigated farms and ranches. Many organizations use more than one method to determine on-farm water use. This chart was drawn from the USDA, Economic Research Service report Irrigation Organizations: Water Storage and Delivery Infrastructure, published October 2021.
Monday, November 29, 2021
Water storage infrastructure includes dams and reservoirs that provide a way to store water across seasons and years to meet the demands of irrigators. According to data collected in the USDA’s 2019 Survey of Irrigation Organizations, less than 20 percent of water delivery organizations own and manage their own water storage reservoirs. The remaining water delivery organizations rely on natural streamflow or storage infrastructure owned by State or Federal agencies or other irrigation organizations. Large irrigation organizations, defined as those organizations that serve more than 10,000 irrigable acres, are the most likely to own water storage infrastructure. Almost 37 percent of large irrigation organizations have at least one water storage reservoir. Meanwhile, 21 percent of medium organizations and 10 percent of small organizations, have at least one reservoir. Storage infrastructure is particularly important in snowpack-dependent basins where the timing of spring runoff does not align with peak irrigation water demand. The role of water storage infrastructure will be critical as snowpack decreases, snowmelt runoff shifts to earlier in the growing season, and water demand increases. This chart can be found in the USDA, Economic Research Service report Irrigation Organizations—Water Storage and Delivery Infrastructure, published October 19, 2021.
Wednesday, November 3, 2021
Irrigation organizations that deliver water to farms and ranches use main and lateral canals, tunnels, and pipelines to transport water from natural waterways, reservoirs, or other infrastructure to irrigated farms and ranches. Transporting water to farms and ranches can result in conveyance losses, or water that is unavailable for irrigation use because of evaporation or seepage. Lining water canals with quasi-impermeable materials, such as concrete or plastic membranes, can reduce conveyance losses as less water is lost to seepage. However, the cost of lining canals may be prohibitively high for many irrigation organizations. According to data collected in the USDA’s 2019 Survey of Irrigation Organizations, almost 76 percent of water delivery organizations cite expense as a reason for leaving conveyance infrastructure unlined. In some scenarios, lining canals may not be feasible or warranted. For example, unlined canals may beneficially recharge aquifers or soil and geologic attributes may minimize seepage losses. A smaller percentage of organizations cite those as reasons for not lining main and lateral canals. This chart can be found in the USDA, Economic Research Service report, Irrigation Organizations—Water Storage and Delivery Infrastructure, published October 19, 2021.
Monday, September 13, 2021
There are two methods to apply irrigation water to crops: gravity or pressurized irrigation systems. Gravity irrigation systems use on-field furrows, basins, or poly-pipe to advance water across the field surface through gravity means only. Pressurized systems apply water under pressure through pipes or other tubing directly to crops (e.g., sprinkler and micro/drip irrigation systems). Under many field conditions, pressurized irrigation systems use water more efficiently than gravity systems, as less water is lost to evaporation, deep percolation, and field runoff. Over the last 30 years, the number of acres irrigated using pressurized irrigation systems roughly doubled while the acreage irrigated using gravity systems declined substantially in the 17 Western States. In 2018, 72 percent of all irrigated cropland acres (28.96 million acres out of 40.31 million acres of total irrigated area) in 17 Western States used pressurized irrigation systems, up from 37 percent in 1984. This chart appears in the USDA, Economic Research Service topic page for Irrigation & Water Use, updated August 2021.
Monday, November 16, 2020
In 2019, Wisconsin’s production of fluid milk was second only to California’s. According to data from USDA’s National Agricultural Statistics Service, Wisconsin generated 30.6 billion pounds of milk that year, with milk sales totaling $5 billion. In recent years, Wisconsin dairy farms have been exposed to substantial weather volatility characterized by frequent droughts, storms, and temperature extremes (both hot and cold). This has resulted in considerable fluctuations in dairy productivity. Researchers from the Economic Research Service (ERS) among others, found that total factor productivity (TFP), which measures the rate of growth in total output (aggregate milk produced) relative to the rate of growth in total inputs (such as the number of cows, farm labor, feed, and machinery), increased at an average annual rate of 2.16 percent for Wisconsin dairy farms between 1996 and 2012. This increase was primarily driven, at an annual rate of 1.91 percent, by technological progress—such as improved herd genetics, advanced feed formulations, and improvements in milking and feed handling equipment. However, trends in rainfall and temperature variation were responsible for a 0.32 percent annual decline in the productivity of Wisconsin dairy farms during the same period. For example, an average increase in temperature of 1.5 degrees Fahrenheit reduced milk output for the average Wisconsin dairy farm by 20.1 metric tons per year. This is equivalent to reducing the herd size of the average farm by 1.6 cows every year. This chart appears in ERS’s October 2020 Amber Waves finding, “Climatic Trends Dampened Recent Productivity Growth on Wisconsin Dairy Farms.”
Wednesday, August 12, 2020
Knowing where natural resource use accumulates is fundamental to understanding what factors influence resource-use decisions. A recent Economic Research Service (ERS) study estimated natural resource use by the U.S. food system in 2007 (2007 data were the latest available with the level of detail needed for the analysis). Farm production was the smallest user of fossil fuels (12 percent of fossil fuel use); households were the largest users (35 percent). Over 40 percent of greenhouse gas emissions in food production were from farms and ranches, followed by households, and then companies that distribute and market food. For forest products, the greatest use occurred during food processing and packaging, with paper-based packaging accounting for most of this use. Farm production was the dominant user of freshwater withdrawals due to irrigation, but slightly over a third of water use by the food system in 2007 occurred after the farm, including in household kitchens (20 percent) and in the energy industry (12 percent). This chart appears in the ERS report, Resource Requirements of Food Demand in the United States, and Amber Waves article, “A Shift to Healthier Diets Likely To Affect Use of Natural Resources,” May 2020.
Thursday, May 28, 2020
Conserving natural resources starts with identifying where they are used. A recent Economic Research Service (ERS) study examined how much of 5 of the Nation’s natural resources were used in 2007 to feed Americans aged 2 and above. (2007 data were the latest available with the level of detail needed for the analysis.) The researchers looked at the entire U.S. food system from production of farm inputs—such as fertilizers and feed—through points of consumer purchases in grocery stores and eating-out places to home kitchens. Their estimates show that agricultural land use in the U.S. food system was 25.5 percent of the country’s 2.3 billion acres of total land. Although the study does not account for other food-related land use, such as by forestry and mining industries serving the food system, it does show that about half of agricultural land is dedicated to food production for the U.S. market, and the other half was devoted to nonfood crops, like cotton and corn for producing ethanol, and to export crops, like soybeans. The U.S. food system also accounted for an estimated 28 percent of 2007’s freshwater withdrawals, 11.5 percent of the fossil fuel budget, and 7.2 percent of marketed forest products. Air is a natural resource that is degraded by the addition of greenhouses gases. The food system accounted for an estimated 18.1 percent of U.S. greenhouse gas emissions in 2007. A version of this chart appears in the ERS report, Resource Requirements of Food Demand in the United States, May 2020 and the Amber Waves feature article, “A Shift to Healthier Diets Likely To Affect Use of Natural Resources.”
Wednesday, April 22, 2020
The U.S. Environmental Protection Agency estimated that agriculture and forestry together accounted for 10.5 percent of U.S. greenhouse gas emissions in 2018. This includes carbon dioxide (CO2) emissions associated with agricultural electricity consumption. The greenhouse gases with the largest contributions to rising temperature are CO2, methane (CH4), and nitrous oxide (N2O). Globally, CO2 emissions are the largest contributor to climate change. However, the emissions profile for agriculture differs from that of the economy as a whole. U.S. agriculture emitted 698 million metric tons of carbon-dioxide equivalent in 2018: 12.3 percent as carbon dioxide, 36.2 percent as methane, and 51.3 percent as nitrous oxide. Increases in carbon storage (sinks) offset 11.6 percent of total U.S. greenhouse gas emissions in 2018. Carbon sinks include forest management to increase carbon in forests, increases in tree carbon stocks in settlements, conversion of agricultural to forest land (afforestation), and crop management practices that increase carbon in agricultural soils. This chart updates data that appears in the Economic Research Service data product Ag and Food Statistics: Charting the Essentials.
Monday, September 9, 2019
U.S. farm output since 1948 has grown by 170 percent. Increases in total factor productivity (TFP), measured as total output per unit of total input, accounted for more than 90 percent of that output growth. However, TFP growth rates fluctuate considerably year-to-year, mostly in response to adverse weather, which can lower productivity estimates. Recent ERS research modeled a future climate-change scenario with an average temperature increase of 2 degrees Celsius (3.6 degrees Fahrenheit) and a 1-inch decrease in average annual precipitation. Results showed that the “TFP gap index”—the difference in total-factor productivity levels between the projected period (2030–40) and the reference period (2000–10)—varies by State. For some States, those climate changes fall within the range of what is historically observed, while for other States they do not, which accounts for regional variation. States in the latter category are projected to experience larger effects. The States experiencing the greatest impacts would include Louisiana and Mississippi in the Delta region; Rhode Island, Delaware, and Connecticut in the Northeast region; Missouri in the Corn Belt region; Florida in the Southeast region; North Dakota in the Northern Plains region; and Oklahoma in the Southern Plains region. This chart appears in the Amber Waves article, “Climate Change Likely to Have Uneven Impacts on Agricultural Productivity,” released August 2019.
Friday, September 30, 2016
In 2010, to help meet water quality goals, the U.S. Environmental Protection Agency (EPA) adopted a limit on the amount of pollutants that the Chesapeake Bay can receive. Nitrogen and phosphorus, in particular, can lead to adverse effects on public health, recreation, and ecosystems when present in excess amounts. The EPA estimates that applications of manure contribute 15 percent of nitrogen and 37 percent of phosphorus loadings to the Bay. Furthermore, ERS estimates that animal feeding operations (AFOs), which raise animals in confinement, account for 88 percent of manure nitrogen and 84 percent of manure phosphorus generation in that watershed. ERS also estimates that about a third of nitrogen and half of phosphorus produced at AFOs can be recovered for later use. That adds to about 234 million pounds of nitrogen and 106 million pounds of phosphorus recovered. These nutrients can then be redistributed regionally to fertilize agricultural land, thereby lessening nutrient run-off problems in the Bay. The remaining nutrients cannot be recovered. Both nitrogen and phosphorus may be lost during collection, storage, and transportation; nitrogen may also volatize into the atmosphere. This chart is based on the ERS report Comparing Participation in Nutrient Trading by Livestock Operations to Crop Producers in the Chesapeake Bay Watershed, released in September 2016.
Friday, September 9, 2016
Climate models predict U.S. agriculture will face changes in local patterns of precipitation and temperature over the next century. These climate changes will affect crop yields, crop-water demand, water-supply availability, farmer livelihoods, and consumer welfare. Using projections of temperature and precipitation under nine different scenarios, ERS research projects that climate change will result in a decline in national fieldcrop acreage in 2080 when measured relative to a scenario that assumes continuation of reference climate conditions (precipitation and temperature patterns averaged over 2001-08). Acreage trends show substantial variability across climate change scenarios and regions. When averaged over all climate scenarios, total acreage in the Mountain States, Pacific, and Southern Plains is projected to expand, while acreage in other regions--most notably the Corn Belt and Northern Plains--declines. Over half of all fieldcrop acreage in the U.S. is found in the Corn Belt and Northern Plains, and projected declines in these regions represent 2.1 percent of their combined acreage. Irrigated acreage for all regions is projected to decline, but in some regions increases in dryland acreage offset irrigated acreage losses. The acreage response reflects projected changes in regional irrigation supply as well as differential yield impacts and shifts in relative profitability across crops and production practices under the climate change scenarios. This chart is from the ERS report Climate Change, Water Scarcity, and Adaptation in the U.S. Fieldcrop Sector, November 2015.
Wednesday, May 11, 2016
Agriculture accounted for an estimated 10 percent of U.S. greenhouse gas (GHG) emissions in 2014. In agriculture, crop and livestock activities are important sources of nitrous oxide and methane emissions, notably from fertilizer application, enteric fermentation (a normal digestive process in animals that produces methane), and manure storage and management. GHG emissions from agriculture have increased by approximately 10 percent since 1990. During this time period, total U.S. GHG emissions increased approximately 7 percent. This chart is from the Land and Natural Resources section of ERS’s Ag and Food Statistics: Charting the Essentials data product.
Wednesday, February 17, 2016
ERS research projects that climate change will result in a decline in national fieldcrop acreage over analysis years 2020, 2040, 2060, and 2080, when measured relative to a scenario that assumes continuation of reference climate conditions (precipitation and temperature patterns averaged over 2001-08). Acreage trends are explored for nine climate change scenarios, and substantial variability exists across climate change scenarios and crop sectors. When averaged over all climate scenarios, U.S. acreage in rice, hay, and cotton is projected to expand, while acreage in corn, soybeans, sorghum, wheat, and silage declines. Acreage response varies across crops as a function of the sensitivity of crop yields to changes in precipitation, temperature, and atmospheric carbon dioxide; the resulting changes in relative crop profitability; the coincidence of climatic shifts with geographic patterns of crop production; and variables related to the extent of crop reliance on irrigation. This chart is from the ERS report Climate Change, Water Scarcity, and Adaptation in the U.S. Fieldcrop Sector, November 2015.
Wednesday, January 6, 2016
About 75 percent of irrigated cropland in the United States is located in the 17 western-most contiguous States, based on USDA’s 2013 Farm and Ranch Irrigation Survey (the most recent available). Between 1984 and 2013, while the amount of irrigated land in the West has remained fairly stable (at about 40 million acres) and the amount of water applied has been mostly flat (between 70 and 76 million acre-feet per year), the use of more efficient irrigation systems to deliver the water has increased. In 1984, 71 percent of Western crop irrigation water was applied using gravity irrigation systems that tend to use water inefficiently. By 2013, operators used gravity systems to apply just 41 percent of water for crop production, while pressure-sprinkler irrigation systems (including drip, low-pressure sprinkler, or low-energy precision application systems), which can apply water more efficiently, accounted for 59 percent of irrigation water use and about 60 percent of irrigated acres. This chart is found in the ERS topic page on Irrigation & Water Use, updated October 2015.
Friday, November 27, 2015
Climate models predict U.S. agriculture will face significant changes in local patterns of precipitation and temperature over the next century. These climate changes will affect crop yields, crop-water demand, water-supply availability, farmer livelihoods, and consumer welfare. Irrigation is an important strategy for adapting to shifting production conditions under climate change. Using projections of temperature and precipitation under nine climate change scenarios for 2020, 2040, 2060, and 2080, ERS analysis finds that on average, irrigated fieldcrop acreage would decline relative to a reference scenario that assumes continuation of climate conditions (precipitation and temperature patterns averaged over 2001-08). Before midcentury, the decline in irrigated acreage is largely driven by regional constraints on surface-water availability for irrigation. Beyond midcentury, the decline reflects a combination of increasing surface-water shortages and declining relative profitability of irrigated production. This chart is from the ERS report, Climate Change, Water Scarcity, and Adaptation in the U.S. Fieldcrop Sector, ERR-201, November 2015.
Wednesday, July 8, 2015
Above a temperature threshold, an animal may experience heat stress resulting in changes in its respiration, blood chemistry, hormones, metabolism, and feed intake. Dairy cattle are particularly sensitive to heat stress; high temperatures lower milk output and reduce the percentages of fat, solids, lactose, and protein in milk. In the United States, dairy production is largely concentrated in climates that expose animals to less heat stress. The Temperature Humidity Index (THI) load provides a measure of the amount of heat stress an animal is under. The annual THI load is similar to “cooling degree days,” a concept often used to convey the amount of energy needed to cool a building in the summer. The map shows concentrations of dairy cows in regions with relatively low levels of heat stress: California’s Central Valley, Idaho, Wisconsin, New York, and Pennsylvania. Relatively few dairies are located in the very warm Gulf Coast region (which includes southern Texas, Louisiana, Mississippi, Alabama, and Florida). This map is drawn from Climate Change, Heat Stress, and Dairy Production, ERR-175, September 2014.
Monday, June 22, 2015
As agriculture adapts to climate change, crop genetic resources can be used to develop new plant varieties that are more tolerant of changing environmental conditions. Crop genetic resources (or germplasm) consist of seeds, plants, or plant parts that can be used in crop breeding, research, or conservation. The public sector plays an important role in collecting, conserving, and distributing crop genetic resources because private-sector incentives for crucial parts of these activities are limited. The U.S. National Plant Germplasm System (NPGS) is the primary network that manages publicly held crop germplasm in the United States. Since 2003, demand for crop genetic resources from the NPGS has increased rapidly even as the NPGS budget has declined in real dollars. By way of comparison, the NPGS budget of approximately $47 million in 2012 was well under one-half of 1 percent of the U.S. seed market (measured as the value of farmers’ purchased seed) which exceeded $20 billion for the same year. This chart updates ones found in the June 2015 Amber Waves feature, Crop Genetic Resources May Play an Increasing Role in Agricultural Adaptation to Climate Change.
Monday, June 1, 2015
Every year, agriculture contributes an estimated 60-80 percent of delivered nitrogen and 49-60 percent of delivered phosphorous in the Gulf of Mexico. Nitrogen in waters can cause rapid and dense growth of algae and aquatic plants, leading to degradation in water quality as found in the hypoxic zone of the Gulf of Mexico, where excess nutrients have depleted oxygen needed to support marine life. Nitrogen removal is one of the many benefits of wetlands. An ERS analysis found that on an annual basis, the amount of nitrogen removed per dollar spent to restore and preserve a new wetland ranged from 0.15 to 34 pounds within the area of study (the Upper Mississippi/Ohio River watershed), or a range of $0.03 to $7.00 per pound of nitrogen removed. Restoring and protecting wetlands in the very productive corn-producing areas of Illinois, Indiana, and Ohio tends to be more cost effective than elsewhere in the study area. The study suggests that if nitrogen reduction was the only environmental goal, these corn-producing areas would be a good place to restore wetlands. Hydrologic conditions in the Upper Mississippi and Ohio River watersheds are unique, so the cost effectiveness of wetlands elsewhere is uncertain. This map is found in the ERS report, Targeting Investments To Cost Effectively Restore and Protect Wetland Ecosystems: Some Economic Insights, ERR-183, February 2015.