ERS Charts of Note
Get the latest charts via email, or on our mobile app for and
Wednesday, August 24, 2022
Households in rural (nonmetropolitan) persistently poor counties were the least likely to have home internet in 2015-19, with more than 3 in 10 households lacking internet access at home. In comparison, only 2 in 10 households in rural counties that were not persistently poor had no internet access at home. A similar pattern was observed in urban (metropolitan) areas, with 2 in 10 households in persistently poor counties lacking home internet access. Only a little more than 1 in 10 households in urban counties that were not persistently poor had no internet access at home. These data illustrate two major trends. First, rural households were less likely to have internet subscriptions at home than urban households. Second, in persistently poor counties, whether rural or urban, a higher share of households lacks internet adoption than in counties that are not persistently poor. For households with internet access at home, service was mainly through a subscription, which includes a range of access from dial-up to broadband to cellular data plans. These gaps in at-home internet access and subscriptions suggest that households in persistently poor counties—and more specifically, households in rural persistently poor counties—had additional barriers to internet adoption. This chart appears in the USDA, Economic Research Service report Rural America at a Glance: 2021 Edition, published in November 2021.
Friday, January 14, 2022
Access to fast internet speeds has been crucial throughout the Nation with the increased online presence of school, work, and shopping due to the (Coronavirus) COVID-19 pandemic. In June 2019, the moderate- or high-speed broadband internet needed for high-quality video calls was available in the census blocks of more than 90 percent of U.S. residents. In rural counties, however, only 72 percent residents had access to those internet speeds. Only 63 percent of rural residents in counties with persistent poverty had moderate or high-speed broadband available in their census blocks. Counties are considered persistently poor if they have a poverty rate of 20 percent or more for four consecutive U.S. Census measurements dating back to 1980. Among persistently poor rural counties, high availability of moderate or high-speed internet was clustered in and around eastern Kentucky and southern Texas. Rural persistently poor counties in the Deep South and Southwest had low internet availability, as did rural counties in the lower Great Plains and western Mountain States that were not persistently poor. Rural counties without persistent poverty that had high internet availability were scattered throughout the eastern half of the United States and clustered in the upper Great Plains and eastern Mountain States. This map was first published in the USDA, Economic Research Service report Rural America at a Glance: 2021 Edition.
Friday, January 7, 2022
The Coronavirus (COVID-19) pandemic affected unemployment rates differently in rural and urban counties. In January 2020, just before the pandemic, the unemployment rates in rural counties, both persistently poor and not, were higher than in urban counties. In addition, the unemployment rates in persistently poor counties were higher than in counties that were not persistently poor (6 percent versus 4.6 percent, respectively, for rural counties). That changed with the pandemic-driven economic downturn. By April 2020, the unemployment rate among persistently poor rural counties had more than doubled to a peak of 12.6 percent. However, in other rural counties the unemployment rate had more than tripled, surpassing the unemployment rate in persistently poor rural counties with a peak of 13.7 percent. Similarly, in urban counties the unemployment rate more than tripled for persistently poor counties and quadrupled for other urban counties, surpassing the unemployment rates in rural counties. These changes in the unemployment rate suggest that the employment shock at the start of the pandemic was not as prominent in persistently poor counties as in counties that were not persistently poor, and that it had a larger effect on urban counties than rural counties. These changes are possibly due to differing employment dependence on industries that remained in operation, such as meatpacking, or that had less demand, such as retail and hospitality. By June 2020, the unemployment rate in not persistently poor rural counties had again fallen below the rate in persistently poor rural counties, while the rate in not persistently poor urban counties again fell below the rate in persistently poor rural counties by October 2020. As of October 2021, the unemployment rates in rural counties had returned to what they were before the pandemic, but the unemployment rate remained elevated in persistently poor urban counties. This chart updates data in the USDA, Economic Research Service report Rural America at a Glance: 2021 Edition, published in November 2021.
Thursday, November 18, 2021
The public rollout of Coronavirus (COVID-19) vaccinations in the United States began in mid-December 2020. The early phases of vaccination prioritized frontline health care workers, adults age 65 and over, individuals with high-risk medical conditions, and essential workers. During this period, vaccination rates in metro (urban) and nonmetro (rural) counties increased at about the same rate, and by mid-April 2021, total vaccination rates for the Nation were slightly above 20 percent. After April 19, 2021, when all adults became eligible for vaccination, the share of fully vaccinated residents increased faster in metro counties than in nonmetro counties. In addition, the vaccination rate was consistently lower in persistently poor counties than in counties that were not persistently poor. Persistently poor counties are counties in which 20 percent or more of the population lived at or below the Federal poverty line during four consecutive U.S. census measurements dating to 1980. In July 2021, the highly infectious Delta variant began to spread. Rural persistently poor counties led the Nation in new infections through much of this surge, and the gap in vaccination rates between persistently poor and other counties started to close in mid-August. However, the rural-urban vaccination gap persisted. By early October 2021, the vaccination rate in urban counties had reached 53 percent, while the vaccination rate in rural counties was about 42 percent. This chart is included in the USDA Economic Research Service report Rural America at a Glance: 2021 Edition, published November 17, 2021.
Monday, August 23, 2021
Across all races and ethnicities, U.S. poverty rates in 2019 were higher at 15.4 percent in nonmetro (rural) areas than in metro (urban) areas at 11.9 percent. Rural Black or African American residents had the highest incidence of poverty in 2019 at 30.7 percent, compared with 20.4 percent for that demographic group in urban areas. Rural American Indians or Alaska Natives had the second highest rate at 29.6 percent, compared with 19.4 percent in urban areas. The poverty rate for White residents was about half the rate for either Blacks or American Indians at 13.3 percent in rural areas and 9.7 percent in urban settings. Rural Hispanic residents of any race had the third highest poverty rate at 21.7 percent, compared with 16.9 percent in urban areas. Non-Hispanic White residents had the lowest poverty rates in both rural (12.7 percent) and urban (8.2 percent) areas in 2019. This chart appears in the Economic Research Service topic page for Rural Poverty & Well-Being, updated June 2021.
Wednesday, August 11, 2021
A recent USDA, Economic Research Service study of U.S. poverty identified 310 counties—10 percent of all counties—with high and persistent levels of poverty in 2019. High and persistent poverty counties had poverty rates of 20 percent or more in 1980, 1990, 2000, and on average for 2007-2011 and 2015-2019. Of those 310 counties, 86 percent, or 267 counties, were rural (nonmetro). These rural counties were concentrated in historically poor areas of the Mississippi Delta, Appalachia, the Black Belt, and the southern border regions, as well as on Federal Indian reservations. More than 5 million rural residents, or about 12 percent of the U.S. rural population, lived in counties that had high and persistent poverty rates in 2019. Of those, 1.5 million individuals had incomes below the Federal poverty threshold, accounting for 20 percent of the total rural poor population. Rural residents who identify as Black or African American and American Indian or Alaska Native were particularly vulnerable. Nearly half the rural poor within these groups lived in high and persistent poverty counties in 2019. By comparison, 20 percent of rural poor Hispanics and 12 percent of rural non-Hispanic Whites lived in those counties. This chart appears in the August 2021 Amber Waves finding, Rural Poverty Has Distinct Regional and Racial Patterns.
Friday, May 28, 2021
Early during the Coronavirus (COVID-19) pandemic, U.S. employment fell at rates not seen since the Great Depression, with the greatest declines occurring in metro areas. Before the pandemic, employment growth in metro areas had averaged 1.4 percent per year for the 12 months prior to March 2020, more than twice the rate in nonmetro areas (0.6 percent per year). After March 2020, the situation reversed. In April 2020, metro employment was 15.0 percent below 12 months earlier, while nonmetro employment was 12.2 percent lower. Employment has since largely recovered in both metro and nonmetro areas but remained lower in February 2021 than levels 12 months earlier. The extent to which employment was still depressed in February 2021 is greater in metro areas: Metro employment was 5.7 percent lower in February 2021 than in February 2020, while nonmetro employment was 3.4 percent lower. This chart appears in the Economic Research Service topic page, The COVID-19 Pandemic and Rural America, updated May 2021.
Friday, April 2, 2021
Since the late 1990s, an opioid epidemic has afflicted the U.S. population, particularly those in the prime working ages of 25-54. As a result, the National age-adjusted mortality rate from drug overdoses rose from 6.1 per 100,000 people in 1999 to 21.7 per 100,000 in 2017, then dipped to 20.7 per 100,000 in 2018 and rose back to 21.6 in 2019. Among the prime working age population, the drug overdose mortality rate was 37.8 deaths per 100,000 people in 2019. This rate was exceeded only by cancer (39.2 deaths per 100,000) in 2019 as a major cause of death in this population. ERS researchers, examining the opioid epidemic from 1999 to 2018, observed two distinct phases: a “prescription opioid phase” (1999-2011) and a succeeding “illicit opioid phase” (2011-2018), marked especially by the spread of fentanyl and its analogs. Updated data show the second phase has extended into 2019. Mortality data indicate that in the prescription opioid phase, drug overdose deaths were most prevalent in areas with high rates of physical disability, such as central Appalachia. Rural residents, middle-age men and women in their 40s and early 50s were most affected, as were Whites and American Indian/Alaskan Natives. Opioid prescriptions ceased driving the epidemic in 2011 as increased regulation and greater awareness of prescription addiction problems took hold. The illicit opioid phase that followed involved primarily heroin and synthetic opioids, such as fentanyl. Fentanyl and its analogs are often used to spike other addictive drugs, including prescription opioids, creating powerful combinations that make existing drug addictions more lethal. During the study period, this second phase was concentrated in the northeastern United States, particularly in areas of employment loss. This phase most often involved urban young adult males, ages 25 to 39. All the racial/ethnic groups studied—Hispanics, Blacks, American Indian/Alaskan Natives, and Whites—were affected. This chart updates data found in the Economic Research Service report The Opioid Epidemic: A Geography in Two Phases, released April 2021.
Friday, March 19, 2021
During the initial COVID-19 surge between March and June 2020, large urban areas had the highest weekly death rates from the virus in the United States. Those numbers declined as medical professionals learned more about the virus, how to treat it, and how to prevent its spread. As the virus spread from major urban areas to rural areas, the second COVID-19 surge, from July to August 2020, brought more deaths to rural areas. The peak in deaths associated with this surge was smaller because testing was more widespread, the infected population was younger and less vulnerable, and treatments were more effective. However, in early September 2020, COVID-19 death rates in rural areas surpassed those in urban areas. This trend continued into a third, still ongoing, surge that spiked in rural areas during the holiday season and again shortly thereafter. Rural areas have shown higher death rates per 100,000 adults since September in part because they had higher rates of new infections than urban areas, but that is not the whole story. Rural COVID-19 deaths per 100 new infections 2 weeks prior (to account for the lag between infection and death) were 2.2 in the first 3 weeks of February—35 percent higher than the corresponding urban mortality rate of 1.6 deaths per 100 new infections 2 weeks earlier. The rural population appears to be more vulnerable to serious infection because of the older age of its population, higher rates of underlying medical conditions, lack of health insurance, and greater distance to an intensive care hospital. As of early February, death rates have started decreasing, possibly because of more widespread vaccinations among the most vulnerable populations. This chart updates data found in the February 2021 Amber Waves data feature, “Rural Residents Appear to be More Vulnerable to Serious Infection or Death from COVID-19.”
Wednesday, February 24, 2021
Higher educational attainment generally is associated with higher median earnings, higher employment rates, and greater workforce opportunity. Among all rural residents who are 25 years old or older, the percentage who had completed a bachelor’s degree or higher rose from 15 percent in 2000 to 21 percent in 2019. In addition, the share of the rural population 25 or older without a high school degree or equivalent dropped from 24 percent in 2000 to 12 percent in 2019. However, ethnic and racial disparities persist in education. Rural Hispanics continued to have the highest share of people without a high school degree in 2019 at 34 percent, despite significant gains in high school and higher educational attainment rates since 2000. Over the same period, Blacks or African Americans had the largest decrease of rural individuals without a high school degree (21 percentage points). This change narrowed the gap between the shares of Blacks or African Americans and Whites who had graduated from high school but had not completed a bachelor’s degree. Nevertheless, the share of rural Blacks or African Americans without a high school degree (20 percent) was nearly double that of Whites (11 percent) in 2019. This chart updates data found in the November 2020 Amber Waves finding, “Racial and Ethnic Disparities in Educational Attainment Persist in Rural America.”
Monday, December 7, 2020
Higher educational attainment is associated with higher median earnings, higher employment rates, and greater workforce opportunity. Among all rural residents who are 25 years old or older, the percentage of those who had completed a bachelor’s degree or higher rose from 15 percent in 2000 to 20 percent in 2018. In addition, the share of the rural population 25 or older without a high school degree or equivalent dropped from 24 percent in 2000 to 13 percent in 2018. Even so, ethnic and racial disparities persist in education. Rural Hispanics continued to have the highest share (35 percent) without a high school degree, despite significant gains in high school and higher educational attainment rates between 2000 and 2018. Over the same period, Blacks or African Americans had the largest decrease (20 percentage points) of rural individuals without a high school degree. This change eliminated the gap between the shares of Blacks or African Americans and Whites who had graduated from high school but had not completed a bachelor’s degree. Nevertheless, the share of rural Blacks or African Americans without a high school degree remained nearly double that of Whites in 2018. This chart appears in the November 2020 Amber Waves finding, “Racial and Ethnic Disparities in Educational Attainment Persist in Rural America.”
Friday, October 2, 2020
COVID-19 has spread to nearly every nation in the world, and to every State and nearly every county in the United States. The virus initially spread most rapidly to large metropolitan areas, and most confirmed cases are still in metro areas with populations of at least 1 million, according to the Economic Research Service’s (ERS) analysis of data from the Johns Hopkins University Center for System Science and Engineering. This is consistent with most of the U.S. population living in large metro areas. Even in per capita terms, the prevalence of COVID-19 cases has been greater in metro than in nonmetro areas since the initial appearance of the pandemic in the United States (the first confirmed case was reported on January 20, 2020). As of September 1, cumulative confirmed cases per 100,000 residents reached 1,877 in metro areas, compared with 1,437 cases in nonmetro areas. Although the prevalence of COVID-19 cases remains lower in nonmetro areas, the share of cases in nonmetro areas has grown since late March. The nonmetro share of all confirmed U.S. COVID-19 cases grew from 3.6 percent on April 1 to 11.1 percent on September 1. ERS regularly produces research on rural America, including demographic changes in rural communities and drivers of rural economic performance. This chart appears in the ERS topic page, The COVID-19 Pandemic and Rural America, updated September 2020.
Monday, August 17, 2020
In 2013, rural poverty reached a 30-year peak at 18.4 percent of the rural population. Between 2013 and 2018, the rural poverty rate fell 2.3 percentage points, a decline of about 1 million rural residents in poverty. Rural poverty rates declined for all race/ethnicity groups. The rural Black population showed the largest decline in poverty rates, from 37.3 percent in 2013 to 31.6 percent in 2018. Despite this decrease, Blacks continued to have the highest poverty rate among all rural race/ethnicity groups. While Blacks made up 7.6 percent of the rural population, they accounted for 14.9 percent of the rural poor in 2018. American Indians had the second-highest poverty rate (30.9 percent) among all rural race/ethnicity groups in 2018, 3.5 percentage points lower than in 2013. Hispanics had the lowest poverty rate among rural minority groups (23.8 percent) in 2018, an improvement of 4.4 percentage points from 2013. Whites have historically had a much lower rural poverty rate (14.0 percent in 2018), and their rate fell 1.9 percentage points from 2013 to 2018. However, the majority of the rural poor are White. Whites account for 84.8 percent of the overall rural population and 73.4 percent of the rural population in poverty in 2018. This chart updates data that appeared in the November 2018 ERS report, Rural America at a Glance, 2018 Edition.
Monday, July 13, 2020
Between 2014 and 2018, the United States had 316 counties with low levels of educational attainment, meaning 20 percent or more of working-age adults (ages 25-64) living in the county lacked a high school diploma or equivalent. The majority of those counties—about 4 out of 5—were in rural (nonmetro) areas. Low-education rural counties were predominantly in the South (nearly 80 percent or 208 counties) and the economies of more than one-third (116 counties) relied on farming or manufacturing. Nearly half (156 counties) were high poverty counties, with a poverty rate of 20 percent or more, and most of those counties (113 counties) also had persistently high poverty over three or more decades. In addition, almost 60 percent of low-education rural counties were in areas where African Americans alone (70 counties) or Hispanics of any race (115 counties) accounted for 20 percent or more of the total population. In these counties, the low-education rates for African Americans or Hispanics were substantially higher than corresponding rates for white (non-Hispanic) individuals. This chart appears on the ERS topic page for Rural Education, updated May 2020.
Tuesday, June 16, 2020
In 2018, the United States had 664 high-poverty counties, where an average of 20 percent or more of the population had lived below the Federal poverty level on average over 2014-18. The majority were rural (78.9 percent, or 524 counties). These high-poverty counties represented about one of every four rural counties, compared with about one of every ten urban counties. Fifteen of the 664 counties were extreme poverty areas, where the poverty rate was 40 percent or greater. The extreme poverty areas were also persistent poverty counties, with poverty rates of at least 20 percent over the past 30 years. In 2018, all of the extreme poverty counties were in rural America. These counties are not evenly distributed, but rather are geographically concentrated and disproportionately located in regions with above-average populations of racial minorities. Several extreme poverty counties, for instance, were found in Mississippi, including four counties where there has historically been a high incidence of poverty among the African-American population. These counties were also found in South Dakota, with six counties where Native Americans made up more than 50 percent of the population. This chart appears on the Economic Research Service topic page for Rural Poverty & Well-being, updated February 2020. It is also in the May 2020 Amber Waves article, “Extreme Poverty Counties Found Solely in Rural Areas in 2018.”
Friday, March 13, 2020
People living in poverty tend to be clustered in certain U.S. regions, counties, and neighborhoods, rather than being spread evenly across the Nation. Poverty rates in rural (nonmetro) areas have historically been higher than in urban (metro) areas, and the rural/urban poverty gap is greater in some regions of the country than others. At the regional level, poverty is disproportionately concentrated in the rural South. In 2014-18, the South had an average rural poverty rate of 20.5 percent—nearly 6 percentage points higher than the average rate in the region’s urban areas. An estimated 42.7 percent of the Nation’s rural population and 51.3 percent of the Nation’s rural poor lived in this region between 2014 and 2018. By comparison, 37.1 percent of the urban population and 39.4 percent of the urban poor lived in the South during that period. The poverty gap was smallest in the Midwest and the Northeast—with less than a percentage point difference between rural and urban poverty rates. This chart appears on the Economic Research Service topic page for Rural Poverty & Well-being, updated February 2020.
Tuesday, December 10, 2019
Family type has a significant bearing on poverty. For example, families headed by two adults are likely to have more sources of income than single-adult families—and are therefore less likely to be poor. In 2017, nearly 33.8 percent of rural families headed by a female with no spouse present and 18.5 percent of those headed by a male with no spouse present fell below the poverty threshold. In contrast, 6 percent of rural families with a married couple were poor. On average, 11.6 percent of all rural families were poor. Poverty rates for single-adult families were higher than average for urban area residents as well in 2017, but overall family poverty rates were higher in rural than in urban areas. This chart appears in the ERS topic page for Rural Poverty & Well-being, updated March 2019. This Chart of Note was originally published May 29, 2019.
Monday, November 4, 2019
U.S. poverty rates differ by age group. In 2017, the difference between rural and urban poverty rates was greatest for children under the age of 5 (26.0 percent in rural areas versus 19.3 percent in urban areas). Federal poverty thresholds vary by household composition. For a family of two adults and one child, the poverty line in 2017 was an annual income of $19,730. Overall, child poverty rates under age 18 were 22.8 percent in rural areas and 17.7 percent in urban areas. In contrast, the poverty rates for senior adults (age 65 and older) were much closer at 10.1 percent in rural areas and 9.1 percent in urban areas. Working age adults (ages 18–64) followed the pattern of other age-groups, in that they had higher poverty rates in rural areas (16.0 percent) than in urban areas (12.1 percent). Poverty rates do not indicate how long individuals have experienced poverty. Some families cycle into and out of poverty over time, while others are persistently poor. Persistent poverty among children is of particular concern, as the cumulative effects may lead to poor health, limited education, and other negative outcomes. Also, research suggests that the more time a child spends in poverty or living in a high-poverty area, the greater the chance of being poor as an adult. This chart appears in the ERS topic page for Rural Poverty & Well-being, updated March 2019.
Monday, August 12, 2019
The Supplemental Nutrition Assistance Program (SNAP) provided benefits to an average of more than 46 million recipients per month and accounted for 52 percent of USDA’s spending in 2014. That year, SNAP recipients redeemed more than $69 billion worth of benefits. Recent ERS research estimated the effect of SNAP redemptions on county-level employment. During and immediately after the Great Recession (2008–10), each additional $10,000 in SNAP redemptions contributed on average 1.04 additional jobs in rural counties and 0.41 job in urban counties. By contrast, before the recession (2001–07), SNAP redemptions had a much smaller positive effect on employment in rural counties (about 0.25 job per $10,000 in redemptions) and a negative effect in urban counties (a loss of about 0.22 job per $10,000 in redemptions). After the recession (2011–14), SNAP redemptions had a statistically insignificant effect on employment in both rural and urban counties. Per dollar spent, the effect of SNAP redemptions on local employment during the recession was greater than the employment effect of other government transfer payments combined—including Social Security, Medicare, Medicaid, unemployment insurance compensation, and veterans’ benefits—and also the employment effect of total Federal Government spending. SNAP’s relatively large effect on employment during the recession may owe to the fact that, unlike many other government programs, SNAP payments are provided directly to low-income people, who tend to immediately spend additional income. This chart uses data found in the May 2019 ERS report, The Impacts of Supplemental Nutrition Assistance Program Redemptions on County-Level Employment. Also see the May 2019 article, “SNAP Redemptions Contributed to Employment During the Great Recession” in ERS’s Amber Waves magazine.
Thursday, June 20, 2019
The Supplemental Nutrition Assistance Program (SNAP) is the largest USDA program. During fiscal year 2014, it provided benefits to an average of more than 46 million recipients per month and accounted for 52 percent of USDA’s spending. That year, SNAP recipients redeemed more than $69 billion worth of benefits at SNAP-authorized stores—83 percent of which were located in urban areas and 17 percent in rural areas. Between fiscal years 2000 and 2013, average monthly SNAP participation nearly tripled, while the inflation-adjusted value of benefits paid under the program nearly quadrupled. The growth in program participation and the value of benefits paid were particularly rapid during and immediately after the Great Recession, which officially began in December 2007 and ended in June 2009. However, the recession resulted in high poverty rates well after it officially ended. The increase in program spending between 2009 and 2013 was due in part to rising SNAP participation in response to high levels of poverty during this period. A temporary increase in benefit rates mandated by the American Recovery and Reinvestment Act (ARRA) in early 2009 and other policies to increase access to the program also likely expanded SNAP participation and spending. This chart appears in the May 2019 ERS report Investigating Impacts of SNAP Redemptions on County-Level Employment. Also see the May 2019 article, “SNAP Redemptions Contributed to Employment During the Great Recession” in ERS’s Amber Waves magazine.