Search This Blog

Sunday, April 9, 2023

Economic history of the United States

New York City, the world’s principal financial center and the epicenter of the principal American metropolitan economy
 

The economic history of the United States is about characteristics of and important developments in the economy of the U.S., from the colonial era to the present. The emphasis is on productivity and economic performance and how the economy was affected by new technologies, the change of size in economic sectors and the effects of legislation and government policy.

Colonial economy

Shipping scene in Salem, Massachusetts, a shipping hub, in the 1770s

The colonial economy was characterized by an abundance of land and natural resources and a severe scarcity of labor. This was the opposite of Europe and attracted immigrants despite the high death rate caused by New World diseases. From 1700 to 1774, the output of the thirteen colonies increased 12-fold, giving the colonies an economy about 30% the size of Britain's at the time of independence.

Population growth was responsible for over three-quarters of the economic growth of the British American colonies. The free white population had the highest standard of living in the world. There was very little change in productivity and little in the way of introduction of new goods and services. Under the mercantilist system, Britain put restrictions on the products that could be made in the colonies and put restrictions on trade outside the British Empire. The colonial economy differed significantly from that of most other regions in that land and natural resources were abundant in America but labor was scarce.

Demographics

Initial colonization of North America was extremely difficult and most settlers before 1625 died in their first year. Settlers had to depend on what they could hunt and gather, what they brought with them, and uncertain shipments of food, tools, and supplies until they could build shelters and forts, clear land, and grow enough food, as well as build gristmills, sawmills, ironworks, and blacksmith shops to be self-supporting. They also had to defend themselves against raids from Native Americans. After 1629 population growth was very rapid due to high birth rates (8 children per family versus 4 in Europe) and lower death rates than in Europe, in addition to immigration. The long life expectancy of the colonists was due to the abundant supplies of food and firewood and the low population density that limited the spread of infectious diseases. The death rate from diseases, especially malaria, was higher in the warm, humid Southern Colonies than in the cold New England Colonies.

The higher birth rate was due to better employment opportunities. Many young adults in Europe delayed marriage for financial reasons, and many servants in Europe were not permitted to marry. The population of white settlers grew from an estimated 40,000 in 1650 to 235,000 in 1700. In 1690, there were an estimated 13,000 black slaves. The population grew at an annual rate of over 3% throughout the 18th century, doubling every 25 years or less. By 1775 the population had grown to 2.6 million, of which 2.1 million were white, 540,000 black and 50,000 Native American, giving the colonies about one-third of the population of Britain. The three most populated colonies in 1775 were Virginia, with a 21% share, and Pennsylvania and Massachusetts with 11% each.

The economy

The colonial economy of what would become the United States was pre-industrial, primarily characterized by subsistence farming. Farm households also were engaged in handicraft production, mostly for home consumption, but with some goods sold, mainly gold.

The market economy was based on extracting and processing natural resources and agricultural products for local consumption, such as mining, gristmills and sawmills, and the export of agricultural products. The most important agricultural exports were raw and processed feed grains (wheat, Indian corn, rice, bread and flour) and tobacco. Tobacco was a major crop in the Chesapeake Colonies and rice a major crop in South Carolina. Dried and salted fish was also a significant export. North Carolina was the leading producer of naval stores, which included turpentine (used for lamps), rosin (candles and soap), tar (rope and wood preservative) and pitch (ships' hulls). Another export was potash, which was derived from hardwood ashes and was used as a fertilizer and for making soap and glass.

The colonies depended on Britain for many finished goods, partly because laws such as the Navigation Acts of 1660 prohibited making many types of finished goods in the colonies. These laws achieved the intended purpose of creating a trade surplus for Britain. The colonial balance of trade in goods heavily favored Britain; however, American shippers offset roughly half of the goods trade deficit with revenues earned by shipping between ports within the British Empire.

The largest non-agricultural segment was ship building, which was from 5 to 20% of total employment. About 45% of American made ships were sold to foreigners.

Exports and related services accounted for about one-sixth of income in the decade before revolution. Just before the revolution, tobacco was about a quarter of the value of exports. Also at the time of the revolution the colonies produced about 15% of world iron, although the value of exported iron was small compared to grains and tobacco. The mined American iron ores at that time were not large deposits and were not all of high quality; however, the huge forests provided adequate wood for making charcoal. Wood in Britain was becoming scarce and coke was beginning to be substituted for charcoal; however, coke made inferior iron. Britain encouraged colonial production of pig and bar iron, but banned construction of new colonial iron fabrication shops in 1750; however, the ban was mostly ignored by the colonists.

Settlement was sparse during the colonial period and transportation was severely limited by lack of improved roads. Towns were located on or near the coasts or navigable inland waterways. Even on improved roads, which were rare during the colonial period, wagon transport was very expensive. Economical distance for transporting low value agricultural commodities to navigable waterways varied but was limited to something on the order of less than 25 miles. In the few small cities and among the larger plantations of South Carolina, and Virginia, some necessities and virtually all luxuries were imported in return for tobacco, rice, and indigo exports.

By the 18th century, regional patterns of development had become clear: the New England colonies relied on shipbuilding and sailing to generate wealth; plantations (many using slave labor) in Maryland, Virginia, and the Carolinas grew tobacco, rice, and indigo; and the middle colonies of New York, Pennsylvania, New Jersey, and Delaware shipped general crops and furs. Except for slaves, standards of living were even higher than in England itself.

New England

The New England region's economy grew steadily over the entire colonial era, despite the lack of a staple crop that could be exported. All the provinces and many towns as well, tried to foster economic growth by subsidizing projects that improved the infrastructure, such as roads, bridges, inns and ferries. They gave bounties and subsidies or monopolies to sawmills, grist mills, iron mills, pulling mills (which treated cloth), salt works and glassworks. Most importantly, colonial legislatures set up a legal system that was conducive to business enterprise by resolving disputes, enforcing contracts, and protecting property rights. Hard work and entrepreneurship characterized the region, as the Puritans and Yankees endorsed the "Protestant Ethic", which enjoined men to work hard as part of their divine calling.

The benefits of growth were widely distributed in New England, reaching from merchants to farmers to hired laborers. The rapidly growing population led to shortages of good farm land on which young families could establish themselves; one result was to delay marriage, and another was to move to new lands farther west. In the towns and cities, there was strong entrepreneurship, and a steady increase in the specialization of labor. Wages for men went up steadily before 1775; new occupations were opening for women, including weaving, teaching, and tailoring. The region bordered New France, and in the numerous wars the British poured money in to purchase supplies, build roads and pay colonial soldiers. The coastal ports began to specialize in fishing, international trade and shipbuilding—and after 1780 in whaling. Combined with growing urban markets for farm products, these factors allowed the economy to flourish despite the lack of technological innovation.

The Connecticut economy began with subsistence farming in the 17th century, and developed with greater diversity and an increased focus on production for distant markets, especially the British colonies in the Caribbean. The American Revolution cut off imports from Britain, and stimulated a manufacturing sector that made heavy use of the entrepreneurship and mechanical skills of the people. In the second half of the 18th century, difficulties arose from the shortage of good farmland, periodic money problems, and downward price pressures in the export market. The colonial government from time to time attempted to promote various commodities such as hemp, potash, and lumber as export items to bolster its economy and improve its balance of trade with Great Britain.

Urban centers

Historian Carl Bridenbaugh examined in depth five key cities: Boston (population 16,000 in 1760), Newport Rhode Island (population 7500), New York City (population 18,000), Philadelphia (population 23,000), and Charles Town (Charlestown, South Carolina), (population 8000). He argues they grew from small villages to take major leadership roles in promoting trade, land speculation, immigration, and prosperity, and in disseminating the ideas of the Enlightenment, and new methods in medicine and technology. Furthermore, they sponsored a consumer taste for English amenities, developed a distinctly American educational system, and began systems for care of people in need.

On the eve of the Revolution, 95 percent of the American population lived outside the cities—much to the frustration of the British, who captured the cities with their Royal Navy, but lacked the manpower to occupy and subdue the countryside. In explaining the importance of the cities in shaping the American Revolution, Benjamin Carp compares the important role of waterfront workers, taverns, churches, kinship networks, and local politics. Historian Gary B. Nash emphasizes the role of the working class, and their distrust of their social superiors in northern ports. He argues that working class artisans and skilled craftsmen made up a radical element in Philadelphia that took control of the city starting about 1770 and promoted a radical Democratic form of government during the revolution. They held power for a while, and used their control of the local militia to disseminate their ideology to the working class, and to stay in power until the businessmen staged a conservative counterrevolution.

Political environment

Mercantilism: old and new

The colonial economies of the world operated under the economic philosophy of mercantilism, a policy by which countries attempted to run a trade surplus, with their own colonies or other countries, to accumulate gold reserves. Colonies were used as suppliers of raw materials and as markets for manufactured goods while being prohibited from engaging in most types of manufacturing. The colonial powers of England, France, Spain and the Dutch Republic tried to protect their investments in colonial ventures by limiting trade between each other's colonies.

The Spanish Empire clung to old style mercantilism, primarily concerned with enriching the Spanish government by accumulating gold and silver, mainly from mines in their colonies. The Dutch and particularly the British approach was more conducive to private business.

The Navigation Acts, passed by the British Parliament between 1651 and 1673, affected the British American colonies.

Important features of the Navigation Acts included:

  • Foreign vessels were excluded from carrying trade between ports within the British Empire
  • Manufactured goods from Europe to the colonies had to pass through England
  • Enumerated items, which included furs, ship masts, rice, indigo and tobacco, were only allowed to be exported to Great Britain.

Although the Navigation Acts were enforced, they had a negligible effect on commerce and profitability of trade. In 1770 illegal exports and smuggling to the West Indies and Europe were about equal to exports to Britain.

On the eve of independence Britain was in the early stage of the Industrial Revolution, with cottage industries and workshops providing finished goods for export to the colonies. At that time, half of the wrought iron, beaver hats, cordage, nails, linen, silk, and printed cotton produced in Britain were consumed by the British American colonies.

Free enterprise

The domestic economy of the British American colonies enjoyed a great deal of freedom, although some of their freedom was due to lack of enforcement of British regulations on commerce and industry. Adam Smith used the colonies as an example of the benefits of free enterprise. Colonists paid minimal taxes.

Some colonies, such as Virginia, were founded principally as business ventures. England's success at establishing settlements on the North American coastline was due in large part to its use of charter companies. Charter companies were groups of stockholders (usually merchants and wealthy landowners) who sought personal economic gain and, perhaps, wanted also to advance England's national goals. While the private sector financed the companies, the king also provided each project with a charter or grant conferring economic rights as well as political and judicial authority. The colonies did not show profits, however, and the disappointed English investors often turned over their colonial charters to the settlers. The political implications, although not realized at the time, were enormous. The colonists were left to build their own governments and their own economy.

Taxation

The colonial governments had few expenses and taxes were minimal.

Although the colonies provided an export market for finished goods made in Britain or sourced by British merchants and shipped from Britain, the British incurred the expenses of providing protection against piracy by the British Navy and other military expenses. An early tax became known as the Molasses Act of 1733.

In the 1760s the London government raised small sums by new taxes on the colonies. This occasioned an enormous uproar, from which historians date the origins of the American Revolution. The issue was not the amount of the taxes—they were quite small—but rather the constitutional authority of Parliament versus the colonial assemblies to vote taxes. New taxes included the Sugar Act of 1764, the Stamp Act of 1765 and taxes on tea and other colonial imports. Historians have debated back and forth about the cost imposed by the Navigation Acts, which were less visible and rarely complained about. However, by 1795, the consensus view among economic historians and economists was that the "costs imposed on [American] colonists by the trade restrictions of the Navigation Acts were small."

The American Revolution

Americans in the Thirteen Colonies demanded their rights as Englishmen, as they saw it, to select their own representatives to govern and tax themselves – which Britain refused. The Americans attempted resistance through boycotts of British manufactured items, but the British responded with a rejection of American rights and the Intolerable Acts of 1774. In turn, the Americans launched the American Revolution, resulting in an all-out war against the British and independence for the new United States of America. The British tried to weaken the American economy with a blockade of all ports, but with 90% of the people in farming, and only 10% in cities, the American economy proved resilient and able to support a sustained war, which lasted from 1775 to 1783.

Revolutionary era cartoon showing US sawing off the horn of a cow (symbolizing a break from British commerce) with a distressed Englishman watching as other European powers wait to collect milk. The cartoon represents the commercial status of the US during the Revolution.

The American Revolution (1775–1783) brought a dedication to unalienable rights to "life, liberty, and the pursuit of happiness", which emphasize individual liberty and economic entrepreneurship, and simultaneously a commitment to the political values of liberalism and republicanism, which emphasize natural rights, equality under the law for all citizens, civic virtue and duty, and promotion of the general welfare.

Britain's war against the Americans, French and Spanish cost about £100 million. The Treasury borrowed 40% of the money it needed and raised the rest through an efficient system of taxation. Heavy spending brought France to the verge of bankruptcy and revolution.

Congress and the American states had no end of difficulty financing the war. In 1775 there was at most 12 million dollars in gold in the colonies, not nearly enough to cover existing transactions, let alone on a major war. The British government made the situation much worse by imposing a tight blockade on every American port, which cut off almost all imports and exports. One partial solution was to rely on volunteer support from militiamen, and donations from patriotic citizens. Another was to delay actual payments, pay soldiers and suppliers in depreciated currency, and promise it would be made good after the war. Indeed, in 1783 the soldiers and officers were given land grants to cover the wages they had earned but had not been paid during the war. Not until 1781, when Robert Morris was named Superintendent of Finance of the United States, did the national government have a strong leader in financial matters. Morris used a French loan in 1782 to set up the private Bank of North America to finance the war. Seeking greater efficiency, Morris reduced the civil list, saved money by using competitive bidding for contracts, tightened accounting procedures, and demanded the federal government's full share of money and supplies from the states.

A one-dollar note issued by the Second Continental Congress in 1775 with the inscription: "ONE DOLLAR. THIS Bill entitles the BEARER to receive ONE SPANISH MILLED DOLLAR, or the Value thereof in Gold or Silver, according to a Resolution of CONGRESS, passed at Philadelphia November 29, 1775..” ; Within border cuts: "Continental Currency" and "The United Colonies". ; Within circle: “DEPRESSA RESURGIT”. ; Verso: “ONE DOLLAR. PHILADELPHIA: Printed by HALL and SELLERS. 1775.”
A one-dollar note issued by the Second Continental Congress in 1775

The Second Continental Congress used four main methods to cover the cost of the war, which cost about 66 million dollars in specie (gold and silver). Congress made two issues of paper money, in 1775–1780, and in 1780–81. The first issue amounted to 242 million dollars. This paper money would supposedly be redeemed for state taxes, but the holders were eventually paid off in 1791 at the rate of one cent on the dollar. By 1780, the paper money was "not worth a Continental", as people said, and a second issue of new currency was attempted. The second issue quickly became nearly worthless—but it was redeemed by the new federal government in 1791 at 100 cents on the dollar. At the same time the states, especially Virginia and the Carolinas, issued over 200 million dollars of their own currency. In effect, the paper money was a hidden tax on the people, and indeed was the only method of taxation that was possible at the time. The skyrocketing inflation was a hardship on the few people who had fixed incomes—but 90 percent of the people were farmers, and were not directly affected by that inflation. Debtors benefited by paying off their debts with depreciated paper. The greatest burden was borne by the soldiers of the Continental Army, whose wages—usually in arrears—declined in value every month, weakening their morale and adding to the hardships suffered by their families.

Starting in 1776, the Congress sought to raise money by loans from wealthy individuals, promising to redeem the bonds after the war. The bonds were in fact redeemed in 1791 at face value, but the scheme raised little money because Americans had little specie, and many of the rich merchants were supporters of the Crown. Starting in 1776, the French secretly supplied the Americans with money, gunpowder and munitions in order to weaken its arch enemy, Great Britain. When the Kingdom of France officially entered the war in 1778, the subsidies continued, and the French government, as well as bankers in Paris and Amsterdam loaned large sums to the American war effort. These loans were repaid in full in the 1790s.

Beginning in 1777, Congress repeatedly asked the states to provide money. But the states had no system of taxation either, and were little help. By 1780 Congress was making requisitions for specific supplies of corn, beef, pork and other necessities—an inefficient system that kept the army barely alive.

The cities played a major role in fomenting the American Revolution, but they were hard hit during the war itself, 1775–83. They lost their main role as oceanic ports, because of the blockade by the Royal Navy. Furthermore, the British occupied the cities, especially New York 1776–83, and the others for briefer periods. During the occupations they were cut off from their hinterland trade and from overland communication. When the British finally departed in 1783, they took out large numbers of wealthy merchants who resumed their business activities elsewhere in the British Empire.

Confederation: 1781–1789

A brief economic recession followed the war, but prosperity returned by 1786. About 60,000 to 80,000 American Loyalists left the U.S. for elsewhere in the British Empire, especially Canada. They took their slaves but left lands and properties behind. Some returned in the mid-1780s, especially to more welcoming states like New York and South Carolina. Economically mid-Atlantic states recovered particularly quickly and began manufacturing and processing goods, while New England and the South experienced more uneven recoveries. Trade with Britain resumed, and the volume of British imports after the war matched the volume from before the war, but exports fell precipitously.

John Adams, serving as the minister to Britain, called for a retaliatory tariff in order to force the British to negotiate a commercial treaty, particularly regarding access to Caribbean markets. However, Congress lacked the power to regulate foreign commerce or compel the states to follow a unified trade policy, and Britain proved unwilling to negotiate. While trade with the British did not fully recover, the U.S. expanded trade with France, the Netherlands, Portugal, and other European countries. Despite these good economic conditions, many traders complained of the high duties imposed by each state, which served to restrain interstate trade. Many creditors also suffered from the failure of domestic governments to repay debts incurred during the war. Though the 1780s saw moderate economic growth, many experienced economic anxiety, and Congress received much of the blame for failing to foster a stronger economy. On the positive side, the states gave Congress control of the western lands and an effective system for population expansion was developed. The Northwest Ordinance of 1787 abolished slavery in the area north of the Ohio River and promised statehood when a territory reached a threshold population, as Ohio did in 1803.

The new nation

Chart 1: trends in economic growth, 1700–1850

The Constitution of the United States, adopted in 1787, established that the entire nation was a unified, or common market, with no internal tariffs or taxes on interstate commerce. The extent of federal power was much debated, with Alexander Hamilton taking a very broad view as the first Secretary of the Treasury during the presidential administration of George Washington. Hamilton successfully argued for the concept of "implied powers", whereby the federal government was authorized by the Constitution to create anything necessary to support its contents, even if it not specifically noted in it (build lighthouses, etc.). He succeeded in building strong national credit based on taking over the state debts and bundling them with the old national debt into new securities sold to the wealthy. They in turn now had an interest in keeping the new government solvent. Hamilton funded the debt with tariffs on imported goods and a highly controversial tax on whiskey. Hamilton believed the United States should pursue economic growth through diversified shipping, manufacturing, and banking. He sought and achieved Congressional authority to create the First Bank of the United States in 1791; the charter lasted until 1811.

After the war, the older cities finally restored their economic basis; newer growing cities included Salem, Massachusetts (which opened a new trade with China), New London, Connecticut, and Baltimore, Maryland. Secretary Hamilton set up a national bank in 1791. New local banks began to flourish in all the cities. Merchant entrepreneurship flourished and was a powerful engine of prosperity in the cities.

World peace lasted only a decade, for in 1793 two decades of war between Britain and France and their allies broke out. As the leading neutral trading partner the United States did business with both sides. France resented it, and the Quasi-War of 1798–99 disrupted trade. Outraged at British impositions on American merchant ships, and sailors, the Jefferson and Madison administrations engaged in economic warfare with Britain 1807–1812, and then full-scale warfare 1812 to 1815. The war cut off imports and encouraged the rise of American manufacturing.

Industry and commerce

Transportation

There were very few roads outside of cities and no canals in the new nation. In 1792 it was reported that the cost of transport of many crops to seaport was from one-fifth to one half their cost. The cheapest form of transportation was by water, along the seacoast or on lakes and rivers. In 1816 it was reported that "A ton of goods could be brought 3000 miles from Europe for about $9, but for that same sum it could be moved only 30 miles in this country".

Automatic flour mill

In the mid 1780s Oliver Evans invented a fully automatic mill that could process grain with practically no human labor or operator attention. This was a revolutionary development in two ways: 1) it used bucket elevators and conveyor belts, which would eventually revolutionize materials handling, and 2) it used governors, a forerunner of modern automation, for control.

Cotton gin

"The First Cotton Gin" conjectural image from 1869

Cotton was at first a small-scale crop in the South. Cotton farming boomed following the improvement of the cotton gin by Eli Whitney. It was 50 times more productive at removing the seeds than with a roller. Soon, large cotton plantations, based on slave labor, expanded in the richest lands from the Carolinas westward to Texas. The raw cotton was shipped to textile mills in Britain, France and New England.

Mechanized textile manufacturing

Samuel Slater (1768–1835)

In the final decade of the 18th century, England was beginning to enter the rapid growth period of the Industrial Revolution, but the rest of the world was completely devoid of any type of large scale mechanized industry. Britain prohibited the export of textile machinery and designs and did not allow mechanics with such skills to emigrate. Samuel Slater, who worked as mechanic at a cotton spinning operation in England, memorized the design of the machinery. He was able to disguise himself as a laborer and emigrated to the U.S., where he heard there was a demand for his knowledge. In 1789 Slater began working as a consultant to Almy & Brown in Rhode Island who were trying to successfully spin cotton on some equipment they had recently purchased. Slater determined that the machinery was not capable of producing good quality yarn and persuaded the owners to have him design new machinery. Slater found no mechanics in the U.S. when he arrived and had great difficulty finding someone to build the machinery. Eventually he located Oziel Wilkinson and his son David to produce iron castings and forgings for the machinery. According to David Wilkinson: "all the turning of the iron for the cotton machinery built by Mr. Slater was done with hand chisels or tools in lathes turned by cranks with hand power". By 1791 Slater had some of the equipment operating. In 1793 Slater and Brown opened a factory in Pawtucket, Rhode Island, which was the first successful water powered roller spinning cotton factory in the U.S. ( See: Slater Mill Historic Site ). David Wilkinson went on to invent a metalworking lathe which won him a Congressional prize.

Finance, money and banking

The First Bank of the United States was chartered in 1791. It was designed by Alexander Hamilton and faced strenuous opposition from agrarians led by Thomas Jefferson, who deeply distrusted banks and urban institutions. They closed the Bank in 1811, just when the War of 1812 made it more important than ever for Treasury needs.

Early 19th century

The United States was pre-industrial throughout the first third of the 19th century. Most people lived on farms and produced much of what they consumed. A considerable percentage of the non-farm population was engaged in handling goods for export. The country was an exporter of agricultural products. The U.S. built the best ships in the world.

The textile industry became established in New England, where there was abundant water power. Steam power began being used in factories, but water was the dominant source of industrial power until the Civil War.

The building of roads and canals, the introduction of steamboats and the first railroads were the beginning of a transportation revolution that would accelerate throughout the century.

Political developments

Tariff Rates (France, UK, US)
 
Average Tariff Rates in USA (1821–2016)

The institutional arrangements of the American System were initially formulated by Alexander Hamilton, who proposed the creation of a government-sponsored bank and increased tariffs to encourage industrial development. Following Hamilton's death, the American school of political economy was championed in the antebellum period by Henry Clay and the Whig Party generally.

Specific government programs and policies which gave shape and form to the American School and the American System include the establishment of the Patent Office in 1802; the creation of the Coast and Geodetic Survey in 1807 and other measures to improve river and harbor navigation; the various Army expeditions to the west, beginning with the Lewis and Clark Expedition in 1804 and continuing into the 1870s, almost always under the direction of an officer from the Army Corps of Topographical Engineers, and which provided crucial information for the overland pioneers that followed; the assignment of Army Engineer officers to assist or direct the surveying and construction of the early railroads and canals; the establishment of the First Bank of the United States and Second Bank of the United States as well as various protectionist measures (e.g., the tariff of 1828).

Thomas Jefferson and James Madison opposed a strong central government (and, consequently, most of Hamilton's economic policies), but they could not stop Hamilton, who wielded immense power and political clout in the Washington administration. In 1801, however, Jefferson became president and turned to promoting a more decentralized, agrarian democracy called Jeffersonian democracy. (He based his philosophy on protecting the common man from political and economic tyranny. He particularly praised small farmers as "the most valuable citizens".) However, Jefferson did not change Hamilton's basic policies. As president in 1811 Madison let the bank charter expire, but the War of 1812 proved the need for a national bank and Madison reversed positions. The Second Bank of the United States was established in 1816, with a 20-year charter.

Thomas Jefferson was able to purchase the Louisiana Territory from the Napoleon in 1803 for $15 million, with money raised in England. The Louisiana Purchase greatly expanded the size of the United States, adding extremely good farmland, the Mississippi River and the city of New Orleans. The French Revolutionary and Napoleonic Wars from 1793 to 1814 caused withdrawal of most foreign shipping from the U.S., leaving trade in the Caribbean and Latin America at risk for the seizure of American merchant ships by France and Britain. This led to Jefferson's Embargo Act of 1807 which prohibited most foreign trade. The War of 1812, by cutting off almost all foreign trade, created a home market for goods made in the U.S. (even if they were more expensive), changing an early tendency toward free trade into a protectionism characterized by nationalism and protective tariffs.

States built roads and waterways, such as the Cumberland Pike (1818) and the Erie Canal (1825), opening up markets for western farm products. The Whig Party supported Clay's American System, which proposed to build internal improvements (e.g. roads, canals and harbors), protect industry, and create a strong national bank. The Whig legislation program was blocked at the national level by the Jacksonian Democrats, but similar modernization programs were enacted in most states on a bipartisan basis.

The role of the Federal Government in regulating interstate commerce was firmly established by the landmark Supreme Court ruling in Gibbons v Ogden, which decided against allowing states to grant exclusive rights to steamboat companies operating between states.

President Andrew Jackson (1829–1837), leader of the new Democratic Party, opposed the Second Bank of the United States, which he believed favored the entrenched interests of the rich. When he was elected for a second term, Jackson blocked the renewal of the bank's charter. Jackson opposed paper money and demanded the government be paid in gold and silver coins. The Panic of 1837 stopped business growth for three years.

Agriculture, commerce and industry

Population growth

Although there was relatively little immigration from Europe, the rapid expansion of settlements to the West, and the Louisiana Purchase of 1803, opened up vast frontier lands. The high birth rate, and the availability of cheap land caused the rapid expansion of population. The average age was under 20, with children everywhere. The population grew from 5.3 million people in 1800, living on 865,000 square miles of land to 9.6 million in 1820 on 1,749,000 square miles. By 1840, the population had reached 17,069,000 on the same land.

New Orleans and St. Louis joined the United States and grew rapidly; entirely new cities were begun at Pittsburgh, Marietta, Cincinnati, Louisville, Lexington, Nashville and points west. The coming of the steamboat after 1810 made upstream traffic economical on major rivers, especially the Hudson, Ohio, Mississippi, Illinois, Missouri, Tennessee, and Cumberland rivers. Historian Richard Wade has emphasized the importance of the new cities in the Westward expansion in settlement of the farmlands. They were the transportation centers, and nodes for migration and financing of the westward expansion. The newly opened regions had few roads, but a very good river system in which everything flowed downstream to New Orleans. With the coming of the steamboat after 1815, it became possible to move merchandise imported from the Northeast and from Europe upstream to new settlements. The opening of the Erie Canal made Buffalo the jumping off point for the lake transportation system that made major trading centers in Cleveland, Detroit, and especially Chicago.

Labor shortage

The U.S. economy of the early 19th century was characterized by labor shortages. It was attributed to the cheapness of land and the high returns on agriculture. All types of labor were in high demand, especially unskilled labor and experienced factory workers. Wages in the U.S. were typically between 30 and 50 percent higher than in Britain. Women factory workers were especially scarce. The elasticity of labor was low in part because of lack of transportation and low population density. The relative labor scarcity and high price was an incentive for capital investment, particularly in machinery.

Agriculture

The U.S. economy was primarily agricultural in the early 19th century. Westward expansion plus the building of canals and the introduction of steamboats opened up new areas for agriculture. Much land was cleared and put into growing cotton in the Mississippi valley and in Alabama, and new grain growing areas were brought into production in the Midwest. Eventually this put severe downward pressure on prices, particularly of cotton, first from 1820 to 1823 and again from 1840 to 1843.

Before the Industrial Revolution most cotton was spun and woven near where it was grown, leaving little raw cotton for the international marketplace. World cotton demand experienced strong growth due to mechanized spinning and weaving technologies of the Industrial Revolution. Although cotton was grown in India, China, Egypt, the Middle East and other tropical and subtropical areas, the Americas, particularly the U.S., had sufficient suitable land available to support large scale cotton plantations, which were highly profitable. A strain of cotton seed brought from New Spain to Natchez, Mississippi, in 1806 would become the parent genetic material for over 90% of world cotton production today; it produced bolls that were three to four times faster to pick. The cotton trade, excluding financing, transport and marketing, was 6 percent or less of national income in the 1830s. Cotton became the United States' largest export.

Sugarcane was being grown in Louisiana, where it was refined into granular sugar. Growing and refining sugar required a large amount of capital. Some of the nation's wealthiest people owned sugar plantations, which often had their own sugar mills.

Southern plantations, which grew cotton, sugarcane and tobacco, used African slave labor. Per capita food production did not keep pace with the rapidly expanding urban population and industrial labor force in the Antebellum decades.

Roads

Construction of the first macadamized road in the United States (1823). In the foreground, workers are breaking stones "so as not to exceed 6 ounces [170 g] in weight or to pass a two-inch [5 cm] ring".

There were only a few roads outside of cities at the beginning of the 19th century, but turnpikes were being built. A ton-mile by wagon cost from between 30 and 70 cents in 1819. Robert Fulton's estimate for typical wagonage was 32 cents per ton-mile. The cost of transporting wheat or corn to Philadelphia exceeded the value at 218 and 135 miles, respectively. To facilitate westward expansion, in 1801 Thomas Jefferson began work on the Natchez Trace, which was to connect Daniel Boone's Wilderness Road, which ended in Nashville, Tennessee, with the Mississippi River.

Following the Louisiana Purchase the need for additional roads to the West were recognized by Thomas Jefferson, who authorized the construction of the Cumberland Road in 1806. The Cumberland Road was to connect Cumberland Maryland on the Potomac River with the Wheeling (West) Virginia on the Ohio River, which was on the other side of the Allegheny Mountains. Mail roads were also built to New Orleans.

The building of roads in the early years of the 19th century greatly lowered transportation costs and was a factor in the deflation of 1819 to 1821, which was one of the most severe in U.S. history.

Some turnpikes were wooden plank roads, which typically cost about $1,500 to $1,800 per mile, but wore out quickly. Macadam roads in New York cost an average of $3,500 per mile, while high-quality roads cost between $5,000 and $10,000 per mile.

Canals

Scene of Lockport on the Erie Canal (W. H. Bartlett 1839)

Because a horse can pull a barge carrying a cargo of over 50 tons compared to the typical one ton or less hauled by wagon, and the horse required a wagoner versus a couple of men for the barge, water transportation costs were a small fraction of wagonage costs. Canals' shipping costs were between two and three cents per ton-mile, compared to 17–20 cents by wagon. The cost of constructing a typical canal was between $20,000 and $30,000 per mile.

Only 100 miles of canals had been built in the U.S. by 1816, and only a few were longer than two miles. The early canals were typically financially successful, such as those carrying coal in the Coal Region of Northeastern Pennsylvania, where canal construction was concentrated until 1820.

The 325-mile Erie Canal, which connected Albany, New York, on the Hudson River with Buffalo, New York, on Lake Erie, began operation in 1825. Wagon cost from Buffalo to New York City in 1817 was 19.2 cents per ton-mile. By Erie Canal c. 1857 to 1860 the cost was 0.81 cents. The Erie Canal was a great commercial success and had a large regional economic impact.

The Delaware and Raritan Canal was also very successful. Also important was the 2.5-mile canal bypassing the falls of the Ohio River at Louisville, which opened in 1830.

The success of some of the early canals led to a canal building boom, during which work began on many canals which would prove to be financially unsuccessful. As the canal boom was underway in the late 1820s, a small number of horse railways were being built. These were quickly followed by the first steam railways in the 1830s.

Steam power

In 1780 the United States had three major steam engines, all of which were used for pumping water: two in mines and one for New York City's water supply. Most power in the U.S. was supplied by water wheels and water turbines after their introduction in 1840. By 1807 when the North River Steamboat (unofficially called Clermont) first sailed, there were estimated to be fewer than a dozen steam engines operating in the U.S. Steam power did not overtake water power until sometime after 1850.

Oliver Evans began developing a high pressure steam engine that was more practical than the engine developed around the same time by Richard Trevithick in England. The high pressure engine did away with the separate condenser and thus did not require cooling water. It also had a higher power to weight ratio, making it suitable for powering steamboats and locomotives.

Evans produced a few custom steam engines from 1801 to 1806, when he opened the Mars Works iron foundry and factory in Philadelphia, where he produced additional engines. In 1812 he produced a successful Colombian engine at Mars Works. As his business grew and orders were being shipped, Evans and a partner formed the Pittsburgh Steam Engine Company in Pittsburgh, Pennsylvania. Steam engines soon became common in public water supply, sawmills and flour milling, especially in areas with little or no water power.

Mechanical power transmission

In 1828 Paul Moody substituted leather belting for gearing in mills. Leather belting from line shafts was the common way to distribute power from steam engines and water turbines in mills and factories. In the factory boom of the late 19th century it was common for large factories to have many miles of line shafts. Leather belting continued in use until it was displaced by unit drive electric motors in the early decades of the 20th century.

Shipbuilding

Shipbuilding remained a sizable industry. U.S.-built ships were superior in design, required smaller crews and cost between 40 and 60 percent less to build than European ships. The British gained the lead in shipbuilding after they introduced iron-hulled ships in the mid-19th century.

Steamboats and steam ships

Commercial steamboat operations began in 1807 within weeks of the launch of Robert Fulton's North River Steamboat, often referred to as the Clermont.

The first steamboats were powered by Boulton and Watt type low pressure engines, which were very large and heavy in relation to the smaller high pressure engines. In 1807 Robert L. Stevens began operation of the Phoenix, which used a high pressure engine in combination with a low pressure condensing engine. The first steamboats powered only by high pressure were the Aetna and Pennsylvania designed and built by Oliver Evans.

In the winter of 1811 to 1812, the New Orleans became the first steamboat to travel down the Ohio and Mississippi Rivers from Pittsburgh to New Orleans. The commercial feasibility of steamboats on the Mississippi and its tributaries was demonstrated by the Enterprise in 1814.

By the time of Fulton's death in 1815 he operated 21 of the estimated 30 steamboats in the U.S. The number of steamboats steadily grew into the hundreds. There were more steamboats in the Mississippi valley than anywhere else in the world.

Early steamboats took 30 days to travel from New Orleans to Louisville, which was from half to one-quarter the time by keel boat. Due to improvements in steamboat technology, by 1830 the time from New Orleans to Louisville was halved. In 1820 freight rates for keel boats were five cents per ton-mile versus two cents by steamboat, falling to one-half cent per pound by 1830.

The SS Savannah crossed from Savannah to Liverpool in 1819 as the first trans-Atlantic steamship; however, until the development of more efficient engines, trans-ocean ships had to carry more coal than freight. Early trans-ocean steamships were used for passengers and soon some companies began offering regularly scheduled service.

Railroads

Railroads were an English invention, and the first entrepreneurs imported British equipment in the 1830s. By the 1850s the Americans had developed their own technology. The early lines in the 1830s and 1840s were locally funded, and connected nearby cities or connected farms to navigable waterways. They primarily handled freight rather than passengers. The first locomotives were imported from England. One such locomotive was the John Bull which arrived in 1831. While awaiting assembly, Matthias W. Baldwin, who had designed and manufactured a highly successful stationary steam engine, was able to inspect the parts and obtain measurements. Baldwin was already working on an experimental locomotive based on designs shown at the Rainhill Trials in England. Baldwin produced his first locomotive in 1832; he went on to found the Baldwin Locomotive Works, one of the largest locomotive manufacturers. In 1833 when there were few locomotives in the U.S., three quarters were made in England. In 1838 there were 346 locomotives, three-fourths of which were made in the U.S.

Ohio had more railroads built in the 1840s than any other state. Ohio's railroads put the canals out of business. A typical mile of railroad cost $30,000 compared to the $20,000 per mile of canal, but a railroad could carry 50 times as much traffic. Railroads appeared at the time of the canal boom, causing its abrupt end, although some canals flourished for an additional half-century.

Manufacturing

Starting with textiles in the 1790s, factories were built to supply a regional and national market. The power came from waterfalls, and most of the factories were built alongside the rivers in rural New England and Upstate New York.

Boston Manufacturing Co., Waltham, Massachusetts

Before 1800, most cloth was made in home workshops, and housewives sewed it into clothing for family use or trade with neighbors. In 1810 the secretary of the treasury estimated that two-thirds of rural household clothing, including hosiery and linen, was produced by households. By the 1820s, housewives bought the cloth at local stores, and continued their sewing chores. The American textile industry was established during the long period of wars from 1793 to 1815, when cheap cloth imports from Britain were unavailable. Samuel Slater secretly brought in the plans for complex textile machinery from Britain, and built new factories in Rhode Island using the stolen designs. By the time the Embargo Act of 1807 cut off trade with Britain, there were 15 cotton spinning mills in operation. These were all small operations, typically employing fewer than 50 people, and most used Arkwright water frames powered by small streams. They were all located in southeastern New England.

In 1809 the number of mills had grown to 62, with 25 under construction. To meet increased demand for cloth several manufacturers resorted to the putting-out system of having the handloom weaving done in homes. The putting-out system was inefficient because of the difficulty of distributing the yarn and collecting the cloth, embezzlement of supplies, lack of supervision and poor quality. To overcome these problems the textile manufacturers began to consolidate work in central workshops shops where they could supervise operations. Taking this to the next level, in 1815 Francis Cabot Lowell of the Boston Manufacturing Company built the first integrated spinning and weaving factory in the world at Waltham, Massachusetts, using plans for a power loom that he smuggled out of England. This was the largest factory in the U.S., with a workforce of about 300. It was a very efficient, highly profitable mill that, with the aid of the Tariff of 1816, competed effectively with British textiles at a time when many smaller operations were being forced out of business.

The Fall River Manufactory, located on the Quequechan River in Fall River, Massachusetts, was founded in 1813 by Dexter Wheeler and cousin David Anthony. By 1827 there were 10 cotton mills in the Fall River area, which soon became the country's leading producer of printed cotton cloth.

Beginning with Lowell, Massachusetts, in the 1820s, large-scale factory towns, mostly mill towns, began growing around rapidly expanding manufacturing plants, especially in New England, New York, and New Jersey. Some were established as company towns, where either from idealism or economic exploitation, the same corporation that owned the factory also owned all local worker housing and retail establishments. This paternalistic model experienced significant pushback with the Pullman Strike of 1894, and significantly declined with increasing worker affluence during the Roaring Twenties, development of the automobile that allowed workers to live outside of the factory town, and New Deal policies.

The U.S. began exporting textiles in the 1830s; the Americans specialized in coarse fabrics, while the British exported finer cloth that reached a somewhat different market. Cloth production—mostly cotton but also wool, linen and silk—became the leading American industry. The building of textile machinery became a major driving force in the development of advanced mechanical devices.

The shoe industry began transitioning from production by craftsmen to the factory system, with division of labor.

Low return freight rates from Europe offered little protection from imports to domestic industries.

Development of interchangeable parts

Standardization and interchangeability have been cited as major contributors to the exceptional growth of the U.S. economy.

The idea of standardization of armaments was originated the 1765 by French Gribeauval system. Honoré Blanc began producing muskets with interchangeable locks in France when Thomas Jefferson was minister to France. Jefferson wrote a letter to John Jay about these developments in 1785. The idea of armament standardization was advocated by Louis de Tousard, who fled the French Revolution and in 1795 joined the U.S. Corps of Artillerists and Engineers where he taught artillery and engineering. At the suggestion of George Washington, Tousard wrote The American Artillerist's Companion (1809). This manual became a standard textbook for officer training; it stressed the importance of a system of standardized armaments.

Fears of war stemming from the XYZ Affair caused the U.S. to begin offering cash advance contracts for producing small arms to private individuals in 1798. Two notable recipients of these contracts associated with interchangeable parts were Eli Whitney and Simeon North. Although Whitney was not able to make interchangeable parts, he was a proponent of using machinery for gun making; however, he employed only the simplest machines in his factory. North eventually made progress toward some degree of interchangeability and developed special machinery. North's shop used the first known milling machine (c. 1816), a fundamental machine tool.

The experience of the War of 1812 led the War Department to issue a request for contract proposals for firearms with interchangeable parts. Previously, parts from each firearm had to be carefully custom fitted; almost all infantry regiments necessarily included an artificer or armorer who could perform this intricate gunsmithing. The requirement for interchangeable parts forced the development of modern metal-working machine tools, including milling machines, grinders, shapers and planers. The Federal Armories perfected the use of machine tools by developing fixtures to correctly position the parts being machined and jigs to guide the cutting tools over the proper path. Systems of blocks and gauges were also developed to check the accuracy and precision of the machined parts. Developing the manufacturing techniques for making interchangeable parts by the Federal Armories took over two decades; however, the first interchangeable small arms parts were not made to a high degree of precision. It wasn't until the mid-century or later that parts for U.S. rifles could be considered truly interchangeable with a degree of precision. In 1853 when the British Parliamentary Committee on Small Arms questioned gun maker Samuel Colt, and machine tool makers James Nasmyth and Joseph Whitworth, there was still some question about what constituted interchangeability and whether it could be achieved at a reasonable cost.

The machinists' skills were called armory practice and the system eventually became known as the American system of manufacturing. Machinists from the armories eventually spread the technology to other industries, such as clocks and watches, especially in the New England area. It wasn't until late in the 19th century that interchangeable parts became widespread in U.S. manufacturing. Among the items using interchangeable parts were some sewing machine brands and bicycles.

The development of these modern machine tools and machining practices made possible the development of modern industry capable of mass production; however, large scale industrial production did not develop in the U.S. until the late 19th century.

Finance, money and banking

The charter for the First Bank of the United States expired in 1811. Its absence caused serious difficulties for the national government trying to finance the War of 1812 over the refusal of New England bankers to help out.

President James Madison reversed earlier Jeffersonian opposition to banking, and secured the opening of a new national bank. The Second Bank of the United States was chartered in 1816. Its leading executive was Philadelphia banker Nicholas Biddle. It collapsed in 1836, under heavy attack from President Andrew Jackson during his Bank War.

There were three economic downturns in the early 19th century. The first was the result of the Embargo Act of 1807, which shut off most international shipping and trade due to the Napoleonic Wars. The embargo caused a depression in cities and industries dependent on European trade. The other two downturns were depressions accompanied by significant periods of deflation during the early 19th century. The first and most severe was during the depression from 1818 to 1821 when prices of agricultural commodities declined by almost 50 percent. A credit contraction caused by a financial crisis in England drained specie out of the U.S. The Bank of the United States also contracted its lending. The price of agricultural commodities fell by almost 50 percent from the high in 1815 to the low in 1821, and did not recover until the late 1830s, although to a significantly lower price level. Most damaging was the price of cotton, the U.S.'s main export. Food crop prices, which had been high because of the famine of 1816 that was caused by the year without a summer, fell after the return of normal harvests in 1818. Improved transportation, mainly from turnpikes, significantly lowered transportation costs.

The third economic downturn was the depression of the late 1830s to 1843, following the Panic of 1837, when the money supply in the United States contracted by about 34 percent with prices falling by 33 percent. The magnitude of this contraction is matched only by the Great Depression. A fundamental cause of the Panic of 1837 was depletion of Mexican silver mines. Despite the deflation and depression, GDP rose 16 percent from 1839 to 1843, partly because of rapid population growth.

In order to dampen speculation in land, Andrew Jackson signed the executive order known as the Specie Circular in 1836, requiring sale of government land to be paid in gold and silver. Branch mints at New Orleans; Dahlonega, Georgia; and Charlotte, North Carolina, were authorized by congress in 1835 and became operational in 1838.

Gold was being withdrawn from the U.S. by England and silver had also been taken out of the country because it had been undervalued relative to gold by the Coinage Act of 1834. Canal projects began to fail. The result was the financial Panic of 1837. In 1838 there was a brief recovery. The business cycle upturn occurred in 1843.

Economic historians have explored the high degree of financial and economic instability in the Jacksonian era. For the most part, they follow the conclusions of Peter Temin, who absolved Jackson's policies, and blamed international events beyond American control, such as conditions in Mexico, China and Britain. A survey of economic historians in 1995 show that the vast majority concur with Temin's conclusion that "the inflation and financial crisis of the 1830s had their origin in events largely beyond President Jackson's control and would have taken place whether or not he had acted as he did vis-a-vis the Second Bank of the U.S."

Economics of the War of 1812

The War of 1812 was financed by borrowing, by new issues of private bank notes and by an inflation in prices of 15%. The government was a very poor manager during the war, with delays in payments and confusion, as the Treasury took in money months after it was scheduled to pay it out. Inexperience, indecision, incompetence, partisanship and confusion are the main hallmarks. The federal government's management system was designed to minimize the federal role before 1812. The Democratic-Republican Party in power deliberately wanted to downsize the power and roles of the federal government; when the war began, the Federalist opposition worked hard to sabotage operations. Problems multiplied rapidly in 1812, and all the weaknesses were magnified, especially regarding the Army and the Treasury. There were no serious reforms before the war ended. In financial matters, the decentralizing ideology of the Republicans meant they wanted the First Bank of the United States to expire in 1811, when its 20-year charter ran out. Its absence made it much more difficult to handle the financing of the war, and caused special problems in terms of moving money from state to state, since state banks were not allowed to operate across state lines. The bureaucracy was terrible, often missing deadlines. On the positive side, over 120 new state banks were created all over the country, and they issued notes that financed much of the war effort, along with loans raised by Washington. Some key Republicans, especially Secretary of the Treasury Albert Gallatin realized the need for new taxes, but the Republican Congress was very reluctant and only raised small amounts. The whole time, the Federalist Party in Congress and especially the Federalist-controlled state governments in the Northeast, and the Federalist-aligned financial system in the Northeast, was strongly opposed to the war and refused to help in the financing. Indeed, they facilitated smuggling across the Canadian border, and sent large amounts of gold and silver to Canada, which created serious shortages of specie in the US.

Across the two and half years of the war, 1812–1815, the federal government took in more money than it spent. Cash out was $119.5 million, cash in was $154.0 million. Two-thirds of the income was borrowed and had to be paid back in later years; the national debt went from $56.0 million in 1812 to $127.3 million in 1815. Out of the GDP (gross domestic product) of about $925 million (in 1815), this was not a large burden for a national population of 8 million people; it was paid off in 1835. A new Second Bank of the United States was set up in 1816, and after that the financial system performed very well, even though there was still a shortage of gold and silver.

U.S. per capita GDP 1810–1815 in constant 2009 dollars

The economy grew every year 1812–1815, despite a large loss of business by East Coast shipping interests. Wartime inflation averaged 4.8% a year. The national economy grew 1812–1815 at the rate of 3.7% a year, after accounting for inflation. Per capita GDP grew at 2.2% a year, after accounting for inflation. Money that would have been spent on imports—mostly cloth—was diverted to opening new factories, which were profitable since British cloth was not available. This gave a major boost to the industrial revolution, as typified by the Boston Associates. The Boston Manufacturing Company built the first integrated spinning and weaving factory in the world at Waltham, Massachusetts, in 1813.

Middle 19th century

The middle 19th century was a period of transition toward industrialization, particularly in the Northeast, which produced cotton textiles and shoes. The population of the West (generally meaning from Ohio to and including Wisconsin, Minnesota, Iowa and Missouri and south to include Kentucky) grew rapidly. The West was primarily a grain and pork producing region, with an important machine tool industry developing around Cincinnati, Ohio. The Southern economy was based on plantation agriculture, primarily cotton, tobacco and sugar, produced with slave labor.

The market economy and factory system were not typical before 1850, but developed along transportation routes. Steamboats and railroads, introduced in the early part of the century, became widespread and aided westward expansion. The telegraph was introduced in 1844 and was in widespread use by the mid 1850s.

A machine tool industry developed and machinery became a major industry. Sewing machines began being manufactured. The shoe industry became mechanized. Horse drawn reapers became widely introduced, significantly increasing the productivity of farming.

The use of steam engines in manufacturing increased and steam power exceeded water power after the Civil War. Coal replaced wood as the major fuel.

The combination of railroads, the telegraph and machinery and factories began to create an industrial economy.

The longest economic expansion of the United States occurred in the recession-free period between 1841 and 1856. A 2017 study attributes this expansion primarily to "a boom in transportation-goods investment following the discovery of gold in California."

Commerce, industry and agriculture

The depression that began in 1839 ended with an upswing in economic activity in 1843.

Table 1: Sector shares

Employment % Output % (1860 prices)
Year Agriculture Industry Services Agriculture Industry Services
1840 68 12 20 47 21 31
1850 60 17 23 42 29 29
1860 56 19 25 38 28 34
1870 53 22 25 35 31 34
1880 52 23 25 31 32 38
1890 43 26 31 22 41 37
1900 40 26 33 20 40 39
Source: Joel Mokyr

Railroads

Railroads opened up remote areas and drastically cut the cost of moving freight and passengers. By 1860 long distance bulk rates had fallen by 95%, less than half of which was due to the general fall in prices. This large fall in transportation costs created "a major revolution in domestic commerce."

As transportation improved, new markets continuously opened. Railroads greatly increased the importance of hub cities such as Atlanta, Billings, Chicago, and Dallas.

Railroads were a highly capital intensive business, with a typical cost of $30,000 per mile with a considerable range depending on terrain and other factors. Private capital for Railroads during the period from 1830 to 1860 was inadequate. States awarded charters, funding, tax breaks, land grants, and provided some financing. Railroads were allowed banking privileges and lotteries in some states. Private investors provided a small but not insignificant share or railroad capital. A combination of domestic and foreign investment along with the discovery of gold and a major commitment of America's public and private wealth, enabled the nation to develop a large-scale railroad system, establishing the base for the country's industrialization.

Table 2: Railroad Mileage Increase by Groups of States

1850 1860 1870 1880 1890
New England 2,507 3,660 4,494 5,982 6,831
Middle States 3,202 6,705 10,964 15,872 21,536
Southern States 2,036 8,838 11,192 14,778 29,209
Western States and Territories 1,276 11,400 24,587 52,589 62,394
Pacific States and Territories
23 1,677 4,080 9,804
TOTAL NEW TRACK USA 9,021 30,626 52,914 93,301 129,774
Source: Chauncey M. Depew (ed.), One Hundred Years of American Commerce 1795–1895 p. 111
1864, Pennsylvania oil drilling early in the history of the petroleum industry in the United States

Railroad executives invented modern methods for running large-scale business operations, creating a blueprint that all large corporations basically followed. They created career tracks that took 18-year-old boys and turned them into brakemen, conductors and engineers. They were first to encounter managerial complexities, labor union issues, and problems of geographical competition. Due to these radical innovations, the railroad became the first large-scale business enterprise and the model for most large corporations.

Historian Larry Haeg argues from the perspective of the end of the 19th century:

Railroads created virtually every major American industry: coal, oil, gas, steel, lumber, farm equipment, grain, cotton, textile factories, California citrus.
1900 panoramic image of the Chicago slaughter houses

Iron industry

The most important technological innovation in mid-19th-century pig iron production was the adoption of hot blast, which was developed and patented in Scotland in 1828. Hot blast is a method of using heat from the blast furnace exhaust gas to preheat combustion air, saving a considerable amount of fuel. It allowed much higher furnace temperatures and increased the capacity of furnaces.

Hot blast allowed blast furnaces to use anthracite or lower grade coal. Anthracite was difficult to light with cold blast. High quality metallurgical coking coal deposits of sufficient size for iron making were only available in Great Britain and western Germany in the 19th century, but with less fuel required per unit of iron, it was possible to use lower grade coal.

The use of anthracite was rather short lived because the size of blast furnaces increased enormously toward the end of the century, forcing the use of coke, which was more porous and did not impede the upflow of the gases through the furnace. Charcoal would have been crushed by the column of material in tall furnaces. Also, the capacity of furnaces would have eventually exceeded the wood supply, as happened with locomotives.

Iron was used for a wide variety of purposes. In 1860 large consumers were numerous types of castings, especially stoves. Of the $32 million of bar, sheet and railroad iron produced, slightly less than half was railroad iron. The value added by stoves was equal to the value added by rails.

Coal displaces wood

Coal replaced wood during the mid-19th century. In 1840 wood was the major fuel while coal production was minor. In 1850 wood was 90% of fuel consumption and 90% of that was for home heating. By 1880 wood was only 5% of fuel consumption. Cast iron stoves for heating and cooking displaced inefficient fireplaces. Wood was a byproduct of land clearing and was placed along the banks of rivers for steamboats. By mid-century the forests were being depleted while steamboats and locomotives were using enough wood to create shortages along their routes; however, railroads, canals and navigable internal waterways were able to bring coal to market at a price far below the cost of wood. Coal sold in Cincinnati for 10 cents per bushel (94 pounds) and in New Orleans for 14 cents.

Charcoal production was very labor and land intensive. It was estimated that to fuel a typical sized 100 ton of pig iron per week furnace in 1833 at a sustained yield, a timber plantation of 20,000 acres was required. The trees had to be hauled by oxen to where they were cut, stacked on end and covered with earth or put in a kiln to be charred for about a week. Anthracite reduced labor cost to $2.50 per ton compared to charcoal at $15.50 per ton.

Manufacturing

Manufacturing became well established during the mid-19th century. Labor in the U.S. was expensive and industry made every effort to economize by using machinery. Woodworking machinery such as circular saws, high speed lathes, planers and mortising machines and various other machines amazed British visitors, as was reported by Joseph Whitworth. See: American system of manufacturing#Use of machinery

In the early 19th century machinery was made mostly of wood with iron parts. By the mid-century machines were being increasingly of all iron, which allowed them to operate at higher speeds and with higher precision. The demand for machinery created a machine tool industry that designed and manufactured lathes, metal planers, shapers and other precision metal cutting tools.

The shoe industry was the second to be mechanized, beginning in the 1840s. Sewing machines were developed for sewing leather. A leather rolling machine eliminated hand hammering, and was thirty times faster. Blanchard lathes began being used for making shoe lasts (forms) in the 1850s, allowing the manufacture of standard sizes.

By the 1850s much progress had been made in the development of the sewing machine, with a few companies making the machines, based on a number of patents, with no company controlling the right combination of patents to make a superior machine. To prevent damaging lawsuits, in 1856 several important patents were pooled under the Sewing Machine Combination, which licensed the patents for a fixed fee per machine sold.

The sewing machine industry was a beneficiary of machine tools and the manufacturing methods developed at the Federal Armories. By 1860 two sewing machine manufacturers were using interchangeable parts.

The sewing machine increased the productivity of sewing cloth by a factor of 5.

In 1860 the textile industry was the largest manufacturing industry in terms of workers employed (mostly women and children), capital invest and value of goods produced. That year there were 5 million spindles in the U.S.

Steam power

The Treasury Department's steam engine report of 1838 was the most valuable survey of steam power until the 1870 Census. According to the 1838 report there were an estimated 2,000 engines totaling 40,000 hp, of which 64% were used in transportation, mostly in steamboats.

The Corliss steam engine, patented in 1848, was called the most significant development in steam engineering since James Watt. The Corliss engine was more efficient than previous engines and maintained more uniform speed in response to load changes, making it suitable for a wide variety of industrial applications. It was the first steam engine that was suitable for cotton spinning. Previously steam engines for cotton spinning pumped water to a water wheel that powered the machinery.

Steam power greatly expanded during the late 19th century with the rise of large factories, the expanded railroad network and early electric lighting and electric street railways.

Steamboats and ships

The number of steamboats on western rivers in the U.S. grew from 187 in 1830 to 735 in 1860. Total registered tonnage of steam vessels for the U.S. grew from 63,052 in 1830 to 770,641 in 1860.

Until the introduction of iron ships, the U. S. made the best in the world. The design of U.S. ships required fewer crew members to operate. U.S. made ships cost from 40% to 60% as much as European ships, and lasted longer.

The screw propeller was tested on Lake Ontario in 1841 before being used on ocean ships. Propellers began being used on Great Lakes ships in 1845. Propellers caused vibrations which were a problem for wooden ships. The SS Great Britain, launched in 1845, was the first iron ship with a screw propeller. Iron ships became common and more efficient multiple expansion engines were developed. After the introduction of iron ships, Britain became the leading shipbuilding country. The U.S. tried to compete by building wooden clipper ships, which were fast, but too narrow to carry economic volumes of low value freight.

Telegraph

Congress approved funds for a short demonstration telegraph line from Baltimore to Washington D.C., which was operational in 1844. The telegraph was quickly adopted by the railroad industry, which needed rapid communication to coordinate train schedules, the importance of which had been highlighted by a collision on the Western Railroad in 1841. Railroads also needed to communicate over a vast network in order to keep track of freight and equipment. Consequently, railroads installed telegraphs lines on their existing right-of-ways. By 1852 there were 22,000 miles of telegraph lines in the U.S., compared to 10,000 miles of track.

Urbanization

By 1860, on the eve of the Civil War, 16% of the people lived in cities with 2500 or more people and one third of the nation's income came from manufacturing. Urbanized industry was limited primarily to the Northeast; cotton cloth production was the leading industry, with the manufacture of shoes, woolen clothing, and machinery also expanding. Most of the workers in the new factories were immigrants or their children. Between 1845 and 1855, some 300,000 European immigrants arrived annually. Many remained in eastern cities, especially mill towns and mining camps, while those with farm experience and some savings bought farms in the West.

Agriculture

Threshing machine from 1881. Steam engines were also used instead of horses.
 
Adriance reaper, late 19th century

In the antebellum period the U.S. supplied 80% of Britain's cotton imports. Just before the Civil War the value of cotton was 61% of all goods exported from the U.S.

The westward expansion into the highly productive heartland was aided by the new railroads, and both population and grain production in the West expanded dramatically. Increased grain production was able to capitalize on high grain prices caused by poor harvests in Europe during the time of the Great Famine in Ireland Grain prices also rose during the Crimean War, but when the war ended U.S. exports to Europe fell dramatically, depressing grain prices. Low grain prices were a cause of the Panic of 1857. Cotton and tobacco prices recovered after the panic.

Agriculture was the largest single industry and it prospered during the war. Prices were high, pulled up by a strong demand from the army and from Britain, which depended on American wheat for a fourth of its food imports.

John Deere developed a cast steel plow in 1837 which was lightweight and had a moldboard that efficiently turned over and shed the plowed earth. It was easy for a horse to pull and was well suited to cutting the thick prairie sod of the Midwest. He and his brother Charles founded Deere and Company which continues into the 21st century as the largest maker of tractors, combines, harvesters and other farm implements.

Threshing machines, which were a novelty at the end of the 18th century, began being widely introduced in the 1830s and 1840s. Mechanized threshing required less than half the labor of hand threshing.

The Civil War acted as a catalyst that encouraged the rapid adoption of horse-drawn machinery and other implements. The rapid spread of recent inventions such as the reaper and mower made the workforce efficient, even as hundreds of thousands of farmers were in the army. Many wives took their place, and often consulted by mail on what to do; increasingly they relied on community and extended kin for advice and help.

The 1862 Homestead Act opened up the public domain lands for free. Land grants to the railroads meant they could sell tracts for family farms (80 to 200 acres) at low prices with extended credit. In addition the government sponsored fresh information, scientific methods and the latest techniques through the newly established Department of Agriculture and the Morrill Land Grant College Act.

Slave labor

In 1860, there were 4.5 million Americans of African descent, 4 million of which were slaves, worth $3 billion. They were mainly owned by southern planters of cotton and sugarcane. An estimated 60% of the value of farms in Alabama, Georgia, Louisiana, Mississippi and South Carolina was in slaves, with less than a third in land and buildings.

In the aftermath of the Panic of 1857, which left many northern factory workers unemployed and deprived to the point of causing bread riots, supporters of slavery pointed out that slaves were generally better fed and had better living quarters than many free workers. It is estimated that slaves received 15% more in imputed wages than the free market.

Finance, money and banking

After the expiration of the charter of the Second Bank of the United States, federal revenues were handled by the Independent Treasury beginning in 1846. The Second Bank of the U.S. had also maintained some control over other banks, but in its absence banks were only under state regulation.

One of the main problems with banks was over-issuance of banknotes. These were redeemable in specie (gold or silver) upon presentation to the chief cashier of the bank. When people lost trust in a bank they rushed to redeem its notes, and because banks issued more notes than their specie reserves, the bank couldn't redeem the notes, often causing the bank to fail. In 1860 there were over 8,000 state chartered banks issuing notes. In 1861 the U.S. began issuing United States Notes as legal tender.

Banks began paying interest on deposits and using the proceeds to make short term call loans, mainly to stock brokers.

New York banks created a clearing house association in 1853 in which member banks cleared accounts with other city banks at the close of the week. The clearinghouse association also handled notes from banks in other parts of the country. The association was able to detect banks that were issuing excessive notes because they could not settle.

Panic of 1857

The recovery from the depression that followed the Panic of 1837 began in 1843 and lasted until the Panic of 1857.

The panic was triggered by the August 24 failure of the well regarded Ohio Life Insurance and Trust Co. A manager in the New York branch, one of the city's largest financial institutions, had embezzled funds and made excessive loans. The company's president announced suspension of specie redemption, which triggered a rush to redeem banknotes, causing many banks to fail because of lack of specie.

The United States had been running a trade deficit, draining gold out of the country. Because of the tariff revenues, the U.S. Treasury held a considerable amount of gold, which kept it out of circulation. On September 12, the SS Central America, which was carrying $1.5 million in gold from California, sank, contributing to the panic. Secretary of the Treasury Howell Cobb came to the aid of New York mercantile interests by buying back some of the national debt. On September 25 the Bank of Pennsylvania suspended specie payment, starting a nationwide bank run.

The danger of interest bearing deposits became apparent when bankers had to call loans made to stock brokers, many of whom were unable to pay. Banks then had to curtail credit to commercial and industrial customers. Many businesses were unable to pay workers back wages because the banknotes they held were now worthless.

The Crimean War, which had cut off Russian wheat exports, ended in 1856. The war had caused high wheat prices and overexpansion in the U.S., which had been exporting wheat to Europe. Bountiful western harvests in 1857 caused grain prices to fall. Good harvests in England, France and Russia caused collapse in demand for U.S. grains in 1858 and 1859. This caused railroad shipments from the West to fall, which resulted in the bankruptcy of some railroads.

The inability of the West to sell its crops hurt businesses in other regions, such as New England, which manufactured shoes sold in the West. Cotton and tobacco prices fell, but unlike grains, soon recovered.

The panic left many northern wage earners unemployed, most temporarily, but high unemployment lingered for a couple of years.

Immigration surge

Immigration to the U.S. surged following the Great Famine (Ireland). There were about 3 million immigrants during the decade of the 1850s. They were mainly from Germany, Ireland and England.

Civil War economy

Union

The Union economy grew and prospered during the war while fielding a very large army and navy. The Republicans in Washington had a Whiggish vision of an industrial nation, with great cities, efficient factories, productive farms, all national banks, all knit together by a modern railroad system, to be mobilized by the United States Military Railroad. The South had resisted policies such as tariffs to promote industry and homestead laws to promote farming because slavery would not benefit. With the South gone and Northern Democrats weak, the Republicans enacted their legislation. At the same time they passed new taxes to pay for part of the war and issued large amounts of bonds to pay for most of the rest. Economic historians attribute the remainder of the cost of the war to inflation. Congress wrote an elaborate program of economic modernization that had the dual purpose of winning the war and permanently transforming the economy.

Financing the war

In 1860 the Treasury was a small operation that funded the small-scale operations of the government through land sales and customs based on a low tariff. Peacetime revenues were trivial in comparison with the cost of a full-scale war but the Treasury Department under Secretary Salmon P. Chase showed unusual ingenuity in financing the war without crippling the economy. Many new taxes were imposed and always with a patriotic theme comparing the financial sacrifice to the sacrifices of life and limb. The government paid for supplies in official currency, which encouraged people to sell to the government regardless of their politics. By contrast the Confederacy gave paper promissory notes when it seized property, so that even loyal Confederates would hide their horses and mules rather than sell them for dubious paper. Overall the Northern financial system was highly successful in raising money and turning patriotism into profit, while the Confederate system impoverished its patriots.

The United States needed $3.1 billion to pay for the immense armies and fleets raised to fight the Civil War—over $400 million in 1862 alone. Apart from tariffs, the largest revenue by far came from new excise taxes that were imposed on every sort of manufactured item. Second came much higher tariffs, through several Morrill tariff laws. Third came the nation's first income tax; only the wealthy paid and it was repealed at war's end.

1862 Greenbacks

Apart from taxes, the second major source of income was government bonds. For the first time bonds in small denominations were sold directly to the people, with publicity and patriotism as key factors, as designed by banker Jay Cooke. State banks lost their power to issue banknotes. Only national banks could do that and Chase made it easy to become a national bank; it involved buying and holding federal bonds and financiers rushed to open these banks. Chase numbered them, so that the first one in each city was the "First National Bank". Third, the government printed paper money called "greenbacks". They led to endless controversy because they caused inflation.

The North's most important war measure was perhaps the creation of a system of national banks that provided a sound currency for the industrial expansion. Even more important, the hundreds of new banks that were allowed to open were required to purchase government bonds. Thereby the nation monetized the potential wealth represented by farms, urban buildings, factories, and businesses, and immediately turned that money over to the Treasury for war needs.

Tariffs

Secretary Salmon P. Chase, though a long-time free-trader, worked with Morrill to pass a second tariff bill in summer 1861, raising rates another 10 points in order to generate more revenues. These subsequent bills were primarily revenue driven to meet the war's needs, though they enjoyed the support of protectionists such as Carey, who again assisted Morrill in the bill's drafting. The Morrill Tariff of 1861 was designed to raise revenue. The tariff act of 1862 served not only to raise revenue but also to encourage the establishment of factories free from British competition by taxing British imports. Furthermore, it protected American factory workers from low paid European workers, and as a major bonus attracted tens of thousands of those Europeans to immigrate to America for high wage factory and craftsman jobs.

Customs revenue from tariffs totaled $345 million from 1861 through 1865 or 43% of all federal tax revenue.

Land sales and grants

The U.S. government owned vast amounts of good land (mostly from the Louisiana Purchase of 1803 and the Oregon Treaty with Britain in 1846). The challenge was to make the land useful to people and to provide the economic basis for the wealth that would pay off the war debt. Land grants went to railroad construction companies to open up the western plains and link up to California. Together with the free lands provided to farmers by the Homestead Law the low-cost farm lands provided by the land grants sped up the expansion of commercial agriculture in the West.

The 1862 Homestead Act opened up the public domain lands. Land grants to the railroads meant they could sell tracts for family farms (80 to 200 acres) at low prices with extended credit. In addition the government sponsored fresh information, scientific methods and the latest techniques through the newly established Department of Agriculture and the Morrill Land Grant College Act.

Agriculture

Agriculture was the largest single industry and it prospered during the war. Prices were high, pulled up by a strong demand from the army and from Britain (which depended on American wheat for a fourth of its food imports). The war acted as a catalyst that encouraged the rapid adoption of horse-drawn machinery and other implements. The rapid spread of recent inventions such as the reaper and mower made the workforce efficient, even as hundreds of thousands of farmers were in the army. Many wives took their place and often consulted by mail on what to do; increasingly they relied on community and extended kin for advice and help.

The Union used hundreds of thousands of animals. The Army had plenty of cash to purchase them from farmers and breeders but especially in the early months the quality was mixed. Horses were needed for cavalry and artillery. Mules pulled the wagons. The supply held up, despite an unprecedented epidemic of glanders, a fatal disease that baffled veterinarians. In the South, the Union Army shot all the horses it did not need to keep them out of Confederate hands. The Treasury started buying cotton during the war, for shipment to Europe and northern mills. The sellers were Southern planters who needed the cash, regardless of their patriotism.

Collapse of the South

The wartime devastation of the South was great and poverty ensued; incomes of whites dropped, but income of the former slaves rose. During Reconstruction railroad construction was heavily subsidized (with much corruption), but the region maintained its dependence on cotton. Former slaves became wage laborers, tenant farmers, or sharecroppers. They were joined by many poor whites, as the population grew faster than the economy. As late as 1940 the only significant manufacturing industries were textile mills (mostly in the upland Carolinas) and some steel in Alabama.

The industrial advantages of the North over the South helped secure a Northern victory in the American Civil War (1861–1865). The Northern victory sealed the destiny of the nation and its economic system. The slave-labor system was abolished; sharecropping emerged and replaced slavery to supply the labor needed for cotton production, but cotton prices plunged in the Panic of 1873, leading Southern plantations to decline in profitability. Northern industry, which had expanded rapidly before and during the war, surged ahead. Industrialists came to dominate many aspects of the nation's life, including social and political affairs.

Political developments

From the 1830s to 1860, Congress repeatedly rejected Whig calls for higher tariffs, and its policies of economic nationalism, which included increased state control, regulation and macroeconomic development of infrastructure. President Andrew Jackson, for example, did not renew the charter of the Second Bank of the United States. The tariff was lowered time and again before the Civil War. Proposals to fund massive western railroad projects, or to give free land to homesteaders, were defeated by Southerners afraid these policies would strengthen the North. The Civil War changed everything.

Territorial expansion of the United States to the area of the Lower 48 States was essentially completed with the Texas annexation (1845), the Oregon Treaty (1846), the Mexican cession (1848) and the Gadsden Purchase (1853).

Treasury

In 1860 the Treasury was a small operation that funded the small-scale operations of the government through the low tariff and land sales. Revenues were trivial in comparison with the cost of a full-scale war, but the Treasury Department under Secretary Salmon P. Chase showed unusual ingenuity in financing the war without crippling the economy. Many new taxes were imposed, and always with a patriotic theme comparing the financial sacrifice to the sacrifices of life and limb. The government paid for supplies in real money, which encouraged people to sell to the government regardless of their politics. By contrast the Confederacy gave paper promissory notes when it seized property, so that even loyal Confederates would hide their horses and mules rather than sell them for dubious paper. Overall the Northern financial system was highly successful in raising money and turning patriotism into profit, while the Confederate system impoverished its patriots.

The United States needed $3.1 billion to pay for the immense armies and fleets raised to fight the Civil War — over $400 million just in 1862. The largest tax sum by far came from new excise taxes—a sort of value added tax—that was imposed on every sort of manufactured item. Second came much higher tariffs, through several Morrill tariff laws. Third came the nation's first income tax; only the wealthy paid and it was repealed at war's end.

Apart from taxes, the second major source was government bonds. For the first time bonds in small denominations were sold directly to the people, with publicity and patriotism as key factors, as designed by banker Jay Cooke. State banks lost their power to issue banknotes. Only national banks could do that, and Chase made it easy to become a national bank; it involved buying and holding federal bonds and financiers rushed to open these banks. Chase numbered them, so that the first one in each city was the "First National Bank". Fourth the government printed "greenbacks"—paper money—which were controversial because they caused inflation.

Secretary Chase, though a long-time free-trader, worked with Congressman Justin Morrill to pass a second tariff bill in summer 1861, raising rates another 10 points in order to generate more revenues. These subsequent bills were primarily revenue driven to meet the war's needs, though they enjoyed the support of protectionists such as Carey, who again assisted Morrill in the bill's drafting. The Morrill Tariff of 1861 was designed to raise revenue. The tariff act of 1862 served not only to raise revenue, but also to encourage the establishment of factories free from British competition by taxing British imports. Furthermore, it protected American factory workers from low paid European workers, and as a major bonus attracted tens of thousands of those Europeans to immigrate to America for high wage factory and craftsman jobs.

Land grants

Homesteaders in central Nebraska in 1886

The U.S. government owned vast amounts of quality land (mostly from the Louisiana Purchase of 1803 and the Oregon Treaty with Britain in 1846). The challenge was to make the land useful to people and to provide the economic basis for the wealth that would pay off the war debt. The government did this by breaking it up into smaller plots for private ownership, through various federal laws.

Bounty-land warrants were issued to military veterans in the United States from 1775 to 1855. The land grants were used extensively for settlement of pre-Louisiana Purchase lands east of the Mississippi River, including the Ohio Country, the Northwest Territory, and the Platte Purchase in Missouri.

About 180 million acres were granted to railroad construction companies between 1850 and 1871. The Land Grant Act of 1850 provided for 3.75 million acres of land to the states to support railroad projects; by 1857 21 million acres of public lands were used for railroads in the Mississippi River valley, and the stage was set for more substantial Congressional subsidies to future railroads.

The Pacific Railroad Acts financed several transcontinental railroads by granting land directly to corporations for the first time. In addition to operating revenues, railroads were able to finance networks crossing vast distances by selling granted property adjacent to the tracks; these would become highly desirable plots for new settlers and businesses because of the easy access to long-distance transportation.

Morrill Land-Grant Acts starting in 1860 benefited colleges and universities.

Various Homestead Acts distributed land nearly for free in return for improvements such as building a house, farming, or planting trees. Between 1862 and 1934, the federal government granted 1.6 million homesteads and distributed 270,000,000 acres (420,000 sq mi) of federal land for private ownership. This was a total of 10% of all land in the United States. Eligibility for the last such program, in Alaska, ended in 1986. The Land Office made about 100 million acres of direct sales in the western United States from 1850 to 1900, benefiting cattle ranchers and speculators.

The economic and military power of the federal government was used to clear Native Americans from land desired by European-American settlers. Land grants creating the Indian Reservation system were used by the Indian Appropriations Act of 1851 to segregate native tribes, but later acts opened some of that land to white settlement, notably including a land run opening the Unassigned Lands in Oklahoma. The Dawes Act of 1887 pressured Native Americans to assimilate to European-American culture, offering former tribal land to individuals separating from their tribes and putting "surplus" reservation land up for auction. Overall, about half of Indian Reservation land was sold to white Americans by 1906, about 75 million acres.

Banking

The North's most important war measure was perhaps the creation of a system of national banks that provided a sound currency for the industrial expansion. Even more important, the hundreds of new banks that were allowed to open were required to purchase government bonds. Thereby the nation monetized the potential wealth represented by farms, urban buildings, factories, and businesses, and immediately turned that money over to the Treasury for war needs.

Education

British Parliamentary Committee members Joseph Whitworth and George Wallis were very impressed at the educational level of workers in the U.S., commenting that "so that everybody reads ... and intelligence penetrates through the lowest grades of society." They also remarked that most states had compulsory education laws requiring a minimum of three months per year schooling for child factory workers.

Civil War

The Union grew rich fighting the war, as the Confederate economy was destroyed. The Republicans in control in Washington had a Whig vision of an industrial nation, with great cities, efficient factories, productive farms, national banks, and high-speed rail links. The South had resisted policies such as tariffs to promote industry and homestead laws to promote farming because slavery would not benefit; with the South gone, and Northern Democrats very weak in Congress, the Republicans enacted their legislation. At the same time they passed new taxes to pay for part of the war, and issued large amounts of bonds to pay for the most of the rest. (The remainder can be charged to inflation.) They wrote an elaborate program of economic modernization that had the dual purpose of winning the war and permanently transforming the economy. The key policy-maker in Congress was Thaddeus Stevens, as chairman of the Ways and Means Committee. He took charge of major legislation that funded the war effort and revolutionized the nation's economic policies regarding tariffs, bonds, income and excise taxes, national banks, suppression of money issued by state banks, greenback currency, and western railroad land grants.

Historians have debated whether or not the Civil War sped up the rate of economic growth in the face of destruction throughout the South and the diversion of resources to military supplies and away from civilian goods. In any case the war taught new organizational methods, prioritized engineering skills, and shifted the national attention from politics to business.

$20 banknote with portrait of Secretary of the Treasury Hugh McCulloch

Financial issues of reconstruction

The Civil War had been financed primarily by issuing short-term and long-term bonds and loans, plus inflation caused by printing paper money, plus new taxes. Wholesale prices had more than doubled, and reduction of inflation was a priority for Secretary of the Treasury Hugh McCulloch. A high priority, and by far the most controversial, was the currency question. The old paper currency issued by state banks had been withdrawn, and Confederate currency became worthless. The national banks had issued $207 million in currency, which was backed by gold and silver. The federal treasury had issued $428 million in greenbacks, which was legal tender but not backed by gold or silver. In addition, about $275 million of coin was in circulation. The new administration policy announced in October would be to make all the paper convertible into specie, if Congress so voted. The House of Representatives passed the Alley Resolution on December 18, 1865, by vote of 144 to 6. In the Senate it was a different matter, for the key player was Senator John Sherman, who said that inflation contraction was not nearly as important as refunding the short-term and long-term national debt. The war had been largely financed by national debt, in addition to taxation and inflation. The national debt stood at $2.8 billion. By October 1865, most of it in short term and temporary loans.

Wall Street bankers typified by Jay Cooke believed that the economy was about to grow rapidly, thanks to the development of agriculture through the Homestead Act, the expansion of railroads, especially rebuilding the devastated Southern railroads and opening the transcontinental line to the West Coast, and especially the flourishing of manufacturing during the war. The goal premium over greenbacks was hundred and $145 in greenbacks to $100 in gold, and the optimists thought that the heavy demand for currency in an era of prosperity would return the ratio to 100. A compromise was reached in April 1866, that limited the treasury to a currency contraction of only $10 million over six months. Meanwhile, the Senate refunded the entire national debt, but the House failed to act. By early 1867, postwar prosperity was a reality, and the optimists wanted an end to contraction, which Congress ordered in January 1868. Meanwhile, the Treasury issued new bonds at a lower interest rate to refinance the redemption of short-term debt. while the old state bank notes were disappearing from circulation, new national bank notes, backed by species, were expanding. By 1868 inflation was minimal.

Late 19th century

Commerce, industry and agriculture

In the last third of the 19th century the United States entered a phase of rapid economic growth which doubled per capita income over the period. By 1895, the United States leaped ahead of Britain for first place in manufacturing output. For the first time, exports of machinery and consumer goods became important. For example, Standard Oil led the way in exporting kerosene; Russia was its main rival in international trade. Singer Corporation led the way in developing a global marketing strategy for its sewing machines.

The greatly expanded railroad network, using inexpensive steel rails produced by new steel making processes, dramatically lowered transportation cost to areas without access to navigable waterways. Low freight rates allowed large manufacturing facilities with great economies of scale. Machinery became a large industry and many types of machines were developed. Businesses were able to operate over wide areas and chain stores arose. Mail order companies started operating. Rural Free Delivery began in the early 1890s, but it was not widely implemented for a decade.

William Sellers & Company in Philadelphia, 1876

Companies created new management systems to carry out their operations on a large scale. Companies integrated processes to eliminate unnecessary steps and to eliminate middlemen.

An explosion of new discoveries and inventions took place, a process called the Second Industrial Revolution. The electric light, telephone, steam turbine, internal combustion engine, automobile, phonograph, typewriter and tabulating machine were some of the many inventions of the period. New processes for making steel and chemicals such as dyes and explosives were invented. The pneumatic tire, improved ball bearings, machine tools and newly developed metal stamping techniques enabled the large scale production of bicycles in the 1890s. Another significant development was the widespread introduction of electric street railways (trams, trolleys or streetcars) in the 1890s.

Improvements in transportation and other technological progress caused prices to fall, especially during the so-called long depression, but the rising amount of gold and silver being mined eventually resulted in mild inflation during the 1890s and beyond.

Table 3: Ten leading U.S. industries by value added (millions of 1914 $'s)
1860 1880 1900 1920
Industry Value added Industry Value added Industry Value added Industry Value added
Cotton goods 59 Machinery 111 Machinery 432 Machinery 576
Lumber 54 Iron and steel 105 Iron and steel 339 Iron and steel 493
Boots and shoes 53 Cotton goods 97 Printing and publishing 313 Lumber 393
Flour and meal 43 Lumber 87 Lumber 300 Cotton goods 364
Men's clothing 39 Boots and shoes 82 Clothing 262 Shipbuilding 349
Machinery 31 Men's clothing 78 Liquor 224 Automotive 347
Woolen goods 27 Flour and meal 64 Cotton goods 196 General shop construction 328
Leather goods 24 Woolen goods 60 Masonry and brick 140 Printing and publishing 268
Cast iron 23 Printing 58 General shop construction 131 Electrical machinery 246
Printing 20 Liquor 44 Meatpacking 124 Clothing 239
Source: Joel Mokyr
Steel workers in 1905, Meadville, Philadelphia

Railroads

Real gross national product per capita of the United States 1869–1918
 
Pork packing in Cincinnati, 1873

Railroads saw their greatest growth in new track added in the last three decades of the 19th century. (See Table 2) Railroads also enjoyed high productivity growth during this time, mainly because of the introduction of new processes that made steel inexpensive. Steel rails lasted roughly ten times longer than iron rails. Steel rails, which became heavier as steel prices fell, enabled heavier, more powerful locomotives that could pull longer trains. Rail cars made of steel on steel rails could be made longer and cars and a load carrying to car weight ratio of 2:1 compared to cars made of iron at 1:1.

In 1890 David Ames Wells estimated wagon transport at 16 cents per ton-mile compared to railroads at less than one cent per ton-mile.

Railroads competed fiercely for passengers and freight by expanding their routes, too often into increasingly marginal ones. The high capital required for expansion plus the low rates, driven by competition and by what the market would bear, resulted in a large percentage of railroad track in bankruptcy.

A practical refrigerated (ice cooled) railcar was introduced in 1881. This made it possible to ship cattle and hog carcasses, which weighed only 40% as much as live animals. Gustavus Franklin Swift developed an integrated network of cattle procurement, slaughtering, meat-packing and shipping meat to market. Up to that time cattle were driven great distances to railroad shipping points, causing the cattle to lose considerable weight. Swift developed a large business, which grew in size with the entry of several competitors.

Steel

In the last three decades of the 19th century iron and steel became a leading industry, in second place by value added, with machinery being in first place. The Bessemer process was the first large scale process for producing steel, which it was able to do at low cost. The first U.S. licensed Bessemer plant began operation in 1865. Bessemer steel was used mostly for rails. Due to difficulty in controlling quality and embrittlement with aging, Bessemer steel was not suitable for structural purposes.

The Siemens-Martin process, or open hearth process, produced a suitable grade of structural steel. Open hearth steel displaced wrought iron as a structural material in the 1880s. Open hearth steel began being used in a wide variety of applications including high rise buildings, ships, machinery, pipelines, rails and bridges.

Electric lights and electric street railways

Early electrification was too limited to have a big impact on the late 19th-century economy. Electricity was also very expensive because of the low conversion efficiency of fuel to power, the small scale of power plants and the fact that most utilities offered only nighttime service. Daytime service became common during the early 20th century after the introduction of the AC motor, which tended to be used more during the day, balancing the load. Until that time a large share of power was self-generated by the user, such as a factory, hotel or electric street railway (tram or streetcar).

Electric street railways were introduced in the U.S. in 1888 when Frank J. Sprague designed and built the first practical system, the Richmond Union Passenger Railway in Richmond, Virginia. Electric street railways rapidly spread to cities around the country in the following years.

The early electric street railways typically generated their own power and also operated as electric utilities, which served to even out daily load because the main use of power for lighting was after the peak usage by railways.

Until the early 1880s electricity had been used mainly in telegraphy and electroplating. Efficient dynamos were introduced in the 1870s and began being used to power electric carbon arc lamps after 1879. In 1880 Thomas Edison patented his invention of a long lasting incandescent light bulb and a system for distributing electrical power. In 1882 he opened the Pearl Street Station in Manhattan, which was the first central power station in the U.S.

Using DC placed severe restrictions on the distance power could be transmitted due to power losses. With DC there was no way to transform power to high voltages, which would have reduced the current and lowered the transmission losses. Power can be safely generated to about 2000 volts, but this is a dangerous voltage for household use. With alternating current voltage can be changed up or down using a transformer. AC power began being widely introduced in the 1890s.

Communications

Following the failure of the first short lived Transatlantic telegraph cable of 1858, a second, more durable cable was completed in 1865, connecting Nova Scotia to England. By 1890 there was an international telegraph network.

After the invention of the telephone in 1876 additional development work was required to make it commercially viable. The first telephones were for local calls. Long-distance calling came into being in the 1890s, but the technology to make transcontinental calls took until 1915 to be operational.

Automatic telephone switching, which eliminated the need for telephone operators to manually connect local calls on a switchboard, was introduced in 1892; however it did not become widespread for several decades.

Modern business management

Before railroads most businesses were run by a sole proprietor or were a partnership. The owners typically ran the daily operations. The railroad industry was the first to adopt modern business management practices in response to the need to operate over vast areas, to maintain continuous long-distance communications, to manage a complex network, to track trains and freight. Railroads hired professional managers and divided work into various corporate departments, and developed the organization diagram.

Another modern business innovation was vertical integration, by which companies expanded to encompass all stages of a business, from producing the raw materials, processing them into saleable products and selling the finished products. Notable examples occurred in the steel and petroleum industries.

Agriculture

A dramatic expansion in farming took place. The number of farms tripled from 2.0 million in 1860 to 6.0 million in 1905. The number of people living on farms grew from about 10 million in 1860 to 22 million in 1880 to 31 million in 1905. The value of farms soared from $8.0 billion in 1860 to $30 billion in 1906.

The federal government issued 160-acre (65 ha) tracts virtually free to settlers under the Homestead Act of 1862. Even larger numbers of settlers purchased lands at very low interest from the new railroads, which were trying to create markets. The railroads advertised heavily in Europe and brought over, at low fares, hundreds of thousands of farmers from Northern Europe.

Despite their remarkable progress and general prosperity, 19th-century U.S. farmers experienced recurring cycles of hardship, caused primarily by falling world prices for cotton and wheat.

Along with the mechanical improvements which greatly increased yield per unit area, the amount of land under cultivation grew rapidly throughout the second half of the century, as the railroads opened up new areas of the West for settlement. The wheat farmers enjoyed abundant output and good years from 1876 to 1881 when bad European harvests kept the world price high. They then suffered from a slump in the 1880s when conditions in Europe improved. The farther west the settlers went, the more dependent they became on the monopolistic railroads to move their goods to market, and the more inclined they were to protest, as in the Populist movement of the 1890s. Wheat farmers blamed local grain elevator owners (who purchased their crop), railroads and eastern bankers for the low prices. Sales of various types of horse pulled harvesting machines increased dramatically between the Civil war and the end of the century. Harvesting machine improvements included automatic rakers, which eliminated the manual raker, allowing operation by a single man, and combination harvester and binders.

To modernize traditional agriculture reformers founded the Grange movement, in 1867. The Granges focused initially on social activities to counter the isolation most farm families experienced. Women's participation was actively encouraged. Spurred by the Panic of 1873, the Grange soon grew to 20,000 chapters and 1.5 million members. The Granges set up their own marketing systems, stores, processing plants, factories and cooperatives. Most went bankrupt. The movement also enjoyed some political success during the 1870s. A few Midwestern states passed "Granger Laws", limiting railroad and warehouse fees.

Federal land grants helped each state create an agricultural college and a network of extension agents who demonstrated modern techniques to farmers. Wheat and cotton farmers in the 1890s supported the Populist movement, but failed in their demands for free silver and inflation. Instead the 1896 election committed the nation to the gold standard and a program of sustained industrialization. Farmers in the Midwest and East gave verbal support to the Populists. They focused on the nearby urban markets, rather than on highly fluctuating European markets for weaving cotton.

Oil, minerals and mining

Oil

In the 1850s an advance in lighting was the use of kerosene lamps with glass chimneys, which produced a good quality light at a relatively affordable price. Kerosene lighting effectively extended the day and made it easier to read at night. An industry developed to produce coal oil, as kerosene was then called. Kerosene was also being distilled from Pennsylvania crude oil by Samuel Kier.

George Bissell paid a visit to Dartmouth College, which he had attended, and saw a sample of "rock oil" from Pennsylvania. Suspecting that the oil may have potential as an illuminant and lubricant, he organized an investor group. In 1853 Bissell's group, which became the Pennsylvania Rock Oil Co., hired Yale chemistry professor Benjamin Silliman, Jr. to perform an analysis of "rock oil". Silliman's report of April 1864 stated that "rock oil" could yield an excellent illuminating oil. However, there was no economical means for producing sufficient commercial quantities of oil. Bissell had a chance insight when he saw a picture of oil derricks used to produce an oil based patent medicine obtained as a byproduct of a brine well.

Following a shareholder disagreement, Bissell and fellow investor Jonathan Eveleth investor split with Pennsylvania Rock Oil Co. and formed Seneca Oil in 1858. Edwin Drake, a shareholder, was hired by the company to drill for oil. The site chosen to drill the well was on Oil Creek near Titusville, PA, where a water well was producing oil. Drake chose to use brine well drilling technology based on the technique used in China since ancient times that reached the West in the late 1820s, except that Drake used iron cable, an iron well casing and a steam engine. The Drake Well hit oil at a depth of 69.55 feet on August 27, 1858, starting a drilling boom in the region.

Among the numerous refineries that were started were several along a new rail link to Cleveland, Ohio, where John D. Rockefeller and his partner Maurice Clark owned a grocery produce shipping business. Rockefeller and Clark also got into the refining business, and in 1865 the partners decided to hold a private auction between the two, with Rockefeller being the successful bidder. The refining industry was intensely competitive, and by 1869 there was three times the capacity needed, a situation which lasted many years, with the number of refineries reaching 6000.

In 1870 John D. Rockefeller, his brother William Rockefeller, Henry Flagler, Oliver Burr Jennings and silent partner Stephen V. Harkness formed Standard Oil. John D. Rockefeller was the master planner and organizer of the systematic plan to form combinations with or acquire competitors and enter all phases of the oil industry from production to transportation, refining and distribution, a concept called vertical integration. Standard Oil sought every possible advantage over its competitors. One method was using Standard's high shipping volume to secure discounts and drawbacks (payments from railroads for transporting competitors products) from railroads. By 1879 Standard oil controlled 90% of U.S. refining capacity. Producers in the Pennsylvania oil region tried to counter Standard Oil's transportation arrangements by building the first long distance pipeline, the 110 mile long Tidewater Pipeline to Williamsport, Pennsylvania, which was on the Reading Railroad. Standard Oil fought back by building four pipelines of its own. Standard continued to monopolize the oil industry in the U.S. until it was broken up by the 1911 U.S. Supreme Court case Standard Oil Co. of New Jersey v. United States.

Efficient gas mantles and electric lighting were eroding the illuminating oil market beginning in the 1880s; however a previously low value byproduct of refining was gasoline, which more than offset the role of kerosene in the early 20th century.

Coal

Coal was found in abundance in the Appalachian Mountains from Pennsylvania south to Kentucky.

Iron ore

Large iron ore mines opened in the Lake Superior region of the upper Midwest. Steel mills thrived in places where these coal and iron ore could be brought together to produce steel. Large copper and silver mines opened, followed by lead mines and cement factories.

Finance, money and banking

A bank run on the Fourth National Bank No. 20 Nassau Street, New York City, October 4, 1873

During the period, a series of recessions happened. The recession of 1869 resulted from a stock market panic, which lowered stock prices 20% and briefly cut wheat prices in half. It was one of the shortest and mildest recessions in American economic history.

Panic of 1873 created one of the worst and longest depressions in American history, seriously affecting every aspect of the economy and bringing the railroad expansion to a halt. The New York Stock Exchange closed for ten days. Of the country's 364 railroads, 89 went bankrupt, a total of 18,000 businesses failed between 1873 and 1875, unemployment reached 14% by 1876, during a time which became known in Britain as the Long Depression. Contemporary economist David Ames Wells argued against calling it a depression in the U.S. (which then meant depressed prices) because output rose dramatically.

Politically, the Democrats took control of Congress in 1874, the election of 1876 was deadlocked.

The end of the Gilded Age coincided with the Panic of 1893, a deep depression that lasted until 1897. Wheat and cotton farmers in the West and South were especially hard hit, and moved toward radicalism. President Grover Cleveland was forced to ask the Wall Street bankers to help keep the Treasury liquid. Agrarian spokesmen William Jennings Bryan called for an inflationary policy of using cheap silver to effectively replace expensive gold. Bryan lost in a major political realignment in favor of the conservative pro-gold Republicans in the election of 1896.

Water supply and sewers

Europe had a substantial amount of water supply and sewer infrastructure installed by the mid 1870s. In 1880 only 0.3% of urban households had filtered water, with this figure rising to 1.5% in 1890 and 6.3% in 1900.

Labor unions

Workers in New York in 1871 demand the eight-hour day

The American labor movement began with the first significant labor union, the Knights of Labor in 1869. The Knights collapsed in the 1880s and were displaced by strong international unions that banded together as the American Federation of Labor under Samuel Gompers. Rejecting socialism, the AFL unions negotiated with owners for higher wages and better working conditions. Union growth was slow until 1900, then grew to a peak during World War I.

Political developments

Concern over railroads' unfair practices, such as freight rates favoring certain shippers, led to the Interstate Commerce Act of 1887 which created the nation's first regulatory agency, the Interstate Commerce Commission.

Trade and tariffs

From the Civil War to 1913, the United States had a high tariff regime, as did most countries with the exception of Great Britain, which clung to free trade.

Most Democrats wanted a lower tariff, but failed to make much difference. The Republican Party stressed the goal of rapid economic growth, as well as high wage rates for industrial workers. Indeed the high American wage rates attracted large numbers of skilled European workers, who filled the upper ranks of the American working class. According to Benjamin O. Fordham the United States:

Protectionism had several important consequences for American foreign policy on both economic and security issues. It led to a focus on less developed areas of the world that would not export manufactured goods to the United States instead of on wealthier European markets. It limited the tactics available for promoting American exports, forcing policymakers to seek exclusive bilateral agreements or unilateral concessions from trading partners instead of multilateral arrangements. It inhibited political cooperation with other major powers and implied an aggressive posture toward these states.

Early 20th century

Economic growth and the 1910 break

The period from 1890 to 1910 was one of rapid economic growth of above 4%, in part due to rapid population growth. However, a sharp break in the growth rate to around 2.8% occurred from 1910 to 1929. Economists are uncertain what combination of supply and demand factors caused the break, but productivity growth was strong, enabling the labor cost per unit of output to decline from 1910 to 1929. The growth rate in hours worked fell 57% compared to the decline in the growth rate of output of 27%. It is generally accepted that the new technologies and more efficient business methods permanently shifted the supply and demand relationship for labor, with labor being in surplus (except during both world wars when the economy was engaged in war-time production and millions of men served in the armed forces). The technologies that became widespread after 1910, such as electrification, internal combustion powered transportation and mass production, were capital saving. Total non-residential fixed business fell after 1910 due to the fall of investment in structures.

Industry, commerce and agriculture

Two of the most transformative technologies of the century were widely introduced during the early decades: electrification, powered by high pressure boilers and steam turbines and automobiles and trucks powered by the internal combustion engine.

Chain stores experienced rapid growth.

Standardization was urged by the Department of Commerce for consumer goods such as bedspreads and screws. A simplified standardization program was issued during World War I.

Electrification

Electrification was one of the most important drivers of economic growth in the early 20th century. The revolutionary design of electric powered factories caused the period of the highest productivity growth in manufacturing. There was large growth in the electric utility industry and the productivity growth of electric utilities was high as well.

At the turn of the 20th century electricity was used primarily for lighting and most electric companies did not provide daytime service. Electric motors that were used in daytime, such as the DC motors that powered street railways, helped balance the load, and many street railways generated their own electricity and also operated as electric utilities. The AC motor, developed in the 1890s, was ideal for industrial and commercial power and greatly increased the demand for electricity, particular during daytime.

Electrification in the U.S. started in industry around 1900, and by 1930 about 80% of power used in industry was electric. Electric utilities with central generating stations using steam turbines greatly lowered the cost of power, with businesses and houses in cities becoming electrified. In 1900 only 3% of households had electricity, increasing to 30% by 1930. By 1940 almost all urban households had electricity. Electrical appliances such as irons, cooking appliances and washing machines were slowly adopted by households. Household mechanical refrigerators were introduced in 1919 but were in only about 8% of households by 1930, mainly because of their high cost.

The electrical power industry had high productivity growth. Many large central power stations, equipped with high pressure boilers and steam turbine generators began being built after 1913. These central stations were designed for efficient handling of coal from the layout of the rail yards to the conveyor systems. They were also much more fuel efficient, lowering the amount of fuel per kilowatt-hour of electricity to a small fraction of what it had been. In 1900 it took 7 lbs coal to generate one kilowatt hour. In 1960 it took 0.9 lb/kw hr.

Manufacturing

Rapid economic growth in the early decades of the 20th century were largely due to productivity growth in manufacturing.

Factory electrification revolutionized manufacturing. Unit drive, which means using a single electric motor for powering a single machine, eliminated line shafts previously used to transmit power from a small number of steam engines or hydraulic turbines. Line shafts created constraints on building arrangement that impeded the efficient flow of materials because they presented traffic barriers and required multi-story buildings for economy. It was not uncommon for large manufacturing sites to have many miles of line shafts. Electric motors were much more economical to operate than steam engines in terms of energy efficiency and operator attention. Electric motors were also lower in capital cost.

Frederick W. Taylor was the best known pioneer in the field of scientific management in the late 19th century, carefully timing and plotting the functions of various workers and then devising new, more efficient ways for them to do their jobs. Ford Motor Co. used techniques of scientific management although Henry Ford claimed not to know of Taylor's system. Ford Motor used every practical means to reduce the effort and movement of workers in order to reduce the time involved in making parts, moving parts and assembling parts into automobiles. Ford used electric powered factories and in 1913 Ford introduced the assembly line, a step in the process that became known as mass production. The price of a Ford Model T fell from $900 in 1908–9 to $360 in 1916, despite the fact that wages doubled to $5 per day in 1914. Production grew from 13,840 in 1909 to 132,702 in 1916. Productivity for this period, measured in output of Model T's per worker, rose 150%.

Ford offered a very generous wage—$5 a day—to his workers, arguing that a mass production enterprise could not survive if average workers could not buy the goods. However, Ford expanded the company's Sociological Department to monitor his workers and ensure that they did not spend their newfound bounty on "vice and cheap thrills".

Electric street railways

Electric street railways developed into a major mode of transportation, and electric inter-urban service connected many cities in the Northeast and Midwest. Electric street railways also carried freight, which was important before trucks became widely introduced. The widespread adoption of the automobile and motor bus halted the expansion of the electric street railways during the 1920s.

Railroads

At the beginning of the 20th century the railroad network had over-expanded with many miles of unprofitable routes. In 1906 Congress gave the Interstate Commerce Commission the power to regulate freight rates and the industry was unable to increase revenue enough to cover rising costs. By 1916, the peak year of track mileage, one-sixth of the nation's railroad trackage was in bankruptcy.

The railroads proved inadequate to handle the increased freight volume created by World War I. There were major traffic jams in the system and critical supplies were experiencing delays. In December 1917 the railroads were taken over by the government and put under control of the United States Railroad Administration (USRA). The USRA ordered 1,930 new standardized steam locomotives and over 100,000 railcars. The USRA's control over the railroads ended in March 1920.

Automobiles and trucks

Harvey Firestone, Thomas Edison, Henry Ford, and Fred Seely in Asheville, North Carolina, 1918

By the dawn of the 20th century, automobiles had begun to replace horse-drawn carriages. Numerous companies were building cars, but car manufacturing was challenging. Consequently, prices were high and production was low. Mass production techniques of the mid 1910s brought down the cost of automobiles and sales grew dramatically. By 1919 automobile registrations were 6.6 million and truck registrations were 898,000.

Replacing horses with cars and trucks eliminated enormous quantities of horse manure and urine from city streets, greatly reducing the labor for street cleaning and also improving sanitation and living conditions. Reducing the number of horses for transportation freed up between one- sixth and one-quarter of all farm land.

Highway system

In 1900 there were only 200 miles of paved roads outside of cities in the U.S. By the late 1920s automobiles were becoming common, but there were few highways connecting cities. The Federal road building program ended in 1818, leaving states to build roads until the Federal Road act of 1916. A national highway system was agreed on in 1926, when an interstate program (not to be confused with the Dwight D. Eisenhower National System of Interstate and Defense Highways) began, there were 23.1 million cars and 3.5 million trucks. The system was nearly complete when the U. S. entered World War II in December 1941.

Water supply and sewers

At the turn of the century approximately one-third of urban households had running water; however, most of it was untreated and carried disease causing microorganisms. The widespread building of water treatment plants and piping of water to and sewage from urban households occurred in the early decades of the century. The number of urban households supplied with running filtered water increased from 6.3% in 1900 to 25% in 1910 and 42% in 1925. In 1908 the Jersey City Water Works in New Jersey was the first to sterilize water using sodium hypochlorite (chlorine bleach). Chlorination of drinking water became common in urban water supplies by the 1930s and contributed to a sharp reduction in many diseases such as hepatitis A, typhoid fever, cholera and dysentery.

Agriculture

Internal combustion powered tractors appeared on farms in the mid 1910s and farmers began using automobiles and trucks to haul produce. By 1924 tractors and trucks on farms numbered 450,000 and 370,000 respectively. Combined harvester-threshers reduced labor cost 85% compared to using binders and stationary threshers. Farms have decreased in the United States. The number of farms in the U.S have decreased from 7 million in the 1930s to just a little over 2 million in 2000. The rate of decline was most rapid in the 1950s and 1960s. The reason for this was because of the increased innovation in farms where new technology was able to create more product which resulted in the need for fewer farms. Also during the 1950s–1960s people moved from the bigger cities and farms to more suburban areas so these smaller farms would be sold and people would move closer to cities. People also could not afford the new technologies to help them farm so bigger farms would be very successful during this time, while smaller farms would close.

Finance, money and banking

A major economic downturn in 1906 ended the expansion from the late 1890s. This was followed by the Panic of 1907. The Panic of 1907 was a factor in the establishment of the Federal Reserve Bank in 1913.

The mild inflation of the 1890s, attributed to the rising gold supply from mining, continued until World War I, at which time inflation rose sharply with wartime shortages including labor shortages. Following the war the rate of inflation fell, but prices remained above the prewar level.

The U.S. economy prospered during World War I, partly due to sales of war goods to Europe. The stock market had its best year in history in 1916. The U.S. gold reserves doubled between 1913 and 1918, causing the price level to rise. Interest rates had been held low to minimize interest on war bonds, but after the final war bonds were sold in 1919, the Federal Reserve raised the discount rate from 4% to 6%. Interest rates rose and the money supply contracted. The economy entered the Depression of 1920-21, which was a sharp decline financially. By 1923, the economy had returned to full employment.

A debt-fueled boom developed following the war. Jerome (1934) gives an unattributed quote about finance conditions that allowed the great industrial expansion of the post World War I period:

Probably never before in this country had such a volume of funds been available at such low rates for such a long period.

There was also a real estate and housing bubble in the 1920s, especially in Florida, which burst in 1925. Alvin Hansen stated that housing construction during the 1920s decade exceeded population growth by 25%. See also:Florida land boom of the 1920s

Debt reached unsustainable levels. Speculation in stocks drove prices up to unprecedented valuation levels. The stock market crashed in late October 1929.

Politics and regulation

Systematic regulation of major sectors of the economy and society was a priority of the federal government in this era. In the 19th century the federal government was not heavily involved in the private sector, except in the areas of land sales and transportation. In general, the concept of laissez-faire prevailed. It was a doctrine opposing government interference in the economy except to maintain law and order. This attitude started to change during the latter part of the 19th century, when small business, farm, and labor movements began asking the government to intercede on their behalf.

Fear of monopolies ("trusts") is shown in this attack on Rockefeller's Standard Oil Company.

By 1900, a middle class had developed that was leery of both the business elite and the somewhat radical political movements of farmers and laborers in the Midwest and West. Known as Progressives, these people favored government regulation of business practices to, in their minds, ensure competition and free enterprise. Congress enacted a law regulating railroads in 1887 (the Interstate Commerce Act), and one preventing large firms from controlling a single industry in 1890 (the Sherman Antitrust Act). These laws were not rigorously enforced, however, until the years between 1900 and 1920, when Republican President Theodore Roosevelt (1901–1909), Democrat President Woodrow Wilson (1913–1921), and others sympathetic to the views of the Progressives came to power. Many of today's U.S. regulatory agencies were created during these years, including the Interstate Commerce Commission and the Federal Trade Commission. Ida M. Tarbell wrote a series of articles against the Standard Oil monopoly. The series helped pave the way for the breakup of the monopoly.

Noon hour in a furniture factory. Indianapolis, Indiana, 1908

Muckrakers were journalists who encouraged readers to demand more regulation of business. Upton Sinclair's The Jungle (1906) showed America the horrors of the Chicago Union Stock Yards, a giant complex of meat processing that developed in the 1870s. The federal government responded to Sinclair's book with the new regulatory Food and Drug Administration.

President Wilson in 1913 using tariff, currency, and anti-trust laws to "prime the pump" and get the economy working

When Democrat Woodrow Wilson was elected president with a Democrat controlled Congress in 1912 he implemented a series of progressive policies. In 1913, the Sixteenth Amendment was ratified, and the income tax was instituted in the United States. Wilson resolved the longstanding debates over tariffs and antitrust, and created the Federal Reserve, a complex business-government partnership that to this day dominates the financial world.

The Pure Food and Drug Act of 1906 was the first of a series of legislation that led to the establishment of the Food and Drug Administration (FDA). Another such act passed the same year was the Federal Meat Inspection Act. The new laws helped the large packers, and hurt small operations that lacked economy of scale or quality controls.

The Sixteenth Amendment to the United States Constitution, which allowed the Federal Government to tax all income, was adopted in 1913. Starting small, the tax shot us pin 1917 to pay for the war, then declined in the 1920s.

The Emergency Quota Act (1921) established a quota system on immigrants by country of origin, with the maximum number of annual immigrants from a country limited to 3% of the number of that national background living in the U.S. according to the 1910 United States Census. The Immigration Act of 1924 reduced the quota from 3% to 2% and added additional restrictions on certain nationalities.

World War I

The World War involved a massive mobilization of money, taxes, and banking resources to pay for the American war effort and, through government-to-government loans, most of the Allied war effort as well.

Roaring Twenties: 1920–1929

People filing tax forms in 1920

Under Republican President Warren G. Harding, who called for normalcy and an end to high wartime taxes, Secretary of the Treasury Andrew Mellon raised the tariff, cut marginal tax rates and used the large surplus to reduce the federal debt by about a third from 1920 to 1930. Secretary of Commerce Herbert Hoover worked to introduce efficiency, by regulating business practices. This period of prosperity, along with the culture of the time, was known as the Roaring Twenties. The rapid growth of the automobile industry stimulated industries such as oil, glass, and road-building. Tourism soared and consumers with cars had a much wider radius for their shopping. Small cities prospered, and large cities had their best decade ever, with a boom in construction of offices, factories and homes. The new electric power industry transformed both business and everyday life. Telephones and electricity spread to the countryside, but farmers never recovered from the wartime bubble in land prices. Millions migrated to nearby cities. However, in October 1929, the Stock market crashed and banks began to fail in the Wall Street Crash of 1929.

Quality of life

The early decades of the 20th century were remarkable for the improvements of the quality of life in the U.S. The quality of housing improved, with houses offering better protection against cold. Floor space per occupant increased. Sanitation was greatly improved by the building of water supply and sewage systems, plus the treatment of drinking water by filtration and chlorination. The change over to internal combustion took horses off the streets and eliminated horse manure and urine and the flies they attracted. Federal regulation of food products and processing, including government inspection of meat processing plants helped lower the incidence of food related illness and death.

Infant mortality, which had been declining dramatically in the last quarter of the 19th century, continued to decline.

The workweek, which averaged 53 hours in 1900, continued to decline. The burden of household chores lessened considerably. Hauling water and firewood into the home every day was no longer necessary for an increasing number of households.

Electric light was far less expensive and higher quality than kerosene lamp light. Electric light also eliminated smoke and fumes and reduced the fire hazard.

Welfare capitalism

Beginning in the 1880s but especially by the 1920s, some large non-union corporations such as Kodak, Sears, and IBM, adopted the philosophy of paternalistic welfare capitalism. In this system, workers are considered an important stakeholder alongside owners and customers. In return for loyalty to the company, workers get long-term job security, health care, defined benefit pension plans, and other perks. Welfare capitalism was seen as good for society, but also for the economic interests of the company as a way to prevent unionization, government regulation, and socialism or Communism, which became a major concern in the 1910s. By the 1980s, the philosophy had declined in popularity in favor of maximizing shareholder value at the expense of workers, and defined contribution plans such as 401(k)s, replaced guaranteed pensions.

Great Depression and World War II: 1929–1945

Pre-war industry, commerce, and agriculture

Despite the Great Depression and World War II, the middle decades of the 20th century were among the highest for productivity growth. The research developed through informal cooperation between U.S. industry and academia grew rapidly and by the late 1930s exceeded the size of that taking place in Britain (although the quality of U.S. research was not yet on par with British and German research at the time).

Manufacturing

Productivity growth in manufacturing slowed from the electrification era of the early century, but remained moderate. Automation of factories became widespread during the middle decades as industry invested in newly developed instruments and controls that allowed fewer workers to operate vast factories, refineries and chemical plants.

Great Depression: 1929–1941

Stock exchange trading floor after the 1929 crash
 
"Broke, baby sick, and car trouble!" Dorothea Lange's 1937 photo of Missouri migrants living in a truck in California. Many displaced people moved to California to look for work during the Depression. John Steinbeck depicted the situation in The Grapes of Wrath.

Following the Wall Street Crash of 1929, the worldwide economy plunged into the Great Depression. The U.S. money supply began to contract by one-third. The protectionist Smoot–Hawley Tariff Act incited retaliation from Canada, Britain, Germany and other trading partners. Congress, in 1932, worried about the rapidly growing deficit and national debt, and raised income tax rates. Economists generally agree that these measures deepened an already serious crisis. By 1932, the unemployment rate was 25%. Conditions were worse in heavy industry, lumbering, export agriculture (cotton, wheat, tobacco), and mining. Conditions were not quite as bad in white collar sectors and in light manufacturing.

Franklin Delano Roosevelt was elected President in 1932 without a specific program. He relied on a highly eclectic group of advisors who patched together many programs, known as the New Deal.

Table 2: Depression Data 1929 1931 1933 1937 1938 1940
Real Gross National Product (GNP) 1 101.4 84.3 68.3 103.9 103.7 113.0
Consumer Price Index 2 122.5 108.7 92.4 102.7 99.4 100.2
Index of Industrial Production 2 109 75 69 112 89 126
Money Supply M2 ($ billions) 46.6 42.7 32.2 45.7 49.3 55.2
Exports ($ billions) 5.24 2.42 1.67 3.35 3.18 4.02
Unemployment (% of civilian workforce) 3.1 16.1 25.2 13.8 16.5 13.9

1 in 1929 dollars
2 1935–39 = 100

Spending

Government spending increased from 8.0% of GNP under Herber Hoover in 1932 to 10.2% of GNP in 1936. Franklin D. Roosevelt balanced the "regular" budget the emergency budget was funded by debt, which increased from 33.6% of GNP in 1932 to 40.9% in 1936. Deficit spending had been recommended by some economists, most notably the Briton John Maynard Keynes. Roosevelt met Keynes but did not pay attention to his recommendations. After a meeting with Keynes, who kept drawing diagrams, Roosevelt remarked that "He must be a mathematician rather than a political economist". John Keynes' approach to the Great Depression could have been a solution. His method was to keep the feds spending as much as they could even on random purchases but the money had to keep moving. It is believed that if America went through with John Keynes plan the Great Depression could have been avoided entirely. The Feds started to spend more when President Roosevelt came into office as the federal government doubled income tax rates in 1932. Total government tax revenues as a percentage of GDP shot up from 10.8% in 1929 to 16.6% in 1933. Higher tax rates tended to reduce consumption and aggregate demand. Spending would go up and America would get out of the great depression when World War II happened after Japan’s attack on American forces in Pearl Harbor in December 1941 led to much sharper increases in government purchases, and the economy pushed quickly into an inflationary gap.

Banking crisis

CPI 1914-2022
  M2 money supply increases Year/Year

In 1929–33 the economy was destabilized by bank failures. The initial reasons were substantial losses in investment banking, followed by bank runs. Bank runs occurred when a large number of customers lost confidence in their deposits (which were not insured) and rushed to withdraw their deposits. Runs destabilized many banks to the point where they faced bankruptcy. Between 1929 and 1933 40% of all banks (9,490 out of 23,697 banks) went bankrupt. Much of the Great Depression's economic damage was caused directly by bank runs.

Hoover had already considered a bank holiday to prevent further bank runs, but rejected the idea because he was afraid to trip a panic. Roosevelt acted as soon as he took office; he closed all the banks in the country and kept them all closed until he could pass new legislation. On March 9, Roosevelt sent to Congress the Emergency Banking Act, drafted in large part by Hoover's top advisors. The act was passed and signed into law the same day. It provided for a system of reopening sound banks under Treasury supervision, with federal loans available if needed. Three-quarters of the banks in the Federal Reserve System reopened within the next three days. Billions of dollars in hoarded currency and gold flowed back into them within a month, thus stabilizing the banking system. By the end of 1933, 4,004 small local banks were permanently closed and merged into larger banks. Their deposits totaled $3.6 billion; depositors lost a total of $540 million, and eventually received on average 85 cents on the dollar of their deposits; it is a common myth that they received nothing back. The Glass–Steagall Act limited commercial bank securities activities and affiliations between commercial banks and securities firms to regulate speculations. It also established the Federal Deposit Insurance Corporation (FDIC), which insured deposits for up to $250,000, ending the risk of runs on banks.

Unemployment

Unemployment reached 25 percent in the worst days of 1932–33, but it was unevenly distributed. Job losses were less severe among women than men, among workers in nondurable industries (such as food and clothing), in services and sales, and in government jobs. The least skilled inner city men had much higher unemployment rates, as did young people who had a hard time getting their first job, and men over the age of 45 who if they lost their job would seldom find another one because employers had their choice of younger men. Millions were hired in the Great Depression, but men with weaker credentials were never hired, and fell into a long-term unemployment trap. The migration that brought millions of farmers and townspeople to the bigger cities in the 1920s suddenly reversed itself, as unemployment made the cities unattractive, and the network of kinfolk and more ample food supplies made it wise for many to go back.

City governments in 1930–31 tried to meet the depression by expanding public works projects, as president Herbert Hoover strongly encouraged. However tax revenues were plunging, and the cities as well as private relief agencies were totally overwhelmed by 1931 men were unable to provide significant additional relief. They fell back on the cheapest possible relief, soup kitchens which provided free meals for anyone who showed up. After 1933 new sales taxes and infusions of federal money helped relieve the fiscal distress of the cities, but the budgets did not fully recover until 1941.

The federal programs launched by Hoover and greatly expanded by president Roosevelt's New Deal used massive construction projects to try to jump start the economy and solve the unemployment crisis. The alphabet agencies ERA, CCC, FERA, WPA and PWA built and repaired the public infrastructure in dramatic fashion, but did little to foster the recovery of the private sector. FERA, CCC, and especially WPA focused on providing unskilled jobs for long-term unemployed men.

Relief

The extent to which the spending for relief and public works provided a sufficient stimulus to revive the U.S. economy, or whether it harmed the economy, is also debated. If one defines economic health entirely by the gross domestic product, the U.S. had gotten back on track by 1934, and made a full recovery by 1936, but as Roosevelt said, one third of the nation was ill fed, ill-housed and ill-clothed. See Chart 3. GNP was 34% higher in 1936 than 1932, and 58% higher in 1940 on the eve of war. The economy grew 58% from 1932 to 1940 in 8 years of peacetime, and then grew another 56% from 1940 to 1945 in 5 years of wartime. The unemployment rate fell from 25.2% in 1932 to 13.9% in 1940 when the draft started. During the war the economy operated under so many different conditions that comparison is impossible with peacetime, such as massive spending, price controls, bond campaigns, controls over raw materials, prohibitions on new housing and new automobiles, rationing, guaranteed cost-plus profits, subsidized wages, and the draft of 12 million soldiers.

Chart 3: GDP annual pattern and long-term trend, 1920–40, in billions of constant dollars[294]

In 1995 economist Robert Whaples stated that measuring the effect of the New Deal remains a thorny issue for economists because it is so difficult to measure the effects it had on the country. A survey of academic specialists by Whaples showed that in regards to the statement "The New Deal lengthened and deepened the Great Depression", 27% generally agreed, 22% agreed but with provisos, and 51% disagreed. Among professional historians, 6% agreed, 21% agreed with provisos, and 74% disagreed. However, economist Eric Rauchway of the University of California stated "very few people disapprove of most of the New Deal reforms", which include Social Security, the Securities and Exchange Commission, the Federal Deposit Insurance Corp., and Fannie Mae. Regardless, unemployment peaked in 1932 at 25% and was reduced to 13.9% by 1940.

As Broadus Mitchell summarized, "Most indexes worsened until the summer of 1932, which may be called the low point of the depression economically and psychologically". Economic indicators show the American economy declined until February 1933. After Roosevelt took office, there began a steady, sharp upward recovery that persisted until the brief Recession of 1937–1938 (see graph) after which they continued their upward climb. Thus the Federal Reserve Index of Industrial Production bottomed at 52.8 on July 1, 1932, and was practically unchanged at 54.3 on March 1, 1933; however by July 1, 1933, it had climbed to 85.5 (with 1935–39 = 100, and for comparison 2005 = 1,342).

New Deal impact

A 2017 review of the published scholarship summarized the findings of researchers as follows:

The studies find that public works and relief spending had state income multipliers of around one, increased consumption activity, attracted internal migration, reduced crime rates, and lowered several types of mortality. The farm programs typically aided large farm owners but eliminated opportunities for share croppers, tenants, and farm workers. The Home Owners' Loan Corporation's purchases and refinancing of troubled mortgages staved off drops in housing prices and home ownership rates at relatively low ex post cost to taxpayers. The Reconstruction Finance Corporation's loans to banks and railroads appear to have had little positive impact, although the banks were aided when the RFC took ownership stakes.

Wartime output and controls: 1940–1945

Women making aluminum shells for the war in 1942

Unemployment dropped to 2%, relief programs largely ended, and the industrial economy grew rapidly to new heights as millions of people moved to new jobs in war centers, and 16 million men and 300,000 women were drafted or volunteered for military service.

All economic sectors grew during the war. Farm output went from an index (by volume) of 106 in 1939 to 128 in 1943. Coal output went from 446 million tons in 1939 to 651 in 1943; oil from 1.3 billion barrels to 1.5 billion. Manufacturing output doubled, from a volume index of 109 in 1939 to 239 in 1943. Railroads strained to move it all to market, going from an output of 13.6 billion loaded car miles in 1939 to 23.3 in 1943.

The War Production Board coordinated the nation's productive capabilities so that military priorities would be met. Converted consumer-products plants filled many military orders. Automakers built tanks and aircraft, for example, making the United States the "arsenal of democracy". In an effort to prevent rising national income and scarce consumer products from causing inflation, the newly created Office of Price Administration rationed and set prices for consumer items ranging from sugar to meat, clothing and gasoline, and otherwise tried to restrain price increases. It also set rent in war centers.

Six million women took jobs in manufacturing and production; most were newly created temporary jobs in munitions. Some were replacing men away in the military. These working women were symbolized by the fictional character of Rosie the Riveter. After the war many women returned to household work as men returned from military service. The nation turned to the suburbs, as a pent-up demand for new housing was finally unleashed.

Household gas, water, electricity, sanitation, heating, refrigeration

By 1940 nearly 100% of urban homes had electricity, 80% had indoor flush toilets, 73% had gas heating or cooking, 58% central heating, 56% had mechanical refrigerators.

Post-World War II prosperity: 1945–1973

Quarterly gross domestic product

The period from the end of World War II to the early 1970s was a golden era of economic growth. $200 billion in war bonds matured, and the G.I. Bill financed a well-educated workforce. The middle class swelled, as did GDP and productivity. This growth was distributed fairly evenly across the economic classes though not race, which some attribute to the strength of labor unions in this period—labor union membership peaked historically in the U.S. during the 1950s, in the midst of this massive economic growth. Much of the growth came from the movement of low income farm workers into better paying jobs in the towns and cities—a process largely completed by 1960.

Congress created the Council of Economic Advisors, to promote high employment, high profits and low inflation. The Eisenhower administration (1953–1961) supported an activist contracyclical approach that helped to establish Keynesianism as a bipartisan economic policy for the nation. Especially important in formulating the CEA response to the recession—accelerating public works programs, easing credit, and reducing taxes—were Arthur F. Burns and Neil H. Jacoby. "I am now a Keynesian in economics", proclaimed Republican President Richard Nixon in 1969. Although this period brought economic expansion to the country as a whole, it was not recession proof. The recessions of 1945, 1949, 1953, 1958, and 1960 saw a drastic decline in GDP.

The "Baby Boom" saw a dramatic increase in fertility in the period 1942–1957; it was caused by delayed marriages and childbearing during depression years, a surge in prosperity, a demand for suburban single-family homes (as opposed to inner city apartments) and new optimism about the future. The boom crested about 1957, then slowly declined.

Agriculture

Farm machinery, fertilizer and high yield seed varieties

Ammonia from plants built during World War II to make explosives became available for making fertilizers, leading to a permanent decline in real fertilizer prices. The early 1950s was the peak period for tractor sales in the U.S. as the few remaining horses and mules were phased out. The horsepower of farm machinery underwent a large expansion. A successful cotton picking machine was introduced in 1949. The machine could do the work of 50 men picking by hand.

Research on plant breeding produced varieties of grain crops that could produce high yields with heavy fertilizer input. This resulted in the Green revolution, beginning in the 1940s. By the century's end yields of corn (maize) rose by a factor of over four. Wheat and soybean yields also rose significantly.

Government policies

The New Deal era farm programs were continued into the 1940s and 1950s, with the goal of supporting the prices received by farmers. Typical programs involved farm loans, commodity subsidies, and price supports. The rapid decline in the farm population led to a smaller voice in Congress. So the well-organized Farm Bureau and other lobbyists, worked in the 1970s to appeal to urban Congressman through food stamp programs for the poor. By 2000, the food stamp program was the largest component of the farm bill. In 2010, the Tea Party movement brought in many Republicans committed to cutting all federal subsidies, including those in agriculture. Meanwhile, urban Democrats strongly opposed reductions, pointing to the severe hardships caused by the 2008–10 economic recession. The Agricultural Act of 2014 saw many rural Republican Congressman voting against the program despite its support from farmers; it passed with urban support.

Aircraft and air transportation industries

Air transport was a major beneficiary of the war. The United States was the leading producer of combat aircraft during World War II and had a large surplus of machine tools and manufacturing facilities for airplanes at the end of the war. There were also experienced airplane manufacturing and maintenance personnel. Additionally, radar had been developed just before the war.

The aircraft industry had the highest productivity growth of any major industry, growing by 8.9% per year in 1929–1966.

During World War II the United States hired hundreds of thousands of workers, put them all in 4 major factories and had a government budget of over $3 billion (equivalent to $44,325,000,000 in 2019). The B-29 project required the US Army Air Forces to have unprecedented organizational capabilities as this project included several major private contractors and labor unions. American aircraft production was the single largest sector of the war economy, costing $45 billion (almost a quarter of the $183 billion spent on war production), employing a staggering two million workers, and, most importantly, producing over 125,000 aircraft. Production of Selected U.S. Military Aircraft (1941–1945): Bombers-49,123 Fighters-63,933 Cargo-14,710 Total-127,766.

Housing

Very little housing had been built during the Great Depression and World War II, except for emergency quarters near war industries. Overcrowded and inadequate apartments was the common condition. Some suburbs had developed around large cities where there was rail transportation to the jobs downtown. However, the real growth in suburbia depended on the availability of automobiles, highways, and inexpensive housing. The population had grown, and the stock of family savings had accumulated the money for down payments, automobiles and appliances. The product was a great housing boom. Whereas an average of 316,000 new housing non-farm units had been constructed from the 1930s through 1945, there were 1,450,000 units built annually from 1946 through 1955.

The G.I. Bill of Rights guaranteed low cost loans for veterans, with very low down payments, and low interest rates. With 16 million eligible veterans, the opportunity to buy a house was suddenly at hand. In 1947 alone, 540,000 veterans bought one; their average price was $7300. The construction industry kept prices low by standardization – for example standardizing sizes for kitchen cabinets, refrigerators and stoves, allowed for mass production of kitchen furnishings. Developers purchased empty land just outside the city, installed tract houses based on a handful of designs, and provided streets and utilities, or local public officials race to build schools. The most famous development was Levittown, in Long Island just east of New York City. It offered a new house for $1000 down, and $70 a month; it featured three bedrooms, fireplace, gas range and gas furnace, and a landscaped lot of 75 by 100 feet, all for a total price of $10,000. Veterans could get one with a much lower down payment.

Interstate highway system

Construction of the Interstate Highway System began in 1956 under President Eisenhower. In long-term perspective the interstate highway system was a remarkable success, that has done much to sustain Eisenhower's positive reputation. Although there have been objections to the negative impact of clearing neighborhoods in cities, the system has been well received. The railroad system for passengers and freight declined sharply, but the trucking expanded dramatically and the cost of shipping and travel fell sharply. Suburbanization became possible, with the rapid growth of easily accessible, larger, cheaper housing than was available in the overcrowded central cities. Tourism dramatically expanded as well, creating a demand for more service stations, motels, restaurants and visitor attractions. There was much more long-distance movement to the Sunbelt for winter vacations, or for permanent relocation, with convenient access to visits to relatives back home. In rural areas, towns and small cities off the grid lost out as shoppers followed the interstate, and new factories were located near them.

Computer technology

Mainframe business computer systems were introduced in the 1950s following the manufacture of transistors. Mainframe computers were in widespread use by the 1960s. These computers handled a variety of accounting, billing and payroll applications.

One highly significant application was the Sabre airline reservations system, which first went into operation in 1960. With Sabre reservations could be placed remotely using teleprinters and all functions were done automatically, including ticket printing. This eliminated manually handling file cards.

Fiscal policy

Federal taxes on incomes, profits and payrolls had risen to high levels during World War II and had been cut back only slowly; the highest rates for individuals reached the 90% level. Congress cut tax rates in 1964. President Lyndon B. Johnson (1963–69) dreamed of creating a "Great Society", and began many new social programs to that end, such as Medicaid and Medicare.

Military and space spending

After the Cold War began in 1947, and especially after the Korean War began in 1950, the government adopted a strategy in NSC 68 military spending. Economists examined how much this "military Keynesianism" stimulated the economy.

President Eisenhower feared that excessive military spending would damage the economy, so he downsized the Army after Korea and shifted priorities to missiles and nuclear weapons (which were much less expensive than army divisions). He also promoted the Interstate Highway system as necessary for national defense, and made space exploration a priority. His successor John F. Kennedy made a manned mission to the moon a national priority. Much of the new spending went to California and the West, a continuation of wartime spending.

An even greater impact came in the South, where it stimulated a modernization of the economy away from cotton towards manufacturing and high technology. For example, there were new, large technologically sophisticated installations at the Atomic Energy Commission's Savannah River Site in South Carolina; the Redstone Arsenal at Huntsville, Alabama; nuclear research facilities at Oak Ridge, Tennessee; and space facilities at Cape Canaveral, Florida, at the Lyndon B. Johnson Space Center in Houston, and at the John C. Stennis Space Center in Mississippi.

The Defense Department financed some of private industry's research and development throughout these decades, most notably ARPANET (which would become the Internet).

Decline of labor unions

The percentage of workers belonging to a union (or "density") in the United States peaked in 1954 at almost 35% and the total number of union members peaked in 1979 at an estimated 21.0 million. Union membership has continued to decline into the 2010s due to right-to-work laws adopted by many states, globalization undermining higher-wage firms, and increasing political opposition exemplified by the breaking of the 1981 strike of the Professional Air Traffic Controllers Organization (1968) by President Ronald Reagan.

Late 20th century

The US trade balance (from 1960)
 
U.S. trade balance and trade policies (1895–2015)
 
Number of countries having a banking crisis in each year since 1800. This is based on This Time is Different: Eight Centuries of Financial Folly  which covers only 70 countries. The general upward trend might be attributed to many factors. One of these is a gradual increase in the percent of people who receive money for their labor. The dramatic feature of this graph is the virtual absence of banking crises during the period of the Bretton Woods agreement, 1945 to 1971. This analysis is similar to Figure 10.1 in Reinhart and Rogoff (2009). For more details see the help file for "bankingCrises" in the Ecdat package available from the Comprehensive R Archive Network (CRAN).

Post industrial (service) economy

Manufacturing employment and nominal value added shares of the economy have been in a steady decline since World War II. In the late 1960s manufacturing's share of both employment and nominal value added was about 26%, falling to about 11% and 12% respectively by the end of the century.

Per-capita steel consumption in the U.S. peaked in 1977, then fell by half before staging a modest recovery to levels well below the peak.

Service sector expansion

The decline in the relative size of manufacturing coincided with a rise in the size of the service sector.

Productivity slowdown

Technological innovations of the final third of the 20th century were significant, but were not as powerful as those of the first two-thirds of the century. Manufacturing productivity growth continued at a somewhat slower rate than in earlier decades, but overall productivity was dragged down by the relative increase in size of the government and service sectors.

Inflation woes: 1970s

The postwar boom ended with a number of events in the early 1970s:

In the late 1960s it was apparent to some that this juggernaut of economic growth was slowing down, and it began to become visibly apparent in the early 1970s. The United States grew increasingly dependent on oil importation from OPEC after peaking production in 1970, resulting in oil supply shocks in 1973 and 1979. Stagflation gripped the nation, and the government experimented with wage and price controls under President Nixon. The Bretton Woods Agreement collapsed in 1971–1972, and President Nixon closed the gold window at the Federal Reserve, taking the United States entirely off the gold standard.

President Gerald Ford introduced the slogan, "Whip Inflation Now" (WIN). In 1974, productivity shrunk by 1.5%, though this soon recovered. In 1976, Jimmy Carter won the Presidency. Carter would later take much of the blame for the even more turbulent economic times to come, though some[who?] say circumstances were outside his control. Inflation continued to climb skyward. Productivity growth was small, when not negative. Interest rates remained high, with the prime reaching 20% in January 1981; Art Buchwald quipped that 1980 would go down in history as the year when it was cheaper to borrow money from the Mafia than the local bank.

Unemployment dropped mostly steadily from 1975 to 1979, although it then began to rise sharply.

This period also saw the increased rise of the environmental and consumer movements, and the government established new regulations and regulatory agencies such as the Occupational Safety and Health Administration, the Consumer Product Safety Commission, the Nuclear Regulatory Commission, and others.

Deregulation and Reaganomics: 1976–1992

Deregulation gained momentum in the mid-1970s, spurred by slow productivity growth and increasing operation and capital costs in several key sectors. It was not until 1978 that the first meaningful deregulation legislation, the Airline Deregulation Act, was cleared by Congress. Transportation deregulation accelerated in 1980, with the deregulation of railroads and trucking. Deregulation of interstate buses followed in 1982. In addition to transportation deregulation, savings and loan associations and banks were partially deregulated with the Depository Institutions Deregulation and Monetary Control Act in 1980 and the Garn–St. Germain Depository Institutions Act in 1982.

On a broader front, the economy initially recovered at a brisk pace from the 1973–75 recession. Incoming president Jimmy Carter instituted a large fiscal stimulus package in 1977 in order to boost the economy. However, inflation began a steep rise beginning in late 1978, and rose by double digits following the 1979 energy crisis. In order to combat inflation, Carter appointed Paul Volcker to the Federal Reserve, who raised interest rates and caused a sharp recession in the first six months of 1980. In March 1980, Carter introduced his own policies for reducing inflation, and the Federal Reserve brought down interest rates to cooperate with the initiatives.

During the 1980 recession, manufacturing shed 1.1 million jobs, while service industries remained intact. Employment in automotive manufacturing in particular suffered, experiencing a 33% reduction by the end of the recession. Collectively these factors contributed to the election of Ronald Reagan in 1980. The Federal Reserve once again began to raise interest rates in 1981, which plunged the economy back into recession. Unemployment rose to a peak of 10.8% in December 1982, a post-war high.

In 1981, Ronald Reagan introduced Reaganomics. That is, fiscally expansive economic policies, cutting marginal federal income tax rates by 25%. Inflation dropped dramatically from 13.5% annually in 1980 to just 3% annually in 1983 due to a short recession and the Federal Reserve Chairman Paul Volcker's tighter control of the money supply and interest rates. Real GDP began to grow after contracting in 1980 and 1982. The unemployment rate continued to rise to a peak of 10.8% by late 1982, but dropped well under 6% unemployment at the end of Reagan's presidency in January 1989.

20 million jobs were created under Reagan's presidency – which were made up of 82 percent high-paying and long-term jobs. From 1982 to 1987 the Dow Jones Industrial Average gained over 1900 points from 776 in 1982 to 2722 in 1987 – about a 350% increase. An economic boom took place from 1983 until a recession began in 1990. Between 1983 and 1989 the number of people below the poverty line decreased by 3.8 million.

The boom saw the increasing popularity of electronic appliances like computers, cell phones, music players and video games. Credit cards were a symbol of the boom. The Reagan tax cuts seemed to work and Americans were able to shrug off the crash of 1987 by the beginning of 1988. The growth ended by 1990 after seven years of stock market growth and prosperity for the upper and middle class. The federal debt spawned by his policies tripled (from $930 billion in 1981 to $2.6 trillion in 1988), reaching record levels.

Though debt almost always increased under every president in the latter half of the 20th century, it declined as a percentage of GDP under all presidents after 1950 and prior to Reagan. In addition to the fiscal deficits, the U.S. started to have large trade deficits. Also it was during his second term that the Tax Reform Act of 1986 was passed. Vice President George H. W. Bush was elected to succeed Reagan in 1988. The early Bush Presidency's economic policies were sometimes seen as a continuation of Reagan's policies, but in the early 1990s, Bush went back on a promise and increased taxes in a compromise with Congressional Democrats. He ended his presidency on a moderate note, signing regulatory bills such as the Americans With Disabilities Act, and negotiating the North American Free Trade Agreement. In 1992, Bush and third-party candidate Ross Perot lost to Democrat Bill Clinton.

The advent of deindustrialization in the late 1960s and early 1970s saw income inequality increase dramatically to levels never seen before. But at the same time, most orthodox economists, and most policy makers, pointed to the fact that consumers could buy so many goods, even with the inflation of the 1970s, as evidence that the general shift away from manufacturing and into services was creating widespread prosperity. In 1968, the U.S. Gini coefficient was 0.386. In 2005, the American Gini coefficient had reached 0.469.

Critics of economic policies favored by Republican and Democratic administrations since the 1960s, particularly those expanding "free trade" and "open markets" (see Neoliberalism) say that these policies, though benefiting trading as well as the cost of products in the U.S., could have taken their own on the prosperity of the America middle-class. But in this period, consumers were buying as never before with so many products and goods at such low costs and in high quantities. Critics however argued that this consumer behavior was giving a false reading of the health of the economy, because it was being paid for by taking on rapidly increasing levels of indebtedness, thus covering up the stagnating wages and earnings of most of the workforce.

The rise of globalization: 1990s – late 2000

This graph shows three major stock indices since 1975. Notice the meteoric rise of the stock market in the 1990s, followed by the collapse of the dot-com bubble in 2000 on the tech-heavy NASDAQ.

During the 1990s, government debt increased by 75%, GDP rose by 69%, and the stock market as measured by the S&P 500 grew more than threefold.

From 1994 to 2000 real output increased, inflation was manageable and unemployment dropped to below 5%, resulting in a soaring stock market known as the dot-com boom. The second half of the 1990s was characterized by well-publicized initial public offerings of high-tech and "dot-com" companies. By 2000, however, it was evident a bubble in stock valuations had occurred, such that beginning in March 2000, the market would give back some 50% to 75% of the growth of the 1990s.

Executive compensation

Ratio of the average compensation of CEOs from the top 350 firms and production workers, 1965–2009. Source: Economic Policy Institute. 2012. Based on data from Wall Street Journal/Mercer, Hay Group 2010.
 

In the late 20th century, business executives started being paid more in stock and less in salary, and at a disproportionately greater rate compared to other employees and to their own performance. CEO compensation at the top 350 firms increased by 940.3% from 1978 to 2018 in the United States. The typical worker's annual compensation grew just 11.9% within the same period. This disparity has been criticized as unfair, and large payouts after bad performance have been criticized as bad for shareholders.

21st century

The economy worsened in 2001 with output increasing only 0.3% and unemployment and business failures rising substantially, and triggering a recession that is often blamed on the September 11 attacks.

An additional factor in the fall of the US markets and in investor confidence included numerous corporate scandals.

In 2001–2007, the red-hot housing market across the United States fueled a false sense of security regarding the strength of the U.S. economy.

Decline of labor unions

Most unions in America are aligned with one of two larger umbrella organizations: the AFL–CIO created in 1955, and the Change to Win Federation which split from the AFL-CIO in 2005. Both advocate policies and legislation on behalf of workers in the United States and Canada, and take an active role in politics. The AFL–CIO is especially concerned with global trade issues.

Child labourers in an Indiana glass works. Labor unions have an objective interest in combating child labour.

In 2010, the percentage of workers belonging to a union in the United States (or total labor union "density") was 11.4%, compared to 18.3% in Japan, 27.5% in Canada and 70% in Finland.

The most prominent unions are among public sector employees such as teachers, police and other non-managerial or non-executive federal, state, county and municipal employees. Members of unions are disproportionately older, male and residents of the Northeast, the Midwest, and California.

The majority of union members come from the public sector. Nearly 34.8% of public sector employees are union members. In the private sector, just 6.3% of employees are union members—levels not seen since 1932.

Union workers in the private sector average 10–30% higher pay than non-union in America after controlling for individual, job, and labour market characteristics.

Great Recession and aftermath (2007–2019)

US Employment to population ratio, 1990–2021

The Great Recession was a sharp decline in the United States' economy. In 2008, a series of related economic disasters hit the American and European financial systems. The bursting of a worldwide bubble in housing set the recession in motion. The end of housing bubbles in California, Florida and Arizona led to the collapse of housing prices and the shrinkage of the construction sector. Millions of mortgages (averaging about $200,000 each) had been bundled into securities called collateralized debt obligations that were resold worldwide. Many banks and hedge funds had borrowed hundreds of billions of dollars to buy these securities, which were now "toxic" because their value was unknown and no one wanted to buy them.

A series of the largest banks in the U.S. and Europe collapsed; some went bankrupt, such as Lehman Brothers with $690 billion in assets; others such as the leading insurance company AIG, the leading bank Citigroup, and the two largest mortgage companies were bailed out by the government. Congress voted $700 billion in bailout money, and the Treasury and Federal Reserve committed trillions of dollars to shoring up the financial system, but the measures did not reverse the declines. Banks drastically tightened their lending policies, despite infusions of federal money. The government for the first time took major ownership positions in the largest banks. The stock market plunged 40%, wiping out tens of trillions of dollars in wealth; housing prices fell 20% nationwide wiping out trillions more. By late 2008 distress was spreading beyond the financial and housing sectors, especially as the "Big Three" of the automobile industry (General Motors, Ford and Chrysler) were on the verge of bankruptcy, and the retail sector showed major weaknesses. Critics of the $700 billion Troubled Assets Relief Program (TARP) expressed anger that much of the TARP money that has been distributed to banks is seemingly unaccounted for, with banks being secretive on the issue.

President Barack Obama signed the American Recovery and Reinvestment Act of 2009 in February 2009; the bill provides $787 billion in stimulus through a combination of spending and tax cuts. The plan is largely based on the Keynesian theory that government spending should offset the fall in private spending during an economic downturn; otherwise the fall in private spending may perpetuate itself and productive resources, such as the labor hours of the unemployed, will be wasted. Critics claim that government spending cannot offset a fall in private spending because the government must borrow money from the private sector in order to add money to it. However, most economists do not think such "crowding out" is an issue when interest rates are near zero and the economy is stagnant. Opponents of the stimulus also point to problems of possible future inflation and government debt caused by such a large expenditure.

In the U.S., jobs paying between $14 and $21 per hour made up about 60% those lost during the recession, but such mid-wage jobs have comprised only about 27% of jobs gained during the recovery through mid-2012. In contrast, lower-paying jobs constituted about 58% of the jobs regained.

Gig work

In the 21st century, proliferation of smartphones and increasing reliance on outsourced labor led to a class of "gig workers" who perform work such as package delivery for a company like Amazon.com or Walmart, food delivery for a company like DoorDash, or driving for a ridehailing company like Uber or Lyft. This has led to disputes alleging misclassification of employees as independent contractors.

Great Lockdown and aftermath (2019–present)

On September 16, 2019, the Federal Reserve announced that it would begin acting as the role of investor in order to provide funds in the repo markets after the overnight lending rate jumped above 8% due to a series of technical factors that limited the supply of available funds. Five months after the decision was made, the American stock markets suffered their biggest crash in modern U.S. history as a result of concerns surrounding the coronavirus pandemic and the Russia-Saudi Arabia oil price war. Before the crash happened, the unemployment rate in the United States stood at 3.6% in late 2019, which was the lowest unemployment rate since World War II. Although the unemployment rate decreased substantially between 2009 and 2019, income inequality continued to increase. In September 2019, the United States Census Bureau reported that income inequality in the United States had reached its highest level in 50 years, with the Gini index increasing from 48.2 in 2017 to 48.5 in 2018. Despite the low unemployment, the ISM Manufacturing Index dropped below 50% in August 2019, reaching a record low of 48.3% in October of that year (which was the lowest level since June 2009); it would continue to remain below 50% in the months leading up to the crash.

Saturday, April 8, 2023

Hunger in the United States

Members of the United States Navy serving hungry Americans at a soup kitchen in Red Bank, N.J., during a 2011 community service project.

Hunger in the United States of America affects millions of Americans, including some who are middle class, or who are in households where all adults are in work. The United States produces far more food than it needs for domestic consumption—hunger within the U.S. is caused by some Americans having insufficient money to buy food for themselves or their families. Additional causes of hunger and food insecurity include neighborhood deprivation and agricultural policy. Hunger is addressed by a mix of public and private food aid provision. Public interventions include changes to agricultural policy, the construction of supermarkets in underserved neighborhoods, investment in transportation infrastructure, and the development of community gardens. Private aid is provided by food pantries, soup kitchens, food banks, and food rescue organizations.

Historically, the U.S. was a world leader in reducing hunger both domestically and internationally. In the latter half of the twentieth century, other advanced economies in Europe and Asia began to overtake the U.S. in terms of reducing hunger among their own populations. In 2011, a report presented in the New York Times found that among 20 economies recognized as advanced by the International Monetary Fund and for which comparative rankings for food security were available, the U.S. was joint worst. Nonetheless, in March 2013, the Global Food Security Index commissioned by DuPont, ranked the U.S. number one for food affordability and overall food security.

In 2018, about 11.1% of American households were food insecure. Surveys have consistently found much higher levels of food insecurity for students, with a 2019 study finding that over 40% of US undergraduate students experienced food insecurity. Indicators suggested the prevalence of food insecurity for US households approximately doubled during the COVID-19 pandemic, with an especially sharp rise for households with young children.

The Human Rights Measurement Initiative finds that the US is achieving 87.6% of what should be possible at their income level for fulfilling the right to food.

Food insecurity

Food insecurity is defined at a household level, of not having adequate food for any household member due to finances. The step beyond this is very low food security, which is having six (for families without children) to eight (for families with children) or more food insecure conditions in the U.S. Department of Agriculture, Food Security Supplement Survey. To be very low food secure means members of the household disrupt their food intake due to financial reasons.

These conditions are many: worrying about running out of food, that food bought does not last, a lack of a balanced diet, adults cutting down portion sizes or out meals entirely, eating less than what they felt they should, being hungry and not eating, unintended weight loss, not eating for whole days (repeatedly), due to financial reasons.

Food insecurity is closely related to poverty but is not mutually exclusive. Food insecurity does not exist in isolation and is just one individual aspect in the multiple factors of social determinants regarding health.

Hunger vs. food insecurity

According to the United States Department of Agriculture (USDA), food insecurity is "a household-level economic and social condition of limited or uncertain access to adequate food." Hunger, on the other hand, is defined as "an individual-level physiological condition that may result from food insecurity." The USDA has also created a language to describe various severities of food insecurity. High food security occurs when there are "no reported indications of food-access problems or limitations." Marginal food security occurs when there are one to two reported indications of "food-access problems or limitations" such as anxiety over food shortages in the household but no observable changes in food intake or dietary patterns. Low food security, previously called food insecurity without hunger, occurs when individuals experience a decrease in the "quality, variety, or desirability of diet" but do not exhibit reduced food intake. Very low food security, previously called food insecurity with hunger, is characterized by "multiple indications of disrupted eating patterns and reduced food intake."

Geographic disparities

Rural and urban communities

Infographic about food insecurity in the US

There are distinct differences between how hunger is experienced in the rural and urban settings. Rural counties experience high food insecurity rates twice as often as urban counties. It has been reported that approximately 3 million rural households are food insecure, which is equal to 15 percent of the total population of rural households. This reflects the fact that 7.5 million people in rural regions live below the federal poverty line. This poverty in rural communities is more commonly found in Southern states. The prevalence of food insecurity is found to be highest in principal cities (13.2%), high in rural areas (12.7) and lowest in suburban and other metropolitan’ areas (non-principal cities) (8.9%). This could possibly display the poor infrastructure within rural and downtown areas in cities, where jobs may be scarce, or display a central reliance on a mode of transit which may come at additional cost.

In addition, rural areas possess fewer grocery stores than urban regions with similar population density and geography. However, rural areas do have more supermarkets than similar urban areas. Research has discovered that rural counties' poverty level and racial composition do not have a direct, significant association to supermarket access in the area. Urban areas by contrast have shown through countless studies that an increase in the African American population correlates to fewer supermarkets and the ones available require residents to travel a longer distance. Despite these differences both city and rural areas experience a higher rate of hunger than suburban areas.

Food deserts

Neighborhoods without access to affordable and nutritious food are often referred to as food deserts. Although there is not a national definition for food deserts, most measures take into account the following factors:

  1. Accessibility to sources of healthy food, as measured by distance to a store or by the number of stores in an area.
  2. Individual-level resources that may affect accessibility, such as family income or vehicle availability.
  3. Neighborhood-level indicators of resources, such as the average income of the neighborhood and the availability of public transportation.

These measures look different across geographic regions. In rural areas, an area is named a food desert if access to a grocery story is over 10 miles away. According to Feeding America, rural communities make up 63% of counties in the United States and 87% of counties with the highest rates of overall food insecurity. However, according to a study by USDA, areas with higher poverty rates are more likely to be food deserts regardless if rural or urban.

According to the USDA, in 2015, about 19 million people, around 6% of the United States population, lived in a food desert, and 2.1 million households both lived in a food desert and lacked access to a vehicle. However, the definition and number of people living in food deserts is constantly evolving as it depends on census information.

Regions and states

Regionally, the food insecurity rate is highest in the South (12.0 percent).

States experience different rates of food insecurity, at varying levels of severity. Food insecurity is highest in AL, AR, IN, KY, LA, MS, NC, NM, OH, OK, TX, and WV.

Demographics

Food insecurity by household in America, 2012

Research by the USDA found that 11.1% of American households were food insecure during at least some of 2018, with 4.3% suffering from "very low food security". Breaking that down to 14.3 million households that experienced food insecurity. Estimating that 37.2 million people living in food-insecure households, and experienced food insecurity in 2018. Of these 37.2 million people approximately six million children were living in food insecure households and around a half million children experience very low food security. Low food insecurity is defined as demonstrating modified eating norms due to financial struggles in ascertaining food.

A survey took for Brookings in late April 2020 found indications that following the COVID-19 pandemic, the number of US households experiencing food insecurity had approximately doubled. For households with young children, indicators had suggested food insecurity may have reached about 40%, close to four times the prevalence in 2018, or triple what was seen for the previous peak that occurred in 2008, during the financial crisis of 2007–08.

Food insecurity issues disproportionately affect people in Black and Hispanic communities, low-income household, single women households, and immigrant communities.

Children

In 2011 16.7 million children lived in food-insecure households, about 35% more than 2007 levels, though only 1.1% of U.S. children, 845,000, saw reduced food intake or disrupted eating patterns at some point during the year, and most cases were not chronic.

Almost 16 million children lived in food-insecure households in 2012. Schools throughout the country had 21 million children participate in a free or reduced lunch program and 11 million children participate in a free or reduced breakfast program. The extent of American youth facing hunger is clearly shown through the fact that 47% of SNAP (Supplemental Nutrition Assistance Program) participants are under the age of 18. The states with the lowest rate of food insecure children were North Dakota, Minnesota, Virginia, New Hampshire, and Massachusetts as of 2012.

The National School Lunch Program (NSLP) was established under the National School Lunch Act, signed by President Harry Truman in 1946.

In 2018 six million children experience food insecurity. Feeding America estimates that around one in seven children or approximately 11 million, children experience hunger and do not know where they will get their next meal or when. The wide breadth between these source's data could possibly be explain that food insecurity is not all-encompassing of hunger, and is only a solid predictor. 13.9% of households with children experience food insecurity with the number increasing for households having children under the age of six (14.3%).

College students

A growing body of literature suggests that food insecurity is an emerging concern in college students. Food insecurity prevalence was found to be 43.5% in a systematic review of food insecurity among US students in higher education. This prevalence of food insecurity is over twice as high as that reported in United States national households. Data have been collected to estimate prevalence both nationally as well as at specific institutions (two and four year colleges). For example, an Oregon university reported that 59% of their college students experienced food insecurity where as in a correlational study conducted at the University of Hawaii at Manoa found that 21-24% of their undergraduate students were food-insecure or at risk of food insecurity. Data from a large southwestern university show that 32% of college freshmen, who lived in residence halls, self-reported inconsistent access to food in the past month. According to a 2011 survey of the City University of New York (CUNY) undergraduates, about two in five students reported being food insecure.

Elderly

Volunteers preparing meals for Meals on Wheels recipients

Like children, the elderly population of the United States are vulnerable to the negative consequences of hunger. In 2011, there was an increase of 0.9% in the number of seniors facing the threat of hunger from 2009. This resulted in a population of 8.8 million seniors who are facing this threat; however, a total of 1.9 million seniors were dealing with hunger at this time. Seniors are particularly vulnerable to hunger and food insecurity largely due to their limited mobility. They are less likely to own a car and drive, and when they live in communities that lack public transportation, it can be quite challenging to access adequate food. Approximately 5.5 million senior citizens face hunger in the United States. This number has been steadily increasing since 2001 by 45%. Predictions believe that more than 8 million senior citizens will be suffering by 2050. Senior citizens are at an increased risk of food insecurity with many having fixed incomes and having to choose between health care and food. With most eligible seniors failing to enroll and receive food assistance such as SNAP. The organization Meals on Wheels reports that Mississippi, New Mexico, Arkansas, and Texas are the states with the top rates of seniors facing the threat of hunger respectively. Due to food insecurity and hunger, the elderly population experiences negative effects on their overall health and mental wellbeing. Not only are they more prone to reporting heart attacks, other cardiac conditions, and asthma, but food insecure seniors are also 60% more likely to develop depression.

Gender

Female single headed household experience food insecurity at a higher rates than the national average. For households without children, 14.2% of female single-headed households experience food insecurity compares to 12.5% in male single-headed households. For households with children, 27.8% of female single-headed household experience hunger compares to 15.9% of male single-headed household.

Race and ethnicity

Minority groups are affected by hunger to far greater extent than the Caucasian population in the United States. According to research conducted by Washington University in St. Louis on food insufficiency by race, 11.5% of Whites experience food insufficiency compared to 22.98% of African Americans, 16.67% of American Indians, and 26.66% of Hispanics when comparing each racial sample group.

Feeding America reports that 29% of all Hispanic children and 38% of all African American children received emergency food assistance in 2010. White children received more than half the amount of emergency food assistance with 11% receiving aid. However, Hispanic household are less likely to have interaction with SNAP than other ethnic groups and received assistance from the program.

Black

In the same survey during 2018 it displays a racial disparity between hunger, and food insecurity. For Blacks 21.2% experience food insecurity. This becomes alarming when comparing poverty rates for Blacks to Whites with data displaying the highest groups to experience food insecurity is those that experience the most severe poverty (9% of which African-Americans live in deep poverty conditions). In continuation and for further support "The 10 counties with the highest food insecurity rates in the nation are at least 60% African-American. Seven of the ten counties are in Mississippi". This depicts the intersectionality of socio-economic and race to display the greatest food insecurity.

Hispanic/Latino

Hispanic/Latino populations also experience inequitably high rates of food insecurity in the United States. In 2020, amidst the global COVID-19 pandemic, 19% of all Latinos in America were designated as food insecure. In the United States, Hispanic/Latino families experience nearly twice the rate of food insecurity as non-Hispanic/Latino white families, and repeatedly research shows higher risk of food insecurity in immigrant families and children of non-citizens.

According to Feeding America, this phenomenon is connected to the following:

  • "Racial prejudice and language, education, and cultural barriers create inequalities that make Latino communities more vulnerable to food insecurity.
  • Due to the coronavirus pandemic, food insecurity among Latinos rose from almost 16% in 2019 to more than 19% in 2020. Latinos were 2.5 times more likely to experience food insecurity than white individuals.
  • Latino workers, especially Latinas, are more likely to be employed in the leisure and hospitality industries that have been devastated by the coronavirus pandemic. Workers in these industries continue to face the highest unemployment rate.
  • According to the Census, Latinos make up 28% of the people living in poverty in the United States. However, Latinos make up only 19% of the total population of the United States."

Another study, published in 2019 by the Journal of Adolescent Health, found that 42% of Hispanic/Latino youth experienced food insecurity; additionally, 10% lived in a very low food secure household. Food insecurity in Hispanic/Latino youth has severe health consequences, including acculturative and economic stress and weakened family support systems.

Native American

While research into Native American food security has gone unnoticed and under researched until recent years, more studies are being conducted which reveal that Native Americans often times experience higher rates of food insecurity than any other racial group in the United States. The studies do not focus on the overall picture of Native American households, however, and tend to focus rather on smaller sample sizes in the available research. In a study that evaluated the level of food insecurity among White, Asian, Black, Hispanic and Indigenous Americans: it was reported that over a 10-year span of 2000-2010, Indigenous people were reported to be one of the highest at-risk groups of from a lack of access to adequate food, reporting anywhere from 20%-30% of households suffering from this type of insecurity. There are many reasons that contribute to the issue, but overall, the biggest lie in high food costs on or near reservations, lack of access to well-paying jobs, and predisposition to health issues relating to obesity and/or mental health.

Undocumented immigrants

Agriculture is a major industry in the United States, with California accounting for over 12% of the U.S. agriculture cash receipts. Over half of agricultural workers in California are undocumented. Agricultural labor is among the lowest paid occupations in the U.S. Many undocumented immigrants suffer from food insecurity due to low wages, forcing them to purchase economically viable unhealthy food. Though existing food pantry and food stamp programs aid in reducing the amount of food insecure individuals, undocumented immigrants are ineligible for social service programs and studies have found that limited English acts as a barrier to food stamp program participation. Due to a lack of education, higher incarceration rate, and language barriers, undocumented immigrants pose higher rates of food insecurity and hunger when compared to legal citizens. Undocumented immigrants who fear being deported limit their interactions with government agencies and social service programs, increasing their susceptibility to food insecurity.

Food insecurity among undocumented immigrants can in some cases be traced to environmental injustices. Researchers argue that climate change increases food insecurity due to drought or floods and that issues of food security and the food systems of the U.S must be addressed. An example may be large populations of undocumented communities along the Central Valley of California. Towns located across the Central Valley of CA exhibit some of the highest rates of air, water and pesticide pollution in the state.

Health consequences

Hunger can manifest a multitude of health consequences, including mental, emotional, and physical symptoms and signs. A person under hunger may experience tiredness, feelings of coldness, dry cracked skin, swelling, and dizziness. Signs may be thinning of the face, pale flaky skin, low blood pressure, low pulse, low temperature and cold extremities. Additional signs denoting more extreme cases include vitamin deficient, osteocalcin, anemia, muscle tenderness, weakening of the muscular system, loss of sensation in extremities, heart failure, cracked lips diarrhea, and dementia. Server hunger can lead to the shrinking of the digestive system track, promote bacterial growth in the intestines, deterioration in the heart and kidney function, impair the immune system.

Early development

Children who experience hunger have an increase in both physical and psychological health problems. Hunger can lead to multiple health consequences, pre-birth development, low birth weights, higher frequency of illness and a delay in mental and physical development. This impairment may cause educational issues, which often can lead to children being held back a year in school.

Although there is not a direct correlation between chronic illnesses and hunger among children, the overall health and development of children decreases with exposure to hunger and food insecurity. Children are more likely to get ill and require a longer recovery period when they don't consume the necessary amount of nutrients. Additionally, children who consume a high amount of highly processed, packaged goods are more-likely to develop chronic diseases such as diabetes and cardiovascular disease due to these food items containing a high amount of calories in the form of added sugars and fats. Children experiencing hunger in the first three years of life are more likely to be hospitalized, experience higher rates of anemia and asthma and develop a weakened immune system, and develop chronic illnesses as an adult. Hunger in later stages of childhood can cause a delayed onset of puberty changing the rate of secretion of critically needed hormones.

Mental health and academic performance

In regards to academics, children who experience hunger perform worse in school on both mathematic and reading assessments. Children who consistently start the day with a nutritious breakfast have an average increase of 17.5% on their standardized math scores than children who regularly miss breakfast. Behavioral issues arise in both the school environment and in the children's ability to interact with peers of the same age. This is identified by both parental and teacher observations and assessments. Children are more likely to repeat a grade in elementary school and experience developmental impairments in areas like language and motor skills.

Hunger takes a psychological toll on youth and negatively affects their mental health. Their lack of food contributes to the development of emotional problems and causes children to have visited with a psychiatrist more often than their sufficiently fed peers. Research shows that hunger plays a role in late youth and young adult depression and suicidal ideation. It was identified as a factor in 5.6% of depression and suicidal ideation cases in a Canadian longitudinal study.

Senior health

Food insecurity has a negative impact on seniors' health and nutrition outcomes compared to food secure seniors. Research found that seniors experiencing hunger are more susceptible to physical health issues like high blood pressure, congestive heart failure, coronary heart disease, heart attack, asthma, and oral health issues.

Pregnancy

Food insecurity also has direct health consequences on pregnancy. Food insecurity during pregnancy is associated with gestational weight gain and pregnancy complications, second-trimester anemia, pregnancy-induced hypertension, and gestational diabetes mellitus (GDM), worse systolic blood pressure trends, increased risk of birth defects, and decreased breastfeeding.

Causes

Hunger and food insecurity in the United States is both a symptom and consequence of a complex combination of factors, including but not limited to poverty, housing insecurity, environmental justice, unemployment, economic inequality, systemic racism, and national policies and protections. There is not a single cause attributed to hunger and there is much debate over who or what is responsible for the prevalence of hunger in the United States.

Hunger and poverty

Researchers most commonly focus on the link between hunger and poverty. The federal poverty level is defined as "the minimum amount of income that a household needs to be able to afford housing, food, and other basic necessities." As of 2020, the federal poverty level for a family of four was $26,200 dollars.

Based on her research on poverty, Pennsylvania State University economic geographer Amy Glasmeier claims that when individuals live at, slightly above, or below the poverty line, unexpected expenses contribute to individuals reducing their food intake. Medical emergencies have a significant impact on poor families due to the high cost of medical care and hospital visits. Also, urgent car repairs reduce a family's ability to provide food, since the issue must be addressed in order to allow individuals to travel to and from work. Although income cannot be labeled as the sole cause of hunger, it plays a key role in determining if people possess the means to provide basic needs to themselves and their family.

The loss of a job reflects a core issue that contributes to hunger - employment insecurity. People who live in areas with higher unemployment rates and who have a minimal or very low amount of liquid assets are shown to be more likely to experience hunger or food insecurity. The complex interactions between a person's job status, income and benefits, and the number of dependents they must provide for, influence the impact of hunger on a family. For example, food insecurity often increases with the number of additional children in the household due to the negative impact on wage labor hours and an increase in the household's overall food needs.

Low-income and low access communities

Food deserts

Location plays a critical role in access to affordable and nutritious food. People who live in food deserts are more likely to experience food insecurity because food is harder to obtain based on where they live. Living in low-income, low access communities that are considered food deserts can prevent individuals from easily accessing healthy food markets and grocery stores due to lack of availability. Lack of access to grocery stores often leads to reliance on corner stores and convenience stores for food. These stores usually offer less nutritious foods, which causes dietary issues in food insecure populations, such as high rates of diabetes.

Research has expanded the definition of grocery store availability to food to include store quality, community acceptability, health and unhealthy food-marketing practices, product quality, and affordability.

There are several theories that attempt to explain why food deserts form. One theory proposes that the expansion of large chain supermarkets results in the closure of smaller-sized, independent neighborhood grocery stores. Market competition thereby produces a void of healthy food retailers in low-income neighborhoods.

Another theory suggests that in the period between 1970 and 1988, there was increasing economic segregation, with a large proportion of wealthy households moving from inner cities to more suburban areas. As a result, the median income in the inner cities rapidly decreased, causing a substantial proportion of supermarkets in these areas to close. Furthermore, business owners and managers are often discouraged from establishing grocery stores in low-income neighborhoods due to reduced demand for low-skilled workers, low-wage competition from international markets, zoning laws, and inaccurate perceptions about these areas.

Transportation

The vast majority of individuals living in food deserts struggle with transportation to food sources. In urban areas, people living in lower income communities may be more unlikely to easily and regularly access grocery stores that tend to be located far from their home.

Social Inequality

Social inequality is one of the major reasons for nutritional inequalities across America. There is a direct linear relation between socioeconomic status versus malnutrition. People with poor living conditions, less education, less money, and from disadvantaged neighborhoods may experience food insecurity and face less healthy eating patterns, which resulting in a higher level of diet-related health issues. According to the United States Department of Agriculture (USDA) Economic Research Service, 10.2 percent (13.5 million) of US household has food insecurity in 2021. Although this number seems low, accessibility of dietary quality among the higher socioeconomic status American has better reach for healthy dietary quality food compared to the lowered economic status American, and gap between higher and lower socioeconomic status is going up every day. This is happening due to income inequality, higher cost of healthy food, lack of proper education on nutritional health, food shortages, and lack of government influence to establish health equity.

Housing and neighborhood deprivation

An additional contributor to hunger and food insecurity in the U.S.A is neighborhood deprivation. According to the Health & Place Journal, neighborhood deprivation is the tendency for low-income, minority neighborhoods to have greater exposure to unhealthy tobacco and alcohol advertisements, a fewer number of pharmacies with fewer medications, and a scarcity of grocery stores offering healthy food options in comparison to small convenience stores and fast-food restaurants.

Studies have shown that within these food deserts there exists distinct racial disparities. For example, compared to predominately white neighborhoods, predominately black neighborhoods have been reported to have half the number of chain supermarkets available to residents.

Agricultural policy

Another cause of hunger is related to agricultural policy. Due to the heavy subsidization of crops such as corn and soybeans, healthy foods such as fruits and vegetables are produced in lesser abundance and generally cost more than highly processed, packaged goods. Because unhealthful food items are readily available at much lower prices than fruits and vegetables, low-income populations often heavily rely on these foods for sustenance. As a result, the poorest people in the United States are often simultaneously undernourished and overweight or obese. This is because highly processed, packaged goods generally contain high amounts of calories in the form of fat and added sugars yet provide very limited amounts of essential micronutrients. These foods are thus said to provide "empty calories."

No right to food for US citizens

In 2017, the US Mission to International Organizations in Geneva explained,

"Domestically, the United States pursues policies that promote access to food, and it is our objective to achieve a world where everyone has adequate access to food, but we do not treat the right to food as an enforceable obligation."

The US is not a signatory of Article 11 of the International Covenant on Economic, Social, and Cultural Rights, which recognizes "the fundamental right of everyone to be free from hunger," and has been adopted by 158 countries. Activists note that "United States opposition to the right to adequate food and nutrition (RtFN) has endured through Democratic and Republican administrations."

Holding the federal government responsible for ensuring the population is fed has been criticized as "nanny government." The right to food in the US has been criticized as being "associated with un-American and socialist political systems", "too expensive", and as "not the American way, which is self-reliance." Anti-hunger activists have countered that "It makes no political sense for the US to continue to argue that HRF [the human right to food] and other economic rights are “not our culture” when the US pressures other nations to accept and embrace universal civil-political rights that some argue are not their culture."

Olivier De Schutter, former UN Special Rapporteur on the Right to Food, notes that one difficulty in promoting a right to food in the United States is a "constitutional tradition that sees human rights as “negative” rights—rights against government—not “positive” rights that can be used to oblige government to take action to secure people's livelihoods."

The Constitution of the United States "does not contain provisions related to the right to adequate food," according to the FAO.

Franklin D. Roosevelt proposed that a Second Bill of Rights was needed to ensure the right to food. The phrase "freedom from want" in Roosevelt's Four Freedoms has also been considered to encompass a right to food.

A 2009 article in the American Journal of Public Health declared that "Adopting key elements of the human rights framework is the obvious next step in improving human nutrition and well-being."

It characterizes current US domestic policy on hunger as being needs-based rather than rights-based, stating:

"The emphasis on charity for solving food insecurity and hunger is a “needs-based” approach to food. The needs-based approach assumes that people who lack access to food are passive recipients in need of direct assistance. Programs and policy efforts that use this approach tend to provide assistance without expectation of action from the recipient, without obligation and without legal protections."

Because "there is no popularly conceived, comprehensive plan in the U.S. with measurable benchmarks to assess the success or failures of the present approach [to hunger]," it is difficult for the US public to hold "government actors accountable to progressively improving food and nutrition status."

In 2014, the American Bar Association adopted a resolution urging the US government "to make the realization of a human right to adequate food a principal objective of U.S. domestic policy.”

An August 2019 article explains that the Supplemental Nutrition Assistance Program (SNAP, formerly known as the Food Stamp Program) only partially fulfills the criteria set out by a right to food.

Jesse Jackson has stated that it was Martin Luther King's dream that all Americans would have a right to food.

Coronavirus (COVID-19) pandemic and food insecurity

Unemployment

Recent studies suggest that food insecurity in the United States has doubled overall and tripled among households with children since the start of the COVID-19 pandemic. Food security rates can be predicated by the national unemployment rate because food insecurity is measured by both access to food and ability to afford it. During economic downturns in the past several centuries, food insecurity and food shortages rise not only during the year of the downturn, but for several years after.

In 2020, the COVID-19 pandemic and economic volatility have spurred mass layoffs or hour reductions, specifically in the transportation, service, leisure, and hospitality industries and domestic workers. As a result of lost wages, individuals and families working in these industries are increasingly more likely to be food and housing insecure.

Racial and gender disparities

Unemployment and food insecurity, linked to the COVID-19 pandemic, have disproportionately affected people of color and communities. People of color are employed in many industries susceptible to layoffs, and as a result of the pandemic, workers are experiencing high levels of unemployment. Persons of color are more likely to work ‘essential’ jobs, such as grocery store clerks, or in healthcare, thus they are at a higher risk of exposure for contracting the virus, which could lead to loss of hourly wages. According to the U.S. Census Bureau Household Pulse Survey, "among adults living in households where someone experienced losses in employment income, 36% of adults in households with an income of less than $25,000 reported either 'sometimes not having enough to eat' or 'often not having enough to eat' in the past week, compared with just 2.1% of adults in households with an income of $100,000 or more."

Women in particular, have been more vulnerable than men to job loss as a result of the COVID-19 pandemic. Women, especially minority women, are overrepresented in education, healthcare, and hospitality industries. According to the National Women's Law Center, Before the pandemic, "women held 77% of the jobs in education and health services, but they account for 83% of the jobs lost in those sectors." According to the United States Department of Agriculture in 2015, over 30% of households with children headed by a single mother were food insecure, and this number is expected to rise as a result of any economic downturn.

Children and school meal programs

The Brookings Institution found that the United States experienced a 65% increase in food insecurity among households with children. For example, in the third week of June in 2020, approximately 13.9 million children were living in a food insecurity, which is 5.6 times as many as in all of 2018 (2.5 million) and 2.7 times then the peak of the Great Recession in 2008 (5.1 million).

According to No Kid Hungry and The Hunger Partnership, more than 22 million kids get a free or reduced-price school lunch on an average school day. Schools closures and transitions to remote learning across the country due to the pandemic causes many schools to take on their summer plan for food distribution, requiring families to pick up food at specific times of the day in neighborhoods with the greatest need. However, many children who qualify for these programs are not receiving meals because often because parents and caretakers cannot pick up meals at the designated times as they have returned to work or lack transportation.

Fighting hunger

Public sector hunger relief

As of 2012, the United States government spent about $50 billion annually on 10 programs, mostly administrated by the Center for Nutrition Policy and Promotion, which in total deliver food assistance to one in five Americans.

The largest and only universal program is the Supplemental Nutrition Assistance Program, formerly known as the food stamp program. In the 2012 fiscal year, $74.6 billion in food assistance was distributed. As of December 2012, 47.8 million Americans were receiving on average $133.73 per month in food assistance.

Despite efforts to increase uptake, an estimated 15 million eligible Americans are still not using the program. Historically, about 40 million Americans were using the program in 2010, while in 2001, 18 million were claiming food stamps. After cut backs to welfare in the early 1980s and late 1990s, private sector aid had begun to overtake public aid such as food stamps as the fastest growing form of food assistance, although the public sector provided much more aid in terms of volume.

This changed in the early 21st century; the public sector's rate of increase in the amount of food aid dispensed again overtook the private sector's. President George W. Bush's administration undertook bipartisan efforts to increase the reach of the food stamp program, increasing its budget and reducing both the stigma associated with applying for aid and barriers imposed by red tape. Cuts in the food stamp programme came into force in November 2013, impacting an estimated 48 million poorer Americans, including 22 million children.

Most other programs are targeted at particular types of citizen. The largest of these is the School Lunch program, which in 2010 helped feed 32 million children a day. The second largest is the School Breakfast Program, feeding 16 million children in 2010. The next largest is the Special Supplemental Nutrition Program for Women, Infants and Children, which provide food aid for about 9 million women and children in 2010.

A program that is neither universal nor targeted is Emergency Food Assistance Program. This is a successor to the Federal Surplus Relief Corporation which used to distribute surplus farm production direct to poor people; now the program works in partnership with the private sector, by delivering the surplus produce to food banks and other civil society agencies.

In 2010, the Obama administration initiated the Healthy Food Financing Initiative (HFFI) as a means of expanding access to healthy foods in low-income communities. With over $400 million in funding from the Department of Health and Human Services, the Department of Agriculture and the Treasury Department, the initiative promoted interventions such as equipping already existing grocery stores and small retailers with more nutritious food options and investing in the development of new healthful food retailers in rural and urban food deserts.

Although many US Presidents made many efforts, until September 28th, 2022 since 1969, there was not any active interest shown by any previous US president to resolve food insecurity, which is resulting from many issues but social inequality is one of them. After 50 years, the Biden Administration brought together nutrition advocates, anti-hunger, food companies, local and state governments, and tribal and territorial communities to lay out a plan to combat hunger in the US. Washington Post reported that President Biden wants to end hunger by 2030 and permanently extends the child tax credits (CTC) program, raises the federal minimum wage, and expands the nutrition assistance program. The CTC program was effective in solving food insecurity during the COVID-19 pandemic. President described a five-pillar approach at the conference. The pillars include; Pillar 1: Improve food access and affordability, Pillar 2: integrate nutrition and health, Pillar 3: Empower all consumers to make and have access to healthy choices, Pillar 4: Support physical activity for all, and Pillar 5: Enhance nutrition and food security research. The fifth pillar is directly related to resolve social inequality, which was completely ignored. This pillar will specifically focus on equity and accessibility to improve metrics, data collection, and research for nutrition and food security policy. This announcement and plan will shade the light to mitigate the problem by establishing health equity, which would ensure all people have the opportunity to maintain a healthy diet to reach their full health regardless of color, gender, race, sexual orientation, social status, or neighborhood they live in.

Agricultural policy

Another potential approach to mitigating hunger and food insecurity is modifying agricultural policy. The implementation of policies that reduce the subsidization of crops such as corn and soybeans and increase subsidies for the production of fresh fruits and vegetables would effectively provide low-income populations with greater access to affordable and healthy foods. This method is limited by the fact that the prices of animal-based products, oils, sugar, and related food items have dramatically decreased on the global scale in the past twenty to fifty years. According to the Nutritional Review Journal, a reduction or removal of subsidies for the production of these foods will not appreciably change their lower cost in comparison to healthier options such as fruits and vegetables.

Current agricultural policy favors monocultures and large corporate farming. These are usually not in favor of community food needs. An alternative agricultural policy would turn toward more diversity in crops and allow communities to more locally define their own agricultural and food system policies that are socially, economically, ecologically, and culturally appropriate. This is food sovereignty.

Supermarket construction

Local and state governments can also work to pass legislation that calls for the establishment of healthy food retailers in low-income neighborhoods classified as food deserts. The implementation of such policies can reduce hunger and food insecurity by increasing the availability and variety of healthy food options and providing a convenient means of access. Examples of this are The Pennsylvania Fresh Food Financing Initiative and The New York City FRESH (Food Retail Expansion Health) program, which promote the construction of supermarkets in low-income neighborhoods by offering a reduction in land or building taxes for a certain period of time and providing grants, loans, and tax exemption for infrastructure costs. Such policies may be limited by the oligopolistic nature of supermarkets, in which a few large supermarket chains maintain the large majority of market share and exercise considerable influence over retail locations and prices.

Transportation infrastructure

If it is unfeasible to implement policies aimed at grocery store construction in low-income neighborhoods, local and state governments can instead invest in transportation infrastructure. This would provide residents of low-income neighborhoods with greater access to healthy food options at more remote supermarkets. This strategy may be limited by the fact that low-income populations often face time constraints in managing employment and caring for children and may not have the time to commute to buy healthy foods. Furthermore, this method does not address the issue of neighborhood deprivation, failing to resolve the disparities in access to goods and services across geographical space.

A small group of people planting fruits or vegetables in a communal garden.

Community gardens

Local governments can also mitigate hunger and food insecurity in low-income neighborhoods by establishing community gardens. According to the Encyclopedia of Community, a community garden is “an organized, grassroots initiative whereby a section of land is used to produce food or flowers or both in an urban environment for the personal use or collective benefit of its members." Community gardens are beneficial in that they provide community members with self-reliant methods for acquiring nutritious, affordable food. This contrasts with safety net programs, which may alleviate food insecurity but often foster dependency.

According to the Journal of Applied Geography, community gardens are most successful when they are developed using a bottom-up approach, in which community members are actively engaged from the start of the planning process. This empowers community members by allowing them to take complete ownership over the garden and make decisions about the food they grow. Community gardens are also beneficial because they allow community members to develop a better understanding of the food system, the gardening process, and healthy versus unhealthy foods. Community gardens thereby promote better consumption choices and allow community members to maintain healthier lifestyles.

Despite the many advantages of community gardens, community members may face challenges in regard to accessing and securing land, establishing organization and ownership of the garden, maintaining sufficient resources for gardening activities, and preserving safe soils.

Private sector hunger relief

Volunteers pass out food items from a Feeding America food bank.

The oldest type of formal hunger relief establishment used in the United States is believed to be the almshouse, but these are no longer in existence. A couple decades after WWII a notion was spread that hunger had been alleviated in Western countries. One man in the U.S, John van Hengel, was frustrated with the little attention towards food insecurity and in 1967 established the first food bank in Phoenix, Arizona *. Named St. Mary's food bank alliance, it worked by collecting food that has been thrown away by grocery stores because it was no longer salable but was perfectly good for human consumption. Around the same time, from 1969 through the 1980s, the renowned Black Panther party established a very effective free breakfast program. Launched in January 1969, Bobby Seale started this program at Father Earl A. Neil's St.Augustine episcopal church in West Oakland. In the 21st century, hunger relief agencies run by civil society include:

  • Soup kitchens, along with similar establishments like food kitchens and meal centers, provide hot meals for the hungry and are the second most common type of food aid agency in the U.S. Unlike food pantry, these establishments usually provide only a single meal per visit, but they have the advantage for the end user of generally providing food with no questions asked.
  • Food pantries are the most numerous food aid establishment found within the United States. The food pantry hands out packages of groceries to the hungry. Unlike soup kitchens, they invariably give out enough food for several meals, which is to be consumed off the premises. A related establishment is the food closet, which serves a similar purpose to the food pantry, but will never be a dedicated building. Instead, a food closet will be a room within a larger building like a church or community center. Food closets can be found in rural communities too small to support a food pantry. Food pantries often have procedures to prevent unscrupulous people from taking advantage of them, such as requiring registration. Unlike soup kitchens, food pantry relief helps people make it from one paycheck to another when their funds run low.
  • Food banks are the third most common type of food aid agency. While some will give food direct to the hungry, food banks in the U.S. generally provide a warehouse like function, distributing food to front line agencies such as food pantries and soup kitchens.
  • Food rescue organizations also perform a warehouse like function, distributing food to front line organizations, though they are less common and tend to operate on a smaller scale than do food banks. Whereas food banks may receive supplies from large growers, manufacturers, supermarkets, and the federal government, rescue organizations typically retrieve food from sources such as restaurants along with smaller shops and farms.

Together, these civil society food assistance establishments are sometimes called the "Emergency Food Assistance System" (EFAS). In 2010, an estimated 37 million Americans received food from the EFAS. However, the amount of aid it supplies is much less than the public sector, with an estimate made in 2000 suggesting that the EFAS is able to give out only about $9.5 worth of food per person per month. According to a comprehensive government survey completed in 2002, about 80% of emergency kitchens and food pantries, over 90% of food banks, and all known food rescue organizations were established in the US after 1981, with much of the growth occurring after 1991.

There are several federal laws in the United States that promote food donation. The Bill Emerson Good Samaritan Food Donation Act encourages individuals to donate food to certain qualified nonprofit organizations and ensures liability protection to donors. Similarly, Internal Revenue Code 170(e)(3) grants tax deductions to businesses in order to encourage them to donate healthy food items to nonprofit organizations that serve low-income populations. Lastly, the U.S. Federal Food Donation Act of 2008 encourages Federal agencies and Federal agency contractors to donate healthy food items to non-profit organizations for redistribution to food insecure individuals. Such policies curb food waste by redirecting nutritious food items to individuals in need.

These policies have also created debate over whether it is sustainable to rely on surplus food for food aid. While some view this as a "win-win" solution as it feeds people while reducing food waste, others argue that it prevents either issue of food waste and food insecurity from being systematically addressed from the root issue. It has also been found that there is commonly a stigma around people who rely on donations to feed themselves. Individuals are unable to shop for food in grocery stores, they are told that surplus food is an acceptable way for their need to be addressed, not all of their nutritional needs may be met, there isn't always consist amounts and variation of food available to those in need, and it takes away from a solution that would fully address people's right to food.

Food Justice

Food Justice is a social movement approach to combating hunger. Food Justice seeks to provide greater food access to all communities through the creation of local food systems, such as urban agriculture and farmers markets. Locally based food networks move away from the globalized economy to provide food solutions and needs appropriate to the communities they serve. The Food Justice Movement specifically aims to address to disproportionate levels of food insecurity facing communities of color. Organizations in the movement often aim to reduce the high prevalence of food deserts and lack of nutritious foods seen in neighborhoods of color.

Race and class play significant roles in the location of food deserts and high food insecurity. Historically, communities of color have been subject to policies and laws that reduce their ability to be self-sufficient in food production. Community members past and present work as farm laborers while their own communities do not have power or access in their own food systems. As a result, communities of color are susceptible to economic segregation, and healthy food is more likely to be more expensive than in wealthier areas. Because of this history of inequality, there are growing projects to promote and enable low-income and people of color to create sustainable food systems.

History

Pre-19th century

Early settlers to North America often suffered from hunger, though some were saved from starvation thanks to aid from Native Americans such as Pocahontas.

European colonists attempting to settle in North America during the 16th and early 17th century often faced severe hunger. Compared with South America, readily available food could be hard to come by. Many settlers starved to death, leading to several colonies being abandoned. Other settlers were saved after being supplied with food by Native Americans, with the intercession of Pocahontas being a famous example. It did not take long however for colonists to adapt to conditions in the new world, discovering North America to be a place of extraordinary fertility. According to author Peter K. Eisinger, the historian Robert Beverley's portrayal of America as the "Garden of the World" was already a stock image as early as 1705. By the time of the Declaration of Independence in 1776, hunger was already considerably less severe than in Western Europe. Even by 1750, low prevalence of hunger had helped provide American Colonists with an estimated life expectancy of 51 years, while in Britain the figure was 37, in France 26 - by 1800, life expectancies had improved to 56 years for the U.S., 33 years for France and dropped to 36 years for Britain. The relative scarcity of hunger in the U.S. was due in part to low population pressure in relation to fertile land, and as labor shortages prevented any able-bodied person from suffering from extreme poverty associated with unemployment.

19th century

Until the early 19th century, even the poorest citizens of the United States were generally protected from hunger by a combination of factors. The ratio of productive land to population was high. Upper class Americans often still held to the old European ideal of Noblesse oblige and made sure their workers had sufficient food. Labour shortages meant the poor could invariably find a position - although until the American Revolution this often involved indentured servitude, this at least protected the poor from the unpredictable nature of wage labor, and sometimes paupers were rewarded with their own plot of land at the end of their period of servitude. Additionally, working class traditions of looking out for each other were strong.

Social and economic conditions changed substantially in the early 19th century, especially with the market reforms of the 1830s. While overall prosperity increased, productive land became harder to come by, and was often only available for those who could afford substantial rates. It became more difficult to make a living either from public lands or a small farm without substantial capital to buy up to date technology. Sometimes small farmers were forced off their lands by economic pressure and became homeless. American society responded by opening up numerous almshouses, and some municipal officials began giving out small sums of cash to the poor. Such measures did not fully check the rise in hunger; by 1850, life expectancy in the US had dropped to 43 years, about the same as then prevailed in Western Europe.

The number of hungry and homeless people in the U.S. increased in the 1870s due to industrialization. Though economic developments were hugely beneficial overall, driving America's Gilded Age, they had a negative impact on some of the poorest citizens. As was the case in Europe, many influential Americans believed in classical liberalism and opposed federal intervention to help the hungry, as they thought it could encourage dependency and would disrupt the operation of the free market. The 1870s saw the AICP and the American branch of the Charity Organization Society successfully lobby to end the practice where city official would hand out small sums of cash to the poor. Despite this, there was no nationwide restrictions on private efforts to help the hungry, and civil society immediately began to provide alternative aid for the poor, establishing soup kitchens in U.S. cities.

20th century

Following the "rediscovery" of hunger in America during the late 1960s, President Richard Nixon addressed Congress saying: "That hunger and malnutrition should persist in a land such as ours is embarrassing and intolerable.... More is at stake here than the health and well-being of 16 million American citizens.... Something very like the honor of American democracy is at issue."

By the turn of the century, improved economic conditions were helping to reduce hunger for all sections of society, even the poorest. The early 20th century saw a substantial rise in agricultural productivity; while this led to rural unemployment even in the otherwise "roaring" 1920s, it helped lower food prices throughout the United States. During World War I and its aftermath, the U.S. was able to send over 20 million pounds of food to relieve hunger in Europe. The United States has since been a world leader for relieving hunger internationally, although her foreign aid has sometimes been criticised for being poorly targeted and politicised. An early critic who argued against the U.S. on these grounds in the 1940s was Lord Boyd-Orr, the first head of the UN's Food and Agriculture Organization. It can also be said that the U.S caused hunger internationally through developmental and colonial privatization practices.

The United States' progress in reducing domestic hunger had been thrown into reverse by the Great Depression of the 1930s. The existence of hunger within the U.S. became a widely discussed issue due to coverage in the Mass media. Both civil society and government responded. Existing soup kitchens and bread lines run by the private sector increased their opening times, and many new ones were established. Government sponsored relief was one of the main strands of the New Deal launched by President Franklin D. Roosevelt. Some of the government established Alphabet agencies aimed to relieve poverty by raising wages, others by reducing unemployment as with the Works Progress Administration. The Federal Surplus Relief Corporation aimed to directly tackle hunger by providing poor people with food. By the late 1940s, these various relief efforts combined with improved economic conditions had been successful in substantially reducing hunger within the United States.

According to sociology professor Janet Poppendieck, hunger within the US was widely considered to be a solved problem until the mid-1960s. By the mid-sixties, several states had ended the free distribution of federal food surpluses, instead providing an early form of food stamps, which had the benefit of allowing recipients to choose food of their liking, rather than having to accept whatever happened to be in surplus at the time. There was however a minimum charge; some people could not afford the stamps, causing them to suffer severe hunger. One response from American society to the rediscovery of hunger was to step up the support provided by private sector establishments like soup kitchens and meal centers. The food bank, a new form of civil society hunger relief agency, was invented in 1967 by John van Hengel. It was not however until the 1980s that U.S. food banks began to experience rapid growth.

A second response to the "rediscovery" of hunger in the mid-to-late sixties, spurred by Joseph S. Clark's and Robert F. Kennedy's tour of the Mississippi Delta, was the extensive lobbying of politicians to improve welfare. The Hunger lobby, as it was widely called by journalists, was largely successful in achieving its aims, at least in the short term. In 1967 a Senate subcommittee held widely publicized hearings on the issue, and in 1969 President Richard Nixon made an emotive address to Congress where he called for government action to end hunger in the U.S.

In the 1970s, U.S. federal expenditure on hunger relief grew by about 500%, with food stamps distributed free of charge to those in greatest need. According to Poppendieck, welfare was widely considered preferable to grass roots efforts, as the latter could be unreliable, did not give recipients consumer-style choice in the same way as did food stamps, and risked recipients feeling humiliated by having to turn to charity. In the early 1980s, President Ronald Reagan's administration scaled back welfare provision, leading to a rapid rise in activity from grass roots hunger relief agencies.

Poppendieck says that for the first few years after the change, there was vigorous opposition from the political Left, who argued that the state welfare was much more suitable for meeting recipients needs. This idea was questionable to many, well other thought it was perfect for the situation. But in the decades that followed, while never achieving the reduction in hunger as did food stamps in the 1970s, food banks became an accepted part of America's response to hunger.

The USDA Economic Research Service began releasing statistics on household food security in the U.S. in 1985.

In the 1980s under Reagan's administration, the Task Force on Food Assistance formally defined hunger in the US for the first time, stating it was a social phenomenon where one doesn't have the means to obtain sufficient food. This differentiated it from the medical definition of hunger, and meant that people could be considered hungry even with no physical conditions. Starting in 1995, a Food Security Supplement was added to the Census to gather data on how many Americans struggle to acquire food, a survey that remains in place to this day. In 2006, a review of USDA hunger measurements led to the separate definitions of "food insecure" and "hungry", and created different levels of food insecurity based on data measurements.

In 1996, the Welfare Reform Act was passed, making EBT the mode of delivering benefits to participants in the Food Stamp Program. This act also gave states more control over administering the program, and added limitations to who was eligible for benefits.

Demand for the services of emergency hunger relief agencies increased further in the late 1990s, after the "end of welfare as we know it" with President Clinton's Personal Responsibility and Work Opportunity Act.

21st century

Lauren Bush at a 2011 party promoting her FEED charity, which helps fund the United Nations' efforts to feed children throughout the world

In comparison to other advanced economies, the U.S. had high levels of hunger even during the first few years of the 21st century, due in part to greater inequality and relatively less spending on welfare. As was generally the case across the world, hunger in the U.S. was made worse by the lasting global inflation in the price of food that began in late 2006 and by the financial crisis of 2008. By 2012, about 50 million Americans were food insecure, approximately 1 in 6 of the population, with the proportion of children facing food insecurity even higher at about 1 in 4.

Hunger has increasingly begun to sometimes affect even middle class Americans. According to a 2012 study by UCLA Center for Health Policy Research, even married couples who both work but have low incomes will sometimes now require emergency food assistance.

In the 1980s and 90s, advocates of small government had been largely successful in un-politicizing hunger, making it hard to launch effective efforts to address the root causes, such as changing government policy to reduce poverty among low earners. In contrast to the 1960s and 70s, the 21st century has seen little significant political lobbying for an end to hunger within America, though by 2012 there had been an increase in efforts by various activists and journalists to raise awareness of the problem. American society has however responded to increased hunger by substantially increasing its provision of emergency food aid and related relief, from both the private and public sector, and from the two working together in partnership.

According to a USDA report, 14.3% of American households were food insecure during at least some of 2013, falling to 14% in 2014. The report stated the fall was not statistically significant. The percentage of households experiencing very low food security remained at 5.6% for both 2013 and 2014. In a July 2016 discussion on the importance of private sector engagement with the Sustainable Development Goals, Malcolm Preston the global sustainability leader at PricewaterhouseCoopers, suggested that unlike the older Millennium Development Goals, the SDGs are applicable to the advanced economies due to issues such as hunger in the United States. Preston stated that one in seven Americans struggle with hunger, with food banks in the US now more active than ever.

In the wake of the COVID-19 pandemic, unemployment and food insecurity in the U.S. soared. In 2020, over 60 million people turned to food banks and community programs for help putting food on the table.

Hydrogen-like atom

From Wikipedia, the free encyclopedia https://en.wikipedia.org/wiki/Hydrogen-like_atom ...