The literature on finding balance in life may advise Americans to work to live rather than live to work. But the fact is that most Americans locate their identity in their work. It’s not an accident that the question most Americans ask when they meet someone new is: “What do you do?”

For many Americans, work provides a sense of purpose, accomplishment and identity. Professional achievements are often seen as a validation of one’s skills, intelligence and worth. Workplaces often serve as important social environments where people form relationships and a sense of community.

No wonder work remains the foundation of Americans’ identity. Work defines who we are. Work has become the core of our self-image and our public image. We are what we do.

Of course, other factors such as religion, family, politics and leisure activities also contribute to personal identity. But they generally do not hold the same central place as people’s work or career.

While religion remains important for many Americans, the overall trend toward secularization has diminished its role as the primary source of identity. Indeed, religious identities are often privatized. Work and religion are often seen as separate spheres of life, with professional success taking precedence in public and social life.

For many, politics, like religion, is seen as part of their private beliefs rather than as a primary public identity, in contrast to the more publicly acknowledged professional identity. Somewhat similarly, leisure activities, including hobbies, tend to take a backseat to career commitments in defining one’s identity.

While family is important, the inward turning, emotionally bounded structure of the nuclear family and the demands of modern life tend to place career at the forefront. Economic pressures and the need for dual-income households prioritize work over family in terms of identity. Meanwhile, high levels of geographic and social mobility have weakened extended family ties, making individual career achievements more central to personal identity.

When and why did people’s identity become defined in terms of their work? In terms of timing, it appears that the post–World War II era marked a significant turning point, when a series of social, economic and demographic developments intersected. Suburbanization and the decline in distinct ethnic neighborhoods, the growth of middle management and corporate bureaucracies, and a sharp increase in college attendance and mothers in the job market combined to make workplace identities more salient.

Of course, there were other contributors as well. There was a tendency, evident even in the 19th century, to equate one’s personal worth and virtue with professional achievement and economic success. The idea that individuals can achieve success and upward mobility through hard work and entrepreneurship has reinforced the notion that one’s career and work are central to personal identity. The narrative of self-made success places a strong emphasis on individual effort and career accomplishments as the main paths to achieving the American Dream.

We may have moved far from the idealized Protestant work ethic, with its emphasis on hard work, discipline, and frugality as a route to spiritual salvation as well as worldly success. But the idea that work is a moral duty and a pathway to self-worth remains firmly intact.

Professional achievements and job titles often confer social status and respect. Americans often use their jobs to signal their social identity, values and belonging to certain social or professional groups.

Many, and perhaps most, of us find fulfillment and a sense of purpose through our work. It provides a sense of accomplishment and satisfaction, and contributes significantly to our self-esteem and self-image. Recognition for one’s skills and talents is a major source of pride and self-identity. The emphasis on meritocracy further reinforces the belief that hard work and talent are the primary means to achieve success, further linking personal identity to professional accomplishments.

How is it, we must ask, that work became culturally ascendant, and the pursuit of a career achieved a kind of centrality in the American psyche?


Over the past century, a number of prominent figures predicted that the workweek would shrink thanks to advances in technology and productivity and growing affluence.

In his 1930 essay, “Economic Possibilities for Our Grandchildren,” the economist John Maynard Keynes wrote:

“For the first time since his creation man will be faced with his real, his permanent problem—how to use his freedom from pressing economic cares, how to occupy the leisure which science and compound interest will have won for him, to live wisely and agreeably and well.”

Two years later, in his 1932 essay “In Praise of Idleness,” the philosopher Bertram Russell, made a similar prediction:

“Modern technic has made it possible for leisure, within limits, to be not the prerogative of a small privileged class but a right evenly distributed throughout the community … If the ordinary wage-earner worked four hours a day, there would be enough for everybody, and no unemployment—assuming a certain very moderate amount of sensible organization.”

In 1956, then Vice President Richard M. Nixon also predicted a shorter workweek:

“We are on the threshold of a 20-hour work week, and a good thing too. The shorter work week would become one of the great by-products of the Industrial Revolution, and it would mark an important step in our march to a fuller and more abundant life.”

Spoiler alert: Those things didn’t happen. Time stress is one of the defining features of 21st century adult life, shaping our relations with our intimate partners, our children, relatives and friends.

That these predictions failed to come true demonstrates that the workweek is not determined by technological advances or increases in productivity, but rather by the complex interplay of cultural, economic, political and social factors.

In the face of staunch resistance from business, productivity gains were channeled into higher output and profits rather than shorter workweeks.

While the average workweek in the United States has decreased compared to the early 20th century, the reduction has not been nearly as substantial as historical predictions suggested. The 40-hour workweek is a standard that was established in the United States in the late 1930s and has remained largely unchanged since then.

At the turn of the 20th century, the typical workweek for many industrial workers was significantly longer than today, often ranging from 48 to 60 hours per week. One of the labor movement’s major goals was to reduce these long hours. A popular labor slogan in the late 19th century was “Eight hours for work, eight hours for sleep, and eight hours for what we will.”

The Fair Labor Standards Act (FLSA) of 1938 was a key piece of legislation that established the 40-hour workweek in the United States, mandating overtime pay for hours worked beyond this threshold. This marked a significant reduction in work hours for many American workers.

During the 1950s and 1960s, the standard workweek for full-time workers stabilized around 40 hours. Since then, the average workweek has not declined. Instead, gains in productivity translated into higher output and wages and benefits rather than shorter hours.

Today, the average workweek for full-time American workers remains at about 39 to 41 hours per week. To be sure, there is considerable variability in work hours depending on the industry, occupation and socioeconomic status. Professionals, managers and salaried employees often work longer hours, while part-time workers and those in lower-wage jobs may work fewer hours but often seek additional work to make ends meet.

Hourly, freelance and gig workers, who make up around 55 percent of the American workforce, tend to have irregular and less predictable work hours. Among the key effects of unpredictable hours include income variability and less access to employer provided health insurance, sick leave, vacation time and retirement benefits. Erratic work hours make it challenging to arrange reliable childcare and manage family responsibilities, resulting in increased stress and difficulty in fulfilling both work and family roles. Inconsistent schedules can make it difficult to build strong relationships with colleagues and supervisors, potentially leading to a sense of isolation at work.

The average American adults works many more hours a year than their European counterparts. The International Labor Organization in 2002 reported that the average American worked 1,815 hours in 2002, about 17 percent more than the comparable figures for France (1,545) and over 25 percent more than in Germany (1,444). I should note that the U.S. figure pales in comparison to the average South Korean employee, who worked over 2,400 hours—nearly a third more than in the United States.


Why didn’t work hours decline? Why do many Americans work as much or even more than their counterparts a century ago?

The best answer comes from the Pennsylvania State University historian Gary S. Cross, in a recent book entitled Free Time: The History of an Elusive Ideal. It’s not just that work hours failed to budge, but many older leisure activities fell into decline. These included activities involving self-cultivation such as hobbies and book reading, socially interactive activities, including church going, participation in bowling leagues, or communal singing, and even the sharing with others through participation in a collective mass culture.

All of these have given way to leisure activities that are more passive, more individual, more private, and more domestic. Think TV watching and internet surfing.

Professor Cross is the leading authority on the most lasting and influential-ism of the twentieth century: not Communism, conservatism, fascism, libertarianism, liberalism, or socialism but consumerism. No one has written with such insight into the origins, evolution, nature, meaning and appeal of consumer culture. In Professor Cross’s view, consumerism—the desire to earn in order to consume—helps explain why American workers haven’t lobbied for a shorter workweek.

Over the past three and a half decades, Professor Cross has published at least 14 books dealing with such diverse topics as labor history, the history of technology, leisure, and recreation, as well as children’s toys. Trained in French and German history, he brings a keen comparative perspective to his scholarship.

Professor Cross approaches consumer culture through a critical analytical lens. He has shown, for example, how marketers have exploited cravings for authenticity and personal meaning and nostalgic longings by offering a profusion of things retro, from golden oldies in music, TV reruns, clothing imitative of past fashions, cars, such as Mustangs, Jeeps and Chargers, and Broadway shows, like Jersey Boys, Ain’t Too Proud, and Tina.

Two of Cross’s earlier books traced the commercialization of childhood, showing how toy manufacturers manipulated and misused children’s developmental needs for independence, their yen for novelty, their proclivity for collecting objects, and their desire to be “cool,” while also appealing to parents’ contradictory desires, to contributed to their children’s learning and skills development even as they let their kids buy toys that played out various fantasies. Especially fascinating is his discussion of how toys intended to function as educational tools mutated into the often violent and sexualized fantasy objects detached from adult culture.

Another book, Men to Boys, is a study of the making of male immaturity. It traces the demise, over the course of the 20th century, of a definition of male adulthood that emphasized maturity, responsibility and adherence to rigid work and family roles.

His latest book, Free Time, is a definitive account of how Americans settled on a forty-hour workweek and why pressure to reduce the workweek is now so muted. This history begins with the Neolithic Revolution, and demonstrates how work prior to the industrial revolution was unrelenting, but also seasonal, intermittent, and intertwined with free time. He examines work breaks and holidays as well as the saturnalian and carnivalesque multi-day seasonal festivals that bonded communities and allowed groups to release social tensions.

He then looks at the protracted rise of the middle class and the ways that this transformed work and leisure. He shows how market pressures and new currents in religion and culture attacked older cycles of holidays and festivals. Among the topics he writes about with great insight are the growing affirmation of the work ethic and ambition, and the rise of a bourgeois ethos rejecting blood sports, gaming, and drink and emphasizing diligence, orderliness, prudence, self-control, sobriety and thrift.

Especially impressive is his analysis of the politics of work and leisure during the nineteenth and first half of the twentieth century, when, in stark contrast to today, a variety of movements fought to reduce the workweek. As David Roediger and Philip Foner showed in their 1989 study Our Own Time: A History of American Labor and the Working Day, trade unions and striking workers led the drive to reduce the workday, first to ten hours and then to eight. But in the early 20th century, they received strong support from a new class of professional industrial engineers, work scientists, and influential economists whose primary interest was in promoting efficiency—in part, by removing very young and older workers from the workforce.

At first, increases in leisure time were accompanied by an emphasis on sociability and self-improvement. Working class men, in particular, spent time in saloons, fraternal lodges, pool halls, brothels, and barber shops, and music halls, and other spaces for male-only recreation. Concerted campaigns by reformers to eliminate many of those sex segregated, male-only bastions saw the rise of opposite sex entertainment venues: city parks, dance halls, amusement parks, vaudeville houses, and nickelodeons and later movie theaters.

Meanwhile, many middle-class men engaged in various kinds of hobbies, such as woodworking, stamp collecting, car restoration and fishing, while many middle-class women spent time reading, card playing or gardening.

Subsequently, however, Americans, in large numbers, succumbed to the allure of what the author calls “fast capitalism”—the passive, individualized enjoyment of the virtual. In fact, most Americans can’t even imagine an alternative to the ways we currently work and spend their time. Work had become central to adult identities. It’s how many Americans measure their success, and like their Puritan forebears, many feel guilty when not working.


Professor Cross suggests that certain developments are under way that may well challenge the existing status-quo and liberate society from fast capitalism with its emphasis on work and consumption. These include a crisis of caregiving, which will require Americans to devote more attention to the needs of the old, the young, and others who need more attention and care than they currently receive; the prospects for technology-enhanced automation that may well increase productivity while reducing labor needs; experiments with 4-day workweeks in Japan, Spain and Sweden that challenge the 8-hour day, 40-hour workweek paradigm; the pandemic turn toward remote work; and a looming environmental crisis that may well force society to reconsider its current emphasis on economic growth and consumerism.

Perhaps equally important, many consumer goods like cars are increasingly losing their luster as they’re reduced to interchangeable commodities—and growing numbers of people search for paths to happiness that the market can’t provide: membership in a community, meaningful forms of public service, participation in voluntary organizations, and the kinds of face-to-face interpersonal connections that can’t be provided via social media or vicariously through mass culture.

Max Weber had it right: American society has found itself confined in an iron cage that looks an awful lot like a shopping cart. The time may be ripe to break free from that enclosure and find other paths to happiness and fulfillment.

Steven Mintz is professor of history at the University of Texas at Austin and the author, most recently, of The Learning-Centered University: Making College a More Developmental, Transformational, and Equitable Experience.

Source link

By admin

Malcare WordPress Security