Susan worked fourteen years in big-box retail, first selling books and later becoming general manager at a pet-supplies store. The hours were long and inconsistent. Early in the pandemic, her store was often shorthanded. Eventually she quit. “It was very much like leaving an abusive relationship,” she said in an interview with Fast Company. “You have to run away, but it’s so hard to get out.” Susan took a job as an administrative coordinator in a hospital unit where, she says, her hours are better and her employer values her skills.
We are two years into the pandemic’s disruption of our work (to say nothing of its cost in lives or its effect on home, school, or religious life), and many workers, like Susan, have had enough. Burnout is the subject of my new book, and it has become a keyword for workers to describe their pain and frustration with work. Last fall, a wave of strikes swept across the manufacturing sector. And record numbers of workers, especially in retail, have quit.
It is tempting to credit this widespread dissatisfaction with work entirely to the pandemic. But in fact, the seeds of the current crisis were sown long ago. And now it is time to reap. For five decades, jobs have steadily been getting worse in the United States. By quitting, striking, and demanding better conditions, workers are asserting not just their economic value on the labor market but their worth as human beings with inherent dignity that employers ought to respect.
The data suggests that the slide began in the mid-1970s. In the subsequent decades, jobs have become less rewarding, less secure, and more psychologically intense. During the post–World War II economic boom, wages rose in tandem with workers’ productivity. That changed in 1973. Since then, productivity has continued to rise along the same trajectory, but real wages have flatlined. And the United States is not the only place where workers now enjoy less of the fruit of their labor. Across several rich economies, the share of gross domestic product that workers take home has declined since the 1970s.
At the same time, business doctrines began to favor having the smallest number of full-time, direct employees as possible, while contracting out as much work as possible. This trend is meant to focus a company on “core competencies,” but it also cuts costs. At universities, for example, custodial and food-service workers are typically employed by a third-party staffing firm, not by the university itself. This doctrine produces a “fissured workplace,” in the words of the economist David Weil. It increases pressure on those core employees – because there are fewer hands to pitch in during busy periods – and renders the contract and temporary workers more precarious. In the increasingly prominent gig sector, a worker might not know if she has a job in the next five minutes, never mind next month or next year.
In addition, since the 1970s the economies of the rich world have shifted their focus from manufacturing to retail and services, increasing work’s psychological and emotional intensity. It is not only the retail worker whose attitude and even facial expressions are the means of production; the customer-service mentality pervades virtually all lines of work, from health care to trucking. And as software makes it possible to perform service tasks remotely, the boundary between on and off hours becomes ever less distinct. There is never a time when you couldn’t be working.
In all three respects, US workers’ conditions reflect those that female workers experienced prior to the 1970s. The Kelly Girl was an archetype of 1960s office culture; companies assumed she was only working for “pin money,” so she did not need a long-term contract or living wage. The temp model has since spread throughout the economy and helped fissure the workplace. Likewise, the emotional labor expected of female service workers in the 1970s now extends to virtually everyone. As the journalist Bryce Covert has put it, “We’re all women workers now, and we’re all suffering for it.”
Because gender plays an important role in these changes, we ought to keep it in mind as we consider policies that will improve jobs. Even though working conditions were better in crucial ways before 1973, the solution to the problem today is not to go back to the policies of those times, because those policies often reflected the idea that only men needed to earn a decent paycheck. Yes, full-time wages ought to be sufficient for one salary to support a family, but that must not become an excuse for male dominance at work or home.
Local governments have recently passed laws requiring employers to give workers more predictable schedules. That trend should go national. Government should also get companies to reduce their dependence on contract labor or, at least, set a standard where contracted workers are paid more than core staff, in light of their greater precarity. And perhaps consumers can give up their dependence on instant, smiling delivery of anything, anytime.
The specific shape of new policy is less important, ultimately, than the moral ideal behind it: the idea that everyone, before they ever go to work, has an inherent dignity that employers are bound to respect. That idea has rarely prevailed in the era of paid employment, but it has certainly been ignored in the past fifty years. If we want to end burnout and win just conditions for all who labor, the concept of human dignity needs to come up in every conversation about work.
Susan, in her story of leaving retail, doesn’t use the word dignity. But I see her as a worker who realized she was worth more as a person than her job said she was. As more workers, like she did, walk away from poor conditions, perhaps employers will finally realize the dignity their employees have always had.