“Companies are doing all they can not to bake in any gains that are difficult to claw back,” Dr. Schneider said. “Workers’ labor market power is so far not yielding durable dividends.”
The changes that make work lower paying, less stable and generally more precarious date back to the 1960s and ’70s, when the labor market evolved in two key ways. First, companies began pushing more work outside the firm — relying increasingly on contractors, temps and franchisees, a practice known as “fissuring.”
Second, many businesses that continued to employ workers directly began hiring them to part-time positions, rather than full-time roles, particularly in the retail and hospitality industries.
According to the scholars Chris Tilly of the University of California, Los Angeles, and Françoise Carré of the University of Massachusetts Boston, the initial impetus for the shift to part-time work was the mass entry of women into the work force, including many who preferred part-time positions so they could be home when children returned from school.
Before long, however, employers saw an advantage in hiring part-timers and deliberately added more. “A light bulb went on one day,” Dr. Tilly said. “‘If we’re expanding part-time schedules, we don’t have to offer benefits, we can offer a lower wage rate.’”
By the late 1980s, employers had begun using scheduling software to forecast customer demand and staffed accordingly. Having a large portion of part-time workers, who could be given more hours when stores got busy and fewer hours when business slowed, helped enable this practice, known as just-in-time scheduling.
But the arrangement subjected workers to fluctuating schedules and unreliable hours, disrupting their personal lives, their sleep, even their children’s brain development.