During most of the 1990s, the United Kingdom was the only major industrialized country that did not impose a minimum wage requirement on employers. And the short version of the story is that economists convinced Margaret Thatcher and other members of the Conservative Party’s leadership that minimum wage laws were bad for workers.
At the time, it was hard to find an economist who had a good word to say about minimum wage laws.
Economists reasoned that companies had only so much money and, if required to pay higher wages, would generally employ fewer workers. A policy intended to help workers instead would result in higher unemployment.
The audience for this kind of theory had once consisted almost entirely of other economists, distinguished by their collective confidence that the complexities of the real world could be meaningfully reduced to a few variables on a chalkboard. But in the second half of the twentieth century, economists who preached the virtues of unfettered markets won the ear of policymakers, first in the United States and then in the United Kingdom and other countries. They said that reducing the government’s role in managing economic conditions would produce faster economic growth, lifting all boats.
The United Kingdom had been one of the first countries to impose minimum wage requirements, beginning in the late nineteenth century. Rather than a statutory minimum wage, the government relied on unions to negotiate minimum wage standards in major industries, and it created “wage councils” to perform the same role for other workers.
This system began to crumble in the late 1960s. As the economy shifted away from mining and manufacturing, unions began to lose their hold. In other nations, as unions declined, the government stepped in as the primary protector of workers, setting a national minimum wage: The Netherlands in 1969, France in 1970 and Spain in 1980. In the UK, however, the unions still maintained enough power to successfully oppose a national minimum as a threat to their own relevance.
Under Thatcher, the government broke the power of the unions even as it gradually eliminated the wage councils. By 1993, the old system was all but gone – and nothing had taken its place. Thatcher, in her memoirs, celebrated the United Kingdom’s move toward more flexible labour markets. Employers were free to pay as little as they liked.
But free-market economists were wrong about minimum wage laws. A groundbreaking study published in 1993 by the American economists David Card and Alan Krueger examined the effects of a minimum wage increase in the state of New Jersey and found no evidence of an increase in unemployment. The old theory did not align with reality.
That paper, and a wave of similar studies that followed, were embraced by the Labour Party, which had decided to back a national minimum wage. After the 1997 elections, Labour established a minimum wage, which took effect in April 1999 despite the warnings of Tory leader David Cameron that it “would send unemployment straight back up.”
Two decades later, Labour’s decision to defy the wisdom of economists is widely regarded as a winner.
It is now well-established and accepted by all of the UK’s major political parties that the government can maintain low unemployment while mandating a minimum level of pay. Prime Minister Boris Johnson campaigned last year on the promise that he would deliver a significant increase in the minimum wage, and his government announced in December that it would raise what is now called the “national living wage” to £8.72 an hour at the beginning of April.
Markets are not states of nature. They are human creations, and by writing better rules we can produce better outcomes.