In our first editorial we took up the question of inequality in America. There we suggested that while the dramatic increase in inequality in American life does call for critical attention, so too does the question of sufficiency—that is, whether or not fellow citizens have enough.

This month, Neal Gabler’s cover article for The Atlantic takes on what he calls “financial impotence.” (For a brief audio interview with Gabler, see this piece from NPR.) While macro-economic trends may be favorable for the American economy as a whole, a recent survey by the Federal Reserve Board reports that in the case of an emergency, 47% of respondents would struggle to come up with $400. What this statistic suggests is that the experience of being economically vulnerable is not synonymous with being poor.

Gabler admits that he was hesitant to mention his “financial travails” until he came to realize that he was not alone. In fact:

It was, according to the Fed survey and other surveys, happening to middle-class professionals and even to those in the upper class. It was happening to the soon-to-retire as well as the soon-to-begin. It was happening to college grads as well as high school drop-outs. It was happening all around the country, including places where you might least expect to see such problems. I knew I wouldn’t have $400 dollars in an emergency. What I hadn’t known, couldn’t have conceived, was that so many other Americans wouldn’t have the money available to them, either.

“In the 2010s, we have managed to democratize financial insecurity.”

What Gabler calls financial impotence, others call “financial fragility, financial insecurity, or financial distress.” But, “whatever you call it,” Gabler says, “the evidence strongly indicates that either a sizeable minority or a slim majority of Americans are on thin ice financially.” This is not, however, merely a liquidity problem—a problem of wealthy people having insufficient access to invested funds. It’s a problem of having those funds at all. “In the 1950s and ‘60s,” Gabler writes as a kind of thesis statement, “American economic growth democratized prosperity. In the 2010s, we have managed to democratize financial insecurity.”

Interestingly, the concept of “thrift” appears only once in Gabler’s piece, when he writes:

Not that Americans—or at least those born after World War II—had ever been especially thrifty. The personal savings rate peaked at 13.3 percent in 1971 before falling to 2.6 percent in 2005. As of last year, the figure stood at 5.1 percent, and according to McClary, nearly 30 percent of American adults don’t save any of their income for retirement. When you combine high debt with low savings, what you get is a large swath of the population that can’t afford a financial emergency.

Yet thrift, or the lack thereof, is the key concept for unlocking the source of economic pain for the “silent sufferers” who make up nearly half of the American population. Writing for The Montréal Review in 2012, Joshua J. Yates gives a historical genealogy of thrift that shows it to be not only a term of self-denial, but frequently and powerfully linked to the concept of thriving. “In one era after another,” Yates writes, “the fundamental purposes of economic life have been defined by prevailing visions of human flourishing—or thriving—accompanied by the exhortation of particular practices and habits of wise use and stewardship—or thrift—that were thought necessary for achieving prevailing visions of human flourishing.”

In cataloguing the various ways in which the language of thrift has been employed, Yates is not suggesting that we can define thrift however we like, or simply appeal to thrift as a way of encouraging restraint. Rather, he argues that sustained attention to the history of “thrift ethics” can reveal much about a culture’s “highest aspirations,” while “acknowledging [its] respective deficits.” (Yates and James Davison Hunter develop this argument in great detail in their 2011 book Thrift and Thriving in America.)

The dominant thrift ethic in our time is a kind of “free-agent thrift.” Yates describes this emergent form of thrift as “oriented to material security, but also to self-actualizing through work, consumption, and social commitment.” There are, of course, benefits and burdens to this form of economic self-understanding, each of which are on display in Gabler’s piece. As Yates has it, “on the positive side, this ethic combines a vision of expressive consumption with a work ethic that privileges authenticity, mobility, and autonomy as much as industrious time management.” Yet, he says, “even in good economic times many Americans are unable to meet the demands of free agency. Such agency is empowering, even exhilarating, if you are one of the meritocratic professionals who can move easily from one job to another in the global economy, but distressing, often painful, if you are not.”

Gabler’s piece points to the deep cultural tensions about the meaning of thrift in our contemporary moment. On the one hand, Gabler (and many others like him) has been willing to risk economic vulnerability in order to pursue a line of work that maximizes his freedom to express himself creatively. He is, after all, a writer—and many within the “creative class” of American society are attracted to the freedom such a job affords to use one’s time, energy, and gifts freely. And yet, when this impulse is tied to financial illiteracy and a culture of easy credit, the promised liberation can feel more like chains.