Jeff Bezos Takes Over The World

According to this article from the Wall Street Journal, Amazon CEO Jeff Bezos has created a winning business formula which will soon allow him to bypass all his rivals— Amazon, Apple, Facebook and Google—and leave them far behind. What is his strategy? His company does not take profits; it breaks even instead.

The analysis of Amazon’s business tactic of not taking profit, contained in this article, seems to turn all previous conjectures about the evils of “the bottom line” approach in business (“profit above all else”) on their heads. It is precisely because it takes no profit that Amazon is outstripping every other company in America—which  raises its market evaluation, allows it to pay very little in taxes and  to reduce employment through greater work efficiency.  

What would Adam Smith have to say about this?  

By SCOTT GALLOWAY, September 25, 2017 for The Wall Street Journal










Why does Amazon’s ascent matter? Aren’t lower prices and greater efficiencies better for everyone? They are, in all the obvious ways, but that’s not a complete picture. Amazon’s seemingly boundless growth forces us to wrestle with difficult questions about the reasons for its dominance.

For one, Amazon, unlike any other firm its size, has changed the basic compact with financial markets. It has replaced the expectation for profits with a focus on vision and growth, managing its business to break even while investors bid up its stock price.

This radical approach has provided the company with a staggering advantage in free-flowing capital. Google, Facebook, Wal-Mart and most Fortune 500 companies are saddled with expectations of profits. Many firms would be much more innovative if they were given a license to operate without the nuisance of profitability. Amazon has thus had enormous capital on hand to invest in delivery networks, especially the crucial last link for getting goods to the doorsteps of consumers, without having to worry that they don’t yield immediate profits.

Amazon’s strategy of break-even operations also means that it has virtually no profits to tax. Since 2008, Wal-Mart has paid $64 billion in federal income taxes, while Amazon has paid just $1.4 billion. Yet, while paying low taxes, Amazon has added $220 billion in value to the stock held by its shareholders over the past 24 months—equivalent to the entire market capitalization of Wal-Mart.

Something is deeply amiss when a company can ascend to almost a half trillion dollars in market value—becoming the fifth most valuable firm in the world—without paying any meaningful income tax. Does Amazon really owe so little to support public revenue and public needs? If a giant firm pays less than the average 24% in income taxes that the companies of the S&P 500 pay, it logically means that less-successful firms pay more. In this way, Amazon further adds to the winner-take-all tendencies plaguing our economy.

Because Amazon is more efficient than other retailers, it is able to transact the same amount of business with half the employees. If Amazon continues to grow its business by $20 billion a year, the annual toll of lost jobs for merchants, buyers and cashiers will be in the tens of thousands by my calculations. Disruption in the U.S. labor force is nothing new—we have just never dealt with a company that is so ruthless and single-minded about it.

I recently spoke at a conference the day after Jeff Bezos. During his talk, he made the case for a universal guaranteed income for all Americans. It is tempting to admire his progressive values and concern for the public welfare, but there is a dark implication here too. It appears that the most insightful mind in the business world has given up on the notion that our economy, or his firm, can support that pillar of American identity: a well-paying job.

Amazon has brought us many benefits, but we all must recognize that the rise of the One brings with it much more than free two-day delivery. “Alexa, is this a good thing?”

 Scott Galloway is a professor of marketing at the NYU Stern School of Business and the author of “The Four: The Hidden DNA of Amazon, Apple, Facebook, and Google,” to be published on Oct. 3 by Portfolio.


Economic Inequality a Constitutional Crisis? – Part Two

A graph from Thomas Piketty’s Capital in the Twenty-First Century which shows how we have exceeded today the excesses which immediately preceded the Great Depression

By GANESH SITARAMAN, September 17, 2017 for The New York Times

The problem, of course, is that economic inequality has been on the rise for at least the last generation. In 1976 the richest 1 percent of Americans took home about 8.5 percent of our national income. Today they take home more than 20 percent. In major sectors of the economy — banking, airlines, agriculture, pharmaceuticals, telecommunications — economic power is increasingly concentrated in a small number of companies.

While much of the debate has been on the moral or economic consequences of economic inequality, the more fundamental problem is that our constitutional system might not survive in an unequal economy. Campaign contributions, lobbying, the revolving door of industry insiders working in government, interest group influence over regulators and even think tanks — all of these features of our current political system skew policy making to favor the wealthy and entrenched economic interests. “The rich will strive to establish their dominion and enslave the rest,” Gouverneur Morris observed in 1787. “They always did. They always will.” An oligarchy — not a republic — is the inevitable result.


As a republic descends into an oligarchy, the people revolt. Populist revolts are rarely anarchic; they require leadership. Morris predicted that the rich would take advantage of the people’s “passions” and “make these the instruments for oppressing them.” The future Broadway sensation Alexander Hamilton put it more clearly: “Of those men who have overturned the liberties of republics, the greatest number have begun their career by paying an obsequious court to the people: commencing demagogues, and ending tyrants.”

Starting more than a century ago, amid the first Gilded Age, Americans confronted rising inequality, rapid industrial change, a communications and transportation revolution and the emergence of monopolies. Populists and progressives responded by pushing for reforms that would tame the great concentrations of wealth and power that were corrupting government.

On the economic side, they invented antitrust laws and public utilities regulation, established an income tax, and fought for minimum wages. On the political side, they passed campaign finance regulations and amended the Constitution so the people would get to elect senators directly. They did these things because they knew that our republican form of government could not survive in an economically unequal society. As Theodore Roosevelt wrote, “There can be no real political democracy unless there is something approaching an economic democracy.”

For all its resilience and longevity, our Constitution doesn’t have structural checks built into it to prevent oligarchy or populist demagogues. It was written on the assumption that America would remain relatively equal economically. Even the father of the Constitution understood this. Toward the end of his life, Madison worried that the number of Americans who had only the “bare necessities of life” would one day increase. When it did, he concluded, the institutions and laws of the country would need to be adapted, and that task would require “all the wisdom of the wisest patriots.”

With economic inequality rising and the middle class collapsing, the deep question we must ask today is whether our generation has wise patriots who, like the progressives a century ago, will adapt the institutions and laws of our country — and save our republic.

This concludes the two-part article on economic inequality.

Ganesh Sitaraman, a professor at Vanderbilt Law School, is the author of “The Crisis of the Middle-Class Constitution: Why Economic Inequality Threatens Our Republic.”

Economic Inequality a Constitutional Crisis? – Part One

Presto! In today’s Sunday Times appears an article describing exactly the basis for maintaining this blog. Why do we blog almost three times a week about economic inequality? Because if it continues to mount at its present rate, it will destroy our democracy. There are no safeguards in the Constitution to protect us. Our continuing concern is to warn readers that only we ourselves, through our own actions, can avoid empowering the oligarchy we are rushing toward and the revolution that would likely follow.

Please read this article carefully and take it to heart. From now on  up to us. 

By GANESH SITARAMAN, September 17, 2017 for The New York Times

Exactly 230 years ago, on Sept. 17, 1787, a group of men in Philadelphia concluded a summer of sophisticated, impassioned debates about the fate of their fledgling nation. The document that emerged, our Constitution, is often thought of as part of an aristocratic counterrevolution that stands in contrast to the democratic revolution of 1776. But our Constitution has at least one radical feature: It isn’t designed for a society with economic inequality.


There are other things the Constitution wasn’t written for, of course. The founders didn’t foresee America becoming a global superpower. They didn’t plan for the internet or nuclear weapons. And they certainly couldn’t have imagined a former reality television star president. Commentators wring their hands over all of these transformations — though these days, they tend to focus on whether this country’s founding document can survive the current president.

But there is a different, and far more stubborn, risk that our country faces — and which, arguably, led to the TV star turned president in the first place. Our Constitution was not built for a country with so much wealth concentrated at the very top nor for the threats that invariably accompany it: oligarchs and populist demagogues.

From the ancient Greeks to the American founders, statesmen and political philosophers were obsessed with the problem of economic inequality. Unequal societies were subject to constant strife — even revolution. The rich would tyrannize the poor, and the poor would revolt against the rich.

The solution was to build economic class right into the structure of government. In England, for example, the structure of government balanced lords and commoners. In ancient Rome, there was the patrician Senate for the wealthy, and the Tribune of the Plebeians for everyone else. We can think of these as class-warfare constitutions: Each class has a share in governing, and a check on the other. Those checks prevent oligarchy on the one hand and a tyranny founded on populist demagogy on the other.


What is surprising about the design of our Constitution is that it isn’t a class warfare constitution. Our Constitution doesn’t mandate that only the wealthy can become senators, and we don’t have a tribune of the plebs. Our founding charter doesn’t have structural checks and balances between economic classes: not between rich and poor, and certainly not between corporate interests and ordinary workers. This was a radical change in the history of constitutional government.

And it wasn’t an oversight. The founding generation knew how to write class-warfare constitutions — they even debated such proposals during the summer of 1787. But they ultimately chose a framework for government that didn’t pit class against class. Part of the reason was practical. James Madison’s notes from the secret debates at the Philadelphia Convention show that the delegates had a hard time agreeing on how they would design such a class-based system. But part of the reason was political: They knew the American people wouldn’t agree to that kind of government.

At the time, many Americans believed the new nation would not be afflicted by the problems that accompanied economic inequality because there simply wasn’t much inequality within the political community of white men. Today we tend to emphasize how undemocratic the founding era was when judged by our values — its exclusion of women, enslavement of African-Americans, violence against Native Americans. But in doing so, we risk missing something important: Many in the founding generation believed America was exceptional because of the extraordinary degree of economic equality within the political community as they defined it.

Unlike Europe, America wasn’t bogged down by the legacy of feudalism, nor did it have a hereditary aristocracy. Noah Webster, best known for his dictionary, commented that there were “small inequalities of property,” a fact that distinguished America from Europe and the rest of the world. Equality of property, he believed, was crucial for sustaining a republic. During the Constitutional Convention, South Carolinan Charles Pinckney said America had “a greater equality than is to be found among the people of any other country.” As long as the new nation could expand west, he thought, it would be possible to have a citizenry of independent yeoman farmers. In a community with economic equality, there was simply no need for constitutional structures to manage the clash between the wealthy and everyone else.


To be continued in a future blog

The Contract Employee—America’s New Working Man and Woman: Part Four

The four-part article on contract workers concludes with more case studies.

By LAUREN WEBER, September 14, 2017 for the Wall Street Journal

At the large logistics firm where Mr. Preiss, the former IBM employee, was reprimanded for laughing too loudly, contractors were denied access to company email and calendars, making it hard to schedule meetings. The contractors had to use a separate email system, but employees often didn’t respond, so Mr. Preiss had to buttonhole them at their desks.

Mr. Preiss recalls spending three weeks trying to set up an important meeting with a company executive who worked in a different building. He finally asked the project’s leader to schedule the meeting. The person did but forgot to invite Mr. Preiss or mention the meeting until everyone else was assembled in a conference room, he says.

After the scolding about his laugh, Mr. Preiss felt obliged to train himself to snicker, he says. “Either that or just smile or put my hand over my mouth or whatever I could do to muffle the sound,” he says.


On some Wednesday nights, he gathered with friends for trivia night at an Irish pub in Roswell, Ga., near Atlanta. Most of the men work as contractors, so they called their trivia team Outsourced. They have had some second- and third-place finishes.

Between trivia questions, Mr. Preiss and teammate Rob Jones often swapped stories about work. Among their frustrations: Employers want to essentially rent employees for short periods but then wonder why workers hop from company to company.

Mr. Jones, 59, has held more than a dozen jobs in nearly two decades of project-management contract work. He calls himself “one of those ‘forgotten men’ you hear about that has not had a raise in 18 years.”

The 2001 Toyota 4Runner that Mr. Jones drives has 215,000 miles on the odometer, but he won’t buy a new car. Monthly loan payments would be too risky, he says, since he never knows when a job will start or end.

In the late 1990s when companies were panicking about Y2K bugs, Mr. Jones bargained directly with clients and commanded $65 an hour. Few large companies are willing to manage thousands of self-employed contractors anymore, so they sign high-volume contracts with a handful of staffing or contracting agencies.

Mr. Jones says he now gets take-it-or-leave-it offers from recruiters, and the rate is usually about $45 an hour, or about $30 in 1999 terms. Some big projects offered as little as $24 an hour.

Foreign workers with H-1B visas compete for the types of jobs he used to do, he says. Such workers often are paid less than U.S. workers doing similar jobs.

“What am I doing wrong?” Mr. Jones asked a former boss, who took him out for a beer but offered no helpful advice.

Mr. Jones says he is now looking for any kind of work he can get, such as a government job, something “with a little bit of a pension to it.”

This concludes the four-part article on contract workers.

The Contract Employee—America’s New Working Man and Woman: Part Three


The four-part article on contract workers continues with more case studies.

By LAUREN WEBER, September 14, 2017 for the Wall Street Journal

Dan Fischer, 61, says his yearlong contract to install software for health-care provider Kaiser Permanente was abruptly cut short after nine months. “Boom, it was gone,” says Mr. Fischer, a contractor since 2013. Contract assignments usually include a clause saying they can be terminated anytime.

In June, he finished a two-year assignment at Bank of America Corp. in Charlotte, N.C., where he worked in a cubicle alongside other outside workers in an area they jokingly called Contractor Row. Most of the hand-me-down swivel chairs at their desks were broken, says Mr. Fischer.

After working from his home in Colorado for a few weeks, he discovered upon returning to Charlotte that his ID badge no longer unlocked the doors. Since there was no manager to agitate on his behalf, he had to work from his apartment until the ID was reactivated a week later. Mr. Fischer saw his wife every eight weeks or so when he flew back to Colorado for a visit.

Mr. Fischer hit the bank’s time limit for contractors this spring, which meant he had to stop working there for at least 90 days. After that, he went back to Bank of America doing the same job as before, except he is called a consultant and is on the payroll of a different contracting firm. Bank of America and Kaiser declined to comment.

Bob Zwicker had a 10-week contract in 2015 designing an electronic circuit for a medical-device maker. Officially, he was on the payroll of a contracting and recruiting firm near his home in Olympia, Wash. After that, he became an employee of tech outsourcing firm HCL Technologies Ltd. of India, which put him on an 18-month assignment at Microsoft Corp. to test power adapters for Surface laptops and tablets.

He gives Microsoft credit for maintaining a manual testing operation, rather than simply automating the task. He spent his time at a workbench with his laptop and electrical equipment, running rote checks such as testing adapters.

“I used to design such things,” says Mr. Zwicker, 65, an electronics engineer.  He attended product meetings and felt that he had the respect of Microsoft managers, even though he felt his opinions carried less weight than those of employees.

His contract ended in June, a couple of months ahead of schedule. Mr. Zwicker says he was told the position wasn’t in the budget for the next fiscal year. A new consulting project involves more design work and is a better fit.

At Microsoft, he missed making decisions on his own and having pride of ownership in his work. Those are the moments “where you realize, if this thing doesn’t work, it’s my fault,” he says. Microsoft declined to comment.

Outside workers say they are leery of doing anything that might backfire into them suddenly losing their contract assignment or hurt their chances at landing a full-time employee job.

Veronica Peinado, a project manager in Raleigh, N.C., says a manager recently asked her to conduct a product analysis, which wasn’t part of her contract assignment. She put about 60 hours of her own time into the project.

Ms. Peinado, 59, says the manager didn’t thank her when she turned in the project. A few months later, her contract ended with less than two weeks notice.

She also was forbidden to ask the company about her compensation or schedule. To take a day off, she was supposed to inform a staffing-agency representative, who then told another outside firm, which sought approval from the company, even though she spoke every day to the manager who ultimately said yes or no.

“It’s very, very weird,” says Ms. Peinado.

Ever since Microsoft agreed to pay $97 million in 2000 to settle an eight-year-old class-action lawsuit filed by “permatemps” who accused the tech giant of using temps to do the work of employees, companies have tried to keep their outside workers at a distance.

This article will be concluded in a future blog.

The Contract Employee—America’s New Working Man and Woman: Part Two

The four-part article on contract workers continues with a number of case studies.

By LAUREN WEBER, September 14, 2017 for the Wall Street Journal

No one knows how many Americans work as contractors, because they don’t fit neatly into the job categories tracked by government agencies. Rough estimates by economists range from 3% to 14% of the nation’s workforce, or as many as 20 million people. The surge might help explain a riddle of today’s labor market—jobs are plentiful, but many Americans feel anxious and insecure about their finances and careers.

Some contract workers say they like contract work for some of the same reasons companies do. Kara Sanders, 36, says it feels like contractors “have control over our destinies by putting ourselves out there and taking risks.”

She says she chose contracting after watching round after round of corporate layoffs hit family members and friends, while growing up in upstate New York. She has moved cross-country three times for data-analysis and consulting projects. She also is finishing an online master’s degree in data science.

“I don’t think it’s ever safe to let your skills atrophy or become too tied to one employer,” says Ms. Sanders.

People usually don’t go looking for contract work. It finds them.

Fernando Granthon of Austin, Texas

Fernando Granthon, 35, saw an outsourced human-relations position at Cisco Systems Inc. in Research Triangle Park, N.C., as a foot in the door. He says a recruiter at staffing agency ManpowerGroup Inc. told him contractors had a strong chance to get hired as Cisco employees.

The son of immigrants from Peru, Mr. Granthon grew up poor in South Florida, joined the U.S. Marine Corps straight out of high school and worked as a personnel clerk. He earned a bachelor’s degree at the age of 30 and was eager to parlay his experience into higher-level human-resources jobs.

At Cisco, Mr. Granthon worked on a team of employees and contractors who answered HR queries. He says he felt valued and trusted.

In 2014, the company split up the team. Outside workers including Mr. Granthon got simpler job duties than employees, he says. Worried that his career was stalling, he asked a Manpower representative about training opportunities and was told nothing was available. The same answer came when he pressed about getting a full-time job.

He left Cisco in 2015. Mr. Granthon didn’t receive a pay increase while working there. He is now pursuing an M.B.A. at St. Edward’s University in Austin, Texas.

“I realized there was no mobility,” he says. While he is grateful for the experience he gained as a contract worker, Mr. Granthon says he wishes bosses had realized that “contractors, like any other employees, want greater experiences, want to learn, and to move on.”

Manpower declined to comment on individual employees but says it offers workers free online training programs. Cisco declined to comment.

 Neil Gimon of Waxhaw, North Carolina

Just asking about job openings can be risky, says information-technology project manager Neil Gimon, who for years has taken contract jobs that he hopes will turn into a full-time position.

“The manager says: ‘You’re unhappy with this position? What’s going on?’ ” says Mr. Gimon, 53.

When the manager at a recent assignment with Wells Fargo & Co. sent around job openings, contractors were steered to a general website, while bank employees got to apply through an internal system that reveals details about the hiring manager and human-resources contacts.

The bank says nonemployees “are subject to Wells Fargo recruiting requirements in the same manner as any other external job seeker.”

Last year, Mr. Gimon and his wife, Anita, opened the Dreamchaser’s Brewery in an old firehouse in Waxhaw, N.C. If all goes well, he will stop doing contract work. “I’m tired of being laid off,” he says.

While job security has ebbed in all walks of corporate life, many employees get a relatively stable paycheck, benefits and often some help to find a new job if they lose the one they had. Contract workers are on their own.

This article will be continued in future blogs.

The Contract Employee—America’s New Working Man and Woman: Part One

images-3  307EmployeeContract2  0521_contract-800x480

This four-part article from the Wall Street Journal explores the newest form of employment in America today—the contract worker. Highly advantageous to the employer, disadvantageous to the employee, allowing him or her to be exploited without benefits, standing or seniority. Everything that formerly offered opportunities for pride in one’s work is now ripped away. As a contractor interviewed for the article says,”There just is no career anymore.”

By LAUREN WEBER, September 14, 2017 for the Wall Street Journal

Michael Preiss was happy to escape the corporate grind after being laid off by International Business Machines Corp. in 2001. He became a contractor, earning more than $100,000 a year from steady assignments helping companies figure out how to do things faster and cheaper.

That work eventually dried up. The past decade has been a revolving door of outsourced jobs for shrinking pay, fear that any day at a company could be his last, and reminders that full-time employees live in a different world, even though they often sit at the next desk. Mr. Preiss says one manager reprimanded him because co-workers complained that he laughed too loudly.

“My career is shot,” says Mr. Preiss, 59 years old, who lives in Atlanta. “There just is no career anymore.”

Millions of contractors now do heavy lifting, paper pushing and other jobs for American companies that have replaced employees with outside workers. Within the next four years, nearly half of the private-sector workforce in the U.S. will have spent at least some time as a contractor, temporary employee or other type of outside job, estimates MBO Partners, a provider of support services to self-employed professionals.


 Nine common interview questions-Blog image-MP    images-8   leaving_350x200_-2

The contractor model offers companies lower costs, more flexibility and fewer management headaches. Workers get far less from the arrangement.

The costs hit home in every paycheck and every day on the job, according to interviews with dozens of current and former contractors, as well as many of the more than 150 responses to a Wall Street Journal survey. The survey asks readers of articles in this series to describe their experiences working as a contractor.

Outside workers usually aren’t surprised when they get no paid holidays, sick days, employee-sponsored health insurance, 401(k) plan or other perks routinely offered to traditional employees at the same companies.

What wounds more deeply are things taken for granted or barely considered at all by regular employees, outside workers often say. The work lives of contractors frequently feel like a series of tiny slights that reinforce their second-class status and bruise their self-worth. Even when contracting jobs are easy to get, they can vanish instantly, and turning contract assignments into a real career remains out of reach.

At many companies, contractors aren’t allowed to attend important meetings, go to the company gym or bring their kids to Take Your Child to Work Day. They keep quiet because only full-time employees are expected to speak up. Working harder, smarter or longer offers little advantage when applying for a job directly with the company.


Nothing is loathed more than the nametags or identification badges that advertise the lowly ranking of contractors in the workplace pecking order. Technical writer Don Cwiklowski Jr. worked as a contractor at Mastercard Inc. for four years. He says co-workers often glanced at the badge dangling from his neck, saw the red color that signaled his contractor status and looked right past him.

He got a green badge when he was hired as a full-time employee at Mastercard in St. Louis in 2012. Some of the same people who had shunned him started saying hello in the hallways, says Mr. Cwiklowski, 53.

The company says it “puts our people at the center of everything we do” and isn’t aware of the examples cited by Mr. Cwiklowski.

Such experiences are becoming more common as the outsourcing wave moves from less-skilled jobs such as security guard and cafeteria worker to a wider range of corporate tasks. Those include information technology, customer service, research, human resources and sales.

This article will be continued in future blogs.

At Last the Democrats Offer the Nation Something Positive—a Single Payer Health Care Plan

“Today is a historic day. Along with 15 co-sponsors—Tammy Baldwin, Richard Blumenthal, Cory Bocker, Al Franken, Kirsten Gillibrand, Kamala Harris, Martin Heinrich, Maizie Hirono, Patrick Leahy, Ed Markey, Jeff Merkley, Brian Schatz,  Tom Udall, Elizabeth Warren and Sheldon Whitehouse—I am proud to introduce Medicare for All, single payer health care legislation in the U.S. Senate. Today we begin the struggle to transform our dysfunctional health care system and make health care in the United States a right, not a privilege.”

So begins Bernie Sanders’s proud announcement this morning of his introduction, along with fifteen other Democratic senators, of a new single payer health care bill—new to this country but long extant (and even flourishing) in Canada and numerous other European nations.

 Not coincidentally, on the same day there appeared this essay by one of Wall Street Journal’s most intelligent editorial contributors, William A. Galston, which cautions the progressive hotheads backing the bill that their party could suffer a similar fate as the Republicans’ recent debacle when they attempted to repeal the Affordable Care Act, if they don’t watch out.

 Read them both and try to decide how you think we should proceed in this perilous course that affects all of us so deeply.

By WILLIAM A. GALSTON, September 13, 2017 for the Wall Street Journal

There must be something special in the waters of Lake Champlain. In 2011 newly elected Vermont Gov. Peter Shumlin announced his intention to shift his state to a single-payer health-care system. He pursued that goal until late 2014, when a study by his staff and consultants projected that it would require imposing a payroll tax of 11.5% and raising the personal income tax by as much as 9.5 percentage points. “The risk of economic shock is too high,” Mr. Shumlin concluded as he withdrew his proposal.

There were political considerations as well. Despite successfully campaigning on a single-payer platform in 2010 and winning re-election in 2012 and 2014, Mr. Shumlin never succeeded in persuading a majority of his constituents to support his signature idea. An April 2014 survey found Vermont split down the middle, with 40% of residents approving and 39% disapproving. Perhaps the prospect of increasing the state budget by 45% gave Vermonters reason to doubt the wisdom of an abrupt shift to single-payer health care.

Vermont is not some random canary in the mineshaft. The Green Mountain State is among the most liberal in the country. Barack Obama prevailed by 37 percentage points in 2008 and 36 points in 2012. Hillary Clinton’s snake-bitten 2016 campaign managed a 26-point victory. The state is ethnically homogeneous, with a median household income above the national average. It is hard to think of a state better positioned to embrace single-payer health care, yet a determined governor couldn’t get close to pushing it through.

But now Democratic presidential aspirants are rushing to endorse Vermont Sen. Bernie Sanders’s soon-to-be-released national single-payer plan. Sens. Elizabeth Warren and Kamala Harris already back it. Sens. Cory Booker and Kirsten Gillibrand have announced plans to co-sponsor it as well.

From the perspective of the contest for the Democratic nomination in 2020, this strategy is easy to understand. Mr. Sanders came closer to upsetting Mrs. Clinton than most observers thought possible. For now, the progressive wing of the party is energized, and the party’s ideological center of gravity has shifted.

In 2000, when Al Gore defeated Bill Bradley for the Democratic presidential nomination, 44% of Democrats regarded themselves as moderate and only 28% as liberal. By 2008, when Mr. Obama narrowly prevailed over Mrs. Clinton, the moderates’ share had fallen to 41% while the liberal share had increased to 33%.

Since then, the pace of ideological change has accelerated. Today, liberals make up the largest share of Democrats—48%. Moderates have fallen further, to only 36%. And the conservative wing, nearly one-quarter of the total in 2000, now amounts to barely one-seventh of the party.

If you want to win the 2020 Democratic presidential nomination, it might seem, the best strategy is to emerge as the champion of its newly dominant progressive faction, and coming out for single-payer might seem the best way to do it.

Whether this is the best formula for winning a general election contest is another matter. Sens. Warren, Harris, Booker and Gillibrand are coastal Democrats from bright-blue states. Ohio Sen. Sherrod Brown, a veteran populist from a swing state that Donald Trump carried by a stunning eight points in 2016, has conspicuously declined to endorse the Sanders bill, preferring to build bipartisan support for a more modest proposal to allow Americans to buy into Medicare when they reach 55. Democrats should ask themselves which of their elected officials better understands how to win back the Midwestern states that made Mr. Trump president.

This is not just a political calculation. From a policy standpoint, the danger is that “Medicare for All” will become the Democrats’ “repeal and replace ObamaCare.”

In May 2016, the Urban Institute—not previously known as a hotbed of conservatism—released its analysis of the Medicare for All proposal Sen. Sanders offered during his presidential campaign. The study found that if the plan were enacted into law, the federal government would absorb the bulk of the current spending by states, localities, employers and households. Federal spending would rise by $2.5 trillion in the plan’s first year, and by $32 trillion over the first decade.

A parallel study conducted by the bipartisan Tax Policy Center found that Mr. Sanders’s revenue proposals would raise only $15.3 trillion over the first decade, leaving a gap of $16.6 trillion between expenditures and revenues. “The proposed taxes,” the Urban Institute observed, are “much too low to fully finance the plan,” and “additional sources of revenue would have to be identified.”

It will be interesting to see whether Sen. Sanders’s new proposal can meet these objections. Even if it does, Democrats interested in regaining a national majority should look before they leap.


“Affluent? Upper Class? No, Not Me.”

A most interesting article from this Sunday’s New York Times gives us a chance to look at the lifestyle of today’s very rich. They differ considerably from those of other periods of extreme wealth  in America such as the Gilded Age, the Roaring Twenties or even the immediate post-war WASP upper class, in that  they prefer to hide their money and pretend to be hard-working middle class people, a “meritocracy” rather than an “aristocracy.” 

By RACHEL SHERMAN, September 10, 2017 forThe New York Times

Nearly all [the interviewed] were in the top 1 or 2 percent.

These people agreed to meet with me as part of research I conducted on affluent and wealthy people’s consumption. I interviewed 50 parents with children at home, including 18 stay-at-home mothers. Highly educated, they worked or had worked in finance and related industries, or had inherited assets in the millions of dollars. Nearly all were in the top 1 percent or 2 percent in terms of income or wealth or both. They came from a variety of economic backgrounds, and about 80 percent were white. . . .

We often imagine that the wealthy are unconflicted about their advantages and in fact eager to display them. Since Thorstein Veblen coined the term “conspicuous consumption” more than a century ago, the rich have typically been represented as competing for status by showing off their wealth. Our current president is the conspicuous consumer in chief, the epitome of the rich person who displays his wealth in the glitziest way possible.

The Gilded Age

Yet we believe that wealthy people seek visibility because those we see are, by definition, visible. In contrast, the people I spoke with expressed a deep ambivalence about identifying as affluent. Rather than brag about their money or show it off, they kept quiet about their advantages. They described themselves as “normal” people who worked hard and spent prudently, distancing themselves from common stereotypes of the wealthy as ostentatious, selfish, snobby and entitled. Ultimately, their accounts illuminate a moral stigma of privilege.

The ways these wealthy New Yorkers identify and avoid stigma matter not because we should feel sorry for uncomfortable rich people, but because they tell us something about how economic inequality is hidden, justified and maintained in American life.

Keeping silent about social class, a norm that goes far beyond the affluent, can make Americans feel that class doesn’t, or shouldn’t, matter. And judging wealthy people on the basis of their individual behaviors — do they work hard enough, do they consume reasonably enough, do they give back enough — distracts us from other kinds of questions about the morality of vastly unequal distributions of wealth. . . .

The stigma of wealth showed up in my interviews first in literal silences about money. When I asked one very wealthy stay-at-home mother what her family’s assets were, she was taken aback. “No one’s ever asked me that, honestly,” she said. “No one asks that question. It’s up there with, like, ‘Do you masturbate?’ ”

“Nobody knows how much we spend.”

Another woman, speaking of her wealth of over $50 million, which she and her husband generated through work in finance, and her home value of over $10 million, told me: “There’s nobody who knows how much we spend. You’re the only person I ever said those numbers to out loud.” She was so uncomfortable with having shared this information that she contacted me later the same day to confirm exactly how I was going to maintain her anonymity. Several women I talked with mentioned that they would not tell their husbands that they had spoken to me at all, saying, “He would kill me,” or “He’s more private.”

These conflicts often extended to a deep discomfort with displaying wealth. Scott, who had inherited wealth of more than $50 million, told me he and his wife were ambivalent about the Manhattan apartment they had recently bought for over $4 million. Asked why, he responded: “Do we want to live in such a fancy place? Do we want to deal with the person coming in and being like, ‘Wow!’ That wears on you. We’re just not the type of people who wear it on our sleeve. We don’t want that ‘Wow.’ ” His wife, whom I interviewed separately, was so uneasy with the fact that they lived in a penthouse that she had asked the post office to change their mailing address so that it would include the floor number instead of “PH,” a term she found “elite and snobby.”

My interviewees never talked about themselves as “rich” or “upper class,” often preferring terms like “comfortable” or “fortunate.” Some even identified as “middle class” or “in the middle,” typically comparing themselves with the super-wealthy, who are especially prominent in New York City, rather than to those with less.

When I used the word “affluent” in an email to a stay-at-home mom with a $2.5 million household income, a house in the Hamptons and a child in private school, she almost canceled the interview, she told me later. Real affluence, she said, belonged to her friends who traveled on a private plane.

Others said that affluence meant never having to worry about money, which many of them, especially those in single-earner families dependent on work in finance, said they did, because earnings fluctuate and jobs are impermanent.

The Roaring Twenties

American culture has long been marked by questions about the moral caliber of wealthy people. Capitalist entrepreneurs are often celebrated, but they are also represented as greedy and ruthless. Inheritors of fortunes, especially women, are portrayed as glamorous, but also as self-indulgent.

The negative side of this portrayal may be more prominent in times of high inequality (think of the robber barons of the Gilded Age or the Gordon Gekko figures of the 1980s). In recent years, the Great Recession and Occupy Wall Street, which were in the background when I conducted these interviews, brought extreme income inequality onto the national stage again. The top 10 percent of earners now garner over 50 percent of income nationally, and the top 1 percent over 20 percent.

 A decades–long shift in the composition of the wealthy.

It is not surprising, then, that the people I talked with wanted to distance themselves from the increasingly vilified category of the 1 percent. But their unease with acknowledging their privilege also grows out of a decades-long shift in the composition of the wealthy. During most of the 20th century, the upper class was a homogeneous community. Nearly all white and Protestant, the top families belonged to the same exclusive clubs, were listed in the Social Register, educated their children at the same elite institutions.

The post-war WASP upper class

This class has diversified, thanks largely to the opening of elite education to people of different ethnic and religious backgrounds starting after World War II, and to the more recent rise of astronomical compensation in finance. At the same time, the rise of finance and related fields means that many of the wealthiest are the “working rich,” not the “leisure class” Veblen described. The quasi-aristocracy of the WASP upper class has been replaced by a “meritocracy” of a more varied elite. Wealthy people must appear to be worthy of their privilege for that privilege to be seen as legitimate.

Being worthy means working hard, as we might expect. But being worthy also means spending money wisely. In both these ways, my interviewees strove to be “normal.”

Scott and his wife had spent $600,000 in the year before our conversation. “We just can’t understand how we spent that much money,” he told me. “That’s kind of a little spousal joke. You know, like: ‘Hey. Do you feel like this is the $600,000 lifestyle? Whooo!’ ” Rather than living the high life that he imagined would carry such a price tag, he described himself as “frenetic,” asserting, “I’m running around, I’m making peanut butter and jelly sandwiches.” Having money does not mean, in his view, that he is not ordinary.

The people I talked with never bragged about the price of something because it was high; instead, they enthusiastically recounted snagging bargains on baby strollers, buying clothes at Target and driving old cars. They critiqued other wealthy people’s expenditures, especially ostentatious ones such as giant McMansions or pricey resort vacations where workers, in one man’s sarcastic words, “massage your toes.”

They worried about how to raise children who would themselves be “good people” rather than entitled brats. The context of New York City, especially its private schools, heightened their fear that their kids would never encounter the “real world,” or have “fluency outside the bubble,” in the words of one inheritor. Another woman told me about a child she knew of whose father had taken the family on a $10,000 vacation; afterward the child had said, “It was great, but next time we fly private like everyone else.”

To be sure, these are New Yorkers with elite educations, and most are socially liberal. Wealthy people in other places or with other histories may feel more comfortable talking about their money and spending it in more obvious ways. And even the people I spoke with may be less reticent among their wealthy peers than they are in a formal interview.

 A deep tension at the heart of the American dream.

Nonetheless, their ambivalence about recognizing privilege suggests a deep tension at the heart of the idea of American dream. While pursuing wealth is unequivocally desirable, having wealth is not simple and straightforward. Our ideas about egalitarianism make even the beneficiaries of inequality uncomfortable with it. And it is hard to know what they, as individuals, can do to change things.

In response to these tensions, silence allows for a kind of “see no evil, hear no evil” stance. By not mentioning money, my interviewees follow a seemingly neutral social norm that frowns on such talk. But this norm is one of the ways in which privileged people can obscure both their advantages and their conflicts about these advantages.


Today’s “meritocracy”

And, as they try to be “normal,” these wealthy and affluent people deflect the stigma of wealth. If they can see themselves as hard workers and reasonable consumers, they can belong symbolically to the broad and legitimate American “middle,” while remaining materially at the top.

These efforts respond to widespread judgments of the individual behaviors of wealthy people as morally meritorious or not. Yet what’s crucial to see is that such judgments distract us from any possibility of thinking about redistribution. When we evaluate people’s moral worth on the basis of where and how they live and work, we reinforce the idea that what matters is what people do, not what they have. With every such judgment, we reproduce a system in which being astronomically wealthy is acceptable as long as wealthy people are morally good.

Calls from liberal and left social critics for advantaged people to recognize their privilege also underscore this emphasis on individual identities. For individual people to admit that they are privileged is not necessarily going to change an unequal system of accumulation and distribution of resources.

Instead, we should talk not about the moral worth of individuals but about the moral worth of particular social arrangements. Is the society we want one in which it is acceptable for some people to have tens of millions or billions of dollars as long as they are hardworking, generous, not materialistic and down to earth? Or should there be some other moral rubric, that would strive for a society in which such high levels of inequality were morally unacceptable, regardless of how nice or moderate its beneficiaries are?

Rachel Sherman is an associate professor of sociology at the New School and the author of “Uneasy Street: The Anxieties of Affluence,” from which this essay is adapted.

Consumer Financial Protection Bureau Under Fire from the Right – Part Three

Three-part article from The New York Times on how the present administration intends to neuter the Consumer Financial Protection Bureau continues.

By STEVE EDER, JESSICA SILVER-GREENBERG and STACY COWLEY, September 2, 2017 for The New York Times

The consumer agency had been collaborating with the Department of Education on overhauling the $1.3 trillion student loan market to ensure that private companies collecting loan payments abided by consumer protections.

But soon after Betsy DeVos was appointed education secretary this year, the department scrapped much of that work. In particular, the department eliminated a requirement that federal student loan servicers adopt a simplified repayment disclosure form that the consumer bureau spent years developing.

Lobbyists are also feeling empowered by the change in administrations. Working on behalf of payday lenders, they have flooded the consumer agency with comments, more than a million in all, urging it to halt a proposed crackdown on the industry.

At some payday loan counters, customers were handed comment forms alongside their checks and urged to tell the bureau just how important payday lending was to their livelihood. Hundreds of thousands of those comments, often with nearly identical wording, poured into government databases.

So far, that push has not deterred the bureau. Within the agency, there is a mounting sense of urgency to get the final version of the payday rules out, according to two people familiar with the process. The new rules would represent the first time that the lucrative market — the payday industry collects $7 billion annually in fees — was directly regulated by the federal government.

The bureau’s rollout last month of its rule allowing class-action lawsuits in some arbitration cases has also rattled Wall Street, and is widely seen as a provocative stance against the prevailing political momentum in Washington.

Opponents of the rule have received an assist from the Trump administration. Keith Noreika, the acting currency comptroller, who serves as the chief bank regulator, asked Mr. Cordray to delay publication of the rule, saying his staff needed more time to review whether it posed a threat to the safety and soundness of the banks.

Mr. Cordray, in a response to Mr. Noreika, said the idea that class actions were a threat to the banking system was “plainly frivolous.” (He also said he had already sent the rule to the Federal Register for publication a week before he received Mr. Noreika’s letter.)

                                     Senator Lindsey Graham, Republican of South Carolina

A challenge to the rule passed the House, but has stalled in the Senate. Senator Lindsey Graham, Republican of South Carolina, has said he would not back a repeal of the rule. Other Republicans are also wavering.

“Moderate Republicans don’t want to be painted as anti-consumer,” said Isaac Boltansky, the director of policy research at Compass Point, a research firm tracking the fate of the agency’s recent rules.

End of Three Part Article