Friday, November 10, 2023

Bari Weiss: End DEI

 

Bari Weiss: End DEI

It’s not about diversity, equity, or inclusion. It is about arrogating power to a movement that threatens not just Jews—but America itself.

By Bari Weiss

November 9, 2023

Twenty years ago, when I was a college student, I started writing about a then-nameless, niche ideology that seemed to contradict everything I had been taught since I was a child. 


It is possible I would not have perceived the nature of this ideology—or rather I would have been able to avoid seeing its true nature—had I not been a Jew. But I was. I am. And in noticing the way I had been written out of the equation, I started to notice that it wasn’t just me, but that the whole system rested on an illusion.


What I saw was a worldview that replaced basic ideas of good and evil with a new rubric: the powerless (good) and the powerful (bad). It replaced lots of things. Color blindness with race obsession. Ideas with identity. Debate with denunciation. Persuasion with public shaming. The rule of law with the fury of the mob. 


People were to be given authority in this new order not in recognition of their gifts, hard work, accomplishments, or contributions to society, but in inverse proportion to the disadvantages their group had suffered, as defined by radical ideologues. According to them, as James Kirchick concisely put it: “Muslim > gay, black > female, and everybody > the Jews.”


I was an undergraduate back then, but you didn’t need a PhD to see where this could go. And so I watched, in horror, sounding alarms as loudly as I could. 

I was told by most Jewish leaders that, yes, it wasn’t great, but not to be so hysterical. Campuses were always hotbeds of radicalism, they said. This ideology, they promised, would surely dissipate as young people made their way in the world. 

It did not.


Over the past two decades, I saw this inverted worldview swallow all of the crucial sense-making institutions of American life. It started with the universities. Then it moved on to cultural institutions—including some I knew well, like The New York Times—as well as every major museumphilanthropy, and media company. Then on to our medical schools and our law schools. It’s taken root at nearly every major corporation. It’s inside our high schools and even our elementary schools. The takeover is so comprehensive that it’s now almost hard to notice it—because it is everywhere. 

Including in the Jewish community.


Some of the most important Jewish communal organizations transformed themselves in order to prop up this ideology. Or at the very least, they contorted themselves to signal that they could be good allies in the fight for equal rights—even as those rights are no longer presumed inalienable or equal and are handed out rather than protected.


For Jews there are obvious and glaring dangers in a worldview that measures fairness by equality of outcome rather than opportunity. If underrepresentation is the inevitable outcome of systemic bias, then overrepresentation—and Jews are two percent of the American population—suggests not talent or hard work, but unearned privilege. This conspiratorial conclusion is not that far removed from the hateful portrait of a small group of Jews divvying up the ill-gotten spoils of an exploited world. 

It isn’t only Jews who suffer from the suggestion that merit and excellence are dirty words. It is strivers of every race, ethnicity, and class. That is why Asian American success, for example, is suspicious. The percentages are off. The scores are too high. Who did you steal all that success from? 


Of course, this new ideology doesn’t come right out and say all that. It doesn’t even like to be named. Some call it wokeness or anti-racism or progressivism or safetyism or Critical Social Justice or identity Marxism. But whatever term you use, what’s clear is that it has gained power in a conceptual instrument called “diversity, equity, and inclusion,” or DEI.


In theory, all three of these words represent noble causes. They are, in fact, all causes to which American Jews in particular have long been devoted, both individually and collectively. But in reality, these words are now metaphors for an ideological movement bent on recategorizing every American not as an individual, but as an avatar of an identity group, his or her behavior prejudged accordingly, setting all of us up in a kind of zero-sum game.


We have been seeing for several years now the damage this ideology has done: DEI, and its cadres of enforcers, undermine the central missions of the institutions that adopt it. But nothing has made the dangers of DEI clearer than what’s happening these days on our college campuses—the places where our future leaders are nurtured. 

It is there that professors are compelled to pledge fidelity to DEI in order to get hired, promoted, or tenured. (For more on this, please read John Sailer’s Free Press piece: How DEI Is Supplanting Truth as the Mission of American Universities.) And it is there that the hideousness of this worldview has been on full display over the past few weeks: we see students and professors immersed not in facts, knowledge, and history, but in a dehumanizing ideology that has led them to celebrate or justify terrorism. 

Jews, who understand that being made in the image of God bestows inviolate sanctity on every human life, must not stand by as that principle, so central to the promise of this country and its hard-won freedoms, is erased. 

What we must do is reverse this.


The answer is not for the Jewish community to plead its cause before the intersectional coalition or beg for a higher ranking in the new ladder of victimhood. That is a losing strategy—not just for Jewish dignity, but for the values we hold as Jews and as Americans. 


The Jewish commitment to justice—and the Jewish American community’s powerful and historic opposition to racism—is a source of tremendous pride. That should never waver. Nor should our commitment to stand by our friends, especially when they need our support as we now need theirs.


But DEI is not about the words it uses as camouflage. DEI is about arrogating power. 

And the movement that is gathering all this power does not like America or liberalism. It does not believe that America is a good country—at least no better than China or Iran. It calls itself progressive, but it does not believe in progress; it is explicitly anti-growth. It claims to promote “equity,” but its answer to the challenge of teaching math or reading to disadvantaged children is to eliminate math and reading tests. It demonizes hard work, merit, family, and the dignity of the individual.
 

An ideology that pathologizes these fundamental human virtues is one that seeks to undermine what makes America exceptional.


It is time to end DEI for good. No more standing by as people are encouraged to segregate themselves. No more forced declarations that you will prioritize identity over excellence. No more compelled speech. No more going along with little lies for the sake of being polite.


The Jewish people have outlived every single regime and ideology that has sought our elimination. We will persist, one way or another. But DEI is undermining America, and that for which it stands—including the principles that have made it a place of unparalleled opportunity, safety, and freedom for so many. Fighting it is the least we owe this country. 

Our friends at Tablet have been amazing collaborators over the past few weeks. This piece was originally published as part of a Tablet symposium about about what we in the West should do after October 7.


Reposted from the Free Press website for ease of reading and sharing
https://www.thefp.com/p/end-dei-woke-capture

Sunday, April 26, 2020

The Bearer of Good Coronavirus News - under attack for questioning the prevailing wisdom about lockdown


We are already grading the response to the Coronavirus. 

Too Soft, Too Firm, or just about right

This point of view is important. The fact that this professor is now being attacked for simply pointing out a few obvious facts shows is not right. Is it that some have an agenda?


Original STAT Article referenced in this article  HERE
Profile of Professor Ioannidis on Stanford Website HERE

Original Wall Street Journal article (this one below) Link HERE

OPINION THE WEEKEND INTERVIEW
The Bearer of Good Coronavirus News

Stanford scientist John Ioannidis finds himself under attack for questioning the prevailing wisdom about the lockdown.

By Allysia FinleyUpdated April 24, 2020 5:14 pm 1,338


Defenders of coronavirus lockdown mandates keep talking about science. “We are going to do the right thing, not judge by politics, not judge by protests, but by science,” California’s Gov. Gavin Newsom said this week. Michigan Gov. Gretchen Whitmer defended an order that, among other things, banned the sale of paint and vegetable seeds but not liquor or lottery tickets. “Each action has been informed by the best science and epidemiology counsel there is,” she wrote in an op-ed.

But scientists are almost never unanimous, and many appeals to “science” are transparently political or ideological. Consider the story of John Ioannidis, a professor at Stanford’s School of Medicine. His expertise is wide-ranging—he juggles appointments in statistics, biomedical data, prevention research and health research and policy. Google Scholar ranks him among the world’s 100 most-cited scientists. He has published more than 1,000 papers, many of them meta-analyses—reviews of other studies. Yet he’s now found himself pilloried because he dissents from the theories behind the lockdowns—because he’s looked at the data and found good news.

In a March article for Stat News, Dr. Ioannidis argued that Covid-19 is far less deadly than modelers were assuming. He considered the experience of the Diamond Princess cruise ship, which was quarantined Feb. 4 in Japan. Nine of 700 infected passengers and crew died. Based on the demographics of the ship’s population, Dr. Ioannidis estimated that the U.S. fatality rate could be as low as 0.025% to 0.625% and put the upper bound at 0.05% to 1%—comparable to that of seasonal flu.

“If that is the true rate,” he wrote, “locking down the world with potentially tremendous social and financial consequences may be totally irrational. It’s like an elephant being attacked by a house cat. Frustrated and trying to avoid the cat, the elephant accidentally jumps off a cliff and dies.”

Dr. Ioannidis, 54, likes metaphors. A New York native who grew up in Athens, he also teaches comparative literature and has published seven literary works—poetry and fiction, the latest being an epistolary novel—in Greek. In his spare time, he likes to fence, swim, hike and play basketball.

Early in his career, he realized that “the common denominator for everything that I was doing was that I was very interested in the methods—not necessarily the results but how exactly you do that, how exactly you try to avoid bias, how you avoid error.” When he began examining studies, he discovered that few headline-grabbing findings could be replicated, and many were later contradicted by new evidence.

Scientific studies are often infected by biases. “Several years ago, along with one of my colleagues, we had mapped 235 biases across science. And maybe the biggest cluster is biases that are trying to generate significant, spectacular, fascinating, extraordinary results,” he says. “Early results tend to be inflated. Claims for significance tend to be exaggerated.”

An example is a 2012 meta-analysis on nutritional research, in which he randomly selected 50 common cooking ingredients, such as sugar, flour and milk. Eighty percent of them had been studied for links to cancer, and 72% of the studies linked an ingredient to a higher or lower risk. Yet three-quarters of the findings were weak or statistically insignificant.

Dr. Ioannidis calls the coronavirus pandemic “the perfect storm of that quest for very urgent, spectacular, exciting, apocalyptic results. And as you see, apparently our early estimates seem to have been tremendously exaggerated in many fronts.”

Chief among them was a study by modelers at Imperial College London, which predicted more than 2.2 million coronavirus deaths in the U.S. absent “any control measures or spontaneous changes in individual behaviour.” The study was published March 16—the same day the Trump administration released its “15 Days to Slow the Spread” initiative, which included strict social-distancing guidelines.

Dr. Ioannidis says the Imperial projection now appears to be a gross overestimate. “They used inputs that were completely off in some of their calculation,” he says. “If data are limited or flawed, their errors are being propagated through the model. . . . So if you have a small error, and you exponentiate that error, the magnitude of the final error in the prediction or whatever can be astronomical.”

“I love models,” he adds. “I do a lot of mathematical modeling myself. But I think we need to recognize that they’re very, very low in terms of how much weight we can place on them and how much we can trust them. . . . They can give you a very first kind of mathematical justification to a gut feeling, but beyond that point, depending on models for evidence, I think it’s a very bad recipe.”

Modelers sometimes refuse to disclose their assumptions or data, so their errors go undetected. Los Angeles County predicted last week that 95.6% of its population would be infected by August if social distancing orders were relaxed. (Confirmed cases were 0.17% of the population as of Thursday.) But the basis for this projection is unclear. “At a minimum, we need openness and transparency in order to be able to say anything,” Dr. Ioannidis says.

Most important, “what we need is data. We need real data. We need data on how many people are infected so far, how many people are actively infected, what is really the death rate, how many beds do we have to spare, how has this changed.”

That will require more testing. Dr. Ioannidis and colleagues at Stanford last week published a study on the prevalence of coronavirus antibodies in Santa Clara County. Based on blood tests of 3,300 volunteers in the county—which includes San Jose, California’s third-largest city—during the first week of April, they estimated that between 2.49% and 4.16% of the county population had been infected. That’s 50 to 85 times the number of confirmed cases and implies a fatality rate between 0.12% and 0.2%, consistent with that of the Diamond Princess.

The study immediately came under attack. Some statisticians questioned its methods. Critics noted the study sample was not randomly selected, and white women under 64 were disproportionately represented. The Stanford team adjusted for the sampling bias by weighting the results by sex, race and ZIP Code, but the study acknowledges that “other biases, such as bias favoring individuals in good health capable of attending our testing sites, or bias favoring those with prior Covid-like illnesses seeking antibody confirmation are also possible. The overall effect of such biases is hard to ascertain.”

Dr. Ioannidis admits his study isn’t “bulletproof” and says he welcomes scrutiny. But he’s confident the findings will hold up, and he says antibody studies from around the world will yield more data. A study published this week by the University of Southern California and the Los Angeles County Department of Public Health estimated that the virus is 28 to 55 times as prevalent in that county as confirmed cases are. A New York study released Thursday estimated that 13.9% of the state and 21.2% of the city had been infected, more than 10 times the confirmed cases.

Yet most criticism of the Stanford study has been aimed at defending the lockdown mandates against the implication that they’re an overreaction. “There’s some sort of mob mentality here operating that they just insist that this has to be the end of the world, and it has to be that the sky is falling. It’s attacking studies with data based on speculation and science fiction,” he says. “But dismissing real data in favor of mathematical speculation is mind-boggling.”


In part he blames the media: “We have some evidence that bad news, negative news [stories], are more attractive than positive news—they lead to more clicks, they lead to people being more engaged. And of course we know that fake news travels faster than true news. So in the current environment, unfortunately, we have generated a very heavily panic-driven, horror-driven, death-reality-show type of situation.”

The news is filled with stories of healthy young people who die of coronavirus. But Dr. Ioannidis recently published a paper with his wife, Despina Contopoulos-Ioannidis, an infectious-disease specialist at Stanford, that showed this to be a classic man-bites-dog story. The couple found that people under 65 without underlying conditions accounted for only 0.7% of coronavirus deaths in Italy and 1.8% in New York City.

“Compared to almost any other cause of disease that I can think of, it’s really sparing young people. I’m not saying that the lives of 80-year-olds do not have value—they do,” he says. “But there’s far, far, far more . . . young people who commit suicide.” If the panic and attendant disruption continue, he says, “we will see many young people committing suicide . . . just because we are spreading horror stories with Covid-19. There’s far, far more young people who get cancer and will not be treated, because again, they will not go to the hospital to get treated because of Covid-19. There’s far, far more people whose mental health will collapse.”

He argues that public officials need to weigh these factors when making public-health decisions, and more hard data from antibody and other studies will help. “I think that we should just take everything that we know, put it on the table, and try to see, OK, what’s the next step, and see what happens when we take the next step. I think this sort of data-driven feedback will be the best. So you start opening, you start opening your schools. You can see what happens,” he says. “We need to be open minded, we need to just be calm, allow for some error, it’s unavoidable. We started knowing nothing. We know a lot now, but we still don’t know everything.”

He cautions against drawing broad conclusions about the efficacy of lockdowns based on national infection and fatality rates. “It’s not that we have randomized 10 countries to go into lockdown and another 10 countries to remain relatively open and see what happens, and do that randomly. Different prime ministers, different presidents, different task forces make decisions, they implement them in different sequences, at different times, in different phases of the epidemic. And then people start looking at this data and they say, ‘Oh look at that, this place did very well. Why? Oh, because of this measure.’ This is completely, completely opinion-based.”

People are making “big statements about ‘lockdowns save the world.’ I think that they’re immature. They’re tremendously immature. They may have worked in some cases, they may have had no effect in others, and they may have been damaging still in others.”

Most disagreements among scientists, he notes, reflect differences in perspective, not facts. Some find the Stanford study worrisome because it suggests the virus is more easily transmitted, while others are hopeful because it suggests the virus is far less lethal. “It’s basically an issue of whether you’re an optimist or a pessimist. Even scientists can be optimists and pessimists. Probably usually I’m a pessimist, but in this case, I’m probably an optimist.”

Ms. Finley is a member of the Journal’s editorial board.

Saturday, March 14, 2020

Comparing Nuclear War to the Coronavirus


It’s now clear that COVID-19 is a deadly serious global pandemic, and all necessary precautions should be taken. Still, C. S. Lewis’s words—written 72 years ago—ring with some relevance for us. Just replace “atomic bomb” with “coronavirus.”
In one way we think a great deal too much of the atomic bomb. “How are we to live in an atomic age?” I am tempted to reply: “Why, as you would have lived in the sixteenth century when the plague visited London almost every year, or as you would have lived in a Viking age when raiders from Scandinavia might land and cut your throat any night; or indeed, as you are already living in an age of cancer, an age of syphilis, an age of paralysis, an age of air raids, an age of railway accidents, an age of motor accidents.”
In other words, do not let us begin by exaggerating the novelty of our situation. Believe me, dear sir or madam, you and all whom you love were already sentenced to death before the atomic bomb was invented: and quite a high percentage of us were going to die in unpleasant ways. We had, indeed, one very great advantage over our ancestors—anesthetics; but we have that still. It is perfectly ridiculous to go about whimpering and drawing long faces because the scientists have added one more chance of painful and premature death to a world which already bristled with such chances and in which death itself was not a chance at all, but a certainty.
This is the first point to be made: and the first action to be taken is to pull ourselves together. If we are all going to be destroyed by an atomic bomb, let that bomb when it comes find us doing sensible and human things—praying, working, teaching, reading, listening to music, bathing the children, playing tennis, chatting to our friends over a pint and a game of darts—not huddled together like frightened sheep and thinking about bombs. They may break our bodies (a microbe can do that) but they need not dominate our minds.
— “On Living in an Atomic Age” (1948) in Present Concerns: Journalistic Essays
Original Link

Why N1N1 of 2009 is not like the Coronavirus



Remember the Last Global Pandemic? Probably Not

In 2009, a new strain of H1N1 influenza emerged. It did not cause anywhere near the disruption that Covid-19 has, and for good reason.
In late March 2009, two kids living more than 100 miles apart in Southern California came down with the flu. By mid-April, their illnesses had been diagnosed as being caused by a new strain of H1N1 influenza, aka swine flu. Flu outbreaks that had started a few weeks earlier in Mexico were soon ascribed to the new H1N1 as well. On April 25, with cases confirmed or suspected in 19 Mexican states and five U.S. ones, the World Health Organization declared the disease’s spread a “Public Health Emergency of International Concern.”
Swine flu has a history that makes health authorities pay special heed. In 1918, a variant of H1N1 influenza caused a global pandemic that is estimated to have killed as many as 50 million people, or 2.7% of the world’s population. After tests found H1N1 in two soldiers during a flu outbreak at the Fort Dix army base in New Jersey in 1976, the U.S. government jumped into action, with President Gerald Ford announcing a plan to vaccinate “every man, woman, and child in the United States.” That turned into something of a debacle, though, as the virus didn’t seem to spread beyond Fort Dix and the hastily assembled vaccine killed about 30 people.
In 2009, the reaction was more muted. In its public-health-emergency declaration in April, the WHO noted that the illnesses caused by the new H1N1 tended to be quite mild, with only one brief hospitalization and no deaths from the 20 confirmed U.S. cases. It also advised against any travel restrictions or border controls. When the U.S. government declared its own public health emergency the next day, Secretary of Homeland Security Janet Napolitano termed it more of a “declaration of emergency preparedness.”
It’s like declaring one for a hurricane. It means we can release funds and take other measures. The hurricane may not actually hit.
The hurricane did hit, although in some ways it was more like a tropical storm. The virus continued its spread, with the Centers for Disease Control and Prevention switching over on May 4 from counting confirmed cases to making estimates. As of May 5, 980 schools with 607,778 students had been closed in an effort to slow the epidemic. By late June, the CDC was estimating that 1 million Americans had contracted the disease. Meanwhile, on June 11, WHO Director-General Margaret Chan had declared that with the virus spreading in 74 countries “the world is now at the start of the 2009 influenza pandemic.” She also said that a vaccine was on the way, and that measures had been taken “to ensure the largest possible supply of pandemic vaccine in the months to come.”
By the time the vaccines became widely available in November, though, H1N1 was already on the decline. By January, many countries were canceling their vaccine orders, and a German physician and former Social Democratic politician was leading a campaign lambasting the WHO for declaring a “fake” pandemic to gin up business for pharmaceutical manufacturers.
That doesn’t seem fair, given that H1N1 did infect as much as 24% of the world’s population. The overall fatality rate was quite low, at about 0.02% of estimated cases — five time lower than the 0.1% average fatality rate for the seasonal flu — but that’s mainly because H1N1 had little effect on the demographic usually hit hardest by influenza: those 65 and older. For younger people, H1N1 was more dangerous than the seasonal flu, and in countries in South Asia and Africa with youthful populations the H1N1 pandemic really was a big deal, with the CDC later estimating a global death toll ranging from 151,700 to 575,400.
Still, that’s lower than the range that the CDC and WHO now put on the annual death toll from seasonal flu: 290,000 to 650,000. In the U.S., an estimated 60.8 million people contracted the new H1N1 virus from April 2009 through April 2010, 274,304 were hospitalized and 12,469 died. Because the CDC changed the statistical model it uses to make such estimates in 2010 that last number can’t really be compared to recent estimates of seasonal flu fatalities, which ranged from 12,000 in 2011-2012 to 61,000 in 2017-2018. But earlier estimates of overall flu-related deaths in 2008-2009 and 2009-2010 indicate that both flu seasons were less deadly than average.
I bring all this up of course because we are in the throes of new global virus outbreak, although current WHO Director-General Tedros Adhanom Ghebreyesus has so far refused to call it a “pandemic.” I’ll admit that I had entirely forgotten about the H1N1 pandemic until a couple of readers emailed to ask about its absence from a column I wrote last week about the risks posed by the new coronavirus.
Calling attention to 2009 pandemic has become a theme in pro-Donald-Trump circles, with extremely similar articles on PJ MediaRed State and Printly all claiming that President Barack Obama didn’t declare a public health emergency until the H1N1 outbreak had been raging for months (as seen above, the public health emergency was declared less than two weeks after the virus was discovered, although Obama did up that to a “national emergency” in late October). President Trump himself argued on Twitter that “the April 2009-10 Swine Flu, where nearly 13,000 people died in the U.S., was poorly handled.” Such charges are to some extent just “whataboutism,” a propaganda technique used heavily by the Soviet Union back in the day to divert attention from misdeeds and problems by calling attention to the purported misdeeds and problems of others. But comparing Covid-19 with H1N1 can shed some light on why the former has elicited the reaction it has.
For example: Why was H1N1 allowed to spread around the world more or less unchecked, while countries are going to far greater lengths to try to halt Covid-19? Why did the WHO call H1N1 a pandemic but not Covid-19? Isn’t 12,469 deaths a lot worse than the 26 that have been attributed to Covid-19 in the U.S. so far?


That last one is the simplest to answer: Covid-19 is near the beginning of its spread in the U.S., and thus cannot be compared with H1N1’s effect over a full year. If the U.S. death toll from Covid-19 is only 12,469 a year from now, that will likely be counted as a great success. The legitimate worry is that it could be many, many times higher, because Covid-19 is so much deadlier for those who get it than the 2009 H1N1 influenza was.
How much deadlier is still unknown, but of the cases reported to the WHO so far 3.4% have resulted in fatalities. That’s probably misleadingly high because there are so many unreported cases, and in South Korea, which has done the best job of keeping up with the spread of the virus through testing, the fatality rate so far is about 0.7%. But even that is 35 times worse than H1N1 in 2009 and 2010. Multiply 12,469 by 35 and you get 436,415 — which would amount to the biggest U.S. infectious-disease death toll since the 1918 flu. Hospitalization rates are also many times higher for Covid-19, meaning that if it spread as widely as H1N1 it would overwhelm the U.S. health-care system.
That’s one very important reason governments (and stock markets) around the world have reacted so much more strongly to Covid-19 than to the 2009 H1N1 pandemic. Another reason is somewhat more hope-inspiring. It’s that public health experts generally don’t think influenza can be controlled once it starts spreading, other than with a vaccine, whereas several Asian countries seem to have successfully turned back the coronavirus tide, for now at least.
Influenza can’t be controlled because as much as half the transmission of the disease occurs before symptoms appear. With Covid-19 that proportion seems to be lower, meaning that even though it’s more contagious than influenza once symptoms appear, it may be possible to control by testing widely and quickly isolating those who have the disease. This is one reason (there are others) the WHO’s Tedros won’t call it a pandemic. “The threat of a pandemic has become very real,” he said Monday. “But it would be the first pandemic in history that could be controlled.” H1N1 couldn’t be controlled in 2009, but was mild enough that this did not lead to disaster. Covid-19 is a much more dangerous disease that maybe, just maybe, can be stopped.
    This column does not necessarily reflect the opinion of Bloomberg LP and its owners.
    To contact the author of this story:
    Justin Fox at justinfox@bloomberg.net
    To contact the editor responsible for this story:
    Stacey Shick at sshick@bloomberg.net

    Wednesday, February 5, 2020

    Milton Friedman’s World Is Dead and Gone



    Economics

    Milton Friedman’s World Is Dead and Gone

    Five overlooked historical developments should reshape the debate between shareholder and stakeholder capitalists.

    The annual conclave of the rich and powerful this month in Davos, Switzerland, put the longstanding debate about the social responsibility of corporations front and center by proclaiming its official theme as “stakeholders for a cohesive and sustainable world.”
    By using the word “stakeholders,” the World Economic Forum confirmed that it’s taken sides in a debate rekindled last year by the Business Roundtable, a lobbying group representing chief executives of major U.S. corporations. The Roundtable had issued a statement highlighting a “fundamental commitment to all of our stakeholders,” including shareholders, clients, employees, suppliers and communities, thereby situating itself in opposition to the view of corporate responsibility made popular half a century ago by the economist Milton Friedman. Friedman had famously stated in a 1970 New York Times magazine essay that business executives who diverted corporate assets toward social goals were betraying their obligations to shareholders.

    >> Here is the link to the original article. The author overstates Friedman's case.
    >> http://umich.edu/~thecore/doc/Friedman.pdf
    >> The author also forgets that what Friedman is saying is much broader and not untrue.
    >> In each of these cases, the corporate executive would be spending someone else's money for a general social interest. Insofar as his actions in accord with his "social responsibility" reduce returns to stockholders, he is spending their money. Insofar as his actions raise the price to customers, he is spending the customers' money. Insofar as his actions lower the wages of some employees, he is spending their money.
    Yet the debate between Roundtable supporters and Friedman supporters, a main topic of panels and unofficial conversation at Davos last week, missed five key points.
    1. When Friedman was writing, the consequences of his view were more modest than they later became. In the 1960s and 1970s the regulatory state was often more interventionist than it is today — especially in industries such as transportation and telecommunications — and social norms were different. Before the deregulation wave of the 1970s and 1980s, arguing that a business executive should focus only on maximizing shareholder value thus may or may not have been wrong theoretically, but the practical impact was less significant. Whether business leaders pursued a narrow or broad definition of their responsibilities didn’t matter as much because government regulation constrained the consequences.
    2. In no small part because of Friedman’s influence, an extreme definition of capitalism has become dominant. By this definition, only perfectly competitive markets with minimally interventionist governments and business executives who maximized shareholder value should be considered capitalist. It’s a strange argument. I doubt that anyone would have said during the 1940s, 1950s or 1960s that the U.S. wasn’t capitalist, but somehow the qualification standards seem to have changed. I heard one person argue at Davos that regulating or even taxing carbon would be “anti-capitalist.” That’s nonsense. Virtually the entire range of policy options for more or less government action on climate change would not, if enacted, affect whether an economy remains capitalist.
    3. As Friedman’s worldview as taught in introductory economics classes became more dominant, policymakers emphasized the effect of incentives and individual skills. Economists focused on assessing how much more productive an individual could be if she faced a lower marginal tax rate or had more education. Studying, instead, how much more productive an individual could be if she worked at Company A instead of Company B, or lived in City X instead of City Y, went out of fashion. And yet the evidence over the past few decades shows the importance of the place-based perspective, with growing differences in productivity and wages for otherwise similar individuals working at different firms, growing differences in returns on capital across firms, and growing differences in upward mobility for people living in different cities.
    4. The evolving view of government’s proper role and the emphasis on individual-based policy instead of place-based policy coincided with fundamental changes in the global economy, especially a substantial expansion in the effective global labor supply, and the evolution of the computer era. Over roughly the same period, the U.S. experienced a disproportionate rise in political polarization, as a new analysis from the National Bureau of Economic Research shows. The authors of that article argue that diverging views among elites (which is plausibly about the role of government, though the authors don’t make that argument) may be the cause of the rapid rise in broader polarization relative to other countries. At the very least, it’s interesting that the country that has most forcefully adopted the Friedman-inflected approach to policy has polarized the most.
    5. Some people who want businesses to adopt a broad view of corporate responsibility argue that companies have to fill a gap left by the diminishing effectiveness of government. (They might not realize that Friedman addressed that issue in his 1970 essay.) Like the old saw about the child who murders her parents and then complains about being an orphan, however, the dominant paradigm of the past several decades has plausibly produced a dramatic rise in inequality and polarization, and that polarization in turn has made the government unable to function effectively. In other words, we have basically done this to ourselves.
    So what is the best pathway forward? There is no easy fix, but I like many recent ideas about making public investments and regulatory adjustments to encourage creation of business ecosystems like technology hubs, as has been done in Palo Alto, California; Austin, Texas; and Boston. That would require more government action than is likely in the near term. But as Friedman’s success in altering the dialog demonstrates, a first step is to be clear about what we should be doing, even before we’re capable of doing it. And an approach to policy making focused more on where we work and live seems vastly more promising than what we’ve tried over the past few decades.
      This column does not necessarily reflect the opinion of Bloomberg LP and its owners.