Categories
American history Civil War Economy Environmental policy Industry Journalism Policy and Politics Politics Science and education Uncategorized

The Coronavirus signals need for reform of U.S. policies for approval of vaccines and advanced cures

I sent a message similar to this essay to Robyn Dixon, author of an article on Russian science and vaccine development in the Washington Post yesterday, February 9. 2021. The Dixon article cited scientist and journalist, Irina Yakutenko, saying that “you should do everything according to the protocols. It takes a long, long time. It takes a lot of money”. That has been true for U.S. policies that have required ten years to release new vaccines*. But the “miraculous” speed of vaccine development in 2020 tells us that those medical policies are grievously outdated and need to change. I copied this message to Senators Tim Kaine and Mark Warner, encouraging them to explore reforms with Senate colleagues and NIH Director Francis Collins. Republicans would likely agree about the importance of reform.

The length of time needed for vaccine development is due to the extreme rigor of U.S requirements for them and other critical cures. This in turn is attributable to concern to minimize adverse effects. The positive potential of a new vaccine can be confirmed in a dozen cases, but to rule out 1:1000 adverse effects may require years and trials with 6000 persons. The FDA operates in the world’s most litigious nation and is risk-averse. We saw what happened in 2020 when excess cautions were swept aside because of the emergency created by the coronavirus. The speed of the approval was startling for our system, but other nations produced vaccines in the same time frame. Sixty-three coronavirus vaccines have been reported in clinical development. Because of the U.S.’s overwhelming dominance in research funding and the rigor and reputation of the National Institute of Health, the sponsor of federally supported trials, our protocols are widely adopted in Germany and other EU nations.

A new vaccine can cost $500 million to $2 billion. This leads to exorbitant treatment costs and a lack of attention to rarer diseases that could be cured. An example is my wife, Lucy, who has a rare “SCA 8” ataxia that could be readily cured by gene editing – but it can’t get attention.

A sleeper factor also holds back treatment in America. The Washington Post article mentions scientific publication as being desirable for Russian medical development. To the extent that they report new knowledge and advances, scientific publications play critical roles.  But the U.S. suffers from a flood of excess clinical publications. Reports offer many promising new treatments “for the future” while there is a dearth of new treatment opportunities today. The reason is that it is more advantageous for medical researchers to apply for research funding and get their names in print or in the news for promising developments than to take the risks of moving to formal treatment. The latter receives little public recognition while it incurs major risks for lawsuits over new procedures. Risk adverseness operates on clinics as well as clinicians.

In 2016 I became personally familiar with a pioneering Austrian heart surgeon who saved the life of an American composer who had a heart attack while attending a concert in Vienna. Dr. Werner Mohl** developed a procedure for restoring heart tissue damaged in heart attacks. The American would probably have died in the U.S. because the procedure would not have been authorized until clinical trials proved its efficacy.

*Vaccines, 5th Ed., Philadelphia, Saunders 2008.

** https://esc365.escardio.org/Person/304114-prof-mohl-werner*

Categories
American history Economy Environmental policy Industry

America’s Paradox: Low Taxes on Corporations — and Higher Taxes on the Execs Who Run Them Could Stimulate U.S. Manufacturing

U.S. progressives call for higher corporate and personal taxes. Conservatives want no tax increases. Given that the Biden “green” plan at its core requires a robust American manufacturing sector, we need a more nuanced tax scenario.

The Biden campaign proposal to increase the nation’s corporate tax rate to 28 percent would leave that rate below pre-Trump corporate tax levels but still put the United States above most European rates and leave American manufacturing at a competitive disadvantage.

Would federal policies that mandate the purchase of U.S.- made products offset this disadvantage? Mandates work mainly for subsidized activities. These tend to involve insider contracts rather than promote entrepreneurship. Germany has become an international manufacturing powerhouse — with large foreign exchange surpluses but industrial wage and benefit levels nearly double what U.S. corporations provide — without such mandates. Germany has a corporate tax rate of 15 percent, plus a 5 percent “solidarity” assessment.

On personal taxes we have counterintuitive reasons for increasing progressivity in personal income taxes beyond Biden’s 39.6 percent. The first reason looks back to the 1970s when the United States suffered disproportionate deindustrialization. We hemorrhaged products and industries where we had traditionally excelled. Other European nations didn’t abandon traditional and new industries. Finland produces those beautiful cruise liners. Italy retains its shoe and clothing industries and leads in popular granite table tops and other stoneware. Sweden’s IKEA is the world’s leading home furnishings company, and Volvo ranks as the second biggest truck maker after Daimler. Sweden also shares with Denmark the distinction of having the world’s highest personal tax rates on high incomes. In the United States, we have one of the lowest.

The subsequent Reagan administration’s relegitimization of business may have been timely. But the Reagan reduction of top personal taxes from 70 to 28 percent led to executive pay becoming linked to corporate profits. This stimulated exponential increases in executive compensation (as well as lawyers’ and others’ pay) , and those increases, in turn, encouraged corporate executives to pursue short-term profits over long-term goals.

No U.S. top executive pursued short-term profits with more zest and celebrity than  General Electric’s CEO, Jack Welch, who would retire in 2001 with a record severance of $457 million. The culture Welch installed at GE created bubble growth and an ultimate crash for the historic manufacturing company.

In 2014 GE contracted to sell the last major U.S. suite of household appliances — its appliance division — to Sweden’s Electrolux, a move designed to gain cash for more profitable investments. Such investments earlier included high-finance and real estate ventures that brought GE profits along with a deemphasis on making products and meeting U.S. technological needs. GE’s German counterpart, Siemens, took a different course. It continued its traditional long-term strategies and today rates as a major global player across the technological spectrum, as well as a key partner in Germany’s proactive climate-change policies.

The learning I take from all this: America’s astoundingly high top incomes have a negative effect on our national productivity. Nor do top earners stimulate productivity. They buy expensive real estate like Bill Gates’ $147 million primary residence. They purchase foreign properties and luxury goods, art objects and Lear Jets, expensive security and financial services. They invest for personal income security and growth, but as Jack Welch demonstrated, investments for the highest possible returns do not necessarily serve national needs.

The incomes of our top earners have an additional negative impact. They grow our nation’s unsupportable inequality. Among advanced nations, we have the greatest disparity between rich and poorBefore 1970 the lowest income quintiles in the United States saw the fastest growth. In a post-war America where top-bracket personal tax rates never dipped below 70 percent, blue-collar breadwinners could support families on one income. Since 1970 incomes in our lowest quintile income have remained nearly flat in inflation-adjusted terms, with our bottom 50 percent owning a mere 1 percent of the nation’s wealth. Older families, the St. Louis Federal Reserve Bank reports, have 12 times the wealth of younger families.

These extremes have triggered record public support for more radical political policies, with disaffection concentrated among younger voters. We will have no economic and political stability in the United States until we correct these imbalances.

Categories
American history Journalism Policy and Politics Science and education

When NPR shut down its user comment line in 2016 it missed a major opportunity

Summary

NPR’s comment line was obviously troubled when it was shut down in 2016. But there had been no  guidelines for contributions to it. Nor were significant corrections made when abuses began clogging the system. As the premier U.S. public radio news network I suggest that NPR should have taken on a generic problem facing comment lines nationally – especially for news media. Rather than giving up it should have developed strategies and led in raising national standards for comment lines. I suggest it can be successful if it readdresses the problem in a new comment line.

====================

History: On August 23, 2016 National Public Radio shut down its comment line. According to a statement by NPR’s ombudsman (Jensen, 2016) NPR took this action after reviewing  statistics showing that in July 2016 there were 491,000 comments from only 19,400 commenters. Letting cliques of users treat NPR’s site like a personal chat line was clearly not tolerable.

NPR statements: NPR claimed it lacked money for moderators and that other news programs had taken similar action. It suggested social media like Facebook might be better sites for discussion. I find these arguments inexcusably superficial. They overlook the rhinoceros in the room: is it a good thing for the federally supported, premier U.S. public radio network to eliminate opportunities for meaningful public comment and discussion?

Sure, viewers can send an email to NPR program managers or hosts, getting a form response like “thank you for your expression of opinion”. But let’s be realistic. Most people with meaningful things to say will no more choose to waste their time sending mail to faceless administrative staff than to participate in the earlier comment line – degraded as it became by trivial input or worse. Lack of meaningful input through NPR’s current contact addresses may lead program managers to assume that there little need or useful value in reopening channels for public comment. Wrong!

I don’t say NPR has the same policies as the Lenya.ru, a leading Russian radio program that delivers sanitized news and commentary. However, it now shares a no-public-input  policy with the Russian program.

Critical omissions: NPR placed no limits on contributors. Ombudsman Jensen reported that in June and July of 2016 4,300 commenters contributed an average 145 comments apiece! Correction of this kind of problem could be straightforward: Kojo Nnamdi’s WAMU radio interview show from Washington D.C. limits comments to one a month. It doesn’t require live moderators to monitor comment frequency. That can be done by software.

Neither did NPR provide clear guidelines for contributors, e.g. stating  priority for comments offering corrections, alternative arguments, new information, or personal experience. Expressions of gut-level opinion can be found on a vast number of web sites. On NPR’s website such expressions add no value. On the contrary, they guarantee that though many people may be attracted by the site (NPR recorded 33 million unique users in July 2016) people with meaningful things to say will not read or contribute to the comments line. As with repeat commenters, software could go far to screen out brief, crude posts or, say, a report about a sick cat.

When problematic posts began to clog the system NPR leadership did not take meaningful action. It apparently learned nothing from 2016 because in March 2018 NPR also ended its Science Commentary Blog (Jensen, 2018).

Yes, it’s a widespread problem: Ombudsman Jensen was right in referring to other news sites that had dropped comment lines. One of the early sites to take this action was Popular Science (2013). Others are CNN, NBC, and VOX, It’s not just an American problem. Stephen Pritchard, editor of The Guardian, a left of center UK newspaper, wrote in an article that at one point the paper’s comment line ballooned to a hopeless 65,000 emails (Pritchard, 2018). Pritchard said that in the future

“Subjects such as race, immigration and Islam too often attracted toxic commentary, so henceforth they would only have comments open if a moderated, positive debate were deemed possible – one without racism, abuse of vulnerable subjects, author abuse or trolling”.

It appears that as in the case of NPR The Guardian did not have strategies in place to minimize “toxic” posts and chose to give up on an open comments line.

A perceptive Australian economic blogger describes toxic comments as a problem of incentive structure (Murphy, 2015):

For some idiot with anti-social views, this is his one chance to get his views amplified. The pay-off here is high. Normally he can’t get anyone to listen. But if he quickly writes something inflammatory, he can spend a happy afternoon jousting with people he made angry.

Murphy advocates the solution used by Reddit (Wikpedia, 2018), an American website that features aggregated websites on a variety of subjects. Its solution to undesirable comments is to have up or down votes. Up votes will raise the visibility of comments. Down votes will lower it and 5 down votes extinguishes a comment.

There are two problems with Reddit’s approach for more thoughtful news and commentary sites. Popularity has never been a reliable criterion for vision in statesmen or quality in comments on complex or controversial issues. Abraham Lincoln was widely reviled in 1863. Further, the Wikipedia article describing Reddit notes that the site requires significant effort from moderators.

NPR’s leadership responsibility and opportunity: There’s no getting around the fact that contemporary public response to comment lines of serious news and commentary media is often problematic. Crude or thoughtless comments not only drive away thoughtful readers. They can add to an unrealistic sense of social chaos and fragmentation because respondents attitudes may disproportionately reflect attitudes held by radical fringes not representative of the majority of audiences.

In my view the first thing needed is for program managers to openly address the realities. In seeking to restore a comment line they need to call on listeners for input on guidelines and to help make the opportunity to present comments  substantive and useful. Relatively simple and cheap measures can filter out a large proportion of posts that don’t meet stated standards. However I also suggest that even contributors of coarse or trivial posts should be treated courteously and invited to come back in the future, taking advantage of constructive suggestions. 

Another measure to attract useful comments may be a category for superior comments. Meaningful posts too long to meet normal guidelines may emerge if more serious and knowledgeable individuals begin to be attracted. Instead of simply cutting off their inspirations, contributors who go past standard length limits could provide summaries that will be posted on the main site while the full comment can go to a special address. Such comments might involve documentation for alternatives challenging ideas of commentators or guests. I often find that talk shows nominally featuring “both sides” of controversial issues overlook important factors. This may be a way to get these into the discussion.

No doubt, robust comment systems need managers committed to bringing in valuable input from the public. Why give up and say they are unaffordable? I argue that a first-rate comment line could enhance the value of NPR and provide a model that may help upgrade comment lines more generally. Wouldn’t that be a worthy goal for National Public Radio? PBS gets support for ambitious programs from a large group of wealthy individuals and foundations. Would not the prospect of restoring meaningful public comment to NPR attract culturally oriented foundations or organizations like the Pew Trust?

REFERENCES

Jensen, E. (2016, August 17, 2016). NPR website To Get Rid Of Comments. Retrieved from https://www.npr.org/sections/ombudsman/2016/08/17/489516952/npr-website-to-get-rid-of-comments

Jensen, E. (2018). Shifting Opinions: NPR Ends Science Commentary Blog.

Murphy, J. (2015, 2015/09/14). Thomas the Think Engine. from https://thomasthethinkengine.com/2015/09/14/the-problem-with-online-comments-solved/

Pritchard, S. (2018, Jan. 13, 2018). The Observer: The readers’ editor on… closing comments below the line, The Guardian. Retrieved from https://www.theguardian.com/commentisfree/2016/mar/27/readers-editor-on-closing-comments-below-line

Wikpedia. (2018). Reddit. from https://en.wikipedia.org/wiki/Reddit

Categories
American history Civil War Personal

Confederate Monuments, Road Signs, And School Names: Don’t Put Them Out Of Sight And Mind

Why I’m writing about this subject. Current actions to remove Confederate statues and change the names of schools are in the news in Virginia. I have special reasons for offering suggestions on these sensitive issues.

In 2005 I had a “born-again African American experience”. An emeritus professor of chemistry at the University of Missouri, Kansas City and I unexpectedly discovered information buried for 50 years. We found that two former segregated black high schools in Kansas City Missouri and Kansas had dominated national science awards for all schools in Greater Kansas City through the 1950s and to 1965 (Manheim and Hellmuth 2006). This experience indirectly led me to greater insights about the Civil War.

Growing up and attending school in Kansas City MO during segregation, I remember wondering what education in the black schools was like. I took it for granted it would be second class. Classmates and I confidently assumed that our elite white high school was the best in Kansas City. So when research in 2005 showed that black schools had topped my and the other white schools in my fields of interest, chemistry and science, it shocked me to the core. How did they do this against the odds of discrimination and other handicaps of the times? It transformed and opened me to the black experience in America.

Getting back to high school days, I was fascinated by Civil War history. I knew slavery was wrong but am now ashamed to admit that I mentally separated the slavery issue from the military campaigns. I rooted for underdog Confederates and their colorful leaders like J.E.B. Stuart. I am afraid many Americans still separate Civil War battles from slavery.

The true horror of the Civil War. It’s only sixteen years ago that I grasped the true nature of the Civil War. Think of the death of 600,000 soldiers, many through infected wounds before Louis Pasteur’s germ theory of disease become known in the 1870s. Six hundred thousand men in 1864 is equivalent to 5.4 million in today’s population. Can we even begin to imagine the agonies of the men and the collective grief of affected families who lost sons, husbands, and fathers?

Unfortunately, that’s not the last word on this ghastly time in the nation’s history. Consider that half the nation went to war to defend the hideous institution of slavery at a time when Canada, Mexico, and European nations had already banned it – some (e.g. France) before the 14th Century. Great Britain’s Abolition Act banned slavery throughout the empire in 1834. Confederates did not die for a noble cause. Apologists have claimed that they fought for their culture, not slavery. That shallow argument won’t wash. Slavery was the only real bone of contention that separated the North and South. The Confederate states rebelled, seceded from the Union and began hostilities at Fort Sumter, South Carolina. Grievous failings in judgment lay with leaders and literate citizens of the South, as well as southern churches that justified slavery.

Robert E. Lee is on record writing that “slavery is a moral and political evil”, and regretted Virginia’s secession from the union. But he took the evil lightly, made excuses for it, and placed loyalty to Virginia and his perception of “honor” above human values asserted in the Declaration of Independence: “We hold these truths to be self-evident, that all men are created equal”.

Lee and J.E.B. Stuart were intelligent, educated men who swore allegiance to the nation as part of their officer training at West Point. What were they thinking when they abandoned their nation and assumed leading roles in defense of the reprehensible institution that subjugated African Americans?

The chilling answer – already articulated in the classic book by Alexis De Tocqueville, Democracy In America (1835, 1840) – is that Americans from early early history have been uniquely prone to be swept away by beliefs of the moment. The Founders knew about and feared this tendency. It’s the reason why they instituted multiple checks and balances in the Constitution. Don’t we see it continue to be displayed in today’s political developments?

So the Civil War is a bigger thing than most people realize. The last thing we should do is bury it out of sight and mind. That would sanitize the awful stain on the nation and let people forget the fateful mistakes in judgment that brought about events whose consequences have still not been completely overcome.

I suggest that – if we have the wisdom to face the realities – we preserve those monuments in museums or other well-kept places, accompanied by carefully crafted commentaries that remind of the costs of ignoring history and reason. Preaching would be counterproductive – unsparing, nuanced reality would be most effective. The lessons of history will be stronger and more acceptable if we allow that men like Lee and Stuart had estimable qualities as well as flawed judgment. African Americans should contribute their insights to such projects – since they have the greatest stake and insight into that history.

What about the names of boulevards and roads? Keeping names of notables linked to the Confederate rebellion could be an important educational opportunity – if the same kind of clarification were provided. Lee-Jackson Highway close to our house in Fairfax Virginia memorializes Robert E. Lee and Thomas Jonathan (“Stonewall”) Jackson, for whom the highway is named. Present and future generations need to be aware not only that they were great generals, but that they misguidedly led a bloody war defending an inhuman institution. Quarterboards or distinctive metallic plaques like those commemorating battlefields could remind future citizens of the ease with which decent humans can be drawn to inhuman causes.

School names often honor individuals in ways designed to offer inspiration for future generations. Confederate generals fail these standards in the contemporary world. It’s therefore only common sense to replace names of notables linked to the Confederacy. However, original names should be nevertheless be prominently displayed inside schools. They could be placed in smaller letters underneath the new name along with appropriate messages that remind school children of history and the damage that bad human decisions in the past did.

I have seen letters to editors of local papers belittling the idea of attaching signs to monuments. Yes, it would be uncomfortable and doing it right isn’t necessarily an easy job. But neither putting monuments out of sight nor facile expressions of guilt are what’s needed. Monuments ought to go into actively-visited museums or historical sites; soul-searching and artistic expression are called for if we want future generations to learn from the past while truly burying the pain and conflict associated with it.

Reference: Manheim, Frank T., and Eckhard Hellmuth. 2006. “Achievers Obscured by History ” U.S. Black Engineer June-July 2006.

Categories
American history Policy and Politics

Seymour Martin Lipset and American Exceptionalism

If everything else about him is forgotten, the noted American sociologist, Seymour Lipset (1922-2006) will surely be remembered for coining the term, “American Exceptionalism” through his 1996  book,  American Exceptionalism: A Two-Edged Sword. Before I took up social science as a “second language” at George Mason University, Lipset’s last academic residence, I was a federal earth scientist avocationally interested in public policy. The only social scientists whose names I recall seeing at regular intervals in the American flagship journal for the sciences,  Science Magazine,  were Lipset, along with Robert Merton, Amitai Etzioni, and Don Price.

Lipset had omnivorous curiosity and interests. Among his many memberships and honors, he was the only person to serve as President of both the American Sociological Association and the American Political Science Association. Lipset could marshall vast amounts of statistical data and tossed out bold generalizations that other academics had not arrived at or were afraid to venture without their conclusions being quantitatively established by “empirical” studies.

For readers not familiar with social science methods the terms “empirical” and “normative” need explanation. Once I got serious about social science I  learned that social scientists used the term “empirical” differently than it was used to among physical scientists like biologists, chemists, geoscientists and oceanographers. For me it loosely referred to relationships suggested by systematic observation, rather than studies involving theory. For social scientists, however, “empirical studies” focus on rigorous statistical testing of specific hypotheses and are ‘king” in leading journals like the American Journal of Political Science. The other major type of research recognized by the social sciences is the “normative” study. It involves a broader approach to problems and may use any relevant methods. Although this more holistic approach better fits decisionmakers’ needs and is the only kind accessible to the general public, normative studies have a lower professional prestige than empirical studies.

Back to Lipset’s generalizations. In the introduction to American Exceptionalism Lipset states flatly that U.S. is the most religious country in Christendom, and the only one where churchgoers adhere to sects. Protestantism influenced opposition to wars, but earlier affected American foreign policy. The U.S. disdain of authority led to the highest crime rate and the lowest level of voting participation in the developed world.

I found that Lipset’s generalizations had to be respected but taken advisedly. This is illustrated by the abovementioned claim that the U.S. had the lowest level of voting participation in the developed world. In fact, this statement only applies to recent years. The Wikipedia article on “American election campaigns in the 19th century” points out that elections in the midwestern states of Illinois, Indiana, Iowa, Michigan, and Ohio, reached 95% voter turnout in 1896. Generally high voter turnouts continued after the turn of the 20th Century.

Lipset’s used the term “American exceptionalism” to confirm that America is qualitatively different from all other nations. He indicates that this was first established by the 19th Century French observer, Alexis De Tocqueville, in a famous book, Democracy in America, published in two volumes in 1835 and 1840. Besides lack of respect for authority Lipset identifies  other characteristic features in U.S. society. These include “identification creeds” (moralism), rather than ethnic or other commonalities; firm belief that the U.S. is best and unique among nations; distrust of a strong state, aversion to state-provided welfare, weak working-class radicalism, and lack of a significant socialist movement. Americans were described as holding tightly to the principle of democracy; elections at various levels are more pervasive than in any other nation.

Lipset recognized that the strong American penchant for moral absolutism could lead to excesses. This is abundantly demonstrated by today’s partisan political polarization and Congressional gridlock – unique among advanced nations. Each contending faction declares in ringing pronouncements the rightness of its principles and the hopeless error of the opposition. Andrew Bacevic’s recent book (2008) The Limits of Power: The End of American Exceptionalism, cites a long list of disastrous policies the author attributes to American moral utopianism and hubris over the past 40 years. He cites chicanery, dirty tricks, “suppression of open discussion and insulation of error against public criticism”, blatant corruption, making common cause with dictatorial regimes, and squandering of billions of dollars, all of which were justified in the name of higher moral goals. In asserting the end of American exceptionalism Bacevic clearly means the “America is best” kind; he would presumably admit to exceptionalism on the undesirable side. Lipset avoids “good/bad” characterizations, like most social scientists.

From my own studies of the history of political polarization in the U.S. I see a problem with Lipset’s exuberantly presented theories and generalizations. He often fails to test them, look for exceptions and differences due to location and especially the factor of time and history in assessing social problems. Wherever I look I find problems with stereotypes when history isn’t taken into account. Let’s take U.S. “high crime”, for example. The U.S. has not been a high crime nation always and everywhere. Prior to the emergence of mobs in the 1920s police in Chicago, Philadelphia, and New York, as well as lesser cities did not wear guns. FBI crime statistics for Massachusetts show that there was a more than ten-fold increase in violent crime from 1960 to 1980 (see figure). Criminality before 1960 was no higher in Massachusetts and other northeastern states than in many European nations from which most immigrants came. This agrees with accounts that the U.S. was a far safer place before the 1960s than it is today. Women are said to have been able to walk safely late at night in Harlem, NY in the 1930s and 40s.

From around 1910 to 1950 America was generally pragmatic and efficient in planning and maintaining societal infrastructure. The American railroad network, which reached its peak extent around 1910, provided extraordinarily comprehensive service for such a geographically large nation. Larger cities maintained efficient central management and planning systems that anticipated traffic and waste treatment needs. They rapidly accommodated introduction of electrical light and power.  The construction and integration of the record-breaking Chrysler building into Manhattan, completed in 1930, took only 20 months. The New York subway system maintained its cost of a nickel from 1904 to 1944 – during which time it expanded the subway network.

Andrew Bacevic documents that American exceptionalism took on more extreme character after 1960. But a conclusion not mentioned by Bacevic or Lipset is that parochialism and the tendency to embrace new ideas for society without reasonable testing was spearheaded by leading educational institutions. For example, before the 1960s virtually all colleges and universities required at least one foreign language for admission. Leading universities required two. U.S. industry, academia, and goverment kept in touch with foreign developments (recall that the early U.S. space program was led by a German scientist,Wernher von Braun). But in the 1960s language requirements were eliminated altogether by Princeton and MIT. Brown University announced with pride elimination of all mandatory course requirements for undergraduate students. And from the mid 1960s on Congress virtually ignored foreign experience in lawmaking.

The effect of the huge growth of the U.S. academic establishment after the 1950s, accompanied by disciplinary fragmentation and increasing disengagement from the nation’s practical affairs is almost completely overlooked by Lipset, possibly obscured by his absorption with academic analysis and stimulating contacts with academic peers and students. This isn’t a trivial issue. Academic leaders have the resources and responsibility to train the nation’s educated workforce and future leaders, conduct research and identify national problems and potential solutions. But have economists, sociologists, and political scientists improved the nation’s conditions in their areas of activity since the 1960s? “If not why not?”. This and the possibility that we can no longer afford even brilliant academic jousting as a respectable game would have been great questions to ask Lipset were he still with us.

Categories
American history Economy Policy and Politics

The U.S. has entered a 21st Century “Gilded Age”. Can earlier history offer insights on reform?

The “Gilded Age” from 1870 to ~1890 was a time of rampant public and private corruption. Congressional seats could be bought and sold. “Robber Barons” made giant killings through monopolies and manipulation – and brought on devastating panics.

The signposts are all around us that ethics in government and society have deteriorated. In recent years record increases in the proportion of national income going to the ultra rich (Fig.1), appointment of partisan political loyalists rather than competent officials to high level agencies,  the rising influence of money on elections and politics, and flagrant public and private lapses in ethics have caused  a number of economists and historians  to refer to U.S. society as having gone into a “New Gilded Age”. Many citizens are angry and looking for change. How can such change come about? 

My recent research comparing developments in the old Gilded Age with the New Gilded Age  (1) shows that reform was out of reach as long as the public accepted the extravagant promises and  favorable treatment they got from politicians. A turnaround became possible only after a wave of revulsion on the part of the public over events in the early 1870s (Grant administration)  allowed real reformers to be elected to high public office. This process began with  the election of 1877, but it took two decades to bring about clean, competent governmental agencies and even-handed executive actions (Theodore Roosecelt) that gained widespread public trust.

Postscript: Since this essay was last edited in January 2014, evidences of public dissatisfaction and anger in both parties have emerged in terms of the 2016 Primary election campaign. This may be be a precursor to real reform.

 

POLITICAL HISTORY

Experts agree that the founders of the United States represented a flowering of political talent and statesmanship that forged a new government system designed to anticipate human fallibilities affecting democracies. Historians have suggested that the writing of the Constitution was aided by the fact that while American colonists inherited Enlightenment ideas from Great Britain, they were largely self-governing and free from the deep corruption that characterized politics in the mother country (until the middle of the 19th Century).

The first six presidents maintained George Washington’s policies of basing appointments of federal employees on competence. President Andrew Jackson broke with this tradition. His administration (1829-1837) introduced the “spoils system” that led to turnover in government appointments in subsequent administrations.

Public and private corruption peaked in the administration of Ulysses S. Grant (1869-1877). It marked the beginning of a “Gilded Age” of unprecedented veniality after the Civil War. An example is the Salary Grab Act of 1872. It doubled President Grant’s salary to today’s equivalent of $900,000/year, and awarded each Congressman a one-time bonus equivalent to $90,000 in today’s dollars. Public outrage forced its repeal and helped support the rise of reform candidates in the subsequent presidential election. President Rutherford B. Hayes (1877-1881) committed himself to a single term in order to focus on reform of the federal government. Subsequent reforms culminated in the administration of Theodore Roosevelt (1901-1909). They brought about a system of efficient government operation with independent federal agencies that operated largely free of arbitrary interference.

Stresses on federal government operations after World War II included growing environmental concerns, the assassination of President Kennedy in 1963, and other developments. An environmental crisis triggered by the Santa Barbara offshore oil spill of 1969 caused Congress, in effect, to take over responsibility for environmental management from federal and state professional agencies through unprecedentedly detailed laws. Rigorous centralized intervention in basic economic activities and expanded roles for federal courts politicized environmental policy. The Democratic Party became the party of environment, and the Republican Party became the party of industry. Both parties reintroduced patronage systems with turnover in federal agency administrations after elections.

RECENT “GILDING” TRENDS 

 Income disparities. Economic researchers Thomas Piketty and Emmanuel Saez have shown that the share of U.S. national income received by the top 10% bracket reached 50% in 2007, values last seen in 1927.

Role of money in elections. A recent CNN report showed the average cost of Congressional campaigns increased from $360,000 in 1986 to $1.6 million in 2013 for a seat in the House of Representatives. The Supreme Court’s “Citizens United” decision in 2010 lifted restrictions on political contributions by independent corporations, associations, and labor unions. 

Interest groups dominate policymaking. Over the past 40 years decisionmaking by Congress and official agencies in the U.S. has been increasingly influenced by partisan politics and diverse interest groups ranging from drug companies and gun lobbies, to environmentalists and trial lawyers. Special interest policies are promoted by lobbyists whose aggregate payments were recently estimated at $3.2 billion per year, by litigation, mass mobilization for campaigns, and cultivation of influential officials. The system has led to flawed decisionmaking and conflict, including Congressional gridlock. Courts and judges have increasingly come to decide issues outside their intended roles, and where they have no professional expertise. 

Federal and state governments increasingly ignore the spirit or letter of laws. For decades the IRS has required payment of taxes on gambling winnings by individuals in states where gambling was illegal. To maximize compliance it pursued a de facto policy of not disclosing information on these payments to states. States, in turn, received taxes from illegal immigrants, turning a blind eye to their status and avoiding disclosure of information to the Immigration and Naturalization Service.  Twelve states have passed laws legitimizing marijuana that directly violate federal law. 

Presidents get around laws. Besides the Watergate scandal that led to Richard Nixon’s resignation in 1974, Democratic and Republican presidents have increasingly pursued policies in conflict with the spirit or letter of the law. In a retrospective essay ardent environmentalist Jimmy Carter reported that as President he asked his Secretary of the Interior, Cecil Andrus, to find ways to sequester Alaskan land. He then used the obscure American Antiquities Act of 1906, originally designed for parcels like the Statue of Liberty, to protect 60 million acres of federal land in Alaska from economic use through designations as “National Monuments”. The Act specifies that parcels “in all cases shall be confined to the smallest area compatible with proper care and management of the objects to be protected”.  Carter acknowledged that Ronald Reagan was furious about his action, regarding it as a “land grab”.

After Carter’s defeat in the 1980 election, Reagan appointees sought to roll back the tide of environmental regulations and sequestration of federal land by slashing enforcement budgets and curtailing (mandated) enforcement of 1970s environmental laws by the Environmental Protection agency. When opposition at hearings got in the way of implementing expansive new leasing policies, Secretary of Interior James Watt simply stopped holding hearings.

President George W. Bush set new records for “signing statements”, i.e. signing Congressional laws with reservations signaling that he did not intend to abide fully by the laws’ provisions. The White House and other federal agencies tried to influence or inhibit science and regulatory agency reports, actions that were formally censured by the Comptroller of the United States and Interior’s Inspector General.

With support from his Attorney General, President Obama declared that he regarded the Defense of Marriage Act (DOMA) unconstitutional, and therefore would not enforce it. Regardless of the merits of this view, no such discretion is given to the President by the Constitution. Arbitrary interpretations or circumvention of drug and immigration laws have followed.

Financial scandals. In the early 1990s bad judgment and fraud closed 747 of the nation’s 3200 savings and loan banks.  In 2003 a multiagency settlement implicated ten of the nation’s largest investment firms in wrongdoing. Prolonged, unprecedented lapses in financial and ethical judgment on the part of the nation’s private and public economic and financial leaders led to the financial crash of 2008 and the worst recession since the great depression of the 1930s. The semi-public lending institutions, Fannie Mae and Freddie Mac were involved in the fiscal meltdown. Notwithstanding tightened controls since 2008, a record fine of $2.6 billion for deceptive practices was recently levied against the iconic J.P. Morgan Chase bank.

Breakdown of moral and ethical standards. The above actions are signs of erosion of a sense of community that remained strong in the U.S. for a time after World War II. Over the past 30 years behaviors of previously unthinkable kinds, like wanton killing of innocent students and school children have increased. Government employees without ties to foreign governments have taken it upon themselves to release vast amounts of classified and highly sensitive documents in response to perceived governmental abuses.

HISTORY’S INSIGHTS FOR FUTURE REFORM 

 The U.S. is showing increasing disillusionment with political institutions. Experience from the earlier reform period suggests that meaningful restructuring of government must be comprehensive. Government is now vastly larger than in the past and the serious reform may seem unlikely. However, we can gain insights on pathways to reform if and when it comes, from earlier history.

How earlier change came about. Reform after 1877 took place through committed political leaders and influential citizens. Reform measures often met resistance, including that of the public, which liked aspects of the patronage system. As reformer Carl Schurz observed, the public often created barriers to meaningful change. Serendipitous events  were often keys to creating changes in opinion favorable to action. For example, the assassination of President Garfield by a disgruntled office seeker ultimately aroused the public and Congress to pass the monumental Civil Service (Pendleton) Act of 1882. Reform leaders generally prepared reform measures with balance so that once enacted they would be effective and gain support by the public and politicians.

Predictions about the future of reform. We should not expect governmental reforms from a Congress that is unable to reform its own operations. Academic researchers on government and policy have become dispersed in fragmented disciplines that study real-world politics from a safe distance and whose publications are not used by decision makers. Nor should we expect reform from popular movements. These, like Occupy Wall Street, and The Tea Party can register disapproval or demand specific actions, but are unlikely to have the in-depth knowledge and balance to produce effective policies. Moreover, Gallup polls in 2013 showed that while voters gave Congress as an institution all-time low approval ratings – approaching 10%, 60% of voters liked their own Congressman. This kind of relationship was already described in the 1830s by the famous French observer of U.S. society, Alexis De Tocqueville. He noted in his book, Democracy in America, that the surprising lack of vision in U.S. politicians could be explained by the fact that voters were often poorly informed and preferred politicians who served their immediate purposes and told them what they wanted to hear. If history is a guide, urgency about improving governmental operations must reach a point where genuine reformers can gain influence and be preferred over the charisma or ideological appeal of alternative candidates for high office. We may need deeper crises in order to reach that point. 

Categories
American history Policy and Politics

A 19th Century French Observer Sheds Light On America’s Unstable Politics

Alexis De Tocqueville arrived in the United States in 1831. The French nobleman, superbly educated and with Enlightenment ideals,  came with a companion to study democracy in the young nation.  The last major  representative system, the Roman Republic, had been overthrown in 27 BC by Julius Caesar, who had himself crowned leader of a Roman Empire that would continue for nearly 500 years. After final ratification of the U.S. Constitution in 1791 the United States became the the world’s first representative political system in more than 1800 years. In the early 19th Century European governments continued to be ruled by monarchs or other systems lacking full popular representation. Power sharing systems like parliaments, such as in Great Britain, existed, but appointments to these positions were mainly limited to the nobility or upper classes and hereditary monarchs retained much power.

The turbulent French experience with democracy. Three years after the adoption of the U.S. Constitution the French had their own popular revolution in 1789. But its bloody and chaotic outcome  – including execution of one of France’s greatest scientists, Antoine Lavoisier, on grounds that he was an aristocrat (the judge who condemned him famously observed that “France had no need of savants”), ultimately led to restoration of the Bourbon monarchy from 1814 to 1830. The first restoration king, Louis XVIII, granted a written constitution and bicameral legislative body. However, his successor, Charles X (1824-1830), returned to a more authoritarian governing style.

At the time of De Tocqueville’s departure from France on April 2, 1831.  King Charles X had been forced by the July Revolution to abdicate in favor of a constitutional monarchy under Louis Philippe.  De Tocqueville and his companion, Gustave De Beaumont, wanted an official government mission and financial support for their travel. Since democracy was too sensitive to justify their request, they initially covered up their real goal, and received approval to study the American penal system.

De Tocqueville finds national character traits different from those in nations that supplied immigrants.  The risk of reporting on democratic systems in America receded after Louis Philippe took the throne.  De Tocqueville described his observations in a famous two-volume book, Democracy in America (1)Published simultaneously in French and English in 1835 and 1840, it created a sensation. De Tocqueville’s work has been described as the most perceptive book ever written about the United States. A highlight of his observations was that descendants of European immigrants in the United States looked very much like people in France and Europe. But they showed characteristics not found in any nation from which Americans had emigrated.

 “American Exceptionalism – a double-edged sword”. The special features of American society, now often described as “American exceptionalism” (2) remain widely noted  today.  But few Americans  who have heard of De Tocqueville realize that he described not only positive qualities that continue to be associated with the United States, like strong commitment to democracy, entrepreneurial drive, willingness to take bold risks, and openness to communication and forming new associations for mutual benefit.

De Tocqueville also described less flattering national tendencies that he apologized for detailing but felt needed attention. These include characteristics like shallowness, opinionatedness,  and focus on material acquisitions and self-interest. De Tocqueville suggested that these qualities helped explain the attraction of American voters to politicians who served their immediate concerns and told them what they wanted to hear. He described his surprise at the lack of vision in political and other societal leaders. The connections between self-interest and lack of leaders with vision becomes more understandable when we consider that statesmen and other leaders with vision see issues that may be valid but unwelcome and  that people would prefer to avoid.

Polling data offer striking resemblances between national characteristics that De Tocqueville reported more than 170 years ago and those we see today. Recent Gallup polls (3)  report that public approval ratings for the performance of the U.S. Congress have declined to he lowest levels (approaching 10%) since polls on such questions have been conducted. At he same time, a representative sampling of citizens showed approval of their own Congressional representative at 62%.

We have met the enemy and he is us (Pogo). When the full implications of the above connections sank in after recently reading De Tocqueville in the original rather than in summaries and interpretations that I had seen or heard earlier,  it seemed tough medicine. If traits like the above were hard-wired in the population, improving the performance of American national politics would face steep odds. If voters don’t really want politicians who know and tell the truth and work toward realistic solutions. If they prefer instead those who serve their immediate interests and offer charisma and “magical” policies to deal with larger national needs, then it’s not  politicians who are the fundamental problem. We could throw out all the leaders who serve today – but we would elect another group just as bad in their place.

Realistic reform potentials. De Tocqueville does offer windows of hope in his observations of politics in America. He noted that at times of crisis – notably during the Revolutionary war – Americans became more willing to accept statesmanlike leaders. And Americans are flexible.  And once Americans accept the reality of major problems they are quicker than in other societies to correct them.

Movements like Occupy Wall Street and the Tea Party show deep dissatisfaction with the status quo.  But projects like former Comptroller General of the U.S. David Walker’s Come Back America movement,  Ted Talks, based on a desire for informed action,  and other reform movements, still seem in early stages of development. We may need a bigger crisis before Americans become willing to take the need for true reform seriously.

References 

1. Alexis De Tocqueville. Democracy in America, four-volume multilingual edition, edited by Eduardo Nolla, and translated from the French by James T. Schleifer, Liberty Fund, Indianapolis. This edition adds previously unpublished drafts, notes, and correspondence by De Tocqueville (who interviewed President Andrew Jackson as well as many other leaders, and also spent extended time with an Indian tribe)

2. Lipset, S. M. (1996). American Exceptionalism–A Double Edged Sword. NY, Norton

3.  Mendes, Elizabeth, 2013. Americans Down on Congress; Ok with their own Representative, Gallup Inc. http://www.gallup.com/poll/162362/americans-down-congress-own-representative.aspx