Alexander Hamilton and his colleagues wrote 85 separate essays to make their case that Americans should take a risk and ratify the 1787 Constitution. Three sentences into the very first of those Federalist Papers, Hamilton made clear that he knew full well the stakes of the gamble he and his co-authors were proposing.
“Whether societies of men are really capable or not of establishing good government from reflection and choice” was, Hamilton wrote, entirely uncertain. People are prone to tribalism and irrationality, easy targets for leaders who traffic in demagoguery and corruption. Hamilton doubted that it was even “seriously to be expected” that “We the People” would be able to set aside existing passions and prejudices long enough to debate the ratification of the Constitution itself solely on its merits. How on earth could the Constitution’s Framers then have convinced themselves to expect the people to manage running an effective national government?
The Framers’ faith in that regard has been sorely tested by the American government’s catastrophic response to the COVID-19 pandemic. For as much blame as the president himself deserves for the country’s current dire condition—and his malign incompetence is breathtaking—the federal government’s failure here is not the president’s alone. Expert agencies such as the Centers for Disease Control and Prevention and the Food and Drug Administration have, from the start, stumbled and delayed, struggled to collect data, secure accurate tests, or convey a consistent message. Congress, too, has proven unable to compel swift action to address shortages or coordinate the distribution of essential supplies, and equally unable to constrain apparently rampant corruption in the allocation of the funds it has authorized. Even the courts have at times balked at accepting pandemic basics, such as the notion that states might have reasonable grounds for permitting large gatherings outdoors but not in. As the Atlantic journalist Ed Yong recently set out in painful detail, the response has “careened between inaction and ineptitude,” leaving the people in “illness and financial ruin.” In the face of such systemic failure, is it time to rethink the system itself?
Read: How the pandemic defeated America
To answer that question, it is worth recalling what “system” that is, and how it came to this point. The Framers had an idea about how to channel tribalism and irrationality into good governance; they wove throughout the Constitution their Enlightenment-era beliefs that knowledge and reason were good, arbitrariness and corruption were bad, and that it was possible to design governing institutions and legal rules that upped the odds that the former would more often triumph. Those same beliefs have animated sweeping reform efforts that have added further reinforcements to the Framers’ design many times in the generations since.
Until the end of the 20th century. It was roughly then, following the last major era of good-government reforms in the 1970s, that America took a decidedly anti-Enlightenment turn in the way government does business. The array of institutional changes that produced this turn has more than one source—conservatives intent on downsizing government and remaking the courts, progressives more focused on governing policy than structure and particularly disinclined to cede presidential power of their own after years locked out of the White House. But the effect across all three branches is singular: It is harder today for the government to access and analyze independent, unbiased information, and easier to avoid processes that might compel that information’s consideration. The United States’ disastrous response to COVID-19 is at least in part the result of this shift, one the Trump administration has taken to uncharted extremes.
Should Trump be defeated in November, the next administration faces an already-daunting transition, fraught with political hostility and hobbled by the need to reconstitute a badly depleted and demoralized civil service. But unless the next administration also takes on the task of restoring the government’s capacity to harness knowledge and analyze it, even the world’s most devoted public servants will continue to struggle.
If the American Revolution was about gaining freedom from a system of government based on hereditary rule and social rank, the 1787 Constitution was about crafting a replacement system based on the Enlightenment values of knowledge and reason-driven process. The Constitution is shot through with rules of governance aimed at advancing these commitments, including the Senate’s role in advising and consenting to presidential appointments (the better to ensure the president appointed aides of merit) and the vigorous embrace of trial by evidence and jury (the better to mitigate impulses to passion and revenge).
Rebecca Spang: The revolution is already underway
Take, for example, the Framers’ efforts to ensure that the government would be populated with at least passably well-informed public servants. Why should members of the federal House of Representatives serve two-year terms, rather than the one-year term common to several state legislatures? As James Madison wrote, “no man can be a competent legislator who does not add … a certain degree of knowledge of the subjects on which he is to legislate.” The longer term was necessary to guarantee that members had time to accumulate the broader range of knowledge and practical experience necessary to the more complex task of federal legislation. Why not make judges run for election, like other officials in the federal government? Tenure and salary protections for federal judges would help, among other things, the judicial office attract those who most excelled in the “long and laborious study” necessary “to acquire a competent knowledge” of the rules and precedents to which they would be bound.
As clever as such mechanisms were, they would hardly be enough to secure government from all vices against reason, and reform legislation aimed at better protecting expertise and competence in government quickly became a regular feature of American political life. When presidents turned to using acting officials to fill vacant federal positions, thereby bypassing the theoretically competence-promoting process of Senate confirmation, Congress passed a law (the first version in 1795) limiting the amount of time acting officials could serve before a permanent replacement had to be named. When the spoils system—through which 19th-century officeholders could reward campaign supporters with government jobs—produced a federal workforce of cronies rather than experts, Congress enacted sweeping changes beginning in 1883 to help ensure a civil service based not on political patronage but on merit.
Such protections became even more crucial in the mid-20th century, as Congress delegated increasing amounts of regulatory authority to expert executive-branch agencies. For example, concerns that administrators might act arbitrarily—based on self-interest or caprice rather than reason—led to the 1946 passage of the Administrative Procedure Act, prohibiting agencies from adopting rules without a sound basis supported by documented evidence. (This is the law that the Supreme Court found the Trump administration has twice failed to comply with—when it attempted to add citizenship questions to the census and when it rescinded Obama-era protections for undocumented immigrants brought to the United States as children.) And when the spectacular government scandals of the 1960s and early ’70s brought a new generation of reformers to Congress, these and other good-government laws were strengthened further, including by reinforcing merit-based civil-service protections, creating internal-agency watchdogs called “inspectors general,” and establishing channels through which federal employees could blow the whistle on officials who ran afoul of legal restrictions.
Similar reforms accumulated to promote at least the capacity for enlightened stewardship in Congress and the courts. As Congress grew from its initial form as a part-time enterprise governing a small nation, it established and expanded its own professional staff, setting and increasing salaries to levels comparable to those earned by executive-branch personnel, hoping to attract the kind of knowledgeable support representatives need to navigate a complex range of policy demands. By the early 20th century, Congress would establish its own reference and research agencies, setting up a process by which all legislators could access independent, nonpartisan information and analysis. This capacity expanded further in the 1970s with the addition of the Office of Technology Assessment, charged with supplementing legislators’ understanding of scientific matters.
And although the judicial branch had benefited from the existence of a legal profession that had been accustomed to operating with a degree of respect for rules and reason since long before the nation’s founding, the Progressive era especially saw dramatic innovations in training and structures that transformed the way the courts did business too. From the mid-19th to the mid-20th century, legal training shifted from a system of private apprenticeships to university-based institutions of higher education devoted to teaching not only rules but rigorous (even if not quite “scientific”) methods of analysis. The turn of the 20th century saw the creation of the American Bar Association to further promote professional standards, as well as the production of the first national code of legal ethics that required lawyers to temper their loyalty to the client with a professional duty of candor—a duty of getting complete and accurate information—to the court.
Although nearly all of these and many other such reforms are still on the books in some way, this generations-long effort to forge enlightened government hardly proceeded uninterrupted, without regression. In a country with perpetual strains of populism and anti-intellectualism, setbacks have been plentiful. But not until after the post-Watergate reforms of the 1970s did the country embark upon what has turned out to be a sustained retreat not just from the practice of good government, but from the capacity to restore it.
Start with the executive branch. Since Richard Nixon, presidents have been drawing control over major policy problems away from expert Cabinet agencies and consolidating decision-making power in the White House. In part, the shift has been driven by a range of executive-oriented concerns, including a sensible interest in better coordinating policy across agencies and a more politically driven desire to own the appearance of getting things done. But Congress has also aided the trend, enacting a series of laws that have delegated new, nearly uncheckable responsibility for matters such as military force, economic and trade policy, immigration, and legal advice—powers vested not in administrative agencies, but in the president himself.
George Packer: The president is winning his war on American institutions
So long as the White House was itself filled with expert staff interested in embracing reason-based process, enlightened decision-making stood a chance. But further structural changes made the odds of even that less likely. Because as presidents and Congress were centralizing power in the White House, they were also shrinking the proportion of civil-service positions (subject to various merit requirements) inside the White House and out, replacing career staff with a growing cadre of political appointees (subject to no such regulation). Likewise, although powers Congress delegated to executive-branch agencies remained subject to APA rules requiring that policy actions be supported by reasoned basis and documented evidence, policy making in the White House was being revealed as requiring no such compliance. “The president,” the Supreme Court noted in a 1992 decision declaring APA rules inapplicable, “is not an agency.” The FDA, for example, could be bound by all the merit and process requirements in the world, but such controls might matter far less if all the real action is in the White House.
In the meantime, Congress itself has fared even worse when it comes to preserving tools essential to enlightened governance. Where congressional staff support once grew steadily for nearly a century, in keeping with growth in the size of the country to be governed, Congress today employs fewer staff aides than it did in the early 1990s. Research support agencies—once Congress’s primary independent source of policy expertise—have been particularly hard-hit; the Office of Technology Assessment was, just after the dawn of the World Wide Web, zeroed out entirely. Even the most well-meaning legislators today have minimal time for sitting in hearings, engaging in oversight, or learning on the job; the Gingrich revolution of the 1990s saw Congress shorten its official work week to three days, the better to afford members time to raise money for campaign spending. And although Congress is still capable of holding hearings to ask questions of executive-branch experts, presidents have come to rely on sweeping interpretations of their own constitutional power—produced by authoritative executive-branch legal offices with little or no competition from the courts or Congress—to resist sharing all they know.
And then there are the federal courts. The sharp politicization of the judicial confirmation process took off in the 1980s, producing the general vitriol and the Republicans’ now-categorical obstructionism of Democratic appointments that characterize federal judicial confirmation hearings today. The same period witnessed the contemporaneous decline in formally nonpartisan professional organizations like the ABA. Where about half of American lawyers were members of this common professional association 40 years ago, the fraction today is more like 20 percent. During that period, the conservative Federalist Society and its rather smaller progressive counterpart, the American Constitution Society, have become influential players in championing alternative channels of legal professionalization, up to and including the selection of judges to the federal bench.
Which brings us to the arrival of COVID-19, and the government-wide failure to muster the kind of response now apparent in the rest of the developed world. Start here too with the executive. Embracing the by-now-established norm, the Trump administration formed a White House–based task force to lead the federal response—a misbegotten body that has been, during its brief existence, a singular model of government dysfunction. Following the president’s early suggestion that injecting bleach might be a useful approach to virus treatment, and his assignment of a son-in-law without relevant knowledge or experience to manage crucial gaps in the national supply chain, the task force today finds even its more sensible warnings (such as urging people to wear masks) regularly ignored. As public-health experts rushed to point out, the competent management of such tasks was precisely what the knowledgeable, experienced civil servants in executive agencies such as the CDC, the National Institutes of Health, the FDA, and the Federal Emergency Management Agency were for.
Read: The pandemic experts are not okay
Yet the Trump team had already been extraordinarily effective in crippling the civil service, circumventing or dismantling structures put in place to promote knowledge-based, reason-driven governance from the Constitution forward. To avoid the constitutional requirement that the Senate actually advise and consent on key positions in the executive branch, for instance, the administration has simply left positions vacant or filled them with “acting” officials who have not been subject to the confirmation process. As COVID-19 took hold earlier this year, just a third of the top roles at the Department of Homeland Security (the Cabinet agency supervising FEMA) had been filled with permanent officials. At the beginning of 2020, of 714 key administration positions being tracked by a running Washington Post/Partnership for Public Service Study, 170 of them were still awaiting a nominee for Senate confirmation. At the same time, the Merit Systems Protection Board—one of the agencies created as part of the post-Watergate reforms to help ensure executive-branch compliance with merit-based principles in the civil service—has been without a quorum since 2017, effectively gutting its ability to research and report on the administration’s adherence to those principles. And in one remarkable six-week period beginning in early April this year, the president fired five agency inspectors general, including the internal watchdogs overseeing the Department of Defense and the Department of Health and Human Services, the Cabinet agency housing the CDC, the NIH, and the FDA.
Meanwhile, the government has equally failed to harness America’s tremendous technical and private production capacity to meet emergency demands for items such as personal protective equipment and testing supplies. Among the many laws Congress passed decades ago delegating authority to the executive, the Defense Production Act was designed to ensure that the president had all the power he needed to respond to just the kind of crisis COVID-19 presents. Yet, however inexplicable the president’s failure to use those powers adequately in response to COVID-19 might be, it is even harder to understand why Congress hasn’t by now ordered sufficient production or compelled coordination itself. Partisan support for the president is only a marginally satisfying answer—even Senate Republicans have voted to authorize trillions in COVID-19-related spending, including a number of initiatives passed despite presidential opposition.
Another key part of the answer: After decades of letting the executive branch do all of the policy homework, the legislature has lost the institutional memory, and given away core practical capacity, to think for itself. Having long since slashed committee staffing and research-support arms, if Congress had wanted its own independent assessment of COVID-19 risks and best practices, or needed supplies in the face of executive failure, it had scant institutional resources of its own to get it. Indeed, among the last hearings the House Rules Committee was wise enough to hold before COVID-19 put much of the national agenda on pause was an extraordinary bipartisan gathering to ask outside experts to help educate members on the nature of Congress’s own authority—including what powers Congress might have to push back should the executive resist oversight of how it spends the money Congress has granted. Congress is today a legislature that not only often lacks a baseline “degree of knowledge of the subjects on which” to legislate, but one that lacks access to enough knowledge or experience to do its job at all.
As for the courts, they have been called from the earliest days of the national COVID-19 response to adjudicate constitutional disputes over municipal shutdowns, restrictions on religious and other group gatherings, and limitations on travel. Due-process requirements, legal norms, and professional ethics continue to afford the judiciary Enlightenment-oriented advantages the political branches lack. But there are deeply concerning signs that the politicization of not only the confirmation process but the legal profession is taking a toll even here. One might thus now find among the published opinions of the U.S. federal courts an April 2020 judgment by District Court Judge Justin Walker, a former intern for Senator Mitch McConnell who was confirmed to a lifetime appointment on the federal bench last year at age 37, holding that the city of Louisville, Kentucky, would be enjoined from prohibiting drive-in church services during the pandemic on the grounds that the measure violated the constitutional right to the free exercise of religion.
Julian E. Zelizer: How conservatives won the battle over the courts
The opinion is remarkable for any number of reasons, not least that the 86-footnote judgment, prepared in little more than a day, begins with a five-page, error-laden account of the world history of religious intolerance since Saint Paul, and concludes that the effect of the Louisville mayor’s action was to have “criminalized the communal celebration of Easter.” But whatever one’s views about the burden that virtual services impose on the free exercise of religion during a pandemic, the opinion is far more remarkable for its utter falsity. There was, in fact, no Louisville ban on drive-in church services in effect at any time. And although the mayor’s office had hoped to advise the court of this reality before judgment was issued, the court denied the city’s entirely standard request for a prejudgment hearing. The lawsuit has since, unsurprisingly, concluded, but Judge Walker’s tenure has not. In June, he was confirmed to a new lifetime appointment on the U.S. Court of Appeals for the D.C. Circuit.
How even to begin a course correction? A first necessary step is to recognize that America’s radical movement away from Enlightenment-based governance preceded Trump, and cannot be fixed solely by his removal. The constitutional decision to divide government power among three federal branches was not only to protect Americans from tyrannical concentrations of power, but also to give the government a chance at correcting its own mistakes. Just as modern engineers recognized in fitting commercial jets with redundant engines, the Constitution’s Framers understood that it might be necessary to sacrifice some efficiency in exchange for a chance at staying afloat when disaster strikes. Avoiding more federal government-wide failures such as the COVID-19 response requires restoring a commitment to Enlightenment-based governance in all of the branches.
A second set of steps requires a package of legislative reforms. For example, when Congress delegates decision-making powers to the president, it should condition the exercise of that power on complying with process requirements such as meaningful fact-finding by relevant experts. When Congress revisits restrictions on the president’s ability to use acting officials to circumvent Senate confirmation requirements, it should include clear enforcement measures such as automatic funding cutoffs for failure to comply. Congress must also dramatically increase its own capacity, restoring staffing levels and salaries to those necessary to attract sufficient, high-quality staff, and expanding its ability to generate policy and legal research and analysis without having to rely solely on executive-branch experts who may or may not be willing to share everything they know. And it is past time to reform the judicial confirmation process. Judgeships, like elected offices in Congress and the presidency, should come with minimum age requirements and lengthy, but maximum, terms. These and other such proposed reforms aiming to denuclearize Senate confirmation proceedings have been circulating for years. Where an extraordinary set of national crises has opened the door, not seizing the moment for extraordinary reforms would be a tragic mistake.
A final step works across the whole of constitutional democracy, as Americans’ all-time ignorance of the fundamental structure of government has become visible in recent surveys revealing, for example, that nearly 75 percent could not name the three branches of the federal government at all. As long as ours is a representative government, this staggering degree of basic incapacity will be represented among our elected officials and their staff. Part of the correction here will require improved civic education in elementary and secondary schools; one in five states, for example, currently has no civics requirement for graduation at all. But another part can be implemented more quickly, as soon as a new administration takes office. Just as congressional and government ethics offices have traditionally trained all new federal employees and even transition teams in the rules of ethical compliance (financial-disclosure requirements and more), it is easy to imagine requiring new hires to receive a refresher in constitutional civics. Among the essential topics: the duties of Congress, the president, and the courts, and the purposes of separating powers in the first place. Because core among those purposes was promoting the Hamiltonian value of good government from both reflection and choice.
Source link