See also the RSS News feed of working papers as they are released.

Bound print copies of George Mason School of Law’s working paper series on law and economics are available in the Law Library. The bound set often includes initial drafts of papers. Search Mason’s Catalog to locate a working paper.

Recent Working Papers:

Overturning a Catch-22 in the Knick of Time: Knick v. Township of Scott and the Doctrine of Precedent

By Ilya Somin, Shelley Saxer


The Supreme Court’s decision in Knick v. Township of Scott was an important milestone in takings jurisprudence. But for many observers, it was even more significant because of its potential implications for the doctrine of stare decisis. Knick overruled a key part of a 34-year-old decision, Williamson County Regional Planning Commission v. Hamilton Bank, that had barred most takings cases from getting a hearing in federal court.

Some fear that the Knick decision signals the start of a campaign by the conservative majority on the Court that will lead to the ill-advised overruling of other precedents. In this article, we explain why such fears are misguided, because Knick’s overruling of Williamson County was amply justified under the Supreme Court’s established rules for overruling precedent, and also under leading alternative theories of stare decisis, both originalist and living constitutionalist.

Part I of this Article briefly summarizes the reasons why Williamson County was wrongly decided, and why the Knick Court was justified in overruling it on the merits — at least aside from the doctrine of stare decisis.  The purpose of this Article is not to defend Knick’s rejection of Williamson County against those who believe the latter was correctly decided. For present purposes, we assume that Williamson County was indeed wrong, and consider whether the Knick Court should have nonetheless refused to overrule it because of the doctrine of stare decisis.   But the reasons why Williamson County was wrong are relevant to assessing the Knick Court’s decision to reverse it rather than keeping it in place out of deference to precedent.

Part II shows that Knick’s overruling of Williamson County was amply justified based on the Supreme Court’s existing criteria for overruling constitutional decisions, which may be called its “precedent on overruling precedent.”   It also addresses Justice Elena Kagan’s claim, in her Knick dissent, that the majority’s conclusion requires reversing numerous cases that long predate Knick. Part III explains why the overruling of Williamson County was justified based on leading current originalist theories of precedent advanced by prominent legal scholars, and by Supreme Court Justice Clarence Thomas in his recent concurring opinion in Gamble v. United States.  In Part IV, we assess the overruling of Williamson County from the standpoint of prominent modern “living constitutionalist” and pragmatic theories of precedent.  Here too, it turns out that overruling was well-justified.

Other recent decisions reversing established precedent may be more troubling. But Knick was amply justified.

The Data Gap: Promoting Analysis of Exposure-Related Harms

By Caroline Cecot


In the toxic tort context, both litigation and regulation require reliable scientific data to establish a causal connection between exposure to some substance and alleged harm before allowing recovery or mandating mitigation. On the one hand, it is important for litigation and regulation to be based on causal evidence of actual harms. Otherwise, these interventions could make society worse off by unduly limiting the availability of useful substances and diverting resources away from addressing true risks. On the other hand, for this system to comprehensively address all important environmental externalities, there must exist sufficient incentives to generate the data required for effective risk-management through litigation and regulation.

This Article argues that, in many cases, the incentives are insufficient. When it comes to latent harms, in particular, scientific research evaluating causal links is challenging and expensive. Independent researchers, who require funding for their work, are unlikely to systematically analyze the effects of new substances. To date, there are thousands of unstudied substances in use.

Given the increasing importance of reliable scientific data for efficient risk management, it is time to evaluate all options for incentivizing its production in order to promote optimal deterrence in the toxic tort context. This Article proposes several ways to combat the persistent data lag, including changes to tort common law and regulation. Most controversially, it proposes a new tort cause of action for informational monitoring and analysis in some circumstances when there exist no reliable studies on the potential harm of a particular substance. A successful claim would lead to the establishment of a scientific panel, paid for by the defendant, to analyze and monitor the link between exposure to the substance and subsequent health outcomes.

Swedish Competition Authority's Proposed Market Study of Digital Platforms, Comment of the Global Antitrust Institute, Antonin Scalia Law School, George Mason University

By Tad Lipsky, Joshua Wright, Douglas Ginsburg, John Yun


This comment is submitted by the Global Antitrust Institute (GAI) at the Antonin Scalia Law School, George Mason University to the Swedish Competition Authority regarding its proposed market study of digital platforms. The GAI Competition Advocacy Program provides a wide range of recommendations to facilitate adoption of economically sound competition policies.

Evolution Equations of Discount Functions and Metrics of Dynamic Inconsistency

By Terrence Chorvat


This article considers continuous models of time discounting that evolve dynamically. While constant exponential discounting is the paradigmatic model for time discounting, many models which depart from exponential discounting have been proposed to attempt to more closely match the behavior of individuals, firms and markets.  This article argues that it is the dynamic inconsistency of behavioral models that gives them their most salient features.  The article then develops evolution equations for some of the most prominent continuous discounting models to more clearly consider their dynamic inconsistency.  It then proposes metrics for the degree of dynamic inconsistency exhibited by discounting models allowing comparison of dynamic inconsistency both across and within models.

Testimony on the STRONGER Patents Act before the Senate Judiciary Committee, Intellectual Property Subcommittee

By Adam Mossoff


This invited testimony was presented at a hearing on the STRONGER Patents Act before the Senate Judiciary Committee, Intellectual Property Subcommittee, on September 11, 2019. It explains how the STRONGER Patents Act addresses two sources of uncertainty, instability and weakness in the U.S. patent system today. First, the bill permanently would end the willy-nilly operations of the Patent Trial & Appeal Board (PTAB), an administrative tribunal created by Congress in 2011 to cancel issued patents. The PTAB engages in numerous procedural “shenanigans” that have produced extremely high cancellation rates, earning it the moniker of a “death squad” for patents. The STRONGER Patents Act imposes structural reforms on the PTAB by hardwiring into it necessary limitations on arbitrary action. Second, the bill abrogates the Supreme Court’s 2006 decision in eBay v. MercExchange, which created a new test for issuing injunctions for the ongoing infringement of a valid patent. By eliminating the classic legal test of a presumptive injunction for an ongoing violation of a valid property right, eBay has led to a significant drop in courts issuing injunctions for all patent owners. Both the PTAB and eBay have created a cloud over the titles of patents, incentivizing “efficient infringement” by large companies and hampering the economic function of patents in driving licensing and other commercial activities in the innovation economy. Thus, the STRONGER Patents Act represents much-needed reform. It reestablishes reliable and effective patent rights, stable legal institutions, and the rule of law in the patent system — essential features of all legal property rights in driving economic growth in innovation economies.

The Decline in U.S. Criminal Antitrust Cases: ACPERA and Leniency in an International Context

By Douglas Ginsburg, Cecilia (Yixi) Cheng


Criminal cartel cases in the U.S. are at modern lows, spurring questions as to whether the Antitrust Criminal Penalty Enhancement and Reform Act of 2004 (ACPERA) and the Antitrust Division’s criminal enforcement program continue to be effective and, if not, why not? In this Chapter, we offer three non-exclusive hypotheses for the recent decline:

(1) increasingly large fines in multiple jurisdictions have lessened the incentive to apply for leniency in any one jurisdiction;

(2) technology has caused the substitution of lawful tacit for unlawful express collusion; and

(3) ACPERA and the Division’s criminal program have succeeded in deterring cartel formation – at least among U.S. companies.

Our analysis of the Antitrust Case Filings database leads us to be tentatively optimistic about the third possibility: Over the last decade, the number and percentage of foreign as opposed to U.S. corporate defendants has increased dramatically.

Testimony on 'Competition in Digital Technology Markets: Examining Acquisitions of Nascent or Potential Competitors by Digital Platform' before the Senate Judiciary Committee, Antitrust Subcommittee

By John Yun


Is there a problem with large technology firms, or platforms, purchasing nascent competitors and suppressing competition before they can mature into vibrant competitors? Further, if there is a problem, are the current antitrust laws and the enforcement of those laws sufficient to combat the problem? If not, is there a legislative solution? These are all critical questions given that innovation and incentives to innovate are at the heart of all vibrant modern economies. This testimony explores these questions.

The Right to Carry Your Gun Outside: A Snapshot History

By Joyce Malcolm


The right of self-defense is the core of the Second Amendment right to keep and bear arms as the US Supreme Court has affirmed in two landmark decisions. The right does not, and cannot, stop at the domestic doorstep. Nevertheless there are those arguing that somehow the right “to bear arms” is confined to the home. This essay addresses this latest effort to deny the individual right to keep and bear arms that the Court has affirmed. It focuses on the right to carry a gun outside the home, mindful that the right to keep and bear arms, like other rights, included some practical restrictions. In reviewing the history, the crucial time for an understanding of the meaning of the Second Amendment is the point in the evolution of the Anglo-American right when the amendment was drafted and added to the American Bill of Rights.

Regulating Speech with Bayesian Audiences

By Yonathan Arbel, Murat Mungan


Defamation law fines speakers who make certain false statements, because such statements mislead their audience. It is commonly thought that stricter defamation laws offer better protection against misleading statements. Here, we study the audience's equilibrium behavior and beliefs in the presence of defamation laws of varying strictness.

We find that both lax and strict defamation laws have undesirable consequences. Strict and lax regulation of information make speech largely unreliable by deterring truthful negative remarks and failing to deter frivolous statements, respectively. Under a large set of circumstances, the optimal regulation of communications should be moderate, to facilitate the effective communication of private information and to balance a trade-off between deterring defamation, chilling truthful criticisms, and litigation costs. The court's competency plays a key role in determining the optimal defamation regime, and when courts are sufficiently capable of sorting out frivolous defamation claims, the optimal defamation regime leads to a separating equilibrium where statements always accurately inform the audience.

Although our analysis focuses specifically on defamation law, it is illustrative of the dynamics present in many other contexts where the law regulates disclosure by interested private parties to a Bayesian audience. We discuss some of these contexts, including securities regulations, whistle-blowers, jury trials, and reports of criminal activity.

Generic Drugs, Used Textbooks, and the Limits of Liability for Product Improvements

By Timothy Muris, Jonathan Nuechterlein


A key issue in "product-hopping" cases is how to reconcile society's interest in increased price competition with the need for continued pharmaceutical innovation, particularly where a new product formulation presents genuine therapeutic benefits. Some courts have proposed to weigh the acknowledged therapeutic value of a new pharmaceutical product against the monetary effects of suppressed generic competition. But the task of "weighing" such radically incommensurable social values lies well beyond the competence of generalist tribunals. Michael Carrier and Steve Shadowen have proposed to side-step this problem through what they call a "no business sense" test. Although this approach would avoid a direct comparison of therapeutic benefits and monetary harms, it would present intractable implementation problems of its own, and it asks the wrong conceptual question in any event. In the final analysis, developing and marketing a new formulation should not subject a manufacturer to antitrust liability if the formulation presents genuine therapeutic benefits for patients.

We underscore these points by comparing the pharmaceutical marketplace to the economically similar marketplace for college textbooks. That marketplace, too, features a "price disconnect," where the professors who assign textbooks do not pay for them, and the students who pay for textbooks do not choose them. Yet no one seriously proposes to subject publishers and authors to antitrust liability for conduct strikingly similar to pharmaceutical product-hopping: introducing new editions more often than they otherwise would allegedly in order to suppress competition from used booksellers. There is no principled reason for applying different rules to successful reformulations of existing pharmaceutical products.

The Proper Role of History and Tradition in Second Amendment Jurisprudence

By Nelson Lund


The Supreme Court’s decisions in District of Columbia v. Heller (2008) and McDonald v. City of Chicago (2010) resolved two foundational issues. First, the Second Amendment protects the inherent right of individuals to self-defense, not a right of states to maintain an organized militia. Second, the Amendment applies to state and local governments in the same way that it applies to the federal government. Both cases also held that a general ban on the possession of a handgun in one’s home are unconstitutional. In the ensuing decade, the lower courts have confronted many questions about the scope and application of the Second Amendment that were left unanswered by these decisions.

Shortly after the retirement of Justice Anthony Kennedy, who was probably the median voter in the 5-4 decisions in Heller and McDonald, the Court granted certiorari in New York State Rifle & Pistol Association v. City of New York. This challenge to New York City’s uniquely severe restrictions on transporting firearms in public raises another foundational issue: whether the Second Amendment right to “bear Arms” is protected outside one’s own home. New York has attempted to render this case moot by changing the law to accommodate the plaintiffs’ very specific and modest demands. The plaintiffs maintain that the case is not moot, and the Court has not yet ruled on that issue.

Whether in this case or some other, Justice Brett Kavanaugh will have an opportunity to press an unusual jurisprudential approach that he developed in a dissenting opinion while he was on the D.C. Circuit. He contended that Heller requires courts to apply a history-and-tradition test to every issue that is not resolved by the constitutional text. No circuit court has adopted this position. Many, however, have employed a version of the means-end analysis that the Supreme Court routinely uses in analogous areas of constitutional law, and none has rejected the use of such analysis.

This Article will show that then-Judge Kavanaugh misinterpreted Heller, and it will explain why neither he nor other members of the Supreme Court should adopt the approach that he mistakenly imputed to Heller. Other circuit judges have developed a better framework, in which text, history, and tradition are relied on when, and only when, those sources provide reasonably clear guidance. In other cases, which in practice will be much more numerous, judges should engage in means-end analysis that is informed by what is known about the purpose of the Second Amendment from its text and history.

Knick v. Township of Scott: Ending a Catch-22 that Barred Takings Cases from Federal Court

By Ilya Somin


The Supreme Court’s decision in Knick v. Township of Scott put a long-overdue end to a badly misguided precedent that had barred most takings cases from federal court. The big issue at stake in Knick was whether the Court should overrule Williamson County Regional Planning Commission v. Hamilton Bank (1985). Under Williamson County, a property owner who contends that the government has taken his property and therefore owes “just compensation” under the Takings Clause of the Fifth Amendment could not file a case in federal court until he or she first secured a “final decision” from the relevant state agency and “exhausted” all possible remedies in state court. The validity of this second “exhaustion” requirement was at issue in Knick. Even after both Williamson County requirements were met, it was still usually impossible to bring a federal claim because procedural rules preclude federal courts from reviewing final decisions in cases that were initially brought in state court.

Part I of this article briefly describes the background of the Knick case and the Williamson County decision that the Court ended up reversing. In Part II, I explain why the Court was right to conclude that Williamson County created an indefensible double standard under which takings claims against state governments were effectively barred from federal court in situations where other types of constitutional claims would not be. Part III explains why overruling Williamson County is justified under the Supreme Court’s admittedly imprecise doctrine on overruling precedent. Justice Elena Kagan’s dissenting opinion is wrong to argue that overruling Williamson County also entails overruling numerous earlier precedents. Finally, Part IV assesses the potential real-world impact of the Knick decision. In many cases, it will make little difference whether a takings claim gets litigated in state court or federal court. In some situations, however, the right to bring a claim in federal court is a vital tool to avoid potential bias in state courts and procedural hoops that subject property owners to a prolonged ordeal before they have an opportunity to vindicate their rights. Claims that Knick will lead to a flood of new takings litigation are overblown. But to the extent that substantial new litigation does result, that is likely to be a feature, not a bug.

Privacy and Consumer Control

By Howard Beales, Timothy Muris


This essay, prepared for the Aspen Institute Congressional Program on the Internet, Big Data, and Algorithms, makes three points. First, personal information about commercial transactions does not belong solely to the consumer. Approaches to privacy regulation based on property, particularly notice and choice, do not give consumers meaningful control over their information or how it is used. Second, regulating information uses based on the consequences of information use and misuse is a more productive approach, providing important protections to consumers. Third, information about users is critical in determining the value of the targeted advertising on which the financing of internet content depends, allowing consumer to receive valuable information and services without direct payment.

The Promise and Peril of Epistocracy

By Ilya Somin


Jason Brennan's Against Democracy makes a strong case that democratic majorities' right to rule rests on shaky grounds so long as their ballot box decisions are heavily influenced by ignorance and bias. But his “epistocratic” alternative - empowering the better-informed segments of society - has significant flaws of its own. Ironically, the biggest shortcoming of epistocracy may be that we lack the knowledge necessary to make it work well.

Testimony on 'The State of Patent Eligibility in America' before the Senate Judiciary Committee, Intellectual Property Subcommittee

By Adam Mossoff


This invited testimony was prepared for the Senate Judiciary Committee, Intellectual Property Subcommittee, hearing on reforming § 101. It identifies several reasons that justify congressional action in reforming § 101 and abrogating the Alice-Mayo framework created by the Supreme Court in its patent eligibility cases between 2010 and 2014. First, it details the very high rates of invalidation of patents and rejection of patent applications under § 101 since 2014. Second, it explains why the USPTO’s recent reforms in its § 101 examination guidelines are insufficient to solve the problems of excessive cancelations of patents by courts and uncertainty for innovators. Third, it identifies a guidepost for congressional action today in the 1952 Patent Act. Congress has abrogated Supreme Court doctrines many times before, and its enactment of § 103 in the 1952 Patent Act is a model for reform of § 101 today. Section 103 is succinct, technology neutral, and simple in setting forth procedural and substantive limits for nonobviousness doctrine. It does this because it responded to the same problems in nonobviousness doctrine that innovators face today under patent eligibility doctrine: the Supreme Court created a very restrictive test (“the flash of creative genius”) that resulted in extensive invalidations of patents and uncertainty for innovators. In 1949, Justice Robert Jackson lamented in dissent that "the only patent that is valid is one which this Court has not been able to get its hands on." This could have been written today about patent eligibility cases. Congress should act again to rein in a judicial doctrine that is undermining the function of the patent system in driving the innovation economy.

Questioning Patent Alienability

By Tun-Jen Chiang


The standard economic rationale for the alienability of property rights is that it facilitates the flow of resources to those who can put it to the most valuable use, or the “highest utility user.” But patents do not come with a right to productively use some social resource—patent rights consist only of a right to stop others from using the claimed invention. The person who is most able to extract rents with a patent’s veto power is not necessarily the same as the person who will put an invention to its most socially valuable use. If one simply applied the conventional economic justification for the alienability of property rights onto patents, then having patents flow to the highest rent extractor is not obviously desirable from a social viewpoint. Restricting transfers to predatory users would accordingly seem justified.

If the unrestricted alienability of patents is to be justified on economic grounds, it must be by reference to other reasons, such as an argument that allowing alienability increases the value of a patent and therefore increases ex ante incentives to invent. But such alternative justifications come with their own limits. Alienability is neither the only means to increase ex ante incentives to invent, nor a particularly effective one, given that inventors must share the surplus generated by alienability with the (more sophisticated) transferee. The case for unlimited alienability of patents is therefore an uneasy one.

Book Review, The US Supreme Court and the Centralization of Federal Authority, by Michael A. Dichio

By Ilya Somin


Does the U.S. Supreme Court protect the states from the expansion of federal authority? In this important new book, political scientist Michael Dichio argues that the answer is “no.” To the contrary, he contends that, throughout American history, “the Court …. has persistently acted as an important instrument of the broader central state, expanding federal authority over society.” The theory that the Supreme Court expands federal power at the expense of the states is not a new idea, having been first raised by anti-Federalist critics of the Constitution over 200 years ago. But Dichio provides the most thorough and wide-ranging defense of it to date, drawing on an extensive database of notable Supreme Court decisions from 1789 through 1997. Among other things, he shows that the Court constrained the states in important ways even in historical periods that are often thought of as high points for “states’ rights,” such as the Jacksonian era and the late nineteenth century. 

Dichio’s analysis is, in many ways, compelling, and is a major contribution to the literature on federalism and judicial review. But some of his methodological choices overstate the centralizing tendencies of the Supreme Court. He also unduly downplays some key ways in which the Court promotes decentralization of power. While the Supreme Court has never been a consistent ally of state governments seeking to limit federal authority, it is also not quite as consistent a centralizing force as Dichio suggests.

Antitrust After Big Data

By John Yun


With the rise of digital markets, the conventional wisdom was that big data was a new economic phenomenon that would allow incumbent firms with market power to entrench their market positions, foreclose competitors, and serve as a virtually insurmountable barrier to entry. This led to calls for greater antitrust enforcement and regulation of big data practices. Since that time, with the benefit of substantial growth in the theoretical and empirical economic literature involving big data, it is appropriate to revisit our understanding of big data’s implications for antitrust. This paper contributes to the discussion by detailing three things we have learned about big data as it applies to competition policy. First, we now have a better understanding of the role that big data plays in the production and innovation process. Second, it makes little sense to reflexively label big data as a barrier to entry. Competition policy is better served by considering actual entry conditions rather than basing competitive effects analysis on determining whether access to certain inputs are or are not barriers to entry. Third, competition authorities now have a sizeable level of experience in assessing big data in actual cases. It is notable that, thus far, big data alone has not fueled a theory of harm that has led to an agency challenge in the U.S. or Europe. All these considerations suggest that we are perhaps in a new, more mature, era regarding big data in competition policy — not because big data is any less important to innovation — but because researchers and regulators have consistently found that big data in and of itself does not represent a relevant antitrust concern.

The Political Economy of Enforcer Liability for Wrongful Police Stops

By Tim Friehe, Murat Mungan


This article questions whether excessive policing practices can persist in an environment where law enforcement policies are subject to political pressures. Specifically, it considers a setting where the police decide whether to conduct stops based on the suspiciousness of a person's behavior and the potential liability for conducting a wrongful stop. We establish that the liability level that results in a voting equilibrium is smaller than optimal, and, consequently, that excessive policing practices emerge in equilibrium.

The Destructive Legacy of McCulloch v. Maryland

By Nelson Lund


McCulloch v. Maryland is probably the Supreme Court’s single most influential opinion, and certainly one of its most celebrated. As countless commentators have recognized, McCulloch’s importance arises from its doctrine of implied congressional powers, which has been applied even to constitutional amendments adopted decades after the McCulloch decision itself. Revered though it may now be, Chief Justice Marshall’s opinion provoked a hostile commotion when it was issued. So much so that he was moved to defend it in a series of anonymous newspaper essays. The opinion remained controversial for many years, and it deserves to become controversial once again.

Like Marshall, all of the current Justices can say that the abstract principle of limited and enumerated powers is “now universally admitted.” But the legacy of his opinion has been the effective destruction of that principle. McCulloch famously proclaimed that “we must never forget, that it is a constitution we are expounding.” This sonorous aphorism is frequently, if unnecessarily and improperly, taken to mean that it is merely a constitution, which judges are free (or obligated!) to amend under the guise of interpretation. That attitude has triumphed historically, and perhaps irrevocably. Constitutional law is widely regarded now as a branch of political philosophy or as a field on which to play junior varsity statesmanship. Or, not infrequently, as an arena for flamboyant moral posturing or as a weapon of partisan warfare.

Rather than submissively celebrate these developments, we could choose to stop forgetting that the Constitution was originally meant to be a law, and that it was meant to be more authoritative than what the Supreme Court says about it. If we did, McCulloch and its rank progeny would become controversial once again.