In Kent Malone-Bey v. Lauderdale County School Board, 3:25-cv-380, the plaintiff was terminated an instructor by the defendant. He filed suit. At some point he filed a motion to disqualify the law firm defending the case. The motion was denied. The opinion offers another chapter in the AI hallucination saga.
The Court sua sponte addresses a separate issue: Plaintiff’s continued use of fake, nonexistent legal opinions in his filings before the Court. This practice unfortunately is not unique to the instant matter as the number of filings that include fictitious legal opinions is rising. This trend, perhaps, correlates to the frequency at which our society is embracing and utilizing generative artificial intelligence (“AI”) programs in everyday life, including legal matters. AI is certainly a tool capable of providing immense benefits and efficiency in the legal profession or in legal matters. Those benefits, however, are not without risk.
Starting with the obvious, it is well-known that AI can generate fake sources of information, commonly referred to as “AI Hallucinations.” Wadsworth v. Walmart, Inc., 348 F.R.D. 489, 493 (D. Wy. 2025). And individuals who use AI for legal research find that they are not immune from these hallucinations. Indeed, AI programs are known to hallucinate nonexistent cases. Sanders v. United States, 176 Fed. Cl. 163, 169 (2025). As AI programs advance, they only increase the difficulty in which users (and the courts) may determine whether a case provided by an AI program is in fact “hallucinated.”
That is because hallucinated cases look like real cases. They are identified by a case name, a citation to a reporter, the name of a district or appellate court, and the year of the decision. United States v. Hayes, 763 F. Supp. 3d 1054, 1065 (E.D. Cal. 2025). But, they are not real cases. These hallucinated cases are instead inaccurate depictions of information from AI models that suffer from incomplete, biased, or otherwise flawed training data. Wadsworth, 348 F.R.D. at 493. Thus, “[w]hen used carelessly, [AI] produces frustratingly realistic legal fiction that takes inordinately longer to respond to than to create. While one party can create a fake legal brief at the click of a button, the opposing party and court must parse through the case names, citations, and points of law to determine which parts, if any, are true.” Ferris v. Amazon.com Servs., LLC, 2025 WL 1122235, at *1 (N.D. Miss. Apr. 16, 2025).
Here, Plaintiff’s Brief [18] overflows with either citations to fake, nonexistent cases or existing cases with incorrect legal propositions, i.e., the hallmarks of AI hallucinations. It therefore appears that Plaintiff may have used generative AI to assist in legal research and drafting his motion briefs.
As just one example, Plaintiff cites Coleman v. Ret. Plan for Dist. Managers, 969 F.3d 142, 149 (5th Cir. 2020) for the proposition that courts may disqualify counsel to “preserve the integrity of the adversary process.” [18] at 2. But the Coleman case does not exist.[5] By citing (and quoting) a fake opinion,[6] Plaintiff undermines the very integrity that he urges this Court to uphold. Indeed, submitting fictitious cases and quotations to the Court degrades or impugns the integrity of the Court. Hayes, 763 F. Supp. 3d at 1054. Moreover, Plaintiff’s attempt to persuade the Court “by relying on [a] fake opinion[] is an abuse of the adversary system.” Mata v. Avianca, Inc., 678 F. Supp. 3d 443, 461 (S.D.N.Y. 2023) (emphasis added).[7]
Even if Plaintiff did not use AI to assist with citing legal opinions and holdings, he still must comply with Federal Rule of Civil Procedure 11. See Yazdchi v. Am Honda Fin. Corp., 217 F. App’x 299, 304 (5th Cir. 2007). Rule 11 provides that “[b]y presenting to the court a pleading, written motion, or other paper — whether by signing, filing, submitting, or later advocating it — an attorney or unrepresented party certifies that to the best of the person’s knowledge, information, and belief, formed after an inquiry reasonable under the circumstances… the claims, defenses, and other legal contentions are warranted by existing law.” Fed. R. Civ. P. 11(b)(2).
By citing to fake legal opinions (or hallucinated legal holdings therein), Plaintiff not only wasted the Court’s time and judicial resources, but he also violated his Rule 11 obligations. See Benjamin v. Costco Wholesale Corp., 2025 WL 1195925, at *2 (E.D.N.Y. Apr. 24, 2025) (“an attorney [or pro se party] who submits fake cases clearly has not read those nonexistent cases, which is a violation of Rule 11 of the Federal Rules of Civil Procedure”) (emphasis in original). Plaintiff is therefore reminded of his obligations under Rule 11 as stated herein and advised that future filings before this Court must contain only accurate representations and citations. Additionally, Plaintiff is warned that future filings with citations to fictitious legal opinions and holdings or which do not otherwise comply with Rule 11 may result in sanctions, including but not limited to the striking of filings and the imposition of monetary penalties. Sanders, 176 Fed. Cl. at 170 (2025); Ferris, 2025 WL 1122235, at *2-3.
IT IS, THEREFORE, ORDERED that:
1. Plaintiff’s Motion to Disqualify Butler Snow LLP from Representing Individual Capacity Defendants [17] is DENIED; and
2. Plaintiff is reminded of his Rule 11 obligations as stated herein and warned that future filings with citations to fictitious legal opinions and holdings in filings before the Court, or which do not otherwise comply with Rule 11, may result in sanctions, including but not limited to striking Plaintiff’s filings and monetary penalties.
SO ORDERED.

