Contractors Vexed by Federal Definition of AI: Dr. Lance Eliot


Contractors can get blindsided by something as simple as a definition. Dr. Lance Eliot points out that the federal definition of artificial intelligence (AI) varies dramatically, which can mess up contractors that assert for contractual purposes that they have or are using AI.

Bloomberg Government subscribers get the stories like this first. Act now and gain unlimited access to everything you need to know. Learn more.

A rose isn’t necessarily a rose by any other name. It all depends upon your definition.

Contractors know that the wording of federal contracts is carefully determined and abided by. Nearly every contract contains definitions of key terms on which the core elements of the vendors’ services or products will hinge.

Let’s consider AI. Suppose you’re seeking a federal gig that says AI is integral to the work effort. The odds are pretty high that the contract will include a definition of the AI that’s being sought or expected to be used.

Here’s a handy warning: Pay attention to the definition of AI that is stated in the contract.

Unbeknownst to many, there isn’t as yet an all-purpose all-agreed standard definition of AI. That’s right, each contract for each effort will indubitably concoct its own idiosyncratic definition of AI.

You might have had prior contracts that said your AI fit within that particular definition of AI, and yet this might not at all be the case when approaching other contracts. It’s almost a roll of the dice as to which AI definition a specific contract might contain. It’s an AI definitional free-for-all.

This becomes a problem for you if your claimed AI doesn’t fit the contractual AI definition—the feds can toss out your bid. Furthermore, assuming that somehow your errant AI slips through and you get the contract, the feds can contend down the road that you aren’t abiding by it since your AI doesn’t match the stated definition. This can lead to financial penalties and even ugly legal battles over whether you undercut what was promised.

I’ll up the ante. Imagine you’re engaged in a federal gig and you’re sure you aren’t using AI, and purposefully so. You didn’t want to get mired in any contested aspects of AI, so you said that the software you’re using as part of your delivered services or products is assuredly non-AI.

Be careful what you think you know. I say this because the feds might decide that your software does fit the definition of AI. In that case, your work and the AI-you-didn’t-know-you-had are bound to various new AI-related rules associated with federal work efforts. Oops.

You can get walloped either coming or going.

The Closest Definition

To get you ready for any AI definitional debates, I direct your attention to a federal definition that seems to have caught on and tends to be used in a lot of federal contracts. It was first codified in Section 238(g) of the John S. McCain National Defense Authorization Act for Fiscal Year 2019. Since then, procurement officers have opted to take the easy road by merely copying and pasting the AI definition into their contracts.

Here’s the definition:

(g) ARTIFICIAL INTELLIGENCE DEFINED. In this section, the term ‘‘artificial intelligence’’ includes the following:
1) Any artificial system that performs tasks under varying and unpredictable circumstances without significant human oversight, or that can learn from experience and improve performance when exposed to data sets.
2) An artificial system developed in computer software, physical hardware, or other context that solves tasks requiring human-like perception, cognition, planning, learning, communication, or physical action.
3) An artificial system designed to think or act like a human, including cognitive architectures and neural networks.
4) A set of techniques, including machine learning, that is designed to approximate a cognitive task.
5) An artificial system designed to act rationally, including an intelligent software agent or embodied robot that achieves goals using perception, planning, reasoning, learning, communicating, decision making, and acting.

Take a close look. Your legal beagles also need to take a close look.

You’ll want to be ready to pitch that your AI fits within this definition when a fed bid is explicitly asking for AI. On the flip side, if you’re claiming that your software is not AI, you’ll want to ensure that the software doesn’t fit this “commonplace” AI definition.

One other thing. The AI definition codified in section 238g is absolutely not a rose. The definition has all kinds of vagaries and confounding elements. The fact that it continues to be used and has become a de facto definition is troubling and problematic. Some people bitterly complain that anything, including the kitchen sink, would seem to fit within that AI definition. Others worry that bona fide AI can be argued as not fitting snugly within this definition.

Whatever you do, don’t get yourself jabbed by this thorny rose. Keep your eyes open and make sure to scrutinize any federal work for that job’s definition of AI.

Subscribers can find related content at Bloomberg Government .

Author Information

Dr. Lance Eliot is a globally recognized expert on AI & Law and serves as founder and CEO of Techbrium Inc. In addition, he is a Stanford Fellow affiliated with the Stanford Law School and the Stanford Computer Science Department via the Center for Legal Informatics. Formerly a top exec at a major Venture Capital firm, plus having been a partner in a prominent consulting firm, he is a successful entrepreneur that has started, run, and sold several high-tech firms. His widely popular books on AI & Law are highly rated and are available on Amazon and at other online booksellers.

Write for us: Email IndustryVoices@bloombergindustry.com

Stay informed with more news like this – from the largest team of reporters on Capitol Hill – subscribe to Bloomberg Government today. Learn more.