No Country for New Laws

Trailblazing developments in Artificial Intelligence have left a litany of complex legal cases in their wake, and judges are tasked with deciding how old law should be interpreted against the backdrop of this technological revolution.

Comptroller-General of Patents, Designs and Trade Marks v Emotional Perception AI Limited [2024] EWCA Civ 825 raises the important question of whether an invention involving an Artificial Neural Network (“ANN”) is excluded from patentability as a “program for a computer” under the Patents Act 1977 (“the Act”). After the High Court’s resounding “no”, it appeared that the UK may have become a more favourable jurisdiction than the European Patent Office (EPO) for patenting ANN-related inventions. Last week, however, in a mood of apparent legal conservatism, the Court of Appeal (Nicola Davies, Arnold and Birss LJJ) unanimously decided to overturn the lower court’s decision: the ANN is excluded after all.

Emotional Perception’s Case

In its patent application, Emotional Perception AI Ltd (“EPAI”) claims a system that contains an ANN. Consider a media file such as one encompassing a music track. The song captures the right mood. How to find more of the same? EPAI’s ANN can identify a “semantically similar” music track, which the system may recommend and send to a listener. To do this, the ANN analyses features of the song, such as the bpm, and estimates distances from that song to other candidate songs in an abstracted embedding space of those features. In the course of training, the ANN learns to correlate distances in the feature embedding space with separations in a semantic embedding space—think, “groovy” versus “romantic”—as predicted by a natural language processor. As a result, the system can return a semantically similar file to the user. EPAI’s system has a noble cause: because it recommends songs based on characteristics of the music itself, rather than statistics such as numbers of streams, it allows budding artists to be promoted on an equal footing with the Swifts and Beatles of the industry.

EPAI’s patent application was refused by the UK Intellectual Property Office (UKIPO). Section 1(2)(c) of the Act asserts that “a program for a computer” is excluded from patentability when claimed “as such”. The leading precedent for applying this exclusion, and the additional exclusion for “mathematical methods” under Section 1(2)(a), is Aerotel v Telco Holdings (“Aerotel”). Central to the Aerotel approach is determining whether the “contribution” of the invention is “technical” in nature. On this point, the hearing officer at the UKIPO rejected EPAI’s argument that an equivalence between a hypothetical implementation of the ANN on dedicated hardware, and its software equivalent, meant that the software implementation was inherently technical. Even if the software ANN could be “decoupled” from the hardware ANN, the hearing officer asserted that it would be excluded as a mathematical method. Furthermore, the subjectivity of the semantic descriptions used during training meant that providing a semantically similar recommendation was not a technical contribution—a failed application either way. EPAI appealed.

Turning the hearing officer’s logic on itself, the High Court rejected both of his findings. The judge, Sir Anthony Mann, concluded that because a hardware implementation does not involve a computer program, the equivalent software implementation ought not to include one either. Though a semantically similar recommendation may be an inherently subjective judgement, it is also a product of the ANN’s internal logic, i.e. the “technical criteria” that it had worked out for itself over the training process. The Comptroller appealed.

The Court of Appeal’s Assessment

Reading the Court of Appeal’s judgment, Birss LJ notes that all four of the Comptroller’s grounds for appeal are tied to a fundamental question of law: Should a “program for a computer”, wording from a statutory clause drafted in 1977 to implement the European Patent Convention (EPC) of 1973, be understood in 2024 to encompass ANNs? Only if the answer is “yes” does the need to assess the technicality of the contribution arise. When the answer is “no”, one still needs to consider whether an ANN is a mathematical method.

In his own answer to the fundamental question, Birss LJ deftly avoids defining either a computer or a program in scientific terms. He reasons that pillars of British case law support the use of ordinary meanings instead: a computer “is a machine which processes information” in the form of instructions, and a program is a “set of instructions for a computer to do something”. The definitions hold regardless of what the something is, or what the instructions are. In the courtroom, Birss LJ posed another question: “Is there magic in the word ‘instruction’?” The judgment suggests no such “magic” exists—Birss LJ rejects the submission that instructions of a computer program must be in a logical series of ‘if-then’ type statements, or that they must be created by a human author. In a somewhat creative flourish, Birss LJ interprets an ANN as a computer (a machine for processing information), and the ANN’s weights as a computer program (the set of instructions for the computer). Quite how the ANN is different from its own weights is not explained—perhaps the former refers to the network architecture—however the distinction appears to rest on the Comptroller’s contrasting, at the hearing, of a “generic” untrained ANN and a “specific” trained ANN. Regardless, the adopted interpretation means that the ANN falls under the Section 1(2)(c) exclusion. (Birss LJ cites EPO case law to postulate that the same approach would be taken in Europe.)

The fact that the exclusion is engaged means that “ANN-implemented inventions are in no better and no worse position than other computer-implemented inventions”. Accordingly, Birss LJ moves on to the question of whether there is a technical contribution. In this regard, EPAI argued that the nature of analysis of the inputs, and/or the external transfer of an electronic file, confer the system with the requisite technical character, and emphasised during the hearing the need to consider the “whole contribution” of the claimed invention. Birss LJ, however, conducts his assessment based solely on the contribution identified by the lower court, and rejects its finding that that contribution is technical, for two reasons: first, the step of sending a recommended file is a “presentation of information”, which is excluded under Section 1(2)(d) of the Act, and second, the running of the program (the ANN’s weights) does not improve the computer (the ANN itself). Rather, the claimed system merely provides an improved file recommendation, and what makes the recommended file “worth recommending” are its semantic qualities, which are subjective rather than technical.

Ramifications

The Court of Appeal’s decision may disappoint certain members of the UK patent profession and would-be applicants, who have been licking their lips at the prospect that the UK may have become a more favourable jurisdiction than the EPO for patenting inventions involving ANNs. After all, the UK has sought to position itself as a world leader in AI innovation, and it could be argued that a wider availability of patent protection for such inventions could support this aim. However, in the absence of a successful appeal to the Supreme Court, the UK remains close to the EPO, and with a considerably higher hurdle than the US. With such a hurdle in place, it is more important than ever for prospective applicants to get candid advice from a patent attorney who knows where the line is (and how it can be pushed) before going ahead.

A Pragmatic Counterpoint

In the courtroom, Birss LJ posed an intriguing thought experiment. Imagine that EPAI’s ANN was a mechanical jukebox with in-built “clever technicality” that allowed it to provide a semantically similar track recommendation in the same way as the ANN. The implication was, of course, that the jukebox would be technical, suggesting that the ANN ought to be technical by analogy.

The Comptroller’s retort was that a computer program with clever technicalities is excluded because of “policy considerations” and not due to qualities intrinsic to a computer program. Specifically, the exclusion of computer programs from patentability is usually justified by the policy consideration that protection of programming expressions belongs to the realm of copyright law. This exclusion, it is argued, avoids overlapping IP rights and thus promotes legal certainty for competitors and third parties. Although the High Court and the Court of Appeal reached opposite conclusions, both decisions follow a literalist approach, turning on whether a “program for a computer” can be read as an “ANN”. An alternative approach—both purposive and pragmatic in nature—would have assessed whether the purpose of the exclusion applies equally to ANNs: unlike computer programs, it is not clear that ANNs enjoy copyright protection. The exclusion of ANNs from patentability arguably perpetuates a gap in their legal protection: they are neither patentable nor protected by copyright.

The Court of Appeal pursues its assessment of the “program for a computer” exclusion on the doctrine that a statutory clause is “always speaking”, meaning it can be broadened in scope over time whilst retaining its original wording. Still, Birss LJ finds there to be “no justification for drawing a distinction in law between instructions created by a computer and those created by a human”. ANNs increasingly contribute to inventions alongside human inventors, and in Thaler v Comptroller-General, the UK Supreme Court recently held that an AI system cannot be an inventor. Evidently, the UK’s highest court has already drawn the distinction for which Birss LJ finds no justification.

At the hearing, Arnold LJ ventured into a historical consideration of the types of protection for computer programs debated during the drafting of the Patents Act 1977. Three possibilities were considered: patent, copyright, and a sui generis right. In the absence of either patent or copyright protection for AI models, should the notion of a sui generis right be revisited?