Federal Court of Australia

Thaler v Commissioner of Patents [2021] FCA 879

Review of:

Stephen L. Thaler [2021] APO 5

File number:

VID 108 of 2021

Judgment of:

BEACH J

Date of judgment:

30 July 2021

Catchwords:

PATENTS artificial intelligence – machine learning – artificial neural networks – invention produced by a computer – semi-autonomous systems – concept of inventor – whether a machine can be an inventor – whether only a human can invent – device for the autonomous boot-strapping of unified sentience (DABUS) – invention created by DABUS in the form of containers, devices and methods for attracting enhanced attention – patent application nature of inventor as distinct from patent applicant – concept of “inventor” under Patents Act 1990 (Cth) – validity of patent application where inventor a machine – relevance of human-ness of inventor to patent grant – whether decision to reject patent application valid

Legislation:

Acts Interpretation Act 1901 (Cth) s 2C

Administrative Decisions (Judicial Review) Act 1977 (Cth) s 5

Judiciary Act 1903 (Cth) s 39B

Patents Act 1990 (Cth) ss 2A, 3, 7, 18, 15, 29, 29A, 40, 45, 59, 64, 101B, 101E, 113, 138, 142, 172, 182, 185, 208; sch 1

21 Ja 1 c 3 (Statute of Monopolies) 1623 (Imp) s 6

Intellectual Property Legislation Amendment (Raising the Bar) Regulation 2013 (No. 1) (Cth)

Intellectual Property Legislation Amendment (TRIPS Protocol and Other Measures) Regulation 2015 (Cth)

Patents Regulations 1991 (Cth) regs 3.1A, 3.2A, 3.2B, 3.2C, 3.18

Patent Cooperation Treaty (Washington, 19 June 1970) arts 4, 9, 27, 58

Regulations under the Patent Cooperation Treaty (Washington, 19 June 1970) rr 4, 51bis

Cases cited:

Atlantis Corp Pty Ltd v Schindler (1997) 39 IPR 29

Brent v Commission of Taxation (1971) 125 CLR 418

D’Arcy v Myriad Genetics Inc (2015) 258 CLR 334

Dodson v Grew (1767) Wilm 272; 97 ER 106

Dunlop v Cooper (1908) 7 CLR 146

Federal Commissioner of Taxation v Clarke (1927) 40 CLR 246

House of Peace Pty Ltd v Bankstown City Council (2000) 48 NSWLR 498

JMVB Enterprises Pty Ltd v Camoflag Pty Ltd (2006) 154 FCR 348

Martin v Scribal Pty Ltd (1954) 92 CLR 17

Martin v Scribal Pty Ltd (1956) 95 CLR 213

Otsuka Pharmaceutical Co Ltd v Generic Health Pty Ltd (No 2) (2016) 120 IPR 431

PMT Partners Pty Ltd (in liq) v Australian National Parks & Wildlife Service (1995) 184 CLR 301

Preston Erection Pty Ltd v Speedy Gantry Hire Pty Ltd (1998) 43 IPR 74

Provincial Insurance Australia Pty Ltd v Consolidated Wood Products Pty Ltd (1991) 25 NSWLR 541

Russell v Wilson (1923) 33 CLR 538

Speedy Gantry Hire Pty Ltd v Preston Erection Pty Ltd (1998) 40 IPR 543

Stack v Davies Shephard Pty Ltd (2001) 108 FCR 422

Tate v Haskins (1935) 53 CLR 594

Towne v Eisner, 245 US 418 (1918)

University of British Columbia v Conor Medsystems Inc (2006) 155 FCR 391

William Blackstone Esq., Commentaries on the Laws of England (Clarendon Press, 1766) bk 2

Division:

General Division

Registry:

Victoria

National Practice Area:

Intellectual Property

Sub-area:

Patents and associated Statutes

Number of paragraphs:

228

Date of hearing:

2 July 2021

Counsel for the Applicant:

Mr D Shavin QC with Ms C Cunliffe

Solicitor for Applicant:

Allens

Counsel for the Respondent:

Mr H P T Bevan

Solicitor for the Respondent:

Australian Government Solicitor

ORDERS

VID 108 of 2021

BETWEEN:

STEPHEN THALER

Applicant

AND:

COMMISSIONER OF PATENTS

Respondent

order made by:

BEACH J

DATE OF ORDER:

30 JULY 2021

THE COURT ORDERS THAT:

1.    The determination of the Deputy Commissioner of Patents made on 9 February 2021 to treat patent application no. 2019363177 as lapsed be set aside.

2.    The determination of the Deputy Commissioner that s 15(1) of the Patents Act 1990 (Cth) is inconsistent with an artificial intelligence system or device being treated as an inventor be set aside.

3.    The matter as to whether patent application no. 2019363177 satisfies the formalities under the Patents Regulations 1991 (Cth) and its examination be remitted to the Deputy Commissioner to be determined according to law in accordance with these reasons.

4.    There be liberty given to either party within 14 days of the date of these orders to apply to supplement orders 1 to 3 by declaration(s) if thought necessary.

5.    There be no order as to costs.

Note:    Entry of orders is dealt with in Rule 39.32 of the Federal Court Rules 2011.

REASONS FOR JUDGMENT

BEACH J:

1    The Deputy Commissioner of Patents has determined that patent application no. 2019363177 did not comply with reg 3.2C(2)(aa) of the Patents Regulations 1991 (Cth). Reg 3.2C(2)(aa) requires that the applicant, who in this case is Dr Stephen Thaler, must provide the name of the inventor of the invention to which the application relates.

2    An artificial intelligence system, which has been described as a device for the autonomous bootstrapping of unified sentience (DABUS), was named as the inventor by Dr Thaler. But it has been determined by the Deputy Commissioner that such a system could not be an inventor.

3    The Deputy Commissioner considered that s 15(1) of the Patents Act 1990 (Cth) is inconsistent with an artificial intelligence machine being treated as an inventor; so too reg 3.2C(2)(aa). It was also said that Dr Thaler’s application and its deficiencies were incapable of remedy.

4    Further, it was determined that Dr Thaler had not complied with a direction under reg 3.2C(4), and that his application had lapsed.

5    Dr Thaler seeks judicial review of these decisions under s 5(1) of the Administrative Decisions (Judicial Review) Act 1977 (Cth) and s 39B of the Judiciary Act 1903 (Cth). He says that s 15, and the Act and the Regulations more generally, do not preclude an artificial intelligence system being treated as an inventor. He says that the Deputy Commissioner misconstrued s 15(1), and the Act and the Regulations more generally, as being inconsistent with an artificial intelligence system being treated as an inventor.

6    The underlying question for my determination is whether an “inventor” for the purposes of the Act and the Regulations can be an artificial intelligence system. As I have indicated, that question arises because reg 3.2C(2)(aa) requires that an applicant of a Patent Cooperation Treaty (Washington, 19 June 1970) (PCT) application must provide the name of the inventor of the invention to which the application relates.

7    The Deputy Commissioner held that an artificial intelligence system cannot be an inventor because “[s]ection 15(1) is clear, but not capable of sensible operation in the situation where an inventor would be an artificial intelligence machine as it is not possible to identify a person who could be granted a patent” (at [32]). The effect of his reasoning is that an artificial intelligence system can invent something that satisfies all of the requirements of patentability in terms of novelty, inventiveness and utility, but such an invention will be unpatentable because the Act requires a human inventor.

8    Now Dr Thaler is the owner of copyright in DABUS’s source code. He is also the owner, is responsible for and is the operator of the computer on which DABUS operates. But Dr Thaler is not the inventor of the alleged invention the subject of the application. The inventor is identified on the application as “DABUS, The invention was autonomously generated by an artificial intelligence”. DABUS is not a natural or legal person. DABUS is an artificial intelligence system that incorporates artificial neural networks.

9    The alleged invention is the output of DABUS’ processes. Various products and methods are claimed concerning containers, devices and methods for attracting enhanced attention using convex and concave fractal elements.

10    In summary and for the following reasons, in my view an artificial intelligence system can be an inventor for the purposes of the Act. First, an inventor is an agent noun; an agent can be a person or thing that invents. Second, so to hold reflects the reality in terms of many otherwise patentable inventions where it cannot sensibly be said that a human is the inventor. Third, nothing in the Act dictates the contrary conclusion.

11    It follows that I reject the Deputy Commissioner’s determination and the Commissioner’s position before me.

12    First, that position confuses the question of ownership and control of a patentable invention including who can be a patentee, on the one hand, with the question of who can be an inventor, on the other hand. Only a human or other legal person can be an owner, controller or patentee. That of course includes an inventor who is a human. But it is a fallacy to argue from this that an inventor can only be a human. An inventor may be an artificial intelligence system, but in such a circumstance could not be the owner, controller or patentee of the patentable invention.

13    Second, on the Commissioner’s logic, if you had a patentable invention but no human inventor, you could not apply for a patent. So by employing the Commissioner’s device of using a procedural requirement in a subordinate instrument, you would substantively preclude the possibility of a patent grant for that invention. Nothing in the Act justifies such a result. And it is the antithesis of the s 2A object.

14    Third, in my view the Commissioner has not kept faith with the tenet that “[i]t is also of fundamental importance that limitations and qualifications are not read into a statutory definition unless clearly required by its terms or its context, as for example if it is necessary to give effect to the evident purpose of the Act” (PMT Partners Pty Ltd (in liq) v Australian National Parks & Wildlife Service (1995) 184 CLR 301 at 310 per Brennan CJ, Gaudron and McHugh JJ). Indeed the evident purpose of the Act, a proxy for which is s 2A, is at odds with the unreality of persisting with the notion that artificial intelligence systems cannot be inventors.

15    Fourth, much of the Commissioner’s argument descended into dictionary definitions of “inventor”. But more is required of me than mere resort to old millennium usages of that word. If words are only pictures of ideas upon paper (Dodson v Grew (1767) Wilm 272 at 278; 97 ER 106 at 108 per Wilmot CJ) and if, as Holmes J described it, they are not “crystal[s], transparent and unchanged, [but] the skin of a living thought and may vary greatly in colour and content according to the circumstances and the time in which [they] are used (Towne v Eisner, 245 US 418, 425 (1918)), I need to grapple with the underlying idea, recognising the evolving nature of patentable inventions and their creators. We are both created and create. Why cannot our own creations also create?

Background

16    I should first deal with some background technical matters concerning artificial neural networks generally and DABUS particularly. I will then say something about the use of artificial intelligence in pharmaceutical research to exemplify where the science is and where it is going in terms of patentable inventions involving artificial intelligence. I do this to demonstrate that unless the Act demands it, outmoded notions of the agent that may invent and the subject matter of a patentable invention are not controlling. And if the concept of “invention” in terms of manner of manufacture evolves, as it must, why not the concept of “inventor”?

17    I should say at the outset that I do not propose to offer any definition of “artificial intelligence”. Discussion of the Turing Test, the Chinese Room thought experiment, functionalism and so on can be left to others. Of course I am dealing with more than a brute force computational tool. And perhaps one could apply the label of “artificial intelligence” to a system that has the capacity for deductive reasoning, inductive reasoning and pattern recognition, leaving to one side any possible embodiment of awareness, consciousness or sense of self. But whatever be the position, I am prepared for the moment to equate DABUS with artificial intelligence if it operates in the fashion that Dr Thaler has represented, but without anthropomorphising its algorithms.

18    I should also say that I do not need to discuss the concept of autonomy in terms of autonomous systems, semi-autonomous systems or non-autonomous systems. But I will make two points here. First, there is a distinction between automation and autonomy, which I will return to later. Second, as DABUS has been described to me, I would only consider it to be semi-autonomous, as opposed to Dr Thaler’s more ambitious label, although its output in terms of the operation of the artificial neural networks could be said in one sense to be autonomously generated. Let me turn then to artificial neural networks.

Artificial neural networks

19    Artificial intelligence systems may incorporate, or be constituted by, artificial neural networks. Such artificial neural networks may be implemented within machines and self-organise to simulate the way in which the human brain processes and generates information.

20    Artificial neural networks are a subfield of machine learning. Machine learning involves computer systems that learn from data. Generally speaking there are three types of machine learning, being supervised, unsupervised and reinforcement learning.

21    Artificial neural networks are based on mathematical modelling designed to mimic natural neural networks. In humans this is constituted by billions of neurons linked by a complex network, with intelligence produced as a result of the interconnections between neurons. The input of one neuron consists of the output signals of other neurons. That input neuron may then be fired, producing output signals which are then the input signals for other neurons, and so on. The pattern of the cascading firing of numerous neurons is non-linear. I described part of the mechanism in Otsuka Pharmaceutical Co Ltd v Generic Health Pty Ltd (No 2) (2016) 120 IPR 431; [2016] FCAFC 111 in the following terms (at [135] to [138]):

The transmission of nerve impulses takes place by means of specific chemical agents called “neurotransmitters”. This transmission occurs in the brain, other parts of the central nervous system and the peripheral nervous system. Transmission within the peripheral nervous system is important for everyday processes without which a human could not survive, eg breathing, heartbeat, movement and generally every important bodily function. This transmission is coordinated by electrical signals (nerve impulses) to and from the central nervous system in which the brain and spinal cord are key components. In addition to coordinating these everyday functions, the central nervous system subsumes other roles such as thinking (cognition) and feeling (emotion).

Neurotransmission is the process by which small molecules (neurotransmitters) transmit signals (electrical impulses) from one neuron to the next. Neurotransmission takes place at a “synapse”, ie where two axons (ie neurons) are in close proximity but not touching one another. These are the sites of communication between neurons in the central nervous system. The gap between the two adjacent neurons is the “synaptic cleft” or “synaptic gap”. It is across this gap that impulses are transmitted by the use of specialised chemicals known as neurotransmitters. When the nerve impulses arrive at a synapse, neurotransmitters are released, which in turn can influence another neuron, either by inhibitory or excitatory impulses. Further, the receiving neuron may “on-transmit” to many other neurons which then influence these neurons and so on. The neurotransmitters are released by a “presynaptic neuron” which bind to and activate “receptors”

Neurotransmission occurs when an action potential (ie an electrical impulse) is initiated in a neuron and arrives at the nerve terminal of the pre-synaptic neuron. The mechanism of the action potential is a function of the electrical potential across part of the neuron, being the axon. It involves “travelling” switching polarity across the membrane of the axon in one direction only mediated by the action of the opening and closing of voltage gated ion channels. The action potential or electrical impulse when it arrives at the nerve terminal causes the release of chemical neurotransmitters. Neurotransmission then takes place, but in one direction, ie from the pre-synaptic cell to the post-synaptic cell. When neurotransmitters bind to their receptors on the postsynaptic neuron this may result in short term changes, such as changes in the membrane potential (ie the electrical charge contained by that postsynaptic neuron) called a postsynaptic potential; this may trigger a further action potential.

Neurons are arranged in the form of networks (neural networks) through which signals can travel. Information arrives at each neuron from many others. The human brain has approximately 100 billion neurons which make about 100 trillion synapses. Information is transferred in the brain, and signals are propagated throughout the body, by way of nerve impulses. Signals are sent to and from the central nervous system by efferent (ie conducting away) and afferent (ie conducting to) neurons in order to coordinate functions essential for survival.

22    Artificial neural networks in their software structure and in their mathematical elements roughly model natural neural networks. They are a more sophisticated form of machine learning. A simple example is an object recognition system that might be fed thousands of labelled images of a particular item so as to then find visual patterns in the images that consistently correlate with particular labels.

23    An artificial neural network consists of thousands or even millions of simple processing nodes that are densely connected. Neural networks are organised into layers of nodes, which are feedforwarding in the sense that generally speaking data moves through them in only one direction; I will discuss backpropagation in a moment. In terms of feedforwarding, simply put, an individual node might be connected to several nodes in the earlier layer, from which it receives data, and several nodes in the succeeding layer, to which it sends data.

24    To each of its incoming connections, a node will assign a number known as a weight. When the network is active, the node receives a different data item, being a different number, over each of its connections and multiplies it by the associated weight. It then adds the resulting products together, yielding a single number. If that number is below a threshold value, the node passes no data to the next layer. If the number exceeds the threshold value, the node fires, which generally means sending the number, being the sum of the weighted inputs, along all its outgoing connections.

25    When an artificial neural network is being trained, all of its weights and thresholds are initially set to random values. Training data is fed to the input layer, and it passes through the succeeding layers, getting multiplied and added together in complex ways, until it arrives in a transformed form at the output layer. During training, the weights and thresholds are continually adjusted until training data with the same labels consistently yield similar outputs.

26    There are various significant features of an artificial neural network. First, there is parallel processing given that neurons simultaneously process the information. Second, there is the twofold function of the neuron; it acts simultaneously as both memory and a signal processor. Third, one has the distributed nature of the data representation; knowledge is distributed throughout the network. Fourth, one has the ability of the network to learn from experience.

27    Layered networks consist of a number of layers. First, one has input layers, consisting of neurons for each network input. Second, one has the hidden layers, composed of one or more hidden or intermediate layers consisting of neurons. Third, one has an output layer, consisting of neurons for each network output. Further, there may be a feedback architecture, with connections between neurons of the same or previous layer. Further or alternatively, there may only be a feedforward architecture without feedback connections.

28    Now a backpropagation algorithm may be used that allows for the modification of connection weights in such a way that it minimises the error function. To minimise the error function a gradient-descent algorithm can be used which starts from a generic data point and calculates the gradient, which gives the direction in which to move in order to reach the maximum increase (or decrease). This is constantly recalculated. The algorithm continues iteratively until the gradient is zero. The backpropagation algorithm can be divided into two steps. First, one has the forward step, where the input data to the network is propagated to the next level and so on, and then the network error function is computed. Second, one has the backward step, where the error made by the network is propagated backwards, and the weights are updated appropriately.

29    I should now say something about DABUS.

DABUS

30    Predecessors to DABUS seem to have combined one type of artificial neural network with a second type of artificial neural network. Let me identify these types.

31    One type of artificial neural network generates output in response to self-stimulation of the network’s connections. It can be described as a generator which is chaotically stimulated by internal and / or external disturbances to create what might be described as potential memories.

32    The other type of artificial neural network perceives value in the stream of output. The potential memories produced by the first artificial neural network are evaluated by this other network, which evaluates or predicts their novelty, utility or value. This other network, apparently, modulates synaptic perturbation levels within the first network to selectively reinforce the most important whilst weakening the rest.

33    The combination results in an artificial intelligence system that is said to produce new ideas after it perturbs the connections within its neural network(s). It is said that the two artificial neural networks mimic the human brain’s major cognitive circuit: the thalamocortical loop. The cerebral cortex generates a stream of output. The thalamus brings attention to ideas that are of interest.

34    Apparently, DABUS was devised to overcome challenges in these compound artificial neural networks. It is an extension of earlier conceptual designs. Apparently, it uses multiple generator artificial neural networks, the first type of network that I described above, which operate in parallel. Apparently, so Dr Thaler asserts, DABUS allows full extensibility of the generative component to trillions of computational neurons.

35    So, DABUS is a form of neurocomputing that allows a machine to generate new concepts which are encoded as chained associative memories within the artificial neural networks. And it consists of assemblies of artificial neural networks where the individual networks represent linguistic or visual concepts.

36    Random disturbances to the connections joining the artificial neural networks promote the formation of alternative chaining topologies. By triggering the retraction of the random disturbances, the system is able to perpetuate and reinforce particular topologies as memories within the artificial neural networks.

37    DABUS was trained by a combination of supervised and unsupervised learning. Initially, training was human assisted via ongoing presentation of fundamental concepts. Linguistic individual networks focused on a group of synonyms and immediate associations for various common and technical terms; for example, for the conceptual space “money”, training exemplars were “money”, “currency”, “bill” and “capital”. Other individual networks focused on images. The system accumulated knowledge as the individual networks connected themselves, at first into binary and ternary combinations of concepts, and then into longer chains of concepts. Later, the system generated notions that were offered by it for human approval or disapproval. The system was then allowed to free run; this was the unsupervised generative learning phase.

38    Once in the unsupervised generative learning phase, random disturbances promoted the formation of alternative chaining topologies which represented different concepts. Those topologies which differed from those previously observed by the system, were identified by the system as novel.

39    When these novel topologies incorporated networks that were also identified as significant because they incorporated certain predetermined individual networks, being so-called hot buttons, which were labelled salient outcomes, for example, pleasure, the system triggered the retraction of random disturbances to reinforce these topologies as memories within its artificial neural network.

40    A “witness” neural network recorded memories of the topologies incorporating one or more hot buttons. With contextual input, these memories could be reported from the witness network in natural language format.

41    The upshot of all of this, which I accept for present purposes, is that DABUS could be described as self-organising as a cumulative result of algorithms collaboratively generating complexity. DABUS generates novel patterns of information rather than simply associating patterns. Further, it is capable of adapting to new scenarios without additional human input. Further, the artificial intelligence’s software is self-assembling. So, it is not just a human generated software program that then generates a spectrum of possible solutions to a problem combined with a filtering algorithm to optimise the outcome.

42    Further, DABUS in one sense can be said to mimic aspects of human brain function. Now Dr Thaler would assert that DABUS perceives and thinks like a person, but I need not comment further on such a description. But I would accept, for present purposes, Dr Thaler’s assertion in material before me that:

DABUS, and its underlying neural paradigm, represents a paradigm shift in machine learning since it is based upon the transient chaining topologies formed among associative memories, rather than activation patterns of individual neurons appearing within static architectures. From an engineering perspective, the use of network resonances to drive the formation of chaining topologies, spares programmers the ordeal of matching the output nodes of one [artificial neural network] with the input nodes of others, as in deep learning schemes. In effect, complex neural architectures autonomously wire themselves together using only scalar resonances.

Reinforcement or weakening of such chains takes place when they appropriate special hot button nets containing memories of salient consequences. Therefore, instead of following error gradients, as in traditional artificial neural net training, conceptual chains are reinforced in proportion to the numbers and significances of advantages offered. Classification is not in terms of human defined categories, but via the consequence chains branching organically from any given concept, effectively providing functional definitions of it. Ideas form as islands of neural modules aggregate through simple learning rules, the semantic portions thereof, being human readable as pidgin language.

43    Finally, an output of the process described above is the alleged invention the subject of Dr Thaler’s application.

Artificial intelligence – pharmaceutical research

44    Before proceeding further I should make some other general observations now which will become relevant later to my discussion concerning s 2A.

45    Let me elaborate on how artificial intelligence has been used in pharmaceutical research, drawing on the discussion by the Joint Institute for Innovation Policy in its September 2020 final report on “Trends and Developments in Artificial Intelligence: Challenges to the Intellectual Property Rights Framework” prepared for the European Commission.

46    First, artificial intelligence has been applied to finding molecular targets, being a molecular structure with which an active pharmaceutical ingredient interacts and thus triggers its effects. Most drug targets are proteins such as receptors, enzymes, transporters and ion channels. Nucleic acids and structural proteins can also be targets. Scientists have recently developed a machine learning algorithm capable of predicting biological targets of prospective drug molecules. An example is the Bayesian analysis for the determination of drug interaction targets (BANDIT), which is designed to use virtually all available data on any potential drug to predict which enzyme or receptor or other target in the cells it interacts with to achieve its therapeutic effect. BANDIT can streamline the process of target identification by limiting the possibilities to perhaps only one or two targets, which can then be studied.

47    Second, artificial intelligence has been applied to finding a hit or lead. Automated high-throughput screening is a method for finding so-called hits and thus candidates for lead structures. A lead is a chemical substance that is investigated in the course of drug design as a starting point for the development of a drug candidate. A lead already shows a desired biological effect in vitro, but does not yet have the qualities required for a drug substance in terms of potency, selectivity and pharmacokinetics. A lead differs from any hit from high-throughput screening in that its structure allows the synthesis of analogue molecules. A lead can also be a natural substance whose pharmacological properties must be improved by chemical modifications.

48    Third, artificial intelligence has been used in drug repurposing. Many drugs are small, flexible molecules that not only interact with the originally determined target structure, but can also interact with other biological structures. Such associated effects can lead to the identification of new applications.

49    Fourth, artificial intelligence can help in identifying candidate substances by screening existing databases. Scientists have applied machine learning models to identify completely new antibiotically-active substances. The machine learning models can analyse molecular structures and correlate them with certain properties, such as the ability to kill bacteria, and convert them into algorithms.

50    Fifth, artificial intelligence has been used in polypharmacology. Drug development in recent decades has been guided by the paradigm “one drug – one target structure – one disease”. This principle is based on the approach that a pharmacological agent is used to selectively address a target structure in the body in order to treat a specific disease pattern. A large number of serious illnesses, such as cancer, mental disorders or cardiovascular diseases, often cannot be explained by a single biological defect, but only the interplay of complex and usually networked malfunctions that together produce the clinical picture. Artificial intelligence can assist.

51    Sixth, artificial intelligence is a useful tool in the development of vaccines. Artificial intelligence can help with gene sequencing, so as to decode the genetic material of the virus. Artificial intelligence can also help with the simulation of vaccines. In a simulation, possible vaccines can be released on the virus. The data on the nature of the virus and the vaccine may be sufficient to simulate how they react to each other. This makes it possible to exclude less promising vaccines even before the actual tests begin.

52    Seventh, artificial intelligence can assist in determining the three-dimensional structure of proteins including their folding forms. The Alphabet subsidiary DeepMind used deep learning, a subfield of machine learning, to develop the AlphaFold system, which generates 3D models of proteins based on their genetic sequences. AlphaFold has been used to predict protein structures of the SARS-CoV-2 virus. This is important in research for a vaccine. The antibodies of a vaccine must act precisely on the proteins of the virus in order to neutralise it.

53    Eighth, let me say something about databases. There are many databases that are available to integrate diverse information of molecular pathways, crystal structures, binding affinities, drug targets, disease relevance, chemical properties and biological activities. Artificial intelligence could be used to harvest information from these databases to design new compounds both for single targets but also in polypharmacology.

54    Ninth, artificial intelligence can also be used to support clinical trials by detecting the disease in patients, identifying the gene targets and predicting the effect of the designed molecule and the on- and off-target effects. Further, patient selection is a critical process in clinical trials. In clinical studies, the relationship between human biomarkers and in vitro phenotypes is essential. It allows a more predictable, quantifiable evaluation of the uncertainty of therapeutic responses in a given patient. The development of artificial intelligence approaches for the identification and prediction of human relevant disease biomarkers enables the recruitment of a specific patient population in Phase II and Phase III clinical trials. The success rate in clinical trials can thus be increased by predictive modelling using artificial intelligence in the selection of a patient population.

55    Generally, as the report set out (pp 38 and 39):

There are two key drivers for the recent AI advances that could significantly accelerate drug discovery:

    First, the digitisation of existing scientific knowledge has enabled the discovery of new compounds that could address certain disease targets. For example, digitising the human genome has drastically increased the search space for new chemical compounds. Rich and multimodal digital data (e.g. chemical structures, X-ray imaging) have further enlarged the search space. However, the size and modality of the data also increased the complexity of navigating in the search space.

    Second, recent advances in AI – specifically in the field of machine learning – can help in managing the complexity of navigating the vast search space for potential drugs. AI can automatically and intelligently collect, digest, analyse and detect complex patterns in the existing data on chemical compounds and biological reactions in the human body. [Machine learning] algorithms can generate a large feature space to find hidden linkages, a process that is extremely labour-intensive for medical experts to perform. They can also search systematically to predict whether new drug compounds can treat a target and generate novel hypotheses faster than scientists can. Accordingly, by overcoming the barriers in computation and data management, AI technologies can reduce the search costs to find drug-target pairs even when the search space has expanded. The resulting drug candidates could then serve as useful starting points to examine whether they are safe and effective in treating their disease targets.

Once a list of potential candidates for a drug compound is identified, AI can also be used to choose which molecules possess suitable characteristics to address the biological target of interest. For example, AI can find the optimal chemical structures to reduce toxicity and to satisfy metabolic requirements, both of which searches can be costly and data-intensive. Similarly, human errors – which have proven to constitute a barrier during this discovery process – are less likely to occur if the process is primarily driven by algorithms. By facilitating the discovery and verification of drug candidates, AI is expected to have a strong effect on the early stage of drug development (before clinical trials).

56    Now I have just dealt with one field of scientific inquiry of interest to patent lawyers. But the examples can be multiplied. But what this all indicates is that no narrow view should be taken as to the concept of “inventor”. And to do so would inhibit innovation not just in the field of computer science but all other scientific fields which may benefit from the output of an artificial intelligence system

57    It is now appropriate to turn to the relevant statutory provisions.

Some statutory provisions

58    Part 2 of Ch 2 of the Act is concerned with “Ownership”. Relevantly to the present context, s 15(1) provides:

Subject to this Act, a patent for an invention may only be granted to a person who:

 (a)    is the inventor; or

(b)    would, on the grant of a patent for the invention, be entitled to have the patent assigned to the person; or

(c)    derives title to the invention from the inventor or a person mentioned in paragraph (b); or

(d)    is the legal representative of a deceased person mentioned in paragraph (a), (b) or (c).

59    There is no definition of “inventor” in the Act. But “invention” is defined to mean “any manner of new manufacture the subject of letters patent and grant of privilege within section 6 of the Statute of Monopolies, and includes an alleged invention” (s 3 and sch 1).

60    Section 15(1) refers to “person”. Relevantly, s 2C(1) of the Acts Interpretation Act 1901 (Cth) provides that:

In any Act, expressions used to denote persons generally (such as “person”, “party”, “someone”, “anyone”, “no-one”, “one”, “another” and “whoever”), include a body politic or corporate as well as an individual”.

61    Part 1 of Ch 3 of the Act is concerned with “Patent applications”. Section 29(1) provides:

A person may apply for a patent for an invention by filing, in accordance with the regulations, a patent request and such other documents as are prescribed.

62    For completeness, s 29(5) extends the reach of s 2C(1) of the Acts Interpretation Act by providing that:

In this section:

person includes a body of persons, whether incorporated or not.

63    For the purposes of s 29(1) a “patent request” means “a request for the grant of a patent to a nominated person” (s 3 and sch 1). And “nominated person” means “the person identified in a patent request as the person to whom the patent is to be granted” (s 3 and sch 1).

64    Stopping here for a moment, none of these provisions exclude an inventor from being a non-human artificial intelligence device or system.

65    Section 29A deals with PCT applications. A “PCT application” means “an international application in which Australia is specified as a designated State under Article 4(1)(ii) of the PCT” (s 3 and sch 1). Sections 29A(1), (2) and (4) are deeming provisions, which provide:

(1)    A PCT application is to be treated as a complete application under this Act for a standard patent.

(2)    The description, drawings, graphics, photographs and claims contained in a PCT application are to be treated as a complete specification filed in respect of the application.

(4)    A PCT application is to be taken to comply with the prescribed requirements of this Act that relate to applications for standard patents, but is not to be taken, merely because of subsection (1) or (2), to comply with any other requirements of this Act.

66    Section 29A(5)(a) imposes an obligation on the applicant of a PCT application, relevantly, to “file the prescribed documents and pay the prescribed fees”.

67    If the applicant of a PCT application has satisfied the precondition in s 29A(5), reg 3.2C of the Regulations applies. Regulation 3.2C concerns the formalities for a PCT application. I should set out reg 3.2C in full, which provides:

(1)    This regulation applies to a PCT application if the applicant complied with the requirements of subsection 29A(5) of the Act.

 (2)    The applicant must:

  (a)    provide:

(i)    an address for service in Australia or New Zealand at which a document under the Act or these Regulations may be given to the applicant personally, or to a person nominated as the applicant’s representative; or

(ii)    another address for service in Australia to which it is practicable and reasonable for Australia Post, or a person acting for Australia Post, to deliver mail; or

(iii)    an address for service in New Zealand to which it is practicable and reasonable for a person providing mail delivery services to deliver mail; and

(aa)    provide the name of the inventor of the invention to which the application relates.

(3)    The PCT application must comply with the formalities requirements determined in an instrument under section 229 of the Act.

(4)    The Commissioner may, within one month from the date the applicant complied with subsection 29A(5) of the Act, direct the applicant to do anything necessary to ensure that the requirements mentioned in subregulations (2) and (3) are met.

 (5)    The PCT application lapses if:

(a)    the applicant has been given a direction under subregulation (4); and

(b)    the applicant has not complied with the direction within 2 months of the date of the direction.

(6)    If the PCT application lapses under subregulation (5), the Commissioner must:

(a)    advertise that fact in the Official Journal; and

(b)    notify the applicant that the PCT application has lapsed.

68    Under reg 3.2C(2)(aa), an applicant must provide “the name of the inventor of the invention to which the application relates”. Regulation 3.2C(4) is a discretionary power to the Commissioner to “within one month from the date the applicant complied with subsection 29A(5) of the Act, direct the applicant to do anything necessary to ensure that the requirements mentioned in subregulations (2) and (3) are met”. If such a direction is given and the applicant does not comply with it within two months of the date of the direction, then the PCT application lapses under reg 3.2C(5). This is a “prescribed circumstance” under s 142(2)(f) of the Act for the lapsing of a complete application for a standard patent that is a PCT application.

69    By reg 3.1A(2), for a PCT application, the applicant is taken to be the “nominated person”.

70    I should pause at this point to make the following points.

71    The PCT does not define inventor. Article 4 deals with requests for applications to be processed according to the PCT and states (art 4(1)(v)) that such a request shall contain:

[T]he name of and other prescribed data concerning the inventor where the national law of at least one of the designated States requires that these indications be furnished at the time of filing a national application. Otherwise, the said indications may be furnished either in the request or in separate notices addressed to each designated Office whose national law requires the furnishing of the said indications but allows that they be furnished at a time later than that of the filing of a national application.

72    That provision does not preclude an inventor from being a non-human artificial intelligence device or system.

73    Of course, the applicant has to be a legal person. Article 9 provides:

(1)    Any resident or national of a Contracting State may file an international application.

(2)    The Assembly may decide to allow the residents and the nationals of any country party to the Paris Convention for the Protection of Industrial Property which is not party to this Treaty to file international applications.

(3)    The concepts of residence and nationality, and the application of those concepts in cases where there are several applicants or where the applicants are not the same for all the designated States, are defined in the Regulations.

74    I should also note art 27(3), which provides:

Where the applicant, for the purposes of any designated State, is not qualified according to the national law of that State to file a national application because he is not the inventor, the international application may be rejected by the designated Office.

75    The provision of course just takes you back to the Act.

76    Rule 4 of the Regulations under the Patent Cooperation Treaty (Washington, 19 June 1970) (PCT Regulations) contains the following regarding the contents of a request:

4.1 Mandatory and Optional Contents; Signature

(a)    The request shall contain:

(iv)    indications concerning the inventor where the national law of at least one of the designated States requires that the name of the inventor be furnished at the time of filing a national application.

(c)    The request may contain:

(i)    indications concerning the inventor where the national law of none of the designated States requires that the name of the inventor be furnished at the time of filing a national application,

4.5 The Applicant

(a)    The request shall indicate:

(i)    the name,

(ii)    the address, and

(iii)    the nationality and residence

of the applicant or, if there are several applicants, of each of them.

(b)    The applicant’s nationality shall be indicated by the name of the State of which he is a national.

(c)    The applicant’s residence shall be indicated by the name of the State of which he is a resident.

(d)    The request may, for different designated States, indicate different applicants. In such a case, the request shall indicate the applicant or applicants for each designated State or group of designated States.

(e)    Where the applicant is registered with the national Office that is acting as receiving Office, the request may indicate

4.6 The Inventor

(a)    Where Rule 4.1(a)(iv) or (c)(i) applies, the request shall indicate the name and address of the inventor or, if there are several inventors, of each of them.

(b)    If the applicant is the inventor, the request, in lieu of the indication under paragraph (a), shall contain a statement to that effect.

(c)    The request may, for different designated States, indicate different persons as inventors where, in this respect, the requirements of the national laws of the designated States are not the same. In such a case, the request shall contain a separate statement for each designated State or group of States in which a particular person, or the same person, is to be considered the inventor, or in which particular persons, or the same persons, are to be considered the inventors.

77    Rules 4.17 and 51bis deal with national law requirements concerning the identity of the inventor but are otherwise silent. But in any event, if the PCT Regulations were inconsistent with the PCT, the latter would prevail (art 58(5)).

78    Before proceeding further, I should pause to note that it is curious that the Deputy Commissioner has used a subordinate instrument passed in 2015 to read “inventor” as excluding a non-human in the context of a PCT application where neither the Act nor the PCT nor the PCT Regulations so stipulate or require it. I will say something about the history of reg 3.2C(2)(aa) much later. I should also note that for a non-PCT application, whether a request for a standard patent (reg 3.2A) or a request for an innovation patent (reg 3.2B), there is no express requirement to name the inventor, although the applicable pro formas for the patent requests have that field for completion.

79    There are other provisions dealing with entitlement such as examination, opposition and revocation that I should mention before proceeding further.

80    Division 1 of Pt 2 of Ch 3 is concerned with “Examination”. Section 45(1)(d) imposes an obligation on the Commissioner, if requested to do so by an applicant, to examine the patent request and complete specification and report on, inter-alia, such matters (if any) as are prescribed. Relevantly, two of the matters prescribed by reg 3.18(2) are whether, to the best of the Commissioner’s knowledge, the request and specification comply with s 15 (reg 3.18(2)(a)(i)), and for a PCT application, the requirements of subreg 3.2C(2) are met (reg 3.18(2)(f)).

81    Chapter 5 concerns “Opposition to grant of standard patent” and the first ground of opposition to the grant of a standard patent for which s 59 provides is that the nominated person is “not entitled to a grant of a patent for the invention” (s 59(a)(i)).

82    Part 4 of Ch 12 covers “Surrender and revocation of patents”. By s 138(3)(a), a ground of revocation is that “the patentee is not entitled to the patent”.

83    I will return to discuss ss 7, 18, 40, 64, 101B, 101E, 172, 182 and 185 later.

The Deputy Commissioner’s reasons

84    The Deputy Commissioner, exercising powers under ss 208(2) and (3) as and for the Commissioner rather than separately as a delegate, identified the issue as whether an artificial intelligence machine is capable of being an inventor for the purposes of the Act and Regulations (at [4]).

85    He considered that as “inventor” is not defined, it bears its ordinary English meaning, referring to Atlantis Corp Pty Ltd v Schindler (1997) 39 IPR 29 at 54. He then said (at [12]):

For the purposes of that decision it was not necessary to go further. Any standard dictionary shows that the traditional meaning of inventor is a person who invents. At the time that the Act came into operation (in 1991) there would have been no doubt that inventors were natural persons, and machines were tools that could be used by inventors. However, it is now well known that machines can do far more than this, and it is reasonable to argue that artificial intelligence machines might be capable of being inventors. I have no evidence whether the ordinary meaning of “inventor”, assessed at the present day, can include a machine. But if this were the ordinary meaning, would this be consistent with the other provisions of the Act?

86    He went on to address whether, if it were so, it was consistent with the Act. He identified s 15 as the most important provision, from which he drew the proposition that “the entitlement of the patentee flows from the inventor, and absent devolution the inventor will become the patentee” (at [20]). From JMVB Enterprises Pty Ltd v Camoflag Pty Ltd (2006) 154 FCR 348 at [69] to [72] per Emmett, Stone and Bennett JJ, he purported to synthesise the following propositions (at [22]):

… First, the inventor refers to whoever devises the invention. A person who did not devise the invention but acquired knowledge from the inventor is not themselves an inventor. Second, a person can derive title from the inventor in many ways. Third, it is essential that the derivation of title must be from the inventor. JMVB also suggests that section 15(1)(c) incorporates the traditional category of communicatees as a person who derives title from the inventor.

(Emphasis in original.)

87    Returning to the question of whether s 15(1) was “workable in the situation where the inventor is an artificial intelligence machine” (at [26]), the Deputy Commissioner considered, starting with s 15(1)(b), that “[i]t is an uncontroversial observation that the law does not presently recognise the capacity of an artificial intelligence machine to assign property” (at [26]). Further, with regard to s 15(1)(c), he could not see “how the owner of a machine can be regarded as the communicatee of artificial intelligence” as, even if information could have been communicated for the purpose of applying for a patent, the artificial intelligence machine cannot have a beneficial interest in any property (at [27] and [28]). And in the context of s 15(1)(c), he also rejected as inapplicable the doctrines of accession or first possession as ownership “automatically vests” rather than “by conceptually moving title ‘from’ the artificial intelligence machine to the owner of the machine” (at [30]). Further, he stated that s 15(1)(d) was “clearly not capable of including the owner of an artificial intelligence machine” (at [31]).

88    The Deputy Commissioner’s view was that s 15(1) “is clear, but not capable of sensible operation in the situation where an inventor would be an artificial intelligence machine as it is not possible to identify a person who could be granted a patent” (at [32], see also [34]). In that light, he did not find it necessary to consider the object clause (at [32]).

89    Before me, the Commissioner’s counsel sought to support the Deputy Commissioner’s reasoning. It is convenient at this point to elaborate on the Commissioner’s reinforcing arguments.

The Commissioner’s arguments

90    The Commissioner accepted that the word “inventor” in reg 3.2C(2)(aa) has the same meaning as it does in s 15(1)(a). This is not to elide considerations of entitlement under s 15 with the assessment of validity of an application for a patent. Rather, according to the Commissioner, it is to recognise the congruence of meaning stemming from the use of the same word in circumstances where the primary person to whom the patent will be granted is the inventor. All other people derive title to the patent, and the invention claimed therein, via the inventor (Stack v Davies Shephard Pty Ltd (2001) 108 FCR 422 at [21] per Whitlam, Sundberg and Dowsett JJ and JMVB at [71] per Emmett, Stone and Bennett JJ).

91    I will return to what the Commissioner seeks to finesse from Stack and JMVB later, although it should be obvious that they did not deal with the issue I am now considering.

92    Now a valid application for a patent may be made by a person even though that person is not ultimately eligible for the grant of the patent for which the application has been made. But the Commissioner says that this is no answer to whether the name of the inventor has been provided for a PCT application in accordance with reg 3.2C(2)(aa).

93    As to Dr Thaler’s position that the provision of “DABUS” satisfied this requirement so that the application should proceed to examination, thereby justifying the grant of relief on this application, the Commissioner says that this suggestion is misplaced.

94    First, it assumes that the mere supply of an identifier signifying the claimed inventor is sufficient. But it is said that such an approach gives no substantive content to the word “inventor”. Put another way, it begs the question: what does “inventor” mean in reg 3.2C(2)(aa)?

95    Second and in any event, it is said that issues of potential futility for the purposes of relief on this application would arise given, as the Deputy Commissioner identified (at [34]), the same issue would need to be considered under reg 3.18(2)(a)(i).

96    Further, the Commissioner made the following submissions concerning the meaning of “inventor”.

97    The Commissioner says that in JMVB it was held (at [71] and [72]) that:

… There is no warrant for reading the word inventor as meaning anything different from the person who is responsible for making the invention, namely, the person who makes or devises the process or product. The word bears its ordinary English meaning (Atlantis Corp Pty Ltd v Schindler (1997) 39 IPR 29 at 54 per Wilcox and Lindgren JJ). …

Inventor in s 15(1)(a) refers to the person who makes or devises the invention, wherever the invention may be made. …

98    The Commissioner says that the ordinary meaning of “inventor” is inherently human: “someone who invents, especially one who devises some new process, appliance, machine or article; someone who makes invention” (see Macquarie Dictionary (online)). This accords with s 2C(1) of the Acts Interpretation Act. It is said that the human quality of ingenuity that resides in the notion of invention as a matter of ordinary language resonates through the definitions of the cognate words in the dictionaries that explicate the meaning of “inventor”, such as, devise, contrive or contrivance, conceive or conception.

99    Further, the Commissioner says that the characterisation of “inventor” as an agent noun does not alter this ordinary meaning. The distinction between the personal and the mechanical is one mentioned in Fowler’s Dictionary of Modern Usage. After explaining that “–or is the Latin agent-noun ending corresponding to English –er … English verbs derived from the supine stem of Latin ones––i.e. especially most verbs in -ate, but also many others such as act, credit, invent, oppress, possess, prosecute, protect––usually prefer this Latin ending to the English one in –er”, it is said that “[s]ome verbs generate alternative forms, generally preferring –er for the personal and –or for the mechanical agent (e.g. adapt, convey, distribute, resist) or –or for the lawyers and –er for ordinary use (see Fowler’s Dictionary of Modern Usage (3rd rev ed, 1998) ‘or).

100    The Commissioner says that the ordinary meaning of “inventor” does not signify a non-human actor. It is said that this may be contrasted, for example, with “computer”, originally a person and more recently an apparatus; see the definition of “computer” (def 1 and 2): “1. A person who makes calculations or computations; a calculator, a reckoner; spec. a person employed to make calculations in an observatory, in surveying, etc. Now chiefly historical. 2. A device or machine for performing or facilitating calculation.” in the Oxford English Dictionary (online); see also the definition of “computer” (def 1 and 2): “1. someone who computes. 2. an apparatus for performing mathematical computations electronically according to a series of stored instructions called a program.” in the Macquarie Dictionary (online).

101    The Commissioner says that the ordinary signification of “inventor” entailing a human person gives effect to the s 2A object, which is to provide a patent system that “balances over time the interests of producers, owners and users of technology and the public”. The Commissioner says that the construction adopted by the Deputy Commissioner does not fail to recognise technological innovation. Rather it gives effect to the current statutory purpose. The Commissioner says that it may be that in the future Parliament considers that artificial intelligence machines can invent for the purpose of the patent system, and that their owners should be rewarded for those machines’ inventions, but such a scenario is not reflected in the current statutory scheme.

102    The Commissioner also asserts that this conclusion mirrors the position in copyright, in which it is established that those intellectual property rights originate by the application of human intellectual effort. I should say immediately that I did not find the Commissioner’s refuge in the copyright analogy to be of any assistance in the present context.

103    Further, the Commissioner says that the construction for which Dr Thaler contends is strained and renders the scheme ineffective because it would not yield the grant of a patent. It is said that a construction that fails to achieve the grant of a patent is a cogent indication that it is not to be preferred. Reference was also made to University of British Columbia v Conor Medsystems Inc (2006) 155 FCR 391 at [38] and [39] per Emmett J. I will discuss this case later.

104    Further, the Commissioner says that an artificial intelligence machine cannot own a patent, so s 15(1)(a) does not apply; nor can it assign one under s 15(1)(b). This is so even accepting that s 15 specifies no formalities as being necessary (University of British Columbia at [37] per Emmett J referring to Speedy Gantry Hire Pty Ltd v Preston Erection Pty Ltd (1998) 40 IPR 543 at 550 per Emmett J).

105    The Commissioner says that s 15(1)(b) necessarily presupposes an “inventor”, whether as the origin of the invention giving rise to title under the general law prior to grant or upon grant, consistently with ss 15(1)(a) and (c). The Commissioner says that in the present case, there is no relevant juridical act, and no requisite legal relationship beyond possession, that establishes that upon grant Dr Thaler would be entitled to an assignment.

106    Further, the Commissioner says, in relation to s 15(1)(c), that it cannot be said that Dr Thaler “derives title to the invention from the inventor”. The Commissioner says that s 15(1)(c) requires the existence of a title that moves from the inventor to the other person. On the facts, it is said that this does not exist here.

Dr Thaler’s arguments

107    Dr Thaler points out that the definition of “invention” in the Act includes an “alleged invention”. As I have said, the Act defines invention to mean “any manner of new manufacture the subject of letters patent and grant of privilege within section 6 of the Statute of Monopolies, and includes an alleged invention” (s 3 and sch 1). Moreover, any person may apply for a patent for an alleged invention.

108    Dr Thaler says that he is a person who through his patent attorneys filed the prescribed forms. In those forms, Dr Thaler provided the name of the inventor, consistent with the requirements of reg 3.2C(2)(aa).

109    He points out that the Act and Regulations set out a scheme for the Commissioner to consider, through an examination process, whether, inter-alia, the request and specification comply with s 15.

110    Section 45(1) provides that if an applicant asks for an examination of a patent request and complete specification relating to an application for a standard patent, the Commissioner must examine the request and specification and report on certain matters, including such other matters (if any) as are prescribed. Regulation 3.18(2)(a)(i) provides that, for the purposes of s 45(1)(d), whether the request and specification comply with s 15, is a prescribed matter.

111    Accordingly, Dr Thaler says that the determination by the Deputy Commissioner that the application has lapsed because Dr Thaler has not complied with the direction under reg 3.2C(4) was in error.

112    Further, Dr Thaler contends that an “inventor” under the Act can be an artificial intelligence system or device.

113    Further, Dr Thaler says that it is accepted that DABUS is not capable of owning or assigning a patent. Nonetheless, Dr Thaler says that he falls within s 15, on two alternative bases.

114    First, Dr Thaler says that he falls within s 15(1)(b). He submits that s 15(1)(b) has been applied to employee inventions, where there is no contract in place which sets out the employer’s entitlements.

115    Second, and in any event, Dr Thaler submits that he falls within s 15(1)(c) because he has derived title from the inventor, namely, DABUS. Even though DABUS, as an artificial intelligence system, is not a legal person and cannot legally assign the invention, he says that it does not follow that it is not possible to derive title from DABUS.

Analysis

116    It is convenient to begin my analysis by making some general observations, particularly concerning s 2A and also inventive step. I will then turn to dictionary definitions. And with all of this out of the way I will then analyse s 15 more directly.

General observations

117    Let me begin with some general observations.

118    First, there is no specific provision in the Act that expressly refutes the proposition that an artificial intelligence system can be an inventor.

119    Second, there is no specific aspect of patent law, unlike copyright law involving the requirement for a human author or the existence of moral rights, that would drive a construction of the Act as excluding non-human inventors.

120    Third, as the word “inventor” is not defined in the Act or the Regulations, it has its ordinary meaning. In this respect then, the word “inventor” is an agent noun. In agent nouns, the suffix “or” or “er” indicates that the noun describes the agent that does the act referred to by the verb to which the suffix is attached. “Computer”, “controller”, “regulator”, “distributor”, “collector”, “lawnmower” and “dishwasher” are all agent nouns. As each example demonstrates, the agent can be a person or a thing. Accordingly, if an artificial intelligence system is the agent which invents, it can be described as an “inventor”.

121    Fourth, in considering the scheme of the Act, it has been said that a widening conception of “manner of manufacture” is a necessary feature of the development of patent law in the twentieth and twenty-first centuries as scientific discoveries inspire new technologies” (D’Arcy v Myriad Genetics Inc (2015) 258 CLR 334 at [18] per French CJ, Kiefel, Bell and Keane JJ). I see no reason why the concept of “inventor” should not also be seen in an analogously flexible and evolutionary way. After all the expressions “manner of [new] manufacture” and “inventor” derive from the 21 Ja 1 c 3 (Statute of Monopolies) 1623 (Imp) s 6. There is a synergy if not a symmetry in both being flexibly treated. Indeed, it makes little sense to be flexible about one and not the other. Tension is created if you give flexibility to “manner of manufacture” and then restrict “inventor”. You would be recognising an otherwise patentable invention and then saying that as there is no inventor it cannot be patented.

122    Fifth, the approach to the construction of the Act should be consistent with the recently inserted object clause. The object clause in s 2A is expressed in the following terms:

The object of this Act is to provide a patent system in Australia that promotes economic wellbeing through technological innovation and the transfer and dissemination of technology. In doing so, the patent system balances over time the interests of producers, owners and users of technology and the public.

123    Now the Deputy Commissioner noted the existence of the object clause, but incorrectly expressed the view that he should have regard to it only where there was ambiguity. But the object clause should always be considered when construing the legislation whether or not any ambiguity is identified.

124    In my view it is consistent with the object of the Act to construe the term “inventor” in a manner that promotes technological innovation and the publication and dissemination of such innovation by rewarding it, irrespective of whether the innovation is made by a human or not.

125    Consistently with s 2A, computer inventorship would incentivise the development by computer scientists of creative machines, and also the development by others of the facilitation and use of the output of such machines, leading to new scientific advantages.

126    Further, machines have been autonomously or semi-autonomously generating patentable results for some time now. So, and consistently with s 2A, you are simply recognising the reality by according artificial intelligence the label of “inventor”. I should say something further about the autonomous operation of an artificial intelligence system. Let me start by posing some questions and making some assumptions.

127    Who sets the goal for the system? The human programmer or operator? Or does the system set and define its own goal? Let the latter be assumed. Further, even if the human programmer or operator sets the goal, does the system have free choice in choosing between various options and pathways in order to achieve the goal? Let that freedom also be assumed. Further, who provides or selects the input data? Let it be assumed that the system can trawl for and select its own data. Further, the larger the choice for the system in terms of the algorithms and iterations developed for the artificial neural networks and their interaction, the more autonomous the system. Let it be assumed that one is dealing with a choice of the type that DABUS has in the sense that I have previously described.

128    Making all of these assumptions, can it seriously be said that the system is just a brute force computational tool? Can it seriously be said that the system just manifests automation rather than autonomy? Admittedly, one should not confuse the two concepts as is sometimes the case for certain types of machine learning (Daria Kim, ‘“AI-Generated Inventions”: Time to Get the Record Straight?’ (2020) 69(5) GRUR International 443, 446). But it would seem to me that such a system could at least be described as semi-autonomous if not autonomous. And it would likely be fair to describe its output in such terms at least.

129    Let me make another point that is relevant to s 2A. Not recognising the reality could produce inefficiency if not logical difficulties, which would be the antithesis of the s 2A object. As Abbott explained the matter (Ryan Abbott, ‘I Think, Therefore I Invent: Creative Computers and the Future of Patent Law’ (2016) 57(4) Boston College Law Review 1079, 1103 to 1104):

Preventing patents on computational inventions by prohibiting computer inventors, or allowing such patents only by permitting humans who have discovered the work of creative machines to be inventors, is not an optimal system. In the latter case, AI may be functioning more or less independently, and it is only sometimes the case that substantial insight is needed to identify and understand a computational invention. Imagine that Person C instructs their AI to develop an iPhone battery with twice the standard battery life and gives it some publically available battery schematics. The AI could produce results in the form of a report titled “Design for Improved iPhone Battery” complete with schematics and potentially even pre-formatted as a patent application. It seems inefficient and unfair to reward C for recognizing the AI’s invention when C has not contributed significantly to the innovative process.

Such a system might also create logistical problems. If C had created an improved iPhone battery as a human inventor, C would be its inventor regardless of whether anyone subsequently understood or recognized the invention. If C instructed C’s AI to develop an improved iPhone battery, the first person to notice and appreciate the AI’s result could become its inventor (and prevent C from being an inventor). One could imagine this creating a host of problems: the first person to recognize a patentable result might be an intern at a large research corporation or a visitor in someone’s home. A large number of individuals might also concurrently recognize a result if access to an AI is widespread.

130    Further, as Abbott explains, recognising computer inventors and patents on computational inventions could promote disclosure and commercialisation consistently with the s 2A object. Without the ability to obtain patent protection, owners of creative computers might choose to protect patentable inventions as trade secrets without any public disclosure.

131    Let me make a more general point. If the output of an artificial intelligence system is said to be the invention, who is the inventor? And if a human is required, who? The programmer? The owner? The operator? The trainer? The person who provided input data? All of the above? None of the above? In my view, in some cases it may be none of the above. In some cases, the better analysis, which is consistent with the s 2A object, is to say that the system itself is the inventor. That would reflect the reality. And you would avoid otherwise uncertainty. And indeed that may be the case if the unit embodying the artificial intelligence has its own autonomy. What if it is free to trawl the internet to obtain its own input or training data? What about a robot operating independently in a public space, having its own senses, learning from the environment, and making its own decisions? And what about the more exotic, such as a mobile unit on Mars left to its own devices and not seeking instructions from Earth? Take the examples given by Fraser (Erica Fraser, ‘Computers as Inventors – Legal and Policy Implications of Artificial Intelligence on Patent Law’ (2016) 13(3) SCRIPTed 318 at 324):

Comparatively, if we imagine contexts in which autonomous invention-generating computers would be particularly useful, from deep sea or outer space exploration where humans cannot practically perceive and solve problems in situ, to do-it-yourself problem-solving by non-technical individuals (especially armed with 3-D printers), it is arguable that the solutions produced would be patentable if invented by a human. For example, consider a situation whereby an AI-enabled machine is involved in deep space exploration at such distances that the associated lag in radio communications would make human intervention in the mission impractical. If an unforeseen local effect were encountered that prevented communication using fitted technology, it could threaten the mission. Therefore, autonomous invention-generation technology could enable the machine to engineer a novel antenna and manufacture it on site with an on-board 3D printer.

132    Further, if only an artificial intelligence system could be said to have created the output, but you only permit of human inventors, you may not have an inventor. Hence one may not be able to patent the invention. So you may have the case where artificial intelligence has created an invention in terms of the output, but there is no inventor and it cannot be at that time patented. But then, say, as Abbott touched on, along comes a human, observes the output for what it is, “discovers” the output and correspondingly the invention and then asserts that he is the inventor because of his discovery. That would be an odd outcome to say the least. But even put aside that example. Generally, it is quite undesirable to preclude a class of otherwise patentable inventions from patentability on the basis of an exclusion that is not apparent from the express words of the Act. Indeed, that would be the antithesis of promoting innovation.

133    I should make another point that may be relevant to s 2A. The spectre has been raised that if one permits of computer-generated inventions and computer-generated patent applications, that the patent system will reach a breaking point. Many more novel inventions are likely to be created because algorithms generate more randomness and variability. And how will the volume be processed? Will such a volume have a chilling effect on innovation and others? There are a number of ways to dispose of these phantoms. One requires a legal person to make a patent application. So, a person will have to have ultimate control over any computer generated application. A computer cannot be an applicant. Further, only a legal person can be granted a patent. So, only a person will have title and control over a patented invention.

134    Generally, the outcome of the Commissioner’s position is incompatible with s 2A. The Commissioner accepts that Dr Thaler is not the inventor, and indeed in analogous circumstances concerning the output of an artificial intelligence system would seem to suggest that the person owning or controlling the machine would not be the inventor. But the product or method that is described or detailed in such an output could involve an inventive step as that concept is used in the Act. But on the Commissioner’s logic there would be no inventor. Accordingly, it would follow on the Commissioner’s reasoning that you could not make a PCT application for the invention, as you would not satisfy reg 3.2C(2)(aa). This would be a strange result, and at odds with the object in s 2A. Let me move to the next topic.

135    Sixth, at this point I should say something about inventive step, which is what the Act is really concerned with. The focus of the Act is not really on the inventor at all, a matter on which I will say something further later in these reasons when I turn to other statutory provisions.

136    Section 18(1)(b)(ii) provides that “an invention is a patentable invention for the purposes of a standard patent if the invention, so far as claimed in any claim … when compared with the prior art base as it existed before the priority date of that claim … involves an inventive step.”

137    Section 7(2) provides that:

For the purposes of this Act, an invention is to be taken to involve an inventive step when compared with the prior art base unless the invention would have been obvious to a person skilled in the relevant art in the light of the common general knowledge as it existed (whether in or out of the patent area) before the priority date of the relevant claim, whether that knowledge is considered separately or together with the information mentioned in subsection (3).

138    There is nothing in ss 7(2) and 18(1) requiring an “inventor” as such, let alone only an inventor who is a legal person. There is no ground of invalidity based upon the absence of any “inventor”, let alone the absence of an inventor who is a legal person. Further, there is nothing in s 40 that makes “inventor” and an inventor who is a legal person of any relevance.

139    Is there something in the definitional provision of s 7(2) concerning inventive step that requires a human inventor? In my view, no. Indeed the Commissioner before me and the Deputy Commissioner below never suggested otherwise. They were correct not to float such an idea.

140    First, s 7(2) refers to a hypothetical construct of “a person skilled in the relevant art in the light of the common general knowledge” at the relevant date. It is not focusing on the thought processes of an actual human, let alone the subjective thought processes of a human inventor.

141    Is an actual mental state or act of an inventor required for an “invention” or “inventive step”? In my view there is no such express or necessarily implied requirement under the Act. Moreover, there are considerable difficulties with such a concept.

142    The Act focuses on inventive step, which uses a hypothetical and objective construct. One is not at all concerned with the inventor’s mental processes as such.

143    Further, what is meant by “mental act” or “thought”? If you simply define this as something engaged in within the human cerebral cortex, not only have you not defined any complete cognitive content but you have conveniently defined away the problem at hand. Further, as I have said, it is not necessary for my purposes to linger on the meaning of “intelligence” in the phrase “artificial intelligence” or otherwise.

144    Second, whether the inventive step is produced by a human or a machine is simply irrelevant to the inquiry in s 7(2). One takes the inventive step, howsoever it arose, and compares it with the prior art base and common general knowledge. Moreover, the proviso “unless the invention would have been obvious to a person skilled in the relevant art …” is not about the inventor or his or its characteristics or status whatsoever.

145    Third, perhaps it may be said that the threshold for inventiveness might rise if the “person skilled in the relevant art” can be taken, in the future, to be assisted by or has access to artificial intelligence or is taken as part of the common general knowledge to have knowledge of developments produced by artificial intelligence in the relevant field. But that is a separate question. And even if such matters be assumed, that does not entail that one can only have human inventors. Indeed such a scenario may be more said to tend in the reverse direction. Say that the person possessing the skills that are expected from the average person in the relevant field is taken to have access to and use artificial intelligence. Say that such a person is taken to be able to perform routine experimentation and research having access to and using artificial intelligence. And say that such a person’s possession of common general knowledge is assessed with such a person being taken to have access to the search functions, filtering and assimilation of artificial intelligence tools. In such a case as we go forward, it is more likely rather than less likely that any inventive step will be generated by artificial intelligence rather than human intelligence. But that is not a difficulty in terms of efficiency and incentives, if it is recognised, as I think it must be, that the Act permits of inventors which are artificial intelligence systems or devices.

146    Let me now turn to the Commissioner’s case on ordinary meaning derived from dictionary definitions.

Dictionary definitions

147    The Commissioner’s reliance on dictionary definitions is problematic to say the least.

148    First, there are competing dictionary definitions of “inventor” that merely define the term as the noun corresponding to the verb “invent”, without describing the agent which invents. So, which dictionary do you use? And what meaning, and in what context? Mahoney JA in Provincial Insurance Australia Pty Ltd v Consolidated Wood Products Pty Ltd (1991) 25 NSWLR 541 at 560 to 562 nicely laid out the problem (see also House of Peace Pty Ltd v Bankstown City Council (2000) 48 NSWLR 498 at [25] to [29] per Mason P). Apparently Lord Wilberforce had a sensible solution to the problem of using dictionaries that I am tempted to emulate.

149    Second, in terms of agent nouns, the word “inventor”, like “computer”, is one that might originally have been used only to describe persons, when only humans could make inventions, but can now aptly be used to describe machines which can carry out the same function.

150    Third, the Commissioner incorrectly treats dictionary definitions as being exclusive legal definitions, when the nature of dictionary definitions is inclusive and exemplary. Most words have multiple dictionary definitions, meaning that any one of the definitions is an example of usage rather than exclusive. No one or more instantiations can be controlling of meaning.

151    Fourth, to say that someone who invents is an inventor does not mean that something which invents is not also correctly denoted by “inventor”. Given the nature of dictionary definitions, it is a fallacy to say that because one thing is defined as being within the scope of a term, other things are not.

152    Fifth, dictionaries are by their nature developed from historical usage. Accordingly, there would be no call for a definition directed to “something that invents” until that became possible. Similarly, prior to the development of the electronic computer, computations were made only by humans and the term “computer” referred to a person who made computations. But the term was correctly applied to a thing that made computations when such machines became available; this is now the dominant usage.

153    Generally, dictionaries are not a substitute for statutory interpretation, and the application of a dictionary definition in place of the words in the statute can lead to error by introducing requirements not contained in the statutory text.

154    Finally, it is somewhat curious for the Commissioner to focus on the distinction between human and non-human. After all, the Act also includes non-humans as “persons”. Expressions used to denote persons include a body politic or corporation as well as an individual.

155    Let me now turn to s 15 as this seems to have been the high point for the Commissioner’s case.

Section 15

156    Considerable reliance has been placed by the Commissioner and Deputy Commissioner on s 15, although this is curious in terms of timing given that Dr Thaler’s application is nowhere near the stage of grant. Who can say which limb of s 15(1) may or may not be satisfied when the time arises?

157    Section 15 is directed to who may be granted a patent. But at the formalities stage of the present application, the only requirement is that the inventor be named. And it is not even necessary now to consider whether the named entity is an actual inventor.

158    Now patents can only be granted to persons. DABUS cannot be granted a patent. Further, an artificial intelligence system cannot be a patent applicant, as it is not relevantly a “person”. And nor can an artificial intelligence system own or legally assign an invention. But does this all mean that DABUS cannot be an inventor? In my view such a conclusion is not entailed by s 15.

159    Section 15 contemplates that four classes of person may be granted a patent. But I am dealing with a separate point. I am not looking at who can be a grantee. Rather, I am looking at who or what can be an inventor. But in any event, let me discuss these categories.

160    First, one has under s 15(1)(a) the inventor who is a person. But that limb is not triggered in the present case because DABUS is not a person. Section 15(1)(a) does however demonstrate that the concept of a “person” is different to an “inventor”. Moreover, it is a fallacy to argue from s 15(1)(a) that a non-human, indeed a non-person, cannot be an inventor. It could be, but it could not be granted a patent.

161    Second, one has under s 15(1)(b) a person who would, on the grant of a patent for the invention, be entitled to have the patent assigned to them. Such an entitlement and any assignment could arise by agreement or by conduct or informally. No instrument need be involved. Moreover, such an entitlement could arise by operation of law. I will return to this limb in a moment as interesting questions arise.

162    Third, one has under s 15(1)(c) a person who derives title to the invention from the inventor or a person mentioned in s 15(1)(b). Clearly, the concept of derivation is broad and is not limited to assignments or any transfer of title as such. I will also return to this limb later.

163    Fourth, one has under s 15(1)(d) a person who is the legal representative of a deceased person in one of the earlier named classes. I do not need to discuss this limb further.

164    Now I should note at the outset that the Deputy Commissioner (at [20]) said that “the entitlement of the patentee flows from the inventor, and absent devolution the inventor will become the patentee”. But I do not consider this to be strictly correct if it be suggested that under s 15(1) there must always first be shown to be a vesting of title in the inventor.

165    Let me now discuss ss 15(1)(b) and (c) in some detail. But I should say at the outset that in my view Dr Thaler is, in principle, capable of being entitled to be granted a patent in relation to an invention made by an artificial intelligence system such as DABUS under at least s 15(1)(c), and possibly s 15(1)(b).

166    First, in my view Dr Thaler could potentially fall within s 15(1)(b).

167    Dr Thaler is the owner, programmer and operator of DABUS, the artificial intelligence system that made the invention; in that sense the invention was made for him. On established principles of property law, he is the owner of the invention. In that respect, the ownership of the work of the artificial intelligence system is analogous to ownership of the progeny of animals or the treatment of fruit or crops produced by the labour and expense of the occupier of the land (fructus industrialis), which are treated as chattels with separate existence to the land.

168    If another person were to take the invention without his consent and apply for a patent, Dr Thaler would be entitled, as the owner of the invention, to have the patent assigned to him from that other person. In other words, s 15(1)(b) does not necessarily require any assignment from the inventor at all. Dr Thaler could be entitled to an assignment from the miscreant that stole it.

169    More generally, on one view s 15(1)(b) does not require the existence of an inventor at all: it requires no more than that the applicant is entitled to have a patent assigned to him, in the event that there is a grant. In other words it is dealing with a future conditional.

170    Now the Commissioner suggests that s 15(1)(b), which posits a hypothetical situation, applies only to the situation where a person is entitled to assignment of a patent from a human inventor, because an artificial intelligence machine cannot assign a patent. Of course, that is one scenario where s 15(1)(b) may apply. Inventors are usually employees. It is clear that s 15(1)(b) encompasses an employer who may take the fruits of an employee’s labour, even in the absence of an express contractual provision, where such labour was in the course of the employee’s duties. If an employee makes an invention which it falls within his duty to make, he holds his interest as trustee for the employer. But that is not the only scenario.

171    Section 15(1)(b) also applies to the situation where a person has contracted with an employer to own an invention made by the employer’s employees. In that event, the assignment referred to by s 15(1)(b) is from the employer to the person entitled by contract to the invention. But the inventor is not a party to such an assignment.

172    Further, and as I have touched on, s 15(1)(b) can apply where a third party misappropriates an invention. In that case, the inventor’s employer could bring an action seeking an equitable assignment from the third party; and such an assignment referred to in s 15(1)(b) would be from the third party to the inventor’s employer. Again, the inventor would not be a party to the assignment.

173    Further, on its face, s 15(1)(b) could also apply in circumstances where an invention made by an artificial intelligence system, rather than by a human inventor, was the subject of contract, or had been misappropriated, giving rise in either case to a legal or equitable right of assignment.

174    Further, I also note that s 113 similarly does not refer to the inventor. It provides:

(1)    Where, before a patent is granted, a person would, if the patent were then granted, be entitled under an assignment or agreement, or by operation of law, to:

(a)    the patent or an interest in it; or

(b)    an undivided share in the patent or in such an interest;

the Commissioner may, on a request made by the person in accordance with the regulations, direct that the application proceed in the name of the person, or in the name of the person and the applicant or the other joint applicant or applicants, as the case requires.

(2)    Where the Commissioner gives a direction:

(a)    the person is to be taken to be the applicant, or a joint applicant, as the case requires; and

(b)    the patent request is to be taken to have been amended so as to request the grant of a patent to the person, either alone or as a joint patentee, as the case requires.

175    Finally, if the Deputy Commissioner (at [26]) is suggesting that s 15(1)(b) is limited to the case only of an assignment from the inventor or pre-supposes an earlier vesting of title in the inventor, I consider him to be incorrect. Section 15(1)(b) does not require this expressly or by necessary implication. But of course if you have such a scenario then s 15(1)(b) will embrace it.

176    In summary, it cannot be said now that Dr Thaler could not bring himself within s 15(1)(b) when the time arises.

177    Second, in my view Dr Thaler prima facie falls within s 15(1)(c) because he has derived title to the invention from the inventor, DABUS.

178    Now whilst DABUS, as an artificial intelligence system, is not a legal person and cannot legally assign the invention, it does not follow that it is not possible to derive title from DABUS. The language of s 15(1)(c) recognises that the rights of a person who derives title to the invention from an inventor extend beyond assignments to encompass other means by which an interest may be conferred.

179    Now although the Explanatory Memorandum, Patents Bill 1990 (Cth) at [27] and the Industrial Property Advisory Committee in its report on “Patents Innovation and Competition in Australia” prepared for the Minister for Science and Technology on 29 August 1984 provide no guidance on the meaning of the word “derives”, it was accepted by the parties before me that its ordinary meaning includes to receive or obtain from a source or origin, to get, gain or obtain, and emanating or arising from.

180    In this context, I also note that the word has been given its ordinary meaning in the context of revenue legislation; see Federal Commissioner of Taxation v Clarke (1927) 40 CLR 246 at 261 per Isaacs ACJ, where it was taken to mean “obtained”, “got” or “acquired”, and also Brent v Commission of Taxation (1971) 125 CLR 418 at 427 to 428, where it was taken by Gibbs J to mean “to draw, fetch, get, gain, obtain (a thing from a source)”.

181    Now at this point I should say something about JMVB and Stack given the Commissioner’s reliance thereon.

182    First, JMVB did not concern whether a non-human artificial intelligence system or device could be an “inventor”. Its paradigm and context was simply about persons. So, the reference to “person” in [71] must be seen in that context. Further, statements such as that an “inventor” means the person responsible for making the invention, cannot be taken to be exhaustive of the scope of “inventor”.

183    Further, the context of JMVB was whether the first importer or communicatee of an invention was an “inventor” or derived title as contemplated by s 15(1)(c). So, it was said (at [72]):

Inventor in s 15(1)(a) refers to the person who makes or devises the invention, wherever the invention may be made. It does not include a person who is not the inventor but who first imports the invention into Australia or to whom the invention is first communicated in Australia. To the extent that communication or importing of the invention gives rise to an interest on the part of the communicatee or importer, such that the communicatee or importer can say that the communicatee or importer derives title to the invention from the inventor, s 15(1) may be attracted. However, the language of s 15(1) is limited to a person who derives title to the invention from the inventor. Mere communication or importation, without anything further, is not sufficient to give rise to a title on the part of the communicatee or importer. That, of itself, is not sufficient to make Mr Van Baardwyk the inventor. The fresh ground of appeal has no substance.

(Emphasis in original.)

184    Such a scenario has little to do with my context in terms of the scope of “inventor”. Moreover, I should also say in passing that it simply did not address or take into account the language of s 15(1)(b).

185    But the case did make it plain however that if the party claiming an interest has an interest in the invention even if that interest has not been conferred by means of an assignment, that party can be said to derive the invention from the inventor. This is also consistent with University of British Columbia at [37] to [39], where Emmett J distinguished between assignment and entitlement under the general law.

186    Second, as for Stack, particularly at [21], it did not concern the issue that I am addressing.

187    Let me make a broader point concerning s 15(1)(c) and the concept of possession.

188    Clearly, proprietary rights may subsist in an invention before applying for a patent. Further, an invention is capable of being possessed and ownership may arise from possession.

189    In my view, Dr Thaler, as the owner and controller of DABUS, would own any inventions made by DABUS, when they came into his possession. In this case, Dr Thaler apparently obtained possession of the invention through and from DABUS. And as a consequence of his possession of the invention, combined with his ownership and control of DABUS, he prima facie obtained title to the invention. By deriving possession of the invention from DABUS, Dr Thaler prima facie derived title. In this respect, title can be derived from the inventor notwithstanding that it vests ab initio other than in the inventor. That is, there is no need for the inventor ever to have owned the invention, and there is no need for title to be derived by an assignment.

190    Inventions are one of the classes of intangible assets that have long been regarded as being capable of physical possession, such possession giving rise to ownership. In this respect, the previous form of application for an invention required the applicant to declare: “I further declare that I am in possession of the said invention”, which type of declaration was considered in Martin v Scribal Pty Ltd (1954) 92 CLR 17 at 67 and 68 per Dixon CJ; the Privy Council in Martin v Scribal Pty Ltd (1956) 95 CLR 213 at 222 reached a different view on a construction point. Further, as to “possessing” the invention, see also Dunlop v Cooper (1908) 7 CLR 146 at 155 per Griffith CJ and Tate v Haskins (1935) 53 CLR 594 at 607 per Rich, Dixon, Evatt and McTiernan JJ.

191    Indeed, the notion that possession of an invention is the foundation of ownership is consistent with older commentary related to intellectual property (see William Blackstone Esq., Commentaries on the Laws of England (Clarendon Press, 1766) bk 2, 405 to 407). Moreover, the present case is not a case of mere possession. It is coupled with the ownership of the computer on which DABUS operates and the fact that the invention has not been previously published.

192    Further, original or exclusive possession can found title, without the need for any assignment. And possessory title is as good as an absolute title of ownership as against all the world except the true owner (Russell v Wilson (1923) 33 CLR 538 at 546 per Isaacs and Rich JJ).

193    In my view on the present material there is a prima facie basis for saying that Dr Thaler is a person who derives title from the inventor, DABUS, by reason of his possession of DABUS, his ownership of the copyright in DABUS’ source code, and his ownership and possession of the computer on which it resides.

194    Now more generally there are various possibilities for patent ownership of the output of an artificial intelligence system. First, one might have the software programmer or developer of the artificial intelligence system, who no doubt may directly or via an employer own copyright in the program in any event. Second, one might have the person who selected and provided the input data or training data for and trained the artificial intelligence system. Indeed, the person who provided the input data may be different from the trainer. Third, one might have the owner of the artificial intelligence system who invested, and potentially may have lost, their capital to produce the output. Fourth, one might have the operator of the artificial intelligence system. But in the present case it would seem that Dr Thaler is the owner.

195    Let me now turn to the Deputy Commissioner’s reasons that the Commissioner sought to support.

196    First, the Deputy Commissioner (at [27] and [28]) referred to the concept of communication and posed the question as to whether a machine can communicate information relating to the invention for the purpose of applying for a patent. I must say that I did not find such a focus helpful. And in any event it is too narrow. The communication does not have to have such a purposive element. The fact is that Dr Thaler is in possession and control of the output of DABUS which has been communicated to him. Accordingly he is entitled to be the applicant. Further, and to be clear in any event, no one is saying that the bare fact of communication without more could constitute a derivation of the type referred to in s 15(1)(c).

197    Second, the Deputy Commissioner (at [27]) referred to s 15(1)(c) and said that the “normal means of deriving title is through assignment”. As I have indicated, this lens is too confined. Section 15(1)(c) is considerably broader.

198    Third, the Deputy Commissioner (at [30]) seems to suggest that the only way one could derive title to the invention from the inventor is if the title first had vested in the inventor. That is too narrow. It is not the only way. Now he referred to the lens of “conceptually moving title ‘from’ the artificial intelligence machine to the owner of the machine”. But “derive” is broader. Moreover it does not necessarily require, as the first step, title in A (the inventor) moving from A to B. That may be a standard way but not the only way. Dr Thaler could have and did derive possessory title at the instant the output was created by DABUS. He achieved power, custody and control over the output on its creation. Nothing more was required for the purpose of s 15(1)(c). Further, the operative concept of s 15(1)(c) concerns the derivation of title to the invention rather than unnecessarily restricting how that might come about by artificially narrowing the word “from”. That simply indicates the source or the starting point rather than that title has to vest first in the inventor.

199    Fourth and for completeness, although Emmett J in University of British Columbia at [39] in referring to s 15(1)(c) talked about “something that is capable of assignment, disposition or alienation by the inventor”, he was not purporting to be exhaustive of the possibilities and nor was he considering the issues before me as to the scope of “inventor” and the derivation of possessory title to the invention. Similarly of limited assistance to the context of s 15(1)(c) is his discussion of the predecessor provision to the present s 15 in Speedy Gantry at 553, which was in any event overturned on other matters, although there was some acceptance as to what his Honour had said concerning the predecessor to s 15 (Preston Erection Pty Ltd v Speedy Gantry Hire Pty Ltd (1998) 43 IPR 74 at 82 per Wilcox, Heerey and Lindgren JJ).

200    Generally, on a fair reading of ss 15(1)(b) and 15(1)(c), a patent can be granted to a legal person for an invention with an artificial intelligence system or device as the inventor.

Miscellaneous statutory provisions and other matters

201    Let me now refer to some other statutory provisions to make good the point that the Act’s focus is not on the inventor. Not only is there no mention of the concept of “inventor” in ss 7, 18 and 40. But apart from s 15 there is scant reference elsewhere.

202    Section 64(2)(a) refers to “the same inventor” in the context of a bar on multiple applications. Sections 101B(2)(h) and 101E(1)(a)(viii) refer to “the same inventor” in an analogous context. But these provisions do not assist the debate before me one way or the other.

203    Sections 172(1), 182(3) and 185(a) may be more on point, although the Commissioner made no reference to these provisions before me. Perhaps it was also thought by the Commissioner that these provisions did not assist one way or the other.

204    Section 172(1) provides:

An inventor, or an inventor’s successor in title, may assign the invention, and any patent granted or to be granted for the invention, to the Commonwealth.

205    Now as to the first possibility, clearly only an inventor who is a legal person could assign. But that does not mean that “inventor” more generally under the Act can only be a legal person. Given the permissive “may assign” in the statutory language, if you are an inventor who is a legal person then you can assign. But if you are not then you cannot assign.

206    But what about the second possibility concerning the expression “inventor’s successor in title”? Does this mean that the inventor had to have prior title? If so, then the inventor would have to be a person. And that being the case, the phrase “An inventor, or an inventor’s successor in title, may assign the invention”, if it covers the universe, may imply that the inventor has to be a person. Moreover, s 172(1) seems to have been intended to cover the universe of possibilities. It does not seem to have been intended that there could be a class of inventions or patents not capable of being assigned to the Commonwealth.

207    But does “inventor’s successor in title” permit of a situation where the inventor did not hold prior title as such? If so, then “inventor” did not need to be a person, although the successor in title must be. In my view, this is the correct meaning. It is also consistent with the scope of s 15 itself that does not necessarily require the inventor to have prior title; see for example s 15(1)(c). In other words the composite phrase in s 172(1) really is intended to cover the possibilities in s 15(1).

208    Let me turn to two other provisions, being ss 182 and 185.

209    Section 182 provides:

(1)    The Commissioner, a Deputy Commissioner or an employee must not buy, sell, acquire or traffic in:

(a)    an invention or patent, whether granted in Australia or anywhere else; or

(b)    a right to, or licence under, a patent, whether granted in Australia or anywhere else.

(2)    A purchase, sale, acquisition, assignment or transfer made or entered into in contravention of this section is void.

(3)    This section does not apply to the inventor or to an acquisition by bequest or devolution by law.

210    Clearly, s 182(3) assumes that an inventor is a person. But s 182(3) does not entail that an inventor has to be a person under the Act generally. Moreover, in such an eventuality, s 182(3) in its first part would simply not operate.

211    Further, s 185 does not take the present debate further, although it requires in its operation the inventor to be a person.

212    In summary, subject to the qualification I made concerning the phrase “inventor’s successor in title” in s 172(1), ss 172(1), 182(3) and 185 seem to predicate that the inventor in the context with which they deal is a person. But the fact that the Act stipulates rights or consequences for an inventor who is a person in some places does not logically entail that an inventor must be and can only be a person for all purposes.

213    Let me make two other points.

214    First, let me say something concerning the history to reg 3.2C(2)(aa).

215    Regulation 3.2C was introduced in 2013 by the Intellectual Property Legislation Amendment (Raising the Bar) Regulation 2013 (No. 1) (Cth). The regulation was inserted to require PCT applications to meet certain formal requirements and to permit the Commissioner to carry out a formalities check. But in the form introduced there was no reg 3.2C(2)(aa). In other words, there was no perceived need to stipulate the name of the inventor.

216    Curiously, in 2015 reg 3.2C(2) was amended by the Intellectual Property Legislation Amendment (TRIPS Protocol and Other Measures) Regulation 2015 (Cth) to add paragraph (2)(aa) (item 5).

217    The explanatory statement seeking to justify this insertion stated:

Item 5 inserts a new paragraph 3.2C(2)(aa) into the Patents Regulations, to require an applicant for a PCT application to provide the name(s) of the inventor(s) of the invention in the application. This information is required to ensure that the entitlement of the applicant to be granted a patent is clear.

Just as with the existing requirement for a PCT applicant to provide an address for service in Australia, the name(s) of the inventor(s) would not be required until after the PCT application has entered national phase in Australia.

218    But this purported purpose was incomplete to say the least. It ignored the broader framework of s 15. Moreover, it seems to have started from the misconceived assumption that the chain of title had to start with the inventor, in terms of the title being vested in the inventor before any part of s 15 could operate. In my view, its relatively recent introduction has only a problematic justification, particularly when s 15 only speaks at the time of grant.

219    Further, as I have said, there was nothing in the PCT itself expressly requiring a human inventor. I need not linger further on such matters.

220    Second, I should say that apart from the PCT and the PCT Regulations I have not discussed foreign legislative instruments or decisions made under them. Whether they require or have held that an inventor must be a person cannot assist me. And if they have so required or held, that is just a function of the text and context of such instruments, construed in the light of their applicable extrinsic material. Perhaps on their proper construction this was expressly required. Or perhaps they are silent on the question, but because the possibility was discussed in the extrinsic material but then omitted before the relevant instrument was promulgated, they have been construed so as to not include the possibility. Whatever be the position, this is not a voyage of discovery that I need embark on.

Conclusion

221    As I have said, s 15 concerns who may be granted the patent. The Commissioner is not being asked to decide that question now. The question is whether a valid PCT application has been presently lodged. The only impediment it would seem is reg 3.2C(2)(aa) and the Commissioner’s interpretation of that requirement.

222    First, in my view the name of the inventor can be a non-human. The Commissioner is incorrect in saying that you cannot have a non-human inventor.

223    Second, if the Commissioner would have it that reg 3.2C(2)(aa) requires the name of a human inventor, that is not what the Act mandates. Accordingly, if the Commissioner is correct, I would read down the regulation to avoid it being ultra vires, so that in effect it reads “the name of a human inventor (if applicable)”.

224    Third, the Deputy Commissioner ought not to have used subordinate legislation to summarily rule out a substantive consideration and examination of Dr Thaler’s application in circumstances where:

(a)    Dr Thaler was a valid applicant;

(b)    prima facie his application is not said not to disclose a patentable invention;

(c)    no other difficulties with his application have been identified;

(d)    the question of grant is some years away; and

(e)    it cannot be said now that Dr Thaler could not later bring himself within s 15(1)(b) and / or s 15(1)(c) in terms of being entitled to a grant.

225    On this aspect, and if it is necessary to say so, I also agree with Dr Thaler’s procedural point that I referred to earlier.

226    In summary, in my view, an inventor as recognised under the Act can be an artificial intelligence system or device. But such a non-human inventor can neither be an applicant for a patent nor a grantee of a patent. So to hold is consistent with the reality of the current technology. It is consistent with the Act. And it is consistent with promoting innovation.

227    I will set aside the Deputy Commissioner’s determinations and remit the matter for reconsideration in accordance with these reasons. The parties have already agreed that given the significance of the issue there should be no order for costs, whoever wins.

228    If any party considers that formal declarations are required, I will grant liberty to apply for that purpose, although whether they are necessary is another matter.

I certify that the preceding two hundred and twenty-eight (228) numbered paragraphs are a true copy of the Reasons for Judgment of the Honourable Justice Beach.

Associate:

Dated:    30 July 2021