google-site-verification=Z_uoRd0b3XdVdrmzeuBxwVnnTutVNUbWIMxE71rh0fU
2 August 2021

Ghost in the Machine: but does the ghost “invent” what it makes?

by Michael Finney (Senior Associate, Bennett & Philp Lawyers) & Dr Amanda-Jane George (Senior Lecturer College of Law, Criminology & Justice, Central Queensland University) Michael Finney

Today cloud computing is an integral metaphor for the internet: an amorphous system we work with; we store and retrieve stuff from; something we experience all the time without really understanding what it is – everything just works. It’s a place where we share, transact, shop, bank, socialise, borrow books and vote. Sure, its physical infrastructure consists of tangible telephone lines, fibre optics, satellites, cables on the seafloor, and distributed networked computers, but what level of control and understanding do we really have over the treasure trove of data stored within and the software employed to bring it to life? Given its relative obscurity, the cloud is inherently less visible and less amenable to critique, investigation, preservation and regulation.

With the emergence of the cloud and its vast associated computer power, companies worldwide have recognised the opportunity to mine this data warehouse and are amassing huge amounts of data and scrambling to develop the right tools to analyse and leverage from this data. With the trend towards predictive analytics, machine learning and other data sciences already underway, marketers have started paying attention to how they can leverage these techniques to form a more data-driven marketing strategy. However, marketers are not the sole beneficiaries of such technologies and application to pharmaceutical research and other scientific fields of human endeavour are well advanced.

This trend is accelerating with artificial intelligence (AI) now becoming mainstream. While many organisations are just starting to understand how AI can fit into their digital strategies, it’s quickly becoming a necessity to leverage data effectively. Most of today’s AI is narrow-AI which seeks to exceed human abilities targeted at a specific problem. However, the next phase, artificial general intelligence (AGI), is seeking to solve the broad spectrum of problems that intelligent humans can’t solve.

As a result of the emergence and proliferation of AI technologies and their application to improving business performance, the issue of ownership of the ‘solutions’ made possible with these new technologies is only now being addressed. Mining of data using predictive analytics and machine learning to develop personalised data-driven marketing strategies is one thing but what happens when AI is unleashed on the data cloud to develop solutions to problems which are posed for it? Or when AGI is empowered to solve problems when the techniques are not known and there is no concrete problem statement? Who owns the solutions to those solved problems. Specifically, if the software AI creates an invention, who owns that intellectual property?

The recent Federal Court decision in Thaler v Commissioner of Patents [2021] FCA 879 brings us one step closer to an understanding of the issue. In a world-first, the Court was asked to review a patent office decision that a patent lapsed as a result of the office finding that an artificial intelligence system or device could not be an inventor and the patentee, therefore, lacked entitlement.

By way of background, the invention the subject of the patent application was autonomously or semi-autonomously generated by an artificial intelligence system known as DABUS (device for the autonomous bootstrapping of unified sentience) and was an output of DABUS’ artificial neural network processes. The Deputy Commissioner had found that section 15 of the Patents Act 2001 (Cth) was “not capable of sensible operation in the situation where an inventor would be an artificial intelligence machine as it is not possible to identify a person who could be granted a patent”. The Commissioner argued that as an artificial intelligence machine is not a ‘person’ it cannot own a patent, so section 15(1)(a) does not apply; and such pre-supposes that it can assign an invention under section 15(1)(b). Further, the Commissioner says, in relation to section 15(1)(c), that it cannot be said that Dr Thaler “derives title to the invention from the inventor”. The Commissioner further argued that section 15(1)(c) requires the existence of a title that moves from the inventor to the other person. As Dr Thaler’s application did not comply with the Patents Act and regulations, its deficiencies were incapable of remedy and the application lapsed. The Commissioner also suggested that it may be that in the future Parliament considers that artificial intelligence machines can “invent” for the purpose of the patent system, and that their “owners” should be rewarded for those machines’ “inventions”, but such a scenario is not reflected in the current statutory scheme.

In response, Dr Thaler argued that even though DABUS, as an artificial intelligence system, is not a legal person and cannot legally assign the invention, he says that it does not follow that it is not possible to derive title from DABUS.

In rejecting the Deputy Commissioner’s determination and the Commissioner’s position before him, Justice Beach found that the Patents Act did not require a narrow interpretation of ‘inventor’ and that the inventor of a patented invention did not need to be human (unlike copyright law where it is established that those intellectual property rights originate by the application of human intellectual effort). Justice Beach clarified that the word ‘inventor’ has its ordinary meaning and in this respect is an agent noun,[1] where an agent can be a person or thing that invents. Whilst declining to rigorously define the boundaries of artificial intelligence, Justice Beach suggested that it may be appropriate to “apply the label of “artificial intelligence” to a system that has the capacity for deductive reasoning, inductive reasoning and pattern recognition, leaving to one side any possible embodiment of awareness, consciousness or sense of self”. For present purposes, Justice Beach found that “DABUS could be described as self-organising as a cumulative result of algorithms collaboratively generating complexity”, “generates novel patterns of information rather than simply associating patterns” and “capable of adapting to new scenarios without additional human input”.

On the central question of the meaning of ‘inventor’, Justice Beach noted that the Patents Act provides no definition of ‘inventor’ and that there are no specific provisions in the Patents Act expressly refuting the proposition that an artificial intelligence system can be an inventor. His Honour explained that, as a matter of statutory interpretation, “it is also of fundamental importance that limitations and qualifications are not read into a statutory definition unless clearly required by its terms or its context, as for example if it is necessary to give effect to the evident purpose of the Act”.

Importantly, drawing upon High Court guidance from Myriad[2] “a widening conception of “manner of manufacture” is a necessary feature of the development of patent law in the twentieth and twenty-first centuries as scientific discoveries inspire new technologies”, Justice Beach saw no reason why the concept of ‘inventor’ should not also be seen in an analogously flexible and evolutionary way. This was especially the case given the new object clause 2A[3] in the Patents Act which “should always be considered when construing the legislation whether or not any ambiguity is identified”. Consistent with the object of the Patents Act and underpinning such reasoning, Justice Beach noted that “computer inventorship would incentivise the development by computer scientists of creative machines, and also the development by others of the facilitation and use of the output of such machines, leading to new scientific advantages” and that “without the ability to obtain patent protection, owners of creative computers might choose to protect patentable inventions as trade secrets without any public disclosure”. In short, Justice Beach drew analogy noting that if the concept of “invention” in terms of manner of manufacture evolves, as it must, why not the concept of “inventor”? He opined that “more is required of [me] than mere resort to old millennium usages of that word” in the need to “grapple with the underlying idea, recognising the evolving nature of patentable inventions and their creators”. Such must surely be the nature of the evolution of the common law.

Somewhat pithily Justice Beach opined, “Why cannot our own creations also create?” and explained that to deny this “would inhibit innovation not just in the field of computer science but all other scientific fields which may benefit from the output of an artificial intelligence system”.  Indeed, that would be the antithesis of promoting innovation. However, Justice Beach explained that such a non-human inventor can neither be an applicant for a patent nor a grantee of a patent.

In Dr Thaler’s case, the combination of him being in possession of the invention and being the sole owner and controller of the copyright in the DABUS AI source code, it followed that Dr Thaler was entitled to the invention as patentee by operation of law (rather than devolution by assignment), given that the issue of the identity of the inventor had been resolved. Justice Beach explained that section 15(1)(b) encompasses an employer who may take the fruits of an employee’s labour, even in the absence of an express contractual provision, where such labour was in the course of the employee’s duties. If an employee makes an invention which it falls within his duty to make, he holds his interest as trustee for the employer. But that is not the only scenario and Justice Beach outlined scenarios where the inventor is not a party to such an assignment giving rise to a legal or equitable right of assignment.

Similarly, Justice Beach considered that the language of section 15(1)(c) recognised that the rights of a person who derives title to the invention from an inventor extend beyond assignments to encompass other means by which an interest may be conferred. As such, Dr Thaler could derive title from the DABUS artificial intelligence system where the word ‘derive’ is given its ordinary meaning including to receive or obtain from a source or origin, to get, gain or obtain, and emanating or arising from. Importantly, drawing upon University of British Columbia,[4] an invention is capable of being possessed and ownership may arise from possession and on a fair reading of sections 15(1)(b) and 15(1)(c), a patent can be granted to a legal person for an invention with an artificial intelligence system or device as the inventor.

However, further complicating the issue, is the recent trend towards open-sourced solutions including those for AI, and one would expect AGI.  For such open-sourced AI software, there are a range of possibilities for patent ownership of the output of an artificial intelligence system. Justice Beach acknowledged this possibility in outlining four classes of possible owners:

  1. the software programmer or developer of the artificial intelligence system;
  2. the person who selected and provided the input data or training data for and trained the artificial intelligence system (noting that the person who provided the input data may be different from the trainer);
  3. the owner of the artificial intelligence system who invested, and potentially may have lost, their capital to produce the output; and
  4. the operator of the artificial intelligence system.

Indeed, there may be others.

In the case of artificial intelligence systems made available under an open-source licence,[5]  the proliferation and diversity of such licences (e.g. copyleft GNU series of general public licences and permissive type licences) will make it difficult to understand the legal implications of the differences between licences. Given the trend to AGI open source software, it seems that ownership disputes as to the output of an artificial intelligence system will occupy the Court’s time in the foreseeable future. Let’s trust that in dealing with the spectre discussed by Justice Beach (where if one permits of computer-generated inventions and computer-generated patent applications, that the patent system will reach a breaking point) there will be “a number of ways to dispose of these phantoms”.

It seems that the ghost in the machine has at last gained a degree of consciousness in existing independently of human innovation.

If you would like to discuss this article, please contact Michael Finney.

 

[1]  Justice Beach found an agent noun indicates that the noun describes the agent that does the act referred to by the verb to which the suffix is attached.

[2]  D’Arcy v Myriad Genetics Inc (2015) 258 CLR 334 at [18] per French CJ, Kiefel, Bell and Keane JJ.

[3]  The object of this Act is to provide a patent system in Australia that promotes economic wellbeing through technological innovation and the transfer and dissemination of technology. In doing so, the patent system balances over time the interests of producers, owners and users of technology and the public.

[4]  University of British Columbia v Conor Medsystems Inc (2006) 155 FCR 391 at [37] to [39], where Emmett J distinguished between assignment and entitlement under the general law.

[5]  Open-sourced software is software that is released under a licence in which the copyright holder grants users the rights to use, study, change, and distribute the software and its source code to the general public with relaxed or non-existent restrictions on the use and modification of the code.

 


Individual liability is limited by a scheme approved under professional standards legislation (personal injury work exempted).

 

Related Posts

29 November 2024 Publications

‘Tis the Season for Cyber Security Reform

Find out more
13 September 2024 Publications

Privacy Amendments – A Missed Opportunity?

Find out more
06 September 2024 Publications

Notice of Conditions – Should You Consent to an Order under Clause 175 of the Poisons and Therapeutic Goods Regulation?

Find out more
>
>
>
>
>
>
>
>

Stay in the know

Get our latest news and publications delivered straight to your inbox

  • This field is for validation purposes and should be left unchanged.