The term “artificial intelligence,” first coined by John McCarthy, Professor Emeritus at Stanford, in 1955, describes it as “the science and engineering of intelligent machines.” AI has undergone radical evolution over the years. Today, the integration of generative AI presents new challenges within the business landscape and beyond, requiring a thorough examination of its legal implications.
In this article, we will focus on the legal implications of using generative AI in product creation and service delivery.
Generative AI employs deep learning models trained on large datasets to create new content, products, and services. Examples of generative AI applications include chatbots, image editing and creation, support in software development, and scientific research. When it comes to creating new products and services, consider design and prototyping, for instance. It can analyse vast amounts of data, identify consumer preferences, and spot market trends to generate highly personalised product or service ideas, as seen in the fashion industry. In manufacturing, generative AI can optimise processes, plan production schedules, monitor quality control, and assist in supply chain optimisation, ensuring that materials and products are delivered on time. Chatbots use generative AI to respond to customer queries, provide assistance, and resolve issues.
Legal Implications and Copyright Challenges in Generative AI
The opportunities offered by generative AI and machine learning are also multiplying in the creative field, where copyright often goes overlooked. Copyright holders have begun suing some of the major AI tool companies. In November 2022, for instance, lawsuits were filed against GitHub, Microsoft (the owner of GitHub), and OpenAI. The code-generating software tools GitHub Copilot and OpenAI Codex allegedly used vast amounts of code from the open-source platform GitHub to generate their own code, violating user licensing rights as stipulated by the platform’s terms and conditions. In January 2023, artists S. Andersen, K. McKernan, and K. Ortiz filed a class action lawsuit in the US to challenge the legality of the image-generating software tools Stable Diffusion, Midjourney, and DreamUp, from the companies Stability AI, Midjourney, and DeviantArt. Stability AI allegedly copied over 10 million images, texts, and associated metadata from the Getty Images website to train Stable Diffusion, despite the fact that the reproduction of materials for commercial purposes without permission was prohibited.
Thus, there are clear legal implications regarding the use of generative AI in product creation and service delivery. A responsible and ethical use of these technologies requires addressing issues such as data privacy, intellectual property rights protection, accountability, and transparency.For some, the creators of the AI system should be held responsible, while for others, the AI system itself should be held accountable.
The recent AI Act, adopted by the European Council, addresses this issue and establishes various certification levels and different levels of risk for different types of AI. Embracing the potential of generative AI necessarily requires addressing the legal challenges that arise from its use.
Blockchain technology provides a means to assert ownership of a work, patent, product, or service by registering it on the blockchain. Kodak Company has recently launched an image rights management policy, KodaKCoin, based on blockchain.
The registration of intellectual property rights, the preservation of original works, and their cataloguing are just a few of the uses of blockchain. Information recorded on the blockchain can establish and enforce intellectual property agreements and enable real-time transmission of payments to owners.
The no-code platform Trakti integrates AI, blockchain technology, and smart contracts for the automated management of contracts and the entire Contract Lifecycle Management (CLM), including certified digital signatures and payments in accordance with eIDAS regulations.