Wednesday, November 20, 2024

Indian news agency sues OpenAI alleging copyright infringement; TechCrunch, November 18, 2024

 Manish Singh, TechCrunch; Indian news agency sues OpenAI alleging copyright infringement

"One of India’s largest news agencies, Asian News International (ANI), has sued OpenAI in a case that could set a precedent for how AI companies use copyrighted news content in the world’s most populous nation.

Asian News International filed a 287-page lawsuit in the Delhi High Court on Monday, alleging the AI company illegally used its content to train its AI models and generated false information attributed to the news agency. The case marks the first time an Indian media organization has taken legal action against OpenAI over copyright claims.

During Tuesday’s hearing, Justice Amit Bansal issued a summons to OpenAI after the company confirmed it had already ensured that ChatGPT wasn’t accessing ANI’s website. The bench said that it was not inclined to grant an injunction order on Tuesday, as the case required a detailed hearing for being a “complex issue.”

The next hearing is scheduled to be held in January."

Sunday, November 17, 2024

Cuban citizen convicted in U.S. streaming piracy scheme; UPI, November 16, 2024

Mike Heuer , UPI; Cuban citizen convicted in U.S. streaming piracy scheme

"A federal jury in Las Vegas found Yoany Vaillant guilty of conspiring to commit criminal copyright infringement for his work on behalf of illegal streamer Jetflicks.

Vaillant, 43, is a Cuban citizen and knows 27 computer programming languages, which he used to streamline the subscription-based but illegal Jetflicks content for its subscribers who were located throughout the United States, the Department of Justice announced in a news release Friday...

Jetflicks is headquartered in Las Vegas and claimed to have 183,285 copyrighted episodes of television programming, which is much more than Netflix, Hulu, Amazon Prime and any other streaming services.

Prosecutors provided evidence showing Vaillant and seven co-conspirators scoured pirate sites located around the world to access and download its extensive library of streaming titles without obtaining permission or paying respective copyright holders...

"The vast scale of Jetflicks' piracy affected every significant copyright owner of a television program in the United States," the DOJ said.

The illegal streaming caused "millions of dollars of losses to the U.S. television show and streaming industries," the agency said.

Vaillant was among eight defendants indicted in the U.S. District Court for Eastern Virginia in 2019."

Saturday, November 16, 2024

What Intellectual Property Policies Should We Expect from the Second Trump Administration?; American Enterprise Institute, November 15, 2024

Michael M. Rosen, American Enterprise Institute ; What Intellectual Property Policies Should We Expect from the Second Trump Administration?

"Days after President-Elect Trump announced numerous conventional cabinet appointments, and several highly idiosyncratic ones, we can be forgiven for throwing our hands up rather than trying to forecast how his new administration will handle the most pressing IP issues. But we can certainly try, based on the limited information we have before us.

1. Legislative patent reform...

2. Artificial intelligence (AI) regulation and IP...

3. Pharmaceutical protection"

Tracking The Slow Movement Of AI Copyright Cases; Law360, November 7, 2024

Mark Davies and Anna Naydonov , Law360; Tracking The Slow Movement Of AI Copyright Cases

"There is a considerable gap between assumptions in the technology community and assumptions in the legal community concerning how long the legal questions around artificial intelligence and copyright law will take to reach resolution.

The principal litigated question asks whether copyright law permits or forbids the process by which AI systems are using copyright works to generate additional works.[1] AI technologists expect that the U.S. Supreme Court will resolve these questions in a few years.[2] Lawyers expect it to take much longer.[3] History teaches the answer...

Mark S. Davies and Anna B. Naydonov are partners at White & Case LLP.

Mark Davies represented Stephen Thaler in Thaler v. Vidal, Oracle in Google v. Oracle, and filed an amicus brief on behalf of a design professional in Apple v. Samsung."

Anheuser-Busch sued for copyright infringement of Montana artist’s fishing illustration; KMOV.com, November 15, 2024

 Pat Pratt, KMOV.com; Anheuser-Busch sued for copyright infringement of Montana artist’s fishing illustration

"A Montana wildlife artist is suing Anheuser-Busch for copyright infringement of one of his fishing illustrations. 

Artist Jon Q. Wright filed the lawsuit Thursday in U.S. District Court in St. Louis, where the company is headquartered. He has requested damages including profits made from the artwork, that illicit copies be impounded and further use be prohibited.

First Alert 4 has reached out to Anheuser-Busch requesting comment and is awaiting a response.

Wright states in the lawsuit he penned the image in 1999 and copyrighted it the following year. The image depicts a fishing scene with a fish in the foreground and a man in a boat in the background.

According to the lawsuit, Wright gave Anheuser-Busch a limited-term, non-exclusive license for specific works of art about 20 years ago, including the image at the center of the litigation filed Thursday. The license also included that several of the company’s affiliates could use the work.

The lawsuit filed Thursday alleges that the license has expired and Anheuser-Busch has altered the photo and continues to use it."

Friday, November 15, 2024

Icelandic Fishing Giant Wins Copyright Case Against Artist; artnet, November 14, 2024

 Jo Lawson-Tancred , artnet; Icelandic Fishing Giant Wins Copyright Case Against Artist

"The work by the artist known as Odee had publicly impersonated Iceland’s biggest fishing company Samherji, issuing a fake apology for its role in the so-called “fishrot” corruption scandal of 2019. In his ruling, the judge described the artwork as “an instrument of fraud, copyright infringement, and malicious falsehood.”

The case never went to trial but the artist said he plans to appeal the judgement. His defenders have argued that any punitive action taken against him could result in a “chilling effect” that discourages artist’s from daring to critique big corporations for fear of legal action.

Samherji sued Odee, the moniker for 41-year-old Icelandic artist Oddur Fridriksson, over We’re Sorry (2023), for which Odee created the website samherji.co.uk, imitating the company’s brand identity. On this platform, he issued the statement: “Samherji Apologizes, Pledges Restitution and Cooperation with Authorities.”

In Samherji’s complaint filed in London’s high court, it accused Odee of trademark infringement and malicious falsehood. The company’s lawyers applied for a summary judgement to avoid a trial."

Thursday, November 14, 2024

Perlmutter Says Copyright Office Is Still Working to Meet ‘Ambitious Deadline’ for AI Report; IPWatchdog, November 14, 2024

EILEEN MCDERMOTT , IPWatchdog; Perlmutter Says Copyright Office Is Still Working to Meet ‘Ambitious Deadline’ for AI Report

"Asked by Subcommittee Chair Chris Coons (D-DE) what keeps her up at night when it comes to the AI issue, Perlmutter said “the speed at which this is all developing.” In September during IPWatchdog LIVE 2024, Perlmutter told LIVE attendees that while she’s confident the issues around copyright and AI will eventually be solved, she’s “less comfortable about what it means for humankind.”

Perlmutter recently came under fire from Committee on House Administration Chairman Bryan Steil (R-WI), who sent a letter On Tuesday, October 29, to the Office asking for an update on the AI report, which Steil charged is no longer on track to be published by its stated target dates. Steil’s letter asked the Office to explain the delay in issuance of parts two and three, which Register of Copyrights Shira Perlmutter indicated in an oversight hearing by the Committee on House Administration would be published before the end of the summer and in the fall, respectively. “The importance of these reports cannot be overstated,” Steil wrote, explaining that copyright owners are relying on the Office to provide clear guidance. “The absence of these reports creates uncertainty for industries that are already grappling with AI-related challenges and hinders lawmakers’ ability to craft effective policy,” the letter added.

Perlmutter commented in the hearing that “we’ve been trying to set and follow our own ambitious deadlines” and the goal remains to get the rest of the report out by the end of the year, but that her key concern is to be “accurate and thoughtful.”

The forthcoming reports will include recommendations on how to deal with copyrightability of materials created using GAI and the legal implications of training on copyrighted works. The latter is most controversial and may in fact require additional legislation focusing on transparency requirements."

Satire publication The Onion buys Alex Jones' Infowars at auction with Sandy Hook families' backing; AP, November 14, 2024

 Dave Collins | APSatire publication The Onion buys Alex Jones' Infowars at auction with Sandy Hook families' backing

"The satirical news publication The Onion won the bidding for Alex Jones’ Infowars at a bankruptcy auction, backed by families of Sandy Hook Elementary School shooting victims whom Jones owes more than $1 billion in defamation judgments for calling the massacre a hoax.

“The dissolution of Alex Jones’ assets and the death of Infowars is the justice we have long awaited and fought for,” Robbie Parker, whose daughter Emilie was killed in the 2012 shooting in Connecticut, said in a statement Thursday provided by his lawyers.

The Onion acquired the conspiracy theory platform’s website; social media accounts; studio in Austin, Texas; trademarks; and video archive for an undisclosed sales price. The purchase gives a satirical outlet — which carries the banner of “America’s Finest News Source” on its masthead — control over a brand that has long peddled misinformation and conspiracy."

Sunday, November 10, 2024

What’s Happening with AI and Copyright Law; JD Supra, November 4, 2024

AEON Law, JD Supra; What’s Happening with AI and Copyright Law

"Not surprisingly, a lot is happening at the intersection of artificial intelligence (AI) and intellectual property (IP) law.

Here’s a roundup of some recent developments in the area of copyright law and AI.

Copyright Office Denies AI Security Research Exemption under DMCA...

Former OpenAI Employee Says It Violates Copyright Law...

Blade Runner Production Company Sues Tesla for AI-Aided Copyright Infringement"

Saturday, November 9, 2024

OpenAI Gets a Win as Court Says No Harm Was Demonstrated in Copyright Case; Gizmodo, November 8, 2024

, Gizmodo; OpenAI Gets a Win as Court Says No Harm Was Demonstrated in Copyright Case

"OpenAI won an initial victory on Thursday in one of the many lawsuits the company is facing for its unlicensed use of copyrighted material to train generative AI products like ChatGPT.

A federal judge in the southern district of New York dismissed a complaint brought by the media outlets Raw Story and AlterNet, which claimed that OpenAI violated copyright law by purposefully removing what is known as copyright management information, such as article titles and author names, from material that it incorporated into its training datasets.

OpenAI had filed a motion to dismiss the case, arguing that the plaintiffs did not have standing to sue because they had not demonstrated a concrete harm to their businesses caused by the removal of the copyright management information. Judge Colleen McMahon agreed, dismissing the lawsuit but leaving the door open for the plaintiffs to file an amended complaint."

Thursday, November 7, 2024

‘I’m going to sue the living pants off them’: AI’s big legal showdown – and what it means for Dr Strange’s hair; The Guardian, November 6, 2024

  , The Guardian; ‘I’m going to sue the living pants off them’: AI’s big legal showdown – and what it means for Dr Strange’s hair

"“The intersection of generative AI and CGI image creation is the next wave.”

Now that wave is threatening to flood an unprepared industry, washing away jobs and certainties. How do people in the industry feel? To find out, I attended Trojan Horse Was a Unicorn (THU), a digital arts festival near Lisbon in Portugal. Now in its 10th year, THU is a place where young artists entering these industries, some 750 of them, come to meet, get inspired and learn from veterans in their fields: film-makers, animators, VFX wizards, concept artists, games designers. This year, AI is the elephant in the room. Everyone is either talking about it – or avoiding talking about it...

Andre Luis, the 43-year-old CEO and co-founder of THU, acknowledges that “the anxiety is here” at this year’s event, but rather than running away from it, he argues, artists should be embracing it. One of the problems now is that the people eagerly adopting AI are executives and managers. “They don’t understand how to use AI to accelerate creativity,” he says, “or to make things better for everyone, so it’s up to us [the artists] to teach them. You need people who actually are creative to use AI.”

Luis likens generative AI to ultra processed food: it cannot create anything new; it can only reconstitute what’s already there, turning it into an inferior product. “And a lot of companies are trying to make fast food,” he says. Many see AI as a way to churn out quick, cheap content, as opposed to higher quality fare that has been created “organically” over time, with loving human input...

The democratising potential of AI could usher in what Luis calls “a new era of indie” in films, games, TV. Just as digital technology put cameras, editing and graphics tools into the hands of many more people...

“AI is something that is here,” he tells the young creators at THU, “so you need to adapt. See the opportunities, see the problems, but understand that it can help you do things in a different way. You need to ask yourselves, ‘How can I be part of that?’"

Tuesday, November 5, 2024

The Heart of the Matter: Copyright, AI Training, and LLMs; SSRN, November 1, 2024

Daniel J. GervaisVanderbilt University - Law School

Noam ShemtovQueen Mary University of London, Centre for Commercial Law Studies

Haralambos MarmanisCopyright Clearance Center

Catherine Zaller RowlandCopyright Clearance Center 

SSRN; The Heart of the Matter: Copyright, AI Training, and LLMs



"Abstract

This article explores the intricate intersection of copyright law and large language models (LLMs), a cutting-edge artificial intelligence technology that has rapidly gained prominence. The authors provide a comprehensive analysis of the copyright implications arising from the training, fine-tuning, and use of LLMs, which often involve the ingestion of vast amounts of copyrighted material. The paper begins by elucidating the technical aspects of LLMs, including tokenization, word embeddings, and the various stages of LLM development. This technical foundation is crucial for understanding the subsequent legal analysis. The authors then delve into the copyright law aspects, examining potential infringement issues related to both inputs and outputs of LLMs. A comparative legal analysis is presented, focusing on the United States, European Union, United Kingdom, Japan, Singapore, and Switzerland. The article scrutinizes relevant copyright exceptions and limitations in these jurisdictions, including fair use in the US and text and data mining exceptions in the EU. The authors highlight the uncertainties and challenges in applying these legal concepts to LLMs, particularly in light of recent court decisions and legislative developments. The paper also addresses the potential impact of the EU's AI Act on copyright considerations, including its extraterritorial effects. Furthermore, it explores the concept of "making available" in the context of LLMs and its implications for copyright infringement. Recognizing the legal uncertainties and the need for a balanced approach that fosters both innovation and copyright protection, the authors propose licensing as a key solution. They advocate for a combination of direct and collective licensing models to provide a practical framework for the responsible use of copyrighted materials in AI systems.

This article offers valuable insights for legal scholars, policymakers, and industry professionals grappling with the copyright challenges posed by LLMs. It contributes to the ongoing dialogue on adapting copyright law to technological advancements while maintaining its fundamental purpose of incentivizing creativity and innovation."

Penguin Random House books now explicitly say ‘no’ to AI training; The Verge, October 18, 2024

 Emma Roth , The Verge; Penguin Random House books now explicitly say ‘no’ to AI training

"Book publisher Penguin Random House is putting its stance on AI training in print. The standard copyright page on both new and reprinted books will now say, “No part of this book may be used or reproduced in any manner for the purpose of training artificial intelligence technologies or systems,” according to a report from The Bookseller spotted by Gizmodo. 

The clause also notes that Penguin Random House “expressly reserves this work from the text and data mining exception” in line with the European Union’s laws. The Bookseller says that Penguin Random House appears to be the first major publisher to account for AI on its copyright page. 

What gets printed on that page might be a warning shot, but it also has little to do with actual copyright law. The amended page is sort of like Penguin Random House’s version of a robots.txt file, which websites will sometimes use to ask AI companies and others not to scrape their content. But robots.txt isn’t a legal mechanism; it’s a voluntarily-adopted norm across the web. Copyright protections exist regardless of whether the copyright page is slipped into the front of the book, and fair use and other defenses (if applicable!) also exist even if the rights holder says they do not."

Monday, November 4, 2024

What AI knows about you; Axios, November 4, 2024

 Ina Friend, Axios; What AI knows about you

"Most AI builders don't say where they are getting the data they use to train their bots and models — but legally they're required to say what they are doing with their customers' data.

The big picture: These data-use disclosures open a window onto the otherwise opaque world of Big Tech's AI brain-food fight.

  • In this new Axios series, we'll tell you, company by company, what all the key players are saying and doing with your personal information and content.

Why it matters: You might be just fine knowing that picture you just posted on Instagram is helping train the next generative AI art engine. But you might not — or you might just want to be choosier about what you share.

Zoom out: AI makers need an incomprehensibly gigantic amount of raw data to train their large language and image models. 

  • The industry's hunger has led to a data land grab: Companies are vying to teach their baby AIs using information sucked in from many different sources — sometimes with the owner's permission, often without it — before new laws and court rulings make that harder. 

Zoom in: Each Big Tech giant is building generative AI models, and many of them are using their customer data, in part, to train them.

  • In some cases it's opt-in, meaning your data won't be used unless you agree to it. In other cases it is opt-out, meaning your information will automatically get used unless you explicitly say no. 
  • These rules can vary by region, thanks to legal differences. For instance, Meta's Facebook and Instagram are "opt-out" — but you can only opt out if you live in Europe or Brazil.
  • In the U.S., California's data privacy law is among the laws responsible for requiring firms to say what they do with user data. In the EU, it's the GDPR."

Sunday, November 3, 2024

An ‘Interview’ With a Dead Luminary Exposes the Pitfalls of A.I.; The New York Times, November 3, 2024

 , The New York Times; An ‘Interview’ With a Dead Luminary Exposes the Pitfalls of A.I.

"When a state-funded Polish radio station canceled a weekly show featuring interviews with theater directors and writers, the host of the program went quietly, resigned to media industry realities of cost-cutting and shifting tastes away from highbrow culture.

But his resignation turned to fury in late October after his former employer, Off Radio Krakow, aired what it billed as a “unique interview” with an icon of Polish culture, Wislawa Szymborska, the winner of the 1996 Nobel Prize for Literature.

The terminated radio host, Lukasz Zaleski, said he would have invited Ms. Szymborska on his morning show himself, but never did for a simple reason: She died in 2012.

The station used artificial intelligence to generate the recent interview — a dramatic and, to many, outrageous example of technology replacing humans, even dead ones."

Friday, November 1, 2024

AI Training Study to Come This Year, Copyright Office Says; Bloomberg Law, October 31, 2024

Annelise Gilbert , Bloomberg Law; AI Training Study to Come This Year, Copyright Office Says

"The Copyright Office’s report on the legal implications of training artificial intelligence models on copyrighted works is still expected to publish by the end of 2024, the office’s director told lawmakers.

Director Shira Perlmutter on Wednesday said the office aims to complete the remaining two sections of its three-part AI report in the next two months—one on the copyrightability of generative AI output and the other about liability, licensing, and fair use in regards to AI training on protected works."

Thursday, October 31, 2024

Thousands of published studies may contain images with incorrect copyright licences; Chemistry World, October 28, 2024

, Chemistry World ; Thousands of published studies may contain images with incorrect copyright licences

"More than 9000 studies published in open-access journals may contain figures published under the wrong copyright licence.

These open-access journals publish content under the CC-BY copyright licence, which means that anyone can copy, distribute or transmit that work including for commercial purposes as long as the original creator is credited. 

All the 9000+ studies contain figures created using the commercial scientific illustration service BioRender, which should technically mean that these are also available for free reuse. But that doesn’t appear to be the case.

When Simon Dürr, a computational enzymologist at the Swiss Federal Institute of Technology Lausanne in Switzerland, reached out to BioRender to ask if two figures produced using BioRender by the authors of both studies were free to reuse, he was told that they weren’t. The company said it would approach both journals and ask them to issue corrections.

Dürr runs an open-source, free-to-use competitor to BioRender called BioIcons and wanted to host figures produced using BioRender that were published in open access journals because he thought they would be free to use. According to Dürr, he followed up with BioRender near the end of 2023, flagging a total of 9277 academic papers published under the CC-BY copyright licence but never heard back on their copyright status. In total, Dürr says he found 12,059 papers if one includes other copyright licences that restrict commercial use or have other similar conditions."

Wednesday, October 30, 2024

A Harris Presidency Is the Only Way to Stay Ahead of A.I.; The New York Times, October 29, 2024

 THOMAS L. FRIEDMAN, The New York Times; A Harris Presidency Is the Only Way to Stay Ahead of A.I.

"Kamala Harris, given her background in law enforcement, connections to Silicon Valley and the work she has already done on A.I. in the past four years, is up to this challenge, which is a key reason she has my endorsement for the presidency...

I am writing a book that partly deals with this subject and have benefited from my tutorials with Craig Mundie, the former chief research and strategy officer for Microsoft who still advises the company. He is soon coming out with a book of his own related to the longer-term issues and opportunities of A.G.I., written with Eric Schmidt, the former Google C.E.O., and Henry Kissinger, who died last year and worked on the book right up to the end of his life.

It is titled “Genesis: Artificial Intelligence, Hope, and the Human Spirit.” The book invokes the Bible’s description of the origin of humanity because the authors believe that our A.I. moment is an equally fundamental turning point for our species.

I agree. We have become Godlike as a species in two ways: We are the first generation to intentionally create a computer with more intelligence than God endowed us with. And we are the first generation to unintentionally change the climate with our own hands.

The problem is we have become Godlike without any agreement among us on the Ten Commandments — on a shared value system that should guide the use of our newfound powers. We need to fix that fast. And no one is better positioned to lead that challenge than the next U.S. president, for several reasons."

Monday, October 28, 2024

THE COPYRIGHT CONTROVERSY BEHIND A VIRAL GOSPEL HIT; Christianity Today, October 28, 2024

, Christianity Today; THE COPYRIGHT CONTROVERSY BEHIND A VIRAL GOSPEL HIT

"Like any growing genre, Christian music’s increasing global popularity has placed a higher value on hits—and raised the stakes of proper attribution and credit. But in a Christian context, conflicts over credit and compensation can be especially fraught. The appearance of greed or opportunism can threaten a Christian artist’s reputation, but failure to claim credit threatens their livelihood—especially for independent musicians and producers or those working in smaller and developing industries like Ghana’s.

“Many dismiss the importance of legal considerations with statements like ‘Since it’s a God thing, it’s free and for everyone,’” said Eugene Zuta, a Ghanaian songwriter and worship leader. “As a result, copyright issues are often disregarded, and regulations are violated. Some of my songs have been used by others, who make light of their infringement with lame excuses.”"

Video game libraries lose legal appeal to emulate physical game collections online; Ars Technica, October 25, 2024

KYLE ORLAND, Ars Technica; Video game libraries lose legal appeal to emulate physical game collections online

"Earlier this year, we reported on the video game archivists asking for a legal DMCA exemption to share Internet-accessible emulated versions of their physical game collections with researchers. Today, the US Copyright Office announced once again that it was denying that request, forcing researchers to travel to far-flung collections for access to the often-rare physical copies of the games they're seeking.

In announcing its decision, the Register of Copyrights for the Library of Congresssided with the Entertainment Software Association and others who argued that the proposed remote access could serve as a legal loophole for a free-to-access "online arcade" that could harm the market for classic gaming re-releases. This argument resonated with the Copyright Office despite a VGHF study that found 87 percent of those older game titles are currently out of print."

Sunday, October 27, 2024

Public Knowledge, iFixit Free the McFlurry, Win Copyright Office DMCA Exemption for Ice Cream Machines; Public Knowledge, October 25, 2024

 Shiva Stella , Public Knowledge; Public Knowledge, iFixit Free the McFlurry, Win Copyright Office DMCA Exemption for Ice Cream Machines

"Today, the U.S. Copyright Office partially granted an exemption requested by Public Knowledge and iFixit to allow people to circumvent digital locks in order to repair commercial and industrial equipment. The Office did not grant the full scope of the requested exemption, but did grant an exemption specifically allowing for repair of retail-level food preparation equipment – including soft serve ice cream machines similar to those available at McDonald’s. The Copyright Office reviewed the request as part of its 1201 review process, which encourages advocates and public interest groups to present arguments for exemption to the Digital Millennium Copyright Act.

Section 1201 of the DMCA makes it illegal to bypass a digital lock that protects a copyrighted work, such as a device’s software, even when there is no copyright infringement. Every three years, the Copyright Office reviews exemption requests and issues recommendations to the Librarian of Congress on granting certain exceptions to Section 1201. The recommendations go into effect once approved by the Librarian of Congress."

Friday, October 25, 2024

Biden Administration Outlines Government ‘Guardrails’ for A.I. Tools; The New York Times, October 24, 2024

 , The New York Times ; Biden Administration Outlines Government ‘Guardrails’ for A.I. Tools

"President Biden on Thursday signed the first national security memorandum detailing how the Pentagon, the intelligence agencies and other national security institutions should use and protect artificial intelligence technology, putting “guardrails” on how such tools are employed in decisions varying from nuclear weapons to granting asylum.

The new document is the latest in a series Mr. Biden has issued grappling with the challenges of using A.I. tools to speed up government operations — whether detecting cyberattacks or predicting extreme weather — while limiting the most dystopian possibilities, including the development of autonomous weapons.

But most of the deadlines the order sets for agencies to conduct studies on applying or regulating the tools will go into full effect after Mr. Biden leaves office, leaving open the question of whether the next administration will abide by them...

The new guardrails would also prohibit letting artificial intelligence tools make a decision on granting asylum. And they would forbid tracking someone based on ethnicity or religion, or classifying someone as a “known terrorist” without a human weighing in.

Perhaps the most intriguing part of the order is that it treats private-sector advances in artificial intelligence as national assets that need to be protected from spying or theft by foreign adversaries, much as early nuclear weapons were. The order calls for intelligence agencies to begin protecting work on large language models or the chips used to power their development as national treasures, and to provide private-sector developers with up-to-the-minute intelligence to safeguard their inventions."

Wednesday, October 23, 2024

Former OpenAI Researcher Says the Company Broke Copyright Law; The New York Times, October 23, 2024

 , The New York Times; Former OpenAI Researcher Says the Company Broke Copyright Law

"Mr. Balaji believes the threats are more immediate. ChatGPT and other chatbots, he said, are destroying the commercial viability of the individuals, businesses and internet services that created the digital data used to train these A.I. systems.

“This is not a sustainable model for the internet ecosystem as a whole,” he told The Times."

Monday, October 21, 2024

Microsoft boss urges rethink of copyright laws for AI; The Times, October 21, 2024

Katie Prescott, The Times; Microsoft boss urges rethink of copyright laws for AI

"The boss of Microsoft has called for a rethink of copyright laws so that tech giants are able to train artificial intelligence models without risk of infringing intellectual property rights.

Satya Nadella, chief executive of the technology multinational, praised Japan’s more flexible copyright laws and said that governments need to develop a new legal framework to define “fair use” of material, which allows people in certain situations to use intellectual property without permission.

Nadella, 57, said governments needed to iron out the rules. “What are the bounds for copyright, which obviously have to be protected? What’s fair use?” he said. “For any society to move forward, you need to know what is fair use.”"

News Corp Sues AI Company Perplexity Over Copyright Claims, Made Up Text; The Hollywood Reporter, October 21, 2024

Caitlin Huston , The Hollywood Reporter; News Corp Sues AI Company Perplexity Over Copyright Claims, Made Up Text

"Dow Jones, the parent company to the Wall Street Journal, and the New York Post filed a lawsuit Monday against artificial intelligence company Perplexity, alleging that the company is illegally using copyrighted work.

The suit alleges that Perplexity, which is an AI research and conversational search engine, draws on articles and other copyrighted content from the publishers to feed into its product and then repackages the content in its responses, or sometimes uses the content verbatim, without linking back to the articles. The engine can also be used to display several paragraphs or entire articles, when asked."

‘Blade Runner 2049’ Producers Sue Elon Musk, Tesla and Warner Bros. Discovery, Alleging Copyright Infringement; Variety, October 21, 2024

Todd Spangler , Variety; ‘Blade Runner 2049’ Producers Sue Elon Musk, Tesla and Warner Bros. Discovery, Alleging Copyright Infringement

"Alcon Entertainment, the production company behind “Blade Runner 2049,” sued Tesla and CEO Elon Musk, as well as Warner Bros. Discovery, alleging that AI-generated images depicting scenes from the film used for the launch of Tesla’s self-driving Robotaxi represent copyright infringement.

In its lawsuit, filed Monday in L.A., Alcon said it had adamantly insisted that “Blade Runner 2049,” which stars Ryan Gosling and Harrison Ford, have no affiliation of any kind with “Tesla, X, Musk or any Musk-owned company,” given “Musk’s massively amplified, highly politicized, capricious and arbitrary behavior, which sometimes veers into hate speech.”"

Saturday, October 19, 2024

Courts Agree That No One Should Have a Monopoly Over the Law. Congress Shouldn’t Change That; Electronic Frontier Foundation (EFF), October 16, 2024

 CORYNNE MCSHERRY, Electronic Frontier Foundation (EFF); Courts Agree That No One Should Have a Monopoly Over the Law. Congress Shouldn’t Change That

"For more than a decade, giant standards development organizations (SDOs) have been fighting in courts around the country, trying use copyright law to control access to other laws. They claim that that they own the copyright in the text of some of the most important regulations in the country – the codes that protect product, building and environmental safety--and that they have the right to control access to those laws. And they keep losing because, it turns out, from New York, to Missouri, to the District of Columbia, judges understand that this is an absurd and undemocratic proposition. 

They suffered their latest defeat in Pennsylvania, where  a district court held that UpCodes, a company that has created a database of building codes – like the National Electrical Code--can include codes incorporated by reference into law. ASTM, a private organization that coordinated the development of some of those codes, insists that it retains copyright in them even after they have been adopted into law. Some courts, including the Fifth Circuit Court of Appeals, have rejected that theory outright, holding that standards lose copyright protection when they are incorporated into law. Others, like the DC Circuit Court of Appeals in a case EFF defended on behalf of Public.Resource.Org, have held that whether or not the legal status of the standards changes once they are incorporated into law, posting them online is a lawful fair use. 

In this case, ASTM v. UpCodes, the court followed the latter path. Relying in large part on the DC Circuit’s decision, as well as an amicus brief EFF filed in support of UpCodes, the court held that providing access to the law (for free or subject to a subscription for “premium” access) was a lawful fair use. A key theme to the ruling is the public interest in accessing law:"