Thursday, October 31, 2024

Thousands of published studies may contain images with incorrect copyright licences; Chemistry World, October 28, 2024

, Chemistry World ; Thousands of published studies may contain images with incorrect copyright licences

"More than 9000 studies published in open-access journals may contain figures published under the wrong copyright licence.

These open-access journals publish content under the CC-BY copyright licence, which means that anyone can copy, distribute or transmit that work including for commercial purposes as long as the original creator is credited. 

All the 9000+ studies contain figures created using the commercial scientific illustration service BioRender, which should technically mean that these are also available for free reuse. But that doesn’t appear to be the case.

When Simon Dürr, a computational enzymologist at the Swiss Federal Institute of Technology Lausanne in Switzerland, reached out to BioRender to ask if two figures produced using BioRender by the authors of both studies were free to reuse, he was told that they weren’t. The company said it would approach both journals and ask them to issue corrections.

Dürr runs an open-source, free-to-use competitor to BioRender called BioIcons and wanted to host figures produced using BioRender that were published in open access journals because he thought they would be free to use. According to Dürr, he followed up with BioRender near the end of 2023, flagging a total of 9277 academic papers published under the CC-BY copyright licence but never heard back on their copyright status. In total, Dürr says he found 12,059 papers if one includes other copyright licences that restrict commercial use or have other similar conditions."

Wednesday, October 30, 2024

A Harris Presidency Is the Only Way to Stay Ahead of A.I.; The New York Times, October 29, 2024

 THOMAS L. FRIEDMAN, The New York Times; A Harris Presidency Is the Only Way to Stay Ahead of A.I.

"Kamala Harris, given her background in law enforcement, connections to Silicon Valley and the work she has already done on A.I. in the past four years, is up to this challenge, which is a key reason she has my endorsement for the presidency...

I am writing a book that partly deals with this subject and have benefited from my tutorials with Craig Mundie, the former chief research and strategy officer for Microsoft who still advises the company. He is soon coming out with a book of his own related to the longer-term issues and opportunities of A.G.I., written with Eric Schmidt, the former Google C.E.O., and Henry Kissinger, who died last year and worked on the book right up to the end of his life.

It is titled “Genesis: Artificial Intelligence, Hope, and the Human Spirit.” The book invokes the Bible’s description of the origin of humanity because the authors believe that our A.I. moment is an equally fundamental turning point for our species.

I agree. We have become Godlike as a species in two ways: We are the first generation to intentionally create a computer with more intelligence than God endowed us with. And we are the first generation to unintentionally change the climate with our own hands.

The problem is we have become Godlike without any agreement among us on the Ten Commandments — on a shared value system that should guide the use of our newfound powers. We need to fix that fast. And no one is better positioned to lead that challenge than the next U.S. president, for several reasons."

Monday, October 28, 2024

THE COPYRIGHT CONTROVERSY BEHIND A VIRAL GOSPEL HIT; Christianity Today, October 28, 2024

, Christianity Today; THE COPYRIGHT CONTROVERSY BEHIND A VIRAL GOSPEL HIT

"Like any growing genre, Christian music’s increasing global popularity has placed a higher value on hits—and raised the stakes of proper attribution and credit. But in a Christian context, conflicts over credit and compensation can be especially fraught. The appearance of greed or opportunism can threaten a Christian artist’s reputation, but failure to claim credit threatens their livelihood—especially for independent musicians and producers or those working in smaller and developing industries like Ghana’s.

“Many dismiss the importance of legal considerations with statements like ‘Since it’s a God thing, it’s free and for everyone,’” said Eugene Zuta, a Ghanaian songwriter and worship leader. “As a result, copyright issues are often disregarded, and regulations are violated. Some of my songs have been used by others, who make light of their infringement with lame excuses.”"

Video game libraries lose legal appeal to emulate physical game collections online; Ars Technica, October 25, 2024

KYLE ORLAND, Ars Technica; Video game libraries lose legal appeal to emulate physical game collections online

"Earlier this year, we reported on the video game archivists asking for a legal DMCA exemption to share Internet-accessible emulated versions of their physical game collections with researchers. Today, the US Copyright Office announced once again that it was denying that request, forcing researchers to travel to far-flung collections for access to the often-rare physical copies of the games they're seeking.

In announcing its decision, the Register of Copyrights for the Library of Congresssided with the Entertainment Software Association and others who argued that the proposed remote access could serve as a legal loophole for a free-to-access "online arcade" that could harm the market for classic gaming re-releases. This argument resonated with the Copyright Office despite a VGHF study that found 87 percent of those older game titles are currently out of print."

Sunday, October 27, 2024

Public Knowledge, iFixit Free the McFlurry, Win Copyright Office DMCA Exemption for Ice Cream Machines; Public Knowledge, October 25, 2024

 Shiva Stella , Public Knowledge; Public Knowledge, iFixit Free the McFlurry, Win Copyright Office DMCA Exemption for Ice Cream Machines

"Today, the U.S. Copyright Office partially granted an exemption requested by Public Knowledge and iFixit to allow people to circumvent digital locks in order to repair commercial and industrial equipment. The Office did not grant the full scope of the requested exemption, but did grant an exemption specifically allowing for repair of retail-level food preparation equipment – including soft serve ice cream machines similar to those available at McDonald’s. The Copyright Office reviewed the request as part of its 1201 review process, which encourages advocates and public interest groups to present arguments for exemption to the Digital Millennium Copyright Act.

Section 1201 of the DMCA makes it illegal to bypass a digital lock that protects a copyrighted work, such as a device’s software, even when there is no copyright infringement. Every three years, the Copyright Office reviews exemption requests and issues recommendations to the Librarian of Congress on granting certain exceptions to Section 1201. The recommendations go into effect once approved by the Librarian of Congress."

Friday, October 25, 2024

Biden Administration Outlines Government ‘Guardrails’ for A.I. Tools; The New York Times, October 24, 2024

 , The New York Times ; Biden Administration Outlines Government ‘Guardrails’ for A.I. Tools

"President Biden on Thursday signed the first national security memorandum detailing how the Pentagon, the intelligence agencies and other national security institutions should use and protect artificial intelligence technology, putting “guardrails” on how such tools are employed in decisions varying from nuclear weapons to granting asylum.

The new document is the latest in a series Mr. Biden has issued grappling with the challenges of using A.I. tools to speed up government operations — whether detecting cyberattacks or predicting extreme weather — while limiting the most dystopian possibilities, including the development of autonomous weapons.

But most of the deadlines the order sets for agencies to conduct studies on applying or regulating the tools will go into full effect after Mr. Biden leaves office, leaving open the question of whether the next administration will abide by them...

The new guardrails would also prohibit letting artificial intelligence tools make a decision on granting asylum. And they would forbid tracking someone based on ethnicity or religion, or classifying someone as a “known terrorist” without a human weighing in.

Perhaps the most intriguing part of the order is that it treats private-sector advances in artificial intelligence as national assets that need to be protected from spying or theft by foreign adversaries, much as early nuclear weapons were. The order calls for intelligence agencies to begin protecting work on large language models or the chips used to power their development as national treasures, and to provide private-sector developers with up-to-the-minute intelligence to safeguard their inventions."

Wednesday, October 23, 2024

Former OpenAI Researcher Says the Company Broke Copyright Law; The New York Times, October 23, 2024

 , The New York Times; Former OpenAI Researcher Says the Company Broke Copyright Law

"Mr. Balaji believes the threats are more immediate. ChatGPT and other chatbots, he said, are destroying the commercial viability of the individuals, businesses and internet services that created the digital data used to train these A.I. systems.

“This is not a sustainable model for the internet ecosystem as a whole,” he told The Times."

Monday, October 21, 2024

Microsoft boss urges rethink of copyright laws for AI; The Times, October 21, 2024

Katie Prescott, The Times; Microsoft boss urges rethink of copyright laws for AI

"The boss of Microsoft has called for a rethink of copyright laws so that tech giants are able to train artificial intelligence models without risk of infringing intellectual property rights.

Satya Nadella, chief executive of the technology multinational, praised Japan’s more flexible copyright laws and said that governments need to develop a new legal framework to define “fair use” of material, which allows people in certain situations to use intellectual property without permission.

Nadella, 57, said governments needed to iron out the rules. “What are the bounds for copyright, which obviously have to be protected? What’s fair use?” he said. “For any society to move forward, you need to know what is fair use.”"

News Corp Sues AI Company Perplexity Over Copyright Claims, Made Up Text; The Hollywood Reporter, October 21, 2024

Caitlin Huston , The Hollywood Reporter; News Corp Sues AI Company Perplexity Over Copyright Claims, Made Up Text

"Dow Jones, the parent company to the Wall Street Journal, and the New York Post filed a lawsuit Monday against artificial intelligence company Perplexity, alleging that the company is illegally using copyrighted work.

The suit alleges that Perplexity, which is an AI research and conversational search engine, draws on articles and other copyrighted content from the publishers to feed into its product and then repackages the content in its responses, or sometimes uses the content verbatim, without linking back to the articles. The engine can also be used to display several paragraphs or entire articles, when asked."

‘Blade Runner 2049’ Producers Sue Elon Musk, Tesla and Warner Bros. Discovery, Alleging Copyright Infringement; Variety, October 21, 2024

Todd Spangler , Variety; ‘Blade Runner 2049’ Producers Sue Elon Musk, Tesla and Warner Bros. Discovery, Alleging Copyright Infringement

"Alcon Entertainment, the production company behind “Blade Runner 2049,” sued Tesla and CEO Elon Musk, as well as Warner Bros. Discovery, alleging that AI-generated images depicting scenes from the film used for the launch of Tesla’s self-driving Robotaxi represent copyright infringement.

In its lawsuit, filed Monday in L.A., Alcon said it had adamantly insisted that “Blade Runner 2049,” which stars Ryan Gosling and Harrison Ford, have no affiliation of any kind with “Tesla, X, Musk or any Musk-owned company,” given “Musk’s massively amplified, highly politicized, capricious and arbitrary behavior, which sometimes veers into hate speech.”"

Saturday, October 19, 2024

Courts Agree That No One Should Have a Monopoly Over the Law. Congress Shouldn’t Change That; Electronic Frontier Foundation (EFF), October 16, 2024

 CORYNNE MCSHERRY, Electronic Frontier Foundation (EFF); Courts Agree That No One Should Have a Monopoly Over the Law. Congress Shouldn’t Change That

"For more than a decade, giant standards development organizations (SDOs) have been fighting in courts around the country, trying use copyright law to control access to other laws. They claim that that they own the copyright in the text of some of the most important regulations in the country – the codes that protect product, building and environmental safety--and that they have the right to control access to those laws. And they keep losing because, it turns out, from New York, to Missouri, to the District of Columbia, judges understand that this is an absurd and undemocratic proposition. 

They suffered their latest defeat in Pennsylvania, where  a district court held that UpCodes, a company that has created a database of building codes – like the National Electrical Code--can include codes incorporated by reference into law. ASTM, a private organization that coordinated the development of some of those codes, insists that it retains copyright in them even after they have been adopted into law. Some courts, including the Fifth Circuit Court of Appeals, have rejected that theory outright, holding that standards lose copyright protection when they are incorporated into law. Others, like the DC Circuit Court of Appeals in a case EFF defended on behalf of Public.Resource.Org, have held that whether or not the legal status of the standards changes once they are incorporated into law, posting them online is a lawful fair use. 

In this case, ASTM v. UpCodes, the court followed the latter path. Relying in large part on the DC Circuit’s decision, as well as an amicus brief EFF filed in support of UpCodes, the court held that providing access to the law (for free or subject to a subscription for “premium” access) was a lawful fair use. A key theme to the ruling is the public interest in accessing law:"

Friday, October 18, 2024

Mass shooting survivors turn to an unlikely place for justice – copyright law; The Guardian, October 18, 2024

  , The Guardian; Mass shooting survivors turn to an unlikely place for justice – copyright law

"In a Nashville courtroom in early July, survivors of the 2023 Covenant school shooting celebrated an unusual legal victory. Citing copyrightlaw, Judge l’Ashea Myles ruled that the assailant’s writings and other creative property could not be released to the public.

After months of hearings, the decision came down against conservative lawmakers, journalists and advocates who had sued for access to the writings, claiming officials had no right to keep them from the public. But since parents of the assailant – who killed six people at the private Christian elementary school, including three nine-year-old children – signed legal ownership of the shooter’s journals over to the families of surviving students last year, Myles said releasing the materials would violate the federal Copyright Act...

Keeping a shooter’s name or creative property – such as a manifesto or recording – out of the public eye does more than protect the emotional wellbeing of those impacted, experts say. It also helps to prevent future massacres.

That such material can serve as inspiration has been widely documented, explains Rachel Carroll Rivas of the Southern Poverty Law Center’s Intelligence Project. “Those videos just have an inherent dangerous factor, and they really shouldn’t be allowed to spread across the internet,” she said.

The danger stems from the fact that shooters, research has shown, often desire attention, recognition and notoriety."

It Sure Looks Like Trump Watches Are Breaking Copyright Law; Wired, October 18, 2024

Matt Giles, Wired; It Sure Looks Like Trump Watches Are Breaking Copyright Law

"According to the Associated Press, though, TheBestWatchesonEarth LLC advertised a product it can’t deliver, as that image is owned by the 178-year-old news agency. This week, the AP told WIRED it is pursuing a cease and desist against the LLC, which is registered in Sheridan, Wyoming. (The company did not reply to a request for comment about the cease and desist letter.)

Evan Vucci, the AP’s Pulitzer Prize–winning chief photographer, took that photograph, and while he told WIRED he does not own the rights to that image, the AP confirmed earlier this month in an email to WIRED that it is filing the written notice. “AP is proud of Evan Vucci’s photo and recognizes its impact,” wrote AP spokesperson Nicole Meir. “We reserve our rights to this powerful image, as we do with all AP journalism, and continue to license it for editorial use only.”"

Penguin Random House underscores copyright protection in AI rebuff; The Bookseller, October 18, 2024

 MATILDA BATTERSBY, The Bookseller; Penguin Random House underscores copyright protection in AI rebuff

"The world’s biggest trade publisher has changed the wording on its copyright pages to help protect authors’ intellectual property from being used to train large language models (LLMs) and other artificial intelligence (AI) tools, The Bookseller can exclusively reveal.

Penguin Random House (PRH) has amended its copyright wording across all imprints globally, confirming it will appear “in imprint pages across our markets”. The new wording states: “No part of this book may be used or reproduced in any manner for the purpose of training artificial intelligence technologies or systems”, and will be included in all new titles and any backlist titles that are reprinted.

The statement also “expressly reserves [the titles] from the text and data mining exception”, in accordance with a European Parliament directive.

The move specifically to ban the use of its titles by AI firms for the development of chatbots and other digital tools comes amid a slew of copyright infringement cases in the US and reports that large tranches of pirated books have already been used by tech companies to train AI tools. In 2024, several academic publishers including Taylor & Francis, Wiley and Sage have announced partnerships to license content to AI firms.

PRH is believed to be the first of the Big Five anglophone trade publishers to amend its copyright information to reflect the acceleration of AI systems and the alleged reliance by tech companies on using published work to train language models."

Thursday, October 17, 2024

Californians want controls on AI. Why did Gavin Newsom veto an AI safety bill?; The Guardian, October 16, 2024

 Garrison Lovely, The Guardian; Californians want controls on AI. Why did Gavin Newsom veto an AI safety bill? 

"I’m writing a book on the economics and politics of AI and have analyzed years of nationwide polling on the topic. The findings are pretty consistent: people worry about risks from AI, favor regulations, and don’t trust companies to police themselves. Incredibly, these findings tend to hold true for both Republicans and Democrats.

So why would Newsom buck the popular bill?

Well, the bill was fiercely resisted by most of the AI industry, including GoogleMeta and OpenAI. The US has let the industry self-regulate, and these companies desperately don’t want that to change – whatever sounds their leaders make to the contrary...

The top three names on the congressional letter – Zoe Lofgren, Anna Eshoo, and Ro Khanna – have collectively taken more than $4m in political contributions from the industry, accounting for nearly half of their lifetime top-20 contributors. Google was their biggest donor by far, with nearly $1m in total.

The death knell probably came from the former House speaker Nancy Pelosi, who published her own statement against the bill, citing the congressional letter and Li’s Fortune op-ed.

In 2021, reporters discovered that Lofgren’s daughter is a lawyer for Google, which prompted a watchdog to ask Pelosi to negotiate her recusal from antitrust oversight roles.

Who came to Lofgren’s defense? Eshoo and Khanna.

Three years later, Lofgren remains in these roles, which have helped her block efforts to rein in big tech – against the will of even her Silicon Valley constituents.

Pelosi’s 2023 financial disclosure shows that her husband owned between $16m and $80m in stocks and options in Amazon, Google, Microsoft and Nvidia...

Sunny Gandhi of the youth tech advocacy group Encode Justice, which co-sponsored the bill, told me: “When you tell the average person that tech giants are creating the most powerful tools in human history but resist simple measures to prevent catastrophic harm, their reaction isn’t just disbelief – it’s outrage. This isn’t just a policy disagreement; it’s a moral chasm between Silicon Valley and Main Street.”

Newsom just told us which of these he values more."

Wednesday, October 16, 2024

What's Next in AI: How do we regulate AI, and protect against worst outcomes?; Pittsburgh Post-Gazette, October 13, 2024

 EVAN ROBINSON-JOHNSON , Pittsburgh Post-Gazette; What's Next in AI: How do we regulate AI, and protect against worst outcomes?

"Gov. Josh Shapiro will give more of an update on that project and others at a Monday event in Pittsburgh.

While most folks will likely ask him how Pennsylvania can build and use the tools of the future, a growing cadre in Pittsburgh is asking a broader policy question about how to protect against AI’s worst tendencies...

There are no federal laws that regulate the development and use of AI. Even at the state level, policies are sparse. California Gov. Gavin Newsom vetoed a major AI safety bill last month that would have forced greater commitments from the nation’s top AI developers, most of which are based in the Golden State...

Google CEO Sundar Pichai made a similar argument during a visit to Pittsburgh last month. He encouraged students from local high schools to build AI systems that will make the world a better place, then told a packed audience at Carnegie Mellon University that AI is “too important a technology not to regulate.”

Mr. Pichai said he’s hoping for an “innovation-oriented approach” that mostly leverages existing regulations rather than reinventing the wheel."

NASCAR aware of allegations a team engineer stole intellectual property to give to rival team; AP, October 14, 2024

JENNA FRYER, AP; NASCAR aware of allegations a team engineer stole intellectual property to give to rival team

"NASCAR has acknowledged it is aware of allegations that an engineer for a Cup Series team accessed proprietary information and shared it with another team...

Until a lawsuit is filed or a complaint is lodged with NASCAR, there is nothing the series can do, raising concerns that employees will be able to hand over intellectual property to rivals without ramifications."

His daughter was murdered. Then she reappeared as an AI chatbot.; The Washington Post, October 15, 2024

  , The Washington Post; His daughter was murdered. Then she reappeared as an AI chatbot.

"Jennifer’s name and image had been used to create a chatbot on Character.AI, a website that allows users to converse with digital personalities made using generative artificial intelligence. Several people had interacted with the digital Jennifer, which was created by a user on Character’s website, according to a screenshot of her chatbot’s now-deleted profile.

Crecente, who has spent the years since his daughter’s death running a nonprofit organization in her name to prevent teen dating violence, said he was appalled that Character had allowed a user to create a facsimile of a murdered high-schooler without her family’s permission. Experts said the incident raises concerns about the AI industry’s ability — or willingness — to shield users from the potential harms of a service that can deal in troves of sensitive personal information...

The company’s terms of service prevent users from impersonating any person or entity...

AI chatbots can engage in conversation and be programmed to adopt the personalities and biographical details of specific characters, real or imagined. They have found a growing audience online as AI companies market the digital companions as friends, mentors and romantic partners...

Rick Claypool, who researched AI chatbots for the nonprofit consumer advocacy organization Public Citizen, said while laws governing online content at large could apply to AI companies, they have largely been left to regulate themselves. Crecente isn’t the first grieving parent to have their child’s information manipulated by AI: Content creators on TikTok have used AI to imitate the voices and likenesses of missing children and produce videos of them narrating their deaths, to outrage from the children’s families, The Post reported last year.

“We desperately need for lawmakers and regulators to be paying attention to the real impacts these technologies are having on their constituents,” Claypool said. “They can’t just be listening to tech CEOs about what the policies should be … they have to pay attention to the families and individuals who have been harmed.”

Tuesday, October 15, 2024

AI Ethics Council Welcomes LinkedIn Co-Founder Reid Hoffman and Commentator, Founder and Author Van Jones as Newest Members; Business Wire, October 15, 2024

Business Wire; AI Ethics Council Welcomes LinkedIn Co-Founder Reid Hoffman and Commentator, Founder and Author Van Jones as Newest Members

"The AI Ethics Council, founded by OpenAI CEO Sam Altman and Operation HOPE CEO John Hope Bryant, announced today that Reid Hoffman (Co-Founder of LinkedIn and Inflection AI and Partner at Greylock) and Van Jones (CNN commentator, Dream Machine Founder and New York Times best-selling author) have joined as a members. Formed in December 2023, the Council brings together an interdisciplinary body of diverse experts including civil rights activists, HBCU presidents, technology and business leaders, clergy, government officials and ethicists to collaborate and set guidelines on ways to ensure that traditionally underrepresented communities have a voice in the evolution of artificial intelligence and to help frame the human and ethical considerations around the technology. Ultimately, the Council also seeks to help determine how AI can be harnessed to create vast economic opportunities, especially for the underserved.

Mr. Hoffman and Mr. Jones join an esteemed group on the Council, which will serve as a leading authority in identifying, advising on and addressing ethical issues related to AI. In addition to Mr. Altman and Mr. Bryant, founding AI Ethics Council members include: