Showing posts with label ethics. Show all posts
Showing posts with label ethics. Show all posts

Monday, September 30, 2024

USU's College of Humanities & Social Sciences Hosts Conference on Ethics of AI; Utah State University (USU), September 23, 2024

 Utah State University (USU); USU's College of Humanities & Social Sciences Hosts Conference on Ethics of AI

"AI’s emergence from the obscure to the unavoidable has come with many questions and concerns — some of which deal with how we can and should use it ethically.


To help answer some of these questions the USU Communication Studies and Philosophy Department and the Center for Anticipatory Intelligence hosted a conference.

They brought in scholars from a variety of disciplines to discuss these issues — with these experts coming from the University of Cambridge, New York University and Northeastern University, among others."

Wednesday, July 10, 2024

Considering the Ethics of AI Assistants; Tech Policy Press, July 7, 2024

 JUSTIN HENDRIX , Tech Policy Press ; Considering the Ethics of AI Assistants

"Just a couple of weeks before Pichai took the stage, in April, Google DeepMind published a paper that boasts 57 authors, including experts from a range of disciplines from different parts of Google, including DeepMind, Jigsaw, and Google Research, as well as researchers from academic institutions such as Oxford, University College London, Delft University of Technology, University of Edinburgh, and a think tank at Georgetown, the Center for Security and Emerging Technology. The paper speculates about the ethical and societal risks posed by the types of AI assistants Google and other tech firms want to build, which the authors say are “likely to have a profound impact on our individual and collective lives.”"

Friday, June 14, 2024

Pope Francis is first pontiff to address G7 leaders with AI speech; Axios, June 14, 2024

"Pope Francis made history Friday as the first pontiff to speak at the Group of Seven meeting in Fasano, Italy, where he discussed his concerns with artificial intelligence.

Why it matters: The pope has long urged caution around AI, calling it "a fascinating tool and also a terrifying one," during his remarks Friday even as he acknowledged its potential applications in medicine, labor, culture, communications, education and politics. 

  • "The holy scriptures say that God gave to human beings his spirit in order for them to have wisdom, intelligence and knowledge in all kinds of tasks," he said. "Science and technology are therefore extraordinary products of the potential which is active in us human beings.""

Saturday, April 6, 2024

Where AI and property law intersect; Arizona State University (ASU) News, April 5, 2024

 Dolores Tropiano, Arizona State University (ASU) News; Where AI and property law intersect

"Artificial intelligence is a powerful tool that has the potential to be used to revolutionize education, creativity, everyday life and more.

But as society begins to harness this technology and its many uses — especially in the field of generative AI — there are growing ethical and copyright concerns for both the creative industry and legal sector.

Tyson Winarski is a professor of practice with the Intellectual Property Law program in Arizona State University’s Sandra Day O’Connor College of Law. He teaches an AI and intellectual property module within the course Artificial Intelligence: Law, Ethics and Policy, taught by ASU Law Professor Gary Marchant.

“The course is extremely important for attorneys and law students,” Winarski said. “Generative AI is presenting huge issues in the area of intellectual property rights and copyrights, and we do not have definitive answers as Congress and the courts have not spoken on the issue yet.”"

Friday, February 2, 2024

European Publishers Praise New EU AI Law; Publishers Weekly, February 2, 2024

 Ed Nawotka, Publishers Weekly; European Publishers Praise New EU AI Law

"The Federation of European Publishers (FEP) was quick to praise the passage of new legislation by the European Union that, among its provisions, requires "general purpose AI companies" to respect copyright law and have policies in place to this effect.

FEP officials called the EU Artificial Intelligence (AI) Act, which passed on February 2, the "world’s first concrete regulation of AI," and said that the legislation seeks to "ensure the ethical and human-centric development of this technology and prevent abusive or illegal practices law, which also demands transparency about what data is being used in training the models.""

Thursday, February 1, 2024

The economy and ethics of AI training data; Marketplace.org, January 31, 2024

 Matt Levin, Marketplace.org;  The economy and ethics of AI training data

"Maybe the only industry hotter than artificial intelligence right now? AI litigation. 

Just a sampling: Writer Michael Chabon is suing Meta. Getty Images is suing Stability AI. And both The New York Times and The Authors Guild have filed separate lawsuits against OpenAI and Microsoft. 

At the heart of these cases is the allegation that tech companies illegally used copyrighted works as part of their AI training data. 

For text focused generative AI, there’s a good chance that some of that training data originated from one massive archive: Common Crawl

“Common Crawl is the copy of the internet. It’s a 17-year archive of the internet. We make this freely available to researchers, academics and companies,” said Rich Skrenta, who heads the nonprofit Common Crawl Foundation."

Tuesday, January 30, 2024

Florida’s New Advisory Ethics Opinion on Generative AI Hits the Mark; JDSupra, January 29, 2024

 Ralph Artigliere , JDSupra; Florida’s New Advisory Ethics Opinion on Generative AI Hits the Mark

"As a former Florida trial lawyer and judge who appreciates emerging technology, I admit that I had more than a little concern when The Florida Bar announced it was working on a new ethics opinion on generative AI. Generative AI promises to provide monumental advantages to lawyers in their workflow, quality of work product, productivity, and time management and more. For clients, use of generative AI by their lawyers can mean better legal services delivered faster and with greater economy. In the area of eDiscovery, generative AI promises to surpass technology assisted review in helping manage the increasingly massive amounts of data.

Generative AI is new to the greater world, and certainly to busy lawyers who are not reading every blogpost on AI. The internet and journals are afire over concerns of hallucinations, confidentiality, bias, and the like. I felt a new ethics opinion might throw a wet blanket on generative AI and discourage Florida lawyers from investigating the new technology.

Thankfully, my concerns did not become reality. The Florida Bar took a thorough look at the technology and the existing ethical guidance and law and applied existing guidelines and rules in a thorough and balanced fashion. This article briefly summarizes Opinion 24-1 and highlights some of its important features.

The Opinion

On January 19, 2024, The Florida Bar released Ethics Opinion 24-1(“Opinion 24-1”)regarding the use of generative artificial intelligence (“AI”) in the practice of law. The Florida Bar and the State Bar of California are leaders in issuing ethical guidance on this issue. Opinion 24-1 draws from a solid background of ethics opinions and guidance in Florida and around the country and provides positive as well as cautionary statements regarding the emerging technologies. Overall, the guidance is well-placed and helpful for lawyers at a time when so many are weighing the use of generative AI technology in their law practices."

Lawyers weigh strength of copyright suit filed against BigLaw firm; Rhode Island Lawyers Weekly, January 29, 2024

Pat Murphy , Rhode Island Lawyers Weekly; Lawyers weigh strength of copyright suit filed against BigLaw firm

"Jerry Cohen, a Boston attorney who teaches IP law at Roger Williams University School of Law, called the suit “not so much a copyright case as it is a matter of professional responsibility and respect.”"

Friday, January 26, 2024

A Stranger Bought a Set of Highly Personal Letters. Can I Call Him Out?; The Ethicist, The New York Times Magazine, January 25, 2024

 Kwame Anthony Appiah, The Ethicist,The New York Times Magazine ; A Stranger Bought a Set of Highly Personal Letters. Can I Call Him Out?

"From the Ethicist:

It was thoughtless, I agree, to sell off a cache of letters that included some that were intimate and came from living people. The thought of strangers’ digging through letters written in the spirit of love and friendship can be upsetting. That the person who has acquired these letters has failed to grasp this suggests a certain lack of empathy. But it doesn’t establish that he lacks a moral sense, because you don’t really have any idea what he plans to do with this material. 

And there are constraints on this. When you acquire letters, you don’t thereby acquire the copyright in those letters, and copyright protection typically lasts until 70 years after the author’s death. So he has to deal with the murky issue of what counts as the “fair use” of such intellectual property. There are also a few privacy torts that individuals can try to pursue in the courts (e.g., intrusion upon seclusion; public disclosure of private facts). Even though he isn’t a party to a covenant of confidentiality, as someone in A.A. is, it remains true that, as you imply, exposing details of the intimate lives of private people is generally wrong."

Thursday, December 28, 2023

AI starts a music-making revolution and plenty of noise about ethics and royalties; The Washington Times, December 26, 2023

 Tom Howell Jr. , The Washington Times ; AI starts a music-making revolution and plenty of noise about ethics and royalties

"“Music’s important. AI is changing that relationship. We need to navigate that carefully,” said Martin Clancy, an Ireland-based expert who has worked on chart-topping songs and is the founding chairman of the IEEE Global AI Ethics Arts Committee...

The Biden administration, the European Union and other governments are rushing to catch up with AI and harness its benefits while controlling its potentially adverse societal impacts. They are also wading through copyright and other matters of law.

Even if they devise legislation now, the rules likely will not go into effect for years. The EU recently enacted a sweeping AI law, but it won’t take effect until 2025.

“That’s forever in this space, which means that all we’re left with is our ethical decision-making,” Mr. Clancy said.

For now, the AI-generated music landscape is like the Wild West. Many AI-generated songs are hokey or just not very good."

Saturday, July 22, 2023

How a Drug Maker Profited by Slow-Walking a Promising H.I.V. Therapy; The New York Times, July 22, 2023

Rebecca Robbins and How a Drug Maker Profited by Slow-Walking a Promising H.I.V. Therapy

"Gilead, one of the world’s largest drugmakers, appeared to be embracing a well-worn industry tactic: gaming the U.S. patent system to protect lucrative monopolies on best-selling drugs...

Gilead ended up introducing a version of the new treatment in 2015, nearly a decade after it might have become available if the company had not paused development in 2004. Its patents now extend until at least 2031.

The delayed release of the new treatment is now the subject of state and federal lawsuits in which some 26,000 patients who took Gilead’s older H.I.V. drugs claim that the company unnecessarily exposed them to kidney and bone problems."

Wednesday, July 12, 2023

Three things to know about how the US Congress might regulate AI; MIT Technology Review, July 3, 2023

 Tate Ryan-Mosley, MIT Technology Review ; Three things to know about how the US Congress might regulate AI

"Here are three key themes in all this chatter that you should know to help you understand where US AI legislation could be going."

Saturday, July 1, 2023

AMP v. Myriad: The Fight to Take Back Our Genes; ACLU, June 13, 2023

 Lora Strum , ACLU; AMP v. Myriad: The Fight to Take Back Our Genes

"Ten years after the Supreme Court invalidated the patents on two human genes in AMP v. Myriad, we revisit the landmark case amid renewed calls for gene patenting."

Thursday, May 4, 2023

OpenAI's ChatGPT may face a copyright quagmire after 'memorizing' these books; The Register, May 3, 2023

Thomas Claburn, The Register; OpenAI's ChatGPT may face a copyright quagmire after 'memorizing' these books

"Tyler Ochoa, a professor in the Law department at Santa Clara University in California, told The Register he fully expects to see lawsuits against the makers of large language models that generate text, including OpenAI, Google, and others.

Ochoa said the copyright issues with AI text generation are exactly the same as the issues with AI image generation. First: is copying large amounts of text or images for training the model fair use? The answer to that, he said, is probably yes.

Second: if the model generates output that's too similar to the input – what the paper refers to as "memorization" – is that copyright infringement? The answer to that, he said, is almost certainly yes.

And third: if the output of an AI text generator is not a copy of an existing text, is it protected by copyright?

Under current law, said Ochoa, the answer is no – because US copyright law requires human creativity, though some countries will disagree and will protect AI-generated works. However, he added, activities like selecting, arranging, and modifying AI model output makes copyright protection more plausible."

Monday, May 1, 2023

Generative AI: Ethical, Legal, and Technical Questions; Markkula Center for Applied Ethics, Santa Clara University, Tuesday, May 16, 2023 12 Noon Pacific/3 PM Eastern

 

Join us May 16th at noon for an online panel discussion on ethical, legal, and technical questions related to generative AI.

Generative AI: Ethical, Legal, and Technical Questions

Generative AI: Ethical, Legal, and Technical Questions

 
Noon to 1:00 p.m. Pacific
Tuesday, May 16, 2023
 

"As artists, composers, and other “content creators” and intellectual property owners use generative AI tools or decry their development, many legal and ethical issues arise. In this panel discussion, a copyright law expert, an AI researcher who is also a composer and music performer, and a multi-disciplinary visual artist (all of whom teach at Santa Clara University) will address some of those questions–from training data collection to fair use, impact on creativity and creative labor, the balancing of various rights, and our ability to assess and respond to fast-moving technologies."

Register to Attend the Webinar

Saturday, April 29, 2023

Editors quit top neuroscience journal to protest against open-access charges; Nature, April 21, 2023

Katharine Sanderson, Nature; Editors quit top neuroscience journal to protest against open-access charges

"More than 40 editors have resigned from two leading neuroscience journals in protest against what the editors say are excessively high article-processing charges (APCs) set by the publisher. They say that the fees, which publishers use to cover publishing services and in some cases make money, are unethical. The publisher, Dutch company Elsevier, says that its fees provide researchers with publishing services that are above average quality for below average price. The editors plan to start a new journal hosted by the non-profit publisher MIT Press.

The decision to resign came about after many discussions among the editors, says Stephen Smith, a neuroscientist at the University of Oxford, UK, and editor-in-chief of one of the journals, NeuroImage. “Everyone agreed that the APC was unethical and unsustainable,” says Smith, who will lead the editorial team of the new journal, Imaging Neuroscience, when it launches.

The 42 academics who made up the editorial teams at NeuroImage and its companion journal NeuroImage: Reports announced their resignations on 17 April. The journals are open access and require authors to pay a fee for publishing services. The APC for NeuroImage is US$3,450; NeuroImage: Reports charges $900, which will double to $1,800 from 31 May. Elsevier, based in Amsterdam, says that the APCs cover the costs associated with publishing an article in an open-access journal, including editorial and peer-review services, copyediting, typesetting archiving, indexing, marketing and administrative costs. Andrew Davis, Elsevier’s vice-president of corporate communications, says that NeuroImage’s fee is less than that of the nearest comparable journal in its field, and that the publisher’s APCs are “set in line with our policy [of] providing above average quality for below average price”."

Thursday, April 13, 2023

To Ingrain AI Ethics, We Should Get Creative About Copyrights; Undark Magazine, April 13, 2023

 CASON SCHMIT & JENNIFER WAGNER, Undark Magazine; To Ingrain AI Ethics, We Should Get Creative About Copyrights

"What’s clear, however, is that the risk of doing nothing is tremendous. AI is rapidly evolving and disrupting existing systems and structures in unpredictable ways. We need disruptive innovation in AI policy perhaps even more than we need disruption in the technology itself — and AI creators and users must be willing participants in this endeavor. Efforts to grapple with the ethical, legal, social, and policy issues around AI must be viewed not as a luxury but as a necessity, and as an integral part of AI design. Otherwise, we run the risk of letting industry set the terms of AI’s future, and we leave individuals, groups, and even our very democracy vulnerable to its whims."

Friday, January 13, 2023

Advances in artificial intelligence raise new ethics concerns; PBS News Hour, January 10, 2023

 , PBS News Hour ; Advances in artificial intelligence raise new ethics concerns

"In recent months, new artificial intelligence tools have garnered attention, and concern, over their ability to produce original work. The creations range from college-level essays to computer code and works of art. As Stephanie Sy reports, this technology could change how we live and work in profound ways."

Tuesday, December 20, 2022

Some of Trump’s New NFTs Look Like Photoshops of Google Search Results; PetaPixel, December 16, 2022

JARON SCHNEIDER, PetaPixel; Some of Trump’s New NFTs Look Like Photoshops of Google Search Results

"After hyping a major announcement, Donald Trump revealed his next major project: NFTs. But reverse image searches of some of the “digital trading cards” revealed them to be edits of clothing easily found in Google search, raising copyright questions...

While these images aren’t what most would consider to be the height of photographic art, they are still photos that are presumably owned by a manufacturer and using images — even e-commerce photos — without permission in this manner brings up copyright questions: it may not be legal, not to mention unethical, to just take photos off web stores, turn them into “art,” and then sell them for $99 each.

Gizmodo says it reached out to the manufacturer of both pieces of clothing to ask if either granted the former U.S. President permission to use their images, but neither immediately responded." 

Sunday, November 20, 2022

 JAMES VINCENT, The Verge; The scary truth about AI copyright is nobody knows what will happen next

"...is any of this actually legal?

The question arises because of the way generative AI systems are trained. Like most machine learning software, they work by identifying and replicating patterns in data. But because these programs are used to generate code, text, music, and art, that data is itself created by humans, scraped from the web and copyright protected in one way or another.

For AI researchers in the far-flung misty past (aka the 2010s), this wasn’t much of an issue. At the time, state-of-the-art models were only capable of generating blurry, fingernail-sized black-and-white images of faces. This wasn’t an obvious threat to humans. But in the year 2022, when a lone amateur can use software like Stable Diffusion to copy an artist’s style in a matter of hours or when companies are selling AI-generated prints and social media filters that are explicit knock-offs of living designers, questions of legality and ethics have become much more pressing."