Showing posts with label transparency. Show all posts
Showing posts with label transparency. Show all posts

Monday, February 9, 2026

The New Fabio Is Claude; The New York Times, February 8, 2026

 , The New York Times; The New Fabio Is Claude

The romance industry, always at the vanguard of technological change, is rapidly adapting to A.I. Not everyone is on board.

"A longtime romance novelist who has been published by Harlequin and Mills & Boon, Ms. Hart was always a fast writer. Working on her own, she released 10 to 12 books a year under five pen names, on top of ghostwriting. But with the help of A.I., Ms. Hart can publish books at an astonishing rate. Last year, she produced more than 200 romance novels in a range of subgenres, from dark mafia romances to sweet teen stories, and self-published them on Amazon. None were huge blockbusters, but collectively, they sold around 50,000 copies, earning Ms. Hart six figures...

Ms. Hart has become an A.I. evangelist. Through her author-coaching business, Plot Prose, she’s taught more than 1,600 people how to produce a novel with artificial intelligence, she said. She’s rolling out her proprietary A.I. writing program, which can generate a book based on an outline in less than an hour, and costs between $80 and $250 a month.

But when it comes to her current pen names, Ms. Hart doesn’t disclose her use of A.I., because there’s still a strong stigma around the technology, she said. Coral Hart is one of her early, now retired pseudonyms, and it’s the name she uses to teach A.I.-assisted writing; she requested anonymity because she still uses her real name for some publishing and coaching projects. She fears that revealing her A.I. use would damage her business for that work.

But she predicts attitudes will soon change, and is adding three new pen names that will be openly A.I.-assisted, she said.

The way Ms. Hart sees it, romance writers must either embrace artificial intelligence, or get left behind...

The writer Elizabeth Ann West, one of Future Fiction’s founders, who came up with the plot of “Bridesmaids and Bourbon,” believes the audience would be bigger if the books weren’t labeled as A.I. The novels, which are available on Amazon, come with a disclaimer on their product page: “This story was produced using author‑directed AI tools.”

“If you hide that there’s A.I., it sells just fine,” she said."


Saturday, February 7, 2026

NBC appears to cut crowd’s booing of JD Vance from Winter Olympics broadcast; The Guardian, February 6, 2026

 , The Guardian; NBC appears to cut crowd’s booing of JD Vance from Winter Olympics broadcast


[Kip Currier: NBC's decision to edit out booing of JD Vance during the Winter Olympics' Opening Ceremony is not surprising, given prior instances of U.S. media editing of similar occurrences, as noted in this Guardian article. But it is nevertheless troubling. NBC is distorting and altering what actually happened, without informing viewers and listeners of its editorial decision-making.

The Opening Ceremony isn't a fictional movie: it's an historical, newsworthy event. As such, alterations to the historical record should not have been made.

Additionally, if a news organization like NBC decides to make changes to news reporting, like removing or suppressing sound for non-technical reasons, it should be transparent about having done so and explain the reasons for such alterations. Trust in news organizations is vital. Actions like sanitization and alterations of news reporting diminish public trust in the accuracy and integrity of news sources and disseminators.

NBCU Academy's website provides information on ethics in journalism. Its first principle "Seek the truth and be truthful in your reporting." is relevant to the editorial decision to edit out the booing of JD Vance:


What are journalism ethics?

Ethics are the guiding values, standards and responsibilities of journalism. At NBCU News Group, the following principles act as the foundation of ethical journalism:

Seek the truth and be truthful in your reporting. Your reporting should be accurate and fair. Ensure that the facts you gathered are verified, sources are attributed and context is provided. Journalists should be bold in seeking and presenting truths to the public, serving as watchdogs over public officials and holding the powerful accountable.

https://nbcuacademy.com/journalism-ethics/

The Society of Professional Journalists (SPJ) also maintains a Code of Ethics. One of its four guiding principles addresses transparency and accountability:

BE ACCOUNTABLE AND TRANSPARENT

Ethical journalism means taking responsibility for one's work and explaining one’s decisions to the public.

Journalists should:

 

Explain ethical choices and processes to audiences. Encourage a civil dialogue with the public about journalistic practices, coverage and news content.

 

Respond quickly to questions about accuracy, clarity and fairness.

 

Acknowledge mistakes and correct them promptly and prominently.

 

Explain corrections and clarifications carefully and clearly.

 

Expose unethical conduct in journalism, including within their organizations.

 

Abide by the same high standards they expect of others.

https://www.spj.org/pdf/spj-code-of-ethics.pdf


[Excerpt]

"The US vice-president, JD Vance, was greeted by a chorus of boos when he appeared at the opening ceremony of the Winter Olympics in Milan on Friday, although American viewers watching NBC’s coverage would have been unaware of the reception.

As speedskater Erin Jackson led Team USA into the San Siro stadium she was greeted by cheers. But when the TV cameras cut to Vance and his wife, Usha, there were boos, jeers and a smattering of applause from the crowd. The reaction was shown on Canadian broadcaster CBC’s feed, with one commentator saying: “There is the vice-president JD Vance and his wife Usha – oops, those are not … uh … those are a lot of boos for him. Whistling, jeering, some applause.”

The Guardian’s Sean Ingle was also at the ceremony and noted the boos, as did USA Today’s Christine Brennan. However, on the NBC broadcast the boos were not heard or remarked upon when Vance appeared on screen, with the commentary team simply saying “JD Vance”. That didn’t stop footage of the boos being circulated and shared on social media in the US. The White House posted a clip of Vance applauding on NBC’s broadcast without any boos.

Friday was not the first time there have been moves to stop US viewers from witnessing dissent against the Trump administration. At September’s US Open, tournament organizers asked broadcasters not to show the crowd’s reaction to Donald Trump, who attended the men’s final. Part of the message read: “We ask all broadcasters to refrain from showing any disruptions or reactions in response to the president’s attendance in any capacity.”

Earlier on Friday in Milan, hundreds of people protested against the presence of US Immigration and Customs Enforcement (ICE) agents at this year’s Olympics. The US state department has said that several federal agencies, including ICE, will be at the Games to help protect visiting Americans. The state department said the ICE unit in Italy is separate from those involved in the immigration crackdown in the United States."

Monday, February 2, 2026

How the Supreme Court Secretly Made Itself Even More Secretive; The New York Times, February 2, 2026

 , The New York Times ; How the Supreme Court Secretly Made Itself Even More Secretive

Amid calls to increase transparency and revelations about the court’s inner workings, the chief justice imposed nondisclosure agreements on clerks and employees.

"n November of 2024, two weeks after voters returned President Donald Trump to office, Chief Justice John G. Roberts Jr. summoned employees of the U.S. Supreme Court for an unusual announcement. Facing them in a grand conference room beneath ornate chandeliers, he requested they each sign a nondisclosure agreement promising to keep the court’s inner workings secret.

The chief justice acted after a series of unusual leaks of internal court documents, most notably of the decision overturning the right to abortion, and news reports about ethical lapses by the justices. Trust in the institution was languishing at a historic low. Debate was intensifying over whether the black box institution should be more transparent.

Instead, the chief justice tightened the court’s hold on information.Its employees have long been expected to stay silent about what they witness behind the scenes. But starting that autumn, in a move that has not been previously reported, the chief justice converted what was once a norm into a formal contract, according to five people familiar with the shift."

Tuesday, January 13, 2026

Türkiye issues ethics framework to regulate AI use in schools; Daily Sabah, January 11, 2026

 Daily Sabah; Türkiye issues ethics framework to regulate AI use in schools

"The Ministry of National Education has issued a comprehensive set of ethical guidelines to regulate the use of artificial intelligence in schools, introducing mandatory online ethical declarations and a centralized reporting system aimed at ensuring transparency, accountability and student safety.

The Ethical Guidelines for Artificial Intelligence Applications in Education set out the rules for how AI technologies may be developed, implemented, monitored and evaluated across public education institutions. The guidelines were prepared under the ministry’s Artificial Intelligence Policy Document and Action Plan for 2025-2029, which came into effect on June 17, 2025."

Tuesday, December 30, 2025

The IP Legislation That Shaped 2025 and Prospects for the New Year; IP Watchdog, December 29, 2025

 BARRY SCHINDLER , IP Watchdog; The IP Legislation That Shaped 2025 and Prospects for the New Year

"As 2025 draws to a close, the intellectual property ecosystem faces a wave of transformative changes driven by artificial intelligence (AI) and evolving legislative priorities. From sweeping federal proposals aimed at harmonizing AI governance and overriding state laws, to new copyright and media integrity measures designed to address deepfakes and transparency, and finally to renewed momentum behind patent eligibility and Patent Trial and Appeal Board (PTAB) reform, these developments signal a pivotal moment for innovators, rights holders, and policymakers alike. This article explores three critical fronts shaping the future of IP: federal AI legislation and executive preemption, copyright accountability and media integrity, and the year-end outlook for patent reform—each redefining the balance between innovation, protection, and compliance."

A code of ethics for AI in education; The Times of Israel, December 29, 2025

 Raz Frohlich, The Times of Israel; A code of ethics for AI in education

"Generative artificial intelligence is transforming every corner of our lives — how we communicate, create, work, and, inevitably, how we teach and learn. As educators, we cannot ignore its power, nor can we embrace it blindly. The rapid pace of AI innovation requires not only technical adaptation, but also deep ethical reflection.

As the largest education provider in Israel, at Israel Sci-Tech Schools (ISTS), we believe that, as AI becomes increasingly present in classrooms, we must ensure that human judgment, accountability, and responsibility remain at the center of education. That is why we are the first in Israel to create a Code of Ethics for Artificial Intelligence in Education. This is not just a policy document but an open invitation for discussion, learning, and shared responsibility across the education system.

This ethical code is not a technical manual, and it does not provide instant answers for daily classroom situations. Instead, it offers a holistic approach — a way of thinking, a framework for educators, students, and policymakers to use AI consciously and responsibly. It asks essential, core-value questions: How do we balance innovation with privacy? How do we ensure equality when access to technology is uneven? How do we maintain transparency when using AI? And when should we pause, reflect, and reconsider how we use AI in the classroom?

To develop the code, we drew from extensive global research and local experience. We consulted with ethicists, educators, technologists, psychologists, and legal experts — and, perhaps most importantly, we listened to students, teachers, and parents. Through roundtable discussions, they shared real concerns and insights about AI’s potential and its pitfalls. Those conversations shaped the code’s seven guiding principles, designed to help schools integrate AI ethically, transparently, and with respect for human dignity."

Thursday, December 11, 2025

Banning AI Regulation Would Be a Disaster; The Atlantic, December 11, 2025

 Chuck Hagel, The Atlantic; Banning AI Regulation Would Be a Disaster

"On Monday, Donald Trump announced on Truth Social that he would soon sign an executive order prohibiting states from regulating AI...

The greatest challenges facing the United States do not come from overregulation but from deploying ever more powerful AI systems without minimum requirements for safety and transparency...

Contrary to the narrative promoted by a small number of dominant firms, regulation does not have to slow innovation. Clear rules would foster growth by hardening systems against attack, reducing misuse, and ensuring that the models integrated into defense systems and public-facing platforms are robust and secure before deployment at scale.

Critics of oversight are correct that a patchwork of poorly designed laws can impede that mission. But they miss two essential points. First, competitive AI policy cannot be cordoned off from the broader systems that shape U.S. stability and resilience...

Second, states remain the country’s most effective laboratories for developing and refining policy on complex, fast-moving technologies, especially in the persistent vacuum of federal action...

The solution to AI’s risks is not to dismantle oversight but to design the right oversight. American leadership in artificial intelligence will not be secured by weakening the few guardrails that exist. It will be secured the same way we have protected every crucial technology touching the safety, stability, and credibility of the nation: with serious rules built to withstand real adversaries operating in the real world. The United States should not be lobbied out of protecting its own future."

Thursday, December 4, 2025

New York Times Sues Pentagon Over First Amendment Rights; The New York Times, December 4, 2025

 , The New York Times ; New York Times Sues Pentagon Over First Amendment Rights

"The New York Times accused the Pentagon in a lawsuit on Thursday of infringing on the constitutional rights of journalists by imposing a set of new restrictions on reporting about the military.

In the suit, filed in the U.S. District Court in Washington, The Times argued that the Defense Department’s new policy violated the First Amendment and “seeks to restrict journalists’ ability to do what journalists have always done — ask questions of government employees and gather information to report stories that take the public beyond official pronouncements.”

The rules, which went into effect in October, are a stark departure from the previous ones, in both length and scope. They require reporters to sign a 21-page form that sets restrictions on journalistic activities, including requests for story tips and inquiries to Pentagon sources. Reporters who don’t comply could lose their press passes, and the Pentagon has accorded itself “unbridled discretion” to enforce the policy as it sees fit, according to the lawsuit."

Friday, November 7, 2025

To Preserve Records, Homeland Security Now Relies on Officials to Take Screenshots; The New York Times, November 6, 2025

  , The New York Times; To Preserve Records, Homeland Security Now Relies on Officials to Take Screenshots


[Kip Currier: This new discretionary DHS records policy is counter to sound ethics practices and democracy-centered values.

Preservation of records promotes transparency, the historical record, accountability, access to information, informed citizenries, the right to petition one's government, free and independent presses, and more. The new DHS records policy undermines all of the above.]



[Excerpt]

"The Department of Homeland Security has stopped using software that automatically captured text messages and saved trails of communication between officials, according to sworn court statements filed this week.

Instead, the agency began in April to require officials to manually take screenshots of their messages to comply with federal records laws, citing cybersecurity concerns with the autosave software.

Public records experts say the new record-keeping policy opens ample room for both willful and unwitting noncompliance with federal open records laws in an administration that has already shown a lack of interest in, or willingness to skirt, records laws. That development could be particularly troubling as the department executes President Trump’s aggressive agenda of mass deportations, a campaign that has included numerous accusations of misconduct by law enforcement officials, the experts said.

“If you are an immigration official or an agent and believe that the public might later criticize you, or that your records could help you be held accountable, would you go out of the way to preserve those records that might expose wrongdoing?” said Lauren Harper, who advocates government transparency at the Freedom of the Press Foundation."

Thursday, September 18, 2025

AI could never replace my authors. But, without regulation, it will ruin publishing as we know it; The Guardian, September 18, 2025

, The Guardian ; AI could never replace my authors. But, without regulation, it will ruin publishing as we know it


[Kip Currier: This is a thought-provoking piece by literary agent Jonny Geller. He suggests an "artists’ rights charter for AI that protects two basic principles: permission and attribution". His charter idea conveys some aspects of the copyright area called "moral rights".

Moral rights provide copyright creators with a right of paternity (i.e. attribution) and a right of integrity. The latter can enable creators to exercise some levels of control over how their copyrighted works can be adapted. The moral right of integrity, for example, was an argument in cases involving whether black and white films (legally) could be or (ethically) should be colorized. (See Colors in Conflicts: Moral Rights and the Foreign Exploitation of Colorized U.S. Motion PicturesMoral rights are not widespread in U.S. copyright law because of tensions between the moral right of integrity and the right of free expression/free speech under the U.S. Constitution (whose September 17, 1787 birthday was yesterday). The Visual Artists Rights Act (1990) is a narrow example of moral rights under U.S. copyright law.

To Geller's proposed Artists' Rights Charter for AI I'd suggest adding the word and concept of "Responsibilities". Compelling arguments can be made for providing authors with some rights regarding use of their copyrighted works as AI training data. And, commensurately, persuasive arguments can be made that authors have certain responsibilities if they use AI at any stage of their creative processes. Authors can and ethically should be transparent about how they have used AI, if applicable, in the creation stages of their writing.

Of course, how to operationalize that as an ethical standard is another matter entirely. But just because it may be challenging to initially develop some ethical language as guidance for authors and strive to instill it as a broad standard doesn't mean it shouldn't be attempted or done.]


[Excerpt]

"The single biggest threat to the livelihood of authors and, by extension, to our culture, is not short attention spans. It is AI...

As a literary agent and CEO of one of the largest agencies in Europe, I think this is something everyone should care about – not because we fear progress, but because we want to protect it. If you take away the one thing that makes us truly human – our ability to think like humans, create stories and imagine new worlds – we will live in a diminished world.

AI that doesn’t replace the artist, or that will work with them transparently, is not all bad. An actor who is needed for reshoots on a movie may authorise use of the footage they have to complete a picture. This will save on costs, the environmental impact and time. A writer may wish to speed up their research and enhance their work by training their own models to ask the questions that a researcher would. The translation models available may enhance the range of offering of foreign books, adding to our culture.

All of this is worth discussing. But it has to be a discussion and be transparent to the end user. Up to now, work has simply been stolen and there are insufficient guardrails on the distributors, studios, publishers. As a literary agent, I have a more prosaic reason to get involved – I don’t think it is fair for someone’s work to be taken without their permission to create an inferior competitor.

What can we do? We could start with some basic principles for all to sign up to. An artists’ rights charter for AI that protects two basic principles: permission and attribution."

Saturday, September 13, 2025

World Meeting on Human Fraternity: Disarming words to disarm the world; Vatican News, September 13, 2025

Roberto Paglialonga, Vatican News ; World Meeting on Human Fraternity: Disarming words to disarm the world


[Kip Currier: There is great wisdom and guidance in these words from Pope Leo and Fr. Enzo Fortunato (highlighted from this Vatican News article for emphasis):

Pope Leo XIV’s words echo: ‘Before being believers, we are called to be human.’” Therefore, Fr. Fortunato concluded, we must “safeguard truth, freedom, and dignity as common goods of humanity. That is the soul of our work—not the defense of corporations or interests.”"

What is in the best interests of corporations and shareholders should not -- must not -- ever be this planet's central organizing principle.

To the contrary, that which is at the very center of our humanity -- truth, freedom, the well-being and dignity of each and every person, and prioritization of the best interests of all members of humanity -- MUST be our North Star and guiding light.]


[Excerpt]

"Representatives from the world of communication and information—directors and CEOs of international media networks— gathered in Rome for the “News G20” roundtable, coordinated by Father Enzo Fortunato, director of the magazine Piazza San Pietro. The event took place on Friday 12 September in the Sala della Protomoteca on Rome's Capitoline Hill. The participants addressed a multitude of themes, including transparency and freedom of information in times of war and conflict: the truth of facts as an essential element to “disarm words and disarm the world,” as Pope Leo XIV has said, so that storytelling and narrative may once again serve peace, dialogue, and fraternity. They also discussed the responsibility of those who work in media to promote the value of competence, in-depth reporting, and credibility in an age dominated by unchecked social media, algorithms, clickbait slogans, and rampant expressions of hatred and violence from online haters.

Three pillars of our time: truth, freedom, Dignity


In opening the workshop, Father Fortunato outlined three “pillars” that can no longer be taken for granted in our time: truth, freedom, and dignity. Truth, he said, is “too often manipulated and exploited,” and freedom is “wounded,” as in many countries around the world “journalists are silenced, persecuted, or killed.” Yet “freedom of the press should be a guarantee for citizens and a safeguard for democracy.” Today, Fr. Fortunato continued, “we have many ‘dignitaries’ but little dignity”: people are targeted by “hate and defamation campaigns, often deliberately orchestrated behind a computer screen. Words can wound more than weapons—and not infrequently, those wounds lead to extreme acts.” Precisely in a historical period marked by division and conflict, humanity—despite its diverse peoples, cultures, and opinions—is called to rediscover what unites it. “Pope Leo XIV’s words echo: ‘Before being believers, we are called to be human.’” Therefore, Fr. Fortunato concluded, we must “safeguard truth, freedom, and dignity as common goods of humanity. That is the soul of our work—not the defense of corporations or interests.”"

Sunday, June 29, 2025

ACM FAccT ACM Conference on Fairness, Accountability, and Transparency; June 23-26, 2025, Athens, Greece

 

ACM FAccT

ACM Conference on Fairness, Accountability, and Transparency

A computer science conference with a cross-disciplinary focus that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems.

"Algorithmic systems are being adopted in a growing number of contexts, fueled by big data. These systems filter, sort, score, recommend, personalize, and otherwise shape human experience, increasingly making or informing decisions with major impact on access to, e.g., credit, insurance, healthcare, parole, social security, and immigration. Although these systems may bring myriad benefits, they also contain inherent risks, such as codifying and entrenching biases; reducing accountability, and hindering due process; they also increase the information asymmetry between individuals whose data feed into these systems and big players capable of inferring potentially relevant information.

ACM FAccT is an interdisciplinary conference dedicated to bringing together a diverse community of scholars from computer science, law, social sciences, and humanities to investigate and tackle issues in this emerging area. Research challenges are not limited to technological solutions regarding potential bias, but include the question of whether decisions should be outsourced to data- and code-driven computing systems. We particularly seek to evaluate technical solutions with respect to existing problems, reflecting upon their benefits and risks; to address pivotal questions about economic incentive structures, perverse implications, distribution of power, and redistribution of welfare; and to ground research on fairness, accountability, and transparency in existing legal requirements." 

Saturday, June 28, 2025

Global South voices ‘marginalised in AI Ethics’; Gates Cambridge, June 27, 2025

Gates Cambridge; Global South voices ‘marginalised in AI Ethics’

"A Gates Cambridge Scholar is first author of a paper how AI Ethics is sidelining Global South voices, reinforcing marginalisation.

The study, Distributive Epistemic Injustice in AI Ethics: A Co-productionist Account of Global North-South Politics in Knowledge Production, was published by the Association for Computing Machinery and is based on a study of nearly 6,000 AI Ethics publications between 1960 and 2024. Its first author is Abdullah Hasan Safir [2024 – pictured above], who is doing a PhD in Interdisciplinary Design. Other co-authors include Gates Cambridge Scholars Ramit Debnath[2018] and Kerry McInerney [2017].

The findings were recently presented at the ACM’s FAccT conference, considered one of the top AI Ethics conferences in the world. They show that experts from the Global North currently legitimise their expertise in AI Ethics through dynamic citational and collaborative practices in knowledge production within the field, including co-citation and institutional of AI Ethics."

Saturday, June 14, 2025

Two men jailed for life for supplying car bomb that killed Daphne Caruana Galizia; The Guardian, June 10, 2025

 , The Guardian ; Two men jailed for life for supplying car bomb that killed Daphne Caruana Galizia


[Kip Currier: It's encouraging to see that justice can occur, even in places and situations where corruption is deeply entangled and seemingly intractable. I vividly remember learning from The Guardian's reporting about the horrific car bomb murder of courageous investigative journalist Daphne Caruana Galizia in Malta in October 2017:

The journalist who led the Panama Papers investigation into corruption in Malta was killed on Monday in a car bomb near her home.

Daphne Caruana Galizia died on Monday afternoon when her car, a Peugeot 108, was destroyed by a powerful explosive device which blew the vehicle into several pieces and threw the debris into a nearby field.

A blogger whose posts often attracted more readers than the combined circulation of the country’s newspapers, Caruana Galizia was recently described by the Politico website as a “one-woman WikiLeaks”. Her blogs were a thorn in the side of both the establishment and underworld figures that hold sway in Europe’s smallest member state.

Her most recent revelations pointed the finger at Malta’s prime minister, Joseph Muscat, and two of his closest aides, connecting offshore companies linked to the three men with the sale of Maltese passports and payments from the government of Azerbaijan.

https://www.theguardian.com/world/2017/oct/16/malta-car-bomb-kills-panama-papers-journalist

As mentioned in the 2017 article, Galizia was reporting about corruption that involved the Maltese government at the time. Journalists like Galizia risk -- and all too often lose -- their lives to expose corruption and promote public awareness and accountability for wrongdoing.

These intrepid reporters also shed important light on the ways that the wealthy, powerful, and famous are frequently able to circumvent laws and ethical standards that apply to everyone else, as was revealed by the Panama Papers investigation.

Non-profit groups like Transparency International are committed to exposing corruption and promoting democracy and accountability:

We are Transparency International U.S. (TI US), part of the world’s largest coalition against corruption. We give voices to victims and witnesses of corruption, and work with governments, businesses, and citizens to stop the abuse of entrusted power.

In collaboration with national chapters in more than 100 countries, we are leading the fight to turn our vision of a world free from corruption into reality. Our U.S. office focuses on stemming the harms caused by illicit finance, strengthening political integrity, and promoting a positive U.S. role in global anti-corruption initiatives. Through a combination of research, advocacy, and policy, we engage with stakeholders to increase public understanding of corruption and hold institutions and individuals accountable.

https://us.transparency.org/who-we-are/]

My forthcoming Bloomsbury book Ethics, Information, and Technology (January 2026) examines the corrosive impacts of corruption. It also explores organizations like Transparency International that report on and educate about corrupt practices, as well as efforts to root out public trust-damaging activities and positively influence and change organizational cultures where corruption exists.

Corruption is often intertwined, too, with other ethical issues like conflicts of interest, censorship, research misconduct, misinformation and disinformation, counterfeit goods and deficits of transparency, accountability, data integrity, freedom of expression, and free and independent presses, which are critically assessed and considered in the book.]


[Excerpt]

"Two men have been sentenced to life in prison for supplying the car bomb that killed the anti-corruption journalist Daphne Caruana Galizia in Malta eight years ago.

The sentencing on Tuesday of Robert Agius and Jamie Vella, reported to be members of the island’s criminal underworld, marked a significant step in the long campaign to bring those charged with Caruana Galizia’s murder to justice.

Her death in October 2017 sparked outrage across Europe and embroiled Malta’s governing party in accusations of a coverup, ultimately leading to the resignation of the then prime minister, Joseph Muscat.

Prosecutors have brought charges against seven people, including a millionaire businessman who is still awaiting trial."

Thursday, June 5, 2025

Government AI copyright plan suffers fourth House of Lords defeat; BBC, June 2, 2025

Zoe Kleinman , BBC; Government AI copyright plan suffers fourth House of Lords defeat

"The argument is over how best to balance the demands of two huge industries: the tech and creative sectors. 

More specifically, it's about the fairest way to allow AI developers access to creative content in order to make better AI tools - without undermining the livelihoods of the people who make that content in the first place.

What's sparked it is the Data (Use and Access) Bill.

This proposed legislation was broadly expected to finish its long journey through parliament this week and sail off into the law books. 

Instead, it is currently stuck in limbo, ping-ponging between the House of Lords and the House of Commons.

A government consultation proposes AI developers should have access to all content unless its individual owners choose to opt out. 

But 242 members of the House of Lords disagree with the bill in its current form.

They think AI firms should be forced to disclose which copyrighted material they use to train their tools, with a view to licensing it."

Thursday, May 1, 2025

Ministers to amend data bill amid artists’ concerns over AI and copyright; The Guardian, April 30, 2025

  and  , The Guardian; Ministers to amend data bill amid artists’ concerns over AI and copyright

"Ministers have drawn up concessions on copyright changes in an attempt to appease artists and creators before a crucial vote in parliament next week, the Guardian has learned.

The government will promise to carry out an economic impact assessment of its proposed copyright changes and to publish reports on issues including transparency, licensing and access to data for AI developers.

The concessions are designed to mollify concerns in parliament and in creative industries about the government’s proposed shake-up of copyright rules."

Wednesday, April 16, 2025

When was the last time AI made you laugh? Scenes from the 2025 Summit on AI, Ethics and Journalism; Poynter, April 11, 2025

 , Poynter ; When was the last time AI made you laugh? Scenes from the 2025 Summit on AI, Ethics and Journalism

"This year’s Summit on AI, Ethics and Journalism, led by Poynter and The Associated Press, unfolded over two days in New York City’s financial district at the AP’s headquarters.

Here’s a brief summit recap through images:"

Saturday, January 25, 2025

Paul McCartney: Don't let AI rip off artists; BBC, January 25, 2025

Laura Kuenssberg, BBC; Paul McCartney: Don't let AI rip off artists

"Sir Paul McCartney has told the BBC proposed changes to copyright law could allow "rip off" technology that might make it impossible for musicians and artists to make a living.

The government is considering an overhaul of the law that would allow AI developers to use creators' content on the internet to help develop their models, unless the rights holders opt out.

In a rare interview for Sunday with Laura Kuenssberg, Sir Paul said "when we were kids in Liverpool, we found a job that we loved, but it also paid the bills", warning the proposals could remove the incentive for writers and artists and result in a "loss of creativity". 

The government said it aimed to deliver legal certainty through a copyright regime that provided creators with "real control" and transparency."

Friday, October 4, 2024

Beyond the hype: Key components of an effective AI policy; CIO, October 2, 2024

  Leo Rajapakse, CIO; Beyond the hype: Key components of an effective AI policy

"An AI policy is a living document 

Crafting an AI policy for your company is increasingly important due to the rapid growth and impact of AI technologies. By prioritizing ethical considerations, data governance, transparency and compliance, companies can harness the transformative potential of AI while mitigating risks and building trust with stakeholders. Remember, an effective AI policy is a living document that evolves with technological advancements and societal expectations. By investing in responsible AI practices today, businesses can pave the way for a sustainable and ethical future tomorrow."