"Some of the web’s biggest destinations for watching videos have quietly started using automation to remove extremist content from their sites, according to two people familiar with the process. The move is a major step forward for internet companies that are eager to eradicate violent propaganda from their sites and are under pressure to do so from governments around the world as attacks by extremists proliferate, from Syria to Belgium and the United States. YouTube and Facebook are among the sites deploying systems to block or rapidly take down Islamic State videos and other similar material, the sources said. The technology was originally developed to identify and remove copyright-protected content on video sites. It looks for "hashes," a type of unique digital fingerprint that internet companies automatically assign to specific videos, allowing all content with matching fingerprints to be removed rapidly. Such a system would catch attempts to repost content already identified as unacceptable, but would not automatically block videos that have not been seen before."
Issues and developments related to IP, AI, and OM, examined in the IP and tech ethics graduate courses I teach at the University of Pittsburgh School of Computing and Information. My Bloomsbury book "Ethics, Information, and Technology", coming in Summer 2025, includes major chapters on IP, AI, OM, and other emerging technologies (IoT, drones, robots, autonomous vehicles, VR/AR). Kip Currier, PhD, JD
Thursday, June 30, 2016
Exclusive: Google, Facebook Quietly Move Toward Automatic Blocking of Extremist Videos; Reuters via New York Times, 6/24/16
Reuters via New York Times; Exclusive: Google, Facebook Quietly Move Toward Automatic Blocking of Extremist Videos:
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment