Showing posts with label data collection and use. Show all posts
Showing posts with label data collection and use. Show all posts

Saturday, February 8, 2020

Montana seeks balancing act with wildlife location data, hunting ethics; Independent Record, February 6, 2020

Montana seeks balancing act with wildlife location data, hunting ethics


"While GPS collars are invaluable to researchers and wildlife managers, the data they produce are the subject of debate about who should have access to the information and why. Some hunters have requested and received the exact latitude and longitude of collared animals, and that has conservation groups and lawmakers concerned about violating the edict of fair chase hunting or the potential to monetize the data."

Tuesday, January 7, 2020

UK Government Plans To Open Public Transport Data To Third Parties; Forbes, December 31, 2019

Simon Chandler, Forbes; UK Government Plans To Open Public Transport Data To Third Parties

"The launch is a significant victory for big data. Occasionally derided as a faddish megatrend or empty buzzword, the announcement of the Bus Open Data Service shows that national governments are willing to harness masses of data and use them to create new services and economic opportunities. Similarly, it's also a victory for the internet of things, insofar as real-time data from buses will be involved in providing users with up-to-date travel info.

That said, the involvement of big data inevitably invites fears surrounding privacy and surveillance."

Sunday, December 2, 2018

I Wanted to Stream Buffy, Angel, and Firefly for Free, But Not Like This; Gizmodo, November 30, 2018

Alex Cranz, Gizmodo; I Wanted to Stream Buffy, Angel, and Firefly for Free, But Not Like This

"This is TV that should be accessible to everyone, but Facebook Watch? Really? In order to watch Buffy take on a demon with a rocket launcher you have to be willing to sit there and stare at a video on the Facebook platform—the same place your cousin continues to post Daily Caller Trump videos and that friend from high school shares clips of a Tasty casserole made of butter, four tubes of biscuit dough, baked beans, and a hot dog? The price for complimentary access to three of the best shows produced is bargaining away your data and privacy?

No, thanks.

But Facebook is hoping we’ll all say yes, please. Facebook’s user growth in the U.S. notably hit a wall over the summer and it’s been trying to fix things. It’s also trying to make itself more “sticky,” so people stay on Facebook to get not just family and friend updates and memes, but also the streams and standard videos more commonly found on YouTube. Last year Facebook launched Watch, its YouTube competitor that was, from the start, filled with trash. But things have slowly improved, with the show Sorry for Your Loss gaining rave reviews."

Thursday, May 24, 2018

New privacy rules could spell the end of legalese — or create a lot more fine print; The Washington Post, May 24, 2018

Elizabeth DwoskinThe Washington Post; New privacy rules could spell the end of legalese — or create a lot more fine print

"“The companies are realizing that it is not enough to get people to just click through,” said Lorrie Cranor, director of the CyLab Usable Privacy and Security Laboratory at Carnegie Mellon University and the U.S. Federal Trade Commission’s former chief technologist. “That they need to communicate so that people are not surprised when they find out what they consented to.”

That has become more apparent in the past two months since revelations that a Trump-connected consultancy, Cambridge Analytica, made off with the Facebook profiles of up to 87 million Americans. Cranor said that consumer outrage over Cambridge was directly related to concerns that companies were engaging in opaque practices behind the scenes, and that consumers had unknowingly allowed it to happen by signing away their rights.

Irrespective of simpler explanations, the impact and success of the GDPR will hinge upon whether companies will try to force users to consent to their tracking or targeting as condition for access to their services, said Alessandro Acquisti, a Carnegie Mellon computer science professor and privacy researcher. "This will tell us a lot regarding whether the recent flurry of privacy policy modifications demonstrates a sincere change in the privacy stance of those companies or is more about paying lip service to the new regulation. The early signs are not auspicious.""

Monday, May 21, 2018

‘Westworld’ Season 2: Seven Big Things We Still Don’t Know at Midseason; The New York Times, May 21, 2018

Scott Tobias, The New York Times; ‘Westworld’ Season 2: Seven Big Things We Still Don’t Know at Midseason

[SPOILERS BELOW]





"Delos cares about its intellectual property. The theme parks are merely a means to an end — that much was clear from the first season, even clearer from the second, in which recovering Dolores’s father, Peter Abernathy (Louis Herthum), has been a top priority.”

Sunday, April 15, 2018

Transcript of Mark Zuckerberg’s Senate hearing; Transcript courtesy of Bloomberg Government via The Washington Post, April 10, 2018

Transcript courtesy of Bloomberg Government via The Washington PostTranscript of Mark Zuckerberg’s Senate hearing

"SEN. JOHN CORNYN (R-TEX): Thank you, Mr. Zuckerberg, for being here. I know in — up until 2014, a mantra or motto of Facebook was move fast and break things. Is that correct?

ZUCKERBERG: I don't know when we changed it, but the mantra is currently move fast with stable infrastructure, which is a much less sexy mantra.

CORNYN: Sounds much more boring. But my question is, during the time that it was Facebook's mantra or motto to move fast and break things, do you think some of the misjudgments, perhaps mistakes that you've admitted to here, were as a result of that culture or that attitude, particularly as it regards to personal privacy of the information of your subscribers?

ZUCKERBERG: Senator, I do think that we made mistakes because of that. But the broadest mistakes that we made here are not taking a broad enough view of our responsibility. And while that wasn't a matter — the “move fast” cultural value is more tactical around whether engineers can ship things and — and different ways that we operate.

But I think the big mistake that we've made looking back on this is viewing our responsibility as just building tools, rather than viewing our whole responsibility as making sure that those tools are used for good."

Sunday, July 16, 2017

How can we stop algorithms telling lies?; Guardian, July 16, 2017

Cathy O'Neil, Guardian; 

How can we stop algorithms telling lies?


[Kip Currier: Cathy O'Neil is shining much-needed light on the little-known but influential power of algorithms on key aspects of our lives. I'm using her thought-provoking 2016 Weapons of Math Destruction: How Big Data Increases Inequality And Threatens Democracy as one of several required reading texts in my Information Ethics graduate course at the University of Pittsburgh's School of Computing and Information.]

"A proliferation of silent and undetectable car crashes is harder to investigate than when it happens in plain sight.

I’d still maintain there’s hope. One of the miracles of being a data sceptic in a land of data evangelists is that people are so impressed with their technology, even when it is unintentionally creating harm, they openly describe how amazing it is. And the fact that we’ve already come across quite a few examples of algorithmic harm means that, as secret and opaque as these algorithms are, they’re eventually going to be discovered, albeit after they’ve caused a lot of trouble.

What does this mean for the future? First and foremost, we need to start keeping track. Each criminal algorithm we discover should be seen as a test case. Do the rule-breakers get into trouble? How much? Are the rules enforced, and what is the penalty? As we learned after the 2008 financial crisis, a rule is ignored if the penalty for breaking it is less than the profit pocketed. And that goes double for a broken rule that is only discovered half the time...

It’s time to gird ourselves for a fight. It will eventually be a technological arms race, but it starts, now, as a political fight. We need to demand evidence that algorithms with the potential to harm us be shown to be acting fairly, legally, and consistently. When we find problems, we need to enforce our laws with sufficiently hefty fines that companies don’t find it profitable to cheat in the first place. This is the time to start demanding that the machines work for us, and not the other way around."

Thursday, May 25, 2017

Obama chief data scientist: Trumpcare health plan would ‘cripple’ precision medicine; FedScoop, May 24, 2017

Billy Mitchell, FedScoop; Obama chief data scientist: Trumpcare health plan would ‘cripple’ precision medicine

"DJ Patil, U.S. chief data scientist in the latter years of Barack Obama’s presidency, wrote on Medium that Trumpcare, as the AHCA is nicknamed, would threaten the country’s ability to leverage data to advance medical science, particularly in the fight against major diseases like cancer. The White House’s proposal would allow insurance companies to deny coverage or charge more when people have preexisting medical conditions. That provision could make people less willing to share important information about themselves with researchers, Patil says, because of fear it could be used against them later.

“[M]y deep fear is that people won’t be willing to donate their data. And there are too many people who have diseases that need us to donate our data to help,” Patil writes.

At the center of the Precision Medicine Initiative introduced under Obama is the “responsible collection of large amounts of data to be able to develop truly customized medical treatments for each patient,” Patil explains. The Trump legislation essentially threatens that project."

Wednesday, May 10, 2017

Taking Pittsburgh’s Open Data on the Road; Government Technology, May 8, 2017

Robert Burack, Government Technology; 

Taking Pittsburgh’s Open Data on the Road


[Kip Currier: 2017 marks the 10th year of this blog. This post is the 3,000th:  an illustrative "lessons-learned" case study of "grassroots" Open Data sharing between City of Pittsburgh data analysts and neighborhood residents.]


"This story was originally published by Data-Smart City Solutions.

When Pittsburgh developed Burgh’s Eye View, the city’s recently-launched open data application, the city’s Analytics & Strategy team visited 26 community meetings in early 2017 to gather actionable feedback and share the application with the community...

The team of analysts offered short presentations focused on how residents might use the application to understand what’s happening in their neighborhood, solicited suggestions for improvements and additional data to include in future updates, and engaged in one-on-one conversations to gain a better understanding of individual residents’ needs.

The team had to thoughtfully consider how to “filter out the tech speak” and present in an accessible and digestible way. The resulting takeaways from the team outline pathways for transforming city analysts into user advocates, show the value of building a broad constituency and taking iterative action based on resident feedback, and provide insight into why cities should pursue open data civic engagement in addition to user research."

Tuesday, February 2, 2016

At Berkeley, a New Digital Privacy Protest; New York Times, 2/1/16

Steve Lohr, New York Times; At Berkeley, a New Digital Privacy Protest:
"While some of the professors criticize the monitoring program as one that invades their privacy, the University of California has responded that “privacy perishes in the absence of security.”
It’s part of the larger challenge that fast-moving technology poses for social values. Every day, corporations, government agencies and universities must balance the need for computer security with the expected right to privacy of the people who use their networks. In different settings, there are different rules, expectations and levels of threat.
“We’re really just starting to sort out the risks and rules for digital security and data collection and use,” said Elana Zeide, a privacy expert at New York University’s Information Law Institute."