All posts by lisa

Chelsea Manning to Technologists: Please Take the Time To Contemplate Your System’s Potential Misuse

Chelsea Manning will be speaking at the Fifth Annual Aaron Swartz Day Evening Event – Saturday, November 4, 2017 – 7:30 pm – TICKETS (Just going to the hackathon? It’s free.)

Chelsea E. Manning at Dolores Park in San Francisco, September, 2017.

From October 8, 2017, in New York City (at the New Yorker Festival):

I think the most important think that we have to learn, because I think it’s been forgotten, is that every single one of us has the ability to change things. Each and every one of us has this ability. We need to look to each other and realize our values are what we care about, and then assert them, and say these things, and to take actions in our political discourse to make that happen. Because it’s not going to happen at the Ballot Box. It’s not.

Make your own decisions. Make your own choices. Make your own judgement.

You have to pay attention. For engineers in particular. We design and we develop systems, but the systems that we develop can be used for different things. The software that I was using in Iraq for predictive analysis was the same that you would use in marketing. It’s the same tools. It’s the same analysis. I believe engineers and software engineers and technologists. (That’s a new term that came out while I was away :-)

I guess technologists should realize that we have an ethical obligation to make decisions that go beyond just meeting deadlines or creating a product. What actually takes some chunks of time is to say “what are the consequences of this system?” “How can this be used?” “How can this be misused?” Let’s try to figure out how we can mitigate a software system from being misused. Or decide whether you want to implement it at all. There are systems where, if misused, could be very dangerous. — Chelsea E. Manning, October 8, 2017.

Excerpt from the WNYC The New Yorker Radio Hour (starts at 31:45):
http://www.wnyc.org/story/chelsea-manning-life-after-prison/

About the Ethical Algorithms Panel and Technology Track

This panel is part of the San Francisco Aaron Swartz Day Hackathon. Admission is FREE.

See Caroline Sinders and Kristian Lum, live at 2pm, on November 4th.

Technology Track – Ethical Algorithms
2:00 – 2:45 pm – Ethical Algorithms Panel – w/Q and A.
Kristian Lum (Human Rights Data Analysis Group – HRDAG) As the Lead Statistician at HRDAG, Kristian’s research focus has been on furthering HRDAG’s statistical methodology (population estimation or multiple systems estimation—with a particular emphasis on Bayesian methods and model averaging).
Caroline Sinders (Wikimedia Foundation) – Caroline uses machine learning to address online harassment at Wikimedia, and before that, she helped design and market IBM’s Watson. Caroline was also just named as one of Forbes’ 8 AI Designers You Need to Know.” Plus Special guests TBA

About the Ethical Algorithms Panel and Technology Track
by Lisa Rein, Co-founder, Aaron Swartz Day

I created this track based on my phone conversations with Chelsea Manning on this topic.

Chelsea was an Intelligence Analyst for the Army and used algorithms in the day to day duties of her job. She and I have been discussing algorithms, and their ethical implications, since the very first day we spoke on the phone, back in October 2015.

Chelsea recently published as a New York Times Op-Ed on the subject: The Dystopia We Signed Up For.

From the Op-Ed:

“The consequences of our being subjected to constant algorithmic scrutiny are often unclear… algorithms are already analyzing social media habits, determining credit worthiness, deciding which job candidates get called in for an interview and judging whether criminal defendants should be released on bail. Other machine-learning systems use automated facial analysis to detect and track emotions, or claim the ability to predict whether someone will become a criminal based only on their facial features. These systems leave no room for humanity, yet they define our daily lives.”

A few weeks later, in December, I went to the Human Rights Data Analysis Group (HRDAG) holiday party, and met HRDAG’s Executive Director, Megan Price. She explained a great deal to me about the predictive software used by the Chicago police, and how it was predicting crime in the wrong neighborhoods based on the biased data it was getting from meatspace. Meaning, the data itself was “good” in that it was accurate, but unfortunately, the actual less-than-desirable behavior by the Chicago PD was being used as a guide for sending officers out into the field. Basically the existing bad behavior of the Chicago PD was being used to assign future behavior.

This came as a revelation to me. Here we have a chance to stop the cycle of bad behavior, by using technology to predict where the next real crime may occur, but instead, we have chosen to memorialize the faulty techniques used in the past into software, to be used forever.

I have gradually come to understand that, although these algorithms are being used in all aspects of our lives, it is not often clear how or why they are working. Now, it has become clear that they can develop their own biases, based on the data they have been given to “learn” from. Often the origin of that “learning data” is not shared with the public.

I’m not saying that we have to understand how exactly every useful algorithm works; which I understand would be next to impossible, but I’m not sure a completely “black box” approach is best at least when the public, public data, and public safety are involved. (Thomas Hargrove’s Murder Accountability Project‘s “open” database is one example of a transparent approach that seems to be doing good things.)

There also appears to be a disconnect with law enforcement, while some precincts seem to be content to rely on on technology for direction, for better or worse, such as the predictive software used by the Chicago Police Department. In other situations, such Thomas Hargrove’s, “Murder Accountability Project” (featured in the article Murder He Calculated) technologists are having a hard time getting law enforcement to take these tools seriously. Even when these tools appear to have the potential to find killers, there appear to be numerous invisible hurdles in the way of any kind of a timely implementation. Even for these “life and death” cases, Hargrove has had a very hard time getting anyone to listen to him.

So, how do we convince law enforcement to do more with some data while we are, at the same time, concerned about the oversharing other forms of public data?

I find myself wondering what can even be done, if simple requests such as “make the NCIC database’s data for unsolved killings searchable” seem to be falling on deaf ears.

I am hoping to have some actual action items that can be followed up on in the months to come, as a result of this panel.

References:

1. The Dystopia We Signed Up For, Op-Ed by Chelsea Manning, New York Times, September 16, 2017. (Link goes to a free version not behind a paywall, at Op-Ed News)

2. Pitfalls of Predictive Policing, by Jessica Saunders for Rand Corporation, October 11, 2016. https://www.rand.org/blog/2016/10/pitfalls-of-predictive-policing.html

3. Predictions put into practice: a quasi-experimental evaluation of Chicago’s predictive policing pilot. by Jessica Saunders, Priscillia Hunt, John S. Hollywood, for the Journal of Experimental Criminology, August 12, 2016. https://link.springer.com/article/10.1007/s11292-016-9272-0

4. Murder He Calculated – by Robert Kolker, for Bloomberg.com, February 12th 2017.

5. Murder Accountability Project, founded by Thomas Hargrove. http://www.murderdata.org/

6. Secret Algorithms Are Deciding Criminal Trials and We’re Not Even Allowed to Test Their Accuracy – By Vera Eidelman, William J. Brennan Fellow, ACLU Speech, Privacy, and Technology Project, September 15, 2017. https://www.aclu.org/blog/privacy-technology/surveillance-technologies/secret-algorithms-are-deciding-criminal-trials-and

7. Machine Bias – There’s software used across the country to predict future criminals. And it’s biased against blacks. by Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner, ProPublica, May 23, 2016. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

8. Criminality Is Not A Nail – A new paper uses flawed methods to predict likely criminals based on their facial features. by Katherine Bailey for Medium.com, November 29, 2016. https://medium.com/backchannel/put-away-your-machine-learning-hammer-criminality-is-not-a-nail-1309c84bb899

Saturday, November 4, 2017
2:00 – 2:45 pm – Ethical Algorithms Panel – w/Q and A.
Kristian Lum (Human Rights Data Analysis Group – HRDAG)
As the Lead Statistician at HRDAG, Kristian’s research focus has been on furthering HRDAG’s statistical methodology (population estimation or multiple systems estimation—with a particular emphasis on Bayesian methods and model averaging).
Caroline Sinders (Wikimedia Foundation)
Caroline uses machine learning to address online harassment at Wikimedia, and before that, she helped design and market IBM’s Watson. Caroline was also just named as one of Forbes’ 8 AI Designers You Need to Know.” Plus Special guests TBA

 

What Makes A Good Lightning Talk

Lightning Talk Schedule

Our lightning talks are only 20 minutes in length, and usually focus on working code – or often, a collection of working implementations that someone has done over time.

These are very advanced, not general in scope, and implementation-oriented. Additionally, the goal is to feature projects that represent our community’s ideals.

Saturday Lightning talks are meant to explain potential hackathon projects.

Sunday talks are to present work done on projects over the weekend.

Think of a topic this way:

What is the exact problem space?
– How do you plan to fix it?
– How is this idea different than other ideas for fixing that problem?
– How have you *implemented* your idea? preferably with at least on screen code, if not working code?

 

 

EFF Pioneer Awards – Part Two – Ashley Nicole Black

Come to the Fifth Annual Aaron Swartz Day Evening Event! Only 75 tickets left :)

“I’m afraid of my own government targeting me for surveillance because I make fun of the President for a living, and while I do it, I’m also black. I need government transparency and accountability. I need Freedom of Speech. I need quality journalism by journalists who feel safe to do their jobs. Because, without them, I can’t do my job.”

– Ashley Nicole Black, September 14, 2017

Ashley Nicole Black is an American comedian, actress, and writer from Los Angeles, California. In 2016, she became a writer and correspondent for Full Frontal w/ Samantha Bee. She was the Keynote Speaker for the 2017 EFF Pioneer Awards.

2013 Post by Jason Leopold About Aaron’s PACER-related FOIA Requests

Jason Leopold, who is speaking Saturday at the San Francisco hackathon, and also that night at the evening event,  wrote about Aaron’s FOIA requests, immediately following his death.

We will be going through the articles referenced in this excerpt below, one by one.

Aaron Swartz’s FOIA Requests Shed Light on His Struggle

From the Truthout article:

Swartz filed his first FOIA request in December 2010, more than two years after he landed on the government’s radar. He was seeking information about himself.

In 2008, Swartz’s friend and fellow open government activist Carl Malamud, the founder of the nonprofit public.resource.org, wanted to make federal court documents housed on the Public Access to Court Electronic Records system (PACER) available to the public for free. Using $600,000 he raised from supporters, Malamud purchased 50 years worth of appellate court documents and posted them on his website.

Then, the government started a pilot program in which access to federal court documents on PACER would be made available to users at no cost at 17 libraries around the country. Malamud urged activists like Swartz to visit the libraries, download the documents and send it over to him so he could make it availble to the public via his website.

“So Aaron went to one of them and installed a small PERL script he had written that cycled sequentially through case numbers, requesting a new document from Pacer every three seconds, and uploading it to” Amazon’s Elastic Compute (EC2) Cloud server, Wired reported. “Aaron pulled nearly 20 million pages of public court documents, which are now available for free on the Internet Archive.”

The court documents Swartz legally accessed were worth $1.5 million. The government shut down the PACER pilot program and the FBI launched an investigtation. Malamud has since published on his website emails he exchanged with Swartz about the incident.

On December 10, 2010, Swartz filed a FOIA request with the Justice Department’s Criminal Division seeking “documents related to me, Aaron Swartz, as well as any documents related to any associated PACER investigation.” The Justice Department said responded by stating it could not locate any records. He also filed an identical FOIA request that day with the Executive Office of United States Attorneys. The office identified 72 documents that were withheld in full.

Case Challenging PACER Fees is allowed to move forward

Editor’s Note: we will be writing a lot about this in the weeks to come, re: how the PACER system in the United States is highly questionable, as it actually forces members of the public to pay, page by the page (and only if they have a credit card) to view the law.

Theodore D’Apuzzo received a favorable opinion, denying the U.S. Government’s Motion to Dismiss in his case against PACER.

In a nutshell:

  1. The case can proceed.
  2. Stay on discovery is now lifted.
  3. Government must now answer the complaint by October 10, 2017.

EFF Pioneer Awards – Part One

Last week’s Pioneer Awards were absolutely amazing. I will be posting video soon, but here are some photos.

Come to this year’s Aaron Swartz Day evening event!

Lawrence Lessig & Chelsea Manning – So great finally introducing these two to each other :-) !
Noah Swartz & Brewster Kahle
Brewster Kahle & Chelsea Manning – both will be speaking at the Aaron Swartz Day evening event!
Chelsea and the EFF gang! :)
Rainey Reitman (EFF)
Dave Maass (EFF)
Cindy Cohn (Executive Director, EFF) – Cindy will be speaking at the Aaron Swartz Day Evening Event on November 4th!

Caroline Sinders Named By Forbes as an “AI Designer That You Need To Know”

See Caroline Sinders at this year’s Aaron Swartz Day International Hackathon, at the San Francisco Hackathon‘s Ethical Algorithm Panel, Saturday at 2pm, and at the evening event, Saturday night, November 4, 7:30 pm.

8 AI Designers That You Need To Know by Adelyn Zhou for Forbes.

Caroline Sinders – Machine Learning Designer and Researcher, former Interaction Designer for IBM Watson

Caroline Sinders Caroline Sinders

Caroline is an artist, designer, and activist who also loves writing codes. She helped design and market IBM Watson, a billion-dollar artificial intelligence system built on advanced natural language processing, automated reasoning, machine learning, and other technologies. Sinders’ work on Watson focused on user flows and the impact of human decision-making in the development of robotics software. She recently left her dream job at IBM to pursue an equally challenging fellowship at Open Labs. A passionate crusader against online harassment, Caroline probes the different ways design can influence and shape digital conversations, with the ultimate goal of using machine learning to address online harassment. You can weigh her strong opinions on Twitter, Medium, LinkedIn, and her personal website.

Plan A November 4th Hackathon In Your Town

Hackathons are being planned for November 4th in San Francisco, New York, and even Cairo!

So far, the projects are the Freedom of the Press Foundation’s SecureDrop and these topics:

  1. Ethical Algorithms
  2. Usable Crypto
  3. Post-Quantum Crypto
  4. FOIA

Send an email to lisa@lisarein.com if you are planning a hackathon :-)

We are putting together new “Hackathon 101” materials  too!

So, if you have good “how to have a hackathon” resources, please email them too! :-)