Description:
Aaron Swartz once published a blog post entitled “Squaring the Triangle“, hypothesizing that a blockchain could be used to create a name system that had secure, decentralized, and human-readable names, thus “squaring” Zooko’s Triangle.
Since that post was published, numerous blockchain name systems have been developed, putting Aaron’s idea into practice. This talk will give a brief overview of the most popular blockchain name systems* in production and show some of their applications.
Namecoin was the first fork of Bitcoin and still is one of the most innovative “altcoins”. It was first to implement merged mining and a decentralized DNS. Namecoin was also the first solution to Zooko’s Triangle, the long-standing problem of producing a naming system that is simultaneously secure, decentralized, and human-meaningful.
Blockstack is a new internet for decentralized apps where users own their data. With Blockstack, users get digital keys that let them own their identity. They sign in to apps locally without remote servers or identity providers.
ENS offers a secure and decentralised way to address resources both on and off the blockchain using simple, human-readable names. ENS is built on smart contracts on the Ethereum blockchain, meaning it doesn’t suffer from the insecurity of the DNS system. You can be confident names you enter work the way their owner intended.
Chelsea Manning will be speaking at the Fifth Annual Aaron Swartz Day Evening Event – Saturday, November 4, 2017 – 7:30 pm – TICKETS(Just going to the hackathon? It’s free.)
From October 8, 2017, in New York City (at the New Yorker Festival):
I grew up in central Oklahoma. A small town, Crescent, Oklahoma. And my parents were both voting Republicans and I wasn’t aware there was an alternative. Everybody held those views. And I didn’t really understand them.
I’m trans and I felt different than everybody else. I knew I was different. I didn’t have words to like, describe that. All of my friends. All of my family. All of my teachers. They all knew it as well. It felt like there was something about me that was different. It caused friction. And it caused difficulty for me.
My mother is British, and when my mother and my father split up, my mother decided to move back to the UK, and so I went and I spent four years there. I went to school there, ya know, it was different. I was a kid from the mid west. I didn’t fit in. I didn’t know. It was just a completely different world for me.
My father exposed me to computers at a young age. I learned how to program by the time I was about 8 or 9, although I didn’t fully understand probably till I was about 10. And my parents, we always had a computer in the house. And we always had internet access. So, it was a “normal” thing for me. Even though, at the time, in the early to mid 90s, it wasn’t a normal thing. And there were a lot of communities on the Internet in this time. And so, I was exploring. I was exploring who I was. I was exploring different ways of presenting myself.
I spent more time text messaging and instant messaging my friends than actually spending time with them. The term is IRL (In Real Life), but, ya know, we weren’t spending a whole lot of time IRL. My mother didn’t know how to write checks, so I used the internet to learn how. It ended up being a symbiotic relationship, but also my mother had a drinking problem, and as I got older, I realized how bad it was. And I love my mother. It just, I realized this is not the environment I needed to be in at the time. So I decided to move after my mom, she had a medical problem happen. And it was a scare for me, because I realized, if something happened to my mother, I didn’t have a back up plan. I didn’t have anywhere else to go.
So, I moved back. We didn’t get along. To say the least. I was 17, and I moved back to the states, and it was just very difficult because she (her father’s wife) didn’t like me, and so she was creating all these rules that were impossible to follow. Like, “you can’t leave your bedroom after 8pm.”
So she called the police on me one night, after an argument. It was over a sandwich, because I wanted to have a sandwich. It was 8:30 at night. So, I went out of the room, and I used *her* kitchen, after like 8 o’clock or whatever, to like make a sandwich. It was a swiss cheese and baloney sandwich. And I would cut it with a knife, so I had a knife in my hand. I wasn’t wielding it or anything like that. She had ran off and like, called the police on me. And I’m just like ok that’s weird. And so the Oklahoma Police Department knocked on the door. I’m like “hello,” and they’re like “we’re here for a domestic incident.” And I was like “Okay. She’s in there.” And so, like, the police officer understood what was going on. He basically said “you shouldn’t go back there.”
I borrowed my dad’s truck. I ended up driving to Chicago and living on the streets of Chicago for a summer in Chicago, and here I am living out of a pickup truck, and dealing with that.
My aunt did some detective work, and she asked around all the people that I used to hang out with. She told me that she called about 50 or 60 people, until she finally found somebody that had my cell phone number. So, I get a call from my aunt, and she’s like “come to my house,” and I did. I drove a night and a day, all the way to Maryland. And I lived with her for a year. It was so wonderful for her to be there for me at a time like this, and I realize now, that she really saved my life in many ways, and I didn’t realize it, I didn’t understand it at the time, cause I was so used to being in crisis mode that even whenever I was there, I was like “this is temporary.” So I was scared.
I was trying to re-establish a relationship with my father, and so I’m calling him, and he kept on saying “You need structure. You need the military. I was in the Navy for four years: You should go into the Navy or the Air Force.” And, at that time, the Iraq war was going on. So I saw the images on TV every day of chaos and violence in Bagdad, and I really wanted to do something. And I joined the Army because, ya know, it was Bagdad, where the fight was, and I wanted to help with that. I thought, “if I become an intelligence analyst, I can use my skills or learn something, and make a difference, and maybe stop this. — Chelsea E. Manning, October 8, 2017.
Chelsea Manning will be speaking at the Fifth Annual Aaron Swartz Day Evening Event – Saturday, November 4, 2017 – 7:30 pm – TICKETS (Just going to the hackathon? It’s free.)
From October 8, 2017, in New York City (at the New Yorker Festival):
I think the most important think that we have to learn, because I think it’s been forgotten, is that every single one of us has the ability to change things. Each and every one of us has this ability. We need to look to each other and realize our values are what we care about, and then assert them, and say these things, and to take actions in our political discourse to make that happen. Because it’s not going to happen at the Ballot Box. It’s not.
Make your own decisions. Make your own choices. Make your own judgement.
You have to pay attention. For engineers in particular. We design and we develop systems, but the systems that we develop can be used for different things. The software that I was using in Iraq for predictive analysis was the same that you would use in marketing. It’s the same tools. It’s the same analysis. I believe engineers and software engineers and technologists. (That’s a new term that came out while I was away :-)
I guess technologists should realize that we have an ethical obligation to make decisions that go beyond just meeting deadlines or creating a product. What actually takes some chunks of time is to say “what are the consequences of this system?” “How can this be used?” “How can this be misused?” Let’s try to figure out how we can mitigate a software system from being misused. Or decide whether you want to implement it at all. There are systems where, if misused, could be very dangerous. — Chelsea E. Manning, October 8, 2017.
Technology Track – Ethical Algorithms 2:00 – 2:45 pm – Ethical Algorithms Panel – w/Q and A. Kristian Lum (Human Rights Data Analysis Group – HRDAG) As the Lead Statistician at HRDAG, Kristian’s research focus has been on furthering HRDAG’s statistical methodology (population estimation or multiple systems estimation—with a particular emphasis on Bayesian methods and model averaging). Caroline Sinders (Wikimedia Foundation) – Caroline uses machine learning to address online harassment at Wikimedia, and before that, she helped design and market IBM’s Watson. Caroline was also just named as one of Forbes’ “8 AI Designers You Need to Know.”Plus Special guests TBA
About the Ethical Algorithms Panel and Technology Track by Lisa Rein, Co-founder, Aaron Swartz Day
I created this track based on my phone conversations with Chelsea Manning on this topic.
Chelsea was an Intelligence Analyst for the Army and used algorithms in the day to day duties of her job. She and I have been discussing algorithms, and their ethical implications, since the very first day we spoke on the phone, back in October 2015.
“The consequences of our being subjected to constant algorithmic scrutiny are often unclear… algorithms are already analyzing social media habits, determining credit worthiness, deciding which job candidates get called in for an interview and judging whether criminal defendants should be released on bail. Other machine-learning systems use automated facial analysis to detect and track emotions, or claim the ability to predict whether someone will become a criminal based only on their facial features. These systems leave no room for humanity, yet they define our daily lives.”
A few weeks later, in December, I went to the Human Rights Data Analysis Group (HRDAG) holiday party, and met HRDAG’s Executive Director, Megan Price. She explained a great deal to me about the predictive software used by the Chicago police, and how it was predicting crime in the wrong neighborhoods based on the biased data it was getting from meatspace. Meaning, the data itself was “good” in that it was accurate, but unfortunately, the actual less-than-desirable behavior by the Chicago PD was being used as a guide for sending officers out into the field. Basically the existing bad behavior of the Chicago PD was being used to assign future behavior.
This came as a revelation to me. Here we have a chance to stop the cycle of bad behavior, by using technology to predict where the next real crime may occur, but instead, we have chosen to memorialize the faulty techniques used in the past into software, to be used forever.
I have gradually come to understand that, although these algorithms are being used in all aspects of our lives, it is not often clear how or why they are working. Now, it has become clear that they can develop their own biases, based on the data they have been given to “learn” from. Often the origin of that “learning data” is not shared with the public.
I’m not saying that we have to understand how exactly every useful algorithm works; which I understand would be next to impossible, but I’m not sure a completely “black box” approach is best at least when the public, public data, and public safety are involved. (Thomas Hargrove’s Murder Accountability Project‘s “open” database is one example of a transparent approach that seems to be doing good things.)
There also appears to be a disconnect with law enforcement, while some precincts seem to be content to rely on on technology for direction, for better or worse, such as the predictive software used by the Chicago Police Department. In other situations, such Thomas Hargrove’s, “Murder Accountability Project” (featured in the article Murder He Calculated) technologists are having a hard time getting law enforcement to take these tools seriously. Even when these tools appear to have the potential to find killers, there appear to be numerous invisible hurdles in the way of any kind of a timely implementation. Even for these “life and death” cases, Hargrove has had a very hard time getting anyone to listen to him.
So, how do we convince law enforcement to do more with some data while we are, at the same time, concerned about the oversharing other forms of public data?
I find myself wondering what can even be done, if simple requests such as “make the NCIC database’s data for unsolved killings searchable” seem to be falling on deaf ears.
I am hoping to have some actual action items that can be followed up on in the months to come, as a result of this panel.
Saturday, November 4, 2017 2:00 – 2:45 pm – Ethical Algorithms Panel – w/Q and A. Kristian Lum (Human Rights Data Analysis Group – HRDAG) As the Lead Statistician at HRDAG, Kristian’s research focus has been on furthering HRDAG’s statistical methodology (population estimation or multiple systems estimation—with a particular emphasis on Bayesian methods and model averaging). Caroline Sinders (Wikimedia Foundation) – Caroline uses machine learning to address online harassment at Wikimedia, and before that, she helped design and market IBM’s Watson. Caroline was also just named as one of Forbes’ “8 AI Designers You Need to Know.” Plus Special guests TBA
Our lightning talks are only 20 minutes in length, and usually focus on working code – or often, a collection of working implementations that someone has done over time.
These are very advanced, not general in scope, and implementation-oriented. Additionally, the goal is to feature projects that represent our community’s ideals.
Saturday Lightning talks are meant to explain potential hackathon projects.
Sunday talks are to present work done on projects over the weekend.
Think of a topic this way:
What is the exact problem space?
– How do you plan to fix it?
– How is this idea different than other ideas for fixing that problem?
– How have you *implemented* your idea? preferably with at least on screen code, if not working code?
“I’m afraid of my own government targeting me for surveillance because I make fun of the President for a living, and while I do it, I’m also black. I need government transparency and accountability. I need Freedom of Speech. I need quality journalism by journalists who feel safe to do their jobs. Because, without them, I can’t do my job.”
Jason Leopold, who is speaking Saturday at the San Francisco hackathon, and also that night at the evening event, wrote about Aaron’s FOIA requests, immediately following his death.
We will be going through the articles referenced in this excerpt below, one by one.
Swartz filed his first FOIA request in December 2010, more than two years after he landed on the government’s radar. He was seeking information about himself.
In 2008, Swartz’s friend and fellow open government activist Carl Malamud, the founder of the nonprofit public.resource.org, wanted to make federal court documents housed on the Public Access to Court Electronic Records system (PACER) available to the public for free. Using $600,000 he raised from supporters, Malamud purchased 50 years worth of appellate court documents and posted them on his website.
Then, the government started a pilot program in which access to federal court documents on PACER would be made available to users at no cost at 17 libraries around the country. Malamud urged activists like Swartz to visit the libraries, download the documents and send it over to him so he could make it availble to the public via his website.
“So Aaron went to one of them and installed a small PERL script he had written that cycled sequentially through case numbers, requesting a new document from Pacer every three seconds, and uploading it to” Amazon’s Elastic Compute (EC2) Cloud server, Wired reported. “Aaron pulled nearly 20 million pages of public court documents, which are now available for free on the Internet Archive.”
The court documents Swartz legally accessed were worth $1.5 million. The government shut down the PACER pilot program and the FBI launched an investigtation. Malamud has since published on his website emails he exchanged with Swartz about the incident.
On December 10, 2010, Swartz filed a FOIA request with the Justice Department’s Criminal Division seeking “documents related to me, Aaron Swartz, as well as any documents related to any associated PACER investigation.” The Justice Department said responded by stating it could not locate any records. He also filed an identical FOIA request that day with the Executive Office of United States Attorneys. The office identified 72 documents that were withheld in full.
Editor’s Note: we will be writing a lot about this in the weeks to come, re: how the PACER system in the United States is highly questionable, as it actually forces members of the public to pay, page by the page (and only if they have a credit card) to view the law.