All posts by lisa

Come to the Rally and Press Conference Monday Dec 5th 9:30 am and Demand the SF Board of Supervisors Vote NO on “Killer Robots”

A poster that says "No Killer Robots" for the Dec 5 Rally and Press Conference at 9:30 am at SF City Hall
Come to the Dec 5 Rally and Press Conference at 9:30 am at SF City Hall

Update: December 5: Transcription and Audio Video Downloads here: https://www.aaronswartzday.org/no-killer-robots-press-conference-december-5-2022/

***

December 4, 2021

Contact: Tracy Rosenberg, Co-Founder – Aaron Swartz Day Police Surveillance Project
email: tracy@media-alliance.org
phone: 510-684-6853

When: 9:30 am 

Where: City Hall, Polk Street Steps (Civic Center Park Facing- Steps)

What: Rally and Press Conference organized by Dean Preston, District 5 Supervisor, San Francisco (Who voted “no” on this issue last week)  — Also the EFF and ACLU will be there!

Why: To convince the SF Board of Supervisors to vote “No” on the final vote going on this Tuesday at 2pm or postpone the 2nd vote to a later date. 

 

Summary

This policy amendment would allow the SFPD to arm any existing robots. For instance, to take the bomb robots that they already have, that are usually used to find and dismantle bombs, and retrofit them to actually carry a bomb into a situation, such as the situation in Dallas back in 2016. (https://www.nytimes.com/2016/07/09/science/dallas-bomb-robot.html)

Police Departments can already violate their standing policies in extreme situations citing “exigent circumstances” (including an imminent threat to life), when everything else has been tried and failed. For this reason, a policy amendment isn’t actually necessary for the robots to be used in this manner in a crisis scenario, as long as the department self-reports.

The policy amendment is an attempt to make the use of lethal force by robots a “standard operating procedure” rather than something that is only allowed during the most extreme of circumstances. The policy amendment’s only requirement to use one of these armed robots is that it be authorized by a single one of the three highest commanding officers, who only needs to “evaluate” the situation prior to use.

 

Exigent Circumstances

Police use the “exigent circumstances” exception frequently when they want to borrow and use surveillance equipment that isn’t authorized by the local surveillance or military equipment policy for regular use. For instance, when the city of Oakland borrows drones from Alameda County, or when San Francisco borrows a cell site simulator from the Department of Homeland Security.

“Exigent circumstances” is how the cops can get around restrictions they would otherwise be bound by and when misused, can cover for First and Fourth Amendment violations. It also gives the cops the flexibility to do basically anything else they want to do, when the circumstances are extreme and abnormal. In the case of surveillance and militarized equipment (which is what the robots fall under) -an exigent circumstance clause lets the equipment  be used in ways which the policy does not otherwise allow on a one-time emergency basis. (https://www.law.cornell.edu/wex/exigent_circumstances)

Here, for example are some exigent circumstance reports for drone use in Oakland when drone use was not otherwise permitted.

https://oaklandprivacy.org/wp-content/uploads/2022/12/January-2020-Exigent-Drone-Use.pdf

https://oaklandprivacy.org/wp-content/uploads/2022/12/August-2020-Exigent-Drone-Use.pdf

We only know about these, because they have to report on themselves when they use unapproved technology due to an emergency. But it is possible to allow them to do so, as long as they disclose the use and explain the “exigent circumstances” that were occurring.

But they cannot use and should not be able to use unapproved equipment or techniques without a present severe emergency and disclosure. That is the difference between exigent-use-only and standard operating procedure. The proposed policy amendment goes beyond exigent use to allow standard use whenever police commanders decide to do so and does not require prompt public disclosure after use.

.

 

How soon can this thing pass the Board of Supervisors? (Or “Where is the Policy Amendment exactly in its Process?)

The “first reading” of the ordinance was already voted on and passed 8-3, but several of the “Yes” votes were clearly uncomfortable with the situation. (We’re not sure why they voted “yes” anyway…)

The next vote is this TUESDAY DECEMBER 6 at 2 PM.

The amended ordinance with the killer robot permission will become law after the second reading, so if we can’t convince some of the supervisors to change their stance in the next few days, San Francisco will become the first city in the country to explicitly authorize deadly force by robots.

 

What language should the Policy Amendment include?

Well, it should say that robots cannot be armed under any circumstances. 

However, since we may need to just slow down a runaway train, we wish to make it clear what any Amendment would need to include. 

The policy should say it would only be used:

1)under threat of imminent and significant casualties or severe physical injury, and

2)only after de-escalation efforts and alternative use of force techniques have been tried and failed to subdue the subject, and

3)where there would be no collateral loss of life whatsoever, including bystanders and hostages. 

These minimum requirements are humane in a way that hopefully does not need justification or debate.

 

About Dallas

In July of 2016 in Dallas, a suspect had shot and injured a number of policemen and barricaded himself in a building. The Dallas Police Department armed an existing bomb sniffing robot and send it into the building to blow it up and kill the suspect. The Dallas Police Department was extremely lucky that  there was no collateral damage. The suspect was killed.

It is the only known use in the United States of a robot with a bomb being used by civilian law enforcement. In 1985, the Philadelphia Police Department delivered explosive bombs via helicopter when they bombed the headquarters of the MOVE black liberation group. The MOVE bombing killed at least 5 children, burned down over 60 residences and is generally seen as egregious police violence.

The Dallas incident is often used to justify why it might be necessary to arm a robot with bombs, but if we examine it closely, it does quite the opposite.

Considering it was one guy, locked inside a building, alone, with little or no chance of being able to hurt anyone else unless they were to force their way inside the building, it’s pretty easy to say that the Dallas PD jumped the gun. DPD had lots of other options short of blowing up the building. For instance, they could have waited hours or days for the guy to eventually come out, and stayed far enough away so he couldn’t shoot them.

With all this in mind, there is no reason for San Francisco to establish the blueprint for regularized use of killer robots with few restrictions besides an evaluation by senior police command.

 

Aaron Would Have Been 35 Years Old Today

It has been eight years since Aaron’s death, on January 11, 2013.

We miss you Aaron.

November 8, 2021 would have been Aaron’s 35th birthday, but instead we mourn our friend and wonder what could have been, had he not taken his own life seven years ago after being terrorized by a career-driven prosecutor and U.S. Attorney who decided to just make shit up, make an example out of Aaron, impress their bosses and further their own careers.

As it turns out though, Aaron’s downloading wasn’t even illegal, as he was a Harvard Ethics Fellow at the time and Harvard and MIT had contractual agreements allowing Aaron to access those materials en masse.

But all this didn’t come to light until it was too late.

Aaron was careful not to tell his friends too much about his case for fear he would involve them in the quagmire. In truth, we wouldn’t have minded doing anything we could to help him, but we didn’t realize he needed help, and that his grand jury’s runaway train had gone so far off the rails.

We should have known though, as Grand Juries are a dangerous, outdated practice that give prosecutors unlimited power, making it easy to manipulate the way that witnesses and evidence are presented to the Grand Jury and convince jurors of almost anything. These kinds of proceedings also often violate the subject and witness’ constitutional rights in different ways. For these reasons, most civilized countries have transitioned away from them in favor of preliminary hearings.

We learned many other lessons from his case, after the smoke had cleared. We learned that Aaron’s Grand Jury prosecutor, Assistant U.S. Attorney Stephen Heymann, and the U.S. Attorney in charge of his case, Carmen Ortiz, were so obsessed with trying to make names for themselves, they were  willing to fabricate charges and evidence in order get indictments that would otherwise be unachievable.

As Dan Purcell explained:

“Steve Heymann did what bureaucrats and functionaries often choose to do. He wanted make a big case to justify his existence and justify his budget. The casualties be damned…

Our bottom line was going to be that Aaron had done only what MIT permitted him to do. He hadn’t gained unauthorized access to anything. He had gained access to JSTOR with full authorization from MIT. Just like anyone in the jury pool, anyone reading Boing Boing, or anyone in the country could have done.

We hoped that the jury would understand that and would acquit Aaron, and it quickly became obvious to us that there really wasn’t going to be opportunity to resolve the case short of trial because Steve Heymann was unreasonable.”

We also learned that MIT was more concerned with their own reputation than standing up for the truth or protecting Aaron. In fact, we learned that MIT decided to assist the government with its case against Aaron, rather than helping him by pressuring to Feds to drop the case, even after JSTOR had made it clear it did not wish to prosecute.

We know all this because Kevin Poulsen explained to us how he had to sue the Department of Homeland Security to get access to documents in Aaron’s FBI file, and that MIT blocked their release – intervening as a third party – and demanding to get a chance to further redact them before they were released to Kevin – and the Judge granted their request! Only time will tell what MIT was so worried about, but its behavior suggests that there may have been some kind of cover-up  regarding its involvement in Aaron’s case.

Most recently, thanks to Property of the People’s Ryan Shapiro, we learned that Aaron had an erroneous code in his FBI record  that meant “International Terrorism involving Al Qaeda” – deriving from his sending a single email to the University of Pittsburg, which might explain why the FBI was so suspicious of him during his case.

There are still many pieces of the puzzle missing, but we won’t stop trying to put it all together. We hope you will join us on November 13th to honor him and learn about his projects and ideas that are still bearing fruit to this day, such as SecureDrop, Open Library, and the Aaron Swartz Day Police Surveillance Project.

Until then, we will continue to come together to help each other and share information, knowledge and resources, and to try to make things better in our world.

 

 Howl For Aaron Swartz (by Brewster Kahle)

Howl for Aaron Swartz

Written by Brewster Kahle, shortly after Aaron’s Death, on January 11, 2013.

Howl for Aaron Swartz
New ways to create culture
Smashed by lawsuits and bullying
Laws that paint most of us criminal

Inspiring young leaders
Sharing everything
Living open source lives
Inspiring communities selflessly

Organizing, preserving
Sharing, promoting
Then crushed by government
Crushed by politicians, for a modest fee
Crushed by corporate spreadsheet outsourced business development

New ways
New communities
Then infiltrated, baited
Set-up, arrested

Celebrating public spaces
Learning, trying, exploring
Targeted by corporate security snipers
Ending up in databases
Ending up in prison

Traps set by those that promised change
Surveillance, wide-eyes, watching everyone now
Government surveillance that cannot be discussed or questioned
Corporate surveillance that is accepted with a click

Terrorists here, Terrorists there
More guns in schools to promote more guns, business
Rendition, torture
Manning, solitary, power

Open minds
Open source
Open eyes
Open society

Public access to the public domain
Now closed out of our devices
Closed out of owning books
Hands off
Do not open
Criminal prosecution

Traps designed by the silicon wizards
With remarkable abilities to self-justify
Traps sprung by a generation
That vowed not to repeat
COINTELPRO and dirty tricks and Democratic National Conventions

Government-produced malware so sophisticated
That career engineers go home each night thinking what?
Saying what to their families and friends?

Debt for school
Debt for houses
Debt for life
Credit scores, treadmills, with chains

Inspiring and optimistic explorers navigating a sea of traps set by us
I see traps ensnare our inspiring generation
Leaders and discoverers finding new ways and getting crushed for it

 

Brewster Kahle: Plea Bargaining and Torture

Brewster - Plea Bargaining Torture - 1
Brewster Kahle at the Internet Archive’s Aaron Swartz Day Celebration, San Francisco, California, November 8, 2014

Audio Clip:

Link to video of Brewster’s talk (Direct link to Brewster’s talk from within the complete video of all speakers from the event.)

The transcript below has been edited slightly for readability.

Complete transcription:

Welcome to the Internet Archive. I’m Brewster Kahle, Founder and Digital Librarian here, and welcome to our home.

For those that haven’t been here before… The little blinking lights on the 5 petabytes of servers that are in the back, are actually serving millions of people a day, and being kind of a digital library. The little sculptures around are people who have worked at the Internet Archive, including one of Aaron Swartz up toward the front. In the front because he was the architect and lead builder of OpenLibrary.org, which is an Internet Archive site. And also worked on putting Pacer into the Internet Archive (RECAP), Google Books public domain books, and other projects that we’ve worked on over the years. So with this, we’d like to say, “Happy Birthday Aaron, we miss you.”

I’m going to talk about a cheery subject: Plea Bargaining and Torture. When I was trying to think through the approach that was used to bring down Aaron Swartz and to try to make a symbol out of him, I typed these words into my favorite search engine (“Plea Bargaining and Torture) and back came a paper on the subject, that I am going to summarize and also elaborate on.

I found this wonderful paper, by a Yale Law Professor, in 1978, comparing European Torture Law and current Plea Bargaining. This might sound a little bit far fetched, but stick with me for a minute.

European Torture Law, I had no idea, was actually a regulated, implemented, part of their court system. It started in 1215, when they stopped going and saying “you’re guilty because God said so.” They had to come up with something else. So they basically had to come up with something that was *that sure.* And they said you either had to have two eyewitnesses, or, you had to confess. And this was actually an unworkable system. And instead of changing that, they tried to force confessions, and they had a whole system for how to do it. They had basically how much regulation, how much leg clamping you had. How many minutes of this, for different crimes.

Brewster - Plea Bargaining Torture - 7

So you can see in this diagram, and you can see this guy getting tortured here, but he is surrounded by court clerks. So, it’s not this, sort of, the Spanish Inquisition, as Monty Python would have it. This was actually a smart people state-sponsored system that was trying to fix a bug in their court system, in that it was too hard to convict people. So they tortured them into confessions.

Sound familiar?

So, in the United States, now, we have between 90 and 99 percent. It depends whether you are in Federal or State court, or which county you’re in. 97% of all convictions at the Federal level are done with plea bargaining.

Brewster - Plea Bargaining Torture - 3

So you have basically no chance of having a jury before your peers. This is basically a threat system. They actually did studies in Florida where they jacked up the sentences, and the number of people that plea bargained went up. It’s a system to handle convictions outside of the Court System. Outside of the Jury System. Unfortunately, our Constitution actually has something to say about this that’s in pretty direct contradiction:

“The Trial of all Crimes, except in Cases of Impeachment, shall be by Jury;…”
– Article III.2 U.S. Constitution (http://www.archives.gov/exhibits/charters/constitution_transcript.html)

Brewster - Plea Bargaining Torture - 2

But as another thinker on this has said, basically Plea Bargains have made jury trials obsolete.

Brewster - Plea Bargaining Torture - 4

When Aaron Swartz was threatened with 35 years, it’s got to have hit a young, idealistic person pretty hard. 35 years for downloading books too fast from the library? This doesn’t make any sense. Yet that’s a pretty big threat, and may have had something to do with it. When this sort of played out, after his death, I just found that these quotations notable enough that I’m going to sort of, bore you, with putting them up.

Brewster - Plea Bargaining Torture - 5

So he was faced with 35 years, thanks to Carmen Ortiz. Wonderful. And the Justice Department had never intended for this. No more than a three, four, or potentially five-month range,” said the top attorney in the United States. And we shouldn’t really judge what the prosecutors were doing, based on what they threatened him (with), just by what they were going to do if he pled guilty.

So I think we’ve got a real problem with this. So what’s to do?

Well, I say we should make some noise about it. I think some of the reasons that we don’t make noise about it is it doesn’t happen to our friends. This sort of thing happens to a lot of “other people.” But, in this case, it did happen to our friend, and I think that it’s important for us to respond to it.

Brewster - Plea Bargaining Torture - 6

I think John Oliver has been on a roll, in terms of some of these unbelievable sorts of diatribes of going and actually doing research and bringing it in front of people in an interesting way. I’d also like to pitch: “is there a documentarian in the house, say?” That we should go, and really go and put this type of behavior in front of more people.

There are others that are trying by not pleading, but it has its downsides. Basically, gum up the courts. At least for me, I take off my… I don’t go through the surveillance device in the airports, and yes it gums them up a little bit, and I feel like that’s my part to help. Would I actually, if it came right down to it, not plead? To help move this forward? I don’t know. By enlarge, we’ve got ridiculous catch-all laws, and we’ve got sentences that are just outrageous, and these have just got to come under control, as well as let’s actually hire some judges.

Chelsea Manning’s Op-Ed for the NY Times: The Dystopia We Signed Up For

From September 13, 2017

The Dystopia We Signed Up For

By Chelsea Manning

In recent years our military, law enforcement and intelligence agencies have merged in unexpected ways. They harvest more data than they can possibly manage, and wade through the quantifiable world side by side in vast, usually windowless buildings called fusion centers.

Such powerful new relationships have created a foundation for, and have breathed life into, a vast police and surveillance state. Advanced algorithms have made this possible on an unprecedented level. Relatively minor infractions, or “microcrimes,” can now be policed aggressively. And with national databases shared among governments and corporations, these minor incidents can follow you forever, even if the information is incorrect or lacking context…

In literature and pop culture, concepts such as “thoughtcrime” and “precrime” have emerged out of dystopian fiction. They are used to restrict and punish anyone who is flagged by automated systems as a potential criminal or threat, even if a crime has yet to be committed. But this science fiction trope is quickly becoming reality. Predictive policing algorithms are already being used to create automated heat maps of future crimes, and like the “manual” policing that came before them, they overwhelmingly target poor and minority neighborhoods.

The world has become like an eerily banal dystopian novel. Things look the same on the surface, but they are not. With no apparent boundaries on how algorithms can use and abuse the data that’s being collected about us, the potential for it to control our lives is ever-growing.

*** full text below for archival purposes***

The Dystopia We Signed Up For

By Chelsea Manning

For seven years, I didn’t exist.

While incarcerated, I had no bank statements, no bills, no credit history. In our interconnected world of big data, I appeared to be no different than a deceased person. After I was released, that lack of information about me created a host of problems, from difficulty accessing bank accounts to trouble getting a driver’s license and renting an apartment.

In 2010, the iPhone was only three years old, and many people still didn’t see smartphones as the indispensable digital appendages they are today. Seven years later, virtually everything we do causes us to bleed digital information, putting us at the mercy of invisible algorithms that threaten to consume our freedom.

Information leakage can seem innocuous in some respects. After all, why worry when we have nothing to hide?

We file our taxes. We make phone calls. We send emails. Tax records are used to keep us honest. We agree to broadcast our location so we can check the weather on our smartphones. Records of our calls, texts and physical movements are filed away alongside our billing information. Perhaps that data is analyzed more covertly to make sure that we’re not terrorists — but only in the interest of national security, we’re assured.

Our faces and voices are recorded by surveillance cameras and other internet-connected sensors, some of which we now willingly put inside our homes. Every time we load a news article or page on a social media site, we expose ourselves to tracking code, allowing hundreds of unknown entities to monitor our shopping and online browsing habits. We agree to cryptic terms-of-service agreements that obscure the true nature and scope of these transactions.

According to a 2015 study from the Pew Research Center, 91 percent of American adults believe they’ve lost control over how their personal information is collected and used.

Just how much they’ve lost, however, is more than they likely suspect.

The real power of mass data collection lies in the hand-tailored algorithms capable of sifting, sorting and identifying patterns within the data itself. When enough information is collected over time, governments and corporations can use or abuse those patterns to predict future human behavior. Our data establishes a “pattern of life” from seemingly harmless digital residue like cellphone tower pings, credit card transactions and web browsing histories.

The consequences of our being subjected to constant algorithmic scrutiny are often unclear. For instance, artificial intelligence — Silicon Valley’s catchall term for deepthinking and deep-learning algorithms — is touted by tech companies as a path to the high-tech conveniences of the so-called internet of things. This includes digital home assistants, connected appliances and self-driving cars.

Simultaneously, algorithms are already analyzing social media habits, determining creditworthiness, deciding which job candidates get called in for an interview and judging whether criminal defendants should be released on bail. Other machine-learning systems use automated facial analysis to detect and track emotions, or claim the ability to predict whether someone will become a criminal based only on their facial features.

These systems leave no room for humanity, yet they define our daily lives. When I began rebuilding my life this summer, I painfully discovered that they have no time for people who have fallen off the grid — such nuance eludes them. I came out publicly as transgender and began hormone replacement therapy while in prison. When I was released, however, there was no quantifiable history of me existing as a trans woman. Credit and background checks automatically assumed I was committing fraud. My bank accounts were still under my old name, which legally no longer existed. For months I had to carry around a large folder containing my old ID and a copy of the court order declaring my name change. Even then, human clerks and bank tellers would sometimes see the discrepancy, shrug and say “the computer says no” while denying me access to my accounts.

Such programmatic, machine-driven thinking has become especially dangerous in the hands of governments and the police.

In recent years our military, law enforcement and intelligence agencies have merged in unexpected ways. They harvest more data than they can possibly manage, and wade through the quantifiable world side by side in vast, usually windowless buildings called fusion centers.

Such powerful new relationships have created a foundation for, and have breathed life into, a vast police and surveillance state. Advanced algorithms have made this possible on an unprecedented level. Relatively minor infractions, or “microcrimes,” can now be policed aggressively. And with national databases shared among governments and corporations, these minor incidents can follow you forever, even if the information is incorrect or lacking context.

At the same time, the United States military uses the metadata of countless communications for drone attacks, using pings emitted from cellphones to track and eliminate targets.

In literature and pop culture, concepts such as “thoughtcrime” and “precrime” have emerged out of dystopian fiction. They are used to restrict and punish anyone who is flagged by automated systems as a potential criminal or threat, even if a crime has yet to be committed. But this science fiction trope is quickly becoming reality. Predictive policing algorithms are already being used to create automated heat maps of future crimes, and like the “manual” policing that came before them, they overwhelmingly target poor and minority neighborhoods.

The world has become like an eerily banal dystopian novel. Things look the same on the surface, but they are not. With no apparent boundaries on how algorithms can use and abuse the data that’s being collected about us, the potential for it to control our lives is ever-growing.

Our drivers’ licenses, our keys, our debit and credit cards are all important parts of our lives. Even our social media accounts could soon become crucial components of being fully functional members of society. Now that we live in this world, we must figure out how to maintain our connection with society without surrendering to automated processes that we can neither see nor control.

The Intentionality of Evil

Photo by Michael Francis McElroy. From the February 12, 2009 New York Times article about Aaron’s PACER project: https://www.nytimes.com/2009/02/13/us/13records.html

 

By Aaron Swartz:

Everybody thinks they’re good.

And if that’s the case, then intentionality doesn’t really matter. It’s no defense to say (to take a recently famous example) that New York bankers were just doing their jobs, convinced that they were helping the poor or something, because everybody thinks they’re just doing their jobs; Eichmann thought he was just doing his job.

Eichmann, of course, is the right example because it was Hannah Arendt’s book “Eichmann in Jerusalem: A Report on the Banality of Evil that is famously cited for this thesis. Eichmann, like almost all terrorists and killers, was by our standards a perfectly normal and healthy guy doing what he thought were perfectly reasonable things.

And if that normal guy could do it, so could we. And while we could argue who’s worse — them or us — it’s a pointless game since its our actions that we’re responsible for. And looking around, there’s no shortage of monstrous crimes that we’ve committed.

 

Complete version from Aaron’s blog, June 23, 2005:

As children we’re fed a steady diet of comic books (and now, movies based off of them) in which brave heros save the planet from evil people. It’s become practically conventional wisdom that such stories wrongly make the line between good and evil too clear — the world is more nuanced than that, we’re told — but this isn’t actually the problem with these stories. The problem is that the villains know they’re evil.

And people really grow up thinking things work this way: evil people intentionally do evil things. But this just doesn’t happen. Nobody thinks they’re doing evil — maybe because it’s just impossible to be intentionally evil, maybe because it’s easier and more effective to convince yourself you’re good — but every major villain had some justification to explain why what they were doing was good. Everybody thinks they’re good.

And if that’s the case, then intentionality doesn’t really matter. It’s no defense to say (to take a recently famous example) that New York bankers were just doing their jobs, convinced that they were helping the poor or something, because everybody thinks they’re just doing their jobs; Eichmann thought he was just doing his job.

Eichmann, of course, is the right example because it was Hannah Arendt’s book Eichmann in Jerusalem: A Report on the Banality of Evil that is famously cited for this thesis. Eichmann, like almost all terrorists and killers, was by our standards a perfectly normal and healthy guy doing what he thought were perfectly reasonable things.

And if that normal guy could do it, so could we. And while we could argue who’s worse — them or us — it’s a pointless game since its our actions that we’re responsible for. And looking around, there’s no shortage of monstrous crimes that we’ve committed.

So the next time you mention one to someone and they reply “yes, but we did with a good intent” explain to them that’s no defense; the only people who don’t are characters in comic books.

You should follow me on twitter here.

 

Angela Davis: This is a very exciting moment.

Angela Davis, being interviewed “Live” from Oakland, California, on the UK’s @Channel4News. Approximate date: June 9, 2020.

Video Clip here:
https://twitter.com/Channel4News/status/1270434723064696835?s=20

Full interview here: https://www.youtube.com/watch?v=i3TU3QaarQE&feature=emb_logo

Full Transcription of video clip:

Change has to come in many forms. It has to be political. It has to be economic. It has to be social.

What we are witnessing are very new demands. For who knows how long, we’ve been calling for accountability for individual police officers responsible for what amounts to lynchings. For continuing the tradition of what amounts to extra-judicial lynching, but under the cone of the law.

What we are seeing now are new demands. Demands to demilitarize the police. Demands to defund the police. Demands to dismantle the police and envision different modes of public safety.

We’re asked now to consider how we might imagine justice in the future.

This is a very exciting moment. I don’t know if we have ever experienced this kind of global challenge to racism and to the consequences of slavery and colonialism.

Supreme Court: All federal laws that prohibit discrimination on the basis of sex also outlaw discrimination based on sexual orientation or gender identity

Big decision by the Supremes today re: Title VII of civil rights act!

Here’s an op ed by Erwin Chemerinsky, Dean of the UC Berkeley School of Law: https://news.yahoo.com/op-ed-supreme-court-victory-175702939.html

From the op-ed by Erwin Chemerinsky:

The decision is hugely important in protecting gay, lesbian and transgender individuals from discrimination in workplaces across the country. But its significance is broader than that. It should be understood to say that all federal laws that prohibit discrimination on the basis of sex also outlaw discrimination based on sexual orientation or gender identity.


Here is decision itself:
https://d2qwohl8lx5mh1.cloudfront.net/8hVHe52Cq4sPdF0wEaTaCQ/content

From the Supreme Court decision itself: (page 13):

From the ordinary public meaning of the statute’s language at the time of the law’s adoption, a straightforward rule emerges: An employer violates Title VII when it intentionally fires an individual employee based in part on sex. It doesn’t matter if other factors besides the plaintiff ’s sex contributed to the decision. And it doesn’t matter if the employer treated women as a group the same when compared to men as a group. If the employer intentionally relies inpart on an individual employee’s sex when deciding to discharge the employee—put differently, if changing the employee’s sex would have yielded a different choice by the employer—a statutory violation has occurred. Title VII’s message is “simple but momentous”: An individual employee’s sex is “not relevant to the selection, evaluation, orcompensation of employees.” Price Waterhouse v. Hopkins, 490 U. S. 228, 239 (1989) (plurality opinion).

The statute’s message for our cases is equally simple and momentous: An individual’s homosexuality or transgenderstatus is not relevant to employment decisions. That’s because it is impossible to discriminate against a person for being homosexual or transgender without discriminating against that individual based on sex.

 

How to use Bridgefy as an Emergency Mesh Network

by Lisa Rein & Matteo Borri – June 1, 2020

Download link for Bridgefy

Introduction (What is a mesh network and how does it work?)

A mesh network application will allow you to continue to communicate – over bluetooth – even after all WiFi and cellular service has been interrupted.

So, for instance if your local city decides to turn off phone service on a given city block, you and your group of people will still be able to communicate with each other – in a small area. (About 100 meters, for example, if you had five people who were all about 20 meters away from each other.)

As long as there are a lot of people with bluetooth and Bridgefy installed within a given area, a “mesh network” will be enabled that has the potential to go even farther distances (than 100 meters).

More details here: https://medium.com/bridgefy/how-to-use-the-bridgefy-offline-messaging-app-b4799af7649b

How to hold your phone for the best reception

Reminder that Bridgefy basically turns a phone into a digital walkie talkie.  So, you may need to hold it up and vertically to get the best range.

BATTERY Considerations

Using mesh networks uses up your phone’s battery very quickly – so be sure to bring extra batteries.

Bandwidth issues (Why you should keep it to text and LOW RES images (and not use too much voice or god forbid, video)

Bluetooth’s bandwidth isn’t the greatest. So, although you CAN send anything – anything but text will be very slow to move around and can slow down the network. It will work, but it’ll glitch out a lot.

Security Considerations

Using Anonymously

If you wish to create an “anonymous, unverified account” – remember when you are installing to NOT SYNC THE APP WITH YOUR CONTACTS when it asks.

Note that you and your friends can quickly “see” each other on the network, using the “Broadcast” group chat feature, BUT GROUP CHATS ARE NOT ENCRYPTED. (Perhaps you don’t care if the group messages are encrypted, since everyone is using anonymous names. Or, it could be really important that your group’s communications are not publicly seen.)

More details here: https://medium.com/bridgefy/how-to-use-the-bridgefy-offline-messaging-app-b4799af7649b

Using with your phone identified (forever, after the first time) and your contacts synced

If you sync contacts, it will grab your phone’s IMEI identifier. If you try to change your account name and save it (in “settings” under “About” and then “Profile”), it will give you the new name, but keep the old name in parentheses when you are identified to others as a user on their phones. Like this: “new name (old name).”

If you decide to let your phone be identified and let the app sync up with your contacts, you will have to be connected to the internet for it to configure properly.

Encrypted messages between individuals vs. Clear text “group” messages using “Broadcast” feature

Messages between two accounts are encrypted. The “Broadcast” feature is useful because it allows you to message everyone on the network at once, but in CLEAR TEXT.

Bridgefy uses RSA encryption for messages sent between individual accounts. Encrypted group chat is not yet available. (Only the unencrypted “group chat – with a default of “all users in range” that is available using the “Broadcast” feature.)

BEFORE YOU INSTALL

IMPORTANT:

Before you start installing the Bridgefy app on your phone, you need to decide if you wish to use Brigefy anonymously OR if you want to sync it to your contacts and have your phone uniquely identified.

DO NOT sync with your contacts if you EVER want to have an anonymous, unverified account. If you sync even ONE TIME. Bridgefy grab’s your phone’s IMEI identifier – and won’t ever forget it. (Even if you uninstall it, reboot your phone, and reinstall it.)

Again: Once you sync your contacts and your phone’s IMEI number has been identified you can’t create an anonymous account. You can create a new name, but when you look at your app – and when others see you on the network, the old name (in parenthesis) will keep showing up next to the new name.

Like this: New name (old name)

NOTE: You do not need to install the SDK to send and receive text messages over the mesh network. The SDK requires an account and is not needed to simply communicate and send text messages over the network.

Install Instructions

1. After deciding to be anonymous or to sync your contacts and be forever identified, install Bridgefy on to your phone (from your phone’s app store)

2. Give yourself an account name (make note of it perhaps – so you can tell others what it is).

3. If you are making an anonymous account, the program will say you have created an “unverified” account.

4. If you decide to sync with your contacts, make sure you are connected to the internet, say “yes,” and then give it a couple minutes.

5. After it syncs with them (which takes a minute) – all of your contacts that have Bridgefy installed will pop up under “contacts.” (Just like they way Signal brings up your contacts, if they also have the app installed.

Click on the “Contacts” icon along the bottom of the screen, and it will display:

1) any of your contacts that have the app installed
2) ALL USERS on the network that are “nearby” and can be messaged individually.

Note: Messages between individual users are encrypted, while group chats via the “Broadcast” feature are in CLEAR TEXT.

6. If no one is showing up under “nearby” in “contacts,” the other way of seeing folks on the network – and a good way to kinda “wake up” your system if you’re not seeing anyone on it – is to click on “Broadcast” (in the row of icons along the bottom of the screen).

7. After you click on it, an empty looking “Broadcast” window comes up with a text box at the very bottom where you can text to EVERYONE on the mesh network (IN CLEAR TEXT).

8. While on the “Broadcast” screen, you can find other users on the network by clicking on the upper right corner, where there is a contact-y looking icon with a red number showing you how many other people are on the network. Touch that – and it will give you a list of the people’s handles that are within range, so you can message people individually..

9. Although it’s very easy to text the whole group from the BROADCAST page, remember that it’s in CLEAR TEXT – AND EVEN USERS YOU HAVE BLOCKED CAN SOMETIMES SEE ALL BROADCAST MESSAGES (according to our testing). (Speak up if we’re wrong about this :)

9. So, whether you access your contacts via “contacts” along the bottom (after allowing it to sync to your contacts) – or access a list of “nearby” people you can message by clicking on the contact-ish icon in the upper right of the “Broadcast” window, once you can see a name, you can select it and:

-start a conversation
-delete a conversation
-block that user from getting any texts from you

Okay we think that about covers it. But we are open to adding more details or changing anything we might have gotten wrong. Please email us at aaronswartzday@protonmail.com.

New “Toolkit” from the ACLU & Oakland Privacy Can Help You Get A Surveillance Ordinance Passed

via Oakland Privacy

The ACLU has just announced a new Surveillance Ordinance Toolkit!

Two years in the making – and in collaboration with Oakland Privacy – this toolkit from the ACLU provides the critical information for part three of the Aaron Swartz Day Police Surveillance Project’s goals:  To help folks present the information to their own local city councils to help get a surveillance policy in place for how any surveillance equipment is allowed to be used on its citizens.

Since the November 2017 Aaron Swartz Day weekend, our police surveillance project has taught activists how to use our templates with muckrock to file public records requests – with a city’s police and sheriff departments (in the spirit of Aaron Swartz’ many FOIA requests via muckrock) so that they will have to give any relevant documentation when specific types of surveillance equipment have already been purchased.

The last, and most complicated of our project’s three goals involves figuring out what to do exactly once you have evidence of the surveillance equipment’s existence.  Ideally, once a piece of surveillance equipment has been identified, a city council can start working on  in the process of implementing a use policy, based on an ordinance, which regulates how that equipment is allowed to be used. But what’s the best way to present that evidence to a  city council for it to act upon?

Thanks to Tracy Rosenberg, we have provided some easy to use templates for obtaining information direct from a city’s own police and sheriff departments, so that the information could be taken to the City Council. Now, the ACLU & Oakland Privacy have created a toolkit for the process of taking that information to the City Council.

From the Oakland Privacy article:

Getting a surveillance transparency ordinance or a facial recognition ban passed in your town can seem like an overwhelming task. It’s not. You can do it!

Oakland Privacy and ACLU of Northern California sat down to write a step by step guide based on the dozen ordinances in place across the country, including 8 in the San Francisco Bay Area.

This free guide includes loads of advice on coalition-building, public education, strategy, research, messaging and advocacy and samples of useful documents.

Howl For Aaron Swartz

If Aaron were still alive he’d be 33 years old, with most of his life still ahead of him.

It’s never easy on January 11th. This year will be no exception.

Brewster Kahle wrote this poem about Aaron shortly after Aaron’s death, in 2013. It was filmed in 2015, and first published in January 2017.

Howl for Aaron Swartz

Written by Brewster Kahle, shortly after Aaron’s Death, on January 11, 2013.

Howl for Aaron Swartz
New ways to create culture
Smashed by lawsuits and bullying
Laws that paint most of us criminal

Inspiring young leaders
Sharing everything
Living open source lives
Inspiring communities selflessly

Organizing, preserving
Sharing, promoting
Then crushed by government
Crushed by politicians, for a modest fee
Crushed by corporate spreadsheet outsourced business development

New ways
New communities
Then infiltrated, baited
Set-up, arrested

Celebrating public spaces
Learning, trying, exploring
Targeted by corporate security snipers
Ending up in databases
Ending up in prison

Traps set by those that promised change
Surveillance, wide-eyes, watching everyone now
Government surveillance that cannot be discussed or questioned
Corporate surveillance that is accepted with a click

Terrorists here, Terrorists there
More guns in schools to promote more guns, business
Rendition, torture
Manning, solitary, power

Open minds
Open source
Open eyes
Open society

Public access to the public domain
Now closed out of our devices
Closed out of owning books
Hands off
Do not open
Criminal prosecution

Traps designed by the silicon wizards
With remarkable abilities to self-justify
Traps sprung by a generation
That vowed not to repeat
COINTELPRO and dirty tricks and Democratic National Conventions

Government-produced malware so sophisticated
That career engineers go home each night thinking what?
Saying what to their families and friends?

Debt for school
Debt for houses
Debt for life
Credit scores, treadmills, with chains

Inspiring and optimistic explorers navigating a sea of traps set by us
I see traps ensnare our inspiring generation
Leaders and discoverers finding new ways and getting crushed for it