Error message

  • Warning: date_timezone_set() expects parameter 1 to be DateTime, boolean given in format_date() (line 2133 of /home4/joepearc/public_html/home/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, boolean given in format_date() (line 2143 of /home4/joepearc/public_html/home/includes/common.inc).

Electronic Freedom Foundation

European Court of Human Rights Confirms: Weakening Encryption Violates Fundamental Rights

EFF - Tue, 03/05/2024 - 9:09am

In a milestone judgment—Podchasov v. Russiathe European Court of Human Rights (ECtHR) has ruled that weakening of encryption can lead to general and indiscriminate surveillance of the communications of all users and violates the human right to privacy.  

In 2017, the landscape of digital communication in Russia faced a pivotal moment when the government required Telegram Messenger LLP and other “internet communication” providers to store all communication data—and content—for specified durations. These providers were also required to supply law enforcement authorities with users’ data, the content of their communications, as well as any information necessary to decrypt user messages. The FSB (the Russian Federal Security Service) subsequently ordered Telegram to assist in decrypting the communications of specific users suspected of engaging in terrorism-related activities.

Telegram opposed this order on the grounds that it would create a backdoor that would undermine encryption for all of its users. As a result, Russian courts fined Telegram and ordered the blocking of its app within the country. The controversy extended beyond Telegram, drawing in numerous users who contested the disclosure orders in Russian courts. A Russian citizen, Mr Podchasov, escalated the issue to the European Court of Human Rights (ECtHR), arguing that forced decryption of user communication would infringe on the right to private life under Article 8 of the European Convention of Human Rights (ECHR), which reads as follows:  

Everyone has the right to respect for his private and family life, his home and his correspondence (Article 8 ECHR, right to respect for private and family life, home and correspondence) 

EFF has always stood against government intrusion into the private lives of users and advocated for strong privacy guarantees, including the right to confidential communication. Encryption not only safeguards users’ privacy but also protects their right to freedom of expression protected under international human rights law. 

In a great victory for privacy advocates, the ECtHR agreed. The Court found that the requirement of continuous, blanket storage of private user data interferes with the right to privacy under the Convention, emphasizing that the possibility for national authorities to access these data is a crucial factor for determining a human rights violation [at 53]. The Court identified the inherent risks of arbitrary government action in secret surveillance in the present case and found again—following its stance in Roman Zakharov v. Russia—that the relevant legislation failed to live up to the quality of law standards and lacked the adequate and effective safeguards against misuse [75].  Turning to a potential justification for such interference, the ECtHR emphasized the need of a careful balancing test that considers the use of modern data storage and processing technologies and weighs the potential benefits against important private-life interests [62-64]. 

In addressing the State mandate for service providers to submit decryption keys to security services, the court's deliberations culminated in the following key findings [76-80]:

  1. Encryption being important for protecting the right to private life and other fundamental rights, such as freedom of expression: The ECtHR emphasized the importance of encryption technologies for safeguarding the privacy of online communications. Encryption safeguards and protects the right to private life generally while also supporting the exercise of other fundamental rights, such as freedom of expression.
  2. Encryption as a shield against abuses: The Court emphasized the role of encryption to provide a robust defense against unlawful access and generally “appears to help citizens and businesses to defend themselves against abuses of information technologies, such as hacking, identity and personal data theft, fraud and the improper disclosure of confidential information.” The Court held that this must be given due consideration when assessing measures which could weaken encryption.
  3. Decryption of communications orders weakens the encryption for all users: The ECtHR established that the need to decrypt Telegram's "secret chats" requires the weakening of encryption for all users. Taking note again of the dangers of restricting encryption described by many experts in the field, the Court held that backdoors could be exploited by criminal networks and would seriously compromise the security of all users’ electronic communications. 
  4. Alternatives to decryption: The ECtHR took note of a range of alternative solutions to compelled decryption that would not weaken the protective mechanisms, such as forensics on seized devices and better-resourced policing.  

In light of these findings, the Court held that the mandate to decrypt end-to-end encrypted communications risks weakening the encryption mechanism for all users, which was a disproportionate to the legitimate aims pursued. 

In summary [80], the Court concluded that the retention and unrestricted state access to internet communication data, coupled with decryption requirements, cannot be regarded as necessary in a democratic society, and are thus unlawful. It emphasized that a direct access of authorities to user data on a generalized basis and without sufficient safeguards impairs the very essence of the right to private life under the Convention. The Court also highlighted briefs filed by the European Information Society Institute (EISI) and Privacy International, which provided insight into the workings of end-to-end encryption and explained why mandated backdoors represent an illegal and disproportionate measure. 

Impact of the ECtHR ruling on current policy developments 

The ruling is a landmark judgment, which will likely draw new normative lines about human rights standards for private and confidential communication. We are currently supporting Telegram in its parallel complaint to the ECtHR, contending that blocking its app infringes upon fundamental rights. As part of a collaborative efforts of international human rights and media freedom organisations, we have submitted a third-party intervention to the ECtHR, arguing that blocking an entire app is a serious and disproportionate restriction on freedom of expression. That case is still pending. 

The Podchasov ruling also directly challenges ongoing efforts in Europe to weaken encryption to allow access and scanning of our private messages and pictures.

For example, the controversial UK's Online Safety Act creates the risk that online platforms will use software to search all users’ photos, files, and messages, scanning for illegal content. We recently submitted comments to the relevant UK regulator (Ofcom) to avoid any weakening of encryption when this law becomes operational. 

In the EU, we are concerned about the European Commission’s message-scanning proposal (CSAR) as being a disaster for online privacy. It would allow EU authorities to compel online services to scan users’ private messages and compare users’ photos to against law enforcement databases or use error-prone AI algorithms to detect criminal behavior. Such detection measures will inevitably lead to dangerous and unreliable Client-Side Scanning practices, undermining the essence of end-to-end encryption. As the ECtHR deems general user scanning as disproportionate, specifically criticizing measures that weaken existing privacy standards, forcing platforms like WhatsApp or Signal to weaken security by inserting a vulnerability into all users’ devices to enable message scanning must be considered unlawful

The EU regulation proposal is likely to be followed by other proposals to grant law enforcement access to encrypted data and communications. An EU high level expert group on ‘access to data for effective law enforcement’ is expected to make policy recommendations to the next EU Commission in mid-2024. 

We call on lawmakers to take the Court of Human Rights ruling seriously: blanket and indiscriminate scanning of user communication and the general weakening of encryption for users is unacceptable and unlawful. 

Voting No on Prop E Is Easy and Important for San Francisco

EFF - Mon, 03/04/2024 - 5:11pm

San Francisco’s ballot initiative Proposition E is a dangerous and deceptive measure that threatens our privacy, safety, and democratic ideals. It would give the police more power to surveil, chase, and harm. It would allow the police to secretly acquire and use unproven surveillance technologies for a year or more without oversight, eliminating the hard-won protections backed by a majority of San Franciscans that are currently in place. Prop E is not a solution to the city’s challenges, but rather a threat to our rights and freedoms. 

Don’t be fooled by the misleading arguments of Prop E's supporters. A group of tech billionaires have contributed a small fortune to convince San Francisco voters that they would be safer if surveilled. They want us to believe that Prop E will make us safer and more secure, but the truth is that it will do the opposite. Prop E will allow the police to use any surveillance technology they want for up to a year without considering whether it works as promised—or at all—or whether it presents risks to residents’ privacy or safety. Police only have to present a use policy after a year of free and unaccountable use, and absent a majority vote of the Board of Supervisors rejecting the policy, this unaccountable use could continue indefinitely. Worse still, some technologies, like surveillance cameras and drones, would be exempt from oversight indefinitely, putting the unilateral decision about when, where, and how to deploy such technology in the hands of the SFPD.

We want something different for our city. In 2019, with the support a wide range of community members and civil society groups including EFF, San Francisco’s Board of Supervisors took a historic step forward by passing a groundbreaking surveillance transparency and accountability ordinance through a 10-1 vote. The law requires that before a city department, including the police, acquire or use a surveillance technology, the department must present a use policy to the Board of Supervisors, which then considers the proposal in a public process that offers opportunity for public comment. This process respects privacy, dignity, and safety and empowers residents to make their voices heard about the potential impacts and risks. 

Despite what Prop E proponents would have you believe, the city’s surveillance ordinance has not stopped police from acquiring new technologies. In fact, they have gained access to broad networks of live-feed cameras. Current law helps ensure that the police follow reasonable guidelines on using technology and mitigating potentially disparate harms. Prop E would gut police accountability from this law and return decision-making about how we are surveilled to closed spaces where unproven and unvetted vendor promises rule the narrative. 

As San Francisco residents, we must stand up for ourselves and our city and vote No on Prop E. Voting No on Prop E is not only an easy choice, but also a necessary one. It is a choice that reflects our values and vision for San Francisco. It is a choice that shows that we will not let a million-dollar campaign of fear drive us to sacrifice our rights. Voting No on Prop E is a choice that proves we are unwilling to accept anything less than what we deserve: privacy, safety, and accountability.

March 5 is election day. Make your voice heard. Vote No on Prop E.  

Celebrating 15 Years of Surveillance Self-Defense

EFF - Mon, 03/04/2024 - 1:59pm

On March 3rd, 2009, we launched Surveillance Self-Defense (SSD). At the time, we pitched it as, "an online how-to guide for protecting your private data against government spying." In the last decade hundreds of people have contributed to SSD, over 20 million people have read it, and the content has nearly doubled in length from 40,000 words to almost 80,000. SSD has served as inspiration for many other guides focused on keeping specific populations safe, and those guides have in turn affected how we've approached SSD. A lot has changed in the world over the last 15 years, and SSD has changed with it. 

The Year Is 2009

Let's take a minute to travel back in time to the initial announcement of SSD. Launched with the support of the Open Society Institute, and written entirely by just a few people, we detailed exactly what our intentions were with SSD at the start:

EFF created the Surveillance Self-Defense site to educate Americans about the law and technology of communications surveillance and computer searches and seizures, and to provide the information and tools necessary to keep their private data out of the government's hands… The Surveillance Self-Defense project offers citizens a legal and technical toolkit with tips on how to defend themselves in case the government attempts to search, seize, subpoena or spy on their most private data.

SSD's design when it first launched in 2009.

To put this further into context, it's worth looking at where we were in 2009. Avatar was the top grossing movie of the year. Barack Obama was in his first term as president in the U.S. In a then-novel approach, Iranians turned to Twitter to organize protests. The NSA has a long history of spying on Americans, but we hadn't gotten to Jewel v. NSA or the Snowden revelations yet. And while the iPhone had been around for two years, it hadn't seen its first big privacy controversy yet (that would come in December of that year, but it'd be another year still before we hit the "your apps are watching you" stage).

Most importantly, in 2009 it was more complicated to keep your data secure than it is today. HTTPS wasn't common, using Tor required more technical know-how than it does nowadays, encrypted IMs were the fastest way to communicate securely, and full-disk encryption wasn't a common feature on smartphones. Even for computers, disk encryption required special software and knowledge to implement (not to mention time, solid state drives were still extremely expensive in 2009, so most people still had spinning disk hard drives, which took ages to encrypt and usually slowed down your computer significantly).

And thus, SSD in 2009 focused heavily on law enforcement and government access with its advice. Not long after the launch in 2009, in the midst of the Iranian uprising, we launched the international version, which focused on the concerns of individuals struggling to preserve their right to free expression in authoritarian regimes.

And that's where SSD stood, mostly as-is, for about six years. 

The Redesigns

In 2014, we redesigned and relaunched SSD with support from the Ford Foundation. The relaunch had at least 80 people involved in the writing, reviewing, design, and translation process. With the relaunch, there was also a shift in the mission as the threats expanded from just the government, to corporate and personal risks as well. From the press release:

"Everyone has something to protect, whether it's from the government or stalkers or data-miners," said EFF International Director Danny O'Brien. "Surveillance Self-Defense will help you think through your personal risk factors and concerns—is it an authoritarian government you need to worry about, or an ex-spouse, or your employer?—and guide you to appropriate tools and practices based on your specific situation."

2014 proved to be an effective year for a major update. After the murders of Michael Brown and Eric Garner, protestors hit the streets across the U.S., which made our protest guide particularly useful. There were also major security vulnerabilities that year, like Heartbleed, which caused all sorts of security issues for website operators and their visitors, and Shellshock, which opened up everything from servers to cameras to bug exploits, ushering in what felt like an endless stream of software updates on everything with a computer chip in it. And of course, there was still fallout from the Snowden leaks in 2013.

In 2018 we did another redesign, and added a new logo for SSD that came along with EFF's new design. This is more or less the same design of the site today.

SSD's current design, which further clarifies what sections a guide is in, and expands the security scenarios.

Perhaps the most notable difference between this iteration of SSD and the years before is the lack of detailed reasoning explaining the need for its existence on the front page. No longer was it necessary to explain why we all need to practice surveillance self-defense. Online surveillance had gone mainstream.

Shifting Language Over the Years

As the years passed and the site was redesigned, we also shifted how we talked about security. In 2009 we wrote about security with terms like, "adversaries," "defensive technology," "threat models," and "assets." These were all common cybersecurity terms at the time, but made security sound like a military exercise, which often disenfranchised the very people who needed help. For example, in the later part of the 2010s, we reworked the idea of "threat modeling," when we published Your Security Plan. This was meant to be less intimidating and more inclusive of the various types of risks that people face.

The advice in SSD has changed over the years, too. Take passwords as an example, where in 2009 we said, "Although we recommend memorizing your passwords, we recognize you probably won't." First off, rude! Second off, maybe that could fly with the lower number of accounts we all had back in 2009, but nowadays nobody is going to remember hundreds of passwords. And regardless, that seems pretty dang impossible when paired with the final bit of advice, "You should change passwords every week, every month, or every year — it all depends on the threat, the risk, and the value of the asset, traded against usability and convenience."

Moving onto 2015, we phrased this same sentiment much differently, "Reusing passwords is an exceptionally bad security practice, because if an attacker gets hold of one password, she will often try using that password on various accounts belonging to the same person… Avoiding password reuse is a valuable security precaution, but you won't be able to remember all your passwords if each one is different. Fortunately, there are software tools to help with this—a password manager."

Well, that's much more polite!

Since then, we've toned that down even more, "Reusing passwords is a dangerous security practice. If someone gets ahold of your password —whether that's from a data breach, or wherever else—they can often gain access to any other account you used that same password. The solution is to use unique passwords everywhere and take additional steps to secure your accounts when possible."

Security is an always evolving process, so too is how we talk about it. But the more people we bring on board, the better it is for everyone. How we talk about surveillance self-defense will assuredly continue to adapt in the future.

Shifting Language(s) Over the Years

Initially in 2009, SSD was only available in English, and soon after launch, in Bulgarian. In the 2014 re-launch, we added Arabic and Spanish. Then added French, Thai, Vietnamese, and Urdu in 2015. Later that year, we added a handful of Amharic translations, too. This was accomplished through a web of people in dozens of countries who volunteered to translate and review everything. Many of these translations were done for highly specific reasons. For example, we had a Google Policy Fellow, Endalk Chala, who was part of the Zone 9 bloggers in Ethiopia. He translated everything into Amharic as he was fighting for his colleagues and friends who were imprisoned in Ethiopia on terrorism charges.

By 2019, we were translating most of SSD into at least 10 languages: Amharic, Arabic, Spanish, French, Russian, Turkish, Vietnamese, Brazilian Portuguese, Thai, and Urdu (as well as additional, externally-hosted community translations in Indonesian Bahasa, Burmese, Traditional Chinese, Igbo, Khmer, Swahili, Yoruba, and Twi).

Currently, we're focusing on getting the entirety of SSD re-translated into seven languages, then focusing our efforts on translating specific guides into other languages. 

Always Updating

Since 2009, we've done our best to review and update the guides in SSD. This has included minor changes to respond to news events, depreciating guides completely when they're no longer applicable in modern security plans, and massive rewrites when technology has changed.

The original version of SSD was launched mostly as a static text (we even offered a printer-friendly version), though updates and revisions did occur, they were not publicly tracked as clearly as they are today. In its early years, SSD was able to provide useful guidance across a number of important events, like Occupy Wall Street, before the major site redesign in 2014, which helped it become more useful training activists, including for Ferguson and Standing Rock, amongst others. The ability to update SSD along with changing trends and needs has ensured it can always be useful as a resource.

That redesign also better facilitated the updates process. The site became easier to navigate and use, and easier to update. For example, in 2017 we took on a round of guide audits in response to concerns following the 2016 election. In 2019 we continued that process with around seven major updates to SSD, and in 2020, we did five. We don't have great stats for 2021 and 2022, but in 2023 we managed 14 major updates or new guides. We're hoping to have the majority of SSD reviewed and revamped by the end of this year, with a handful of expansions along the way.

Which brings us to the future of SSD. We will continue updating, adapting, and adding to SSD in the coming years. It is often impossible to know what will be needed, but rest assured we'll be there to answer that whenever we can. As mentioned above, this includes getting more translations underway, and continuing to ensure that everything is accurate and up-to-date so SSD can remain one of the best repositories of security information available online.

We hope you’ll join EFF in celebrating 15 years of SSD!

Privacy Isn't Dead. Far From It. | EFFector 36.3

EFF - Mon, 03/04/2024 - 1:31pm

As we continue the journey of fighting for digital freedoms, it can be hard to keep up on the latest happenings. Thankfully, EFF has a guide to keep you in the loop! EFFector 36.3 is out now and covers the latest news, including recent changes to the Kids Online Safety Act (spoiler alert: IT'S STILL BAD), why we flew a plane over San Francisco, and the first episode of Season 5 of our award-winning "How to Fix the Internet" podcast!

You can read the full newsletter here, or subscribe to get the next issue in your inbox automatically! You can also listen to the audio version of the newsletter on the Internet Archive, or by clicking the button below:

LISTEN ON YouTube

EFFector 36.3 | Privacy Isn't Dead. Far From It.

Since 1990 EFF has published EFFector to help keep readers on the bleeding edge of their digital rights. We know that the intersection of technology, civil liberties, human rights, and the law can be complicated, so EFFector is a great way to stay on top of things. The newsletter is chock full of links to updates, announcements, blog posts, and other stories to help keep readers—and listeners—up to date on the movement to protect online privacy and free expression. 

Thank you to the supporters around the world who make our work possible! If you're not a member yet, join EFF today to help us fight for a brighter digital future.

A Virtual Reality Tour of Surveillance Tech at the Border: A Conversation with Dave Maass of the Electronic Frontier Foundation

EFF - Mon, 03/04/2024 - 12:13pm

This interview is crossposted from The Markup, a nonprofit news organization that investigates technology and its impact on society.

By: Monique O. Madan, Investigative Reporter at The Markup

After reading my daily news stories amid his declining health, my grandfather made it a habit of traveling the world—all from his desk and wheelchair. When I went on trips, he always had strong opinions and recommendations for me, as if he’d already been there. “I've traveled to hundreds of countries," he would tell me. "It's called Google Earth. Today, I’m going to Armenia.” My Abuelo’s passion for teleporting via Google Street View has always been one of my fondest memories and has never left me. 

So naturally, when I found out that Dave Maass of the Electronic Frontier Foundation gave virtual reality tours of surveillance technology along the U.S.–Mexico border, I had to make it happen. I cover technology at the intersection of immigration, criminal justice, social justice and government accountability, and Maass’ tour aligns with my work as I investigate border surveillance. 

My journey began in a small, quiet, conference room at the Homestead Cybrarium, a hybrid virtual public library where I checked out virtual reality gear. The moment I slid the headset onto my face and the tour started, I was transported to a beach in San Diego. An hour and a half later, I had traveled across 1,500 miles worth of towns and deserts and ended up in Brownsville, Texas.

During that time, we looked at surveillance technology in 27 different cities on both sides of the border. Some of the tech I saw were autonomous towers, aerostat blimps, sky towers, automated license plate readers, and border checkpoints. 

After the excursion, I talked with Maass, a former journalist, about the experience. Our conversation has been edited for brevity and clarity.

Monique O. Madan: You began by dropping me in San Diego, California, and it was intense. Tell me why you chose the location to start this experience.

Dave Maass: So I typically start the tour in San Diego for two reasons. One is because it is the westernmost part of the border, so it's a natural place to start. But more importantly, it is such a stark contrast to be able to jump from one side to the other, from the San Diego side to the Tijuana side.

When you're in San Diego, you're in this very militarized park that's totally empty, with patrol vehicles and this very fierce-looking wall and a giant surveillance tower over your head. You can really get a sense of the scale.

And once you're used to that, I jump you to the other side of the wall. You're able to suddenly see how it's party time in Tijuana, how they painted the wall, and how there are restaurants and food stands and people playing on the beach and there are all these Instagram moments.

Credit: Electronic Frontier Foundation

Yet on the other side is the American militarized border, you know, essentially spying on everybody who's just going about their lives on the Mexican side.

It also serves as a way to show the power of VR. If there were no wall, you could walk that in a minute. But because of the border wall, you've got to go all the way to the border crossing, and then come all the way back. And we're talking, potentially, hours for you to be able to go that distance. 

Madan: I felt like I was in two different places, but it was really the same place, just feet away from each other. We saw remote video surveillance systems, relocatable ones. We saw integrated fixed towers, autonomous surveillance towers, sky towers, aerostat radar systems, and then covert automated license plate readers. How do you get the average person to digest what all these things really mean?

7 Stops on Dave Maass’ Virtual Reality Surveillance Tour of the U.S.–Mexico Border

The following links take you to Google Street View.

Maass: Me and some colleagues at EFF, we were looking at how we could use virtual reality to help people understand surveillance. We came up with a very basic game called “Spot the Surveillance,” where you could put on a headset and it puts you in one location with a 360-degree camera view. We took a photo of a corner in San Francisco that already had a lot of surveillance, but we also Photoshopped in other pieces of surveillance. The idea was for people to look around and try to find the surveillance.

When they found one, it would ping, and it would tell you what the technology could do. And we found that that helped people learn to look around their environment for these technologies, to understand it. So it gave people a better idea of how we exist in the environment differently than if they were shown a picture or a PowerPoint presentation that was like, “This is what a license plate reader looks like. This is what a drone looks like.”

That is why when we're on the southern border tour, there are certain places where I don't point the technology out to you. I ask you to look around and see if you can find it yourself.

Sometimes I start with one where it's overhead because people are looking around. They're pointing to a radio tower, pointing to something else. It takes them a while before they actually look up in the sky and see there's this giant spy mob over their head. But, yeah, one of the other ones is these license plate readers that are hidden in traffic cones. People don't notice them there because they're just these traffic cones that are so ubiquitous along highways and streets that they don't actually think about it.

Madan: People have the impression that surveillance ops are only in militarized settings. Can you talk to me about whether that’s true?

Maass: Certainly there are towers in the middle of the desert. Certainly there are towers that are in remote or rural areas. But there are just so many that are in urban areas, from big cities to small towns.

Rather than just a close-up picture of a tower, once you actually see one and you're able to look at where the cameras are pointed, you start to see things like towers that are able to look into people's back windows, and towers that are able to look into people's backyards, and whole communities that are going to have glimpses over their neighborhood all the time.

But so rarely in the conversation is the impact on the communities that live on both the U.S. and Mexican side of the border, and who are just there all the time trying to get by and have, you know, the normal dream of prospering and raising a family.

Madan: What does this mean from a privacy, human rights, and civil liberties standpoint? 

Maass: There’s not a lot of transparency around issues of technology. That is one of the major flaws, both for human rights and civil liberties, but it's also a flaw for those who believe that technology is going to address whatever amorphous problem they've identified or failed to identify with border security and migration. So it's hard to know when this is being abused and how.

But what we can say is that as [the government] is applying more artificial intelligence to its camera system, it's able to document the pattern of life of people who live along the border.

It may be capturing people and learning where they work and where they're worshiping or who they are associated with. So you can imagine that if you are somebody who lives in that community and if you're living in that community your whole life, the government may have, by the time you're 31 years old, your entire driving history on file that somebody can access at any time, with who knows what safeguards are in place.

But beyond all that, it really normalizes surveillance for a whole community.

There are a lot of psychological studies out there about how surveillance can affect people over time, affect their behavior, and affect their perceptions of a society. That's one of the other things I worry about: What kind of psychological trauma is surveillance causing for these communities over the long term, in ways that may not be immediately perceptible?

Madan: One of the most interesting uses of experiencing this tour via the VR technology was being able to pause and observe every single detail at the border checkpoint.

Maass: Most people are just rolling through, and so you don't get to notice all of the different elements of a checkpoint. But because the Google Street View car went through, we can roll through it at our leisure and point out different things. I have a series of checkpoints that I go through with people, show them this is where the license plate reader is, this is where the scanner truck is, here's the first surveillance camera, here's the second surveillance camera. We can see the body-worn camera on this particular officer. Here's where people are searched. Here's where they're detained. Here's where their car is rolled through an X-ray machine.

Madan: So your team has been mapping border surveillance for a while. Tell us about that and how it fits into this experience.

Maass: We started mapping out the towers in 2022, but we had started researching and building a database of at least the amount of surveillance towers by district in 2019. 

I don't think it was apparent to anyone until we started mapping these out, how concentrated towers are in populated areas. Maybe if you were in one of those populated areas, you knew about it, or maybe you didn't.

In the long haul, it may start to tell a little bit more about border policy in general and whether any of these are having any kind of impact, and maybe we start to learn more about apprehensions and other kinds of data that we can connect to.

Madan: If someone wanted to take a tour like this, if they wanted to hop on in VR and visit a few of these places, how can they do that? 

Maass: So if they have a VR headset, a Meta Quest 2 or newer, the Wander app is what you're going to use. You can just go into the app and position yourself somewhere in the border. Jump around a little bit, maybe it will be like five feet, and you can start seeing a surveillance tower.

If you don’t have a headset and want to do it in your browser, you can go to EFF’s map and click on a tower. You’ll see a Street View link when you scroll down. Or you can use those tower coordinates and then go to your VR headset and try to find it.

Madan: What are your thoughts about the Meta Quest headset—formerly known as the Oculus Rift—being founded by Palmer Luckey, who also founded the company that made one of the towers on the tour?

Maass: There’s certainly some irony about using a technology that was championed by Palmer Luckey to shine light on another technology championed by Palmer Luckey. That's not the only tech irony, of course: Wander [the app used for the tour] also depends on using products from Google and Meta, both of whom continue to contribute to the rise of surveillance in society, to investigate surveillance.

Madan: What's your biggest takeaway as the person giving this tour?

Maass: I am a researcher and educator, and an activist and communicator. To me, this is one of the most impactful ways that I can reach people and give them a meaningful experience about the border. 

I think that when people are consuming information about the border, they're just getting little snippets from a little particular area. You know, it's always a little place that they're getting a little sliver of what's going on. 

But when we're able to do this with VR, I'm able to take them everywhere. I'm able to take them to both sides of the border. We're able to see a whole lot, and they're able to come away by the end of it, feeling like they were there. Like your brain starts filling in the blanks. People get this experience that they wouldn't be able to get any other way.

Being able to linger over these spaces on my own time showed me just how much surveillance is truly embedded in people's daily lives. When I left the library, I found myself inspecting traffic cones for license plate readers. 

As I continue to investigate border surveillance, this experience really showed me just how educational these tools can be for academics, research and journalism. 

Thanks for reading,
Monique
Investigative Reporter
The Markup

This article was originally published on The Markup and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.

Ghana's President Must Refuse to Sign the Anti-LGBTQ+ Bill

EFF - Thu, 02/29/2024 - 5:52pm

After three years of political discussions, MPs in Ghana's Parliament voted to pass the country’s draconian Promotion of Proper Human Sexual Rights and Ghanaian Family Values Bill on February 28th. The bill now heads to Ghana’s President Nana Akufo-Addo to be signed into law. 

President Nana Akufo-Addo must protect the human rights of all people in Ghana and refuse to provide assent to the bill.

This anti-LGBTQ+ legislation introduces prison sentences for those who partake in LGBTQ+ sexual acts, as well as those who promote the rights of gay, lesbian or other non-conventional sexual or gender identities. This would effectively ban all speech and activity on and offline that even remotely supports LGBTQ+ rights.

Ghanaian authorities could probe the social media accounts of anyone applying for a visa for pro-LGBTQ+ speech or create lists of pro-LGBTQ+ supporters to be arrested upon entry. They could also require online platforms to suppress content about LGBTQ+ issues, regardless of where it was created. 

Doing so would criminalize the activity of many major cultural and commercial institutions. If President Akufo-Addo does approve the bill, musicians, corporations, and other entities that openly support LGBTQ+ rights would be banned in Ghana.

Despite this direct threat to online freedom of expression, tech giants are yet to speak out publicly against the LGBTQ+ persecution in Ghana. Twitter opened its first African office in Accra in April 2021, citing Ghana as “a supporter of free speech, online freedom, and the Open Internet.” Adaora Ikenze, Facebook’s head of Public Policy in Anglophone West Africa has said: “We want the millions of people in Ghana and around the world who use our services to be able to connect, share and express themselves freely and safely, and will continue to protect their ability to do that on our platforms.” Both companies have essentially dodged the question.

For many countries across Africa, and indeed the world, the codification of anti-LGBTQ+ discourses and beliefs can be traced back to colonial rule, and a recent CNN investigation from December 2023 found alleged links between the drafting of homophobic laws in Africa and a US nonprofit. The group denied those links, despite having hosted a political conference in Accra shortly before an early version of this bill was drafted.

Regardless of its origin, the past three years of political and social discussion have contributed to a decimation of LGBTQ+ rights in Ghana, and the decision by MPs in Ghana’s Parliament to pass this bill creates severe impacts not just for LGBTQ+ people in Ghana, but for the very principle of free expression online and off. President Nana Akufo-Addo must reject it.

We Flew a Plane Over San Francisco to Fight Proposition E. Here's Why.

EFF - Thu, 02/29/2024 - 3:19pm

Proposition E, which San Franciscans will be asked to vote on in the March 5 election, is so dangerous that last weekend we chartered a plane to inform our neighbors about what the ballot measure does and urge them to vote NO on it. If you were in Dolores Park, Golden Gate Park, Chinatown, or anywhere in between on Saturday, there’s a chance you saw it, with a huge banner flying through the sky: “No Surveillance State! No on Prop E.”

Despite the fact that the San Francisco Chronicle has endorsed a NO vote on Prop E, and even quoted some police who don’t find its changes useful to keeping the public safe, proponents of Prop E have raised over $1 million to push this unnecessary, ill-thought out, and downright dangerous ballot measure.

San Francisco, Say NOPE: Vote NO on Prop E on March 5

What Does Prop E Do?

Prop E is a haphazard mess of proposals that tries to capitalize on residents’ fear of crime in an attempt to gut commonsense democratic oversight of the San Francisco Police Department (SFPD). In addition to removing certain police oversight authority from the civilian-staffed Police Commission and expanding the circumstances under which police may conduct high-speed vehicle chases, Prop E would also amend existing law passed in 2019 to protect San Franciscans from invasive, untested, or biased police surveillance technologies. Currently, if the SFPD wants to acquire a new technology, they must provide a detailed use policy to the democratically-elected Board of Supervisors, in a process that allows for public comment. The Board then votes on whether and how the police can use the technology.

Prop E guts these protective measures designed to bring communities into the conversation about public safety. If Prop E passes on March 5, then the SFPD can unilaterally use any technology they want for a full year without the Board’s approval, without publishing an official policy about how they’d use the technology, and without allowing community members to voice their concerns.

Why is Prop E Dangerous and Unnecessary?

Across the country, police often buy and deploy surveillance equipment without residents of their towns even knowing what police are using or how they’re using it. This means that dangerous technologies—technologies other cities have even banned—are being used without any transparency, accountability, or democratic control.

San Franciscans advocated for and overwhelmingly supported a law that provides them with more knowledge of, and a voice in, what technologies the police use. Under current law, if the SFPD wanted to use racist predictive policing algorithms that U.S. Senators are currently advising the Department of Justice to stop funding or if the SFPD wanted to buy up geolocation data being harvested from people’s cells phones and sold on the advertising data broker market, they have to let the public know and put it to a vote before the city’s democratically-elected governing body first. Prop E would gut any meaningful democratic check on police’s acquisition and use of surveillance technologies.

What Technology Would Prop E Allow Police to Use?

That's the thing—we don't know, and if Prop E passes, we may never know. Today, if the SFPD decides to use a piece of surveillance technology, there is a process for sharing that information with the public. With Prop E, that process won't happen until the technology has been in use for a full year. And if police abandon use of a technology before a year, we may never find out what technology police tried out and how they used it. 

Even though we don't know what technologies the SFPD is eyeing, we do know what technologies other police departments have been buying in cities around the country: AI-based “predictive policing,” and social media scanning tools are just two examples. And according to the City Attorney, Prop E would even enable the SFPD to outfit surveillance tools such as drones and surveillance cameras with face recognition technology. San Francisco currently has a ban on police using remote-controlled robots to deploy deadly force, but if passed, Prop E would allow police to invest in technologies like taser-armed drones without any oversight or potential for elected officials to block the sale. 

Don’t let police experiment on San Franciscans with dangerous, untested surveillance technologies. Say NOPE to a surveillance state. Vote NO on Prop E on March 5.  

Pages