EFF
Amazon and Google Must Keep Their Promises on Project Nimbus
When a company makes a promise, the public should be able to rely on it. Today, nearly every person in the U.S. is a customer of either Amazon or Google—and many of us are customers of both technology giants. Both of these companies have made public promises that they will ensure their technologies are not being used to facilitate human rights violations. These promises are not just corporate platitudes; they’re commitments to every customer and to society at large.
It’s a reasonable thing to ask if these promises are being kept. And it’s especially important since Amazon and Google have been increasingly implicated by reports that their technologies, specifically their joint cloud computing initiative called Project Nimbus, are being used to facilitate mass surveillance and human rights violations of Palestinians in the Occupied Territories of the West Bank, East Jerusalem, and Gaza. This was the basis of our public call in August 2024 for the companies to come clean about their involvement.
But we didn’t just make a public call. We sent letters directly to the Global Head of Public Policy at Amazon and to Google’s Global Head of Human Rights in late September. We detailed what these companies have promised and asked them to tell us by November 1, 2024 how they were complying. We hoped that they could clear up the confusion, or at least explain where we, or the reporting we were relying on, were wrong.
But instead, they failed to respond. This is unfortunate, since it leads us to question how serious they were in their promises. And it should lead you to question that too.
Project Nimbus: Technology at the Expense of Human RightsProject Nimbus provides advanced cloud and AI capabilities to the Israeli government, tools that an increasing number of credible reports suggest are being used to target civilians under pervasive surveillance in the Occupied Palestinian Territories. This is more than a technical collaboration—it’s a human rights crisis in the making as evidenced by data-driven targeting programs like Project Lavender and Where’s Daddy, which have reportedly led to detentions, killings, and the systematic oppression of journalists, healthcare workers, aid workers, and ordinary families.
Transparency is not a luxury when human rights are at risk—it’s an ethical and legal obligation.
The consequences are serious. Vulnerable communities in Gaza and the West Bank suffer violations of their human rights, including their rights to privacy, freedom of movement, and free association, all of which can be fostered and furthered by pervasive surveillance. These documented violations underscore the ethical responsibility of Amazon and Google, whose technologies are at the heart of this surveillance scheme.
Amazon and Google’s PromisesAmazon and Google have made public commitments to align with the UN Guiding Principles on Business and Human Rights and their own AI ethics frameworks. These frameworks are supposed to ensure that their technologies do not contribute to harm. But their silence on these pressing concerns speaks volumes, undermining trust in their supposed dedication to these principles and casting doubt on their sincerity.
Unanswered Letters, Unanswered AccountabilityWhen we sent letters to Amazon and Google, it was with direct, actionable questions about their involvement in Project Nimbus. We asked for transparency about their contracts, clients, and risk assessments. We called for evidence that due diligence had been conducted and demanded explanations of the steps taken to prevent their technologies from facilitating abuse.
Our core demands were straightforward and tied directly to the company’s commitments:
- Disclose the scope of their involvement in Project Nimbus.
- Provide evidence of risk assessments tied to this project.
- Explain how they are addressing credible reports of misuse.
Despite these reasonable and urgent requests, which are tied directly to the companies’ stated legal and ethical commitments, both companies have remained silent, and their silence isn’t just an insufficient response—it’s an alarming one.
Why Transparency Cannot WaitTransparency is not a luxury when human rights are at risk—it’s an ethical and legal obligation. For both of these companies, it’s an obligation they have promised to the rest of us. For global companies that wield immense power, silence in the face of abuse is inexcusable.
The Fight for AccountabilityEFF is making these letters public to highlight the human rights obligations Amazon and Google have undertaken and to raise reasonable questions they should answer in light of public reports about the misuse of their technologies in the Occupied Palestinian Territories. We aren’t the first ones to raise concerns, but, having raised these questions publicly, and now having given the companies a chance to clarify, we are increasingly concerned about their complicity.
Google and Amazon have promised all of us—their customers and noncustomers alike—that they would take steps to ensure that their technologies support a future where technology empowers rather than oppresses. It’s increasingly clear that those promises are being ignored, if not entirely broken. EFF will continue to push for transparency and accountability.
One Down, Many to Go with Pre-Installed Malware on Android
Last year, we investigated a Dragon Touch children’s tablet (KidzPad Y88X 10) and confirmed that it was linked to a string of fully compromised Android TV Boxes that had also multiple reports of malware, adware, and a sketchy firmware update channel. Since then, Google has taken the (now former) tablet distributor off of their list of Play Protect certified phones and tablets. The burden of catching this type of threat should not be placed on the consumer. Due diligence by manufacturers, distributors, and resellers is the only way to tackle this issue of pre-installed compromised devices making their way into the hands of unknowing customers. But in order to mitigate this issue, regulation and transparency need to be a part of the strategy.
As of October, Dragon Touch is not selling any tablets on their website anymore. However, there is lingering inventory still out there in places like Amazon and Newegg. There are storefronts that exist only on reseller sites for better customer reach, but considering Dragon Touch also wiped their blog of any mention of their tablets, we assume a little more than a strategy shift happened here.
We wrote a guide to help parents set up their kid’s Android devices safely, but it’s difficult to choose which device to purchase to begin with. Advising people to simply buy a more expensive iPad or Amazon Fire Tablet doesn’t change the fact people are going to purchase low-budget devices. Lower budget devices can be just as reputable if the ecosystem provided a path for better accountability.
Who is Responsible?
There are some tools in development for consumer education, like the newly developed, voluntary Cyber Trust Mark by the FCC. This label would aim to inform consumers of the capabilities and guarantee that minimum security standards were met for an IoT device. However, the consumer holding the burden to check for pre-installed malware is absolutely ridiculous. Responsibility should fall to regulators, manufacturers, distributors, and resellers to check for this kind of threat.
More often than not, you can search for low budget Android devices on retailers like Amazon or Newegg, and find storefront pages with little transparency on who runs the store and whether or not they come from a reputable distributor. This is true for more than just Android devices, but considering how many products are created for and with the Android ecosystem, working on this problem could mean better security for thousands of products.
Yes, it is difficult to track hundreds to thousands of distributors and all of their products. It is hard to keep up with rapidly developing threats in the supply chain. You can’t possibly know of every threat out there.
With all due respect to giant resellers, especially the multi-billion dollar ones: tough luck. This is what you inherit when you want to “sell everything.” You also inherit the responsibility and risk of each market you encroach or supplant.
Possible Remedy: Firmware Transparency
Thankfully, there is hope on the horizon and tools exist to monitor compromised firmware.
Last year, Google presented Android Binary Transparency in response to pre-installed malware. This would help track firmware that has been compromised with these two components:
- An append-only log of firmware information that is immutable, globally observable, consistent, auditable. Assured with cryptographic properties.
- A network of participants that invest in witnesses, log health, and standardization.
Google is not the first to think of this concept. This is largely extracting lessons of success from Certificate Transparency. Yet, better support directly from the Android ecosystem for Android images would definitely help. This would provide an ecosystem of transparency of manufacturers and developers that utilize the Android Open Source Project (AOSP) to be just as respected as higher-priced brands.
We love open source here at EFF and would like to continue to see innovation and availability in devices that aren’t necessarily created by bigger, more expensive names. But there needs to be an accountable ecosystem for these products so that pre-installed malware can be more easily detected and not land in consumer hands so easily. Right now you can verify your Pixel device if you have a little technical skill. We would like verification to be done by regulators and/or distributors instead of asking consumers to crack out their command lines to verify themselves.
It would be ideal to see existing programs like Android Play Protect certified run a log like this with open-source log implementations, like Trillian. This way, security researchers, resellers, and regulating bodies could begin to monitor and query information on different Android Original Equipment Manufacturers (OEMs).
There are tools that exist to verify firmware, but right now this ecosystem is a wishlist of sorts. At EFF, we like to imagine what could be better. While a hosted comprehensive log of Android OEMs doesn’t currently exist, the tools to create it do. Some early participants for accountability in the Android realm include F-Droid’s Android SDK Transparency Log and the Guardian Project’s (Tor) Binary Transparency Log.
Time would be better spent on solving this problem systemically, than researching whether every new electronic evil rectangle or IoT device has malware or not.
A complementary solution with binary transparency is the Software Bill of Materials (SBOMs). Think of this as a “list of ingredients” that make up software. This is another idea that is not very new, but has gathered more institutional and government support. The components listed in an SBOM could highlight issues or vulnerabilities that were reported for certain components of a software. Without binary transparency though, researchers, verifiers, auditors, etc. could still be left attempting to extract firmware from devices that haven’t listed their images. If manufacturers readily provided these images, SBOMs can be generated more easily and help create a less opaque market of electronics. Low budget or not.
We are glad to see some movement from last year’s investigations. Right in time for Black Friday. More can be done and we hope to see not only devices taken down more swiftly when reported, especially with shady components, but better support for proactive detection. Regardless of how much someone can spend, everyone deserves a safe, secure device that doesn’t have malware crammed into it.
Tell the Senate: Don’t Weaponize the Treasury Department Against Nonprofits
Last week the House of Representatives passed a dangerous bill that would allow the Secretary of Treasury to strip a U.S. nonprofit of its tax-exempt status. If it passes the Senate and is signed into law, H.R. 9495 would give broad and easily abused new powers to the executive branch. Nonprofits would not have a meaningful opportunity to defend themselves, and could be targeted without disclosing the reasons or evidence for the decision.
This bill is an existential threat to nonprofits of all stripes. Future administrations could weaponize the powers in this bill to target nonprofits on either end of the political spectrum. Even if they are not targeted, the threat alone could chill the activities of some nonprofit organizations.
The bill’s authors have combined this attack on nonprofits, originally written as H.R. 6408, with other legislation that would prevent the IRS from imposing fines and penalties on hostages while they are held abroad. These are separate matters. Congress should separate these two bills to allow a meaningful vote on this dangerous expansion of executive power. No administration should be given this much power to target nonprofits without due process.
Protect nonprofits
Over 350 civil liberties, religious, reproductive health, immigrant rights, human rights, racial justice, LGBTQ+, environmental, and educational organizations signed a letter opposing the bill as written. Now, we need your help. Tell the Senate not to pass H.R. 9495, the so-called “Stop Terror-Financing and Tax Penalties on American Hostages Act.”
EFF Tells the Second Circuit a Second Time That Electronic Device Searches at the Border Require a Warrant
EFF, along with ACLU and the New York Civil Liberties Union, filed a second amicus brief in the U.S. Court of Appeals for the Second Circuit urging the court to require a warrant for border searches of electronic devices, an argument EFF has been making in the courts and Congress for nearly a decade.
The case, U.S. v. Smith, involved a traveler who was stopped at Newark airport after returning from a trip to Jamaica. He was detained by border officers at the behest of the FBI and his cell phone was forensically searched. He had been under investigation for his involvement in a conspiracy to control the New York area emergency mitigation services (“EMS”) industry, which included (among other things) insurance fraud and extortion. He was subsequently prosecuted and sought to have the evidence from his cell phone thrown out of court.
As we wrote about last year, the district court made history in holding that border searches of cell phones require a warrant and therefore warrantless device searches at the border violate the Fourth Amendment. However, the judge allowed the evidence to be used in Mr. Smith’s prosecution because, the judge concluded, the officers had a “good faith” belief that they were legally permitted to search his phone without a warrant.
The number of warrantless device searches at the border and the significant invasion of privacy they represent is only increasing. In Fiscal Year 2023, U.S. Customs and Border Protection (CBP) conducted 41,767 device searches.
The Supreme Court has recognized for a century a border search exception to the Fourth Amendment’s warrant requirement, allowing not only warrantless but also often suspicionless “routine” searches of luggage, vehicles, and other items crossing the border.
The primary justification for the border search exception has been to find—in the items being searched—goods smuggled to avoid paying duties (i.e., taxes) and contraband such as drugs, weapons, and other prohibited items, thereby blocking their entry into the country.
In our brief, we argue that the U.S. Supreme Court’s balancing test in Riley v. California (2014) should govern the analysis here—and that the district court was correct in applying Riley. In that case, the Supreme Court weighed the government’s interests in warrantless and suspicionless access to cell phone data following an arrest against an arrestee’s privacy interests in the depth and breadth of personal information stored on a cell phone. The Supreme Court concluded that the search-incident-to-arrest warrant exception does not apply, and that police need to get a warrant to search an arrestee’s phone.
Travelers’ privacy interests in their cell phones and laptops are, of course, the same as those considered in Riley. Modern devices, a decade later, contain even more data points that together reveal the most personal aspects of our lives, including political affiliations, religious beliefs and practices, sexual and romantic affinities, financial status, health conditions, and family and professional associations.
In considering the government’s interests in warrantless access to digital data at the border, Riley requires analyzing how closely such searches hew to the original purpose of the warrant exception—preventing the entry of prohibited goods themselves via the items being searched. We argue that the government’s interests are weak in seeking unfettered access to travelers’ electronic devices.
First, physical contraband (like drugs) can’t be found in digital data.
Second, digital contraband (such as child pornography) can’t be prevented from entering the country through a warrantless search of a device at the border because it’s likely, given the nature of cloud technology and how internet-connected devices work, that identical copies of the files are already in the country on servers accessible via the internet. As the Smith court stated, “Stopping the cell phone from entering the country would not … mean stopping the data contained on it from entering the country” because any data that can be found on a cell phone—even digital contraband—“very likely does exist not just on the phone device itself, but also on faraway computer servers potentially located within the country.”
Finally, searching devices for evidence of contraband smuggling (for example, text messages revealing the logistics of an illegal import scheme) and other evidence for general law enforcement (i.e., investigating non-border-related domestic crimes, as was the case of the FBI investigating Mr. Smith’s involvement in the EMS conspiracy) are too “untethered” from the original purpose of the border search exception, which is to find prohibited items themselves and not evidence to support a criminal prosecution.
If the Second Circuit is not inclined to require a warrant for electronic device searches at the border, we also argue that such a search—whether manual or forensic—should be justified only by reasonable suspicion that the device contains digital contraband and be limited in scope to looking for digital contraband. This extends the Ninth Circuit’s rule from U.S. v. Cano (2019) in which the court held that only forensic device searches at the border require reasonable suspicion that the device contains digital contraband, while manual searches may be conducted without suspicion. But the Cano court also held that all searches must be limited in scope to looking for digital contraband (for example, call logs are off limits because they can’t contain digital contraband in the form of photos or files).
In our brief, we also highlighted two other district courts within the Second Circuit that required a warrant for border device searches: U.S. v. Sultanov (2024) and U.S. v. Fox (2024). We plan to file briefs in their appeals, as well. Earlier this month, we filed a brief in another Second Circuit border search case, U.S. v. Kamaldoss. We hope that the Second Circuit will rise to the occasion in one of these cases and be the first circuit to fully protect travelers’ Fourth Amendment rights at the border.
Looking for the Answer to the Question, "Do I Really Own the Digital Media I Paid For?"
Sure, buying your favorite video game, movie, or album online is super convenient. I personally love being able to pre-order a game and play it the night of release, without needing to go to a store.
But something you may not have thought about before making your purchase are the differences between owning a physical or digital copy of that media. Unfortunately, there’s quite a few rights you give up by purchasing a digital copy of your favorite game, movie, or album! On our new site, Digital Rights Bytes, we outline the differences between owning physical and digital media, and why we need to break down that barrier.
Digital Rights Bytes explains this and answers other common questions about technology that may be getting on your nerves and includes short videos featuring adorable animals. You can also read up on what EFF is doing to ensure you actually own the digital media you pay for, and how you can take action, too.
Got other questions you’d like us to answer in the future? Let us know on your favorite social platform using the hashtag #DigitalRightsBytes.