Given our work with security, we do everything we can to protect our hardware. Even if it means using glittery nail polish. We’ll show you one of our more creative and fun yet effective methods for tamper protecting computers.

The Australian government has new laws on the books to hack your computer, your online accounts, and just about any piece of technology and networks you come into contact with. It can happen without a warrant and without you ever knowing. That’s just the start of it. Outraged? Good.

Earlier in August, the Parliamentary Joint Committee on Intelligence and Security (PJCIS) released a report on the Surveillance Legislation Amendment (Identify and Disrupt) Bill 2020 recommending it be passed with significant changes. Most notably, they recommended narrowing the scope of the new powers introduced by the bill, by limiting the criteria for issuing new warrants, requiring approval from a superior court judge and calling for stronger oversight and review mechanisms.

Read More

When Mark Zuckerburg unveiled a new “privacy-focused vision” for Facebook in March 2019, he cited the company’s global messaging service, WhatsApp, as a model. Acknowledging that “we don’t currently have a strong reputation for building privacy protective services,” the Facebook CEO wrote that “I believe the future of communication will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure and their messages and content won’t stick around forever. This is the future I hope we will help bring about. We plan to build this the way we’ve developed WhatsApp.”

Read More

Apple announced today that it would “take additional time over the coming months to collect input and make improvements” to a program that will weaken privacy and security on iPhones and other products. EFF is pleased Apple is now listening to the concerns of customers, researchers, civil liberties organizations, human rights activists, LGBTQ people, youth representatives, and other groups, about the dangers posed by its phone scanning tools. But the company must go further than just listening, and drop its plans to put a backdoor into its encryption entirely.

Read More

Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced.

The planned features include scanning users’ iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

Apple confirmed that feedback from customers, non-profit and advocacy groups, researchers, and others about the plans has prompted the delay to give the company time to make improvements. Apple issued the following statement about its decision:

Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook’s former security chief, politicians, policy groups, university researchers, and even some Apple employees. Apple has since endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more.

The suite of Child Safety Features were originally set to debut in the United States with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. It is now unclear when Apple plans to roll out the “critically important” features, but the company still appears to be intent on releasing them.

Read More

A little less than a year ago, I wrote a now-popular post about how I over-engineered my home network for privacy and security. If you haven’t already checked that post out, it walks through how I used a UniFi Dream Machine (although most routers would work), a Pi-Hole to block ads and tracking, cloudflared for DNS over HTTPS, and Cloudflare Gateway to block malware/phishing to (over) optimize my home network for privacy and security.

What I wrote then remains true, but after having relied on, optimized, and upgraded what I described in my previous post for about eighteen months now, I’ve decided to build on what’s there by revisiting re-over-engineering how I setup, maintain, and manage the software and services that power and protected the network with a number of specific goals in mind:

  • Config (and infrastructure) as code - This is by far from a new concept to the industry, but I was somewhat-recently introduced to the idea of treating servers like cattle, not pets. While config as code may come more naturally when managing a cluster of servers, even when managing only a single Raspberry Pi, prefer defined and well-understood changes over guess-and-check server administration.

  • Outsource to the experts - The less I can trust to me “getting it right”, the better. “Copy and paste these random commands from StockOverflow” isn’t the best way to run a security-conscious home network. Instead, rely on the open source community’s established, vetted, and maintained builds, configurations, and defaults through known and trusted distribution channels.

  • It (still) needs to “just work” - A dependency update shouldn’t be able to steal hours of my weekend due to an unexpected conflict or config change. I wanted to get out of the bespoke sysadmin business, provisioning and then immediately walking away from “set it and forget it” systems wherever possible. Ideally, systems would update themselves regularly, and upgrades would be predictable and boring.

Read More

With the warrants, both agencies can take control of a person’s online account to gather evidence about serious offences without consent, as well as add, copy, delete or alter material to disrupt criminal activity and collect intelligence from online networks.

By now you’ve probably heard that Apple plans to push a new and uniquely intrusive surveillance system out to many of the more than one billion iPhones it has sold, which all run the behemoth’s proprietary, take-it-or-leave-it software. This new offensive is tentatively slated to begin with the launch of iOS 15⁠—almost certainly in mid-September⁠—with the devices of its US user-base designated as the initial targets. We’re told that other countries will be spared, but not for long.

You might have noticed that I haven’t mentioned which problem it is that Apple is purporting to solve. Why? Because it doesn’t matter.

Read More
Aug 17

Hamburg’s state government has been formally warned against using Zoom over data protection concerns.

The German state’s data protection agency (DPA) took the step of issuing a public warning yesterday, writing in a press release that the Senate Chancellory’s use of the popular videoconferencing tool violates the European Union’s General Data Protection Regulation (GDPR) since user data is transferred to the US for processing.

Aug 12

Politicians regularly claim that they need to ban encryption to protect the children. But who is actually being monitored?

Apple’s new program for scanning images sent on iMessage steps back from the company’s prior support for the privacy and security of encrypted messages. The program, initially limited to the United States, narrows the understanding of end-to-end encryption to allow for client-side scanning. While Apple aims at the scourge of child exploitation and abuse, the company has created an infrastructure that is all too easy to redirect to greater surveillance and censorship. The program will undermine Apple’s defense that it can’t comply with the broader demands.

For years, countries around the world have asked for access to and control over encrypted messages, asking technology companies to “nerd harder” when faced with the pushback that access to messages in the clear was incompatible with strong encryption. The Apple child safety message scanning program is currently being rolled out only in the United States. 

Read More

As the norms for how people connect have changed, much of the communication that once took place through the medium of coffee shops, bars, and parks now takes place through the medium of digital devices. One side effect of this shift from analog to digital is the conjoined shift from the ephemeral to the eternal: words once transiently spoken are now – more often than not – data stored forever.

We’ve designed Signal so that your data always stays in your hands. We think there’s something special about sharing a private fleeting moment between friends, so Signal also supports disappearing messages. Now, we’ve added the ability to preconfigure all conversations you initiate with a default disappearing messages timer.

Read More

On 29 April 2021, the European Union co-legislators (EU Member States and the European Parliament (EP) reached a provisional agreement (subject to a formal approval) on a temporary legislation to allow providers of electronic communications services such as web-based email and messaging services to continue to detect, remove and report child sexual abuse material (CSAM) online. The temporary legislation has removed protections of confidential conversations between lawyers and their clients and doctors and patients. Furthermore the interim legislation would cover anti-grooming practices. Contrary to scanning known illegal images, detecting grooming requires scanning entire conversations.

Read More

It’s personal. It’s private. And it’s no one’s business but yours. You may be planning a political campaign, discussing your taxes, or having a secret romance. Or you may be communicating with a political dissident in a repressive country. Whatever it is, you don’t want your private electronic mail (email) or confidential documents read by anyone else. There’s nothing wrong with asserting your privacy. Privacy is as apple-pie as the Constitution.

The right to privacy is spread implicitly throughout the Bill of Rights. But when the United States Constitution was framed, the Founding Fathers saw no need to explicitly spell out the right to a private conversation. That would have been silly. Two hundred years ago, all conversations were private. If someone else was within earshot, you could just go out behind the barn and have your conversation there. No one could listen in without your knowledge. The right to a private conversation was a natural right, not just in a philosophical sense, but in a law-of-physics sense, given the technology of the time.

Read More

Apple has announced impending changes to its operating systems that include new “protections for children” features in iCloud and iMessage. If you’ve spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system.

Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.

To say that we are disappointed by Apple’s plans is an understatement. Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again. Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad, but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.

Read More

The forthcoming Senate draft of Biden’s infrastructure bill—a 2,000+ page bill designed to update the United States’ roads, highways, and digital infrastructure—contains a poorly crafted provision that could create new surveillance requirements for many within the blockchain ecosystem. This could include developers and others who do not control digital assets on behalf of users.

While the language is still evolving, the proposal would seek to expand the definition of “broker” under section 6045©(1) of the Internal Revenue Code of 1986 to include anyone who is “responsible for and regularly providing any service effectuating transfers of digital assets” on behalf of another person. These newly defined brokers would be required to comply with IRS reporting requirements for brokers, including filing form 1099s with the IRS. That means they would have to collect user data, including users’ names and addresses.

The broad, confusing language leaves open a door for almost any entity within the cryptocurrency ecosystem to be considered a “broker”—including software developers and cryptocurrency startups that aren’t custodying or controlling assets on behalf of their users. It could even potentially implicate miners, those who confirm and verify blockchain transactions. The mandate to collect names, addresses, and transactions of customers means almost every company even tangentially related to cryptocurrency may suddenly be forced to surveil their users. 

Read More

Facebook representatives approached controversial surveillance vendor NSO Group to try and buy a tool that could help Facebook better monitor a subset of its users, according to an extraordinary court filing from NSO in an ongoing lawsuit.

Facebook is currently suing NSO for how the hacking firm leveraged a vulnerability in WhatsApp to help governments hack users. NSO sells a product called Pegasus, which allows operators to remotely infect cell phones and lift data from them.

According to a declaration from NSO CEO Shalev Hulio, two Facebook representatives approached NSO in October 2017 and asked to purchase the right to use certain capabilities of Pegasus.

Read More

Internet of Things (IoT) devices that record copious details of the daily lives of users raise natural privacy concerns. Manufacturers include measures meant to address these concerns, such as the option of a history-clearing factory reset. But the consumer must trust that these privacy and safety measures work as advertised. It would appear that in the case of at least one high-profile smart speaker, that trust would be misplaced.

Academic research performed on 86 used Amazon Echo Dots has found that the factory reset does not truly wipe data from the devices; it can still be recovered with relatively basic forensic techniques. Echo Dots commonly contain WiFi passwords, router MAC addresses, and Amazon logins among other pieces of sensitive information.

Read More

Every now and then, due to some egregious blunder or blatant overreach on the part of government agencies or tech companies, concerns about surveillance and technology break out beyond the confines of academic specialists and into the public consciousness: the Snowden leaks about the NSA in 2013, the Facebook emotional manipulation study in 2014, the Cambridge Analytica scandal in the wake of the 2016 election. These moments seem to elicit a vague anxiety that ultimately dissipates as quickly as it materialized. Concerns about the NSA are now rarely heard, and while Facebook has experienced notable turbulence, it is not at all clear that meaningful regulation will follow or that a significant number of users will abandon the platform. Indeed, the chief effect of these fleeting moments of surveillance anxiety may be a gradual inoculation to them. In my experience, most people are not only untroubled by journalistic critiques of exploitative surveillance practices; they may even be prepared to defend them: There are trade-offs, yes, but privacy appears to be a reasonable price to pay for convenience or security.

Read More
Jul 11

One of the most troubling features of the digital revolution is that some people pay to subject themselves to surveillance that others are forced to endure and would, if anything, pay to be free of.

Consider a GPS tracker you can wear around one of your arms or legs. Make it sleek and cool — think the Apple Watch or FitBit — and some will pay hundreds or even thousands of dollars for the privilege of wearing it. Make it bulky and obtrusive, and others, as a condition of release from jail or prison, being on probation, or awaiting an immigration hearing, will be forced to wear one — and forced to pay for it too.

In each case, the device collects intimate and detailed biometric information about its wearer and uploads that data to servers, communities, and repositories. To the providers of the devices, this data and the subsequent processing of it are the main reasons the devices exist. They are means of extraction: That data enables further study, prediction, and control of human beings and populations. While some providers certainly profit from the sale of devices, this secondary market for behavioral control and prediction is where the real money is — the heart of what Shoshana Zuboff rightly calls surveillance capitalism.

Read More
Privacy

Discussions about privacy, digital rights, etc.

Sections
Created on Sep 16, 2020
By @gurlic
Administered by: @zed