The Hamburg commissioner for data protection and freedom of information, Johannes Caspar, is looking to stop Facebook from aggregating the data from WhatsApp, fearing that the company would use it to expand its marketing and advertising business.
Caspar said in a statement: “Currently, there is reason to believe that the data sharing provisions between WhatsApp and Facebook are intended to be unlawfully enforced due to the lack of voluntary and informed consent. In order to prevent unlawful mass data sharing and to put an end to unlawful consent pressure on millions of people, a formal administrative procedure has now been initiated to protect data subjects.”
The technology behind Bitcoin is a “boon for surveillance” and shouldn’t be shunned by governments but embraced, according to an ex-CIA boss.
Michael Morell, who was previously the CIA’s acting director, said in ‘An Analysis of Bitcoin’s Use in Illicit Finance’ that “blockchain technology is a powerful but underutilized forensic tool for governments to identify illicit activity and bring criminals to justice.”
The report, co-authored by Josh Kirshner and Thomas Schoenberger, was ostensibly written as a defense of Bitcoin—a response to growing “concerns about the illicit finance implications of the cryptocurrency ecosystem.”
Moxie should just admit that he’s getting older and wants to make a lot of money really fast. No shame in that.
Where are all the moxie fanboys???
The founder and CEO of encrypted messaging app Signal, Moxie Marlinspike may have been the former CTO of MobileCoin, a cryptocurrency that Signal recently integrated for in-app payments, early versions of MobileCoin technical documents suggest.
MobileCoin CEO Joshua Goldbard told CoinDesk this 2017 white paper is “not something [he] or anyone at MobileCoin wrote,” though it is very nearly a verbatim precursor to MobileCoin’s current white paper. Additionally, snapshots of MobileCoin’s homepage from Dec. 18, 2017, until April 2018, list Marlinspike as one of three members of “The Team,” though his title is not given there. He is not listed as an adviser until May 2018.
The team for the self-described privacy coin has always acknowledged Marlinspike as an adviser to the project, but neither the team nor Marlinspike has ever disclosed direct involvement through an in-house role, much less one so involved as Chief Technical Officer.
I’m shocked. This is my shocked face. :-|
Given that the app is blowing up, I figure it’s a good time to roll out my periodic public service announcement: Signal was created and funded by a CIA spinoff. Yes, a CIA spinoff. Signal is not your friend.
Here are the cold hard facts.
Signal was developed by Open Whisper Systems, a for-profit corporation run by “Moxie Marlinspike,” a tall, lanky cryptographer who has a head full of dreadlocks and likes to surf and sail his boat. Moxie was an old friend of Tor’s now-banished chief radical promotor Jacob Appelbaum, and he’s played a similar fake-radical game — although he’s never been able to match Jake’s raw talent and dedication to the art of the con. Still, Moxie wraps himself in air of danger and mystery and hassles reporters about not divulging any personal information, not even his age. He constantly talks up his fear of Big Brother and tells stories about his FBI file.
So how big a threat is Moxie to the federal government?
Many technologists viscerally felt yesterday’s announcement as a punch to the gut when we heard that the Signal messaging app was bundling an embedded cryptocurrency. This news really cut to heart of what many technologists have felt before when we as loyal users have been exploited and betrayed by corporations, but this time it felt much deeper because it introduced a conflict of interest from our fellow technologists that we truly believed were advancing a cause many of us also believed in. So many of us have spent significant time and social capital moving our friends and family away from the exploitative data siphon platforms that Facebook et al offer, and on to Signal in the hopes of breaking the cycle of commercial exploitation of our online relationships. And some of us feel used.
Okay, this is horrible.
The UK has gone mental.
The 24 March, 2020 will be remembered by some for the news that Prince Charles tested positive for Covid and was isolating in Scotland. In Athens it was memorable as the day the traffic went silent. Twenty-four hours into a hard lockdown, Greeks were acclimatising to a new reality in which they had to send an SMS to the government in order to leave the house. As well as millions of text messages, the Greek government faced extraordinary dilemmas. The European Union’s most vulnerable economy, its oldest population along with Italy, and one of its weakest health systems faced the first wave of a pandemic that overwhelmed richer countries with fewer pensioners and stronger health provision. The carnage in Italy loomed large across the Adriatic.
One thing I’ve always wanted to do is support Tor by running a public relay. However, I didn’t have a machine to dedicate to it. All of my normal systems hold data I don’t want to expose to Tor (ssh keys, browser sessions, etc.) Now that FreeBSD has initial support for the Raspberry Pi 3, I can now run an inexpensive Tor relay. At the same time, I could use it to create a special Tor network at home. All data transmitted over the network destined for the public Internet would go through Tor first.
Since I prefer HardenedBSD over normal FreeBSD, that’s what we’ll be setting up in this article. Though this article focuses on using the HardenedBSD on the RPI3, the concepts apply equally to FreeBSD or HardenedBSD on any architecture.
Our phones are our most personal computers, and the most vulnerable to privacy abuses. They carry personal files and photos, our contact list, and our email and private chat messages. They also are typically always left on and always connected to the Internet either over a WiFi or cellular network. Phones also contain more sensors and cameras than your average computer so they can not only collect and share your location, but the GPS along with the other sensors such as the gyroscope, light sensor, compass and accelerometer can reveal a lot more information about a person than you might suspect (which is why we designed the Librem 5 with a “lockdown mode” so you can turn all of that off).
Beginning in April, new iPhones and other iOS devices sold in Russia will include an extra setup step. Alongside questions about language preference and whether to enable Siri, users will see a screen that prompts them to install a list of apps from Russian developers. It’s not just a regional peculiarity. It’s a concession Apple has made to legal pressure from Moscow—one that could have implications far beyond Russia’s borders.
The law in question dates back to 2019, when Russia dictated that all computers, smartphones, smart TVs, and so on sold there must come preloaded with a selection of state-approved apps that includes browsers, messenger platforms, and even antivirus services. Apple has stopped short of that; the suggested apps aren’t pre-installed, and users can opt not to download them. But the company’s decision to bend its rules on pre-installs could inspire other repressive regimes to make similar demands—or even more invasive ones.
Federal law enforcement has been asking for a backdoor to read Americans’ encrypted communications for years now. FBI Director Christopher Wray did it again last week in testimony to the Senate Judiciary Committee. As usual, the FBI’s complaints involved end-to-end encryption employed by popular messaging platforms, as well as the at-rest encryption of digital devices, which Wray described as offering “user-only access.”
This updated guide aims to provide introduction to various de-anonymization techniques, tracking techniques, id verification techniques and optional guidance to creating and maintaining reasonably anonymous identities online including social media accounts safely. This includes mainstream platforms and not only privacy friendly ones.
It is important to understand that the purpose of this guide is anonymity and not just privacy but many of the guidance you will find here will also help you improve your privacy even if you are not interested in anonymity. There is an important overlap in techniques and tools used for privacy and anonymity but they differ at some point:
Privacy is about people knowing who you are but not knowing what you are doing.
Anonymity is about people knowing what you are doing but not knowing who you are.
Two unnamed broadband or mobile ISPs are reportedly helping the UK Home Office and the National Crime Agency (NCA) to trial a new internet snooping system on their customers, which is being conducted as part of the controversial 2016 UK Investigatory Powers Act (aka – snoopers charter).
The act introduced a new power that, among many other things, could force ISPs – upon being ordered to do so by a senior judge – into logging the Internet Connection Records (ICR) of all their customers for up to 12 months (e.g. the IP addresses of the servers you’ve visited and when), which can be accessed without a warrant and occurs regardless of whether or not you’re suspected of a crime.
New consent management platforms (CMPs) have been introduced to the web to conform with the EU’s General Data Protection Regulation, particularly its requirements for consent when companies collect and process users’ personal data. This work analyses how the most prevalent CMP designs affect people’s consent choices. We scraped the designs of the five most popular CMPs on the top 10,000 websites in the UK (n=680). We found that dark patterns and implied consent are ubiquitous; only 11.8% meet our minimal requirements based on European law. Second, we conducted a field experiment with 40 participants to investigate how the eight most common designs affect consent choices. We found that notification style (banner or barrier) has no effect; removing the opt-out button from the first page increases consent by 22-23 percentage points; and providing more granular controls on the first page decreases consent by 8-20 percentage points. This study provides an empirical basis for the necessary regulatory action to enforce the GDPR, in particular the possibility of focusing on the centralised, third-party CMP services as an effective way to increase compliance.