Tag: end-to-end encryption

  • Telegram CEO Pavel Durov’s Arrest Linked to Sweeping Criminal Investigation

    Telegram CEO Pavel Durov’s Arrest Linked to Sweeping Criminal Investigation

    [ad_1]

    French prosecutors gave preliminary information in a press release on Monday about the investigation into Telegram CEO Pavel Durov, who was arrested suddenly on Saturday at Paris’ Le Bourget airport. Durov has not yet been charged with any crime, but officials said that he is being held as part of an investigation “against person unnamed” and can be held in police custody until Wednesday.

    The investigation began on July 8 and involves wide-ranging charges related to alleged money laundering, violations related to import and export of encryption tools, refusal to cooperate with law enforcement, and “complicity” in drug trafficking, possession and distribution of child pornography, and more.

    The investigation was initiated by “Section J3” cybercrime prosecutors and has involved collaboration with France’s Centre for the Fight against Cybercrime (C3N) and Anti-Fraud National Office (ONAF), according to the press release. “It is within this procedural framework in which Pavel Durov was questioned by the investigators,” Paris prosecutor Laure Beccuau wrote in the statement.

    Telegram did not respond to multiple requests for comment about the investigation but asserted in a statement posted to the company’s news channel on Sunday that Durov has “nothing to hide.”

    “Given the existence of several preliminary investigations in France concerning Telegram in relation to the protection of minors’ rights and in cooperation with other French investigation units—for instance, on cyber harassment—the arrest of Durov, does not seem to me like a highly exceptional move,” says Cannelle Lavite, a French lawyer who specializes in free-speech matters.

    Lavite notes that Durov is a French citizen who was arrested in French territory with an arrest warrant issued by French judges. She adds that the list of charges involved in the investigation is “extensive,” a wide net that she says is not entirely surprising in the context of “France’s ambiguous legislative arsenal” meant to balance content moderation and free speech.

    Durov is a controversial figure for his leadership of Telegram, in large part because he has not typically cooperated with calls to moderate the platform’s content. In some ways, this has positioned him as a free-speech defender against government censorship, but it has also made Telegram a haven for hate speech, criminal activity, and abuse. Additionally, the platform is often billed as a secure communication tool, but much of it is open and accessible by default.

    “Telegram is not primarily an encrypted messenger; most people use it almost as a social network, and they’re not using any of its features that have end-to-end encryption,” says John Scott-Railton, senior researcher at Citizen Lab. “The implication there is that Telegram has a wide range of abilities and access to potentially do content moderation and respond to lawful requests. This puts Pavel Durov very much in the center of all kinds of potential governmental pressure.”

    On top of all of this, many researchers have questioned whether Telegram’s end-to-end encryption is durable when users do elect to enable it.

    French president Emmanuel Macron said in a social media post on Monday that “France is deeply committed to freedom of expression and communication … The arrest of the president of Telegram on French soil took place as part of an ongoing judicial investigation. It is in no way a political decision.”

    News of Durov’s arrest is fueling concerns, though, that the move could threaten Telegram’s stability and undermine the platform. The case seems poised, too, to have implications in long-standing debates around the world about social media moderation, government influence, and use of privacy-preserving end-to-end encryption.

    Lavite says the case certainly invokes debates about “the balance between the right to encrypted communication and free speech on the one hand, and users’ protection—content moderation—on the other hand.” But she notes that there is a lot of information about the investigation that is unknown and “a lot of blurry zones still.”

    On Monday afternoon, Telegram seemed to be receiving a download boost from the situation, moving from 18th to 8th place in Apple’s US App Store apps ranking. Global iOS downloads were up by 4 percent, and in France the app was number one in the App Store social network category and number three overall.

    [ad_2]

    Source link

  • Apple’s iMessage Encryption Puts Its Security Practices in the DOJ’s Crosshairs

    Apple’s iMessage Encryption Puts Its Security Practices in the DOJ’s Crosshairs

    [ad_1]

    The argument is one that some Apple critics have made for years, as spelled out in an essay in January by Cory Doctorow, the science fiction writer, tech critic, and co-author of Chokepoint Capitalism. “The instant an Android user is added to a chat or group chat, the entire conversation flips to SMS, an insecure, trivially hacked privacy nightmare that debuted 38 years ago—the year Wayne’s World had its first cinematic run,” Doctorow writes. “Apple’s answer to this is grimly hilarious. The company’s position is that if you want to have real security in your communications, you should buy your friends iPhones.”

    In a statement to WIRED, Apple says it designs its products to “work seamlessly together, protect people’s privacy and security, and create a magical experience for our users,” and adds that the DOJ lawsuit “threatens who we are and the principles that set Apple products apart” in the marketplace. The company also says it hasn’t released an Android version of iMessage because it couldn’t ensure that third parties would implement it in ways that met the company’s standards.

    “If successful, [the lawsuit] would hinder our ability to create the kind of technology people expect from Apple—where hardware, software, and services intersect,” the statement continues. “It would also set a dangerous precedent, empowering government to take a heavy hand in designing people’s technology. We believe this lawsuit is wrong on the facts and the law, and we will vigorously defend against it.”

    Apple has, in fact, not only declined to build iMessage clients for Android or other non-Apple devices, but actively fought against those who have. Last year, a service called Beeper launched with the promise of bringing iMessage to Android users. Apple responded by tweaking its iMessage service to break Beeper’s functionality, and the startup called it quits in December.

    Apple argued in that case that Beeper had harmed users’ security—in fact, it did compromise iMessage’s end-to-end encryption by decrypting and then re-encrypting messages on a Beeper server, though Beeper had vowed to change that in future updates. Beeper cofounder Eric Migicovsky argued that Apple’s heavyhanded move to reduce Apple-to-Android texts to traditional text messaging was hardly a more secure alternative.

    “It’s kind of crazy that we’re now in 2024 and there still isn’t an easy, encrypted, high-quality way for something as simple as a text between an iPhone and an Android,” Migicovsky told WIRED in January. “I think Apple reacted in a really awkward, weird way—arguing that Beeper Mini threatened the security and privacy of iMessage users, when in reality, the truth is the exact opposite.”

    Even as Apple has faced accusations of hoarding iMessage’s security properties to the detriment of smartphone owners worldwide, it’s only continued to improve those features: In February it upgraded iMessage to use new cryptographic algorithms designed to be immune to quantum codebreaking, and last October it added Contact Key Verification, a feature designed to prevent man-in-the-middle attacks that spoof intended contacts to intercept messages. Perhaps more importantly, it’s said it will adopt the RCS standard to allow for improvements in messaging with Android users—although the company did not say whether those improvements would include end-to-end encryption.

    [ad_2]

    Source link

  • How Apple’s Advanced Data Protection Works, and How to Enable It on Your iPhone

    How Apple’s Advanced Data Protection Works, and How to Enable It on Your iPhone

    [ad_1]

    ADP extends that protection pretty considerably to also cover your iCloud backups, iCloud Drive, and the information in Photos, Notes, and Reminders. Without ADP enabled, this data is still encrypted, which adds a strong layer of protection from third parties or bad actors. But Apple can still access this data and turn it over to the cops.

    End-to-end encryption closes that loophole. If an Apple employee decides to go rogue, or Apple gets hacked and your files get leaked—and neither of those scenarios have happened, to our knowledge—ADP will ensure your data is still safe. It also means Apple can’t get your files back if you lose access to them: The passcodes and passwords on your Apple devices are the only route through which end-to-end encrypted data can be unlocked. Apple has a full and comprehensive explanation of iCloud’s security and privacy features.

    Screenshot of a computer displaying icloud

    ADP does make iCloud on the web harder to get into.

    Apple via David Nield

    So, do you need ADP? If you want the most complete level of security and privacy possible, then yes. Just make sure you have backup methods for recovering your account (more on this in a moment), as Apple won’t be able to help you unlock your encrypted files if your account becomes inaccessible. While the default, standard encryption Apple puts in place is already very strong, ADP covers more of your data, which is reassuring in the slim chance Apple’s data centers suffer a breach or the FBI wants to take a look at your iCloud files.

    It’s also worth bearing in mind that all this extra encryption restricts iCloud access in your web browser (the web portal where you can get at your emails, photos, and so on). You can still log in to iCloud on the web, but you’ll need to confirm the connection on a trusted device (like an iPhone or Mac) every single time you log in—and you’ll need to reauthorize the link every hour while you browse your files. If you use iCloud on the web a lot, you might find life more convenient without ADP enabled.

    How to Enable Advanced Data Protection

    If you’re ready to enable ADP, you can do it right from your iPhone—as long as all the devices associated with your Apple ID are running the latest software, and your Apple ID has two-factor authentication switched on. (If you haven’t yet done this, you’ll find instructions on the Apple website.)

    If you’re using an iPhone or an iPad, open Settings, then tap your name at the top. Choose iCloud, then Advanced Data Protection: You’ll see a screen briefly explaining how the feature works, and you can tap Turn On Advanced Data Protection to do just that. At this point you’ll be told if there are any devices connected to your Apple ID that aren’t compatible with ADP, and you’ll be given the option to “remove” them. If you do remove a device, it will no longer be linked to your Apple ID, and it won’t sync to your iCloud account, so it’s not recommended you “remove” any devices you’re still using. A better option would be to update the software on these devices to make them compatible with ADP, or replace the devices with newer versions.

    [ad_2]

    Source link

  • Signal Finally Rolls Out Usernames, So You Can Keep Your Phone Number Private

    Signal Finally Rolls Out Usernames, So You Can Keep Your Phone Number Private

    [ad_1]

    The third new feature, which is not enabled by default and which Signal recommends mainly for high-risk users, allows you to turn off not just your number’s visibility but its discoverability. That means no one can find you in Signal unless they have your username, even if they already know your number or have it saved in their address book. That extra safeguard might be important if you don’t want anyone to be able to tie your Signal profile to your phone number, but it will also make it significantly harder for people who know you to find you on Signal.

    The new phone number protections should now make it possible to use Signal to communicate with untrusted people in ways that would have previously presented serious privacy risks. A reporter can now post a Signal username on a social media profile to allow sources to send encrypted tips, for instance, without also sharing a number that allows strangers to call their cell phone in the middle of the night. An activist can discreetly join an organizing group without broadcasting their personal number to people in the group they don’t know.

    In the past, using Signal without exposing a private number in either of those situations would have required setting up a new Signal number on a burner phone—a difficult privacy challenge for people in many countries that require identification to buy a SIM card—or with a service like Google Voice. Now you can simply set a username instead, which can be changed or deleted at any time. (Any conversations you’ve started with the old username will switch over to the new one.) To avoid storing even those usernames, Signal is also using a cryptographic function called a Ristretto hash, which allows it to instead store a list of unique strings of characters that encode those handles.

    Amid these new features designed to calibrate exactly who can learn your phone number, however, one key role for that number hasn’t changed: There’s still no way to avoid sharing your phone number with Signal itself when you register. The fact that this requirement persists even after Signal’s upgrade will no doubt rankle some critics who have pushed Signal’s developers to better cater to users seeking more complete anonymity, such that even Signal’s own staff can’t see a phone number that might identify users or hand that number over to a surveillance agency wielding a court order.

    Whittaker says that, for better or worse, a phone number remains a necessary requisite as the identifier Signal privately collects from its users. That’s partly because it prevents spammers from creating endless accounts since phone numbers are scarce. Phone numbers are also what allow anyone to install Signal and have it immediately populate with contacts from their address book, a key element of its usability.

    In fact, designing a system that prevents spam accounts and imports the user’s address book without requiring a phone number is “a deceptively hard problem,” says Whittaker. “Spam prevention and actually being able to connect with your social graph on a communications app—those are existential concerns,” she says. “That’s the reason that you still need a phone number to register, because we still need a thing that does that work.”

    [ad_2]

    Source link