قالب وردپرس درنا توس
Home / Technology / Spies want to turn FaceTime eavesdropping into a feature

Spies want to turn FaceTime eavesdropping into a feature



On Monday, we learned that Apple's FaceTime Video Chat service suffers from a bug that allows others to get audio and even video directly from their iPhone or Mac computer. This can be done without your permission and without the standard message that the other person listens and looks. Anyone with FaceTime can eavesdrop on any other FaceTime user by simply calling and performing a simple operation – and the victim's device would start sending, even if the call is never answered.

This is a pretty catastrophic mistake. However, when worryingly large national espionage agencies find their way, a similar mistake in almost every common communications product becomes standard. As unbelievable as it sounds, we know that because they told us.

The FaceTime error is a bug in the interface ̵

1; the parts of the software that alert the user and control what the device is doing. The FaceTime interface fails in at least two ways that are related but different. First, he sends the attacker audio and video without the victim's permission. The transfer begins without the victim approving it. Secondly, this happens without the victim's knowledge – the normal indication that an active call is in progress does not exist.

The engineering community has understood for years that user interface errors are often a cause of security errors and that these errors are often worse Other. There are organizations, books, and meetings that deal with trusted and secure user interfaces, and Apple itself has policies that emphasize the importance of the security software user interface.

Government Communications Headquarters (GCC) – a close monitoring partner of the US National Security Agency – has recently suggested that government officials have the ability to bring hidden subscribers into secure messaging services. This proposal became known as the "Ghost Proposal".

The proposal written by Ian Levy and Crispin Robinson of GCHQ recommends institutionalizing an untrusted UI if the government wants to spy on a conversation:

It is relatively easy for a service provider to make a group chat or call to silently add a participant to the law enforcement. The service provider normally controls the identity system and therefore really decides who and who and what devices are involved. Usually they are involved in introducing the parties into a chat or call. In such a solution, we usually talk about suppressing a notification on the target device of a target … and possibly also with those with whom they communicate.

In short, Apple or another company that allows people to chat privately would be forced to allow the government to participate in these chats as a quiet, invisible eavesdropper. Even the safest apps like Signal (which we recommend) and WhatsApp, which uses end-to-end encryption, would become unsafe if they had to implement this proposal.

  mytubethumb   play

% 3Ciframe% 20allow% 3D Accelerometer% 3B% 20autoplay% 3B% 20crypted media% 3B% 20gyroscope% 3B% 20Picture% 22% 20allowfullscreen% 3D% 22% 22% 20frameborder% 3D% 220% 22% 20high% 3D% 22315% 22% 20src% 3D% 22https% 3A% 2F% 2Fwww.youtube.com% 2Fembed% 2FvSQQXS3q1k8% 3Fautoplay% 3D1% 26version% 3D3% 22% 20thumb% 3D% 22% 2Ffiles% 2Fweb19-protect-digital-privacy -thumb-560×315.jpg% 22% 20width% 3D% 22560% 22% 3E% 3C% 2Fiframe% 3E

The Ghost proposal institutionalizes a much worse user interface error than the FaceTime error From monday. The FaceTime error notifies the vulnerable user of at least one incoming call that happens somewhat even if the user interface misrepresents the situation and violates the user's expectations. With the ghost suggestion, users can not even know that something is going on that violates their expectations.

The GCHQ authors claim Ghost offers law enforcement the ability to use interception features, and "you do not even have to handle the encryption. "This is true, but only in the most insincere sense.

If users want encryption in their communications, that's not because they love mathematics. People care about what encryption does . Encryption and other cryptographic protocols are required to protect individuals with features such as confidentiality, integrity, and authenticity. Essentially, the Ghost proposal is: "Let's violate the authenticity, and you can keep the encryption." But if you do not know who you are talking to, what security guarantee is there?

To ensure these features, cryptography is required, but it is not enough. The entire system, from cryptographic math to software implementation, network protocols, and the user interface, is critical to delivering secure communications in an increasingly hostile online environment.

And let's not forget: If companies like Apple are forced to governments to quietly participate in private talks, this tool will not only be available to democratic governments – it will be used by the world's worst human rights abusers, journalists, activists and attack others.

We should be clear: all software has bugs, and Apple's software, no matter how good, is no exception. Although it took Apple too long to spot the mistake, the company now handles it with the gravity it deserves.

Because the Group FaceTime vulnerability is being accessed, Apple has completely taken these servers offline until the FaceTime app itself becomes available. However, every connected FaceTime app is still vulnerable when Apple re-enables the Group FaceTime servers. So if an upgrade is not delivered, users should probably leave FaceTime disabled. (This is a good reminder of why it is important to install new software updates as they become available.)

That such a fatal flaw can be detected in the software of a company known to prioritize privacy, should be a warning to anyone, including GCHQ and NSA, who are committed to deliberate security breaches to facilitate government surveillance. It's very difficult to properly develop software right from the start, and it's even harder to construct it with deliberate mistakes, even if they're limited. If there is a mechanism to purposely make the UI untrustworthy, this will be an attractive target for malicious hackers and other hostile actors. Who will be responsible for the inevitable abuse?

Any future discovery of a software bug that allows interception, false identities, message manipulation, or other compromise of communications security should be treated in the same way as this recent vulnerability: with serious emergency remedies, as soon as possible, a software update. that eliminates the error. And governments certainly should not consider deliberately adding such vulnerabilities.


Source link