|
I believe I read somewhere that there's currently a security measure that deletes the encryption key upon too many failed attempted login attempts. If I'm not mistaken, they're asking Apple to change that setting so that they can brute force the password (i.e. make it so it doesn't delete anything when faced with a brute force attack).
|
|
|
|
|
Duncan Edwards Jones wrote: If the file exists on the phone and was encrypted using an existing version of the data, how would installing a new version of the iOS allow easier unencryption?
My understanding is that if you attempt bad passwords X number of times, the phone bricks itself essentially. The "new" iOS being requested by the courts/FBI would allow unlimited attempts therefore making any phone that can have that OS installed brute forcible.
|
|
|
|
|
We answered the same thing at just about the same time, so I guess that is the stated story.
I can see the concern, if this "modified" version of the OS got out onto "the wild", anybody could brute force an iPhone.
|
|
|
|
|
Vark111 wrote:
My understanding is that if you attempt bad passwords X number of times, the phone bricks itself essentially. The "new" iOS being requested by the courts/FBI would allow unlimited attempts therefore making any phone that can have that OS installed brute forcible. Ten attempts. Then the phone not just blocks all info on the phone, it erases it completely. After that no tool can recover it; there is nothing to recover.
The code in question is a 4 decimal digit code, so a brute force attack requires only ten thousand tries (or on the average half of that) - so little that it neither sounds very much "brute" nor very strong "force"
|
|
|
|
|
Duncan Edwards Jones wrote:
If the file exists on the phone and was encrypted using an existing version of the data, how would installing a new version of the iOS allow easier unencryption?
Unless the user specifies the full encryption key every time the encrypted information is accessed, the software does know the key. It is stored somewhere in the file system. Move that flash (/disk, for general PCs) over to another machine, as a secondary storage device, and the key can be read by that other machine.
Sure, the key is usually encrypted; you won't find it in cleartext. But the OS/Application knows how to decrypt it. It must know, in order to decrypt the info for the proper user. But in a standard version, the OS/App refuses to do it until the operater has authenticated himself. The special OS edition on the other machine may be willing to decrypt the key without the the owner authenticating himself, e.g. presenting a password or fingerprint.
Couldn't that info, given by the user, be (part of) what encrypts the key, so that an intruder would have to know that?
But the OS knows that, too. It must know the PW (or some transformation of it) in order to check that the user gives the right one. So the alternate OS version may pretend that it has just read from the user a PW corresponding to the expected one, even if no user ever specified anything.
Whether you install the alternate OS version on the same device or you move the storage device (flash/disk) to another machine makes no essential difference, as long as there exists a possiblity for loading a new OS version without logging in to the machine. In the old days, that wasn't always the case, but with modern automatic over-the-air updates and fixes, it it probably possible to replace all essential parts of the OS that way.
The only safe encryption is where you are the one generating the key, the only one knowing it, and you never present it to the OS or to any application. For standard PC use, I would like to have a USB dongle where I can load, say, my X.509 certificates into a flash area that is not adressable across the USB interface; only the processor in the dongle can see it. So the PC sends the ciphertext across the USB interface, the dongle decrypts it, and returns hte cleartext to the PC across the USB interface. (Or it receives cleartext and returns ciphertext.) In many applications (such as S-MIME), the ciphertext will not be the full document text but e.g. a one-time 3DES or AES256 key, used for the text body, but in principle, the dongle could encrypt/decrypt the entire text body.
This dongle could itself require authentication. E.g. it can have a Bluetooth [Smart] interface to your smartphone, requesting a 6-digit PIN to be keyed on the phone. No keylogger on the PC will be able to pick it up (the way it can pick up any PIN, PW or key you type at the PC keyboard). So to access an encrypted document would require both the right USB dongle with the proper keys loaded, the right smartphone for authentication, and knowledge of the PIN requested by the dongle. (Plus, implicitly, the ability to unlock the smartphone, eg. by fingerprint.) In principle, a keylogger may be installed on the phone, but the risk of the intruder knowing how those digits typed are actually used - as a PIN code for some independent dongle - is rather small.
The biggest problem is to make e.g. an email program use that dongle for decrypting/encrypting the one-time-key (or the entire text body). Even if there exists standard encryption APIs, there is a great risk that common mail programs insists on accessing the X.509 certificate itself; maybe it doesn't use that standard encryptin API at all. So if I make myself such a dongle (in fact, I do have access to a programmable USB dongle that could do the job - I just have to learn to develop software for it!), I guess I would have to obtain some open-source email reader (such as Thunderbird) to adapt the source code for it. I guess that I might get the time to complete that project as soon as I retire as an old age pensioner...
|
|
|
|
|
More people have been killed with babies by guns than terrorists. Don't let hoopla and propaganda cloud your judgement. Yes it was sad, but the media blew it up to play the fear card to make it seem like it's a much bigger problem than it really is. So, it's not worth Pandora's box being opened.
Jeremy Falcon
|
|
|
|
|
I say we ban babies!
|
|
|
|
|
Agreed. They don't do anything but cry and poop anyway. Who needs them.
Jeremy Falcon
|
|
|
|
|
It could be a marketing stunt on the part of Apple:
(1)Apple publish that they refuse to unlock phones knowing damn well that that will unlock them.
(2)Their sales go up and they gain a market share from the Android users who think 'Apple have an ethical stance'.
(3)Apple then say that sadly they had no choice and unlock the phone - they come out of it smelling of roses.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
|
|
|
|
|
It is not like that...but Apple has no idea how to unlock iPhone
Skipper: We'll fix it.
Alex: Fix it? How you gonna fix this?
Skipper: Grit, spit and a whole lotta duct tape.
|
|
|
|
|
You need to get the full story. The government has not asked Apple to "unlock" that phone. The government wants Apple to create and install software on that phone which makes it hackable. Software can be copied.
You may or may not love to hate Apple. But their words open another perspective: Customer Letter - Apple[^]
Life is too shor
|
|
|
|
|
The question is...can you trust the government.
When you see how the IRS abused its power, I think the answer is obvious: No. Any tools given to the government will be used against real and 'perceived' enemies.
A 'perceived' enemy is someone you disagree with politically.
|
|
|
|
|
Apple are saying no because it will devalue their biggest selling product. Even if we believe that it was special software written to access one particular phone, the fact that it could be done to anyone on a court order may well deter people from their products in the future.
Having said that, I find it disingenuous of Apple stand up for security concerns when they've allowed such easy access to the data to, albeit legitimately, installed applications such as Facebook.
Ultimately we should consider any computing device, especially devices capable of over the air comms, as insecure anyway.
|
|
|
|
|
It not only could, it will. If one phone got unlocked, what's wrong with two? If two, what's wrong with three? And so on and so forth.
|
|
|
|
|
"The Gov isn't asking hem to unlock EVERYONE's phone"
No, that's exactly what the government is demanding. They want a tool that will unlock any iPhone. And that is a dangerous precedent. If history has taught us anything, it is that no government should be trusted, at any time, to do the right thing, when the wrong thing is an option. It also represents a significant reduction in security, whose primary purpose is preventing hackers/crackers from gaining access to your data. If a backdoor is created, attackers will find it, and they will exploit it.
What can this strange device be?
When I touch it, it gives forth a sound
It's got wires that vibrate and give music
What can this thing be that I found?
|
|
|
|
|
I do not understand what it means "No". The article says the authorities have the device. So if the device could be unlocked (it doesn't matter if it is, it matters that it could), then everyone could unlock the iPhone (okay, without a source code it takes a little bit longer, but not so much). If the device is strongly encrypted (as it should be), no backdoor may unlock it, instead a strong encryption would take several million years to brute force for a super computer. Finally if the device is not really encrypted, or private key could be reached by the hardware or it is obfuscated, then device is already unlocked, just use the right tools (obfuscation is not a security, but prevents power users to poke around the device).
So what it means "We could, but we said "No"!"? Are the iPhone's are really secure or they're just secure, because normal users does not have proper hardware/source code (first is easy to create, second could be reverse-engineered). A really secure device should be impossible to be unlocked by its manufacturer, unless wiped out.
|
|
|
|
|
From my understanding of what is being requested is to have a version of iOS that will not wipe the device if the incorrect password is typed more than 10 times. If the Feds can have a version of iOS that will allow an unlimited number of password attempts, then they can eventually type the correct password and access the phone.
I suspect the source code to allow an unlimited number of sign-in attempts before wiping the phone is a pretty easy code change.
|
|
|
|
|
So what, if someone steal your device, he/she cannot unlock it unless connects directly to processor bus (yes, there are devices that could do that). Normally such hardware price is high in the skies for normal users (which stealing person usually is). I don't think government cannot afford such a hardware, so besides legal problems why do they need Apple. Yes, modifying source code is easier, cheaper and faster, but modifying machine code is not that difficult.
|
|
|
|
|
Once someone is convicted of a crime, have they not given up their right to privacy? Like a felon has given up the right to vote?
I'm not for spying on innocent citizens, but what about citizens that have been proven to NOT be innocent?
Liberty comes with a price, and so does wickedness.
|
|
|
|
|
I think it extremely interesting that the government is forced to go to the manufacturer to get the data.
The entire situation itself indicates that the government (which obviously includes the NSA) isn't all powerful when it comes to invasion of personal privacy.
I have to admit, my first reaction was, "why can't they just give the damn phone to Apple and have a government (FBI) representative (for chain of evidence reasons) present when the data is produced." That way the code-breaking capability doesn't leave Apple's "clean room" and reduces by many factors the vulnerability of such a program escaping into the wild.
However, if Apple did such a thing, the government would be knocking on their door to do it again in less time than it takes to say iPhone. Ah those pesky precedents.
I'll be stepping out shortly to get more popcorn for the rest of the show.
Talk amongst yourselves.
Cheers,
Mike Fidler
"I intend to live forever - so far, so good." Steven Wright
"I almost had a psychic girlfriend but she left me before we met." Also Steven Wright
"I'm addicted to placebos. I could quit, but it wouldn't matter." Steven Wright yet again.
|
|
|
|
|
While I absolutely oppose inserting a backdoor into any security, this case is a bit different in that the owner of the phone also wants the security hacked. I have no problem with that, as a one-off hack. Most of us use company resources for personal (email for instance), but I don't pretend that anything that touches a company server is private. It's spelled out in company policy. the same applies to a company phone. All those records and content belong to the company. If an employee is stupid enough to give private information to the company, it's on them.
|
|
|
|
|
Actually they aren't asking Apple to unlock one person's phone. They are asking Apple to create software that can unlock that iPhone, which could then obviously be used to open any other iPhone or maybe any iDevice.
If the government said they wanted to create a strain of Super Ebola transmittable through the air so they can study it. You know, just in case it naturally mutates that way we can be prepared. Don't worry we will keep it safe in just one lab in San Bernadino where only authorized scientists will have access. Would you be OK with that?
There may be nothing useful in the phone at all.
Everything that can fall into the wrong hands will fall into the wrong hands. Once that software is created, it will leak. Then every lost iPhone means that person loses every dime in their bank accounts. If a thief gets your phone they can log into your bank app and transfer funds. Even if you don't have the password saved, they can reset your password because your e-mail password is auto-saved. Heck it could even mean a huge spike in iPhone theft once the thieves have the tools to make so much more money from each stolen phone.
|
|
|
|
|
Everyone who really thinks Apple should provide the FBI with tools to access or unlock phones, should immediately turn off all locking and privacy features on their phones right now and leave them off forever. If you're not willing to do that, then you really don't want Apple to provide unlock tools to anybody, you're just not thinking things all the way through.
Tools means an exploit must be present. They also set a precedent, with the expectation that those tools will continue to work, which means that the exploit must become a maintained feature of the product. What happens what that exploit is discovered by the bad guys? Will the FBI take responsibility and give up their tools so Apple can close the hole? Never.
We can program with only 1's, but if all you've got are zeros, you've got nothing.
|
|
|
|
|
A quick look through the many replies below seems to indicate most people have an immediate feeling that we want to be protected from terrorists so it is petty of Apple to "refuse to unlock this one phone" simply because they believe in privacy rights. I would suggest you know what is actually being required of Apple:
The government has invoked a centuries old writ requiring the general cooperation of third parties in excecuting writs or orders of the court/government. It has invoked that general writ in this case to insist that Apple engineers write a new operating system for the iPhone that will remove the multiple password submit protection (i.e., remove the increasing delay of response and ultimate locking of the device on repeated password errors) so the government can try brute force cracking the password for the terrorist's phone (by running millions of attempts at the password in automatically until one works).
To paraphrase a federal judge who refused to allow the use of th All Writs Act in that way in 2005, the government need only run this Hail Mary play if its arguments under the relevant laws fail to allow it to do what it wants to do (US Magistrate Judge Orenstein).
This controversy will surely take years to resolve, since it will likely proceed to the US Supreme Court (which may not be fully staffed since the Congress apparently views the President's power to appoint justices as optional and politically inconvenient).
Aside from the implications of demanding a business abandon a marketing feature or do slave labor for the government (and these do involve constitutional questions re 2nd and 5th amendments among other issues), you really need to slow down on this reaction that we want to be protected and what does it matter if the government can look at any and all of my communications (which they do anyway for the most part). There is a difference from being protected by law and being protected by the good will of a particular official of the government.
We've come a long way from Patrick Henry's "give me liberty or give me death," the attitude of those who risked their lives that we might have a country like America. Now it seems to be, "to hell with liberty---I want to live at any cost." If you look at history you will see populations that made that decision always suffered severe consequences.
|
|
|
|
|
Excellent commentary. Well written.
If it's not broken, fix it until it is
|
|
|
|
|