Jump to content
Not connected, Your IP: 3.139.104.214
Khariz

A Letter from Apple CEO Tim COok

Recommended Posts

I think everyone needs to read this:  https://www.apple.com/customer-letter/

 

I'll quote it here too, for ease of reading:

 

 

 

February 16, 2016

 

A Message to Our Customers

 

The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.  This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.

 

The Need for Encryption

 

Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.

 

All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.

 

Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

 

For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.

 

The San Bernardino Case

 

We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.

 

When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.

 

We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

 

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

 

The Threat to Data Security

 

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

 

In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

 

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

 

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

 

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

 

A Dangerous Precedent

 

Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.

 

The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.

 

The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

 

Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.  We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.

 

While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

 

Tim Cook

 

Share this post


Link to post

I think Tim Cook was very clear about what kind of compliance they have done and what kind they haven't.  He said that they have handed over everything that they could, but that they aren't willing to cross this line.  For whatever reason, I believe him that there is no backdoor into the iphone. I don't think it's safe to store things on icloud though.

Share this post


Link to post

I do not use Apple products or services in any way. But I applaud them for having the decency and courage to say no.

 

Regardless of what they say, I would never take the word of anyone that their product or service has no backdoors. Take precautions lest you wish you had.

 

The OpenSSL Heartbleed bug that was patched last year had existed in the wild since 2012, and no-one thought to look for it. Likewise I think only a fool will trust a propitiatory software to be free of exploitable bugs.

 

Be paranoid. They do not have to be out to get you to ruin your life.


Debugging is at least twice as hard as writing the program in the first place.

So if you write your code as clever as you can possibly make it, then by definition you are not smart enough to debug it.

Share this post


Link to post

It's upon law enforcement officers (LEOs) to prove guilt or gain access to private property.  That's liberty for the citizen.  How silly to think the property owner should be expected to open his/her "doors" to LEOs.  The constitution of the USA protects the citizens in this way, though it's certainly not being followed by government.  For example, a person is allowed to NOT speak.  It behooves the home owner to *not* open his/her "door" to LEOs.  That is his right.  However, if the LEOs have sufficient reason to search private property then they should seek justice in what lawful way they can.  Still, it is not upon the property owner to provide means.  Again, if gaining access to a domicile authorities with probable cause don't wait for the door to be opened for them.  They bust the door down.  So why should we now be expected to provide access to a locked device?  No, it should be upon the FBI to bust the door down of the iphone.

Share this post


Link to post

In gerneral ,if companies really are concerned about customers Privacy,dragnet etc,they have to leave the us.

​Because its not going to change soon.

​Its very interesting what Apple is going to do;or what they say,they are going to do.

​Have a good day.

 

More in detail about the iphone case.Very interesting.

http://www.jupiterbroadcasting.com/93956/open-that-iphone-unfilter-176/

Share this post


Link to post

I certainly applaud Apple here.  Tim Cook said a really obvious thing to many of us in the "hobby" of securing privacy (very loose quoting on my part): "criminals will/can select from any number of products out there even if we build the backdoor".  Many sage privacy folks will still go for open source proven solutions rather than proprietary ones.  Still, Apple's stance and posture here is impressing me and I don't carry an IPhone.

Share this post


Link to post

My 10 cents; Governments such as the UK (mine) are using their proganda machines to demonize anything that might make it difficult or impossible for them to snoop upon. So anyone using any type of encryption must be doing something underhand or illegal, unless you are using a bank or shopping on line but don't mention those because they are legit and people might start thinking encryption is good.

 

If terrorists or any criminal wants to keep stuff secret, they will (try at least), no matter what the stupid governments want with regard to back doors and other such nonesense. There is little difference in essence than before digital technology was around, cryptography has been alive and well for centuries.

 

I just hate slimey underhand liars, (lots of expletives) governments, totally hypocritical teapot heads, members of who do loads of secretive deals and other such deeds that they would not like in the public arena. Thank the Lord for Snowden and others like him.

Share this post


Link to post

I could stomach a compromise perhaps along this line (I don't like a compromise but I could live with this):  Apple HQ constructs a concrete and steel (faraday) room where a closed system computer sits.  Like the nuclear football it takes two to get into the room.  One is ONLY Tim Cook himself, and the other could be one of maybe 5 people.  Then the same two people are needed to mount the computer and THAT specific machine has NO backdoor - EVER.

 

The created IPhone backdoor would only exist there and the phone MUST be physically connected to this ONE computer.  Nothing happens without a direct physical connection to prevent this thing getting into the wild --- EVER.  Once the computer is mounted ONLY Tim Cook stays in the room and he prints out the entire contents of the phone on PAPER!  Once he verifies it printed correctly, he takes the hacked phone and drops it in a slot right there in the room, which is a high end commercial shredder that turns the device into dust!!

 

This would be acceptable to me.

 

In my plan it would take three things to take a phone into that room.

 

1.  A fully certified court order from a CIRCUIT LEVEL judge, and not a municipal judge that may have graduated law school last year.

 

2. the ONE IPhone documented in the order.  NO others allowed, not even on Tim Cook.

 

3. $50,000.00 (US Dollars) non-negotiable opening fee that must be paid in advance.  This is needed to prevent LEO from lining up with phones from sunrise to sunset.  If local LE has a case important enough to invest 50 Grand than so be it.  Apple can donate the money to homeless shelters or whatever, but don't even cry the poor boy thing from the LE side of this.

 

 

I can tell you that the FBI would pay 50 Grand in a snap to have that one IPhone opened.  Hopefully Apple is smart enough to configure the encryption on the backdoor computer to be unbreakable in case some "politicians" decide to break through the faraday door anyway.

 

 

 

 

Just this setup sound too out there?

Share this post


Link to post

Before people continue the debate about "how" this should be handled, let me point out that it may not even be physically possible in the first place. If Apple can do this, then the backdoor existed from the start. They made a lot of noise because it gets public attention on the case. And it is favorable PR for them to refuse. But legally, despite anything they said, if they are even capable of doing it, and they actually refuse a court order to do it, they are already in contempt of court. And that is potentially more damning than everyone at Apple actually being a terrorist.

 

So stop believing the lies. Apple either has no capacity to do as ordered by the court, or they are making their damnation as public as possible. Think it over.

 

If it was possible to break the encryption on the device in question, then the encryption in use was never worth using. Look at Truecrypt. It has been out for 12 years and has *NEVER* been cracked even one time. Need proof? Read the Legal Cases part of the Wikipedia link.


Debugging is at least twice as hard as writing the program in the first place.

So if you write your code as clever as you can possibly make it, then by definition you are not smart enough to debug it.

Share this post


Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Security Check
    Play CAPTCHA Audio
    Refresh Image

×
×
  • Create New...