LIVE AS IF YOU WERE TO DIE TOMORROW. LEARN AS IF YOU WERE TO LIVE FOREVER (GANDHI)
Monday, March 6, 2017
QUESTION OF THE WEEK NO. 8
Should Congress enact a statute that requires
any smartphone or tablet sold in the United States to have a "backdoor" to insure that data on such devices is
accessible to the government pursuant to a search warrant?
Absolutely not. This isn’t the first time this debate has come up. Back in the 90s, the government pressed companies to use a “clipper chip” in their hardware, which would give the government access to encrypted data on private systems. While the technology claimed to be foolproof, it was quickly hacked, giving hackers access to any system installed with a clipper chip. Without a doubt, this is exactly what will happen if a “master key” is created for the government for any service. Forcing such a feature upon all encrypted data would undermine what encryption means, and would seriously harm the protection of everyone’s private data.
I do not think that Congress should require all U.S sold smart phones to have a "backdoor," for the government. Both individuals and businesses purchase cell phones and use them for different reasons. Even if nothing illegal is going on, phone purchasers might not want an easy access way to get ahold of private information. As Alex said above, it could be detrimental if hackers had some type of "master key" for all cell phones.
My first thought is "no". If there was a "backdoor," it would be too easy for ANYONE to gain access to and misuse the information on another person's smartphone. Also, if people knew this policy was in place they would probably feel "chilled" and less free to go about their lives as they wish. However, a smartphone is a person's property and the fourth amendment states that property can be searched with a warrant. So, I personally don't think this should happen, but in accordance with current law, maybe it should.
Not at all. I think that our phones are private to the individual. If a back door is created, that opens up the possibility of misuse not only from the government but to hackers as well. In todays world, cell phones hold lots of private and personal information. I think that making a "backdoor" gives a lot of power to the "big brother" and does not allow the people to act freely. If we knew that our cell phones were being monitored, I don't think we would act the same, even if we weren't doing anything illegal. This would be a serious invasion of privacy and should not be implemented.
Definitely not. Any back door created for even the best of reasons can and will be exploited for malicious purposes. Since our entire lives are on our phones, including passwords and possibly credit card information, our digital information would be put in serious danger and our privacy would be undermined in yet one more way.
No. Unless there was a secure way to reach the information, which only the government or phone company would be able to use (key). Creating a backdoor would likely make the smartphone or tablet at risk for hacking outside of law enforcement.
No, I do not think so. I don't think that any sort of regulation on industry is inherently bad. I think that it is good that the government puts safety standards on industry. However, I don't think that a back door to phone encryption is the same. The regulation wouldn't be to make sure that people using the product are safe using the product. It just doesn't seem right for the government to force companies to make their products more accessible to the government.
No, there should be no "backdoor" made on smartphones and tablets for the U.S. to use. It is my understanding that authorities can gain access to text messages at least 180 days old with a warrant, and currently police only need a subpoena to access emails that have been opened or are at least 180 days old. I believe that degree of access should be sufficient. Access to other information should be on a case by case basis, not full open-door access. This way technology providers can maintain the contract of trust they have with customers to be the safeguard of their privacy. And obviously there is the argument that hackers can also gain access through this backdoor. I wonder what people think about the position that Apple et al. should not be forced to create a backdoor, but the government can still try to crack its way in.
No. I thought the Manhattan District Attorney's Office's report, "Smartphone Encryption and Public Safety" made a very convincing argument about how necessary data on phones can be in catching criminals. I read through this article trying to find some justification for allowing a "backdoor" into encryption, but I couldn't find one. First, I think criminals will always find a way to communicate secretly. Exploiting encrypted devices is not going to solve this problem. I think that a backdoor will cause more harm to the innocent than it would bring justice to the guilty.
No, I do not believe that there should be a "backdoor" made to allow for the government to get around encryption. I this was made it could be exploited by criminals and compromise the encryption. There is no safe way to create a "backdoor" and still protect the integrity of the encryption. I believe that the government can use other methods to obtain evidence for cases that do not jeopardized everyone's privacy as a "backdoor" would.
In the readings for this week, I was half convinced by the argument for having a multi-envelope where the producing company and the government, with both required to open the phone. This is perfectly technically feasable, but has two problems: One: Which governments would have access? Two: The possibility of a key being leaked is still non-zero
One possible solution to the key being leaked is to just force a change every so often (Every software update, for instance). The question of which governments have access is a stickier problem, but leaving that aside, the real problem is that anybody with basic, google-albe technical knowledge could encrypt anything they want to protect themselves, thus making any backdoor useless.
Thus, my conclusion is that, no tech companies should not be required to build in a backdoor. Even though I don't think such a backdoor would cause a massive security problem, the security problem it causes aren't balanced by a significant benefit to any kind of security.
Vehemently no. Once a backdoor exists, it becomes infinitely easier for hackers to gain access as well. This specific issue was brought up when Apple refused to grant the FBI access to that phone. The tech world agreed almost unanimously that installing backdoors on mobile phones would be severely detrimental to privacy and data security.
Initially, I would have said no, but after the readings, I will tentatively say yes. I believe that the envelope method mentioned in our readings would be very efficient - if the FBI and the company had to both use their keys in order to access the phone, this would definitely safeguard privacy. Of course, there is a possibility that this capability could be abused by both the FBI and companies, but it would be extremely unlikely. Additionally, the scenarios presented in the readings where access to the phones could have stopped a child molester, thwarted burglary, etc., seem exceedingly beneficial to society. If criminals believe that they don't have to fear any threat to their mobile devices, they will be able to engage in operations much more effectively, and law enforcement will have a much difficult job preventing them. Furthermore, I don't believe that creating a backdoor would create massive security issues. Before Apple began using the iOS9, there were no data breaches. If the envelope method is employed, I believe that society's right to privacy and security will be fairly balanced.
No, I don't believe there should be a universal back door incorporated to electronic devices. As has been said, the potential security breaches on a mass scale by hackers into the devices of individuals is unlikely. Nevertheless, the type of information gained through such a backdoor doesn't outweigh the public's privacy interest, and there are other ways to obtain information that's more helpful to such a case.
No, it would be dangerous for the average user if any local police officer could access the backdoor key with a search warrant. Giving so many people an important key almost guarantees someone will leak it or leave it out by accident. This policy would also be ineffective as Simon noted. Encryption is not some highly technical computer science mumbo-jumbo. It only takes a few steps and anyone with google and half a brain could encrypt any message they want with the same level of security as big companies like Apple.
No, this backdoor could be too easily taken advantage of in other situationstates. Other people may hack it, or the government may try to get information without obtaining a warrant first. Because our cell phones are our own private property, I think that we have the right to store private information on them without creating a back door.
No, I don't think that the United States Federal Government should mandate a backdoor to get into smartphones or tablets. However, manufacturers and administrators of operating systems should be required to give passwords to unlock said devices if the government provides a warrant. That way, everybody keeps their privacy, and the "backdoor" cannot be hacked. But the government still gets what it requires.
No, I think that a mandatory backdoor would be vulnerable enough for people outside the government to take advantage of. I also think that it represents a privacy issue, in that there ought to be a genuine expectation of privacy when it comes to somebody's phone, and the idea that the government can readily breach that privacy is too far on the side of security issues.
Absolutely not.
ReplyDeleteThis isn’t the first time this debate has come up. Back in the 90s, the government pressed companies to use a “clipper chip” in their hardware, which would give the government access to encrypted data on private systems. While the technology claimed to be foolproof, it was quickly hacked, giving hackers access to any system installed with a clipper chip. Without a doubt, this is exactly what will happen if a “master key” is created for the government for any service. Forcing such a feature upon all encrypted data would undermine what encryption means, and would seriously harm the protection of everyone’s private data.
I do not think that Congress should require all U.S sold smart phones to have a "backdoor," for the government. Both individuals and businesses purchase cell phones and use them for different reasons. Even if nothing illegal is going on, phone purchasers might not want an easy access way to get ahold of private information. As Alex said above, it could be detrimental if hackers had some type of "master key" for all cell phones.
ReplyDeleteMy first thought is "no". If there was a "backdoor," it would be too easy for ANYONE to gain access to and misuse the information on another person's smartphone. Also, if people knew this policy was in place they would probably feel "chilled" and less free to go about their lives as they wish. However, a smartphone is a person's property and the fourth amendment states that property can be searched with a warrant. So, I personally don't think this should happen, but in accordance with current law, maybe it should.
ReplyDeleteNot at all. I think that our phones are private to the individual. If a back door is created, that opens up the possibility of misuse not only from the government but to hackers as well. In todays world, cell phones hold lots of private and personal information. I think that making a "backdoor" gives a lot of power to the "big brother" and does not allow the people to act freely. If we knew that our cell phones were being monitored, I don't think we would act the same, even if we weren't doing anything illegal. This would be a serious invasion of privacy and should not be implemented.
ReplyDeleteDefinitely not. Any back door created for even the best of reasons can and will be exploited for malicious purposes. Since our entire lives are on our phones, including passwords and possibly credit card information, our digital information would be put in serious danger and our privacy would be undermined in yet one more way.
ReplyDeleteNo. Unless there was a secure way to reach the information, which only the government or phone company would be able to use (key). Creating a backdoor would likely make the smartphone or tablet at risk for hacking outside of law enforcement.
ReplyDeleteNo, I do not think so. I don't think that any sort of regulation on industry is inherently bad. I think that it is good that the government puts safety standards on industry. However, I don't think that a back door to phone encryption is the same. The regulation wouldn't be to make sure that people using the product are safe using the product. It just doesn't seem right for the government to force companies to make their products more accessible to the government.
ReplyDeleteNo, there should be no "backdoor" made on smartphones and tablets for the U.S. to use. It is my understanding that authorities can gain access to text messages at least 180 days old with a warrant, and currently police only need a subpoena to access emails that have been opened or are at least 180 days old. I believe that degree of access should be sufficient. Access to other information should be on a case by case basis, not full open-door access. This way technology providers can maintain the contract of trust they have with customers to be the safeguard of their privacy. And obviously there is the argument that hackers can also gain access through this backdoor.
ReplyDeleteI wonder what people think about the position that Apple et al. should not be forced to create a backdoor, but the government can still try to crack its way in.
No. I thought the Manhattan District Attorney's Office's report, "Smartphone Encryption and Public Safety" made a very convincing argument about how necessary data on phones can be in catching criminals. I read through this article trying to find some justification for allowing a "backdoor" into encryption, but I couldn't find one. First, I think criminals will always find a way to communicate secretly. Exploiting encrypted devices is not going to solve this problem. I think that a backdoor will cause more harm to the innocent than it would bring justice to the guilty.
ReplyDeleteNo, I do not believe that there should be a "backdoor" made to allow for the government to get around encryption. I this was made it could be exploited by criminals and compromise the encryption. There is no safe way to create a "backdoor" and still protect the integrity of the encryption. I believe that the government can use other methods to obtain evidence for cases that do not jeopardized everyone's privacy as a "backdoor" would.
ReplyDeleteIn the readings for this week, I was half convinced by the argument for having a multi-envelope where the producing company and the government, with both required to open the phone. This is perfectly technically feasable, but has two problems:
ReplyDeleteOne: Which governments would have access?
Two: The possibility of a key being leaked is still non-zero
One possible solution to the key being leaked is to just force a change every so often (Every software update, for instance). The question of which governments have access is a stickier problem, but leaving that aside, the real problem is that anybody with basic, google-albe technical knowledge could encrypt anything they want to protect themselves, thus making any backdoor useless.
Thus, my conclusion is that, no tech companies should not be required to build in a backdoor. Even though I don't think such a backdoor would cause a massive security problem, the security problem it causes aren't balanced by a significant benefit to any kind of security.
Vehemently no. Once a backdoor exists, it becomes infinitely easier for hackers to gain access as well. This specific issue was brought up when Apple refused to grant the FBI access to that phone. The tech world agreed almost unanimously that installing backdoors on mobile phones would be severely detrimental to privacy and data security.
ReplyDeleteInitially, I would have said no, but after the readings, I will tentatively say yes. I believe that the envelope method mentioned in our readings would be very efficient - if the FBI and the company had to both use their keys in order to access the phone, this would definitely safeguard privacy. Of course, there is a possibility that this capability could be abused by both the FBI and companies, but it would be extremely unlikely. Additionally, the scenarios presented in the readings where access to the phones could have stopped a child molester, thwarted burglary, etc., seem exceedingly beneficial to society. If criminals believe that they don't have to fear any threat to their mobile devices, they will be able to engage in operations much more effectively, and law enforcement will have a much difficult job preventing them.
ReplyDeleteFurthermore, I don't believe that creating a backdoor would create massive security issues. Before Apple began using the iOS9, there were no data breaches. If the envelope method is employed, I believe that society's right to privacy and security will be fairly balanced.
No, I don't believe there should be a universal back door incorporated to electronic devices. As has been said, the potential security breaches on a mass scale by hackers into the devices of individuals is unlikely. Nevertheless, the type of information gained through such a backdoor doesn't outweigh the public's privacy interest, and there are other ways to obtain information that's more helpful to such a case.
ReplyDeleteNo, it would be dangerous for the average user if any local police officer could access the backdoor key with a search warrant. Giving so many people an important key almost guarantees someone will leak it or leave it out by accident. This policy would also be ineffective as Simon noted. Encryption is not some highly technical computer science mumbo-jumbo. It only takes a few steps and anyone with google and half a brain could encrypt any message they want with the same level of security as big companies like Apple.
ReplyDeleteNo, this backdoor could be too easily taken advantage of in other situationstates. Other people may hack it, or the government may try to get information without obtaining a warrant first. Because our cell phones are our own private property, I think that we have the right to store private information on them without creating a back door.
ReplyDeleteNo, I don't think that the United States Federal Government should mandate a backdoor to get into smartphones or tablets. However, manufacturers and administrators of operating systems should be required to give passwords to unlock said devices if the government provides a warrant. That way, everybody keeps their privacy, and the "backdoor" cannot be hacked. But the government still gets what it requires.
ReplyDeleteNo, I think that a mandatory backdoor would be vulnerable enough for people outside the government to take advantage of. I also think that it represents a privacy issue, in that there ought to be a genuine expectation of privacy when it comes to somebody's phone, and the idea that the government can readily breach that privacy is too far on the side of security issues.
ReplyDelete