Law enforcement officials are revisiting proposals that would require tech companies to build backdoor access into electronic devices to allow for better access to data in criminal investigations, reports The New York Times.
This is an issue that was heavily debated following the 2016 legal dispute between Apple and the FBI over the iPhone 5c that belonged to San Bernardino shooter Syed Farook. The government wanted Apple to create software that would allow them to access data on the device, which Apple refused to do.
In response to rumors over renewed efforts to build such a tool, Apple software engineering chief Craig Federighi told The New York Times that weakening security protections in iOS devices would be a grave mistake, maintaining Apple's stance on the issue.
"Proposals that involve giving the keys to customers' device data to anyone but the customer inject new and dangerous weaknesses into product security," he said in a statement. "Weakening security makes no sense when you consider that customers rely on our products to keep their personal information safe, run their businesses or even manage vital infrastructure like power grids and transportation systems."
Apple has continually argued for the need for improvements to device security to stay ahead of hackers and other bad actors who exploit security vulnerabilities in iOS devices. During the dispute over the San Bernardino device, Apple refused to build a backdoor tool into its devices and argued that if such a tool existed, it could easily end up in non-government hands.
Federighi has previously spoken passionately on the issue, and in early 2016, he published an op-ed in The Washington Post using the same argument he reiterated in his statement to The New York Times. iOS devices, he said, are "part of the security perimeter that protects your family and co-workers." From Federighi in 2016:
To get around Apple's safeguards, the FBI wants us to create a backdoor in the form of special software that bypasses passcode protections, intentionally creating a vulnerability that would let the government force its way into an iPhone. Once created, this software -- which law enforcement has conceded it wants to apply to many iPhones -- would become a weakness that hackers and criminals could use to wreak havoc on the privacy and personal safety of us all.
According to The New York Times, FBI and DOJ officials have been "quietly" meeting with security researchers to work on approaches that would provide "extraordinary access" to encrypted devices like the iPhone. Based on this research, DOJ officials "are convinced" there's a way to create a backdoor to access data without weakening a device's defense against hacking.
One method under discussion involves a special access key that would be generated when a device encrypts itself, allowing data to be unlocked without a user's passcode. The key would be stored on the device itself, in a part of the hard drive that would be encrypted separately. Only the device manufacturer, with a court order, would be able to access it.
Susan Landau, a computer security professor at Tufts University, told The New York Times that this would create "significant additional security risks" given that "so many more tech companies" would need to access these keys to comply with the inevitable flood of law enforcement access requests.
Talks inside the executive branch have reportedly been renewed over whether to ask Congress to enact legislation that would require tech companies to create a new access mechanisms for law enforcement officials. The talks are said to be in a preliminary stage with no imminent request for legislation ready at this time.
Note: Due to the political nature of the discussion regarding this topic, the discussion thread is located in our Politics, Religion, Social Issues forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.