Since the 90s, there has been a struggle between the tech industry and the government over the implementation of encryption. The tech industry has always argued that people deserve and need the ability to protect their data. For example, corporations might want to protect their financial data, or information about new product development. Individuals also have the right to protect themselves from theft. Encrypting photos could have helped protect Apple users during the iCloud hack.
On the other side of the argument is the government, which believes that encryption should be breachable by them. For over 2 decades, law enforcement has claimed that strong encryption makes data warrant-proof, and they need a way to be able to deal with that inevitability. The Department of Justice has petitioned time and again for what Deputy Attorney General Rod Rosenstein is currently calling "responsible encryption," which essentially means that they expect data companies to provide a backdoor through which they can sneak into your data.
Tech companies have refused to comply with this request, sometimes taking the issue to court. Apple famously fought an order to unlock an iPhone owned by one of the San Bernardino attackers. They claimed that, once a backdoor was created, it would exist and create an ever-present threat. This is the same argument made against backdoors in standard encryption processes.
During a discussion at Ars Live, Riana Pfefferkorn, a legal fellow at the Stanford Center for Internet and Society, made a the point that the term responsibility is misleading.
I think what Rosenstein is getting at is that he believes that companies in their deployment of encryption should be responsible to law enforcement above all and public safety rather than being responsible to their users or the broader security ecosystem.
She believes that the topic has come up again now because the DOJ smells "blood in the water" after the perceived mishandling of data regarding Russia during the presidential election, despite the fact that the two topics are unrelated. But the DOJ's arguments and tactics have not really changed in over 2 decades. In 2015, a research paper was released, showing that nothing has changed on either side.
The complexity of today's Internet environment, with millions of apps and globally connected services, means that new law enforcement requirements are likely to introduce unanticipated, hard-to-detect security flaws. Beyond these and other technical vulnerabilities, the prospect of globally deployed exceptional access systems raises difficult problems about how such an environment would be governed and how to ensure that such systems would respect human rights and the rule of law.
From a software standpoint, the important argument here is that a backdoor is a backdoor; once it exists, it exists. There is no way to secure a backdoor, because people are people. A key will leak, because there are a lot of people involved: developers, law enforcement, government officials. If the breach exists, it will be exploited immediately, either by someone involved, a hacker or law enforcement without a warrant. No good can possibly come from making encryption insecure.