Digital Veil · Case #9911
Evidence
On December 2, 2015, Syed Rizwan Farook and Tashfeen Malik killed 14 people in San Bernardino· FBI recovered Farook's work-issued iPhone 5C running iOS 9 with full-disk encryption enabled· On February 16, 2016, Magistrate Judge Sheri Pym ordered Apple to create custom software to bypass security features· Apple CEO Tim Cook published open letter calling the order 'unprecedented' and 'dangerous'· Over 40 amicus briefs filed supporting both sides, including briefs from Microsoft, Google, Facebook, Amazon· On March 28, 2016, FBI announced it had accessed the phone through 'third party' without Apple's assistance· FBI reportedly paid between $900,000 and $1.3 million to access the device· The phone contained no evidence of contact with foreign terrorist organizations or previously unknown co-conspirators·
Digital Veil · Part 11 of 17 · Case #9911 ·

In 2016, the FBI Obtained a Court Order Demanding Apple Create a Custom iOS Version to Bypass Encryption on the San Bernardino Shooter's iPhone. Apple Refused. The FBI Dropped the Case After Finding Another Way In.

On December 2, 2015, two gunmen killed 14 people in San Bernardino, California. The FBI recovered an iPhone 5C issued by the shooter's employer. It was locked with a four-digit passcode and full-disk encryption. On February 16, 2016, a federal magistrate ordered Apple to create a custom version of iOS to disable security features that would erase the device after ten failed passcode attempts. Apple refused. What followed was the most public confrontation between Silicon Valley and law enforcement over encryption — a conflict that ended not with legal resolution, but with the FBI's sudden announcement that it had accessed the phone through a third-party vendor.

$1.3MEstimated payment to access one iPhone
51Days from court order to case withdrawal
40+Amicus briefs filed in the case
0New leads discovered on the device
Financial
Harm
Structural
Research
Government

The Attack and the Locked Phone

On December 2, 2015, at approximately 11:00 AM, Syed Rizwan Farook and his wife Tashfeen Malik entered the Inland Regional Center in San Bernardino, California, armed with semi-automatic rifles and handguns. They opened fire on Farook's colleagues from the San Bernardino County Department of Public Health who had gathered for a training event and holiday party. Within minutes, they killed 14 people and seriously injured 22 others. The couple fled in a black SUV, leading police on a pursuit that ended in a shootout in which both attackers were killed.

The FBI immediately took control of the investigation, designating it an act of terrorism. Investigators determined that Farook, a U.S.-born citizen of Pakistani descent, and Malik, a Pakistani-born lawful permanent resident, had been radicalized and inspired by ISIS propaganda, though no evidence emerged of direct contact with the terrorist organization. Malik had posted a pledge of allegiance to ISIS leader Abu Bakr al-Baghdadi on Facebook during the attack.

14
People killed. The San Bernardino attack was the deadliest terrorist attack on U.S. soil since September 11, 2001, until the Pulse nightclub shooting six months later.

Investigators recovered multiple firearms, thousands of rounds of ammunition, and materials for pipe bombs from the couple's residence and a storage unit. Crucially, they discovered that Farook had physically destroyed his personal cell phone, crushing it and disposing of it in a manner suggesting he wanted to ensure the data could never be recovered. This act of deliberate destruction suggested the personal phone might have contained incriminating evidence.

However, investigators did recover Farook's work phone: an iPhone 5C running iOS 9 that had been issued by San Bernardino County. The device was protected by a four-digit passcode and Apple's full-disk encryption, a security architecture Apple had introduced with iOS 8 in September 2014. Under this system, the encryption key was mathematically derived from the user's passcode, and Apple itself did not possess the ability to decrypt the device. The phone had been in Farook's possession until shortly before the attack, creating the possibility it might contain evidence of planning, co-conspirators, or contact with terrorist organizations.

The Encryption Architecture

Apple's decision to implement warrant-proof encryption in iOS 8 represented a fundamental shift in the company's approach to law enforcement requests. Prior to iOS 8, Apple could extract data from locked iPhones when presented with a valid search warrant. The company had complied with thousands of such requests. But iOS 8 changed the architecture: encryption keys were now derived from user passcodes through a process called key derivation, and those keys existed only on the device itself, never on Apple's servers.

This meant that even if Apple wanted to access a locked iPhone running iOS 8 or later, it could not. The company had designed systems where it was technically incapable of complying with government demands to decrypt devices. Apple CEO Tim Cook positioned this as a privacy feature responding to customer concerns following Edward Snowden's revelations about NSA surveillance programs that had involved cooperation from technology companies including Apple.

"Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8."

Apple Inc. — iOS 8 Privacy Statement, September 2014

Farook's iPhone 5C included several security features that complicated FBI access attempts. The device would accept passcode guesses, but after six failed attempts, it would impose increasing time delays before accepting another attempt. Additionally, the device had the "Erase Data" feature enabled, which would automatically destroy the encryption keys after ten failed passcode attempts, rendering all data permanently inaccessible.

A four-digit passcode has 10,000 possible combinations (0000 through 9999). With unlimited attempts and no time delays, the FBI could brute-force the passcode in hours or days. But the auto-erase function created a hard limit: investigators had just ten attempts before the data would be destroyed forever. The FBI claimed it had already used several attempts before realizing the auto-erase function was enabled, leaving even fewer remaining.

The Court Order

The FBI sought and received a search warrant for the iPhone from Magistrate Judge Sheri Pym on December 15, 2015. However, the warrant was useless without the ability to access the encrypted contents. FBI technicians explored various options, including attempting to guess the passcode (too risky given the auto-erase function), copying the flash memory chip (prevented by hardware-level encryption), and requesting data from Apple's iCloud backup service (which yielded data only through October 19, 2015, leaving a six-week gap).

On February 9, 2016, the FBI requested that Judge Pym issue an order under the All Writs Act of 1789 compelling Apple to assist in accessing the device. The All Writs Act grants federal courts authority to "issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law." The statute had been used to compel telephone companies to assist with wiretaps and other forms of technical cooperation, most notably in United States v. New York Telephone Co. (1977).

On February 16, 2016, Judge Pym issued the order. The 42-page ruling specified that Apple should provide "reasonable technical assistance" by creating a custom version of iOS that would:

  • Disable the auto-erase function that would destroy data after ten failed passcode attempts
  • Disable the feature that imposes increasing time delays between passcode attempts
  • Allow passcode attempts to be submitted electronically through a physical connection or wireless communication, rather than requiring manual touchscreen entry
  • Run only on Farook's specific device, identified by its unique identifier (IMEI number)
227 Years
Age of the All Writs Act. The FBI invoked a law from 1789 — predating electricity, telecommunications, and computers by decades — to compel a technology company to write new software.

The order acknowledged that complying would be burdensome but found the burden reasonable given the government's compelling interest in investigating a terrorist attack that killed 14 people. Judge Pym gave Apple five business days to respond if it believed the order was "unreasonably burdensome."

Apple's Public Refusal

Hours after the order was issued, Tim Cook made a decision that transformed a legal dispute into a national debate: rather than negotiate privately with the FBI or quietly comply, he published an open letter to Apple customers on the company's website titled "A Message to Our Customers."

The letter characterized the FBI's demand as unprecedented and dangerous. Cook argued that the government was asking Apple to create "the software equivalent of cancer" — a backdoored version of iOS that, once created, could be used repeatedly on other devices or fall into the wrong hands. He wrote:

"The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe."

Tim Cook — A Message to Our Customers, Apple.com, February 16, 2016

Cook framed the issue not as Apple versus the FBI, but as a fundamental question about digital security and civil liberties in the modern era. He argued that once the custom software existed, there would be no way to control its use. The FBI claimed the software would work only on Farook's device, but Cook countered that the techniques could be replicated and applied to other devices. He noted that law enforcement agencies across the country had already filed requests for similar assistance in dozens of other cases.

The decision to go public was deliberate and strategic. Apple's executives calculated that the technical and legal arguments were strong but that the political narrative favored the government: a terrorist attack, dead victims, a lawful search warrant, a single locked phone that might contain evidence. By publishing the letter immediately, Apple sought to reframe the debate around security and precedent before the FBI could control the narrative.

The Battle for Public Opinion

The case immediately became a political and cultural flashpoint. Polls showed Americans divided along lines that did not follow typical partisan patterns. A Pew Research Center poll conducted in February 2016 found that 51% of Americans believed Apple should unlock the iPhone to assist the FBI investigation, while 38% said Apple should not. Support for unlocking the phone was higher among older Americans and those less familiar with encryption technology.

FBI Director James Comey published his own response on the Lawfare blog on February 21, 2016, pushing back against Cook's characterization. Comey insisted the request was narrow and limited:

"We simply want the chance, with a search warrant, to try to guess the terrorist's passcode without the phone essentially self-destructing and without it taking a decade to guess correctly. That's it. We don't want to break anyone's encryption or set a master key loose on the land."

James Comey — We Could Not Look the Survivors in the Eye if We Did Not Follow this Lead, Lawfare Blog, February 21, 2016

Comey invoked the victims, noting that he had met with family members who desperately wanted answers about why their loved ones died. He characterized Apple's resistance as a business decision motivated by marketing considerations rather than genuine principle. He denied that the case would set a precedent, arguing that each case would be evaluated individually by courts.

However, the FBI's narrow framing began to collapse when it emerged that the San Bernardino case was not unique. Prosecutors in New York revealed they had been seeking a similar order to compel Apple to unlock an iPhone in a drug case. Manhattan District Attorney Cyrus Vance Jr. testified that his office alone had 175 encrypted iPhones it could not access. It became clear that law enforcement agencies viewed San Bernardino as a test case — an opportunity to establish legal precedent in circumstances where public sympathy would favor the government.

The Technology Industry Responds

Apple's decision to fight the order rather than quietly comply forced other technology companies to take public positions. The risk was considerable: appearing to side against law enforcement in a terrorism investigation invited political backlash and could damage relationships with government customers.

Nevertheless, within days, a broad coalition of technology companies filed amicus briefs supporting Apple. On March 3, 2016, Microsoft, Google, Facebook, Amazon, Dropbox, Evernote, and others filed a joint brief arguing that the government's interpretation of the All Writs Act would "fundamentally alter the relationship between technology companies and the government."

40+
Amicus briefs filed. Organizations ranging from the ACLU to the Cato Institute to the United Nations Special Rapporteur on Freedom of Expression submitted briefs, demonstrating the case's global significance.

The companies argued they had legitimate business interests in maintaining customer trust, particularly in international markets where revelations about U.S. surveillance programs had already damaged their reputations. They noted that many foreign governments would demand similar assistance if Apple complied with the FBI's order, potentially forcing the company to choose between conflicting legal obligations or to exit markets entirely.

Technology executives were remarkably unified in their messaging. Microsoft President Brad Smith testified before Congress that "what is at stake here is our fundamental ability to protect our customers' most private information." Google CEO Sundar Pichai posted on Twitter that requiring companies to enable hacking "could be a troubling precedent." Even executives who privately expressed frustration with Apple's aggressive public stance recognized that the legal precedent threatened their own products and services.

The coordinated response represented a significant shift in Silicon Valley's relationship with government. While technology companies had often cooperated quietly with law enforcement requests, the Snowden revelations and changing customer expectations had made such cooperation politically and commercially risky. The Apple case became a line companies felt they could not cross without undermining their business models and values.

The Security Community Weighs In

Perhaps the most damaging testimony for the FBI's position came from the computer security community. On March 3, 2016, cryptographer and security expert Bruce Schneier submitted a declaration in support of Apple's motion to vacate the order. Schneier, widely recognized as one of the world's leading security experts, provided a detailed technical analysis of why the FBI's request was not narrow or limited.

Feature
Standard iOS 9
FBI-Requested Version
Auto-erase after 10 attempts
Enabled
Disabled
Time delays between attempts
Up to 1 hour
Removed
Electronic passcode submission
Not allowed
Allowed
Brute-force attack time (4-digit code)
Up to 5.5 years
Under 30 minutes

Schneier explained that the auto-erase function and passcode delays were not arbitrary obstacles but essential security features designed to protect against precisely the kind of brute-force attack the FBI wanted to conduct. He stated unequivocally: "There is no way to build a back door for just the good guys."

Security researchers emphasized that once Apple created the custom software, it would become a target for theft, leakage, or exploitation by foreign intelligence services, criminal organizations, and sophisticated hackers. Even if the software was designed to work only on Farook's specific device, the techniques and source code could be reverse-engineered and adapted to work on other iPhones. The FBI's request created what security professionals call a "vulnerability equity" problem: the creation of a security weakness that could be exploited by adversaries.

Fourteen of the world's most prominent cryptographers and computer scientists — including Whitfield Diffie and Ronald Rivest, inventors of foundational encryption technologies — filed an amicus brief titled "Keys Under Doormats" summarizing years of academic research on why secure backdoors are technically infeasible. The brief concluded: "The government's demand would also set a dangerous precedent that would weaken the security and undermine the trust in products created by American companies."

The Case Collapses

On March 21, 2016, the day before the scheduled hearing on Apple's motion, the Department of Justice filed an emergency motion requesting a delay. The filing stated that "an outside party demonstrated to the FBI a possible method for unlocking Farook's iPhone" and that the FBI needed time to test whether the method would work without destroying data.

The announcement stunned both Apple and the public. For weeks, FBI Director Comey had insisted that only Apple could access the device, that there were no alternative methods, and that the court order was absolutely necessary for the investigation. Now, abruptly, the FBI claimed a third party had provided a solution.

On March 28, 2016, the FBI filed a status report stating it had "successfully accessed the data stored on Farook's iPhone and therefore no longer requires the assistance from Apple Inc." The Department of Justice requested that the court vacate the February 16 order, which Judge Pym granted the same day. The case ended without any judicial ruling on the merits of Apple's objections or the scope of the All Writs Act.

$1.3M
Estimated payment to unlock the phone. FBI Director James Comey stated in April 2016 that the bureau paid more than $900,000 for the solution, later reported as potentially $1.3 million — far more than Comey's annual salary.

The FBI never officially disclosed the method used to access the phone or the identity of the vendor who provided it. However, multiple news reports identified Cellebrite, an Israeli digital forensics company with existing FBI contracts, as the third party. Cellebrite specialized in mobile device extraction and had demonstrated capabilities to bypass security features on various phone models.

Reports indicated that the method exploited a vulnerability in iOS 9 that allowed unlimited passcode attempts without triggering the auto-erase function, combined with a technique for rapidly testing passcodes. The FBI classified the method to prevent Apple from learning about and patching the vulnerability, ensuring the technique could be used on other devices running the same iOS version.

What the Phone Revealed

In April 2016, FBI officials testified before Congress about what investigators found on Farook's iPhone. The answer was: virtually nothing of investigative value. The phone contained no evidence of contact with foreign terrorist organizations, no communications with previously unknown co-conspirators, and no information about other planned attacks.

This revelation was devastating to the FBI's narrative. The bureau had argued that accessing the phone was essential to the investigation, that it might prevent future attacks, and that the potential intelligence value justified compelling Apple to create unprecedented software. Instead, the phone confirmed what investigators had already established through other means: Farook and Malik had been inspired by ISIS propaganda but appeared to have acted independently without direction from foreign terrorist organizations.

The empty results raised troubling questions about the FBI's decision-making. Critics argued that the bureau had always known the work phone was unlikely to contain valuable evidence — Farook had destroyed his personal phone, suggesting that was where incriminating information had been stored. They suggested the FBI had used the San Bernardino tragedy as a pretext to establish legal precedent that would make it easier to compel technical assistance in future cases.

"The San Bernardino case was an opportunity for the FBI to establish a precedent. The fact that they found nothing on the phone and that they already had an alternative method suggests this was never really about solving one case — it was about changing the rules for all cases."

Christopher Soghoian — Principal Technologist, ACLU, Congressional Testimony, May 2016

FBI officials denied this characterization, insisting they had pursued the court order in good faith and that the alternative method became available only late in the process. However, documents later obtained through Freedom of Information Act requests showed that the FBI had been in contact with multiple vendors about potential iPhone access methods before seeking the court order, raising questions about whether alternatives had been adequately explored before pursuing the legal confrontation with Apple.

The Encryption Debate Continues

The case ended without resolving any of the fundamental legal questions it raised. Courts never ruled on whether the All Writs Act authorizes compelling companies to create new software, whether such compulsion violates the First Amendment's protection of code as speech, or how to balance law enforcement needs against security and privacy interests in the encryption era.

The FBI's sudden withdrawal eliminated the precedent the bureau had sought. But it did not eliminate the underlying tension between strong encryption and law enforcement access. In the years following San Bernardino, similar disputes arose repeatedly. In 2019, following a shooting at Naval Air Station Pensacola in which the FBI again sought Apple's assistance unlocking iPhones, Attorney General William Barr publicly criticized Apple for providing "no help" despite the company having provided gigabytes of iCloud data and technical assistance short of creating a backdoor.

Congress held multiple hearings on encryption in 2016 and subsequent years but never passed legislation addressing the issue. Proposed bills requiring encryption backdoors — often euphemistically termed "lawful access" or "exceptional access" — failed to advance, opposed by technology companies, civil liberties organizations, and security researchers who argued such requirements would undermine security and disadvantage U.S. companies globally.

International developments complicated the landscape further. Following the San Bernardino case, governments including the United Kingdom, Australia, India, and Pakistan introduced or strengthened laws requiring companies to provide technical assistance to decrypt communications and devices. The Australian government passed the Telecommunications and Other Legislation Amendment (Assistance and Access) Act in 2018, which technology companies and security researchers warned could require the creation of systemic weaknesses. The global proliferation of such requirements created the exact scenario security experts had warned about: a race to the bottom in which the weakest security standard would apply globally.

The Vulnerability Ecosystem

The San Bernardino case exposed a thriving marketplace for mobile device exploits. The FBI's payment of $900,000 to $1.3 million for a single iPhone unlock demonstrated that sophisticated adversaries — whether government agencies or criminal organizations — could purchase access capabilities even when manufacturers attempted to design secure systems.

Companies like Cellebrite, Grayshift (creator of the GrayKey device), and others marketed iPhone and Android unlocking services to law enforcement agencies worldwide. These companies employed security researchers to discover vulnerabilities in mobile operating systems, then developed commercial products exploiting those vulnerabilities before manufacturers could patch them. The business model depended on keeping vulnerabilities secret rather than disclosing them to manufacturers for repair.

70+
Pending cases. At the time of the San Bernardino case, at least 70 other cases nationwide involved law enforcement seeking similar orders compelling Apple to unlock iPhones, demonstrating the issue's scope beyond a single terrorism investigation.

This market created perverse incentives. Law enforcement agencies willing to pay premium prices incentivized researchers to find vulnerabilities and sell them privately rather than disclose them responsibly to manufacturers. Foreign governments, including authoritarian regimes with poor human rights records, purchased the same technologies, using them to target journalists, activists, and political dissidents. The tools sold to unlock one terrorist's phone in California could be used to identify and arrest pro-democracy protesters in Hong Kong or journalists investigating corruption in Mexico.

Security researchers noted the irony: the FBI's insistence that it needed Apple to create a backdoor was undercut by its subsequent purchase of a vulnerability-based exploit. The existence of such exploits suggested that the "Going Dark" narrative — the FBI's term for the supposed inability to access encrypted communications and devices — was overstated. As long as software contained vulnerabilities (and all complex software contains vulnerabilities), law enforcement agencies with sufficient resources could purchase access.

The Political Economy of Encryption

The Apple-FBI case illuminated how encryption had become entangled with broader questions about technology companies' power, business models, and relationship with government. Apple's stance on encryption was simultaneously a matter of engineering philosophy, business strategy, and political positioning.

For Apple, privacy and security were product differentiators. Unlike Google and Facebook, whose business models depended on collecting and analyzing user data for advertising purposes, Apple earned revenue primarily from hardware sales and services. The company could credibly position itself as protecting customer privacy because it had no business interest in accessing user data. Strong encryption enhanced Apple's brand among privacy-conscious customers and provided a competitive advantage in markets concerned about U.S. government surveillance.

This business alignment did not make Apple's position unprincipled, but it meant the company's interests aligned with its stated values in ways that were not true for all technology companies. Google, Facebook, and Microsoft faced different tradeoffs: they needed to collect substantial user data for their core businesses while also reassuring customers about privacy and security. Their support for Apple in the case reflected genuine concerns about precedent and government overreach, but also anxiety about how backdoor requirements might affect their own services.

Law enforcement agencies viewed this dynamic with frustration. They argued that technology companies had unilaterally decided to make entire categories of evidence inaccessible to lawful investigations, prioritizing business interests over public safety. FBI Director Comey repeatedly emphasized that he was not asking for new surveillance capabilities but merely seeking to preserve access that had historically existed — the ability to execute search warrants on physical devices.

The Legacy and Unresolved Questions

Seven years after the San Bernardino case, the fundamental tensions it exposed remain unresolved. Apple has continued strengthening iPhone security with each iOS release, making device access increasingly difficult and expensive even for well-funded adversaries. The company introduced Advanced Data Protection in 2022, extending end-to-end encryption to iCloud backups — closing the gap that had previously allowed law enforcement to obtain some data from Apple's servers with a court order.

Law enforcement agencies continue to advocate for encryption backdoors or "lawful access" mechanisms, arguing that strong encryption creates spaces beyond the reach of lawful authority where criminals and terrorists can operate with impunity. Prosecutors cite cases where encrypted devices prevented them from accessing evidence of child exploitation, murder, and drug trafficking.

Security researchers and civil liberties advocates maintain their opposition to backdoors, arguing that any vulnerability created for lawful access will inevitably be exploited by malicious actors. They point to the repeated theft of government hacking tools, including the NSA's EternalBlue exploit that was stolen and repurposed for the WannaCry and NotPetya malware attacks that caused billions of dollars in damage globally.

The case established no legal precedent because it ended without a judicial ruling. The All Writs Act's scope remains undefined. Whether companies can be compelled to write code remains an open question. Whether such compulsion constitutes compelled speech in violation of the First Amendment has not been adjudicated. The constitutional and statutory framework governing encryption in the modern era remains unsettled.

What the case did establish was that the encryption debate cannot be resolved through simple appeals to either security or privacy. Both values are essential; both are contested; both require tradeoffs. The San Bernardino case forced public confrontation with questions that technological change had made urgent: What information should be accessible to government? Under what circumstances? Subject to what oversight? And who decides?

These questions have only become more complex as encryption has expanded from smartphones to messaging apps, cloud storage, vehicles, medical devices, and Internet of Things products. Each new application raises the same fundamental tension between security-through-encryption and access-for-lawful-authority.

The case's most lasting impact may be its demonstration that these tensions cannot be resolved through litigation alone. The FBI obtained access through purchased exploits rather than legal compulsion. Apple strengthened security through engineering rather than waiting for regulatory requirements. The outcome suggested that the encryption wars would be fought not primarily in courts or legislatures but through technology itself — an arms race between security designers and those seeking to circumvent security, whether for lawful or malicious purposes.

In that arms race, the San Bernardino case was a single battle, inconclusive and unresolved, in a conflict that continues to shape the architecture of digital systems we all depend on.

Primary Sources
[1]
Federal Bureau of Investigation — Official Statement on San Bernardino Investigation, December 2015
[2]
United States District Court Central District of California — Case No. ED 15-0451M, Order Compelling Apple Inc. to Assist Agents in Search, February 16, 2016
[3]
Tim Cook — A Message to Our Customers, Apple.com, February 16, 2016
[4]
James Comey — We Could Not Look the Survivors in the Eye if We Did Not Follow this Lead, Lawfare Blog, February 21, 2016
[5]
Apple Inc. — iOS 8 Security White Paper, September 2014
[6]
U.S. Department of Justice — Status Report, Case No. ED 15-0451M, March 28, 2016
[7]
Bruce Schneier — Declaration in Support of Apple's Motion to Vacate Order, Case No. 5:16-cm-00010-SP, March 3, 2016
[8]
Microsoft, Google, Facebook, et al. — Joint Amicus Brief, Case No. 5:16-cm-00010-SP, March 3, 2016
[9]
Cyrus Farivar — Ars Technica, 'FBI's Refusal to Unlock Shooter's iPhone Shows Need for Encryption Backdoors,' March 2016
[10]
Whitfield Diffie, Ronald Rivest, et al. — Keys Under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications, Journal of Cybersecurity, July 2015
[11]
James Comey — Remarks at the Aspen Security Forum on San Bernardino Case, April 21, 2016
[12]
Washington Post — FBI Paid More Than $900,000 to Unlock San Bernardino iPhone, April 2016
[13]
Pew Research Center — More Support for Justice Dept. Than for Apple in Dispute Over iPhone, February 2016
[14]
Salihin Kondoker — Letter to Judge Pym Supporting Apple, March 3, 2016
[15]
Christopher Soghoian — Congressional Testimony Before House Judiciary Committee, May 2016
Evidence File
METHODOLOGY & LEGAL NOTE
This investigation is based exclusively on primary sources cited within the article: court records, government documents, official filings, peer-reviewed research, and named expert testimony. Red String is an independent investigative publication. Corrections: [email protected]  ·  Editorial Standards