Nashville Christmas Bombing: Lessons Learned

Early Christmas morning 2020, a man detonated a bomb in the middle of a tourist district in downtown Nashville after warning the neighborhood to evacuate for 30 minutes. Along the same street sat a critical piece of network infrastructure – owned by AT&T – routing telecommunications and internet service across the Southeast United States. The blast severed connections to commercial power and irreparably damaged three generators tasked with supporting most of the facility, including the floor hosting 9-1-1 infrastructure. Telecommunications infrastructure was not immediately impacted, as the facility resorted to Universal Power Supply (battery) power.

In his presentation, Director Stephen P. Martini summarized the impact to his department that morning and subsequent days, the challenges they faced and worked to overcome, and the continuity conversations they are engaged in for the future.

Martini said that as he watched the news coverage from his home minutes after the explosion, it wasn’t immediately apparent that the blast would become an infrastructure emergency. At the ECC, phones rang, internet connections delivered data. The center’s telecommunicators had assisted in setting up a perimeter around the downtown area before the explosion and focused after the explosion on supporting the officers on the street and sending firefighters and medics. They even confirmed with AT&T that alternate routing solutions remained intact if power to the block was eventually turned off while buildings were inspected. It appeared to be a de-escalating incident.

Shortly after he arrived at the center, however, the UPS batteries at the AT&T facility started to fail. The batteries attempted to pass the load to the existing generators, but those had been rendered inoperable. This set in motion a domino effect of failures impacting 9-1-1, business and internet connectivity across seven states for 72 hours.

Around 11 a.m., the ECC noticed intermittent 9-1-1 connectivity with callers reporting they received a busy signal or message saying their call couldn’t be completed. The tech team quickly routed 9-1-1 traffic to their CAMA trunks. Martini believes this single act was the difference between Metro Nashville retaining 9-1-1 service while their neighbors did not, especially since remote connectivity to the AT&T facility was lost when the power failed, preventing neighboring counties from taking similar action.

66 ECCs were either impacted by the loss of power caused by the blast. More than 100 ECCs in Tennessee, Kentucky, Alabama and Georgia lost ANI and ALI information. To counter this loss, ECCs throughout the Southeast United States used RapidSOS technology to identify caller’s locations.

When fire at the downtown facility activated sprinklers and drowned critical equipment, a team of 50-100 engineers worked to power up and power down batteries in phases to evaporate the water safely and restore equipment. This created operational confusion around the region, as service would restore for periods of time when power was on and then disappear again when the batteries were powered down.

Nashville also lost commercial phone service, including the center’s administrative lines hosting their alarm line queue and administrative non-emergency number. Public safety telecommunicators used older cell phones, provided by a different carrier and pre-positioned at each console, to make outbound calls, confirming that carrier diversity is an absolute must in events like these.

Through Christmas Day, Dec. 26 and early into the morning of Sunday, Dec. 27, administrators and technology specialists across three agencies and two counties explored various options to restore connectivity to the non-emergency numbers. They publicized the Hub.Nashville.gov site and app – a web-based alternative to submit non-emergency requests – but other efforts were unattainable within the time needed.

By 7:30 p.m. Sunday night, the administrative lines were functioning normally land, by Tuesday, Dec. 29. at 3 p.m., 9-1-1 lines had returned to their SIP trunks.

During and following the event, discussions with neighboring counties and internal IT professionals emphasized the need for interagency coordination and carrier redundancy. But, Martini said, no single solution would have prevented the impact of this outage, which thrust the ECC into the backup to their backup plans with very little warning and only a patchwork of layered redundancy. He emphasized the importance of Continuity of Operations Plans and identifying opportunities for redundancy to provide for primary, alternate, contingency and emergency levels of support for mission-critical operations.