Real world data breaches have shown that the devil is in the detail. Often, systems are exploited by a small opening that has been overlooked in favour of creating extensive defences around the front door, or because only one large scale solution had been deployed.

Continuous monitoring practices, as recommended under the National Institute of Standards and Technology, should bring quality as well as quantity to an organisation’s security. Visibility in a time where everything is connected is essential. However, security policies must ensure not only the extended coverage that full enterprise consoles offer, but a certain level of detail and layered security which only the deployment of multiple, tailored tools can bring.

Automation when executed correctly offers the freedom to take the important decisions that can make or break an organisation. The amount of available data is such that security analysts cannot respond to every alarm; consistent evaluation and a clear understanding of risk factors are the only way to navigate the threat landscape appropriately.

The following potential scenario illustrates how security teams have the added responsibility of responding quickly, as well as correctly, to the data they receive through regular audits and assessments.

It was a quiet evening at a renowned bank in the city’s financial district. The lights were off save for a few in the IT department. The Network Operations Centre was running the newly installed enterprise monitoring tool which had been approved after terse negotiations with the board. The chief information security officer (CISO) argued that it would offer the team better visibility across the enterprise network. The board was not easy to convince, but the team eventually won approval for the system. For now they were putting in a late night’s work in preparation for tomorrow morning’s assessment with the external pen testers.

Oct-Dec 2015 Issue