in-depth analysis from gigamon february 2016 vision€¦ · in-depth analysis from gigamon february...

8
Market Focus 6 pages of original research In-depth analysis from Gigamon February 2016 Recognizing a breach in your network requires visibility into the traffic itself. SC Magazine asked its readers just how much visibility they have into their network activity. The answers might surprise you. VISION DOUBLE

Upload: others

Post on 10-Jul-2020

8 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: In-depth analysis from Gigamon February 2016 VISION€¦ · In-depth analysis from Gigamon February 2016 Recognizing a breach in your network requires visibility into the traffic

MarketFocus6 pages of

original research

In-depth analysis from Gigamon February 2016

Recognizing a breach in your network requires visibility into the traffic

itself. SC Magazine asked its readers just how much

visibility they have into their network activity.

The answers might surprise you.

VISIONDOUBLE

Page 2: In-depth analysis from Gigamon February 2016 VISION€¦ · In-depth analysis from Gigamon February 2016 Recognizing a breach in your network requires visibility into the traffic

MarketFocus

Double vision Historically, companies believed that if you threw enough technology against the network perimeter, you could stop malware or intruders. That’s no longer the case. Today, visibility deep into network activity and analysis of network traffic can show breaches before serious damage is done. Stephen Lawton reports.

Despite the conventional wisdom among data security pundits that companies always should consider their networks breached, the

percentage of companies that acknowledge they were victims of a targeted breach within the past year continues to be relatively small, according to a recent survey conducted by SC Magazine and sponsored by Gigamon.

In the survey of 294 security professionals, only 23 percent say they knew they had been the victim of a targeted attack or advanced persistent threat. A full third of respondents say they are unsure, leaving nearly half, some 43 percent, who say they did not suffer such a breach. In fact, only 21 percent of those responding to the survey indicate they assume attackers have infiltrated their network. But perhaps that number is skewed because the security team might not have recognized the breach yet – or maybe they do not consider a malware breach to be a targeted attack.

This surprises Johnnie Konstantas, director of security solutions marketing and business development at Gigamon, who notes that “the assumption of

compromise is quite low.” While six in 10 respondents say their networks are safe, Konstantas says these executives need to reconsider their position. She notes that there are multiple statistics from vendor and industry surveys that state “nearly 97 percent of networks have been breached.”

Of those who acknowledge in this survey that they were breached, nearly 70 percent say they were able to identify the breach in less than one week. Another 10 percent do not know how long it took to identify the breach, so it is possible the attackers had been in the network for quite some time. Just two respondents acknowledge that the attack had been going on for more than six months before it was detected.

While the initial malware attack might well have been detected quickly, Konstantas says, it often is too late because many new network breach points can germinate from the first. Often, she notes, once malware gets embedded it calls out to the command-and-control servers for additional malware. It also has a tendency to make changes in infected systems to hide its actions so it can be extremely difficult to

2 • www.gigamon.com

How would you define your organization’s general approach towards security?

Heuristics-based defensive measures19%

A layered approach to security

60%Latest methods

of detection and expulsion

21%

Page 3: In-depth analysis from Gigamon February 2016 VISION€¦ · In-depth analysis from Gigamon February 2016 Recognizing a breach in your network requires visibility into the traffic

www.gigamon.com • 3

fully remediate an infected system.

Konstantas quips that in some of the high-profile attacks in recent years – Target, Sony, Anthem Insurance and Ashley-Madison – the attackers were in the victims’ networks for an extended period of time; in the case of Ashley-Madison, she says, it was years. Often when companies are attacked they find out well after the incursion has had time to settle in and metastasize in a much larger breach. When malware spreads, it takes a long time to find all the locations where it was deposited, she warns.

Even if a company says it has cleaned its servers and mitigated the attack, the penetration might still be on-going through malware that the company has yet to identify. “It is very unlikely that you can fully remediate [the breach],” she says. “For that reason, you need visibility to the traffic inside the network perpetually to make sure that once you’ve discovered malware that you’re on top of finding all the places that it may have spread.”

Konstantas compares network monitoring after a breach to consumers who monitor their credit score after an identity theft incident. Even though the problem might not show up immediately, it might well do so later. Constant vigilance of network activity is the best way to determine if any semblance of the attack still resides on the network.

Some 28 percent of the respondents to the SC Magazine survey say that social engineering played a role in the attacker being able to infiltrate a corporate network. There is only so much a victim can do about that, Konstantas says, because social engineering implies that someone was tricked into allowing the

breach, perhaps through a cleverly disguised email or perhaps a phone call.

A similar number of respondents, 29 percent, say advanced malware that somehow bypassed network defenses was responsible for the breach. However, 34 percent say they were victims of both a network vulnerability that might have been identified using network visibility and a person who was tricked into allowing the access was

responsible for the breach. None of the defenses that a company might have in place on the perimeter will stop an attack if the user opts in to being infected, she adds.

Another 24 percent of respondents indicate that a software vulnerability, such as an unpatched application, was at fault. When taken together, Konstantas says, this clearly shows why nearly 100 percent of companies are vulnerable to a breach. (Because some respondents had more than one reason for a breach, the totals result in the survey was greater than a 100 percent response.)

Network visibility

Richard A. Clarke, chairman and CEO of the Arlington, Va.-based Good Harbor Consulting, is just one of many security professionals who believe companies need better visibility into their networks to find breaches. The former chairman of the federal government’s Counterterrorism Security Group, a former member of the National Security Council and the top cybersecurity adviser to three presidents, has been quoted frequently as saying there are two kinds of companies: Those that know they have been breached and those that don’t know it yet. Being breached, Clarke says, should not be considered an

Visibility into encrypted traffic

19%

In terms of network-level traffic visibility, which would be the most helpful from a security perspective?

30%

20%Identification of applications requiring inspection

Application session filtering

Visibility to public cloud environments

Visibility to private cloud environments

Metadata extrac-tion for forensics purposes

13%

10%

7%

Page 4: In-depth analysis from Gigamon February 2016 VISION€¦ · In-depth analysis from Gigamon February 2016 Recognizing a breach in your network requires visibility into the traffic

anomaly but rather a consequence of being in business. The key to identifying breaches is having visibility

into the network, says Jai Balasubramaniyan, director of product management at Gigamon. He is not alone in that belief. At the recent Usenix Enigma conference in San Francisco, the National Security Agency’s Rob Joyce, head of the Tailored Access Operations (TAO) hacking team, underscored Balasubramaniyan’s assertion. According to The Register, Joyce reportedly

told the attendees “If you really want to protect your network you have to know your network, including all the devices and technology in it. In many cases, we [the NSA] know networks better than the people who designed and run them.”

When respondents to the SC Magazine survey were asked how they would use network visibility, they say the most useful tool from a security perspective is visibility into encrypted traffic for threat detection,

4 • www.gigamon.com

Has your organization experienced a targeted attack or APT in the past 12 months?

UNSURE34%

NO43%

YES23%

If so, how long did it take you to detect it?

Less than one week 69%

Don’t know15%

1-3 months7%

Less than 1 month 6%

More than 6 months

3%

If so, do you know how attackers were able to infiltrate your network?

A combination of all

34%

Other 7%

Advanced malware 29%Social

engineering 28%

Software vulnerabilities

24%

Uncertain12%

If so and the attackers were able to exfiltrate data, how was it detected?

Network visibility 50%

Contacted by 3rd party

4%

Forensic investigation

28%

Contacted by affected party

15%

Contacted by attackers

3%

Some totals are greater than 100 percent due to multiple answer by respondents.

Page 5: In-depth analysis from Gigamon February 2016 VISION€¦ · In-depth analysis from Gigamon February 2016 Recognizing a breach in your network requires visibility into the traffic

with the second priority being identification of applications requiring targeted inspections.

Balasubramaniyan says he is not surprised that survey respondents say they currently make network visibility a top priority—right up there with traditional security techniques. Eighty-three percent of respondents say they use firewalls – network, application or both – and 77 percent say that they use intrusion prevention and detection systems (IPS/IDS) in their networks. In third place with 67 percent was network visibility. He says that means companies are investing in security beyond the traditional perimeter. They are recognizing that network visibility can help them proactively find problems before they become serious.

In addition to the 67 percent level for network visibility, there were other indicators of a shifting emphasis to securing networks from the inside. At 66 percent are those who say they use anti-malware products and at 64 percent of those who have wireless security products. Among the other common components of corporate network security, in descending order, were endpoint or file and folder encryption (61 percent), SIEM (61 percent), email encryption (56 percent), mobile security solutions (53 percent), network access control (NAC) for endpoints (53 percent), identity and access management (48 percent), virtualization and cloud security (44 percent), unified threat management appliances (33 percent), and NAC for server-side (30 percent).

Traditional network security hardware concentrated at the perimeter might flag an ongoing attack during the data exfiltration process, he says. “Ideally though you want to catch it sooner, at the earlier stages

of the kill chain during, say, lateral movement.” Using technologies that can analyze network traffic and network information can give firms a chance to spot anomalous patterns before the breach has resulted in data theft.

Network visibility is key to identifying actions, such as large amounts of data leaving the network at an unusual time, or perhaps from a user who generally does not create such large transfers,

Balasubramaniyan adds. While monitoring network traffic is hardly a new or unique technology, it is an excellent example of how visibility works. If network traffic suddenly increases at 3 a.m. to an unusual server in another country where the company does not do business, there is a good chance that the transfer is unwarranted or, worse, consistent with a breach. When other actions similar to this take place in the network that has a visibility fabric as well as advanced thread detection and network analytics in place, he continues, the chance of spotting the attack closer to the point of initial compromise goes up dramatically.

“The attackers have to scan the network and move within it in order to find the information they want to steal. All that movement is security gold and the combination of pervasive visibility and analytics is the key to leveraging it with reliable results,” he says.

Perception versus reality

It is possible that survey respondents believe they have implemented network visibility everywhere that it is needed, Balasubramaniyan says, but the results of the survey demonstrate that this simply is not the case. However, “the definition of network visibility – how deep, how broad – varies,” so perhaps what one

www.gigamon.com • 5

Contacted by attackers

3%

Intrusion prevention system

40%

If the attackers attempted to exfiltrate data and were stopped, which solution(s) prevented it?

51%

46%Data loss prevention

Security gateway

Next generation firewall

Data was encrypted

Sandbox

34%

34%

10%

Page 6: In-depth analysis from Gigamon February 2016 VISION€¦ · In-depth analysis from Gigamon February 2016 Recognizing a breach in your network requires visibility into the traffic

company describes as network visibility differs from others, he notes.

Security analytics can leverage logs and events, network data (i.e. packets), and network metadata (e.g. NetFlow/IPFIX, or Internet Protocol Flow Information Export). Rarely do organizations have the analytics facilities to leverage all three types of data but that is what is requires in today’s security architectures.

Logs require you do analysis of the log files themselves, which commonly is done by security information and event management (SIEM) applications and systems. This type of analysis is very important in order to understand how a breach may have traversed through the network and which were the touch points. Still, it is an analysis conducted after the fact, meaning some unwarranted action triggered the event or log, he continues.

Analyzing full packets or network traffic is also a requirement at this point. The packets themselves might contain malware or portions of malware so the analysis will surface the type of threat. Some SIEMs are able to consume both log and event information as well as full packets to conduct this type of analysis. Next generation firewalls, next generation IPS and advanced threat detection systems also analyze network traffic for this purpose, he adds. Still, with increasing network speeds and multi-stage applications, there is more network data to

analyze than ever. This type of analysis can be extremely computationally expensive. Some organizations deploy network traffic analysis judiciously rather than on all traffic. Networks with visibility fabrics that forward data to security devices can be tuned so that only the traffic of interest is sent on for deep security inspection.

Yet another option is security analysis conducted on network metadata or what is essentially summary information about the network traffic. Some experts describe network metadata and what it can reveal about security as akin to what a phone bill can tell you about a person. Network metadata like that contained in NetFlow/IPFIX records can be generated on routers or, better yet, by a visibility fabric, which can send it unsampled to security devices optimized to analyze it,

Balasubramaniyan says. The key is to know whether you have one of those types of devices in your security stack.

How companies respond to attacks was clear, the survey notes: 45 percent say they immediately take the server or system in question offline. That response generally was consistent across company size and revenue breakouts. However, it begs the issue of potentially damaging forensic data versus retaining that data but continuing to suffer a breach. Balasubramaniyan says that many companies would prefer to stop the data loss immediately – even if it means compromising forensic data – if it also stops the company

6 • www.gigamon.com

User error

14%

Which of these poses the biggest security challenge for your organization?

29%

BYOD/BYOA (apps)

Mobile devices

Rate of change in the environment

Poor security hygiene

Cloud-based applications and storage

Lack of specific tools needed

13%

10%

10%

6%

Difficult security workflows

Virtualization

Lack of good multi vendor interoperability

6%

6%

3%

2%

Page 7: In-depth analysis from Gigamon February 2016 VISION€¦ · In-depth analysis from Gigamon February 2016 Recognizing a breach in your network requires visibility into the traffic

form leaking money though the breached system.Some 40 percent of survey respondents say they

would begin an investigation of the attacker’s communications and sources, while just 10 percent say their first reaction is to observe the breach to identify the ultimate target. A mere five percent say they immediately bring in consultants that specialize in advanced mitigation, effectively turning over the investigation to non-employees.

On the plus side of opting for a security specialist that is not part of the company, a non-employee doing an investigation of a data breach has the benefit of not having any potential conflict of interest if a network vulnerability is found. The downside is that the outsiders might not have as clear an understanding of how the company uses its network when it sees it for the first time after a breach.

There is a longstanding debate in the industry about whether to use staff to engage with the incident response team as opposed to using just outsiders. These results, however, indicate that the debate might be far more academic than real.

Having the ability to generate traffic packet duplicates and store them is key to network forensics, Balasubramaniyan says. The investigators essentially can go back in time and follow the attacker’s path and actions.

While saving every packet might seem extreme and require a lot of storage, those with visibility fabrics can de-duplicate the data so that only one copy of each packet is stored, significantly reducing the storage overhead and impact on the network. Also, the storage can be purged periodically and only specific data can be stored long-term in order to reduce the ever-growing data storage requirements.

When it comes to security architectures as a whole, the same rules about looking at data packets and metadata on the corporate network apply to the cloud as well, Balasubramaniyan says. If you have critical workloads that are virtualized, you’ll want the facility to know what is going in and out of those virtual machines. So whatever your network visibility measures, they should be extensible to your clouds.

“Visibility is central to doing security right,” Konstantas says. n

www.gigamon.com • 7

Methodology

This SC Magazine survey, sponsored by Gigamon, was based on 294 responses from a broad cross-section of company sizes and revenues and eight industry verticals, including federal and state and local government, technology services, finance, education, manufacturing, medical and health care, legal/real estate and retail and wholesale dis-tribution. The survey was conducted in January 2016 by C.A. Walker Research Solutions, Glendale, Calif.

Page 8: In-depth analysis from Gigamon February 2016 VISION€¦ · In-depth analysis from Gigamon February 2016 Recognizing a breach in your network requires visibility into the traffic

Gigamon (NYSE: GIMO) provides active visibility into physical and virtual network traffic, enabling stronger security and superior performance. Gigamon’s Visibility Fabric™ and GigaSECURE®, the industry’s first Security Delivery Platform, deliver advanced intelligence so that security, network and application performance management solutions operate more efficiently and effectively.

See more at www.gigamon.com

This supplement was commissioned by Gigamon and produced by SC Magazine, a Haymarket Media, Inc. brand.