Netwitness – next generation network traffic analysis

Imagine Harrison Ford doing traffic analysis on your network.

Hmm – there’s a thought.

The US-based company – Netwitness has been making a lot of noise lately about their “next generation” capability to perform full session reassembly and threat analysis from packet capture. This is a great feature to have for traffic analysis that has been available from other open source tools like Snort, Sguil and NetworkMiner for years. I was doing full session traffic analysis with Snort over 5 years ago – when we had problems in a UDP-based physical security control network that opened and closed doors in a 40 story office building…

NetWitness Investigator is the award-winning interactive threat analysis application of the NextGen product suite. Our patented methods of viewing network session and application data have helped our clients fill in the visibility gaps that exist in their firewall, intrusion detection, SEIM and other security infrastructures. Now, the entire community of security practitioners will have the capability to obtain faster and clearer insight into today’s advanced threats.

Download Investigator and see for yourself using your own data why top government agencies, banks, and Fortune 1000 companies have turned to NetWitness.

Netwitness is exactly what they say it is – a very good network traffic analyzer. However – beware of vendor marketing overshoot – network traffic analyzers are not data loss prevention systems like Fidelis Security Systems XPS or Vontu  (now Symantec) or Websense (formerly Port Authority).

  • Recording all the traffic is not the same is producing potential data loss events with a high level of precision and recall
  • Netwitness performs session reassembly and extracts meta-data such as hostname and filename BUTNetwitness doesn’t perform file format independent content analysis. A regex for a keyword might work for a plain-text string an a simple SMTP email but it is totally worthless for URL-encoded text in Webmail, Microsoft Office, PDF, Open Office etc.
  • It records all the traffic.  On a 1GB network – that is 100MByte/second.  Do the math regarding disk space, network performance and computing capacity. Recording all the traffic also means that Netwitness users are in 100% violation of EU Privacy laws that specifically prohibit recording of personal information. BTW – the last time I  benchmarked pcap – it maxed out at about 100MB/s,
  • Netwitness doesn’t provide rule-based policy capability
  • Netwitness doesn’t provide event management,  event analytics database,  central console management, or distributed sensors provisioning andmanagement
  • Netwitness doesn’t provide extrusion prevention/ data loss prevention
Related Posts Plugin for WordPress, Blogger...
Tell your friends and colleagues about us. Thanks!
Share this

8 thoughts on “Netwitness – next generation network traffic analysis

  1. Danny,

    Thank you for the post, but I have a different perspective on several of the points you make. First – you seem to be making some fairly broad observations about NetWitness solutions based on the free Investigator release. You need to know that our enterprise solutions are dedicated linux appliances that perform distributed multi gigabit collection, easily creating hundreds of terabyte collections. Investigator is simply the front end to that solution. It is, however, very useful when working with capture files, and has much of the logic built into it that our enterprise solutions have, albeit on a much smaller scale.

    We are not a DLP, nor do I think our marketing makes us out to be. However, the reverse is also true. Trying to make a pervasive network capture solution using any of the DLP solutions you mentioned would not be a successful venture. Similarly – your point about file format analysis seems to assume that the other DLP solutions are impervious to obfuscation techniques. Not only are most of them rife with assumptions about how to decode documents, simply password protecting a zip file will bypass detection on all of them. Not to mention assumptions on how protocols such as FTP, IM, various webmail services work leave many if not all of them very prone to missing criticial information. In my opinion, DLP solutions are great at preventing intentional or accidental data leakage, but do very little to prevent any motivated insider or any of the more recent malware and exfiltration techniques. In the end, our solution does not obviate the need for a DLP, nor does a DLP solution obviate the need for pervasive capture. You have been using squil and networkminer – I guarantee you would like our solutions – because that capability you have assembled is our target, and we do it well. BTW – you should turn on the webmail parser in Investigator – located under the options tab.

    Lastly – there is an extensive rules engine, that among other things is used to ensure we can meet privacy requirements. The software can filter, truncate, or alert on any aspect of the session analysis. The system can be tuned to meet any privacy requirements. If privacy laws allow you to operate IDS or IPS systems, you can use NetWitness. Our enterprise solutions, of course have the ability to take those alerts and feed sim/siem infrastructures, including full integration with ArcSight, Cisco, Sourcefire and others. Of course we have distributed sensors and central, as well as hierarchical management capabilities.

    Hope this helps.

    Tim Belcher
    CTO
    NetWitness

  2. Tim,

    Wow – thanks for such a fast response, you’re certainly doing a good job tracking comments on Netwitness. First of all, I can appreciate the engineering work you guys have put into the product. I will definitely spend some more time with Netwitness.

    I was turned on to Netwitness recently when a reseller of yours pinged a client of ours and claimed it was an extrusion detection solution that competes (sic) with Fidelis XPS. For openers, I totally agree that network surveillance requires a bag of tools and DLP / extrusion detection is no exception. Your Web site collateral does imply that you are in the compliance enforcement and DLP space.

    The business need has shifted to detection of unauthorized network transfer of content inside and in and out of the network, not anomalous TCP/IP traffic. I am sure Netwitness can be a valuable tool for a security analyst. From what I’ve seen it’s a lot friendlier than many of the Open Source tools.

    However, I still have a number of concerns which your response has not allayed.

    1. The difficulty of making sense out of large unstructured data sets of decoded network traffic.

    My experience with network surveillance in general and extrusion detection in particular has taught me that signal/noise ratio is more important than a low number of false negatives/false positives (which many vendors in the SIM/DLP/IDS space try to achieve). I fail to see how a system that collects hundreds of terabytes of decoded network traffic can be a useful analytical tool with high S/N ratio. Pumping the data into ArcSight is not going to cut it when you need real-time alerts on violations.

    2. Performance on gigabit networks
    Unless you have written your own Windows NDIS kernel packet capture driver, I don’t think that pcap on a Windows box can do much more than 100mb/s – even on a Linux box it might be able to another 20-40% before it melts down. Maybe you’re using some special network cards….I’d be curious to see what your session reassembly and payload decoding throughput really is.

    2. Regex versus content analysis
    Your regex support will work fine on plain Latin-1, I suspect it will not work on plain text in Unicode, and in right to left languages like Arabic and Hebrew. Regex will not work on MS Office, PDF or OO documents because the data is not stored neither in plain text nor in a linear format. Nothing to do with obfuscation.

    3. Application protocol detection is hard

    Yes it is. Quite hard. I cannot speak for the current version of Vontu since it’s been a couple of years, but speaking for FIdelis XPS (full disclosure – I’ve been working with Fidelis for about 5 years as an industry partner) – I can say that the XPS application protocol decoders (FTP, telnet, smtp, pop3, imap, p2p, im, oracle, db2, rpc, http, gmail, yahoo mail….) are kept up to date and miss very little – extracting the content and analyzing structured and unstructured data on the fly

    4. Recording sessions versus storing violations
    I am not sure I understand how a rules engine mitigates the privacy concerns. You’re still monitoring all employee endpoint traffic, you’re just not storing the recorded sessions. I’m not a lawyer but I would be curious to hear how customers worked this issue in the EU. Incidentally, since IDS/IPS deal with inbound traffic – they are not a valid point of comparison nor justification for recording user endpoint traffic.

    5. Hyping network security with regulatory issues
    Just don’t do it.

    Privacy compliance and corporate governance have practically nothing to do with network traffic analysis. Security vendor hype and attempts to build franchises around regulation in order to sell product have done a great disservice to the security industry and our customers. You and I are in the business of helping customers understand threats that may exploit asset vulnerabilities and find cost-effective security countermeasures. Neither Fidelis nor Netwitness nor Symantec can claim to help prevent back-dating of options or manipulation of a trial balance before it gets processed into a 10Q.

    Finally – regarding insider abuse.
    I totally agree that a determined insider will figure out how to bypass the DLP systems, especially if they’re endpoint based (obviously he or she knows there is a piece of software that the IT security guys installed on their workstation. There are so many ways to do it but my favorite is powering down the machine and taking out the disk and taking it home.

    Respectfully

    Danny Lieberman

  3. Google alerts work well ☺

    I will address what I can in the order you asked. As for resellers – we do take our time and responsibility in choosing them, but obviously cant control every aspect. My assumption would be that like all security product resellers, they tend to compete for budget with products that have some overlap. I should first say I share your disdain for marketing hype in this industry. I think we are far more careful than most making sure we are not selling and marketing ahead of our capabilities. In many ways, we are marketing and selling far behind them. In terms of our website – we take the approach that our solution can significantly AID in the problem. Which is very valid. We have many clients using them side by side – and I think the benefits of both are increased. Same can be said for using us to aid in governance issues (another problem area on the website). First and foremost tho – is incident response. Our ability to provide all associated activity to any incident is the core feature of our technology. The exact reasons you have turned to squil and (assumed) home grown capture solutions — is our contribution to the process. However, we want to make the information and data available to any concern the customer may have. HR investigations. Insider threat. Zero day. Malcode. Competitive analysis.

    The next point – which to some degree echoes the positioning above – about working with hundreds of terabytes… We are not a replacement for IDS/IPS, or DLP, or SIM/SIEM. What we offer each is the ability to quickly reach in to 100TB of network traffic, and view only the relevant sessions that each or all of these solutions have identified – and to quickly investigate peripheral activities as far back as data exists. Easy examples first. You have a compromised host, and have detected it using your IDS or SIM. Regardless of the amount of network data – we can answer questions like – show me any communication between the attacker and the victim for 30 minutes prior to the attack to now. Same analysis capabilities you see in investigator, with the ability to follow any trail you find. OK – I have discovered how the system was compromised – let me quickly pivot on the malicious javascript, or the attacking host, or the ports, or the URL, or any other piece or combination of information, and see the entire enterprise communication with that profile. Let me examine the actual data that was lost – regardless of protocol. Etc. DLP – same sort of thing applies. DLP detects source code being transferred offsite by an insider. What else has that person shipped – to whom – etc? The point is – and one that I make internally – capturing the data is not the difficult part. Making the data usable has, and remains, the real challenge. With investigator – you can see some of what we do to make it usable, and it works at scale.

    My point about integration with other products – was that once for instance you find some new vector of attack, or some new concerning aspect – it is simple to create alerts that will feed existing solutions. In the instance above – easily flagging the host, the script, or any component of the meta can help feed good information into other management products. Also – once you flag it – the flag itself provides an instant pivot point in investigator – and one more piece of meta you can use for your queries. Take the entire spamhaus zone for instance, and you can load it into investigator – or into our enterprise decoder – and flag any session involving any spamhaus tracked host. You can take lists from any intelligence vendor, and flag and alert on the presence. This not only provides good navigation points within our solutions, but also can then send that intel into products that don’t support them directly.

    As for performance of winpcap – I want to say the last time I benchmarked it is began to have problems on windows at around 250mb/s. But again – realize that is not how we collect in our enterprise solutions. Our decoders – which are dedicated linux appliances with over 12TB of disk, use our own kernel modifications to achieve multi-gigabit capture. At around 2 to 3gb/s – you need more disks to achieve the write speeds needed. Our decoders – are appliances that do the collection, sessionization, analysis and meta extraction at high speed. You can then just connect to them using Investigator – no processing needed.

    The search parser does use regex, so what is doable in regex is doable in that parser – or in the search dialog. Our parsing of the sessions at import however attempts to render the session first, before data extraction. So while you are correct in your searches, file / attachment names in Webmail for instance are rendered prior to lexing the session for information. So after the fact searching is regex – and real-time is programmatic manipulation of the session prior to looking for content. Again – in no way are we a DLP in terms of document analysis.

    My only point about the application detection comment, is that we parse in a port and protocol agnostic manner – not making assumptions on ports for instance – to determine content. The way we do it provides you the options of asking for instance, what sessions traversed port 80 that were not HTTP? Another way to say it would be – if you were to tunnel HTTP traffic over DNS – we would identify the HTTP aspects as well as any DNS information. Again – DLP solutions generally work well targeting their specific ports and communication channels. I will say that I see frequent scalability issues with many of them – but what they are analyzing is difficult.

    Lastly – I think ☺ — the privacy issues involved. I am not sure of a couple of things you mentioned. My point about IDS – and certainly not assuming inbound traffic only – is that with either the IDS or the DLP/CMF solutions you mention – you of course cannot promise that no user information will be captured. With IDS – of course some signature may hit and capture information from a mail message, or IM conversation. Even worse with DLP/CMF solutions – you are targeting by nature the communications channels of your employees? I guess I just don’t see how you reconcile those issues with privacy laws, but think egress monitoring is an inherent violation. With regard to how we address, you can create and store very detailed filtering, OR truncation instructions that do not store information regarding matching criteria. For instance, it is simple to filter email, IM or any other identified services – again in a port and protocol agnostic fashion. You can then apply them to any collection device. We also have RBAC permissions applied from the server, to provide even more granularity to viewing certain types of content. I am talking about our enterprise appliances of course – but you can see some of the capability in the rules section of Investigator as well.

    This conversation is becoming too long for HTTP posts ☺

    Have a good one.

    Tim

  4. Tim

    For sure this is a long thread. You have the advantage on me of knowing your product – so it’s tough for me to add value right now to the discussion.

    Many people like IDS for outbound content filtering – but fact is – it’s not a good fit for the problem and people end up with too many rules and just stop maintaining it.

    Regarding privacy and DLP systems – your mileage varies according to your country and what you do and what you say or don’t say – if you get my drift.

    Will continue by mail – Sounds like the enterprise product is pretty good indeed.

    Danny Lieberman

  5. Hi Danny and Tim,

    Interesting thread on network security analysis, and it’s nice to see that you are mentioning my NetworkMiner application. I agree with you that application protocol detection is difficult, this has been one of the issues I have been struggling with when developing NetworkMiner. Just relying on the port number, to know which protocol parsers to apply to a session, is just not enough. Especially now that many P2P protocols, backdoors etc. use randomized or at least non-standard ports.

    I have therefore developed an algorithm called the SPID algorithm (Statistical Protocol IDentification), which is freely available for open source usage. You can read more about this on:

    http://spid.wiki.sourceforge.net/

    This port-independent protocol detection will be available in future versions of NetworkMiner, and I am pretty sure that NetWitness will not even be close to identifying protocols as well as NetworkMiner will do using the SPID algorithm.

  6. Erik,
    Very cool – I will definitely take a look. I have been thinking for a while that it is time for an open source DLP project – I think many of the pieces are out there – like snort security platform and now – SPID. Now I need a sponsor 😉

    Danny

  7. You can think of your computer’s registry like the brain of your computer. As such, it stores information not only on every program that your computer has installed at any given time, it also tends to keep details from programs that you Previously removed. This can be a key problem for PC owners and is why it’s imperative for computer users to use a free registry cleaner.

    When you install software on your computer, some important records are retailerd within your computer’s registry. However, when you get rid of or unset up software, sometimes those data remain within your registry. Maybe the software was inadequately penned or your computer had a hard time unset uping the software adequately. In both case, the end-result is that you have information in your registry that are no longer needed.

    free antivirus software
    cleans your computer’s registry. Registry cleaners get rid of outdated and errant registry entries that can cause PC slowdown, error messages and even application crashes. serious registry problems can even result in your computer becoming unbootable. So, by making use of a registry cleaning tool, you can work to eliminate these PC slowdowns and avoid future problems due to a bloated registry.

Leave a Reply

Your email address will not be published. Required fields are marked *