Last Updated on May 24, 2015

Few days ago I have discovered that the City I live in (Rome), ranks at number two in the World for the number of BOT infections, at least according to Symantec Internet Security Threat Report Edition XVII.

Of course reports must be taken with caution, but it is undoubted that Bot infections are becoming a huge problem for the Information Security Community (a modern Biblical Plague), so huge to deserve the attentions of The Federal Communication Commission. As a matter of fact, on March 2012, FCC, working with communications companies including Verizon, Cox, and Comcast, has passed a voluntary code that delineates the steps that ISPs must take to combat botnets. As you will probably know, botnets may be used by cybercrookers for making money with different criminal purposes ranging from information theft to the execution of DDoS Attacks: have a look to this interview to a botnet operator to have an idea (and to discover that botnets are used also to counterfeit virtual currency).

Such a similar plague is pushing a major change to the traditional security paradigm, a change that can be summarized in few words: if yesterday the refrain for system administrators was “Beware of what enters your network” (so all the security warfare was focused in checking the ingress traffic), today it is becoming: “Beware of what leaves your network“.

This is nothing else than a consequence of the fact that traditional endpoints technologies are proving not to be so effective against Bots, so a new approach, which aims to control the egress traffic generated by compromised endpoints and leaving the organization, is needed. The effectiveness of traditional endpoint technologies is not optimal since new variants (capable of evading antivirus controls) come out much faster than the related signatures developed by vendors: try to have a look at the average antivirus detection rate against Zeus (the god of bots), and you will probably be disappointed in noticing that it is stable at a poor 38%). On the other hand, recognizing the communication patterns at the perimeter is a more profitable strategy, since the different variants generally do not change deeply the communication protocols with the C&C Server (unless a P2P protocol is used, see below).

The strategy to mitigate botnets relies on the fact that each botnet has (in theory) a single point of failure: it is the C&C Server to which Cyber Hunters and Law Enforcement Agencies address their takeover attempts to take them down definitively or to turn them into sinkholes for studying the exact morphology and extension of the infection). Depending on the botnet configuration, each infected endpoint polls the C&C server for new instructions at a given time interval and that is the point of the process in which good guys may act: detecting (and blocking) that traffic allows to identify infected machines (and my experience indicate that too often those machines are equipped with an updated and blind antivirus).

For the chronicle the C&C Server is only a theoretical single point of failure since C&C Servers are generally highly volatile and dynamic so it is not so easy to intercept and block them (the only way to take down a botnet), hence in my opinion, it should be more correct to say that a botnet has has many single points of failure (an information security oxymoron!).

As if not enough, in order to make life harder for good guys, the next generation botnets are deploying P2P protocols for decentralizing the C&C function and make their takedown even tougher.

But good guys have a further weapon in this cat and mouse game: the cloud intelligence. Even if I am not a cloud enthusiast, I must confess that this technology is proving to be a crucial element to thwart botnets since it allows to collect real time information about new threats and to centralize the “intelligence” needed to dynamically (and quickly) classify them. Real time information is collected directly from the enforcement points placed at the perimeter, which analyze the egress traffic from an organization containing compromised machines. Of course after the successful analysis and classification, the new patterns may be shared among the enforcement points all over the five continents in order to provide real time detection (and hence protection) against new threats. This approach is clearly much more efficient than an endpoint based enforcement (which would need to share the information among a larger amount of devices), provided the enforcement point are positioned adequately, that is they are capable to monitor all the egress traffic.

The combination of the analysis of egress traffic and cloud intelligence is a good starting points for mitigating the botnet effects (for sure it is necessary to identify infected machines) but, as usual, do not forget that the user is the first barrier so a good level of education is a key factor together with consolidated processes and procedures to handle the infections.

This Post Has 3 Comments

  1. Jule

    Good story, I like your suggestion on the analysis of egress traffic and cloud intelligence. The botnet traffic could be detected, and measures taken. But how do you the deal with other cyber vulderabilities like DDOS for example, if the endpoints are concentrated?

    1. Paolo Passeri

      I believe the best way to address the issues you depicted could be to centralize the security enforcement point where the traffic is monitored and compromised endpoints members of a bot detected. In few words I believe that ISPs will soon offer advanced anti-malware (read anti-bot) services in the cloud by routing (or better switching) the customer’s traffic on their data centers. You may think to the same approach used for URL filtering services on the cloud with the difference that in this scenario the clients must arrive to the ISP’s Data Center with their original IP Address or a statically NATed address so that it is always possible to recognize the source. This would allow to face other kinds of attacks (such as the DDoS where sources are spread among different locations even if inside the same ISPs): another contribution of the cloud against Botnets that I forgot to mention in the original post.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.