- Remotely exploitable SQL injection vulnerabilities act as the infection vector
- Taking advantage of the most popular search engines' indexes, vulnerable sites and web pages get automatically detected and simultaneously exploited
- The scanning bots injects back the most popular web shell c99shell, so that ull control with UID based on the web server's use privileges is gained
- Hosting of malware embedded sites, phishing and spam pages, blackhat SEO taking advantage of the domain's pagerank are among the few examples of how is the access abused
These so called "malicious economies of scale" showcase the following :
- a new command is gaining malware author's attention, namely !milw0rm that is directly syndicating remotely exploitable web application vulnerabilities
- approximately 10 to 15 sites got remotely SQL injected in the first minute of monitoring the bot
- web application vulnerabilities continue to get a lower priority in an infosec budget
- XSS vulnerabilities to actually have e-bank.com forward the captured information to a third-paty via a phishing attack undermine SSL certificates and the rest of the "yes, we're working on it" security for the massess approaches
- c99shell may be the most popular web shell, but taking into considerating the Web-ization of malware, and how a huge number of web application backdoors remain undetected by anti virus software, botnet masters and malicious attackers are gaining competitive advantage in a very efficient way
- botnet masters are not rocket scientiests, in some of the IRC channels used to control the scan bots, the administrators were so lame they were even allowing complete outsiders to perform scanning commands based on their preferences
- despite that the majority of SQL injected sites are connected to a centralized web shell, even if it gets shut down, namely a home user somewhere across the world is acting as a C&C for the entire campaign, the site remains vulnerable and anything can make it "phone wherever they want to"
- the botnet masters in this particular case were also interested in the FREE SPACE they have available at the exploited domains
What are the search engines doing to tackle the search engine hacking possibilities, especially Google being the most widely used and having the most comprehensive index? They're successfully implementing CAPTCHA's for such suspicious scanning bot behaviour :
"At ACM WORM 2006, we published a paper on Search Worms [PDF] that takes a much closer look at this phenomenon. Santy, one of the search worms we analyzed, looks for remote-execution vulnerabilities in the popular phpBB2 web application. In addition to exhibiting worm like propagation patterns, Santy also installs a botnet client as a payload that connects the compromised web server to an IRC channel. Adversaries can then remotely control the compromised web servers and use them for DDoS attacks, spam or phishing. Over time, the adversaries have realized that even though a botnet consisting of web servers provides a lot of aggregate bandwidth, they can increase leverage by changing the content on the compromised web servers to infect visitors and in turn join the computers of compromised visitors into much larger botnets."
It will not solve the parsing approach scanning bots are implementing, so I think that in the short term a database of google hacking searches may indeed get a CAPTCHA verification by default. An IP reputation system has a lot of potential too, and with Google's acquisition of Postini, they already have a huge population of IPs you should not trust for anything. My expirience shows that once you get a phishing email from a single IP, you will sooner or later see the same IP hosting and sending malware, hosting as well as sending spam, and pretty much anything malicious.
No comments:
Post a Comment