Last Updated: 2008-07-25 14:12:49 UTC
by Swa Frantzen (Version: 1)
As indicated in earlier diary entries, an authoritative server sees queries from recursive servers for nonexistent names if their domain is being targeted by the latest DNS attack. They can't do much: all they can do is report them.
Yesterday Ray contacted us with logs from his authoritative DNS server indicating some ISPs servers in former USSR countries were being used in a very suspicious manner. The logs were somewhat sanitized, so we don't know the first thing about the domains that were targeted, but the logs had enough detail to identify that the pattern of randomness in the queries was different from any of the publicly known exploits.
So this can mean a lot of things, but the signs of either more development of attack tools, or of obfuscation of existing attack tools, or even somebody just exploring how hard it is to write it yourself is never going to be positive to the good guys.
Better get the patches in over the weekend if you still didn't.
Verify any firewall, NAT etc. you use doesn't undo what the patches provide.
If you use DNS servers from your ISP, validate they did patch them, if not use alternate servers such as those of OpenDNS.
The Bad Guys
But why would you write your own code as a bad guy?
- Well there are a few trends they follow. One of them is to use drive-by attacks from hacked webservers and loading something on a webbrowser. Now imagine that client on your internal network querying your recursive "internal-only" DNS caches. Yep, next you know www.google.com or something similar points -company-wide- to an IP of those infecting your clients.
- Bad guys today run botnets. If they can coordinate it well enough they can win the individual races much faster than the public tools do, giving them much less exposure to the authoritative DNS server operators (although I doubt they could care less)
What's the motive of a bad guy to change where a site points to on a DNS caching server level?
- As always: follow the money!
- End users are notoriously good at clicking "next" without reading what's in the window they click away, they want it all to just work. They do that too for SSL certificate warnings, and hence expose themselves to man-in-the-middle attacks. So are the bad guys interested in diverting www.your-bank.com (or anything else they can scrape something off of they can use/sell) to their servers ? For all customers of a given ISP at once ?
- What about advertising ? Instead of giving you a click-bot on a drive-by site, how about changing what ads are given to all customers of an ISP/ all users at a large company/... at once. They could simply replace all advertising of a given network to something where they get the revenue from instead of the publishers of the web sites that get visited.
Don't worry, those interested in doing this know this, I'm not giving them any new ideas.
We're (not yet) on yellow for this, we lack some exploitation in the wild to do that. Moreover the trade press has been running this to boredom since last Black Tuesday.
What to patch?
- All recursive DNS servers
- DNS clients
- NAT devices and firewalls if they undo the UDP source port randomization
Only non-recursive DNS servers are exempt from this need.
So is this bad: yes, it is unless your DNS clients, name-servers and the namesevers you forward to are up-to-date on patches as well as your NAT devices (routers, firewalls, ...) in between are confirmed not to undo the randomization of source ports.
The clock is ticking ... .
Swa Frantzen -- Section 66