Last Updated: 2015-02-09 12:32:21 UTC
by Chris Mohan (Version: 1)
The mantra of "A working, tried and tested backup is something that has to be done" almost seems to be catching on after over twenty years of saying it. The horror stories in the media of people and companies losing data may have helped re-enforce the message and need. As long as backups are happening, then we’re one step closer to a reasonable and working recovery process. There are those businesses (and people) that have implemented a backup process, but haven’t thought passed backing up the data.
Let me offer an example. I got one of those phone calls from a friend, saying a friend of theirs was having “some really bad internet virus trouble” and could I help out. Those amongst you that get these types of calls can immediately spot my friend had no really idea of what the problem was. A call to the poor soul with “some really bad internet virus trouble” revealed a reasonably frustrated, but still rational person. She was quickly able to clearly describe what was happening to some of the Windows systems there and it took seconds to work out they had been hit by a CrytpoWall  type malware. Australian email addresses have been plagued with emails using well known Australian companies as lures to trick the unsuspecting user in to running the encrypting malware. And that’s what had happened here.
My questions were: “Do you have backups of the now encrypted files?” and the answer came back they did. “Do you know how to restore from those backups and can you do that to a safe machine?”, again they did and found a machine to restore the data to.
A bit of time passed as the files where restored and checked. The happy outcome was all the files and data were there and they’d only lost about a day’s worth of work. I could hear her relief once the files were recovered, but being a cheery security professional I asked one final question. “Do you have copies of the software on those three encrypted machine, as you need to format them and start again – just to be safe”. That produced the “Oh. Er. Um, let me get find out”. One of the machines encrypted was their accountant’s system, which was running software that would have looked old and out of date in Hackers . After a bit more advice I left them to it, as there wasn’t anything further I could do.
The outcome resulted in the owner buying three new computers, all preloaded with new versions of the required software. The ancient accounts software data was able to be imported to a new, supported accounts software and the business lost roughly a day’s worth of work*. This is a pretty impressive turn around for a small company with no internal IT support.
It does, however, point out that backup are only part of a business continuity and disaster recovery plan. As security and IT folks, we can advise and recommend people and their businesses understand the entirety of business continuity and disaster recovery planning. We’ve discussed it a number of times in Diaries at the Internet Storm Center . For the owner I was talking with, I pointed her to a local resource  here in Australia, but for those in the United States this is a great resource  as a starting point. Find out if plans are in place, and if not, then start these conversations now, rather than during or after an incident.
If you have any other suggestions or advice on getting a business continuity and disaster recovery planning in place and updated, please feel free to add a comment.
Chris Mohan --- Internet Storm Center Handler on Duty
 Cryptowall 3.0 Sample http://malware-traffic-analysis.net/2015/02/06/index2.html
* The accountant had to learn some new software which took time and effort not accounted for in the total downtime, as it wasn’t impacting the rest of the business.