New Tools and Links
ProDiscover
Chris Brown has updated ProDiscover to version 6.8. This may not interest a lot of folks but if you haven't kept up with PD, you should consider taking a look.
If you go to the Resource Center, you'll find a couple of things. First off, there's a whitepaper that demonstrates how to use ProDiscover to access Volume Shadow Copies on live remote systems. There's also a webinar available that demonstrates this. Further down the page, ProDiscover Basic Edition (BE) v 6.8 is available for download...BE now incorporates the Registry, EventLog and Internet History viewers.
Chris also shared with me that PD v6.8 (not BE, of course) includes the following:
Added full support for Microsoft Bitlocker protected disks on Vista and Windows7. This means that users can add any bitlocker protected disk/image to a project and perform all investigative functions provided that they have the bitlocker recovery key.
The image compare feature in the last update is very cool for getting the diff's on volume shadow copies.
Added support for Linux Ext4 file system.
Added a Thumbs.db viewer.
These are just some of the capabilities he mentioned, and there are more updates to come in the future. Chris is really working hard to make ProDiscover a valuable resource.
MS Tool
Troy Larson reached to me the other day to let me know that MS had released the beta of their Attack Surface Analyzer tool. I did some looking around with respect to this tool, and while there are lot of 'retweets', there isn't much out there showing its use.
Okay, so here's what the tool does...you install the tool and run a baseline of the system. After you do something...install or update an app, for example...you rerun the tool. In both cases, .cab files are created, and you can then run a diff between the two of them. I see two immediate uses for something like this...first, analysts and forensic researchers can add this to their bag of tricks and see what happens on a system when an app is installed or updated, or when updates are installed. The second, which I don't really see happening, is that organizations can install this on their critical systems (after testing, of course) and create baselines of systems, which can be compared to another snapshot after an incident.
I'll admit, I haven't worked with this tool yet, so I don't know if it creates the .cab files in a specific location or the user can specify the location, or even what's covered in the snapshot, but something like this might end up being very useful. Troy says that this tool has "great potential for artifact hunters", and I agree.
CyberSpeak is back!
After a bit of an absence, Ovie is back with the CyberSpeak podcast, posting an interview with Mark Wade of the Harris Corporation. The two of them talked about an article that Mark had written for DFINews...the interview was apparently based on pt. 1 of the article, now there's a pt. 2. Mark's got some great information based on his research into the application prefetch files generated by Windows systems.
During the interview, Mark mentioned being able to use time-based analysis of the application prefetch files to learn something about the user and their actions. Two thoughts on this...unless the programs that were run are in a specific user's profile directory (and in some cases, even if they are...), you're going to have to do more analysis to tie the prefetch files to when a user was logged in...application prefetch files are indirect artifacts generated by the OS, and are not directly tied to a specific user.
The second thought is...timeline analysis! All you would need to do to perform the analysis Mark referred to is generate a nano-timeline using only the metadata from the application prefetch files themselves. Of course, you could build on that, using the file system metadata for those files, and the contents of the UserAssist subkeys (and possibly the RecentDocs key) to build a more complete picture of the user's activities.
Gettin' Local
A recent article in the Washington Post stated that Virginia has seen a rise in CP cases. I caught this on the radio, and decided to see if I could find the article. The article states that the increase is a result of the growth of the Internet and P2P sharing networks. I'm sure that along with this has been an increase in the "I didn't do it" claims, more commonly referred to as the "Trojan Defense".
There's a great deal of analysis that can be done quickly and thoroughly to obviate the "Trojan Defense", before it's ever actually raised. Analysts can look to Windows Forensic Analysis, Windows Registry Forensics, and the upcoming Digital Forensics with Open Source Tools for solutions on how to address this situation. One example is to create a timeline...one that shows the user logging into the system, launching the P2P application, and then from there add any available logs of file down- or up-loads, launching an image viewing application (and associated MRU list...), etc.
Another issue that needs to be addressed involves determining what artifacts "look like" when a user connects a smart phone to a laptop in order to copy or move image or video files (or uploads them directly from the phone), and then share them via a P2P network.
Free Stuff
Ken Pryor has posted his second article about doing "Digital Forensics on a (less than) shoestring budget" to the SANS Forensic blog. Ken's first post addressed training options, and his second post presents some of the tools described in the upcoming Digital Forensics with Open Source Tools book.
What I like about these posts is that by going the free, open-source, and/or low cost route for tools, we start getting analysts to understand that analysis is not about tools, it's about the process. I think that this is critically important, and it doesn't take much to understand why...just look around at all of the predictions for 2011, and see what they're saying about cybercrime being and continuing to become more sophisticated.