My favorite tool for parsing Windows link files was created by Jake Cunningham who posts tools he creates at the "JAFAT: Archive of Forensic Analysis Tools" website. Jake has created some very well know Safari browser tools, but the tool that has caught my eye is "lnk-parse-1.0.pl". The lnk-parse tool is a perl script that provides the most detail I have ever seen in a link file parser. Here is some sample output:
Link File: /WINDOWS/system32/config/systemprofile/Start Menu/Programs/Windows Media Player.lnk Link Flags: HAS SHELLIDLIST | POINTS TO FILE/DIR | HAS DESCRIPTION | HAS RELATIVE PATH STRING | NO WORKING DIRECTORY | HAS CMD LINE ARGS | NO CUSTOM ICON | File Attributes: ARCHIVE Create Time: Wed Jan 30 2008 05:36:33 Last Accessed time: Wed Jan 30 2008 05:39:37 Last Modified Time: Mon Jul 07 2003 05:00:00 Target Length: 520192 Icon Index: 0 ShowWnd: 1 SW_NORMAL HotKey: 0 Target is on local volume Volume Type: Fixed (Hard Disk) Volume Serial: b0a65d8e Vol Label: Base Path: C:\Program Files\Windows Media Player\wmplayer.exe (App Path:) Remaining Path: Description: @%SystemRoot%\inf\unregmp2.exe,-155 Relative Path: ..\..\..\..\Program Files\Windows Media Player\wmplayer.exe Command Line: /prefetch:1
Lnk-parse takes a single link file as an argument. But anyone who knows Windows knows there are usually hundreds of link files on a system. So, how does one quickly find all the link files in a Windows operating system using Linux tools? I'm going to avoid all the caveats, such a deleted files, renamed files, etc., and make the following assumption: The examiner is interested in allocated link files that have not been obfuscated by renaming or other anti-forensics techniques.
The fastest way to find the link files is to mount the Windows file system and search for link files by name. The search tool to use is the 'find' command. I'll discuss mounting in another post (I recently wrote a tool to read disk/forensic image partition tables an automatically mount all allocated filesystems), but here I want to show a how a simple while loop can make using commands like lnk-parser on multiple files easy.
The following command assumes it is being executed from the root of the Windows file system. Windows file system is a misnomer; I mean a file system in which the Windows operating system is installed.
$ find . -iname "*.lnk" | while read line; do lnk-parse-1.0.pl "$line"; done
I'll break that down by moving the major parts to their own lines:
find . -iname "*.lnk" | \ while read line do lnk-parse-1.0.pl "$line" done
Line by line:
- find, recursively from the current dir, all files ending with .lnk, regardless of case, and pipe the output to the while loop
- "while" there is output from the find command, read each line of output and assign the output (a file name and path) to the variable "line"
- a required argument in a while loop, essentially: "do" the following to $line
- process $line with lnk-parse-1.0.pl
- Finished with this loop cycle, return to while and get another line from "find", if any.
In 1.23 seconds, I processed all 103 link files in a small Windows XP system. I could pipe the output to a text file for a report, if necessary.
$ find . -iname "*.lnk" | while read line; do lnk-parse-1.0.pl "$line"; done > some_path/link_files.txt
I use while loops like this all the time. One gotcha of which to be aware is that variables, assigned within the while loop, don't survive outside of the while loop. I'll explain that sometime in the future, and provide some strategies on how to handle this. But for now, if you mimic this usage, you won't run into that issue, you'll save yourself a lot of work, and you'll increase your productivity tremendously.