Triaging My Way

Tuesday, May 17, 2011 Posted by Corey Harrell 7 comments
Your onsite performing a collection or you are in your lab when a computer is given to you and you don’t have a lot of time to answer a few initial questions. How would you quickly determine someone’s activity on the computer? What pictures were viewed, programs ran, files accessed, or removable devices used? Quickly assessing the computer will not only provide information to answer these questions but will also reveal relevant data storage locations for an investigation. This post describes a process – including tools usage – where a few initial questions can be answered in a matter of minutes by examining the user activity on a computer.

Approaching the Triage

I know I promised to write about how to answer questions in less than two minutes by examining user activity on a computer. Before I dive into the land of ones and zeros I wanted to take a step back to talk about the thought process of how to approach the triage. Goals need to be established and a plan needs to be developed on how those goals can be obtained. To accomplish this, the first three questions of the Alexiou Principle can be used:

* What question are you trying to answer?
* What data do you need to answer the question?
* How do you extract that data?

The Alexiou Principle is very versatile since it can be to create analysis design plans, assist with DFIR training (as I described in the Forensicator Readiness post), and guide the thought process for triaging. The first Alexiou Principle question is what question you are trying to answer and this question is pretty self explanatory. As it relates to triage, what are the initial questions to determine if “something” is relevant to the digital forensic examination? The initial questions will vary based on the type of DFIR case and customer needs but the opening paragraph provided a few example questions.

The second Alexiou Principle question is what data do you need to answer the question? The data not only includes data sources such as computers, servers, and people but it also includes what information in those data sources can help answer the questions. Take for example the question was someone accessing the jIIr blog and the only data source available is the person’s computer. What data in the computer can help determine if the person visited my blog? A few areas to check could be the installed software (what web browser are installed), web browser artifacts (history, cookies, or favorites/bookmarks), and maybe the TypedURLs key in the user account’s NTUSER.DAT hive.

The third Alexiou Principle question is how do you extract that data? The tools selected to perform the triage need to be able to extract the data that is required to answer the initial questions. This means the selection of tools should not occur before the data is identified since this may force people to have to work within the confines of the tools. Instead, the selection of tools should be one of the last things completed since the tools to use will be dependent on the goal(s) of the assessment. I wanted to mention this point because I’ve seen numerous times where discussions are started with “should I use this tool” when the discussion should start with this is what I’m trying/need to accomplish.

Triaging User Activity in Under Two Minutes

Now that I’ve explained the thought process of how to approach the assessment of user activity on a computer I’ll walk through an issue I had at one point. I work in corporate environments and as expected the majority of the networks are running Windows domains. A Windows domain can have a significant impact on digital forensic examination because the computer being collected may not contain all of the data relevant to an investigation. The IT department may have assigned home folders to employees using company computers to make it easier for the IT department to back up people’s files. If an organization is using home folders then most likely the organization is encouraging users to store all of their data in their home folders instead of the computers’ My Documents folders. In addition to home folders, the person may be accessing and storing data in network shares. One of the initial questions I need to answer in this type of environment is what data sources - besides the person’s computer – do I need to collect?

Two options to determine what data sources - besides the person’s computer – need to be collected is speaking with the IT department or quickly triaging the computer. IT departments are not known for their great documentation skills so the better option is to triage the computer to find the answer. What data is needed to determine the network shares a person accessed? The data to answer the question could be located in the person’s activity on the computer. Three locations containing user activity are the registry, system restore points/volume shadow copies’ registry files, and the link files in the user account’s profile. For additional references on the evidentiary value of the registry check out the book Windows Registry Forensics while the Digitial Forensic Search custom Google query for link files can be used to learn more about them.

How do you extract data stored in registry and link files? I took into account the following for my tool selection: had to extract the data, had to be fast, had to be command line tools (you’ll see why), and my preference was for tools already in my toolbox. Harlan Carvey’s Regripper was chosen to parse the registry files, Harlan’s RipXP (included in the Regripper downloaded) was chosen to parse the registry files in system restore points, a modified version of Harlan’s lslnk.pl script to parse the link files and FTK Imager to mount the hard drive/image.

The example I’m using is to determine the network shares accessed but keep in mind this will work for other types of user activity since the registry and link files store the information. I promised to write about how to answer a few initial questions in less than two minutes by assessing the user activity on the computer. Here goes ………..

Mount the Image/Hard Drive

The situation will dictate if the computer’s hard drive will be examined using a write blocker or if the forensic image of the computer’s hard drive will be examined. The triage will work the same regardless if the hard drive or image is being examined. When quickly triaging a system my preference is to parse the data of interest where it’s located instead of copying the data to my forensic computer. The image/hard drive has to be mounted to the forensic computer in order for the registry and link files to be parsed in their storage locations and this can be accomplished using FTK Imager version 3.0 with the “File System / Read Only” mount option (allows access to system restore points). F:\[root]\ is the path to the root of the volume I’m examining and the commands below will reflect the path.

User Activity Stored in Registry

A user account’s NTUSER.DAT registry hive stores configuration information and the user account activity on the system. A quick way to identify registry keys of interest (in addition to the Windows Registry Forensics book) is to reference registry checklists like AccessData’s Registry Quick Find Chart or review the Regripper’s plugin files. The three registry keys of interest are: Map Network Drive MRU since it lists the recently mapped network drives, MountPoints2 since it lists devices accessed, and RunMRU since one way to access network shares is using the run dialog box. Regripper has plugins to parse all three registry keys and the commands for running the command-line version of Regripper with the output being redirected to a text file is below.

rip.exe -r "F:\[root]\Documents and Settings\Administrator\NTUSER.DAT" -p mndmru >> rip-drives.txt

rip.exe -r "F:\[root]\Documents and Settings\Administrator\NTUSER.DAT" -p mp2 >> rip-drives.txt

rip.exe -r "F:\[root]\Documents and Settings\Administrator\NTUSER.DAT" -p runmru >> rip-drives.txt

The first command uses -p mndmru to specify the plugin for the Map Network Drive MRU registry key. The output of this command identifies four network shares that were mapped to the user account of interest as shown below.

Map Network Drive MRU
Software\Microsoft\Windows\CurrentVersion\Explorer\Map Network Drive MRU
LastWrite Time Sat May 14 18:13:18 2011 (UTC)
MRUList = dcba
c \\192.168.1.80\Map Drive 2
a \\192.168.1.80\Map Drive D
b \\192.168.1.80\Map Drive 1
d \\192.168.1.80\Map Drive 3

The second command uses -p mp2 to specify the plugin for the MountPoints2 registry key. The output of this command also identifies the same four network shares. A portion of the output is shown below.

MountPoints2
Software\Microsoft\Windows\CurrentVersion\Explorer\MountPoints2
LastWrite Time Sat May 14 18:19:16 2011 (UTC)

Remote Drives:
Sat May 14 18:13:18 2011 (UTC)
##192.168.1.80#Map Drive 3
Sat May 14 18:04:37 2011 (UTC)
##192.168.1.80#Map Drive 2
Sat May 14 17:59:27 2011 (UTC)
##192.168.1.80#Map Drive 1
Sat May 14 17:56:10 2011 (UTC)
##192.168.1.80#Map Drive D

The third and last command uses -p runmru to specify the plugin for the RunMRU registry key. The output of this command shows the same device with the four network shares (192.168.1.80) as well as a new device (192.168.2.50). The output doesn’t show any network shares being accessed but it does attempts were made to access the devices in a method that will display the network shares (\\IP-Address). The runmru output is shown below.

RunMru
Software\Microsoft\Windows\CurrentVersion\Explorer\RunMRU
LastWrite Time Sat May 14 18:22:00 2011 (UTC)
MRUList = bca
a cmd\1
b \\192.168.2.50\1
c \\192.168.1.80\1

User Activity Stored in System Restore Points (or Volume Shadow Copies) Registry Files

Windows system restore uses restore points to return system files and settings to an earlier point in time for computers running Windows 2000 or XP. Windows Vista and 7 uses volume shadow copies instead of restore points. Registry files covering different points of time in the past are located the restore points and volume shadow copies, and these files may contain additional information about user account activity. To parse the registry hives in the volume shadow copies a simple batch file using the command-line version of Regripper can be used. RipXP.exe can be used to parse a key in the current registry hive and the registry hives in the restore points (RipXP.exe parses the current registry hive so the rip.exe commands can be skipped when using RipXP.exe in this triage method). RipXP.exe requires three switches: -r specifies the current registry hive, -d specifies the restore point directory, and –p specifies the plug to use. The commands to parse the three registry keys of interest with the output being redirected to a text file are below.

ripxp.exe -r "F:\[root]\Documents and Settings\Administrator\NTUSER.DAT" -d "F:\[root]\System Volume Information\_restore{3F806DB1-464B-46B0-B724-4376EC868222}" -p mndmru >> rip-rp-mndmru.txt

ripxp.exe -r "F:\[root]\Documents and Settings\Administrator\NTUSER.DAT" -d "F:\[root]\System Volume Information\_restore{3F806DB1-464B-46B0-B724-4376EC868222}" -p runmru >> rip-rp-runmru.txt

ripxp.exe -r "F:\[root]\Documents and Settings\Administrator\NTUSER.DAT" -d "F:\[root]\System Volume Information\_restore{3F806DB1-464B-46B0-B724-4376EC868222}" -p mp2 >> rip-rp-mp2.txt

The output of all three commands will be similar to the Regripper output I showed before with the exception the registry keys in all of the restore points being displayed. I'm only showing a portion of the Map Drive MRU registry key output since it demonstrates how the output will look. As can be seen below, RipXP.exe output first displays the registry data from the current registry hive before the restore point registry data. The current Map Drive MRU key has four mapped network drives while the restore point shown only has three.

RipXP v.20090818
Launched Sat May 14 19:46:12 2011 Z

F:\[root]\Documents and Settings\Administrator\NTUSER.DAT
Map Network Drive MRU
Software\Microsoft\Windows\CurrentVersion\Explorer\Map Network Drive MRU
LastWrite Time Sat May 14 18:13:18 2011 (UTC)
MRUList = dcba
c \\192.168.1.80\Map Drive 2
a \\192.168.1.80\Map Drive D
b \\192.168.1.80\Map Drive 1
d \\192.168.1.80\Map Drive 3
----------------------------------------
Restore Point Info
Description : access share files
Type : System CheckPoint
Creation Time : Sat May 14 18:08:38 2011

F:\[root]\System Volume Information\_restore{3F806DB1-464B-46B0-B724-4376EC868222}\RP10\snapshot\_REGISTRY_USER_NTUSER_S-1-5-21-1214440339-1708537768-725345543-500

Map Network Drive MRU
Software\Microsoft\Windows\CurrentVersion\Explorer\Map Network Drive MRU
LastWrite Time Sat May 14 18:04:37 2011 (UTC)
MRUList = cba
c \\192.168.1.80\Map Drive 2
a \\192.168.1.80\Map Drive D
b \\192.168.1.80\Map Drive 1

User Activity Stored in Link Files

The registry provides a wealth of information but link files is another location containing information about a user account activity on a computer. A link file is created when a person accesses a file on their computer's hard drive, removable media, or a network share. The link file contains information about the file including its storage location which will show someone accessing network shares. A few locations containing link files are:

Windows XP: C:\Documents and Settings\username\Recent and C:\Documents and Settings\\Application Data\Microsoft\Office\Recent

Windows Vista and 7: C:\Users\username\AppData\Roaming\Microsoft\Windows\Recent and C:\Users\\AppData\Roaming\Microsoft\Office\Recent

One of my requirements for a tool to parse link files was it had to be a command line tool. My reasoning is command-line tools can be used in scripts but more importantly command-line tools can be used in batch files to parse artifacts in volume shadow copies. I couldn't find a tool to meet my needs. Harlan provided the lslnk.pl script in WFA 2nd edition and it displays the information contained in link files. However, lslnk.pl only works against individual files when I needed a tool to parse an entire directory of link files. I'm not a programmer and I don't know Perl but I can use search engines so I decided to try to modified lslnk.pl. The first modification I made was to enable lsnk.pl to parse all of the link files in a directory with the output being in report format. The modified script - lslnk-directory-parse.pl - worked fine but the output wasn't the best for filtering data. I needed the output to display all of the information from a link file on one line so I perform a search I can see all information for a specific link file. I made another change to make the output contain one link file per line in comma delimited format and this resulted in the lslnk-directory-parse2.pl script. Both modified scripts can be found in the Yahoo Win4n6 group's tools folder. The only parameter required by the script is the directory to parse as shown below.

C:\Perl>lslnk-directory-parse2.pl "F:\[root]\Documents and Settings\Administrator\Recent" > lsnk-parse2-output.txt

The output is a comma delimited text file and this means it can be opened in Excel/Calc (to see how to import a text file in Excel check out my posts Reviewing Timelines with Excel or Reviewing Timelines in Calc). Opening the text file at this point will show the information from all of the link files instead of the information specific to the question at hand which is what network shares did an account access. The text file can be searched prior to it being reviewed in Excel/Calc and I use Grep (available in UnxUtils) to do this. There are numerous characteristics to search on such as filenames, directory paths, file extensions, removable media, and network shares. That's right, link files indicate if person accessed a file on a network share. The lslnk-directory-parse2.pl can be redirected to Grep for searching prior to the creation of the text file. The command below shows the lslnk-directory-parse2.pl output being searched for the word "network share" (the binary-file=text switch forces Grep to see the output as text).

C:\Perl>lslnk-directory-parse2.pl "F:\[root]\Documents and Settings\Administrator\Recent" | grep.exe -i --binary-files=text "network share" > lsnk-parse2-output.txt

The output file is still a comma delimited file but the only link files present will be the ones with the phrase "network share" in its information. Reviewing a portion of the output not only shows files accessed on the network shares identified previously with Regripper and RipXP but it also identifies files accessed in a new share (\\192.168.2.50\Share) as shown below.

Summary

The triage extracted data from registry and link files to answer the question of what network shares a person accessed. There was some redundancy in the extracted data since both locations showed the same user account activity (same network share access). However, at times one location may reveal information that is not present in the other. I promised to write about how to answer a few initial questions in less than two minutes by examining the activity on the computer. It took me less than one minute to run Regripper, RipXP, and lslnk-directory-parse2.pl, and to examine the output from the three tools. Within this one minute I was able to determine the network shares a user account had readily available access to (mapped drives) and network shares accessed. The process can be even quicker by creating a batch file with all of the commands since a batch file can just be executed. I answered a question network share access but the process I described should not be limited to only this activity. Different registry keys in combination with information contained in link files can be used to quickly determine someone’s activity on a computer.

Coming To A System Near You

Wednesday, May 11, 2011 Posted by Corey Harrell 0 comments
According to Websense, there is a new trend where cyber criminals are spreading malware by taken advantage of Google Image search rankings. The attack involves poisoned pictures being displayed in Google’s image search results which when clicked redirects a user to a malicious site. As I was in the middle of putting together this write-up the Unmask Parasites blog had a great post, Thousands of Hacked Sites Seriously Poison Google Image Search Results, on how the websites involved in this attack were compromised and how the Google image search is poisoned while Brian Krebs wrote his own article on the subject Scammers Swap Google Images for Malware.

Those write-ups provided good information about the Google image search poisoning technique but I was approaching the topic from a different angle. My approach is from the perspective of the digital forensic practitioner who investigates this attack on a computer (client side). The Google image search is being leverage to spread malware but one important question I haven’t seen addressed is what are the potential artifacts that indicate the malware came from a Google image search. Along the same line of thinking, how are the artifacts of this delivery mechanism different than a Google web search, SPAM email, or a network share? The answer to these questions will be discussed in detail, hopefully before the Google image search attack comes to a system near you.

Simulation Setup

I tried to simulate how a user would perform Google searches for a selected topic. The topic I selected for my searches was the news of the day on 05/02/11 since the media coverage was everywhere. The topic seemed like a candidate for cyber criminals to try to leverage for spreading malware. I performed Google web and image searches using different word combinations until I had my first sign of an infection which was a warning message saying my unpatched Windows XP SP3 system was infected. I pretended to be a “normal” user to get rid of the warning by clicking cancel but in a short period of time the computer was held hostage by a fake antivirus program.

The Search Hit Culprit

I usually write my posts the way I conducted the examination. The malware is located then I work backwards in time examining the system activity to identify the initial infection vector. I’m taking a slightly different approach for this write-up by first explaining what the user saw followed by what the digital forensic practitioner would see during an examination. The potential artifacts of the Google image search being used to deliver a payload is shown through the DF perspective.

***** Heads Up: some of the URLs and domains mentioned in this write-up were malicious at one point in time so caution should be used if anyone tries to access them for their own research. All URLs were sanitized (or purposely only shown in images) to prevent anyone from accidently accessing the URLs. *****

User Perspective 1

Starting at 09:41:38 PM on 05/02/11 Google web and image searches were performed looking for sites and images about the news of the day. After about 20 minutes I performed the Google image search shown in the picture below. The highlighted image in the first row of search results is the image I access which lead to my system being infected.

DF Perspective 1

The above picture shows what a user sees when performing a Google image search. Different tools/techniques can be used to see what the search looks like on a system post mortem. The picture below shows the part of the timeline where the Google image search occurred and the images in the timeline were downloaded because of the search.

User Perspective 2

Clicking on the image highlighted in red resulted in the Internet Explorer window disappearing and being replaced by the warning message below.

It wasn’t too long until an Internet Explorer window appeared which was pointing to the malicious mlrglrqj.co.cc domain as illustrated below.

DF Perspective 2

At this point a Google image search resulted in the Internet Explorer browser being redirected to the mlrglrqj.co.cc domain where a fake online scanner was located. To see how this occurred forensically, the activity of the Google image being accessed needs to be examined. The portion of the timeline below shows the Google image URL that was accessed and this resulted in the image (line 151879) and a webpage (line 151880) being downloaded to the system. The timeline also shows a webpage, mlrglrqj.co[2].htm, being downloaded six seconds after the image was accessed (line 151881).

The URL in the above picture shows that when the Google image was accessed it brought the user to hxxp://pimpit.com/pr-Osama-Binladen-Dead.html (the imgrefurl variable contained the URL) and the webpage was using an image located at hxxp://theblackboxoffice.com/wp-content/uploads/2010/08/binladen_dead_alive.jpg (the imgurl variable contained the URL). Besides the image of interest, the only other file downloaded to the system before the browser redirect was an htm file named CA16L2DT.htm (this file was uploaded to jsunpack and can be viewed here). I examined CA16L2DT.htm to see if I could find in the file what caused the browser redirect. There was a reference to the t3.gstatic.com domain so I decided to look into the domain a little closer. The first Google search hit for the domain was a thread in a CNET forum titled “Phishing on Google Image Search - t3.gstatic.com/images” from July 2010. A person in the thread mentioned how Kaspersky antivirus was blocking the t3.gstatic.com domain due to it being a phishing attack. I did a search for the domain using the Malware Analysis Search which found malware samples associated with URLs that looked similar to the URL I found in the CA16L2DT.htm file (two of the malware sample reports can be found here and here). I wasn’t able to confirm what caused the browser redirect but I was able to determine the pimpit.com domain was involved with the redirect and a suspicious URL was present on pimpit.com’s webpage.

User Perspective 3

A “Windows Security Alert” appeared on the fake online scanner as shown below.

Shortly after the “Windows Security Alert” a program named XP Home Security appeared on the system. XP Home Security was the program holding the test system hostage.

DF Perspective 3

User Perspective 3 showed the payload of the attack wasn’t the fake online scanner but was the XP Home Security program which was successfully installed on the system. Continuing with the examination of the timeline, the activity on the system indicates the fake online scanner was still open as can be seen in the timeline below.

After the fake online scanner activity there was an Internet Explorer history entry for the following URL hxxp://mlrglrqj.co.cc/file/sc1/SecurityScanner.exe. Immediately after this URL was accessed there were a few registry modifications and the creation of a prefetch file indicating the SecurityScanner.exe program was executed. The picture below shows this activity in the timeline.

The timeline showed there were no indications of a software exploit (vulnerable programs executing, new files appearing on system ,etc..) on the system so it doesn’t appear an exploit was responsible for installing the malware. However, the administrator user account was responsible for the suspicious Internet activity so the account's recent activity was examined to shed light on how the malware was installed. I used Regripper to examine the user activity stored in the registry by parsing the administrator user account’s NTUSER.DAT registry hive. The MUICache registry key entry in the Regripper report shows the administrator user account executed the Security Scanner.exe and ieh.exe programs. The MUICache data for these programs are shown below:

Software\Microsoft\Windows\ShellNoRoam\MUICache
LastWrite Time Tue May 3 02:09:07 2011 (UTC)
     C:\Documents and Settings\Administrator\Local Settings\Temporary Internet Files\Content.IE5\4967GLU3\SecurityScanner[1].exe (SecurityScanner[1])
     C:\Documents and Settings\Administrator\Local Settings\Application Data\ieh.exe (ieh)

The lack of exploit artifacts and the MUICache registry key data indicate the administrator user account installed the malware which was exactly what happened. Further examination of the system identified the ieh.exe file as the program holding the system hostage and a VirusTotal scan of the file had a detection rate around 30%.

Potential Google Image Search Delivery Artifacts

At this point the user and digital forensics perspectives showed malware being installed on a system because of a Google image. The purpose of this post was to identify the potential artifacts of a Google image search being used to deliver malware which is why I stopped writing about the DF perspective once the malware was installed on the system. The portions of the timeline in my write-up showed had a lot of deleted files which helped explain how this attacked happened. Most likely deleted files will be over written since the system won’t be preserved within 30 seconds of being infected. However, the potential artifacts of the Google image search being used as the delivery mechanism may still be present on a system in the Internet browsing history. If the browser history artifacts occur around the time when malware firsts executes on a system (prefetch files, registry modifications, etc) then this may indicate the Google image search was used as the delivery method. For example, the malware executed on my test system around 10:03 PM on 05/02/11 and the Internet browsing history around this time showed a Google image being accessed followed by my Internet browser visiting a malicious domain. My browser history showing the Google image search is below.

All Things Encase

Thursday, May 5, 2011 Posted by Corey Harrell 0 comments
I use a range of tools to perform digital forensics and these tools fall into different categories such as free, open source, and commercial tools. Some readers of this blog may have picked up on that Encase is one of the commercial tools in my toolbox. I thought I would share some of the interesting links I came across over the past month about Encase.

Forensic Analysis Techniques Using Encase

Lance Muller put together a couple of posts about computer forensics analysis techniques using Encase. First up is the post Basic Computer Forensic Analysis Techniques in Encase which outlines the techniques commonly used in cases and techniques specific to certain types of cases. His second post is General Forensics (using EnCase Enterprise) Flow chart and this provides some ideas on the different ways to use Encase Enterprise in support of investigations, incident response, and e-discovery.

Lance mentioned that both posts are not meant to be all inclusive lists but are to be used as starting points. He also said in one of the posts that the type of investigation will impact the techniques to use. I couldn’t agree more with his comment. To help determine what techniques to use a person should take a step back before an image is loaded into Encase or a servlet is pushed across the network. Taking a step back provides time to think about the goals of their forensic examination, the questions that need to be answered, and what data is needed to answer those questions. This quick reflection (or better yet an analysis design plan) will not only help determine what techniques/activities are needed to extract the data of interest but can also help keep the examination focused on what the customer wants or needs.

A New Option for Creating Timelines

Kristinn Gudjonsson released version 0.52 of log2timeline in April. I was checking out the change log to see what was new and one of the changes is the ENCASE_DIRLISTING input module. According to the change log, this new module imports a text file exported by Encase which contains the file listing of an image. It’s good to see more options for creating timelines. Now we have the Sleuthkit, Sleuthkit with Harlan’s timeline tools, Sleuthkit with log2timeline, FTK file listing, FTK file listing with log2timeline, Encase enscript, Encase file listing, and now the Encase file listing with log2timeline. Having options lets me test the different ways to create timelines and choose the method that best meets my needs. An additional thought that came to me as I was typing the various options was to do a write up on the different ways to create timelines. One more idea added to my blog hopper.

Encase version 7

Just in case for anyone who missed the announcements from the Guidance Software’s advertising machine, Encase version 7 is on the horizon. If you’re interested in some of the new features or changes check out Lee Whitfield’s podcast Episode 36 Encase Forensic 7 and Geoff Black’s Forensic Gremlins post Encase 7 Sneak Peek (NYC).

Besides the  layout of the user interface, two new improvements I’m also interested in are the index and email functionality. At times and in certain types of cases, I need the flexibility to search an index on the fly so I’m curious how well the new index will work. I always found the email analysis in Encase to be lacking so I'll welcome any improvements in this area. Unfortunately, the new email still lacks support for Lotus Notes version 8.X but I have other options to address this need.

Encase Version 7 Preview

Speaking of wanting to see the new features in Encase 7, Guidance released the Encase 7 preview software last weekend. Paul Bobby of SecureArtisan has been testing the software and sharing his thoughts on his blog. Encase v7 Preview, Encase v7 Conditions, and Tagging in Encase v7 are his posts so far. Hopefully I’ll find some time over the next week to play with my preview software. I was a little disappointed to see that the software is restricted to the evidence files provided by Guidance. I was looking forward to throwing my images and email files at the new version to see how it performs … at least in the meantime I can see the new layout.
Labels: ,