2014 Conference Abstracts

 

Autopsy 3.1: Faster, Better, and still Free  (slides)

Brian Carrier
Basis Technology

2 years and 10 point releases after the release of 3.0.0, Autopsy™ 3.1 has been released. It has many performance improvements and feature enhancements. This talk will cover what is new in Autopsy, including multi-threaded ingest pipelines to take better advantage of multi-core systems, hash sets that users can create and update, image galleries to better organize images, improved timelines, and mobile phone support. We’ll also cover the basics of Autopsy for audience members who have not seen or used it before. Autopsy 3 is a windows-based open source digital forensics platform that has all of the basic features needed to conduct a digital investigation.

Supersize your Internet Timeline with Google Analytic Artifacts  (slides)

Mari DeGrazia
Verizon RISK Team

What Internet evidence might you be missing? Learn how to find additional information by leveraging the artifacts left behind by Google Analytics.

These artifacts include valuable information such as timestamps, search terms and referral pages. Learn how to utilize a Google Analytics Python script to recover these artifacts from Browser history, cache files, RAM, unallocated space, iTunes backup and more.

Vortessence – Automating memory forensics (slides)

Endre Bangerter and Beni Urech
Bern University of Applied Sciences, Security Engineering Lab

Memory forensics is a key technique for detecting and analyzing malware and related attack tools. While there are several memory forensics tools, the Volatility framework is probably the most widely used and significant tool.

Volatility features a large number of commands, which for Windows systems roughly fall into two categories. One consists of commands that are more directly geared towards detecting malware specific artifacts such as code injections, or various kinds of hooks. The other category consists of commands that allow inspecting the state of the system being investigated. For instance, one can display running processes, loaded DLLs, drivers, etc.

To detect malware using the latter tools, the analyst is typically looking for anomalous system properties. A simple example would be a bogus process name or processes which are started from unusual directories. Stuxnet for instance, introduces two malicious instances of the alleged “lsass.exe” process, which feature a wrong parent process. Another, more subtle anomaly, would be an unusually low or high number of DLLs in a certain process.

The key point is that finding these anomalies requires lots of encyclopedic knowledge about the state of a non-infected Windows system. This knowledge is hard to memorize (at least for some of us). Moreover, searching for actual discrepancies deviating from the clean state can be a quite boring and rather mechanical task.

Due to laziness and lack of brain capacity we have developed a Volatility based tool that automates the process of finding certain anomalies in memory images. The idea underlying our tool is on a high level rather straightforward: We first generate a whitelist of known clean memory states, and then detect malware by checking for deviations from the whitelist. The anomalies are described in a report that is automatically generated by the tool. The tool thus allows to detect certain anomalies more efficiently and likely covers more anomalies than most humans will be able to memorize.

Let us sketch in some more detail how our tool works. We first obtain memory images of known clean systems, and extract from these images a large number of properties (e.g., parent – child relationships of processes, drivers loaded, which process contains what DLLs, and many many more) using various Volatility tools. The resulting data is stored in a database. To analyze a potentially infected image, we again perform property extraction using Volatility, and then check the resulting values against the whitelist stored in the database.

Another more advanced and “researchy” component of the tool does not use Volatility, but rather takes a “binary diffing” approach by comparing known clean memory images with the image to be analyzed. A simple byte-wise diffing won’t be successful; our tool rather has the capability to identify relevant modifications. This diffing approach allows finding, for instance, hooks and memory injections using a different method than the heuristics used by the malware tools of Volatility. Moreover, by combining both approaches we can reduce the false positive rate of the current Volatility tools.

We have tested our tool with real world malware samples and the results are encouraging. For instance, in the case of Stuxnet, we can automatically identify many of its infection characteristics, with a relatively moderate false positive rate. We believe that our tool is quite useful for practitioners, and we are going to opensource it in the second half of 2014.

What’s New in RegRipper (slides)

Harlan Carvey
Dell SecureWorks

In this presentation, we will discuss updates to RegRipper (RR), which include:

  • Plugins have four options for output formats: ‘regular’, bodyfile, TLN, csv
  • Artifact categories
  • Usage examples; how to get the most out of RR

Sceadan – Systematic Classification Engine for Advanced Data Analysis (slides)

Nicole L. Beebe

University of Texas at San Antonio

Simson Garfinkel
Naval Postgraduate School

Data type classification involves the determination of “type” (e.g. image, HTML text, video, etc.) without reference to metadata (e.g. file names, extensions, or network headers) or explicit reference to file signatures (e.g. “magic numbers”). Use cases involve classification of deleted data fragments, triage and disk profiling, search hit relevancy ranking, intrusion detection, and data loss prevention. Naïve statistical classification via machine learning has been studied by academics for over a decade, however, the research has netted varied results and failed to net any widely available open source tools. Researchers at UTSA and NPS have significantly advanced the state of the art in naïve statistical data type classification and have instantiated their findings in an open-source tool called SCEADAN. We used skillfully crafted training sets to perform statistical, n-gram based analysis to predict data type across 48 file/data/encoding types. Prediction accuracies vary across data types, but SCEADAN averages 75-90% accuracy and automatically produces confusion matrices. SCEADAN comes with a built-in model but researchers can easily train their own.

Python Autopsy: Easier Forensics Scripting (not dead snakes) (slides)

Richard Cordovano
Basis Technology

Lots of people love to write code in Python and lots of people have been disappointed that they couldn’t write Autopsy™ modules in Python. That has now changed. Autopsy allows you to write ingest modules in Python so that you can more easily analyze file content. Ingest modules allow you to access any file in a disk image or set of logical files and allow full access to the database and blackboard. Writing a Python Autopsy module is the easiest way to search for and analyze files in a custom way. This talk is for the Python lovers who want to learn about the basics of making an Autopsy module, and why it will make your life easier.

Live Disk Forensics on Bare Metal (slides)

Hongyi Hu & Chad Spensky
MIT Lincoln Laboratory

We have developed a hardware/software-based framework to perform live disk forensics on both physical and virtual machines. For physical machines, we developed a hardware SATA sensor capable of passively monitoring disk traffic. Similarly, for virtual machines, we inserted hooks into KVM and Xen to provide similar monitoring capabilities. Our software tools use The Sleuth Kit, pyTSK and analyzeMFT to convert the raw disk traffic into semantically useful data in real time. Our platform is difficult to detect and can perform functions such as live forensic analysis, tamper-resistant logging, and replay of disk events with minimal impact to the system under test.

Thus, our framework provides researchers and developers a rich toolset for building live forensics and monitoring applications and for conducting forensics research.

A Case Study on Network Anti-Forensics (slides)

Ben Schmidt
Narf Industries

Forensic analysts have plenty to worry about when it comes to the security of their network. Seldom do they worry about their packet analysis software. We will demonstrate why they should.

In this talk, we detail how we uncovered a handful of remotely exploitable Wireshark vulnerabilities and deployed them as anti-forensic measures during DEFCON 20 and 21 CTF. These vulnerabilities can be used to compromise Wireshark by sending specially crafted input to a live sniffer, or by supplying a malicious packet capture. We walk through exploitation of these vulnerabilities to cause denial of service conditions and to execute arbitrary code on modern operating systems.

We dive into the inner workings of the Wireshark network capture and reconstruction tool. We detail the interactions between protocol layers, Wireshark’s dissector model, and mitigations built into the Wireshark common API. We walk through the reconstruction of popular network traffic, from Ethernet frame to IP packet to TCP stream to application data. We highlight where things can (and often do) go wrong and how to exploit this popular software package. We close by offering suggestions for how Wireshark and other forensic tools can mitigate risk and decrease their attack surface.”

Incident Response with STIX and Autopsy (slides)

Ann Priestman
Basis Technology

When responding to a computer incident, you may want to scan the hard drive for signatures and indicators from known malware and threat actors. STIX™ is a standardized way to share information about cyber threats. It is being used in the US Government and in industry to share intelligence. This presentation will talk about how you can use STIX with the open source Autopsy™ tool to find files, registry keys, and other indicators that other incident responders have also seen.

Timeline Visualization in Autopsy (slides)

Jonathan Millman
Basis Technology

Timeline analysis can be an important digital forensics analysis technique. At previous OSDFCons, there have been presentations about collecting temporal data from various sources, but there has not been much available for open source tools to help visualize the results besides text-based methods. This talk will cover a new Autopsy™ timeline module that focuses on visualizing large amounts of event data. The module provides several display techniques to help reduce information overload, such as description zooming and filtering. The timeline includes temporal data from file system activity and other artifacts, such as web activity and the registry.

A Differential Approach to Analysis of Malware in Memory (slides)

Dr. Vico Marziale
504ENSICS Labs

Detecting malware is difficult, and analyzing a detected piece of malware’s behavior is even more difficult. Techniques for analysis generally fall into one of three camps: static analysis of the malicious binary on disk, dynamic analysis as the binary executes, or a hybrid approach using a snapshot of physical RAM taken as the malware executes. As the result of our DARPA Cyber Fast Track (CFT) funded research, we extend this third approach. In this session we present a novel technique for leveraging multiple snapshots of physical RAM for malware detection and analysis. We will also present DAMM, a tool for differential analysis of malware in memory

The techniques presented can be utilized in any size infrastructure where malware is a threat (everywhere). The presentation will contain two scenarios based on real world malware samples, demonstrating the technique in use in a manner that attendees will be able to directly apply to their own malware analysis efforts.

MEDS: Malware Evolution Discovery System (slides)

Antonio Cesar Vargas
NacoLabs, LLC

Malware, or malicious software, affects every computing device at our disposal, including personal computers, dedicated servers and more recently, mobile devices such as smart phones and tablets. The information stored on these devices makes them attractive targets for illegal financial gain by cybercriminals, and corporate espionage or even strategic operations by government agencies. Yet, traditional detection measures are increasingly ineffective at detecting the extensive number of malware variants. To make matters worse, these variants are becoming commodity products whose manufacture is facilitated by an underground industry that seeks to meet demands for products that can bypass current anti-malware technologies. Consequently, most present malware is not new since the development of new “from-scratch” malware is not economically viable for the underground malware industry. Instead, most malware found in the wild is a modification or feature upgrade of previously created malware.

This presentation attempts to frame the malware problem from the perspective of the evolutionary production of malware. The goal is to present the design of an architecture that allows researchers to discover generative malware, and show an initial implementation of the Malware Evolution Discovery System (MEDS). MEDS supports the creation of phylogenetic trees of malware, and attempts to make predictions of generative malware by applying two models of supervised regression analysis on malware samples and their corresponding phylogenetic tree. Finally, the MEDS framework is made available as an open source project, thus providing an innovative tool that has been previously unavailable for the cybersecurity and digital forensics communities.

Fresh Produce: How We Can Integrate Our Forensic Tools Into Great Workflows Without Crazy File Formats (slides)

William Ballenthin
FireEye

As a forensic analyst that is driven by really large investigations, I have some issues with many popular forensic tools (open source included). Ignoring buggy scripts and broken interpretations, it bothers me that many tools are not designed with more than one workflow in mind. So, I’d like to share with you a novel way for tools to easily integrate on small, medium, and large scales: “user defined output formatting”. After defining the concept, I’ll demonstrate how it saves significant developer and investigator effort, and how it can lead to more complete forensic analysis. Next, we’ll walk through the implementation of “user defined output formatting” in a common forensic tool (such as RegRipper or Volatility), and you’ll be surprised how minimal the changes are. Finally, we’ll discuss the future of forensic tool workflows, and how we can all do our part to improve the integration of forensic tools everywhere. By the end, you’ll be sold that “user defined output formatting” is a neat idea, and ready to try it out during your following investigation.

Next Generation Memory Forensics (slides)

The Volatility Development Team
The Volatility Foundation

The previous year has seen a number of exciting advances in the field of memory forensics. In particular, we have seen expanded support for Windows 8 and robust support for Mac OS X. There has also been a lot of innovative research into extracting memory resident artifacts from applications. This includes extracting cryptographic material from encryption programs and user activity from chat clients and web browsers. As the relatively young and exciting field of memory forensics continues to rapidly evolve, it also continues to change the way we approach all types of digital investigations.

In this presentation, we will highlight many of these advances within the latest 2.4 release of Volatility, including showing how investigators can incorporate these advanced capabilities into their own analysis efforts. We also discuss where the field of memory forensics is headed, including the expanded focus from kernel artifacts to those left by applications. While current memory analysis research allows for deep recovery of kernel artifacts, such as lists of processes, kernel modules, and network connections, the majority of userland (process) analysis has historically focused on extracting memory regions to disk for analysis with strings and regular expressions. This presentation will discuss recent research efforts to analyze the memory resident artifacts of popular applications commonly used on both Windows and Mac.

By the end of the presentation, attendees will know about the latest advances in memory forensics and how they can be rapidly utilized in the field. They will also know what to expect from future memory forensics research efforts, and how some of the current gaps in memory forensics analysis are being addressed.