11 år 9 år

Builds on Digital Forensics I
Summary of tools: open source tools

Literature


  • Malware Analyst's Cookbook and DVD: Tools and Techniques for Fighting Malicious Code (Michael Ligh, Steven Adair, Blake Hartstein, Matthew Richard)
  • Real Digital Forensics: Computer Security and Incident Response (Keith J. Jones, Richard Bejtlich, Curtis W. Rose) (In library as "HIG Pensum 005.8 Jon")]
  • Forensic Discovery (Dan Farmer and Wietse Venema) Available online!
  • NIST SP800-61 is recommended (Incident handling) (NIST SP800-86 might also be a good read - Guide to Integrating Forensic Techniques into incident response)

Info
10 ETC subject (300 hours workload - 150h project, 36 hours lectures, 32 hours lab and 82 hours reading/exercises)
50% exam 14th of June (2013)
50% project by 25th June (2013)

S2_DF2_Visualization_tools_in_digital_forensics
size 3.5 MiB
sha256: 62875f4a81...20d1cc01c0

A case is planned after the two weeks of lab, where we are supposed to try defending evidence in a court like situation. The lectures will be partly held by external parties. We will get programming exercises after the lectures (nothing so far)

Lectures

Notes from reading

Real Digital Forensics: Computer Security and Incident Response:

  • Ch1/Ch2: Live forensics on windows and unix. System time/offset (time), network connections (netstat -an and use portsdb.org), processes associated with TCP∕UDP (listening and connected), netbios names (nbstat -c), routes (netcat -rn), processes (opslist), using netcat/cryptcat to copy files to a server over network, services (psservice), scheduled jobs (at), opened files (psfile), dump processes (userdump <pid>), extract logs, users logged in, loadable kernel modules and mounted file systems. Find patch level, search for time stamps, look at registry and find suspicious files.
  • Ch3/4/5: Network Based Evidence (NBE) and Network Security Monitoring (NSM). NMS consists of full content, session data, alert and statistical data. Get it before or after incident is detected? Examples of "no op sled" is shown (buffer overflow)
  • Ch6/7/8: Forensics duplication: What to bring: Tools, Chain of custody forms (source, destination and time of transfers), Labels to mark evidence collected (case, identifier, content, acquired by and date), Boot medium with trusted software, Evidence envelopes, anti-static bags, hub/switch. Evidence worksheet, agent notes, evidence custodian and access logs are also mentioned. Using dd (flags conv=notrunc,noerror,sync), md5sum in binary mode, split, dd_rescue, dcfldd (integrated with hashing) and NED (odessa or openDD). Begin with file recovery (undelete) using sleuth kit or commercial tool, fls to look at structure, losetup for loopback devices and icat to extract a file from image via inode or file ID. Important to use different tools because they find different deleted files.
  • Ch9: Analyze image: Get metadata in a listed form for import in spreadsheet: MAC times, full path, size, md5 hash (for known files filtering) and file type using magic numbers (/etc/share/magic) (see scripts using md5 and file commands), NSRL by NIST is a good source of known file hashes. md5deep can also traverse file tree and produce hashes. Do string searches and learn to use grep command (page 240)
  • Ch10: Web history: Pasco (web history) and galleta (cookie data)
  • Ch11: E-mail activity, eindeutig for outlook express .DBX files and munpack to decode MIME attachments, libPST for outlook .PST files. No (at that time) open source solution for AOL but Apple and Netscape uses plaintext files and folders (MBOX format).
  • Ch12: Windows logs: Event log, application logs and registry are mentioned. Registry has default, system, software and individual user ntuser.dat registry hives. Search for MRU (most recent use) and URL's types inn are examples of use (old information)
  • Ch13/14/15: Analysis of unknown executable files. A simple C program is created to show how the simplest possible program would look like, in both Linux and Windows environment. Both processes explained. Binaries can be compiled stand alone or dependent on system libraries. Methods used to analyze is divided in static "dead" code analysis and dynamic "live" analysis actually running the code in a safe environment monitoring what the executable does. For each platform a hostile unknown file is presented, methods used to unpack it and methods used to bypass password restrictions. Tools used in Linux include hexdump, nm, ldd, readelf, objdump, strace, ltrace and gdb. For Windows tools used includes visual c++ toolkit, cygwin, ADA (commercial), ollyDbg, filemon and regmon.
  • Ch16/17: Creating a tool kit bootable medium: Move all dependencies (dll's etc) to the medium by using filemon to watch for disk usage. Notice that since XP (Windows) prefetch always writes to disk. Create script to run collection of information automatically, ...bootable.
  • Ch18: Old portable devices like PDA, Palm and iPAQ.
  • Ch19/20: USB and flash cards, undelete with sleuth kit, fatback, foremost.
  • Ch21: E-mail headers
  • Ch22: DNS and scripting huge data in databases with perl.
  • Appendix with some perl and regex