
Please access this website using a laptop / desktop or tablet for the best experience
Search Results
499 results found with an empty search
- KAPE: A Detailed Exploration
Introduction: KAPE, can be used in graphical user interface (GUI), and can be used via the command line interface (CMD). Users typically run KAPE from the command prompt, providing it with the necessary parameters to specify the artifacts they want to collect and the output location. GUI Based: We'll walk through the process of using Kape for evidence acquisition and processing. Kape, written by Eric Zimmerman, is a powerful tool used in digital forensics and incident response. Enable Targets: At the top left, spot number one, you need to enter the Target source. For our example, we're choosing the C drive. For the Target destination, spot number two, we'll use C:\temp\T_out. "T_out" is a common naming convention for Target output. Selec t Kape Triage Target: At spot number two, we are selecting Kape triage. This is a compound Target that gathers various artifacts like registry hives, event logs, and evidence of execution. ( total target around more than 220. It depend on analyst/investigator what he wants to collect) Enable Modules: At spot number three, check the box to enable the module side of Kape (GK). Specify a module destination, which is where parsed output will reside . For our example, C:\temp\M_out (module output). Choose !EZParser Module: ( Depend upon analyst) Below that, we are selecting the !EZParser module. This module runs all of Eric Zimmerman's tools against the data grabbed by the Kape triage Target. This combination simplifies parsing using the Easy Parser tool. Select CSV as Output Format: At spot number four, choose CSV as the default output. Eric Zimmerman's tools commonly support CSV output. Enable Debug Messages: At spot number five, it's advisable to enable debug messages. While it outputs more messages to the console, these are immensely helpful for troubleshooting issues during acquisition or processing. Execute the Command: At spot number six, once you have satisfied all the necessary configurations, click the "Execute" button. This initiates the command and begins the acquisition and processing of data. Accessing Evidence: • There are two main ways to access evidence: running Kape on a live system or mounting a forensic image . It's recommended to use Arsenal Image Mounter for handling forensic images. • The typical Kape workflow involves using the Kape triage Target and the !EZParser module. This combination covers a broad spectrum of common artifacts. As you become more comfortable, you can customize your own Kape recipe to suit specific acquisition and processing needs. Kape Targets: Kape targets are collections of relevant files and directories, defined and customizable through YAML files hosted on GitHub in the Kape Files repository. These targets can focus on files locked by the operating system, preserving original timestamps and metadata . Files locked by the OS are added to a secondary queue, visible in the console log. Even if the console log might indicate certain files weren't grabbed, they were added to the secondary queue, processed using raw disk reads to bypass operating system locks. The Kape folder contains subfolders for targets, such as "Disabled, Antivirus ,Apps," each representing a different collection of artifacts. Targets in the "Disabled" folder won't show up in Kape and cannot be used by it When examining a compound target like "Kape Triage," drilling down through associated targets in the Kape folder reveals the specific files and directories being captured Kape Modules: Kape modules serve as mechanisms to run command-line tools against collected files. They are predefined and customizable, grouping artifacts into categories. The category name becomes the output folder's name. Modules facilitate live response scenarios, offering multiple modules geared towards this purpose. Modules are responsible for processing collected artifacts, and they are grouped into categories, with each category defining the name of the output folder. Modules are highly customizable, allowing users to tailor them to their specific needs. Special programs and scripts can also be employed through modules. The Kape Modules folder, like the Targets folder, contains a "Disabled" subfolder. Placing modules here prevents them from appearing in Gkape or being used by kape. The "Bin" folder within the Modules directory is crucial, housing executables that modules call upon. This ensures that third-party tools, not shipped with Kape, are accessible for module execution. Using the EzParser module simplifies this process, as it seamlessly integrates with Eric Zimmerman's tools. The below Screenshot illustrates the process of examining the EzParser module, which then points to the EVTXecmd module. Each module specifies the binaries it uses, emphasizing the importance of organizing executables in the "Bin" folder for seamless module execution. If you prefer a user-friendly graphical interface, the GUI version of KAPE is an excellent choice. However, for those who appreciate the precision and control of the command line, KAPE also offers a robust command-line interface (CMD). A noteworthy feature of the GUI version is its automatic generation of command-line instructions based on the selections you make. As you navigate through the graphical interface and choose the specific options and artifacts you need, the corresponding command is seamlessly composed. This ensures a smooth transition between the user-friendly GUI and the powerful flexibility of the command line. For a quick and efficient workflow, take advantage of the visual cues provided by the GUI, and observe how the selected options translate into a well-structured command. Whether you opt for the ease of the GUI or the command-line precision, KAPE caters to both preferences, offering a versatile solution for digital forensics and incident response tasks." If you choose to enable only the target for collection, KAPE delivers raw forensic data—a comprehensive snapshot of the specified target. This raw data is invaluable for detailed analysis and investigation. On the other hand, for users seeking a more structured and parsed output, KAPE's modular capabilities come into play. By combining the selection of specific modules with the target, KAPE not only captures the raw data but also processes and organizes it into user-friendly formats such as CSV or TXT. This dual-output feature ensures that users have access to both the unfiltered raw data and the parsed, structured results. Integration Possibilities: While Kape itself doesn't integrate into Splunk directly, but the investigators can ingest CSVs into Splunk. Hash Sets and Cloud Data Collection: Kape allows excluding certain files with hash sets, it doesn't restrict the search to specific file types. This emphasizes Kape's flexibility while outlining its approach to hash-based exclusions. Furthermore, collecting data from cloud storage services, such as OneDrive, Google Drive, Dropbox, and Box is done by Kape. But Legal considerations regarding search warrants and authorization for cloud data access. Akash Patel
- Examining SRUM with ESEDatabaseView
You can download tool from link below: https://www.nirsoft.net/utils/ese_database_view.html Opening SRUM Database with NirSoft Using NirSoft's utilities, you can open the SRUDB.dat ESE database to access its tables. In a typical Windows 10 setup, you'll find around 13 tables. By default, the MSysObjects table is displayed, sorted by the first column. We're focusing on the Windows Network Data Usage Monitor table, identified by the unique identifier {973F5D5C-1D90-4944-BE8E-24B94231Al74}, which is consistent across Windows 8.1 and Windows 10. Examining the Windows Network Data Usage Monitor Table Once you've selected the Windows Network Data Usage Monitor table, you'll find entries detailing the system's network connections. Each entry features an "AppID," identifying the application using the network during that time period. The AppID corresponds to the "Idlndex" field in the SruDbIdMapTable. This table also reveals the drive and f ull path of the application executable via the "IdBlob" for each "Idlndex." Additionally, you'll find the "Userld," network interface (lnterfaceLuid), network profile index (L2Profileld), and bytes sent and received for each application during that time period. Mapping Network Profiles To map a network profile, start by identifying a network with a profile identifier(l2Profileld) , such as 268435461. Navigate to the SOFTWARE registry hive to find the corresponding network name. Here's how: Navigate to \Microsoft\WlanSvc\lnterfaces\{Key}\Profiles key. The "Last write timestamp" for each profile GUID is the First time this computer ever connected to that network . The " Last write timestamp" for the "MetaData" registry subkey is the Last time this computer ever connected to that network. 2. Look for profile identifiers and check the Profilelndex key value to find the matching identifier. 3. Expand the matching profile identifier key and select the "MetaData" subkey. 4. Check the "Channel Hints" key value to reveal the network name corresponding to the Profilelndex 268435461 . By following these steps, you can gain valuable insights into the network connections made by a system, the applications involved, and even the network names. This information can be pivotal in forensic investigations, shedding light on user activities and potentially uncovering malicious intent. Conclusion The SRUM database, when explored using NirSoft's utilities, offers a comprehensive view of network usage data on a Windows system. By understanding how to navigate and interpret this data, digital forensic analysts can uncover critical insights that may be instrumental in their investigations. Akash Patel
- Unpacking SRUM: The Digital Forensics Goldmine in Windows
Updated on 31 Jan, 2025 Enter the System Resource Usage Monitor (SRUM) — a treasure trove for digital forensic analysts. The SRUM Database: A Wealth of Insights The SRUM database serves as a goldmine of information for investigators, offering invaluable insights into user activities and system performance. Some of the most exciting pieces of information that SRUM can reveal include: Applications Running: Details on what applications were active on the system during a specific hour. User Account Information: Identification of the user account responsible for launching each application. Network Bandwidth Usage: Insights into the amount of network bandwidth sent and received by each application. Network Connections: Information on the networks the system was connected to , including dates, times, and connected networks. ***************************************************************************************************** SRUM Database in Windows: How It Works and What You Need to Know Windows 8 When SRUM was first introduced in Windows 8 , it stored performance data for desktop applications, system utilities, services, and Windows Store (Metro) apps. Approximately every hour, or when the system was properly shut down or rebooted, this data was transferred to an Extensible Storage Engine (ESE) database file known as SRUDB.dat . Windows 10 and 11 Now data is no longer temporarily stored in the Windows Registry before being written to SRUDB.dat. Q2: When and How SRUM Data is Written Windows 10 and 11 SRUM data is generally recorded every 60 minutes . However, testing has revealed that data is not always written on shutdown . For example, if a system is shut down twice within 10 minutes, the SRUM database might not update until a later reboot where the system remains powered on past the standard 60-minute mark. This delayed writing behavior can be misleading. When reviewing SRUM entries, you may find multiple records with the exact same timestamp . This does not mean the events occurred simultaneously; rather, it indicates that the system recorded them all at once when SRUM was last updated . The actual activities could have taken place at any point between two consecutive entries. Q3: Analyzing SRUM Data for Patterns To make sense of SRUM data , you can compare the timestamps of consecutive entries . If the interval between entries deviates significantly from the expected 60-minute period (with a margin of plus or minus 10 minutes ), it might suggest that data was written due to a system shutdown rather than the usual scheduled update. A useful method for identifying anomalies is to import SRUM data into Excel and use the Conditional Formatting feature to highlight timestamps that fall outside the standard interval. Q4: Recovering Historical SRUM Data SRUM is often backed up in Volume Shadow Copies , meaning forensic analysts can potentially retrieve older SRUM database snapshots if shadow copies are available. ***************************************************************************************************** SRUM Database Integrity and Repair Given that systems are often not cleanly shut down during incident response procedures, the SRUM database file may sometimes be in a "dirty" or corrupt state. Windows provides a built-in tool, esentutl, for diagnosing and repairing ESE databases. This tool can perform tasks like defragmentation, recovery, integrity checking, and repair of ESE databases. Additionally, deleted files from the SRUM database may be recoverable using a utility called "EseCarve. To check the status of the SRUM database, Windows\System32\sru\ directory: esentutl /mh SRUDB.dat Repair corrupted SRUDB.dat: esentutl /p SRUDB.dat SRUM Registry Keys and Subkeys Performance data collected via SRUM is initially stored in the Windows registry and then transferred to the SRUM database. The primary registry key is HKLM\SOFTWARE\Microsoft\Windows NT\CurrentVersion\SRUM , which contains three subkeys: Parameters, Telemetry, and Extensions. Each of these subkeys corresponds to tables in the SRUM database and contains temporary data. Key Tables in the SRUM Database Windows Network Data Usage Monitor: **** ({97 3F5D5C-1D90-4944-BE8E-24B94231Al 74}) Records information about networks, applications, user SIDs, and total bytes sent and received by each application. WPN SRUM Provider: {dl0ca2fe-6fcf-4f6 d-848e-b2e99266fa86} Captures Windows push notifications for Windows applications, user SIDs, and push notification payload sizes. Application Resource Usage Provider: ***** { dlOca2fe-6fcf-4 f6d-848e-b2e99266fa89} Records the drive, directory, and full path of active applications, user SIDs, CPU cycle times, and bytes read and written. Windows Network Connectivity Usage Mon itor:***** {DD6636C4-8929-4683-974E-22C046A43763} Identifies network connections, connection start times, connection durations, and interface types. Energy Usage Provider: **** {fee4eI4f-02a9 -4550-b5ce-5fa2da202e37} Provides battery charge level, design capacity, cycle count, and power source information. Energy Estimation Provider (Windows 10 ): {97C2CE28-A37B-4920-B1E9-8B76CD341EC5} Offers a summary of historical battery status. ------------------------------------------------------------------------------------------------------------- Forensic Challenges with SRUM Data Delayed Writes: SRUM data is written approximately every 60 minutes. Shutdowns may prevent immediate updates to SRUDB.dat. Analysts should be cautious when interpreting timestamps. Retention Period: Most SRUM data is retained for 30–60 days. (In VSS) The Energy Usage LT table can store data for years. If a system is powered off for an extended period, older data may be purged on reboot. Database Corruption: If a system crashes or is not properly shut down, SRUDB.dat may be left in a "dirty" state. Windows has a built-in tool, esentutl , for repairing SRUM databases. ------------------------------------------------------------------------------------------------------------- Volume Shadow Copies: Older versions of SRUM can sometimes be recovered if Volume Shadow Copies (VSS) are available. Conclusion The SRUM database has revolutionized digital forensic investigations by offering a comprehensive view of system activities and performance metrics. As investigators continue to explore this rich data source, the potential for uncovering critical evidence and insights will only grow. -------------------------------------------------Dean--------------------------------------------------
- Analyzing Recycle Bin Metadata with RBCmd and $I_Parse
When investigating deleted files on a Windows system, analyzing the Recycle Bin metadata can provide crucial insights. In this guide, we’ll look at how to use Eric Zimmerman’s RBCmd.exe and another tool called $I_Parse.exe to extract and analyze deleted file information . Understanding Recycle Bin Metadata Windows keeps metadata for deleted files in different formats depending on the version of the operating system: INFO2 files (used in Windows XP) $I files (used in Windows Vista and later) These metadata files store details such as: Original file name Path before deletion Deletion timestamp File size Using RBCmd.exe for Analysis RBCmd.exe is a command-line utility created by Eric Zimmerman that can parse Recycle Bin metadata from both XP and modern Windows systems. Parsing a Single File To analyze a specific $I file, run the following command: RBCmd.exe -f "C:\$Recycle.Bin\S-1-5-21-1094574232-2158178848-303877012-1001\$IZZOXEO.pdf" Parsing an Entire Directory If you need to analyze all $I files in a folder, use the -d option: RBCmd.exe -d "C:\$Recycle.Bin\S-1-5-21-1094574232-2158178848-303877012-1001" --csv C:\Users\Akash's\Downloads This will parse all $I files in the specified directory and save the results in a CSV file. Output: ------------------------------------------------------------------------------------------------------------ Collecting Recycle Bin Artifacts with KAPE KAPE (Kroll Artifact Parser and Extractor) is a powerful tool that can collect forensic artifacts, including Recycle Bin metadata files. Steps to Collect Recycle Bin Artifacts Using KAPE: Open KAPE . Select the Target Module for Recycle Bin collection. Specify the output folder where the extracted files should be saved. Run KAPE Once collected, you can use RBCmd.exe or $I_Parse.exe to analyze the extracted data. Using $I_Parse.exe Another useful tool for parsing Recycle Bin metadata is $I_Parse.exe . While its usage is similar to RBCmd, it provides an alternative way to extract and analyze metadata from deleted files. Example Tool is very simple to use mention directory where you collected artifact and destination and click parse. Output: Conclusion Analyzing Recycle Bin metadata is a crucial step in digital forensics. Using RBCmd.exe and $I_Parse.exe , you can quickly extract valuable information about deleted files. Additionally, KAPE simplifies the collection of these artifacts, making your forensic workflow more efficient. -----------------------------------------------Dean-------------------------------------------------
- Windows Recycle Bin Forensics: Recovering Deleted Files
The Windows Recycle Bin is an important artifact in forensic investigations . When a user deletes a file using the graphical interface, it is not immediately erased. Instead, the file is moved to the Recycle Bin, where it remains until the user permanently deletes it or empties the Recycle Bin. This behavior makes it a great place to recover deleted files. ------------------------------------------------------------------------------------------------------------- How the Recycle Bin Works When a file is deleted, it is moved to a hidden system folder called $Recycle.Bin . Each user on the system has a separate folder within it, identified by their Security Identifier (SID) . The deleted file is renamed, and metadata is stored alongside it. This metadata includes: The original file name The original file location The time of deletion Since Windows does not track file deletion timestamps at the file system level, the Recycle Bin metadata provides valuable forensic evidence ------------------------------------------------------------------------------------------------------------- Ways to Bypass the Recycle Bin Some users may try to avoid the Recycle Bin by using methods such as: Shift + Delete : This permanently deletes a file without moving it to the Recycle Bin. Command Prompt or PowerShell : Deleting files from the command line bypasses the Recycle Bin. Third-Party Tools: Some applications delete files without sending them to the Recycle Bin. Even with these methods, deleted files may still be recoverable using forensic tools. ------------------------------------------------------------------------------------------------------------- Changes in Recycle Bin Architecture Microsoft has modified the Recycle Bin over the years: Windows XP and earlier: The Recycle Bin used a RECYCLER folder and an INFO2 database file to store metadata. Windows Vista and later: The folder was renamed $Recycle.Bin, and metadata is now stored in separate $I files for each deleted item . This change prevents metadata corruption issues that were common in older versions. ------------------------------------------------------------------------------------------------------------- What Happens When the Recycle Bin Is Emptied? When a user empties the Recycle Bin, all files and their metadata are removed. However, forensic tools can often recover them by: File carving: Searching for file remnants in unallocated space on the disk . Recovering $I files: These metadata files might still be retrievable and can provide useful information. ------------------------------------------------------------------------------------------------------------- Understanding $R and $I Files Modern versions of Windows store each deleted file as two separate files: $R files: These contain the actual deleted data . $I files: These store metadata such as the original file name, location, and deletion timestamp. By analyzing these files, forensic investigators can piece together details about deleted files and their original locations. ------------------------------------------------------------------------------------------------------------- Conducting Recycle Bin Forensics Locate the Recycle Bin Folder: Check $Recycle.Bin on all available drives (e.g., C:\, D:\) . Extract Metadata: Parse $I files to find relevant information . Recover Deleted Files: Copy $R files for further analysi s. Look for Deleted Evidence: If the Recycle Bin has been emptied, attempt file recovery using forensic tools. ------------------------------------------------------------------------------------------------------------- As $R is recoverable files so no need for parsing but $I files need parsing tool use for that is $I Parse Conclusion The Windows Recycle Bin is a goldmine of forensic evidence. While users can attempt to bypass it, forensic tools can often recover deleted files and metadata. By understanding the Recycle Bin’s structure and metadata files, investigators can uncover valuable information during an investigation. -------------------------------------------------Dean------------------------------------------------------
- Understanding and Managing Thumbnail Cache in Windows: Tools thumbcache_viewer_64
Introduction Thumbnail cache in Windows is an essential feature that helps speed up the display of folders by storing thumbnail images. Tools = Thumbcache Viewer. What is Thumbnail Cache? The thumbnail cache is a set of database files used by Windows to store thumbnail images of files and folders. This cache allows Windows to quickly display thumbnails without needing to regenerate them each time you open a folder. Location of Thumbnail Cache Files Windows 10, 11, and 8 In these versions, the thumbnail cache files are stored in the following directory: C:\Users\\AppData\Local\Microsoft\Windows\Explorer : Replace this with your actual Windows username. To access this folder: Press Win + R to open the Run dialog. Type %localappdata%\Microsoft\Windows\Explorer and press Enter. Windows 7 The location is similar to newer versions: C:\Users\\AppData\Local\Microsoft\Windows\Explorer To access this folder: Press Win + R to open the Run dialog. Type %localappdata%\Microsoft\Windows\Explorer and press Enter. Types of Files in Thumbnail Cache In the Explorer folder, you will find several files, each representing different sizes and types of thumbnails. These include: thumbcache_32.db : Thumbnails of size 32x32 pixels. thumbcache_96.db : Thumbnails of size 96x96 pixels. thumbcache_256.db : Thumbnails of size 256x256 pixels. thumbcache_1024.db : Thumbnails of size 1024x1024 pixels. thumbcache_idx.db : Index file for the thumbnails. Viewing Thumbnail Cache Files To view the contents of these thumbnail cache files, you can use a tool like Thumbcache Viewer . This tool allows you to open and examine the thumbnail cache database files. Using Thumbcache Viewer Thumbcache Viewer is a free tool that supports Windows 7 to Windows 10 thumbnails . Here’s how to use it: Download Thumbcache Viewer : Install the Tool : Open Thumbnail Cache Files : Launch Thumbcache Viewer and open the thumbnail cache files located in the Explorer directory. View Thumbnails : The tool will display the thumbnails stored in the cache, allowing you to browse and inspect them. Practical Uses Forensics and Investigation For forensic investigators, examining thumbnail cache files can reveal important information about files and images that were present on the system. Using tools like Thumbcache Viewer, investigators can recover thumbnails of deleted files, providing crucial evidence. ************************* To Know more about check out below article from thumbcache viewer itself https://thumbcacheviewer.github.io/ ************************* Conclusion The thumbnail cache in Windows is a useful feature that enhances the user experience by speeding up folder display. Knowing how to access, view, and manage these cache files can be beneficial for both everyday users and professionals. Tools like Thumbcache Viewer make it easy to inspect these files, and regular maintenance can help keep your system running smoothly. Akash Patel
- Automating Google Drive Forensics: Tools & Techniques
Investigating Google Drive for Desktop can be a time-consuming process, especially when dealing with protobuf-encoded metadata and cached files . Fortunately, open-source forensic tools like gMetaParse and DriveFS Sleuth make the job significantly easier . ------------------------------------------------------------------------------------------------------------- 1️⃣ Automating Metadata Extraction with gMetaParse 🔍 What is gMetaParse? Developed by forensic researcher, gMetaParse is a Python-based tool that automates the extraction of metadata from Google Drive’s metadata_sqlite_db database . 📌 Key Features of gMetaParse: ✅ Extracts metadata for all files and folders in Google Drive ✅ Identifies cached files stored locally ✅ Detects deleted (trashed) files ✅ Provides CSV, JSON, and GUI output 📍 Installation & Usage: g MetaParse is available as a Python script or pre-compiled .exe . It can be run via command-line or with a graphical user interface (GUI) . 🛠️ Step-by-Step: Running gMetaParse 1️⃣ Open a command prompt and navigate to the gMetaParse folder. 2️⃣ Run the following command: gMetaParse.exe -f "C:\Users\Akash\AppData\Local\Google\DriveFS\\metadata_sqlite_db"-d "C:\Users\Akash\AppData\Local\Google\DriveFS\\content_cache" -o "C:\Users\Akash\Downloads\GoogleDriveFS.csv" -g 📌 Explanation: ✅ -f → Points to the Google Drive metadata database ✅ -d → Specifies the cache folder location ✅ -o → Outputs results in CSV format ✅ -g → Launches GUI for interactive file browsing ------------------------------------------------------------------------------------------------------------- 2️⃣ Visualizing Google Drive Data with gMetaParse GUI 📌 Why Use the GUI? While CSV/JSON outputs are useful for analysis , gMetaParse’s graphical interface (GUI) makes it easier to navigate large file structures and visually identify deleted or cached files . 🔍 Features of gMetaParse GUI: ✅ Tree structure visualization of Google Drive contents ✅ Color-coded files: 🟥 Red → Deleted (trashed) files 🟩 Green → Cached (available locally) ✅ Detailed metadata view when clicking a file 📌 Forensic Use: ✅ Quickly identify deleted files and restore local copies ✅ Filter & search files using metadata ✅ Export all metadata for offline analysis ------------------------------------------------------------------------------------------------------------- 3️⃣ Extracting Google Drive Metadata with DriveFS Sleuth 🔍 What is DriveFS Sleuth? DriveFS Sleuth is an advanced Google Drive forensics tool developed by Amged Wageh and Ann Bransom . It specializes in decoding protobuf-encoded data and recovering MD5 hashes from Google Drive metadata. 📌 Key Features of DriveFS Sleuth: ✅ Parses metadata_sqlite_db , extracting file metadata, timestamps, and hashes ✅ Recovers MD5 hashes for locally stored files ✅ Extracts account information (Google email, username, settings) ✅ Provides interactive HTML reports 📍 Installation & Usage: 🛠️ Step-by-Step: Running DriveFS Sleuth 1️⃣ Install Python 3 (if not already installed). 2️⃣ Download DriveFS Sleuth from GitHub . 3️⃣ Run the following command: python3 drivefs_sleuth.py /mnt/c/Users/Akash/AppData/Local/Google/DriveFS --html -o /mnt/c/Users/Akash/Downloads/GoogleDriveFS.html ------------------------------------------------------------------------------------------------------------- 4️⃣ Investigating Google Workspace Logs (Business & Enterprise) 🔍 Why Are Google Workspace Logs Important? For enterprise environments , Google Workspace logs provide a detailed audit trail of user activity, including: ✅ File uploads, downloads, modifications ✅ File sharing (internal & external users) ✅ Deleted items & recovery attempts ✅ Login history & suspicious access attempts 📍 Accessing Google Workspace Logs: 1️⃣ Login to Google Admin Console (admin.google.com). 2️⃣ Navigate to Reports > Audit > Drive Log 3️⃣ Filter logs based on event type, user, date range, or filename . 4️⃣ Export logs to CSV for offline analysis. ------------------------------------------------------------------------------------------------------------- 5️⃣ Filtering Google Workspace Logs for Investigation 📌 Key Log Categories & Event Names: Event Name Description FileUploaded User uploaded a new file FileDownloaded File downloaded from Google Drive FileDeleted File moved to trash FileCopied File duplicated within Drive AnonymousLinkCreated File shared externally via public link FileViewed File opened by the user 📌 Forensic Use: ✅ I dentify suspicious file sharing (e.g., external link creation) ✅ Track deleted files & their recovery attempts ✅ Correlate file access with IP addresses & user accounts ------------------------------------------------------------------------------------------------------------- 6️⃣ Investigating File Sharing & External Access 📌 How to Identify External File Sharing: ✅ Look for AnonymousLinkCreated → Indicates public file sharing ✅ Check IP addresses in logs → Identify external access ✅ Cross-reference Google Drive metadata → Find locally cached shared files 📍 Example: Investigating an External File Share 1️⃣ Search logs for AnonymousLinkCreated 2️⃣ Identify which file was shared and by which user 3️⃣ Check logs for FileDownloaded → Determine if the file was accessed externally 4️⃣ Extract IP address & timestamps → Track external access ------------------------------------------------------------------------------------------------------------- Conclusion Google Drive forensics plays a crucial role in modern digital investigations, providing insights into file synchronization, access history, deletions, and metadata changes . By analyzing local artifacts, cloud logs, and sync databases , forensic analysts can reconstruct user activity and track evidence even after files have been deleted or modified. Understanding key artifacts such as Google Drive logs, SQLite databases, and API activity allows investigators to uncover who accessed what files, when, and from where —a critical aspect of forensic timelines. 🚀 Keep exploring, stay curious, and refine your forensic skills—because digital evidence is everywhere! 🔍 🎯 Next Up: Dropbox Forensics – Investigating Cloud Storage Security 🚀 -----------------------------------------------Dean-----------------------------------------------
- Detailed explanation of SPF, DKIM, DMARC, ARC
Updated on 28 January, 2025 Email security has always been a challenge because the Simple Mail Transfer Protocol (SMTP) wasn’t built with security in mind. This makes it easy for cybercriminals to spoof email addresses and launch phishing, scam, or spam attacks. However, various email authentication mechanisms have been introduced to help verify senders and detect fraudulent messages. When analyzing an email header, you’ll often see these security measures in action ------------------------------------------------------------------------------------------------------------ Sender Policy Framework (SPF) SPF helps verify if an email is sent from an authorized mail server for a particular domain. You’ll often find this in the header under the Received-SPF line. Think of SPF as a guest list for a party only specific mail servers are allowed to send emails on behalf of a domain. If an email comes from an unauthorized source, it fails SPF, raising a red flag. Received-SPF: pass (google.com: domain of n0459381b14-ceb4982011ad4618-nikopirosmani22===gmail.com@bounce.twitter.com designates 199.16.156.176 as permitted sender) client-ip=199.16.156.176; Header Entry: Received-SPF : This header field indicates the outcome of SPF validation. A "pass" typically signifies a legitimate email, while a "fail" might indicate a potentially suspicious email. Example:- if an email is supposedly from outlook.com, the SPF record ensures it was actually sent by Microsoft’s mail servers. ------------------------------------------------------------------------------------------------------------ DomainKeys Identified Mail (DKIM) DKIM takes email authentication a step further by verifying both the sender and the integrity of the message content . It uses a digital signature, which is added to the email header by the sending server. If this signature is valid, it confirms two things: The email genuinely came from the stated domain. The content wasn’t tampered with in transit. Header Entry: DKIM-Signature : This header field contains the DKIM signature and associated information. A successful DKIM validation usually results in a "pass" status. ------------------------------------------------------------------------------------------------------------ Authenticated Received Chain (ARC) Emails often get forwarded through mailing lists, auto-forwarding, or relays . When that happens, SPF and DKIM checks may fail because the email’s route has changed. That’s where ARC comes in. A RC keeps track of authentication results at each hop , maintaining a chain of trust. ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=arc-2016081 h=feedback-id:message-id:precedence:list-unsubscribe:mime-version:subject:to:from:date:dkim-signature; Every forwarding step is recorded in the email header, and each mail server in the chain signs the message with an ARC-Message-Signature. This way, even if SPF and DKIM fail due to forwarding, ARC can confirm that the original email was legitimate . Google was one of the first major email providers to adopt ARC, followed by Microsoft 365 and others. ------------------------------------------------------------------------------------------------------------ Domain-based Message Authentication, Reporting, and Conformance (DMARC) DMARC builds on SPF and DKIM by letting domain owners specify what should happen if an email fails authentication checks . The policy can be set to: None (just monitor emails without blocking them) Quarantine (send suspicious emails to the spam folder) Reject (completely block failed emails from delivery) Header Entry: dmarc : This header field displays the DMARC policy status, which can be "pass," "fail," "none," or other designated states. It also indicates policy actions like "p=REJECT" or "p=NONE." ------------------------------------------------------------------------------------------------------------ Verifying Email Authentication for Investigations If you're investigating a suspicious email, checking SPF, DKIM, ARC, and DMARC records can help confirm its legitimacy . Here are some practical tools: MxToolbox – Checks SPF records and other email security details. dkimpy (Python library) – Validates DKIM and ARC signatures. Metaspike Forensic Email Intelligence – Automates email header analysis for forensic investigations.\ Limitations of Email Authentication While these security measures are powerful, they aren’t foolproof. Here’s what you should keep in mind: Not all email providers use SPF, DKIM, and ARC. DKIM and ARC signatures can expire when mail servers rotate their keys, making it impossible to validate old emails. These authentication methods only apply to received emails, not emails in the sender’s outbox. Microsoft Outlook and Exchange may modify email headers, making DKIM validation difficult for emails stored in PST/OST files. To ensure authenticity, collect emails in their original MIME format (EML, EMLX, or MBOX). Implications for Digital Forensics Enhanced Verification : SPF, DKIM, and DMARC provide digital forensic professionals with additional tools for email verification and authentication, enhancing the accuracy and reliability of forensic investigations. Policy Interpretation : Understanding DMARC policies can help investigators interpret email handling procedures and identify potential red flags or suspicious activities. Privacy and Compliance : While these protocols enhance security, forensic professionals must also ensure that their methods align with privacy regulations like GDPR, respecting user consent and data protection rights. Conclusion SPF, DKIM, and DMARC protocols have become integral components of modern email security, offering robust mechanisms for authentication, integrity, and policy enforcement. As these protocols continue to evolve, digital forensic professionals must stay updated with the latest trends and practices to effectively navigate the complexities of email-based investigations, ensuring both security and compliance in their endeavors. Akash Patel
- Webmail Forensics / Mobile Email Forensics: A Critical Component of Digital Investigations
Introduction Webmail forensics is a crucial aspect of digital investigations, especially in cases involving cybercrime, fraud, and eDiscovery. Understanding how webmail services operate, where data is stored, and how to extract and analyze it effectively is essential for forensic examiners. Mobile Email Considerations Many investigators overlook mobile email when acquiring evidence. While some smartphones sync with corporate mail servers and only maintain copies of emails, mobile devices can contain valuable messages that are difficult to retrieve elsewhere. Therefore, it is important to: Verify how email-capable phones interact with mail servers. Assess whether cloud acquisition is in scope. Investigate Mobile Device Management (MDM) software logs, which may provide metadata for SMS/MMS, call logs, or device backups. Mobile Email Backups Smartphone backups can provide historical data, even if the mobile device is unavailable. Investigators should search for backup files: Android Backup Files (.ab extension) or vendor-specific backups like Samsung Smart Switch, LG Bridge, and Huawei HiSuite. iOS Backups stored in locations such as: C:\Users\\AppData\Roaming\Apple Computer\MobileSync\Backup C:\Users\\Apple\MobileSync\Backup These backups may contain email messages, contacts, and configuration files, aiding forensic analysis. Windows "Phone Link" Application Introduced in Windows 10 as "Your Phone" and rebranded as "Phone Link" in Windows 11, this application provides access to: Call logs (calling.db) Contacts (contacts.db) SMS/MMS messages (phone.db) Photos (photos.db) Notifications (notifications.db) These SQLite databases, stored under can be extracted and analyzed using tools like KAPE (WindowsYourPhone.tkape) and SQLECmd. %UserProfile%\AppData\Local\Packages\Microsoft.YourPhone_8wekyb3d8bbwe\LocalCacheIndexed\\System\Database Webmail Investigation Steps Identify Email Clients and Services: Determine what email clients exist on the system and whether the user relies on webmail services . Review system folder structures. Check the Windows registry for installed email applications. Examine browser history, cookies, and cached files for webmail use. Forensic Acquisition of Email Data: Acquire mail archives within the scope of authority. Extract both server mailboxes and local storage. Convert email archives into a consistent format, such as PST, for easier analysis while retaining original files for authenticity checks. Email Header and Metadata Analysis: Extract and analyze email headers to trace the origin and integrity of messages. Validate email authenticity using DKIM/ARC signatures. Identify sender IP addresses and geolocation. Cross-reference timestamps for consistency. Commercial and Open-Source Tools Commercial tools like Metaspike Forensic Email Intelligence (FEI) provide extensive features, including: SMTP/MAPI header parsing. Email validation and timestamp extraction. IP and domain intelligence. Advanced searching and filtering of email archives. Additionally, forensic tools like Autopsy's Your Phone Analyzer module can help parse mobile email artifacts. Conclusion Webmail forensics plays a vital role in digital investigations. By understanding how emails are stored, retrieved, and analyzed across devices, forensic examiners can uncover critical evidence. Utilizing both forensic best practices and specialized tools ensures thorough and accurate email investigations. -------------------------------------------Dean-----------------------------------------------
- Webmail Forensics: Challenges, Techniques, and Investigation Tools
Update on 29 Jan, 2025 Webmail presents unique challenges for forensic investigations due to its cloud-based nature. Unlike traditional email clients that store messages locally, most webmail exists solely on servers operated by email service providers (ESPs) . This lack of offline archives makes forensic analysis more complex unless the user has enabled offline storage via POP or IMAP protocols . In such cases, emails can be retrieved from the user’s email client using standard host-based forensic techniques. Otherwise, forensic investigators must rely on keyword searches, data carving, or legal requests to the ESP for email preservation and release. -------------------------------------------------------------------------------------------------------- Challenges in Investigating Webmail One of the biggest hurdles in webmail investigations is identifying whether webmail is being used and determining which accounts belong to the target . Web browser forensics can help uncover email activity by analyzing: Browser history and cached data Auto-complete databases Saved passwords (if legally permissible) Regular expression searches for email addresses -------------------------------------------------------------------------------------------------------- Techniques for Webmail Collection Google Takeout and Similar Tools Many service providers offer tools for users to download their data. Emails (stored in MBOX format) Contacts, calendars, and bookmarks Drive files, Chrome history, and passwords For forensic investigations, this method requires the target's credentials and, if enabled, multi-factor authentication. IMAP Synchronization A simple yet effective way to collect webmail is through the IMAP protocol . This involves setting up an email client on a forensic workstation and synchronizing the target’s mailbox. However, Outlook is not ideal for forensic collection as it modifies email headers, which can impact DKIM and ARC validation. IMAP is widely used for collecting emails from providers that lack dedicated APIs, including: Outlook.com Hotmail Yahoo Mail iCloud AOL Mail Forensic Email Collection Tools Several specialized tools streamline webmail forensic investigations: 1. Magnet AXIOM Supports cloud-based email collection from Google Workspace, Microsoft 365, iCloud, and more. Uses API integration for forensic acquisition, requiring Super Admin privileges for enterprise accounts. 2. Metaspike Forensic Email Collector (FEC) Supports Microsoft 365 via Exchange Web Services, Microsoft Graph API, and IMAP. Captures Gmail, Google Workspace, and Microsoft webmail accounts. Uses a unique Remote Authenticator to extract authentication tokens from a live system. Provides IMAP server logs, useful for detecting message manipulation via internal sequence numbers and timestamps. https://www.metaspike.com/software/ -------------------------------------------------------------------------------------------------------- Legal Requests for Webmail Data Each major ESP and social media platform offers legal and law enforcement guides detailing how investigators can request user data. These documents, often restricted to law enforcement, provide valuable insights into: Data retention policies Available subscriber information Logging details such as IP addresses used for account creation and access Similar legal resources exist for Google, Facebook, and Microsoft. Transparency reports from these providers give insight into the volume and nature of legal requests they rece -------------------------------------------------------------------------------------------------------- Browser Artifacts Webmail services like Gmail, Yahoo Mail, and Outlook are often accessed through web browsers, leaving behind a wealth of forensic artifacts. These browser-based traces can provide valuable insights into user activity, making them a key source of evidence in digital investigations. Whether analyzing a potential email compromise or tracking user communications, forensic experts can uncover crucial details through browser history, cache, and memory analysis. The Role of Browser Artifacts in Webmail Forensics Since webmail is accessed through browsers, artifacts left behind in browser history, cookies, cache, and session data can reveal: Webmail account names and providers – Identifying which webmail services were used. Email subject lines – Some services, like Gmail, include the subject line of opened emails in the page title, making it easier to conduct deeper searches. Folder structures and accessed emails – URL parameters and page titles often indicate which email folders were accessed (e.g., Inbox, Sent, Drafts, Trash). Composed messages – Identifying if and when new messages were composed can be crucial in cases of email compromise. Search activity – Users frequently search within their webmail, and these search terms can reveal important topics of interest or specific emails accessed. Analyzing Browser History and Cache Browser history is a primary source of forensic evidence, as it contains URLs, timestamps, and referrer data. Additionally, cached webmail data can contain valuable remnants , though modern dynamic web content has made these traces less common. A strategic approach is to filter browser cache files for relevant webmail domains and then manually examine them . JSON and XML formats are commonly used, so a viewer that supports these formats can help analyze extracted data. For instance: Gmail cache files may contain a list of recent email contacts. Yahoo Mail cache files have been found to store search terms used by the user, sometimes spanning multiple years. A common technique is to filter search results by keywords like “mail” to identify relevant artifacts. Zero-byte files, which are often present, can be ignored to streamline the investigation. -------------------------------------------------------------------------------------------------------- Memory Analysis for Webmail Artifacts Capturing a system's memory can be one of the most effective ways to extract webmail data. While email content is rarely stored long-term in browser caches, it often remains in system memory while the session is active. Forensic tools like Magnet AXIOM (previously Internet Evidence Finder), Belkasoft, and AccessData specialize in carving out webmail remnants from memory images. These tools can recover: Complete webmail messages Email metadata Session tokens and authentication data -------------------------------------------------------------------------------------------------------- Webmail Forensics Arsenal Recon has developed an open-source tool called GmailURLDecoder , designed to extract and decode Gmail URLs from forensic output files. This tool can reveal embedded timestamps and other key information, making it a valuable asset for investigators. -------------------------------------------------------------------------------------------------------- Conclusion Webmail forensics is an essential aspect of modern digital investigations. By leveraging browser artifacts, cache data, and memory analysis, forensic experts can uncover valuable insights into email activity. While dynamic web content has reduced the amount of recoverable data in browser caches, careful search techniques and forensic tools can still reveal critical evidence. -------------------------------------------Dean---------------------------------------------------------
- Microsoft 365: Content Search, Unified Audit Logs, and Extracting Logs for Investigations
Updated on 29 Jan,2025 Microsoft 365 Purview Compliance Manager offers a powerful Content Search feature that allows organizations to search across emails, Teams chats, SharePoint, OneDrive for Business, and even CoPilot usage. This tool is often the first stop when investigating emails and other online content. Key Features of Content Search Extensive Search Scope : Covers emails, Teams chats, SharePoint, OneDrive, and CoPilot interactions. Search Refinement : Filter results based on keywords, email addresses, and other parameters. Preview and Export : Search results can be estimated, previewed, and ultimately exported. Integration with eDiscovery : Enables litigation holds and deeper investigative workflows. Microsoft Purview Licensing and Access The features available depend on the organization's Microsoft version: E5 License : Access to Premium eDiscovery tools, including Advanced Audit Logging . Lower-tier Licenses : Access to Standard eDiscovery tools, which still provide audit log search capabilities. -------------------------------------------------------------------------------------------------------- Exporting Mailboxes to .PST Format Microsoft 365 allows the export of mailboxes via Content Search . Once a search is completed, results can be exported in .PST format for emails , while SharePoint and OneDrive content is exported in native formats. Export Limitations Maximum 2 TB of data per search per day . Supports up to 100,000 mailboxes per export. Individual .PST files are capped at 10 GB , with large searches split into multiple files. A maximum of 10 exports can run simultaneously. To perform an export, the user must be assigned to the eDiscovery Manager role. -------------------------------------------------------------------------------------------------------- Unified Audit Logs (UAL) and Their Importance Microsoft 365 provides Unified Audit Logs (UAL) for tracking activity across Exchange Online, SharePoint Online, OneDrive for Business, and Azure AD. These logs help security teams investigate potential threats and track attacker activities. Key Points About UAL: Enabled by Default (since 2019) : Previously, logging had to be manually enabled for each user. Retention Policy : 90 days by default. Up to 1 year for Microsoft 365 E5 users. Azure AD logs are retained for 180 days (depending on the license). Export Format : Logs are exported in JSON format and can be processed using third-party tools for extended retention. ------------------------------------------------------------------------------------------------------- Auditing and Logging Office 365 offers built-in auditing and APIs for Exchange Online, SharePoint Online, OneDrive for Business, and Azure AD. However, auditing is not enabled by default. Here's how you can enable auditing for a user via PowerShell: Set-Mailbox -Identity "Akash Patel" -AuditEnabled $true When enabling logging, not all items are logged by default. You can chain multiple commands to set all available logging options for mailbox owner accounts: Get-Mailbox -ResultSize Unlimited -Filter {RecipientTypeDetails -eq "UserMailbox"} | Set-Mailbox -AuditEnabled $true -AuditOwner "Create,HardDelete,MailboxLogin,Move,MoveToDeleteditems,SoftDelete,Update" What to Keep in Mind Logging Limitations : Logging in Office 365 has limitations, such as no logoff events and limited logging for non-admin accounts. Log Retrieval Time : Logs for SharePoint and OneDrive are typically available 15 minutes after the event, while Exchange Online and Azure AD logs may take between 30 minutes to 12 hours. -------------------------------------------------------------------------------------------------------- One critical audit category is MailItemsAccessed , which logs when a user or attacker views emails . Initially restricted to admin users, it is now available for all tenants—though the rollout has been slow. -------------------------------------------------------------------------------------------------------- Investigating Logs with PowerShell The Search-UnifiedAuditLog PowerShell cmdlet is a powerful tool for log analysis. Search-UnifiedAuditLog -StartDate 029/01/2025 -EndDate 30/01/2025 -UserIds -Operations MailItemsAccessed Log Availability: SharePoint & OneDrive logs : Available ~15 minutes after events. Exchange Online & Azure AD logs : May take 30 minutes to 12 hours to appear. -------------------------------------------------------------------------------------------------------- Extracting Microsoft 365 Audit Logs Efficiently Extracting logs manually can be cumbersome due to limitations in Microsoft’s interface. Fortunately, third-party tools simplify this process: Microsoft-Extractor-Suite I have created a detailed article on Microsoft-Extractor-Suite (This article will be enough to get you running and understand how this tools work addition to another tool microsoft analyzer which will help you in investigation) Streamlining Cloud Log Analysis with Free Tools: Microsoft-Extractor-Suite and Microsoft-Analyzer-Suite https://www.cyberengage.org/post/streamlining-cloud-log-analysis-with-free-tools-microsoft-extractor-suite-and-microsoft-analyzer-su Hawk (PowerShell-Based Investigation Tool) (Will create article on this in future) GitHub Link: Hawk - O365 Intrusion Analysis -------------------------------------------------------------------------------------------------------- Final Thoughts Microsoft 365 Purview provides robust eDiscovery, search, and audit capabilities for compliance and security teams. Understanding how to effectively leverage these tools—alongside PowerShell and third-party utilities—can make investigations faster and more efficient. Ensure that audit logs are enabled and verify logging configurations to avoid surprises during critical incidents! ------------------------------------------------------Dean-----------------------------------
- Leveraging Compliance Search in Microsoft Exchange for Email Investigations
Microsoft Exchange offers powerful tools for searching, archiving, and reviewing emails. One of these tools, Compliance Search , is designed for eDiscovery but is equally effective for tracking suspicious emails, investigating malware incidents, or responding to security breaches.. What is Compliance Search? Compliance Search first appeared in Exchange 2013. It provides a highly granular way to conduct email investigations by leveraging Exchange’s built-in indexing system . This indexing allows for efficient searches across email contents, including attachments, subject lines, and metadata. For on-premises Exchange servers There is no limit to the number of mailboxes that can be searched, but each individual search is restricted to a maximum of 500 mailboxes and 50 GB of data . In Microsoft 365, different limits may apply. What Can You Search? Email messages (including body text and metadata) Attachments (except encrypted files or unsupported formats) Contacts and calendar entries Deduplication options (to avoid duplicate search results) Compliance Search in Action New-ComplianceSearch -name "Legal Case 280" -ExchangeLocation "Operations" -ContentMatchQuery "'Query' AND 'Akash'" In Office 365, a GUI interface is provided within the Compliance Center for easier execution. Exchange 2010: The Predecessor to Compliance Search Before Compliance Search, Exchange 2010 relied on "Multi-Mailbox Search. " While less refined than Compliance Search, it offered advanced searching capabilities within a designated Discovery Management user group. This group allowed specific users to conduct advanced searches across the Exchange domain. Compliance Search in Microsoft 365 For Microsoft 365 Exchange Online , Compliance Search is integrated into the Microsoft Purview interface, offering additional features such as: Expanded search capabilities (including Teams, OneDrive, SharePoint, and even CoPilot AI prompts) Keyword statistics (helping refine search terms and estimate matching results) Litigation Holds (preventing deletion of identified emails, including future messages related to a case) This makes Compliance Search a crucial tool for legal teams, cybersecurity analysts, and IT administrators when handling data retention, incident response, and regulatory compliance . References [l] Use Compliance Search to Search All Mailboxes in Exchange 2016: https://learn.microsoft.com/en-us/exchange/policy-and-compliance/ediscovery/compliance-search?view=exchserver-2019&redirectedfrom=MSDN [2] New-ComplianceSearch: https://learn.microsoft.com/en-us/powershell/module/exchange/new-compliancesearch?view=exchange-ps&redirectedfrom=MSDN -------------------------------------------Dean--------------------------------------------------







