
Please access this website using a laptop / desktop or tablet for the best experience
Search Results
514 results found with an empty search
- Box Cloud Storage Forensic Investigations: Logs, Cached Files, and Metadata Analysis
Box is one of the most forensic-friendly cloud storage applications, offering extensive logging, locally cached files, and SQLite databases that track user activity and file metadata. This makes it a goldmine for forensic investigators looking to analyze user interactions, deleted files, and cloud-stored documents. ------------------------------------------------------------------------------------------------------------- 1️⃣ Box Local Artifacts: Logs, Databases, and Cache Files 🔍 Where Does Box Store Metadata and Logs Locally? File/Database Location Purpose sync.db %AppData%\Local\Box\Box\data\ Tracks both locally cached & cloud-only files streemsfs.db %AppData%\Local\Box\Box\data\ Lists cached files stored locally metrics.db %AppData%\Local\Box\Box\data\ Stores Box login email & authentication details Box_Streem logs %AppData%\Local\Box\Box\logs\ Detailed user activity logs (file access, sync) .cache %UserProfile%\Box\ Stores offline & temp cached files 📌 Forensic Use: ✅ Recover cached files, including cloud-only items accessed offline ✅ Track user logins and Box authentication details ✅ Extract timestamps and SHA1 hashes for forensic verification ------------------------------------------------------------------------------------------------------------- 2️⃣ Understanding Box's NTFS Reparse Points & Virtual File System 🔍 How Does Box Handle Cloud-Only Files? Box uses NTFS reparse points to create a virtualized file system , meaning: Files appear local , but actual content may be in the cloud. Interacting with a file triggers a real-time download. B ox Drive contents won’t be visible in traditional forensic imaging . 📍 Forensic Implications: 🔸 Standard disk imaging won’t capture Box cloud files if they are not cached. 🔸 Investigators must a ccess live systems or parse Box databases to extract metadata. 🔸 Offline files can be recovered from the cache directory . ------------------------------------------------------------------------------------------------------------- 3️⃣ Extracting Metadata from Box SQLite Databases 🔍 1. Analyzing sync.db: The Box File Tracker *******Deleted items in the Box Drive trash folder are not tracked in this database. 📍 Located at: %AppData%\Local\Box\Box\data\sync.db Column Description name Original filename checksum SHA1 hash of file size File size (bytes) content_created_at File creation time (Unix epoch) content_updated_at Last modification time (Unix epoch) parent_item_id Parent folder, which can be cross-referenced with the box_id field to find folder name Note: Timestamps do not appear to update as expected when interfacing with Bo x on the website via a browser. As an example, content_created_at and content_updated_at are both set to the original file modification time when the file is added via the browser. When using Windows File Explorer to interact with files, timestamps update as expected. Useful fields from the local_item table: . inode: Universal ID assigned to file, which can be useful for quickly matching with other databases like streemsfs.db 📌 Forensic Use: ✅ Identify cloud-only files and locally stored files ✅ Verify file integrity using SHA1 checksums ✅ Correlate file timestamps with user activity 🔍 2. Analyzing streemsfs.db: Track locally Cached File files ********Both offline and online (cloud) items are tracked in the database, but deleted items in the Box Drive trash folder are not tracked within this database. 📍 Located at: %AppData%\Local\Box\Box\data\streemsfs.db Column Description name Cached file name createdAtTimestamp File creation time modifiedAtTimestamp Last modified time accessedAtTimestamp Last accessed time (when file was opened from Box Drive) markForOffline Files marked for permanent offline use inodeId Identifier used to determine parent folders and as a foreign key in cachefiles table parentInodeId: inodeId for parent folder folderFetchTimestamp: When folder content was last synchronized with cloud Note: Timestamps do not appear to update as expected when interfacing with Box on the website via a browser. As an example, createdAtTimestamp, modifiedAtTimestamp, and accessedAtTimestamp are all set to the original file modification time. And the accessedAtTimestamp does not update when the file is solely accessed via the browser. When using Windows File Explorer, timestamps update as expected in the database, with the exception that access I mportant tables to Look for: Cachefiles, fsnodes cacheDataId: Filename within the Box cache folder of the locally saved file size: File size of cached file (in bytes) inodeId: Identifier used as a foreign key in fsnodes table age: Time file was cached; a “ 0” value means not yet cached (Unix epoch time) As you can see screenshot above correlating data one by one b/w them is difficult task to automate the, we will be use freely available script called streemBOXlite This can be particularly helpful if there are many files cached on the system You should run this script using Python3. Command: akash@DESKTOP-DCLRDM4:/mnt/c/Users/Admin/Downloads$ python3 streemBOXlite.py -p /mnt/d/streem/ -c -o /mnt/c/Users/Admin/Downloads -v -q Output: 📌 Forensic Use: ✅ Determine what files were accessed and stored locally ✅ Track offline file synchronization with Box cloud ✅ Recover deleted or previously cached files 🔍 3. Extracting User Login & Activity from metrics.db 📍 Located at: %AppData%\Local\Box\Box\data\metrics.db Column Description payload Stores Box login email & user authentication data 📌 Forensic Use: ✅ Identify Box accounts linked to the system ✅ Correlate login activity with user behavior ✅ Investigate unauthorized access to enterprise Box accounts ------------------------------------------------------------------------------------------------------------- 4️⃣ Investigating Box Log Files (Box_Streem logs) 🔍 Box logs detailed user actions, including: ✅ File uploads & downloads ✅ Folder structure changes ✅ Synchronization errors ✅ Authentication attempts 📍 Log Location: %AppData%\Local\Box\Box\logs\ 📌 Forensic Use: ✅ Track when files were accessed or modified ✅ Reconstruct user file activity timeline ✅ Identify anomalous file access patterns ------------------------------------------------------------------------------------------------------------- 5️⃣ Recovering Deleted & Cached Files from Box Drive 🔍 How to Recover Deleted Files from Box? Locally Deleted Files → Found in Recycle Bin Box Cloud Trash → Retains deleted files for 30-120 days Cached Files → Found in %UserProfile%\Box\.cache 📍 Forensic Strategy: 1️⃣ Extract file metadata from sync.db & streemsfs.db 2️⃣ Search .cache for offline versions of deleted files 3️⃣ Check Box logs for file deletion records 4️⃣ Retrieve deleted files from Box Cloud if enterprise logs are available 📌 Forensic Use: ✅ Recover previously cached Box files even if deleted from the cloud ✅ Identify sensitive documents removed from the system ✅ Correlate file deletion with user activity logs ------------------------------------------------------------------------------------------------------------- 🚀 Summary: Why Box is a Goldmine for Forensics ✔ Tracks file hashes (SHA1), timestamps, and offline/online status ✔ Maintains detailed logs for file access, sync, and user activity ✔ Stores locally cached files even if deleted from the cloud ✔ Allows forensic reconstruction of user interactions with cloud storage As organizations increasingly rely on Box for cloud storage and collaboration, understanding Box forensics is essential for digital investigations. Box provides detailed activity logs, file versioning, and sharing records , which can help forensic analysts track user actions, detect unauthorized access, and reconstruct file history. 🔍 Stay proactive, test forensic scenarios, and refine your analysis techniques—because every digital action leaves a trace! 🚀
- Investigating Dropbox Forensics
Dropbox has long been a challenging cloud storage service to investigate due to encrypted databases, hidden caches, and complex storage mechanisms . However, recent changes in Dropbox’s architecture have introduced unencrypted metadata sources , making forensic analysis more effective . 🚀 Key Topics Covered: ✅ Locating and analyzing Dropbox metadata & configuration files. ✅ Recovering deleted files from cache and database records ✅ Investigating Dropbox sync activity and user file interactions ✅ Extracting evidence from SQLite databases & JSON logs ------------------------------------------------------------------------------------------------------------- 1️⃣ Locating Dropbox Artifacts on Windows 📌 Primary Dropbox Data Locations Artifact Location Purpose Local Dropbox Folder %UserProfile%\Dropbox\ Stores synced files Configuration Files %UserProfile%\AppData\Local\Dropbox\info.json Contains Dropbox settings & sync path Cache Folder %UserProfile%\Dropbox\.dropbox.cache\ Stores recently deleted & cloud-only files Sync Databases %UserProfile%\AppData\Local\Dropbox\instance1\ Tracks file sync activity Registry Keys SOFTWARE\Microsoft\Windows\CurrentVersion\Explorer\SyncRootManager\Dropbox Identifies sync location & settings 📌 Forensic Use: ✅ Identify Dropbox usage even if uninstalled ✅ Recover deleted files from the cache folder ✅ Find local & cloud-only files ------------------------------------------------------------------------------------------------------------- 2️⃣ Extracting Dropbox Configuration Details Located at %UserProfile%\AppData\Local\Dropbox\ this JSON file stores: ✅ Sync folder path (customized storage location) ✅ Dropbox Team info (Enterprise accounts) ✅ Subscription type (Basic, Plus, Business, Enterprise) 📌 How to extract data: 1️⃣ Open the file with a JSON viewer 2️⃣ Search for path, is_team, and subscription_type fields 📌 Forensic Use: ✅ Verify Dropbox usage & account type ✅ Identify business accounts with enhanced logging ✅ Locate all synced files on disk ------------------------------------------------------------------------------------------------------------- 3️⃣ Recovering Deleted & Cloud-Only Files 🔍 The .dropbox.cache Folder 📍 Location: %UserProfile%\Dropbox\.dropbox.cache\ 🔍 Purpose: ✅ A hidden folder present in the root of the user's Dropbox file folder. Can contain copies of deleted files not yet purged from the local file store, ✅ Caches cloud-only files accessed recently ✅ Cleared automatically every 3 days 📌 How to recover files: 1️⃣ Check file headers to identify file types 2️⃣ Use forensic tools (e.g., FTK Imager) to analyze deleted file remnants 3️⃣ Correlate timestamps with Dropbox logs to determine deletion events ------------------------------------------------------------------------------------------------------------- 4️⃣ Investigating File Sync & Modification History 🔍 The aggregation.dbx Database 📍 Location: %UserProfile%\AppData\Local\Dropbox\instance1\ ✅ Tracks previous file updates to Dropbox storage ✅ Stores full path, timestamp, and user attribution 📌 Forensic Use: ✅ I dentify files recently added or modified ✅ Snapshot table****** Determine who edited the fil e (edited_by_me field)***** ✅ Recover deleted or renamed files 🛠 Parsing the Database: 1️⃣ Open with SQLite Viewer 2️⃣ Extract the recent table 3️⃣ Convert JSON entries for easy reading ------------------------------------------------------------------------------------------------------------- 5️⃣ Extracting File Metadata & Starred Items 🔍 The home.db Database 📍 Location: %UserProfile%\AppData\Local\Dropbox\instance1\ 📌 Key Tables: Table Field Purpose recents server_path, timestamp Last updated files starred_items server_path, is_starred, timestamp Files marked as "important" sfj_resources server_path, server_fetch_timestamp Tracks last sync from cloud 📌 Forensic Use: ✅ Track starred files (user-marked important files) ✅ Determine last synced files from the cloud ✅ Recover previous versions of files ------------------------------------------------------------------------------------------------------------- 6️⃣ Investigating Dropbox Sync History 🔍 The sync_history.db Database 📍 Location: UserProfile%\AppData\Local\Dropbox\instance1\ ✅ Records uploads, downloads, deletions, and modifications ✅ Tracks changes made locally vs. changes from the cloud 📌 Key Fields in sync_history Table: Field Purpose file_event_type Type of action (add, delete, edit) direction Upload = Local → Cloud, Download = Cloud → Local local_path Full file path timestamp Time of last activity other_user "1" indicates file owned by another user 📌 Forensic Use: ✅ Identify if a file was deleted locally or via the cloud ✅ Track external file sharing & downloads ✅ Determine if files were modified before deletion ------------------------------------------------------------------------------------------------------------- 7️⃣ Recovering Hidden Dropbox Files 🔍 The nucleus.sqlite3 Database 📍 Location: %UserProfile%\AppData\Local\Dropbox\instance1\sync ✅ S tores names of local & cloud-only files ✅ T racks synced & unsynced files 📌 Key Tables: Table Field Purpose local_tree value Files currently synced locally synced_tree value Mirrors local_tree but with extra metadata remote_tree value Tracks cloud-only files (not synced) 📌 Forensic Use: ✅ Identify files stored only in the cloud ✅ Recover filenames of deleted cloud files ✅ Determine the last known location of missing files ------------------------------------------------------------------------------------------------------------- 8️⃣ Extracting Thumbnails of Deleted Dropbox Images 🔍 The tray-thumbnails.db Database 📍 Location: %UserProfile%\AppData\Local\Dropbox\machine_storage ✅ Stores references to to image files once present in Dropbox ✅ Includes metadata on deleted images 📌 Key Fields: Field Purpose file_name Name of the image file timestamp Time the thumbnail was created 📌 Forensic Use: ✅ Recover filenames of deleted images ✅ Identify when images were last accessed or modified ✅ Correlate with file sync logs for evidence reconstruction ------------------------------------------------------------------------------------------------------------- Extracting icons information 🔍 The icon.db Database 📍 Location: %UserProfile%\AppData\Local\Dropbox\instance1\ ✅ Stores generated icon information, including full file paths. 📌 Key Fields: Field Purpose file_name Full file path created_time Likely the creation time of the icon, not the time an item was added to the store (Unix epoch time) ------------------------------------------------------------------------------------------------------------- 9️⃣ Investigating Dropbox Enterprise & Team Accounts 🔍 Dropbox Business & Enterprise Accounts offer extended logging and audit trails . ✅ Tracks file sharing, modifications, deletions ✅ Identifies file downloads & external access 📌 Forensic Use: ✅ Monitor suspicious file transfers within teams ✅ Track shared links & external file access ✅ Recover deleted files from extended retention policies 🛠 How to Access Dropbox Business Logs: 1️⃣ Login to Dropbox Admin Console 2️⃣ Navigate to Reports > Activity Logs 3️⃣ Filter logs by event type (file downloaded, shared, deleted, etc.) 4️⃣ Export logs in CSV format for offline analysis ------------------------------------------------------------------------------------------------------------- 🔎 Summary & Forensic Workflow ✅ Step 1: Identify Dropbox installation (check info.json, registry keys, and instance1 folder). ✅ Step 2: Extract file metadata (home.db, aggregation.dbx). ✅ Step 3: Recover deleted files (.dropbox.cache, sync_history.db). ✅ Step 4: Track cloud-only & unsynced files (nucleus.sqlite3). ✅ Step 5: Track icons information ( icon.db ). ✅ Step 5: Analyze Dropbox Business logs for enterprise investigations. We will explore more about Dropbox in the next article ( Dropbox Forensic Investigations: Logs, Activity Tracking, and External Sharing ) , so stay tuned! See you in the next one.
- Dropbox Forensic Investigations: Logs, Activity Tracking, and External Sharing
Dropbox presents significant challenges for forensic investigations due to encrypted databases, limited endpoint logs, and obfuscated external IP s . However, with the right approach, investigators can extract valuable metadata, user activity records, and external sharing reports . 🚀 Key Topics Covered: ✅ Extracting Dropbox metadata from local databases ✅ Using SQLECmd to automate SQLite analysis ✅ Tracking user actions via cloud activity logs ✅ Investigating file sharing and external access -------------------------------------------------------------------------------------------------------- 1️⃣ Dropbox Local Artifacts: Databases & Metadata Files 🔍 Where Does Dropbox Store Metadata Locally? File/Database Location Purpose info.json %AppData%\Local\Dropbox\ Dropbox configuration & sync folder location .dropbox.cache %UserProfile%\Dropbox\ Cached & staged file versions aggregation.dbx %AppData%\Local\Dropbox\instance<#> Recent file updates (JSON format) home.db %AppData%\Local\Dropbox\instance<#> Tracks Dropbox file changes (Server File Journal) sync_history.db %AppData%\Local\Dropbox\instance<#> Upload/download activity nucleus.sqlite3 %AppData%\Local\Dropbox\instance<#>\sync List of local & cloud-only files 📌 Forensic Use: ✅ Identify Dropbox folder locations & linked accounts ✅ Recover deleted/staged files from .dropbox.cache ✅ Reconstruct file modification history using home.db -------------------------------------------------------------------------------------------------------- 2️⃣ Automating Dropbox Analysis with SQLECmd 🔍 What is SQLECmd? SQLECmd is an open-source forensic tool created by Eric Zimmerman to automate SQLite database parsing . It utilizes map files to identify Dropbox, Google Drive, and other forensic databases , automatically extracting file activity, timestamps, and metadata . What I did? Used gkape to extract all dropbox related files: 📍 Example: Running SQLECmd on Dropbox Data SQLECmd.exe -d C:\Users\Akash's\Incident response Dropbox --csv . 📌 How It Works: 🔹 -d : Specifies the directory to scan (Dropbox data folder) 🔹 --csv . : Saves results as CSV files in the current directory 📌 Forensic Use: ✅ Quickly extract metadata from Dropbox SQLite databases ✅ Identify synced, modified, and deleted files ✅ Analyze file movement within Dropbox folders -------------------------------------------------------------------------------------------------------- 1️⃣ Dropbox Logging: Free vs. Business Tiers 🔍 Comparing Activity Logs Across Dropbox Tiers Feature Basic (Free) Dropbox Business File Add/Edit/Delete Logs ❌ No logs ✅ Yes File Download & Upload Logs ❌ No logs ✅ Yes User Login & Session History ✅ Limited ✅ Full IP & Geolocation External File Sharing Reports ❌ No ✅ Yes Export Logs to CSV ❌ No ✅ Yes API Access for Logs ❌ No ✅ Yes 📌 Forensic Use: ✅ Track file modifications & deletion history ✅ Identify suspicious logins based on IP & location ✅ Monitor shared links for data exfiltration -------------------------------------------------------------------------------------------------------- 2️⃣ Accessing Dropbox Logs via the Admin Console 🔍 Steps to Retrieve Logs: 1️⃣ Log in to the Dropbox Admin Console 2️⃣ Navigate to Reports > Activity Logs 3️⃣ Use Filters to narrow results by user, file, folder, or event type 4️⃣ Click "Create Report" to export logs in CSV format 📌 Forensic Use: ✅ Track who accessed or modified sensitive files ✅ Identify suspicious external IP addresses ✅ Monitor deleted files & restoration attempts -------------------------------------------------------------------------------------------------------- 3️⃣ Investigating IP Addresses & Geolocation Data 🔍 Analyzing IP Logs for Unauthorized Access Dropbox logs user IP addresses and device locations , which can help track unauthorized logins . ⚠ Limitations: Dropbox obfuscates some external IP addresses , making it difficult to identify non-employee access . 4️⃣ Tracking External File Sharing & Anonymous Links 🔍 Dropbox Business "External Sharing" Report Dropbox tracks files shared outside the organization , but free users lack visibility into external recipients . 5️⃣ Advanced Filtering for Dropbox Logs 🔍 Filtering Logs for Specific Investigations Dropbox allows filtering logs by various criteria, improving forensic analysis. Key Filters for Investigation Filter Use Case Date Range Identify activity before & after an incident User Track a specific employee's Dropbox usage File/Folder Name Find modifications to critical documents Event Type Focus on file downloads, sharing, or deletions ------------------------------------------------------------------------------------------------------------- Before leaving, I waana update that in forensics, not everything is a piece of cake—there are limitations. Same for Dropbox lets talk about limitation Understanding Dropbox Event Logging All Dropbox users, regardless of their plan, have access to basic event logging through the "Events" section. However, users with Business or Advanced Business plans have access to more extensive logging , which is particularly valuable in forensic investigations. What Does Dropbox Log? Administrators of Advanced Business plans can track detailed user activity , including: ✔ File-level events – Adding, downloading, editing, moving, renaming, and deleting files. ✔ Sharing actions – Shared folder creation, shared link creation, Paper doc sharing, and permission changes. ✔ Access tracking – Internal and external interactions with shared files and folders. These logs can be exported in CSV format , allowing investigators to filter data more effectively and analyze additional fields, such as IP addresses . Logs can be retained for years , making them a valuable resource for forensic analysis. However, new event entries may take up to 24 hours to appear. Limitations and Blind Spots in Dropbox Logging While Dropbox's cloud logging is valuable, it is important to recognize its limitations : 🔹 Limited endpoint visibility – Actions performed on locally stored Dropbox files may not be logged . For example, if a user copies a file from the Dropbox folder to their desktop or an external USB device, Dropbox may not record this activi ty. 🔹 Synchronization tracking challenges – While Dropbox logs when an unauthorized devic e connects and authenticates, it does not always track what files were synchronized to that device. 🔹 Difficulty reconstructing deleted files – Dropbox logs make it challenging to determine what files were once in a folder after they are deleted. However, Dropbox's versioning feature can sometimes help retrieve previous versions of a file. Due to these blind spots, forensic investigators should not rely solely on cloud logs . Instead, combining cloud logs with endpoint forensic analysis (such as examining sync databases and local metadata) provides a more complete picture. Best Practices for Dropbox Forensics Since breaches and data theft are inevitable , proactive measures are necessary: ✔ Test forensic scenarios – Simulating real-world incidents can help determine the exact scope of logging available in your environment. ✔ Export and analyze logs regularly – Using CSV exports allows deeper filtering and historical tracking. ✔ Correlate with endpoint forensics – Combining Dropbox logs with local forensic evidence (if available) can help bridge information gaps. While Dropbox logging isn't perfect , it is still a crucial tool for digital investigations . By understanding its capabilities and limitations, forensic analysts can make informed decisions when investigating incidents involving Dropbox. ------------------------------------------------------------------------------------------------------- Conclusion Dropbox forensics is a crucial aspect of modern investigations, as cloud storage plays a key role in how users store, access, and share files. By analyzing local sync folders, logs, SQLite databases, and API activity , forensic analysts can reconstruct file movements, modifications, deletions, and access history with precision . As cloud storage becomes an integral part of personal and corporate data management, the ability to track and analyze Dropbox activity is essential for digital forensics, cybersecurity, and incident response . Staying updated on Dropbox forensic techniques ensures that investigators can effectively follow digital trails and uncover critical evidence. 🚀 Keep exploring, stay curious, and refine your forensic skills—because digital evidence is everywhere! 🔍 🎯 Next Up: Box Forensics – Investigating Cloud Storage Security 🚀
- KAPE: A Detailed Exploration
Introduction: KAPE, can be used in graphical user interface (GUI), and can be used via the command line interface (CMD). Users typically run KAPE from the command prompt, providing it with the necessary parameters to specify the artifacts they want to collect and the output location. GUI Based: We'll walk through the process of using Kape for evidence acquisition and processing. Kape, written by Eric Zimmerman, is a powerful tool used in digital forensics and incident response. Enable Targets: At the top left, spot number one, you need to enter the Target source. For our example, we're choosing the C drive. For the Target destination, spot number two, we'll use C:\temp\T_out. "T_out" is a common naming convention for Target output. Selec t Kape Triage Target: At spot number two, we are selecting Kape triage. This is a compound Target that gathers various artifacts like registry hives, event logs, and evidence of execution. ( total target around more than 220. It depend on analyst/investigator what he wants to collect) Enable Modules: At spot number three, check the box to enable the module side of Kape (GK). Specify a module destination, which is where parsed output will reside . For our example, C:\temp\M_out (module output). Choose !EZParser Module: ( Depend upon analyst) Below that, we are selecting the !EZParser module. This module runs all of Eric Zimmerman's tools against the data grabbed by the Kape triage Target. This combination simplifies parsing using the Easy Parser tool. Select CSV as Output Format: At spot number four, choose CSV as the default output. Eric Zimmerman's tools commonly support CSV output. Enable Debug Messages: At spot number five, it's advisable to enable debug messages. While it outputs more messages to the console, these are immensely helpful for troubleshooting issues during acquisition or processing. Execute the Command: At spot number six, once you have satisfied all the necessary configurations, click the "Execute" button. This initiates the command and begins the acquisition and processing of data. Accessing Evidence: • There are two main ways to access evidence: running Kape on a live system or mounting a forensic image . It's recommended to use Arsenal Image Mounter for handling forensic images. • The typical Kape workflow involves using the Kape triage Target and the !EZParser module. This combination covers a broad spectrum of common artifacts. As you become more comfortable, you can customize your own Kape recipe to suit specific acquisition and processing needs. Kape Targets: Kape targets are collections of relevant files and directories, defined and customizable through YAML files hosted on GitHub in the Kape Files repository. These targets can focus on files locked by the operating system, preserving original timestamps and metadata . Files locked by the OS are added to a secondary queue, visible in the console log. Even if the console log might indicate certain files weren't grabbed, they were added to the secondary queue, processed using raw disk reads to bypass operating system locks. The Kape folder contains subfolders for targets, such as "Disabled, Antivirus ,Apps," each representing a different collection of artifacts. Targets in the "Disabled" folder won't show up in Kape and cannot be used by it When examining a compound target like "Kape Triage," drilling down through associated targets in the Kape folder reveals the specific files and directories being captured Kape Modules: Kape modules serve as mechanisms to run command-line tools against collected files. They are predefined and customizable, grouping artifacts into categories. The category name becomes the output folder's name. Modules facilitate live response scenarios, offering multiple modules geared towards this purpose. Modules are responsible for processing collected artifacts, and they are grouped into categories, with each category defining the name of the output folder. Modules are highly customizable, allowing users to tailor them to their specific needs. Special programs and scripts can also be employed through modules. The Kape Modules folder, like the Targets folder, contains a "Disabled" subfolder. Placing modules here prevents them from appearing in Gkape or being used by kape. The "Bin" folder within the Modules directory is crucial, housing executables that modules call upon. This ensures that third-party tools, not shipped with Kape, are accessible for module execution. Using the EzParser module simplifies this process, as it seamlessly integrates with Eric Zimmerman's tools. The below Screenshot illustrates the process of examining the EzParser module, which then points to the EVTXecmd module. Each module specifies the binaries it uses, emphasizing the importance of organizing executables in the "Bin" folder for seamless module execution. If you prefer a user-friendly graphical interface, the GUI version of KAPE is an excellent choice. However, for those who appreciate the precision and control of the command line, KAPE also offers a robust command-line interface (CMD). A noteworthy feature of the GUI version is its automatic generation of command-line instructions based on the selections you make. As you navigate through the graphical interface and choose the specific options and artifacts you need, the corresponding command is seamlessly composed. This ensures a smooth transition between the user-friendly GUI and the powerful flexibility of the command line. For a quick and efficient workflow, take advantage of the visual cues provided by the GUI, and observe how the selected options translate into a well-structured command. Whether you opt for the ease of the GUI or the command-line precision, KAPE caters to both preferences, offering a versatile solution for digital forensics and incident response tasks." If you choose to enable only the target for collection, KAPE delivers raw forensic data—a comprehensive snapshot of the specified target. This raw data is invaluable for detailed analysis and investigation. On the other hand, for users seeking a more structured and parsed output, KAPE's modular capabilities come into play. By combining the selection of specific modules with the target, KAPE not only captures the raw data but also processes and organizes it into user-friendly formats such as CSV or TXT. This dual-output feature ensures that users have access to both the unfiltered raw data and the parsed, structured results. Integration Possibilities: While Kape itself doesn't integrate into Splunk directly, but the investigators can ingest CSVs into Splunk. Hash Sets and Cloud Data Collection: Kape allows excluding certain files with hash sets, it doesn't restrict the search to specific file types. This emphasizes Kape's flexibility while outlining its approach to hash-based exclusions. Furthermore, collecting data from cloud storage services, such as OneDrive, Google Drive, Dropbox, and Box is done by Kape. But Legal considerations regarding search warrants and authorization for cloud data access. Akash Patel
- Examining SRUM with ESEDatabaseView
You can download tool from link below: https://www.nirsoft.net/utils/ese_database_view.html Opening SRUM Database with NirSoft Using NirSoft's utilities, you can open the SRUDB.dat ESE database to access its tables. In a typical Windows 10 setup, you'll find around 13 tables. By default, the MSysObjects table is displayed, sorted by the first column. We're focusing on the Windows Network Data Usage Monitor table, identified by the unique identifier {973F5D5C-1D90-4944-BE8E-24B94231Al74}, which is consistent across Windows 8.1 and Windows 10. Examining the Windows Network Data Usage Monitor Table Once you've selected the Windows Network Data Usage Monitor table, you'll find entries detailing the system's network connections. Each entry features an "AppID," identifying the application using the network during that time period. The AppID corresponds to the "Idlndex" field in the SruDbIdMapTable. This table also reveals the drive and f ull path of the application executable via the "IdBlob" for each "Idlndex." Additionally, you'll find the "Userld," network interface (lnterfaceLuid), network profile index (L2Profileld), and bytes sent and received for each application during that time period. Mapping Network Profiles To map a network profile, start by identifying a network with a profile identifier(l2Profileld) , such as 268435461. Navigate to the SOFTWARE registry hive to find the corresponding network name. Here's how: Navigate to \Microsoft\WlanSvc\lnterfaces\{Key}\Profiles key. The "Last write timestamp" for each profile GUID is the First time this computer ever connected to that network . The " Last write timestamp" for the "MetaData" registry subkey is the Last time this computer ever connected to that network. 2. Look for profile identifiers and check the Profilelndex key value to find the matching identifier. 3. Expand the matching profile identifier key and select the "MetaData" subkey. 4. Check the "Channel Hints" key value to reveal the network name corresponding to the Profilelndex 268435461 . By following these steps, you can gain valuable insights into the network connections made by a system, the applications involved, and even the network names. This information can be pivotal in forensic investigations, shedding light on user activities and potentially uncovering malicious intent. Conclusion The SRUM database, when explored using NirSoft's utilities, offers a comprehensive view of network usage data on a Windows system. By understanding how to navigate and interpret this data, digital forensic analysts can uncover critical insights that may be instrumental in their investigations. Akash Patel
- Unpacking SRUM: The Digital Forensics Goldmine in Windows
Updated on 31 Jan, 2025 Enter the System Resource Usage Monitor (SRUM) — a treasure trove for digital forensic analysts. The SRUM Database: A Wealth of Insights The SRUM database serves as a goldmine of information for investigators, offering invaluable insights into user activities and system performance. Some of the most exciting pieces of information that SRUM can reveal include: Applications Running: Details on what applications were active on the system during a specific hour. User Account Information: Identification of the user account responsible for launching each application. Network Bandwidth Usage: Insights into the amount of network bandwidth sent and received by each application. Network Connections: Information on the networks the system was connected to , including dates, times, and connected networks. ***************************************************************************************************** SRUM Database in Windows: How It Works and What You Need to Know Windows 8 When SRUM was first introduced in Windows 8 , it stored performance data for desktop applications, system utilities, services, and Windows Store (Metro) apps. Approximately every hour, or when the system was properly shut down or rebooted, this data was transferred to an Extensible Storage Engine (ESE) database file known as SRUDB.dat . Windows 10 and 11 Now data is no longer temporarily stored in the Windows Registry before being written to SRUDB.dat. Q2: When and How SRUM Data is Written Windows 10 and 11 SRUM data is generally recorded every 60 minutes . However, testing has revealed that data is not always written on shutdown . For example, if a system is shut down twice within 10 minutes, the SRUM database might not update until a later reboot where the system remains powered on past the standard 60-minute mark. This delayed writing behavior can be misleading. When reviewing SRUM entries, you may find multiple records with the exact same timestamp . This does not mean the events occurred simultaneously; rather, it indicates that the system recorded them all at once when SRUM was last updated . The actual activities could have taken place at any point between two consecutive entries. Q3: Analyzing SRUM Data for Patterns To make sense of SRUM data , you can compare the timestamps of consecutive entries . If the interval between entries deviates significantly from the expected 60-minute period (with a margin of plus or minus 10 minutes ), it might suggest that data was written due to a system shutdown rather than the usual scheduled update. A useful method for identifying anomalies is to import SRUM data into Excel and use the Conditional Formatting feature to highlight timestamps that fall outside the standard interval. Q4: Recovering Historical SRUM Data SRUM is often backed up in Volume Shadow Copies , meaning forensic analysts can potentially retrieve older SRUM database snapshots if shadow copies are available. ***************************************************************************************************** SRUM Database Integrity and Repair Given that systems are often not cleanly shut down during incident response procedures, the SRUM database file may sometimes be in a "dirty" or corrupt state. Windows provides a built-in tool, esentutl, for diagnosing and repairing ESE databases. This tool can perform tasks like defragmentation, recovery, integrity checking, and repair of ESE databases. Additionally, deleted files from the SRUM database may be recoverable using a utility called "EseCarve. To check the status of the SRUM database, Windows\System32\sru\ directory: esentutl /mh SRUDB.dat Repair corrupted SRUDB.dat: esentutl /p SRUDB.dat SRUM Registry Keys and Subkeys Performance data collected via SRUM is initially stored in the Windows registry and then transferred to the SRUM database. The primary registry key is HKLM\SOFTWARE\Microsoft\Windows NT\CurrentVersion\SRUM , which contains three subkeys: Parameters, Telemetry, and Extensions. Each of these subkeys corresponds to tables in the SRUM database and contains temporary data. Key Tables in the SRUM Database Windows Network Data Usage Monitor: **** ({97 3F5D5C-1D90-4944-BE8E-24B94231Al 74}) Records information about networks, applications, user SIDs, and total bytes sent and received by each application. WPN SRUM Provider: {dl0ca2fe-6fcf-4f6 d-848e-b2e99266fa86} Captures Windows push notifications for Windows applications, user SIDs, and push notification payload sizes. Application Resource Usage Provider: ***** { dlOca2fe-6fcf-4 f6d-848e-b2e99266fa89} Records the drive, directory, and full path of active applications, user SIDs, CPU cycle times, and bytes read and written. Windows Network Connectivity Usage Mon itor:***** {DD6636C4-8929-4683-974E-22C046A43763} Identifies network connections, connection start times, connection durations, and interface types. Energy Usage Provider: **** {fee4eI4f-02a9 -4550-b5ce-5fa2da202e37} Provides battery charge level, design capacity, cycle count, and power source information. Energy Estimation Provider (Windows 10 ): {97C2CE28-A37B-4920-B1E9-8B76CD341EC5} Offers a summary of historical battery status. ------------------------------------------------------------------------------------------------------------- Forensic Challenges with SRUM Data Delayed Writes: SRUM data is written approximately every 60 minutes. Shutdowns may prevent immediate updates to SRUDB.dat. Analysts should be cautious when interpreting timestamps. Retention Period: Most SRUM data is retained for 30–60 days. (In VSS) The Energy Usage LT table can store data for years. If a system is powered off for an extended period, older data may be purged on reboot. Database Corruption: If a system crashes or is not properly shut down, SRUDB.dat may be left in a "dirty" state. Windows has a built-in tool, esentutl , for repairing SRUM databases. ------------------------------------------------------------------------------------------------------------- Volume Shadow Copies: Older versions of SRUM can sometimes be recovered if Volume Shadow Copies (VSS) are available. Conclusion The SRUM database has revolutionized digital forensic investigations by offering a comprehensive view of system activities and performance metrics. As investigators continue to explore this rich data source, the potential for uncovering critical evidence and insights will only grow. -------------------------------------------------Dean--------------------------------------------------
- Analyzing Recycle Bin Metadata with RBCmd and $I_Parse
When investigating deleted files on a Windows system, analyzing the Recycle Bin metadata can provide crucial insights. In this guide, we’ll look at how to use Eric Zimmerman’s RBCmd.exe and another tool called $I_Parse.exe to extract and analyze deleted file information . Understanding Recycle Bin Metadata Windows keeps metadata for deleted files in different formats depending on the version of the operating system: INFO2 files (used in Windows XP) $I files (used in Windows Vista and later) These metadata files store details such as: Original file name Path before deletion Deletion timestamp File size Using RBCmd.exe for Analysis RBCmd.exe is a command-line utility created by Eric Zimmerman that can parse Recycle Bin metadata from both XP and modern Windows systems. Parsing a Single File To analyze a specific $I file, run the following command: RBCmd.exe -f "C:\$Recycle.Bin\S-1-5-21-1094574232-2158178848-303877012-1001\$IZZOXEO.pdf" Parsing an Entire Directory If you need to analyze all $I files in a folder, use the -d option: RBCmd.exe -d "C:\$Recycle.Bin\S-1-5-21-1094574232-2158178848-303877012-1001" --csv C:\Users\Akash's\Downloads This will parse all $I files in the specified directory and save the results in a CSV file. Output: ------------------------------------------------------------------------------------------------------------ Collecting Recycle Bin Artifacts with KAPE KAPE (Kroll Artifact Parser and Extractor) is a powerful tool that can collect forensic artifacts, including Recycle Bin metadata files. Steps to Collect Recycle Bin Artifacts Using KAPE: Open KAPE . Select the Target Module for Recycle Bin collection. Specify the output folder where the extracted files should be saved. Run KAPE Once collected, you can use RBCmd.exe or $I_Parse.exe to analyze the extracted data. Using $I_Parse.exe Another useful tool for parsing Recycle Bin metadata is $I_Parse.exe . While its usage is similar to RBCmd, it provides an alternative way to extract and analyze metadata from deleted files. Example Tool is very simple to use mention directory where you collected artifact and destination and click parse. Output: Conclusion Analyzing Recycle Bin metadata is a crucial step in digital forensics. Using RBCmd.exe and $I_Parse.exe , you can quickly extract valuable information about deleted files. Additionally, KAPE simplifies the collection of these artifacts, making your forensic workflow more efficient. -----------------------------------------------Dean-------------------------------------------------
- Windows Recycle Bin Forensics: Recovering Deleted Files
The Windows Recycle Bin is an important artifact in forensic investigations . When a user deletes a file using the graphical interface, it is not immediately erased. Instead, the file is moved to the Recycle Bin, where it remains until the user permanently deletes it or empties the Recycle Bin. This behavior makes it a great place to recover deleted files. ------------------------------------------------------------------------------------------------------------- How the Recycle Bin Works When a file is deleted, it is moved to a hidden system folder called $Recycle.Bin . Each user on the system has a separate folder within it, identified by their Security Identifier (SID) . The deleted file is renamed, and metadata is stored alongside it. This metadata includes: The original file name The original file location The time of deletion Since Windows does not track file deletion timestamps at the file system level, the Recycle Bin metadata provides valuable forensic evidence ------------------------------------------------------------------------------------------------------------- Ways to Bypass the Recycle Bin Some users may try to avoid the Recycle Bin by using methods such as: Shift + Delete : This permanently deletes a file without moving it to the Recycle Bin. Command Prompt or PowerShell : Deleting files from the command line bypasses the Recycle Bin. Third-Party Tools: Some applications delete files without sending them to the Recycle Bin. Even with these methods, deleted files may still be recoverable using forensic tools. ------------------------------------------------------------------------------------------------------------- Changes in Recycle Bin Architecture Microsoft has modified the Recycle Bin over the years: Windows XP and earlier: The Recycle Bin used a RECYCLER folder and an INFO2 database file to store metadata. Windows Vista and later: The folder was renamed $Recycle.Bin, and metadata is now stored in separate $I files for each deleted item . This change prevents metadata corruption issues that were common in older versions. ------------------------------------------------------------------------------------------------------------- What Happens When the Recycle Bin Is Emptied? When a user empties the Recycle Bin, all files and their metadata are removed. However, forensic tools can often recover them by: File carving: Searching for file remnants in unallocated space on the disk . Recovering $I files: These metadata files might still be retrievable and can provide useful information. ------------------------------------------------------------------------------------------------------------- Understanding $R and $I Files Modern versions of Windows store each deleted file as two separate files: $R files: These contain the actual deleted data . $I files: These store metadata such as the original file name, location, and deletion timestamp. By analyzing these files, forensic investigators can piece together details about deleted files and their original locations. ------------------------------------------------------------------------------------------------------------- Conducting Recycle Bin Forensics Locate the Recycle Bin Folder: Check $Recycle.Bin on all available drives (e.g., C:\, D:\) . Extract Metadata: Parse $I files to find relevant information . Recover Deleted Files: Copy $R files for further analysi s. Look for Deleted Evidence: If the Recycle Bin has been emptied, attempt file recovery using forensic tools. ------------------------------------------------------------------------------------------------------------- As $R is recoverable files so no need for parsing but $I files need parsing tool use for that is $I Parse Conclusion The Windows Recycle Bin is a goldmine of forensic evidence. While users can attempt to bypass it, forensic tools can often recover deleted files and metadata. By understanding the Recycle Bin’s structure and metadata files, investigators can uncover valuable information during an investigation. -------------------------------------------------Dean------------------------------------------------------
- Understanding and Managing Thumbnail Cache in Windows: Tools thumbcache_viewer_64
Introduction Thumbnail cache in Windows is an essential feature that helps speed up the display of folders by storing thumbnail images. Tools = Thumbcache Viewer. What is Thumbnail Cache? The thumbnail cache is a set of database files used by Windows to store thumbnail images of files and folders. This cache allows Windows to quickly display thumbnails without needing to regenerate them each time you open a folder. Location of Thumbnail Cache Files Windows 10, 11, and 8 In these versions, the thumbnail cache files are stored in the following directory: C:\Users\\AppData\Local\Microsoft\Windows\Explorer : Replace this with your actual Windows username. To access this folder: Press Win + R to open the Run dialog. Type %localappdata%\Microsoft\Windows\Explorer and press Enter. Windows 7 The location is similar to newer versions: C:\Users\\AppData\Local\Microsoft\Windows\Explorer To access this folder: Press Win + R to open the Run dialog. Type %localappdata%\Microsoft\Windows\Explorer and press Enter. Types of Files in Thumbnail Cache In the Explorer folder, you will find several files, each representing different sizes and types of thumbnails. These include: thumbcache_32.db : Thumbnails of size 32x32 pixels. thumbcache_96.db : Thumbnails of size 96x96 pixels. thumbcache_256.db : Thumbnails of size 256x256 pixels. thumbcache_1024.db : Thumbnails of size 1024x1024 pixels. thumbcache_idx.db : Index file for the thumbnails. Viewing Thumbnail Cache Files To view the contents of these thumbnail cache files, you can use a tool like Thumbcache Viewer . This tool allows you to open and examine the thumbnail cache database files. Using Thumbcache Viewer Thumbcache Viewer is a free tool that supports Windows 7 to Windows 10 thumbnails . Here’s how to use it: Download Thumbcache Viewer : Install the Tool : Open Thumbnail Cache Files : Launch Thumbcache Viewer and open the thumbnail cache files located in the Explorer directory. View Thumbnails : The tool will display the thumbnails stored in the cache, allowing you to browse and inspect them. Practical Uses Forensics and Investigation For forensic investigators, examining thumbnail cache files can reveal important information about files and images that were present on the system. Using tools like Thumbcache Viewer, investigators can recover thumbnails of deleted files, providing crucial evidence. ************************* To Know more about check out below article from thumbcache viewer itself https://thumbcacheviewer.github.io/ ************************* Conclusion The thumbnail cache in Windows is a useful feature that enhances the user experience by speeding up folder display. Knowing how to access, view, and manage these cache files can be beneficial for both everyday users and professionals. Tools like Thumbcache Viewer make it easy to inspect these files, and regular maintenance can help keep your system running smoothly. Akash Patel
- Automating Google Drive Forensics: Tools & Techniques
Investigating Google Drive for Desktop can be a time-consuming process, especially when dealing with protobuf-encoded metadata and cached files . Fortunately, open-source forensic tools like gMetaParse and DriveFS Sleuth make the job significantly easier . ------------------------------------------------------------------------------------------------------------- 1️⃣ Automating Metadata Extraction with gMetaParse 🔍 What is gMetaParse? Developed by forensic researcher, gMetaParse is a Python-based tool that automates the extraction of metadata from Google Drive’s metadata_sqlite_db database . 📌 Key Features of gMetaParse: ✅ Extracts metadata for all files and folders in Google Drive ✅ Identifies cached files stored locally ✅ Detects deleted (trashed) files ✅ Provides CSV, JSON, and GUI output 📍 Installation & Usage: g MetaParse is available as a Python script or pre-compiled .exe . It can be run via command-line or with a graphical user interface (GUI) . 🛠️ Step-by-Step: Running gMetaParse 1️⃣ Open a command prompt and navigate to the gMetaParse folder. 2️⃣ Run the following command: gMetaParse.exe -f "C:\Users\Akash\AppData\Local\Google\DriveFS\\metadata_sqlite_db"-d "C:\Users\Akash\AppData\Local\Google\DriveFS\\content_cache" -o "C:\Users\Akash\Downloads\GoogleDriveFS.csv" -g 📌 Explanation: ✅ -f → Points to the Google Drive metadata database ✅ -d → Specifies the cache folder location ✅ -o → Outputs results in CSV format ✅ -g → Launches GUI for interactive file browsing ------------------------------------------------------------------------------------------------------------- 2️⃣ Visualizing Google Drive Data with gMetaParse GUI 📌 Why Use the GUI? While CSV/JSON outputs are useful for analysis , gMetaParse’s graphical interface (GUI) makes it easier to navigate large file structures and visually identify deleted or cached files . 🔍 Features of gMetaParse GUI: ✅ Tree structure visualization of Google Drive contents ✅ Color-coded files: 🟥 Red → Deleted (trashed) files 🟩 Green → Cached (available locally) ✅ Detailed metadata view when clicking a file 📌 Forensic Use: ✅ Quickly identify deleted files and restore local copies ✅ Filter & search files using metadata ✅ Export all metadata for offline analysis ------------------------------------------------------------------------------------------------------------- 3️⃣ Extracting Google Drive Metadata with DriveFS Sleuth 🔍 What is DriveFS Sleuth? DriveFS Sleuth is an advanced Google Drive forensics tool developed by Amged Wageh and Ann Bransom . It specializes in decoding protobuf-encoded data and recovering MD5 hashes from Google Drive metadata. 📌 Key Features of DriveFS Sleuth: ✅ Parses metadata_sqlite_db , extracting file metadata, timestamps, and hashes ✅ Recovers MD5 hashes for locally stored files ✅ Extracts account information (Google email, username, settings) ✅ Provides interactive HTML reports 📍 Installation & Usage: 🛠️ Step-by-Step: Running DriveFS Sleuth 1️⃣ Install Python 3 (if not already installed). 2️⃣ Download DriveFS Sleuth from GitHub . 3️⃣ Run the following command: python3 drivefs_sleuth.py /mnt/c/Users/Akash/AppData/Local/Google/DriveFS --html -o /mnt/c/Users/Akash/Downloads/GoogleDriveFS.html ------------------------------------------------------------------------------------------------------------- 4️⃣ Investigating Google Workspace Logs (Business & Enterprise) 🔍 Why Are Google Workspace Logs Important? For enterprise environments , Google Workspace logs provide a detailed audit trail of user activity, including: ✅ File uploads, downloads, modifications ✅ File sharing (internal & external users) ✅ Deleted items & recovery attempts ✅ Login history & suspicious access attempts 📍 Accessing Google Workspace Logs: 1️⃣ Login to Google Admin Console (admin.google.com). 2️⃣ Navigate to Reports > Audit > Drive Log 3️⃣ Filter logs based on event type, user, date range, or filename . 4️⃣ Export logs to CSV for offline analysis. ------------------------------------------------------------------------------------------------------------- 5️⃣ Filtering Google Workspace Logs for Investigation 📌 Key Log Categories & Event Names: Event Name Description FileUploaded User uploaded a new file FileDownloaded File downloaded from Google Drive FileDeleted File moved to trash FileCopied File duplicated within Drive AnonymousLinkCreated File shared externally via public link FileViewed File opened by the user 📌 Forensic Use: ✅ I dentify suspicious file sharing (e.g., external link creation) ✅ Track deleted files & their recovery attempts ✅ Correlate file access with IP addresses & user accounts ------------------------------------------------------------------------------------------------------------- 6️⃣ Investigating File Sharing & External Access 📌 How to Identify External File Sharing: ✅ Look for AnonymousLinkCreated → Indicates public file sharing ✅ Check IP addresses in logs → Identify external access ✅ Cross-reference Google Drive metadata → Find locally cached shared files 📍 Example: Investigating an External File Share 1️⃣ Search logs for AnonymousLinkCreated 2️⃣ Identify which file was shared and by which user 3️⃣ Check logs for FileDownloaded → Determine if the file was accessed externally 4️⃣ Extract IP address & timestamps → Track external access ------------------------------------------------------------------------------------------------------------- Conclusion Google Drive forensics plays a crucial role in modern digital investigations, providing insights into file synchronization, access history, deletions, and metadata changes . By analyzing local artifacts, cloud logs, and sync databases , forensic analysts can reconstruct user activity and track evidence even after files have been deleted or modified. Understanding key artifacts such as Google Drive logs, SQLite databases, and API activity allows investigators to uncover who accessed what files, when, and from where —a critical aspect of forensic timelines. 🚀 Keep exploring, stay curious, and refine your forensic skills—because digital evidence is everywhere! 🔍 🎯 Next Up: Dropbox Forensics – Investigating Cloud Storage Security 🚀 -----------------------------------------------Dean-----------------------------------------------
- Detailed explanation of SPF, DKIM, DMARC, ARC
Updated on 28 January, 2025 Email security has always been a challenge because the Simple Mail Transfer Protocol (SMTP) wasn’t built with security in mind. This makes it easy for cybercriminals to spoof email addresses and launch phishing, scam, or spam attacks. However, various email authentication mechanisms have been introduced to help verify senders and detect fraudulent messages. When analyzing an email header, you’ll often see these security measures in action ------------------------------------------------------------------------------------------------------------ Sender Policy Framework (SPF) SPF helps verify if an email is sent from an authorized mail server for a particular domain. You’ll often find this in the header under the Received-SPF line. Think of SPF as a guest list for a party only specific mail servers are allowed to send emails on behalf of a domain. If an email comes from an unauthorized source, it fails SPF, raising a red flag. Received-SPF: pass (google.com: domain of n0459381b14-ceb4982011ad4618-nikopirosmani22===gmail.com@bounce.twitter.com designates 199.16.156.176 as permitted sender) client-ip=199.16.156.176; Header Entry: Received-SPF : This header field indicates the outcome of SPF validation. A "pass" typically signifies a legitimate email, while a "fail" might indicate a potentially suspicious email. Example:- if an email is supposedly from outlook.com, the SPF record ensures it was actually sent by Microsoft’s mail servers. ------------------------------------------------------------------------------------------------------------ DomainKeys Identified Mail (DKIM) DKIM takes email authentication a step further by verifying both the sender and the integrity of the message content . It uses a digital signature, which is added to the email header by the sending server. If this signature is valid, it confirms two things: The email genuinely came from the stated domain. The content wasn’t tampered with in transit. Header Entry: DKIM-Signature : This header field contains the DKIM signature and associated information. A successful DKIM validation usually results in a "pass" status. ------------------------------------------------------------------------------------------------------------ Authenticated Received Chain (ARC) Emails often get forwarded through mailing lists, auto-forwarding, or relays . When that happens, SPF and DKIM checks may fail because the email’s route has changed. That’s where ARC comes in. A RC keeps track of authentication results at each hop , maintaining a chain of trust. ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=arc-2016081 h=feedback-id:message-id:precedence:list-unsubscribe:mime-version:subject:to:from:date:dkim-signature; Every forwarding step is recorded in the email header, and each mail server in the chain signs the message with an ARC-Message-Signature. This way, even if SPF and DKIM fail due to forwarding, ARC can confirm that the original email was legitimate . Google was one of the first major email providers to adopt ARC, followed by Microsoft 365 and others. ------------------------------------------------------------------------------------------------------------ Domain-based Message Authentication, Reporting, and Conformance (DMARC) DMARC builds on SPF and DKIM by letting domain owners specify what should happen if an email fails authentication checks . The policy can be set to: None (just monitor emails without blocking them) Quarantine (send suspicious emails to the spam folder) Reject (completely block failed emails from delivery) Header Entry: dmarc : This header field displays the DMARC policy status, which can be "pass," "fail," "none," or other designated states. It also indicates policy actions like "p=REJECT" or "p=NONE." ------------------------------------------------------------------------------------------------------------ Verifying Email Authentication for Investigations If you're investigating a suspicious email, checking SPF, DKIM, ARC, and DMARC records can help confirm its legitimacy . Here are some practical tools: MxToolbox – Checks SPF records and other email security details. dkimpy (Python library) – Validates DKIM and ARC signatures. Metaspike Forensic Email Intelligence – Automates email header analysis for forensic investigations.\ Limitations of Email Authentication While these security measures are powerful, they aren’t foolproof. Here’s what you should keep in mind: Not all email providers use SPF, DKIM, and ARC. DKIM and ARC signatures can expire when mail servers rotate their keys, making it impossible to validate old emails. These authentication methods only apply to received emails, not emails in the sender’s outbox. Microsoft Outlook and Exchange may modify email headers, making DKIM validation difficult for emails stored in PST/OST files. To ensure authenticity, collect emails in their original MIME format (EML, EMLX, or MBOX). Implications for Digital Forensics Enhanced Verification : SPF, DKIM, and DMARC provide digital forensic professionals with additional tools for email verification and authentication, enhancing the accuracy and reliability of forensic investigations. Policy Interpretation : Understanding DMARC policies can help investigators interpret email handling procedures and identify potential red flags or suspicious activities. Privacy and Compliance : While these protocols enhance security, forensic professionals must also ensure that their methods align with privacy regulations like GDPR, respecting user consent and data protection rights. Conclusion SPF, DKIM, and DMARC protocols have become integral components of modern email security, offering robust mechanisms for authentication, integrity, and policy enforcement. As these protocols continue to evolve, digital forensic professionals must stay updated with the latest trends and practices to effectively navigate the complexities of email-based investigations, ensuring both security and compliance in their endeavors. Akash Patel
- Webmail Forensics / Mobile Email Forensics: A Critical Component of Digital Investigations
Introduction Webmail forensics is a crucial aspect of digital investigations, especially in cases involving cybercrime, fraud, and eDiscovery. Understanding how webmail services operate, where data is stored, and how to extract and analyze it effectively is essential for forensic examiners. Mobile Email Considerations Many investigators overlook mobile email when acquiring evidence. While some smartphones sync with corporate mail servers and only maintain copies of emails, mobile devices can contain valuable messages that are difficult to retrieve elsewhere. Therefore, it is important to: Verify how email-capable phones interact with mail servers. Assess whether cloud acquisition is in scope. Investigate Mobile Device Management (MDM) software logs, which may provide metadata for SMS/MMS, call logs, or device backups. Mobile Email Backups Smartphone backups can provide historical data, even if the mobile device is unavailable. Investigators should search for backup files: Android Backup Files (.ab extension) or vendor-specific backups like Samsung Smart Switch, LG Bridge, and Huawei HiSuite. iOS Backups stored in locations such as: C:\Users\\AppData\Roaming\Apple Computer\MobileSync\Backup C:\Users\\Apple\MobileSync\Backup These backups may contain email messages, contacts, and configuration files, aiding forensic analysis. Windows "Phone Link" Application Introduced in Windows 10 as "Your Phone" and rebranded as "Phone Link" in Windows 11, this application provides access to: Call logs (calling.db) Contacts (contacts.db) SMS/MMS messages (phone.db) Photos (photos.db) Notifications (notifications.db) These SQLite databases, stored under can be extracted and analyzed using tools like KAPE (WindowsYourPhone.tkape) and SQLECmd. %UserProfile%\AppData\Local\Packages\Microsoft.YourPhone_8wekyb3d8bbwe\LocalCacheIndexed\\System\Database Webmail Investigation Steps Identify Email Clients and Services: Determine what email clients exist on the system and whether the user relies on webmail services . Review system folder structures. Check the Windows registry for installed email applications. Examine browser history, cookies, and cached files for webmail use. Forensic Acquisition of Email Data: Acquire mail archives within the scope of authority. Extract both server mailboxes and local storage. Convert email archives into a consistent format, such as PST, for easier analysis while retaining original files for authenticity checks. Email Header and Metadata Analysis: Extract and analyze email headers to trace the origin and integrity of messages. Validate email authenticity using DKIM/ARC signatures. Identify sender IP addresses and geolocation. Cross-reference timestamps for consistency. Commercial and Open-Source Tools Commercial tools like Metaspike Forensic Email Intelligence (FEI) provide extensive features, including: SMTP/MAPI header parsing. Email validation and timestamp extraction. IP and domain intelligence. Advanced searching and filtering of email archives. Additionally, forensic tools like Autopsy's Your Phone Analyzer module can help parse mobile email artifacts. Conclusion Webmail forensics plays a vital role in digital investigations. By understanding how emails are stored, retrieved, and analyzed across devices, forensic examiners can uncover critical evidence. Utilizing both forensic best practices and specialized tools ensures thorough and accurate email investigations. -------------------------------------------Dean-----------------------------------------------








