
Please access this website using a laptop / desktop or tablet for the best experience
Search Results
514 results found with an empty search
- Understanding Lateral Movement in Cyber Attacks:
In the realm of cybersecurity, one of the most concerning aspects of an attack campaign is the stealthy progression through a network to target critical data and assets. This maneuver, known as "lateral movement," is a sophisticated technique employed by attackers to navigate networks, evade detection, and gain access to valuable information. Identifying and preventing lateral movement is crucial to fortifying network defenses and safeguarding sensitive data from compromise. What is Lateral Movement? Lateral movement is akin to a strategic chess game for cyber attackers. Once they breach an initial entry point, they proceed methodically across the network, seeking out key assets that are the ultimate objectives of their attack. Identifying irregular peer-to-peer communication within a network can serve as a vital indicator of lateral movement attempts. Background: Lateral movement attacks involve the unauthorized connections from one Windows host to another using valid stolen credentials. Typically, a compromised system serves as the source host, infiltrated through various means such as spear-phishing attacks. Once compromised, attackers escalate privileges and extract credentials stored in the system to access other resources. Credential Theft and Misuse: Attackers employ specialized tools to capture various credentials, including NT hashes and Kerberos tickets, from compromised systems. These stolen credentials are then utilized to access additional resources within the network using techniques like pass-the-hash or pass-the-ticket. Detecting Lateral Movements: Detection of lateral movements necessitates the meticulous monitoring of Windows events to identify unauthorized account usage from or to unusual systems. This entails maintaining a comprehensive list of expected user-workstation combinations and promptly flagging any deviations from established norms. NTLM Lateral Movements Detection: NTLM lateral movements leave distinct traces in Windows event logs. Events such as 4648, 4776, and 4624 provide valuable insights into anomalous logon attempts, authentication packages, and workstation usage, serving as key indicators of potential lateral movements. Kerberos Lateral Movements Detection: Similarly, Kerberos lateral movements can be detected by closely monitoring events like 4768, 4769, and 4624. By scrutinizing service names, client addresses, and logon types, cybersecurity professionals can swiftly identify suspicious activities indicative of lateral movement attempts. Main Accounts to Monitor: In addition to Domain Administrator accounts, it is imperative to monitor other critical accounts such as service accounts, rarely used accounts, and business-critical accounts. By keeping a vigilant eye on these accounts, organizations can fortify their defenses against lateral movement attacks. Additional Events to Monitor: Reference materials such as NSA guidelines offer supplementary insights into additional events to monitor for detecting various types of cyber-attacks, including lateral movements. By leveraging these resources, organizations can further enhance their detection capabilities and bolster their overall cybersecurity posture. Techniques and Tools Leveraged in Lateral Movement Attackers employ a range of techniques and tools to execute lateral movement within networks. Here are some commonly used methods: Remote Access Services: Any amalgamation of hardware and software facilitating remote access tools or information on a network. Protocols like SSH, telnet, RDP, and VNC provide attackers with the means to traverse networks laterally. Windows Management Instrumentation Command-Line (WMIC): Offering a terminal interface, WMIC allows administrators to execute scripts for computer management. However, it can be manipulated as a vector in post-attack lateral movement. PsExec: Developed as an alternative to conventional remote access services, PsExec utilizes the Windows SYSTEM account for privilege escalation, making it a favored tool for attackers. Windows PowerShell: Microsoft's framework for task automation and configuration management. The PowerShell Empire toolkit encompasses a plethora of prebuilt attack modules, rendering PowerShell a potent tool for lateral movement in cyber attacks. Securing Against Lateral Movement: Mitigating the risks associated with lateral movement demands -Addressing vulnerabilities like insecure passwords, -Employing strong authentication methods and regularly updating passwords -Regularly auditing network activity -Monitoring irregularities in peer-to-peer communication. "Explore a meticulously compiled dossier spotlighting event log entries, registry modifications, and file creations or changes linked to lateral movement. This comprehensive file meticulously examines the nuances of lateral movement occurrences, shedding light on both the origins and destinations of these actions. Immerse yourself in meticulously categorized sections that unveil crucial details surrounding lateral movement scenarios, offering invaluable insights into their dynamics." Akash Patel
- Collecting Email Evidence from Network-Based Servers
Collecting email evidence from mail servers can indeed be challenging due to various factors like server location, criticality to business operations, and the utilization of shared-hosting or cloud facilities. 1) Full or Logical Disk Image of Server Challenges: Difficult to obtain for highly utilized, critical servers. Method: Live imaging is often the only viable option. Considerations: Requires specialized tools capable of live imaging. Risk of disrupting business operations if not handled carefully. 2) Export of Individual Mailboxes in Their Entirety Method: Export each mailbox to create a backup or a PST file. Considerations: Efficiency: Suitable for collecting specific user data. Completeness: Ensures all mailbox data is captured. Tools: Exchange Management Shell or third-party utilities can be used for mailbox export. 3) Specialized Applications for Searching, Filtering, and Extracting Messages Method: Utilize forensic tools designed for email extraction and analysis. Considerations: Precision: Allows targeted searches based on criteria. Flexibility: Filters to extract relevant messages or data. Compatibility: Ensure the tool supports the server's email platform. Backup and Recovery Windows Server Backup (WSB): Exchange Aware Backups: Uses a plugin named "WSBExchange.exe" for Exchange-aware backups. Leverages Volume Shadow Service for background backups. Checks Exchange database consistency, flushes transactional logs, and marks databases as backed up. Backups stored as Virtual Hard Disk (VHD) files. Instructions for Backing up Exchange 2007 or 2010: 1. Start Windows Server Backup. 2. Click on "Backup Once" from the Actions pane to initiate the Backup Once Wizard. 3. Choose Backup Options: Select "Different options" and proceed. Opt for Full server (recommended) or Custom to specify volumes. 4. Specify Backup Destination: Choose a location and configure Access Control settings. 5. Advanced Options: Select VSS full backup. 6. Review and Confirm: Confirm backup settings and start the backup process. 7. Monitor Backup Progress: Check the backup progress page. 8. Backup Completion: Close once the backup operation is complete. Conclusion: When collecting email evidence from network-based servers, it's crucial to choose the right method based on the server's characteristics, business needs, and the investigation's requirements. Whether it's live imaging, mailbox exports, or specialized forensic tools, each approach has its advantages and challenges. Additionally, leveraging server backups like Windows Server Backup can provide a reliable and efficient way to capture Exchange data while ensuring data integrity and compliance with backup and disaster recovery plans. Akash Patel
- Unveiling System Secrets with WinPmem(memory acquisition tool)
Exploring WinPmem WinPmem is a robust memory acquisition tool designed specifically for Windows environments. Its primary function is to capture the content of a system's physical memory, offering a snapshot of the system's state at a particular moment. This is invaluable for uncovering running processes, identifying malicious activities, and piecing together the puzzle of a security incident. Key Features of WinPmem Kernel-Level Operation: WinPmem operates at the kernel level, enabling it to access and acquire the contents of the system's physical memory directly. Memory Analysis: The acquired memory image provides a treasure trove of information, including details about running processes, network connections, and other volatile artifacts crucial for investigations. Forensic Insights: Analysts use memory analysis to uncover evidence of malware, unauthorized access, and other security incidents that may not be readily available through traditional disk-based forensics. Capturing a Memory Image with WinPmem Now, let's walk through the process of capturing a memory image using WinPmem. Follow the command below: WinPmem.exe -o C:\Forensics\MemoryImage.raw or WinPmem.exe MemoryImage.raw (Both commands will work) I don't know about others but (With this tools I am able to capture .raw, .img, . mem) In this example, WinPmem will capture the memory image and save it as "MemoryImage.raw" in the "C:\Forensics" directory. Understanding the Command The WinPmem.exe executable initiates the tool. The -o flag is followed by the desired output path where the memory image file will be stored. You can use different tools like Autopsy, Volatility and more to analyze the image Conclusion WinPmem stands as a powerful ally for digital forensics experts, providing a window into a system's soul through the lens of its memory. By incorporating this tool into investigative workflows, analysts can unravel the mysteries hidden within a system, contributing to a more comprehensive understanding of security incidents. Akash Patel
- Email Storage: Server vs. Workstation
Determining the location of email data—whether on a server or a workstation—is a pivotal first step for forensic investigators. Email Storage Locations 1. Server-Based Storage: Business Environments: In corporate settings, the email server typically hosts the most recent email traffic, while workstations often store older messages or synchronize mailboxes. Challenges: Email archives may be found in unexpected locations on workstations due to varying IT policies or system administrator oversights. 2. Workstation-Based Storage: Local Storage: Workstations often hold offline or archived email data, particularly older messages that are no longer actively synchronized with the server. Access: Limited IT controls on workstations can result in email archives being stored outside of intended locations, complicating forensic analysis. Way for Email Analysis: Advanced Indexing & Filtering: Narrow down the scope to relevant messages. Threading & Clustering: Facilitates focused investigation. Deleted Message Recovery: Retrieve soft-deleted messages within retention periods. Multi-Account Access: Access multiple user accounts for comprehensive review. Deduplication: Eliminate duplicate messages to streamline review. Recommended Tools: Forensic Suites: X-Ways, EnCase, FTK Dedicated Email Tools: SysTools Mail Examiner, Aid4Mail, Emailchemy, Logikcull Example: Microsoft Exchange: Market Leader: Predominantly used in corporate enterprises, often deployed on standalone or virtualized servers. Storage Structure: Exchange 2007: Utilizes .EDB database files, often located in C:\Program Files\Microsoft\Exchange Server\Mailbox\First Storage Group\Mailbox Database.edb. Prior to Exchange 2007: Comprises .EDB and .STM files, both essential for forensic analysis. .log Files: Vital for data recovery, capturing transactions before committing to .EDB. eseutil Tool: Enables log replay and data import into .EDB files for recovery and analysis. Storage Groups: Newer Exchange databases can be segmented into multiple storage groups, each containing several database files. Acquisition & Collaboration: Server Administrator Collaboration: Essential for comprehensive data acquisition. Mailbox Export: Mailboxes can be exported to .PST format as an alternative data source. Conclusion Understanding email storage nuances—be it server-based or workstation-based—is indispensable for forensic investigators. Collaboration with server administrators and leveraging specialized tools can significantly enhance the efficiency and thoroughness of email forensic investigations. Akash Patel
- Demystifying Email Encryption and Forensic Analysis
Email remains a primary communication tool, handling a vast amount of sensitive information daily. As such, understanding email encryption and the intricacies of email clients is vital for both privacy-conscious users and forensic investigators. 1. Individual Message Encryption Public-Key Protocols: Secure MIME (S/MIME) and Pretty Good Privacy/MIME (PGP/MIME) are commonly used public-key protocols for individual message encryption. End-to-End Encryption: These protocols ensures only the sender and recipient can decrypt the message, enhancing security. File Extensions: Look out for .PGP (PGP) or .P7M (S/MIME) extensions as indicators of encrypted content. 2. Client-Side Encryption Local Archives: Email clients like Outlook and Lotus Notes support encryption for locally stored archives. Enterprise Environments: Centralized key servers can facilitate S/MIME encryption, aiding recovery efforts. 3. Network-Based Mail Encryption TLS/SSL (Transport Layer Security/Secure Sockets Layer): Encrypts emails during transit without hindering forensic investigations. 4. Office 365 Encryption Transparent Encryption: Aims to make email encryption seamless for end-users within the Office 365 ecosystem. Common Traits of Email Clients and Investigative Considerations 1. File Structure: Index, Message, and Folder Files: Crucial for organizing and accessing email data. Archiving: Copy all mail directories during export for comprehensive data recovery. 2. Message Storage: Text-Based Storage: Messages are often stored in text form, facilitating the use of search tools to locate archives and enabling review using text editors if archives are corrupted. 3. Access Control: Limited Access: Requires authentication for email access, restricting to client identities. Password Recovery: Tools like Mail Pass View can aid in recovering passwords for popular email clients. 4. Data Recovery: Deleted Emails: Email archives often hide messages marked as deleted, requiring alternate viewers for review. File Recovery: Traditional forensic techniques can recover entire deleted email archives. Outlook Specifics: File Format: Stored in a single .pst file containing all email data. Binary Obfuscation: Includes default encryption options for added security. Deleted Messages: Accessible until compaction or cleanup, offering extended recovery opportunities. Conclusion Understanding email encryption and the traits of various email clients is crucial for effective digital communication and forensic investigations. Whether you're a user aiming to enhance data privacy or an investigator analyzing email data, this knowledge empowers you to navigate the complexities with confidence. Stay tuned for more insightful articles on cybersecurity and digital privacy topics! Akash Patel
- Navigating the Email Clients, Features of Modern Email Clients, Corrupted Email Archives
What is Email client? An email client, often simply referred to as an "email program" or "email software," is a computer program or application that enables users to send, receive, organize, and manage email messages. Essentially, it provides an interface for users to interact with their email accounts hosted on email servers. Identifying Email Clients 1. Review Installed Programs: Start by examining the system's installed programs. The Windows registry can be a treasure trove, even revealing references to previously uninstalled email clients. 2. Internet Search:For unfamiliar email clients, a simple internet search can shed light on their file types and archive structures. Storing Email Data 1. Flat-Text Archives:Many email clients use flat-text archives, making keyword searches at the bit-level a fruitful endeavor, whether the data is in allocated or unallocated disk space. 2. Exported Email Files:Don't overlook exported emails, like Thunderbird's .EML files, which might contain crucial information. Common Email Clients to Consider The Bat! Poco Pegasus FoxMail IncrediMail AOL Features of Modern Email Clients 1. Comprehensive Data Storage: Modern email clients often store emails, calendar entries, contacts, and tasks within a unified archive. 2. Integration with Productivity Tools: Enhanced with features like appointment scheduling and task lists, modern email clients function as comprehensive productivity suites. Calendar Entries Importance: Calendar entries offer insights into a person's activities. File Formats: Look out for .ICS files commonly used for calendar data. Forensic Analysis: Orphan .ICS files in temporary directories can offer evidence. Address Books File Formats: Formats like .WAB, .PAB, .VCF, .MAB, and .NNT are common. Searchability: Text-based formats are easier to search and analyze. Task Lists Storage: Task lists may reside within calendar files in SQLite format with an .SDB extension. Forensic Analysis: Importing these files into a forensic station can enable detailed analysis. Corrupted Email Archives Common Causes: Corruptions can result from client issues, large archives, or out-of-sync files. Recovery Options: Tools like scanpst.exe can repair corruption, but third-party tools are available, though their trustworthiness varies. Best Practices: Always document tools used and run them on evidence copies. Conclusion Understanding the intricacies of email client data storage is paramount for forensic investigators. By employing the strategies, considerations, and best practices outlined in this guide, investigators can navigate the challenges posed by diverse email clients effectively. Akash Patel
- Deep Dive into Additional Email Header Fields in Digital Forensics
In our previous exploration of email headers, we delved into some of the most common and widely recognized fields like Message-ID and Received. However, the email header is a multifaceted entity, rich with additional fields that can offer further insights into the email's journey and integrity. X-Originating-IP (Removed in Many webmail because of security concerns) Purpose: Identifies: This optional tag reveals the IP address of the computer from which the original email was sent. Authentication & Integrity: Potential Forging: While this field can be spoofed, it requires control over the originating Mail Transfer Agent (MTA). Backup Information: If this field is missing, the "Received" field might still contain endpoint originating information, providing a fallback for tracing the source. X-Forwarded-For Purpose: Forwarding Indication: Indicates that the email was forwarded from another source, possibly through load-balancing or proxy servers. Authentication & Integrity: Source Identification: Can help identify the infrastructure or route taken by the email before reaching its final destination. X-BarracudaApparent-Source-IP Purpose: Device-Specific Tag: Unique to Barracuda devices, this optional tag provides the apparent source IP address. Authentication & Integrity: Device Origin: Helps identify if the email passed through a Barracuda device, potentially revealing security filtering or processing. Authentication & Integrity Across Fields Spoofing Risks: Many of these fields, including X-Originating-IP and X-Forwarded-For, can be spoofed, but doing so requires a level of control over the MTA or specific devices in the email's path. Validation: While these fields can be valuable, validation is crucial. Cross-referencing with other headers, using forensic tools, and understanding the typical behavior of MTAs and devices can help verify the authenticity of these fields. Conclusion While the landscape of email headers is vast and ever-evolving, these additional fields provide a deeper layer of insight for digital forensic professionals. While there are challenges like spoofing and the need for meticulous validation, the richness of information embedded in these headers offers invaluable opportunities for tracing, validation, and enhanced forensic analysis.
- Important Update: Temporary Pause in Blog Updates
Dear readers and followers, I hope this message finds you well. I wanted to take a moment to share an important update regarding our blog. Due to some unforeseen circumstances, I was not able to publish new blog posts from April 5th. Please rest assured that this pause is temporary. I am actively working to resolve the issues at hand and will be back as soon as possible with fresh and engaging topics for you to enjoy. I understand that you might be looking forward to our regular updates, and I sincerely apologize for any inconvenience this may cause. Your patience and understanding during this time are greatly appreciated. In the meantime, I encourage you to explore our archive of past blog posts. There's a wealth of information, tips, and insights waiting for you there. Thank you once again for your continued support and understanding. I look forward to reconnecting with you all very soon with new and exciting content. Stay tuned, and take care! Akash Patel
- Solid-State Drives (SSDs): Acquisition, Analysis, and Best Practices
Introduction: Solid-state drives (SSDs) have revolutionized data storage with their speed, reliability, and lack of moving parts. However, their unique characteristics pose challenges for forensic investigators and analysts. Understanding SSDs: SSDs utilize non-volatile flash memory for data storage, providing faster access times and improved reliability compared to traditional hard drives. (Non-volatility allows flash SSDs to retain memory during a sudden power loss.) Limited Writes and NAND(non-volatile storage) Flash Quality: SSD reliability is directly affected by the number of writes to the NAND(non-volatile storage) flash memory. Frequent writes can lead to data corruption and reduce the lifespan of the drive. Consumer-grade SSDs often use lower quality NAND(non-volatile storage) flash, making them more susceptible to wear and tear from repeated writes. Wear Leveling: Wear leveling is a technique used to distribute write and erase cycles evenly across the SSD's memory cells. When data is modified, it is moved to a new location, and the original location is marked for erasure. This helps prevent certain memory cells from wearing out faster than others. Drive Trimming or Trim: Trim is a feature that improves SSD performance and lifespan by informing the drive which data blocks are no longer in use, allowing the SSD to reclaim them. Effects on Forensic Analysis: Wear leveling can affect forensic analysis by altering the physical location of data on the SSD, making it challenging to recover specific sectors or data remnants such as file slack. Trim operations can also impact forensic investigations by eliminating data remnants and reducing the effectiveness of traditional techniques like file carving. Prefetch and ReadyBoost: Prefetch and ReadyBoost, which are designed to improve system performance by caching frequently accessed data, may be disabled or enabled depending on the SSD configuration. Microsoft has started enabling prefetch and ReadyBoost by default on SSDs due to their improved performance, which may affect forensic analysis and investigation techniques. Acquisition of Data from SSDs: Acquiring data from SSDs requires careful consideration of power loss concerns and data collection methods: Power Loss Concerns: Cutting power to a running SSD can lead to serious problems, potentially causing data modifications during recovery processes. Traditional shutdown processes can also trigger drive optimization activities, affecting data integrity. 2. Impact on Data Collection: Cutting power to an SSD may not be the best option for ensuring proper data collection. The repair operations initiated by the SSD during power loss recovery can involve tasks such as trimming operations and wear leveling, which can affect the integrity of the data. Simply powering off the system using a normal shutdown process can also trigger drive optimization activities, further complicating data collection. 3. Live Acquisition Considerations: Some experts suggest that live imaging of the system might be the best approach for acquiring data from SSDs. Leaving the SSD running for extended periods, even in a powered-down state, can potentially corrupt the data. Live acquisition, similar to imaging memory, may offer better control over the data and reduce the risk of unintended modifications by the SSD. 4. Recommended Recovery Procedures: In case of a drive failure due to power loss, it is recommended to follow specific recovery guidelines provided by manufacturers like Crucial. The recovery process involves completing a power cycle, which may take approximately one hour. This procedure is typically performed on a laptop or desktop computer by connecting the SSD to the SATA power connector and following specific steps to power cycle the drive. Once you have the drive connected and sitting idle, simply power on the computer and wait for 20 minutes. We recommend that you don't use the computer during this process. Power the computer down and disconnect the drive from the power connector for 3 0 seconds. Reconnect the drive, and repeat steps 1 and 2 one more time. Reconnect the drive normally, and boot the computer to your operating system. If the latest firmware has not been updated to your drive, do so. 5. Write Blocking and Analysis: While write blocking drives using standard write blockers can prevent accidental writes from the connected operating system, the SSD's controller may still perform wear leveling and trimming operations when powered on. Using a write blocker for imaging purposes is recommended to preserve drive integrity, but prolonged analysis on an SSD connected via a write blocker may increase the risk of controller-initiated drive management operations, potentially compromising data integrity. Will disk defragmentation be disabled by default on SSDs? Answer: Yes, disk defragmentation is disabled by default on SSDs. This is because SSDs do not benefit from defragmentation like traditional mechanical hard drives. In fact, defragmentation can cause unnecessary wear and tear on SSDs without providing any performance improvements. Will SuperFetch be disabled on SSDs? Answer: It depends. While newer versions of Windows, such as Windows 8 and Windows 10, typically keep SuperFetch enabled on SSDs, older Windows 7 systems may disable SuperFetch if an SSD drive is detected. SuperFetch can improve system performance by preloading frequently used applications into memory, but on SSDs, it may not be as necessary due to the faster read/write speeds. Does the Windows Search Indexer operate differently on SSDs? Answer: No, the Windows Search Indexer operates the same way on SSDs as it does on traditional hard drives. The Search Indexer creates and maintains a database of file and folder information to enable quick file searches. While SSDs may have faster access times, the functionality of the Search Indexer remains unchanged. What should you do if the hash does not match on the first attempt to image an SSD? Answer: If the hash does not match on the first attempt to image an SSD, it's recommended to keep the original image and reimage the drive again. The most likely reason for the hash mismatch is due to wear leveling or trim operations occurring after the initial hash was generated. By comparing the original and subsequent images, you can identify any differences caused by wear leveling or trim, such as deleted files or changes in unallocated space. This comparison can help mitigate concerns over unmatched hashes when presenting evidence in legal proceedings. Conclusion: Solid-state drives offer numerous benefits, but their unique characteristics present challenges for forensic investigators. By understanding the behavior of SSDs, implementing proper acquisition techniques, and adhering to best practices, forensic analysts can effectively acquire and analyze data from SSDs while maintaining data integrity and reliability. Akash Patel
- Digital Evidence: Techniques for Data Recovery and Analysis
In today's digital age, forensic investigators face the challenge of extracting valuable evidence from various storage devices, including solid-state drives (SSDs). With techniques like datastream carving, file carving, and parsing metadata, investigators can uncover crucial information for legal proceedings and investigations. Datastream Carving vs. File Carving: 1. Datastream Carving: Involves extracting small fragments of data from larger files. Useful for recovering valuable information, such as URLs and timestamps, from partially deleted files. Tools like Magnet Forensics' Internet Evidence Finder (IEF) facilitate the process by scanning for fragments and full files across storage devices. 2. File Carving: Focuses on recovering intact files from memory or unallocated space. Scans for known file headers and carves out files based on predicted lengths or known footers. Effective for recovering specific types of deleted files but may yield numerous false positives. Parsing Metadata in Files: Metadata embedded within files provides insights into their creation, modification, and history. Microsoft Office documents and picture files contain metadata such as author information, creation time, GPS Coordination, and camera details. Example : For Microsoft Office documents, metadata may include details such as author information, creation time, last print time, and even the version of Microsoft Office used to create the document. This information can help establish the origin and authenticity of the document, which is especially important in cases involving stolen or altered documents. Similarly, picture files contain metadata, which includes information about how the picture was taken. This data typically includes the original picture creation date, the type of camera used, and even GPS coordinates if the device has a built-in GPS. Tools like exiftool can parse metadata from files, uncovering valuable information for e-discovery cases and investigations. In e-discovery cases, requesting metadata can be crucial for building a comprehensive understanding of the evidence and ensuring a fair trial. Judges often grapple with the complexities of metadata requests, recognizing its potential to make or break a case. By leveraging tools like exiftool to parse metadata from files, investigators can uncover valuable information that may strengthen their legal arguments and provide clarity in complex litigation scenarios https://exiftool.org/ Recovering Deleted Files: Forensic analysis often involves recovering lost or deleted files from storage devices. Metadata layer extraction focuses on retrieving file properties, while unallocated space extraction scans for file headers and clusters. Tools like Photorec facilitate file recovery by scanning for file headers and attempting to reconstruct fragmented files. Using Photorec: Photorec is a versatile data recovery program that reads file headers and targets various media file types. It can recover files from hard drives or mounted drive images and has limited fragmentation handling capabilities. Photorec Sorter can help organize recovered files by extension for easier analysis. Output: Using Photorecsorter: Move the PhotoRec Sorter executable (PhotoRec_Sorter.exe) to the directory containing the "recup_dir" folders generated by PhotoRec. Execute PhotoRec_Sorter.exe from the same directory. Monitor the console output for any messages or errors during the sorting process. Once PhotoRec Sorter has finished execution, navigate through the "recup_dir" folders to ensure all files are properly sorted. Check for any files that may not have been sorted correctly and manually move them to the appropriate folders based on their file extensions. Conclusion: By leveraging techniques such as datastream carving, file carving, and metadata parsing, forensic investigators can extract valuable evidence from storage devices like SSDs. These techniques play a crucial role in e-discovery cases, legal proceedings, and criminal investigations, providing insights that can strengthen legal arguments and uncover hidden truths. Akash Patel
- Program Execution : UserAssist Registry Key || Shimcache/Amcache ||BAM/DAM
1. UserAssist Key Understanding the UserAssist Key: The UserAssist key, located within the NTUSER.DAT hive of the Windows registry, contains valuable information about GUI program executions initiated by users. This key stores details such as the last run time, run count, name of the GUI application, focus time, and focus count for each program launched in Windows Explorer. Analyzing UserAssist Data: Forensic analysts can leverage the UserAssist key to uncover important details about program executions, including Last Run Time (UTC): The timestamp indicating when a program was last executed by the user. Run Count: The number of times a program has been executed on the system. Name of GUI Application: The name or identifier of the GUI application launched by the user. Focus Time and Focus Count: Metrics indicating the total time an application has been in focus and the number of times it was re-focused in Windows Explorer. Understanding GUIDs and Execution Modes: Each application launch generates unique GUIDs within the UserAssist key, distinguishing between executable file executions and shortcut file executions. For example: GUIDs for Windows XP: GUIDs such as 5e6ab780 represent Internet Toolbar, 75048700_ signifies Active Desktop. GUIDs for Windows 7 and higher: GUIDs like CEBFF6CD denote executable file execution, F4E57C4B indicates shortcut file execution. Understanding GUIDs and Execution Modes: Each application launch generates unique GUIDs within the UserAssist key, distinguishing between executable file executions (CEBFF6CD) and shortcut file executions (F4E57C4B). By analyzing these GUIDs, forensic analysts can discern how users interact with applications, whether through direct executions or shortcut activations. 2. Shimcache (Application compability cache)/ Amcache Hive Shimcache Purpose • Checks to see if application needs to be "shimmed" (properties applied) to run application on current OS or via older OS parameters • AppCompatCache will track the executable file's last modification date, file path, and if it was executed • Advanced: Applications will be shimmed again (w/ additional entry) if the file content is updated or renamed. Good for proving application was moved, renamed, and even time stomped (If current File's Mod-time * ShimCache Mod-time) Amcache Purpose: •Application Experience Service •New AppCompat structure and full of additional information To understand in deep Kindly go through my previous blog link below... Blog Headline : Forensic Collection of Execution Evidence through AppCompatCache(Shimcache)/Amcache.hiv Blog Link: https://www.cyberengage.org/post/forensic-collection-of-execution-evidence-through-appcompatcache-shimcache--amcache-hiv Blog Headline: Shimcache/Amcache Analysis: Tool-->AppCompactCacheParser.exe/AmcacheParser.exe Blog Link: https://www.cyberengage.org/post/shimcache-amcache-analysis-tool-appcompactcacheparser-exe-amcacheparser-exe Blog Headline: Amcache.hiv Analysis: Tool--> Registry explorer Blog Link: https://www.cyberengage.org/post/amcache-hiv-analysis-tool-registry-explorer 3. BAM/DAM Record information about executed programs, including the path of the executable and the date/time of the last execution. The DAM is specifically found on systems with connected standby, a feature that allows Windows to remain powered on while the screen is turned off, similar to the standby mode on smartphones. The DAM helps manage desktop application access to extend battery life while ensuring that system processes can still function effectively. On the other hand, the BAM is associated with a kernel mode driver service that was introduced in Windows 10 version 1709. While there is limited official information available about the BAM, forensic analysts have observed similarities between the information recorded in BAM and DAM keys. Within these registry keys, you can find entries corresponding to various programs. Each entry will contain details such as the full path of the executable and the timestamp of the last execution. System Hive: (BAM/DAM) SYSTEM\CurrentControlSet\Services\bam\UserSettings\{SID} SYSTEM\CurrentControlSet\Services\Dam\UserSettings\{SID} Akash Patel
- Part 4- Important Registries related to System configuration overview
9. System Boot autostart programs: NTUSER.DAT NTUSER.DAT\Software\Microsoft\ Windows\CurrentVersion\Run NTUSER.DAT\Software\Microsoft\ Windows\CurrentVersion \Run Once Software Hive Software\Microsoft\ Windows\CurrentVersion\RunOnce Software\Microsoft\Windows\CurrentVersion\policies\Explorer\Run Software\Microsoft\ Windows\CurrentVersion \Run System Hive: SYSTEM\CurrentControlSet\Services 0x0 (Hexadecimal) or 0 (Decimal): Boot start - The service starts during the system boot process. 0x1 (Hexadecimal) or 1 (Decimal): System start - The service starts during the system initialization. 0x2 (Hexadecimal) or 2 (Decimal): Automatic start - The service starts automatically when the system starts. 0x3 (Hexadecimal) or 3 (Decimal): Manual start - The service must be started manually by the user or another program. 0x4 (Hexadecimal) or 4 (Decimal): Disabled - The service is disabled and cannot be started. Key usefulness: Determine programs that will start automatically Useful to find malware on a machine that installs on boot such as a rootkit Look at when the time key was last updated, generally this would be the last boot time of the system 10. Shutdown information: Discover when the system was last shut down Discover how many successful times the system was shut down System hive: SYSTEM\CurrentContro1Set\Control\Windows (Shutdown Time) SYSTEM\CurrentContro1Set\Control\Watchdog\Display (Shutdown Count) CMD: reg query HKLM\SYSTEM\CurrentControlSet\Control\Windows Notice the shutdown time is in hex. This time is in Windows 64-bit time. Luckily, we can utilize Decode Date on your desktop, we can write the values and press decode. It will tell us the date that is stored at that location. Akash Patel







