top of page
Search

Tracking Lateral Movement — Named Pipes, Scheduler, Services, Registry, and DCOM (Event IDs)

  • 13 hours ago
  • 10 min read
ree

Hey — today we’re unpacking lateral movement.

Think of it like this: an attacker already got a foothold in your network and now wants to move sideways to more valuable systems.

In this article I’ll try tp show you the common ways they do that, what Windows logs to watch for, and practical detective steps you can take right now.


-------------------------------------------------------------------------------------------------------------

Why this matters

Once an attacker can move laterally, they can reach domain controllers, file servers, backup systems, or any asset you care about. Detecting lateral movement early can stop a breach from becoming a full-blown incident.


Start point: the logon event that often tells the story — Event ID 4624 (Logon Type 3)

When someone authenticates remotely to a Windows host (SMB, named pipes, remote service calls, PsExec, scheduled tasks, etc.), Windows commonly records Event ID 4624 with Logon Type = 3 (Network). That single event is often the first hint something happened on the target system.


What to watch for:
  • A strange source computer or IP doing a network logon to a host that normally doesn’t see it.

  • An account authenticating from an unexpected system (service accounts from a workstation, users from servers).

  • Unusual time-of-day for an account or a burst of network logons from the same source.


Quick detective mindset:
  1. Find 4624 / LogonType 3 on the target machine.

  2. Note the Account Name, Logon ID and Caller Computer / Source IP.

  3. Correlate with surrounding events: process creation, PowerShell logs, Sysmon network events, scheduled task events.

Don’t assume a 4624 = malicious. Many normal operations use this type of logon. Context is everything.

-------------------------------------------------------------------------------------------------------------

Common techniques that produce 4624 Type 3 (and what extra artifacts they leave)

Here are typical ways attackers use network logons and what you can look for around them.

  • Network share access (SMB, port 445)

    • Look for Event IDs related to file share access (if auditing enabled). Attackers often mount shares to copy tools or exfiltrate files.

  • Named pipes / RPC (port 135, 445)

    • Correlate network logs showing RPC or SMB with suspicious services/processes.

  • Remote scheduled tasks / Task Scheduler

    • Scheduled task creation events, task run events, or suspicious schtasks command lines in process creation logs.

  • Remote service execution (PsExec, sc.exe)

    • Process creation for psexec.exe, sc.exe remote service installs, or any service creation events. Check for Service Control Manager logs.

  • PowerShell remoting, WinRM (port 5985/5986)

    • PowerShell logs, WinRM session events, or Event ID 4648 (explicit credentials) near a 4624.

  • WMI remote execution (wmic /node)

    • WMI operation events, suspicious wmic command lines in process creation logs.



Analyze network connections and system activity — where to get signal

Successful TCP/UDP connections by themselves are usually not logged — they’re too noisy — but these sources can give you the visibility you need:

  • Sysmon (Event ID 3) — if you run Sysmon and enable network connection logging, you get process-to-remote-host mapping (gold).

  • Host-based firewall logs — Windows Filtering Platform and firewall logs can show successful connections (if enabled).

  • Security log: Event ID 5156 — Windows Filtering Platform allowed a connection.

  • EDR/XDR — many EDRs provide both process telemetry and network context; use process launch + network data together.


How to use them together:
  1. Start from the 4624 Type 3 event.

  2. Look at Sysmon or firewall logs to see which remote port and process were involved.

  3. Check process creation logs, PowerShell logs, and the Security log around the same timestamp for suspicious activity (scripts, encoded commands, task creation).



-------------------------------------------------------------------------------------------------------------

File shares — a favorite lateral movement highway

Mounting and using network shares is one of the easiest ways to move laterally or stage data.

Windows events for share auditing:

  • 5140 — network share was accessed (gives share name, server path, source IP, account). This event is created when the session is established — not for every file access.

  • 5145 — detailed file share auditing (records access to specific files/folders) — very informative but noisy. Use this only for very sensitive shares or short bursts of investigation.

  • 5142–5144 — share created/modified/deleted.


Important note about 5140: The Accesses field will often say ReadData (or ListDirectory) even if later the user wrote or deleted files. 5140 records the initial access granted when the share session started.


Practical steps:
  • If you can, enable strategic share auditing on sensitive file servers (5140 + selective 5145).

  • When you see 5140 for a suspicious source, follow the account’s Logon ID across the security log to find what else it touched.

  • Look for sudden creation of new shares (5142) or the use of admin shares (like C$) from non-admin hosts.



-------------------------------------------------------------------------------------------------------------

Detection tips you can use today

  • Baseline normal: map which systems normally connect to your servers and from what accounts. Anything outside that baseline is suspicious.

  • Flag 4624 Type 3 where the Caller Computer is not in your baseline or the account is unexpected for that host.

  • Look for 4624 Type 3 followed quickly by process creation events launching admin tools (PsExec, wmiprvse making child processes, schtasks, sc.exe, net.exe).

  • Monitor for Service creation, scheduled task creation, or new remote services started from non-admin systems.

  • If you have Sysmon: monitor Image that opens network connections (Event 3) — e.g., cmd.exe, powershell.exe, wmic.exe, psexec.exe.

  • On file servers: enable Object Access > Audit File Share (ID 5140) for strategic monitoring; enable detailed file share (ID 5145) only for critical folders.



Common pitfalls — what to avoid when investigating

  • Assuming every 4624 Type 3 is bad — many business processes use network logons. Use context (account role, time, source host).

  • Relying on a single log source — attackers leave breadcrumbs in many places; correlate across logs.

  • Enabling noisy auditing wide-open — 5145 and verbose Sysmon network logging can generate mountains of data. Apply selectively or with filters.



-------------------------------------------------------------------------------------------------------------

Named pipes — what they are and why attackers love them

Think of a named pipe like a little memory-backed mailbox processes use to chat with each other. Pipes can be local (two processes on the same machine) or remote (a process on another machine reads/writes the pipe over SMB).

Windows exposes remote pipes through the special IPC$ share — so when you see IPC$ traffic, named pipes might be involved.


Why this matters to defenders:

  • Attackers use named pipes to hide communications inside normal SMB traffic (TCP 445). Instead of opening a weird port, they piggyback on SMB — which often looks rote in network telemetry.



Windows telemetry you can use

Named pipes are noisy in normal operation, so context is key. Useful events and sources:

  • Sysmon Event ID 17 — a named pipe was created by a process (gives you the process that created it).

  • Sysmon Event ID 18 — a named pipe connection occurred (shows which pipe was accessed and when).

  • Security / System logs — IPC$ access will show up as a network share access (similar to file share events).



How to use them together:
  1. Start with suspicious IPC$/SMB activity (or a 4624 network logon tied to a workstation).

  2. Look at Sysmon Event 18 around the same time to get the pipe name.

  3. Use Sysmon Event 17 to find which process created that pipe on the target host.

  4. Correlate with process creation, command lines, or network activity to decide whether it’s malicious.



-------------------------------------------------------------------------------------------------------------

Scheduled tasks — silent persistence and remote execution

Scheduled tasks are a favorite for attackers because they:


  • Can be created remotely — which often generates a network logon on the target (so still shows up as that 4624 type 3 behavior we talked about).


But Windows gives you great signals about tasks — if you enable the logging.


Key logs to turn on:
  • Task Scheduler / Operational (Microsoft-Windows-TaskScheduler/Operational) — excellent for creation and execution entries. Events here persist longer and are easier to hunt through than Security log entries.

  • Security log scheduled task events (when object auditing is enabled) — these include:

    • 4698 — scheduled task created

    • 4699 — scheduled task deleted

    • 4700 / 4701 — task enabled/disabled

    • 4702 — task updated


Task Scheduler operational events you’ll see:
  • 106 — task created (shows registering user and task name)

  • 200 / 201 — task executed / completed (these often contain the actual command path that ran)


Why remote task creation is important:
  • Tasks created remotely will usually be accompanied by a 4624 Type 3 logon on the host around the same timestamp. That pairing is a very useful signal to automate hunting on.



-------------------------------------------------------------------------------------------------------------

When analysts think about lateral movement, scheduled tasks and malicious services often go hand-in-hand. Attackers use them to execute commands remotely, maintain persistence, and bypass login-based detection. Luckily, Windows leaves behind rich forensic artifacts — if you know where to look.

Task Scheduler v1.2 — Modern Task Artifacts (Vista and Later)

Starting with Windows Vista and Server 2008, Microsoft introduced a new scheduled task format (v1.2), the new tasks are XML-based, human-readable, and stored without any file extension.


Where to Find Them

Folder

Description

C:\Windows\System32\Tasks

Standard location for 64-bit task files

C:\Windows\SysWOW64\Tasks

Rare — tasks created by 32-bit code (worth checking for anomalies)

Each file’s name matches the task name, and its contents describe who created it, what it runs, and under which account.


What Information You Get (from the XML)

Inside each XML file, key elements reveal attacker actions and context:

Tag

Description

<RegistrationInfo>

Shows the date/time and account that registered the task

<Author>

Includes hostname and username that created the task — crucial for spotting remote scheduling

<Triggers>

Defines when and how often the task runs (e.g., once, hourly, at logon)

<Actions>

Contains the command path or script executed

<Principals>

Identifies the user account used to run the command



Why This Matters in Lateral Movement

Remote scheduled tasks are a common tactic . While event logs (like 4698 or 106) don’t clearly state whether a task was scheduled remotely, the Author tag in the XML file does. If you see a hostname or domain account in <Author>, it’s almost certainly remote.


Even if the attacker deletes the task, the XML file may remain or be recovered forensically from disk or Volume Shadow Copies. Better yet — if the same malicious task name appears across several systems, it can map attacker propagation across your network.



Task Scheduler v1.0 — Legacy Artifacts (XP / 2003)

Older systems use .job files stored under C:\Windows\Tasks.These binary-format jobs contain:

  • Registration date/time

  • User account

  • Command path

  • Execution timestamp

They’re created by at.exe and schtasks.exe on XP/2003 systems. Even on newer OS .job files may appear for backward compatibility, giving you a second artifact to pivot from if attackers forget to delete both versions.



-------------------------------------------------------------------------------------------------------------

Windows Services — Another Common Lateral Movement Vector

Just like scheduled tasks, services are used for persistence and remote execution — often seen when attackers deploy PsExec, SCShell, or custom service installers.


Key System Log Event IDs (Service Control Manager)

Event ID

Description

Why it matters

7034

Service crashed unexpectedly

May reveal instability caused by injected malware

7035

Service sent a Start/Stop control

Traces the start/stop command

7036

Service started or stopped

Confirms the actual operation

7040

Service start type changed

Detects persistence via Boot/Auto-start configuration

7045

New service installed (Windows 2008 R2+)

Excellent signal for new service-based malware

EID 7045 is particularly powerful — each new service installation generates one. Even transient services (like PsExec) produce 7045 entries, making them easy to track across hosts.


Security Log — Event ID 4697

If “Audit Security System Extension” is enabled, Event ID 4697 will appear in the Security log for new service installations. While it may list SYSTEM as the account, it provides start type and correlates nicely with 7045 events.

Pro tip: Use both 4697 (Security) and 7045 (System) to get the who + when + what of any new service creation.


-------------------------------------------------------------------------------------------------------------


Abusing Windows Services

Services in Windows are like small background workers — they run quietly without user interaction, handling updates, drivers, or system tasks. But attackers love them because services can start code with high privileges and even survive reboots.

So when you see a “new service installed” event on a host you didn’t expect, your alarm bells should go off.

What to Hunt For

The two golden event IDs for service creation are:

  • EID 4697 (Security log) — “A service was installed in the system.”

  • EID 7045 (System log) — “A service was installed.”


Before Windows 10, these were clean, high-signal events. If one popped up, something new was created — and you’d investigate.


But then Microsoft introduced Per-User Services in Windows 10.These are like lightweight user-specific instances that start when a user logs in — and unfortunately, they flood your logs with hundreds of “new” service events.

So now your logs might look like:

OneSyncSvc_52a78dec
WpnUserService_4g4y
BluetoothUserService_0a3c

Looks legit, right? That’s the problem.

Attackers can easily hide behind that chaos. For example, naming their malicious service something like:

OneSyncSvc_52a78dec

and blending right in.


Smarter Filtering

Don’t just filter out every service with an underscore — that’s dangerous. Instead:

  • Filter by ServiceFileName, not by service name.

  • Create an ignore list for known legitimate binaries, like:

    C:\Windows\System32\svchost.exe -k ASUSSystemAnalysis

  • Focus on EID 7045 (System log) instead of 4697 — it’s less noisy and usually doesn’t log those per-user services.


Bottom line: If you see an unexpected service installed and the binary path points to Temp, AppData, or a random directory — that’s your sign.

-------------------------------------------------------------------------------------------------------------


Remote Registry — The Sneaky Lateral Move

Here’s something you’ll see often in real intrusions: attackers using the reg command to make changes on another system’s registry.


Yup, the same reg add command you use locally can modify a remote machine if they have credentials and the Remote Registry service running.

🔍 What You’ll See in Logs

When this command runs, several artifacts light up:

Event Log

Event ID

Description

Security

4624 (Logon Type 3)

Network logon from the attacker system

Security

5140

IPC$ access (named pipe communication)

System

7036

Remote Registry service start/stop

If you’re auditing file shares, 5140 is pure gold — it confirms the named pipe connection (like \PIPE\winreg).If not, 4624 + 7036 can still tell the story.

Registry Timestamp

The modified registry key also updates its LastWrite timestamp.

Example workflow:

  1. Spot suspicious remote logons (4624 Type 3).

  2. Check the timeframe in Registry Explorer.

  3. Sort by Last Write Time to see what changed.

  4. If a new value appears under Run with a weird executable path — that’s your persistence clue.



Analyst Tip

Attackers often choose “subtle misspellings” for filenames .

So when you see something like:

C:\Windows\System32\svchos1.exe

Ask yourself — since when did Windows start naming files like that?


-------------------------------------------------------------------------------------------------------------


DCOM Abuse — Old Tech, New Tricks

This is one of those “been around forever but still dangerous” technologies. DCOM lets one system create or control a code component on another system over the network.


Attackers use this for lateral movement — without dropping any new binaries.


What to Look For

When a COM object is instantiated remotely:

  • The DcomLauncher service spins up the associated process (e.g., mmc.exe).

  • That process becomes a child of the svchost.exe process that hosts the DcomLaunch service.


So if you’re threat hunting:
  • Look for processes like mmc.exe, excel.exe, or outlook.exe being launched by svchost.exe -k DcomLaunch.

  • Check the logs for EID 4624 (Type 3) — remote network logons.

  • Correlate with EID 4672 — “Special privileges assigned” (this usually means admin rights).

  • Then review process creation logs (Sysmon Event 1 or Security EID 4688) for binaries spawned right after.



Common Noise (and How to Filter It)

Two DCOM-related system events are worth noting:

  • EID 10036 — DCOM error (can reveal attacker testing or failure).

  • EID 10016 — frequent in normal operations; only useful if you filter by user, SID, or time.


Pro tip: most malware devs test DCOM methods by trial and error. Those errors generate 10036s — if you see a cluster of them before a suspicious logon, you’ve probably caught them mid-experiment.

-------------------------------------------------------------------------------------------------------------

🧭 Quick Recap — Detection Summary

Technique

Key Events

Hunt Focus

Service Abuse

4697, 7045

Unexpected service install, odd binary path

Remote Registry

4624, 5140, 7036

Remote logon + Run key changes

DCOM Abuse

4624, 4672, 4688, 10036

DcomLauncher spawning child process, privileged remote logon

-------------------------------------------------------------------------------------------------------------


Why This Matters

These techniques aren’t theoretical — they’re used daily by ransomware operators, red teams, and even internal IT tools. But the difference between legit and malicious is all about context. If you tune your detections around these three — services, registry, and DCOM — you’ll catch the kind of lateral movement that slips past surface-level monitoring.


----------------------------------------------------Dean------------------------------------------------

We will continue this in next article

 
 
 

Comments


bottom of page