Active Directory Visualization for Blue Teams and Threat Hunters

As a network defender, it can be easy to attribute a certain degree of omnipotence to attackers. Advanced threats have an uncanny knack for figuring out how to move through an environment without regards for passwords, roles, permissions, or what “should” be possible. For example, here’s a relatively simple attack path that is present in many environments unless defenders have intentionally worked to mitigate it:

“First the attackers phished the summer intern’s account. They couldn’t access any other systems with this account, so they cracked the credentials for a privileged service account. Using that account, they moved laterally to the CEO’s laptop and stole the credentials needed to access her email. They then found a logged-on domain administrator and impersonated the account to dump the hashed credentials for everyone in the company.”

How could defenders anticipate that? How did the attacker manage to find the right accounts and systems?

The answer, in part, is the attacker’s ability to think of the environment as a graph. As Microsoft’s John Lambert pointed out, “Defenders think in lists. Attackers think in graphs. As long as this is true, attackers win.” In this post, we show defenders how to view their environment through the lens of an attacker. Specifically, we’ll use an Active Directory (AD) visualization tool called Bloodhound to discover and mitigate dangerous attack paths before an attacker can leverage them. We’ll also look at what historical evidence may be available to see if the path has already been exploited.

The following describes three lateral movement techniques (token stealing, Kerberoasting, and DCSync), what they look like as a graph, and how defenders can mitigate their risks.

This post was written by Andrew Cook, Incident Response Practice Director at Praetorian, in collaboration with Staff Engineer Josh Abraham, who brings the attacker’s mindset.

Technique #1: Stealing tokens from logged-on sessions

Querying the domain controller and filtering for “Find Shortest Path to Domain Admin” provides output that might look something like this:

active directory visualization

In this example, the attacker wants to move from a compromised unprivileged user (John) to the Domain Admins group. The graph shows one possible path that leverages token stealing or credential theft. Because John is a local administrator (“AdminTo”) on the host Win10-3, the attacker is able to use token stealing to impersonate Bob through Bob’s logged-on session (“HasSession”) on Win10-3. Alternatively, the attacker can use Mimikatz to search the system’s memory for Bob’s cleartext password. From there, the attacker can log on to Win10-5 as Bob and use the same technique to impersonate a member of the Domain Admins group. With this Domain Admin account, the attacker has achieved an important objective and is free to move through the rest of the domain.

What can defenders gain from visualizing this graph of their AD environment? They can start to understand the trust relationships and escalation paths available to an attacker. This blows the lid off the illusion of domain security based on user authentication. Demonstrating the ability to move from unprivileged to privileged accounts is ammunition for implementing the best practices to prevent this attack path. Those best practices are: disabling account delegation, implementing tiered administration, and minimizing local administrators.

First, enable “Account is sensitive and cannot be delegated” on your Domain Administrators and other privileged accounts. This option would have prevented the unprivileged Bob account from impersonating the privileged account. While this may break access to certain multi-tier business applications, these accounts shouldn’t need access to those anyway.

Next, implement tiered administration. According to Microsoft’s tiered administration model, the highly privileged Domain Administrator should never have logged in to the user workstation. This violates the “clean source principle” which states that only lower privileged “Tier 2” administrator accounts should ever log in to lower trust user systems. Conversely, Tier 0 administrators, like our Domain Administrator, should never log into lower trust systems.

Finally, minimize the number of users that are local administrators of their own systems. This is hard for some organizations, but it mitigates many forms of lateral movement and highly constrains the attacker’s options. In this case, it would have prevented the attacker from stealing any tokens or moving laterally from Win10-3 to Win10-5.

Technique #2: SPN Account Abuse (Kerberoast)

As we discussed in a previous post, Kerberoast is an offensive technique that takes advantage of Service Principal Names (SPN). The technique allows any domain user to request the hashed credentials of any account with a registered SPN. Generally, but not always, SPNs are registered to service accounts and not intended to be used by any one individual.

Attackers love Kerberoasting because it represents an opportunity to jump far across the graph. For example, the example below shows an account named Backup that is a member of the Domain Admins group. This account also happens to have an SPN attached so an attacker can request this account’s hash from any user, attempt to crack it, and then take control of a privileged account.

backup windomain local

Kerberoasting tends to be a blind spot for defenders that don’t understand SPN, especially in larger organizations where they may be managed by another team. Using graph visualizations defensively is an opportunity to identify these high-risk accounts and eliminate the opportunity to exploit them by removing the SPN. But what if removing the SPN isn’t an option?

First, ensure any accounts with a registered SPN have strong passwords of at least 25 characters that is periodically rotated. In some cases, that might not be simple because the account is controlled by someone else. In those cases, try cracking the credentials yourself. Highlighting the vulnerability is the first step in getting it fixed but it also gives you a specific threat hunting question: Has anyone else tried to crack this weak password?

Defenders can detect historical evidence of Kerberoasting from the Windows events it creates on the domain controller. Specifically, Kerberoasting creates a 4769 event (“A Kerberos service ticket was requested”) with weaker RC4 encryption (“Ticket Encryption Type = 0x17”). Combine that with the risky account (“Backup”) we’ve identified in the graph.

Here’s what that query looks like in Splunk:

sourcetype=WinEventLog:Security eventid=4769 username!=*$ ServiceName=”Backup” Ticket_Encryption_Type=0x17

Legitimate requests are generally distinguishable from the non-legitimate ones based on frequency. Legitimate requests tend to involve a related group of users routinely requesting downgraded tickets over a long period of time, likely as part of their business function or automation. A tell-tale sign of Kerberoasting is a few requests in a short period of time from a single user that has nothing to do with the service being requested. In these cases, investigate the possibility that the requesting user may have been compromised.

Technique #3: DCSync

The last offensive technique we’ll look at is called DCSync. This technique takes advantage of how domain controllers legitimately replicate domain objects. With the right permissions, it allows attackers to impersonate domain controllers and request the hashed credentials for any other users on the domain. It’s also a stealthy option; the attacker doesn’t need to run any malicious code on the domain controller and can selectively target the credentials of specific accounts.

What does DCSync look like in our AD graph? Take a look:

windomain local diagram

Performing a DCSync attack requires access to a user with the “Replicating Directory Changes All” permission. In our graph, this shows up as an edge labeled “GetChangesAll.” In the bottom right of the above example, we see three non-administrator accounts that have this permission:  DCSYNC_USER, TEST2, and TEST3. When our red teams see this, they see targets and will pivot to find a path to those accounts.

How does a defender benefit from seeing this graph of their environment? First, these risky accounts tend to fly under the radar, especially in complex organizations. Gather the list of accounts that can perform a DCSync and eliminate the “Replicating Directory Changes All” permission whenever possible. Finally, look back through your domain controller logs and understand when these accounts may have performed a replication action.

Here’s what the query looks like in Splunk:

sourcetype=WinEventLog:Security eventid=4662 (“Replicating Directory Changes All” OR “1131f6ad-9c07-11d1-f79f-00c04fc2dcd2” OR “9923a32a-3607-11d2-b9be-0000f87a36b2” OR “1131f6ac-9c07-11d1-f79f-00c04fc2dcd2”) (Account_Name=“DCSYNC_USER” OR Account_Name=“TEST2” OR Account_Name=“TEST3”)

It may still take some digging to understand the legitimate replication events in your environment, but once that baseline is established this becomes a good alert to monitor for.

Conclusion

Defenders can never hope to get ahead of attackers if attackers have a better understanding of the battlefield. By leveraging AD visualization tools like Bloodhound, defenders can start to see their environment as attackers do. Once you see what they see, it becomes much easier to anticipate their attack paths and implement the appropriate countermeasures and controls.

icon-praetorian-

See Praetorian in Action

Request a 30-day free trial of our Managed Continuous Threat Exposure Management solution.

About the Authors

Andrew Cook

Andrew Cook

Andrew leads Praetorian's IR & threat hunting service. His passion is helping clients prepare for, identify, & recover from malicious cyber activity.

Catch the Latest

Catch our latest exploits, news, articles, and events.

Ready to Discuss Your Next Continuous Threat Exposure Management Initiative?

Praetorian’s Offense Security Experts are Ready to Answer Your Questions

0 Shares
Copy link