Kali Linux

Snaffler : A Tool For Pentesters To Help Find Delicious Candy

Snaffler is a tool for pentesters to help find delicious candy needles (creds mostly, but it’s flexible) in a bunch of horrible boring haystacks (a massive Windows/AD environment).

It might also be useful for other people doing other stuff, but it is explicitly NOT meant to be an “audit” tool.

What does it do?

Broadly speaking – it gets a list of Windows computers from Active Directory, then spreads out its snaffly appendages to them all to figure out which ones have file shares, and whether you can read them.

Then YET MORE snaffly appendages enumerate all the files in those shares and use LEARNED ARTIFACTUAL INTELLIGENCE for MACHINES to figure out which ones a grubby little hacker like you might want.

Actually it doesn’t do any ML stuff (yet), because doing that right would require training data, and that would require an enormous amount of time that we don’t have.

What does it look like?

Like this! (mostly, this screenshot is a few versions old now)

How do I use it?

If you “literally just run the EXE on a domain joined machine in the context of a domain user” (as people were instructed to do with Grouper2, immediately before they ran it with all the verbose/debug switches on so it screamed several hundred megabytes of stack traces at them) it will basically do nothing. This is our idea of a prankTM on people who don’t read README files, because we’re monsters.

HOWEVER… if you add the correct incantations, it will enable the aforementioned L.A.I.M. and the file paths where candy may be found will fall out.

The key incantations are:

-o Enables outputting results to a file. You probably want this if you’re not using -s. e.g. -o C:\users\thing\snaffler.log

-s Enables outputting results to stdout as soon as they’re found. You probably want this if you’re not using -o.

-v Controls verbosity level, options are Trace (most verbose), Degub (less verbose, less gubs), Info (less verbose still, default), and Data (results only). e.g -v debug

-m Enables and assigns an output dir for snaffler to automatically take a copy of (or Snaffle… if you will) any found files that it likes.

-l Maximum size of files (in bytes) to Snaffle. Defaults to 10000000, which is about 10MB.

-i Disables computer and share discovery, requires a path to a directory in which to perform file discovery.

-n Disables computer discovery, takes a comma-separated list of hosts to do share and file discovery on.

-y TSV-formats the output.

-b Skips the LAIM rules that will find less-interesting stuff, tune it with a number between 0 and 3.

-f Limits Snaffler to finding file shares via DFS (Distributed File System) – this should be quite a bit sneakier than the default while still covering the biggest file shares in a lot of orgs.

-a Skips file enumeration, just gives you a list of listable shares on the target hosts.

-u Makes Snaffler pull a list of account names from AD, choose the ones that look most-interesting, and then use them in a search rule.

-d Domain to search for computers to search for shares on to search for files in. Easy.

-c Domain controller to query for the list of domain computers.

-r The maximum size file (in bytes) to search inside for interesting strings. Defaults to 500k.

-j How many bytes of context either side of found strings in files to show, e.g. -j 200

-z Path to a config file that defines all of the above, and much much more! See below for more details. Give it -z generate to generate a sample config file called .\default.toml.

What does any of this log output mean?

This log entry should be read roughly from left to right as:

  • at 7:37ish
  • Snaffler found a file it thinks is worth your attention
  • it’s rated it “Red”, the second most-interesting level
  • it matched a rule named “KeepConfigRegexRed”
  • you can read it, but not modify it
  • the exact regex that was matched is that stuff in the red box
  • it’s 208kB
  • it was last modified on January 10th 2020 at quarter-to-four in the afternoon.
  • the file may be found at the path in purple

… and the rest of the line (in grey) is a little snippet of context from the file where the match was.

In this case we’ve found ASP.NET validationKey and decryptionKey values, which might let us RCE the web app via some deserialisation hackery. Hooray!

How does it decide which files are good and which files are boring?

The simple answer:

Each L.A.I.M. magic file finding method does stuff like:

  • Searching by exact file extension match, meaning that any file with an extension that matches the relevant wordlist will be returned. This is meant for file extensions that are almost always going to contain candy, e.g. .kdbx, .vmdk, .ppk, etc.
  • Searching by (case insensitive) exact filename match. This is meant for file names that are almost always going to contain candy, e.g. id_rsa, shadow, NTDS.DIT, etc.
  • Searching by exact file extension match (yet another wordlist) FOLLOWED BY ‘grepping’ the contents of any matching files for certain key words (yet yet another another wordlist). This is meant for file extensions that sometimes contain candy but where you know there’s likely to be a bunch of chaff to sift through. For example, web.config will sometimes contain database credentials, but will also often contain boring IIS config nonsense and no passwords. This will (for example) find anything ending in .config, then will grep through it for strings including but not limited to: connectionString, password, PRIVATE KEY, etc.
  • Searching by partial filename match (oh god more wordlists). This is mostly meant to find Jeff's Password File 2019 (Copy).docx or Privileged Access Management System Design - As-Built.docx or whatever, by matching any file where the name contains the substrings passw, handover, secret, secure, as-built, etc.
  • There’s also a couple of skip-lists to skip all files with certain extensions, or any file with a path containing a given string.

The real answer:

Snaffler uses a system of ‘classifiers’, which allow the end-user (you) to define (relatively) simple rules that can be combined and strung together and mangled however you see fit. It comes with a set of default classifiers, which you can see by either looking at the code or by having a look at the config file created by -z generate, so the best place to start with making your own is to edit those.

The defaults don’t have any rules that will look inside Office docs and PDFs, but you can see some examples in SnaffCore/DefaultRules/FileContentRules.cs that have been commented out. Just uncomment those before you compile and edit the regexen to suit your requirements. Be warned, this is a lot slower than looking inside good old fashioned text files, and a typical environment will have an absolute mountain of low-value Office docs and PDFs.

Here’s some annotated examples that will hopefully help to explain things better. If this seems very hard, you can just use our rules and they’ll probably find you some good stuff.

This is an example of a rule that will make Snaffler ignore all files and subdirectories below a dir with a certain name.

[[Classifiers]]
EnumerationScope = “DirectoryEnumeration” # This defines which phase of the discovery process we’re going to apply the rule.
# In this case, we’re looking at directories.
# Valid values include ShareEnumeration, DirectoryEnumeration, FileEnumeration, ContentsEnumeration
RuleName = “DiscardFilepathContains” # This can be whatever you want. We’ve been following a rough “MatchAction, MatchLocation,
# MatchType” naming scheme, but you can call it “Stinky” if you want. ¯_(ツ)_/¯
MatchAction = “Discard” # What to do with things that match the rule. In this case, we want to discard anything that matches this rule.
# Valid options include: Snaffle (keep), Discard, Relay (example of this below), and CheckForKeys (example below).
MatchLocation = “FilePath” # What part of the file/dir/share to look at to check for a match. In this case we’re looking at the whole path.
# Valid options include: ShareName, FilePath, FileName, FileExtension, FileContentAsString, FileContentAsBytes,
# although obviously not all of these will apply in all EnumerationScopes.
WordListType = “Contains” # What matching logic to apply, valid options are: Exact, Contains, EndsWith, StartsWith, or Regex.
WordList = [“winsxs”, “syswow64”] # A list of strings or regex patterns to use to match. If using regex matterns, WordListType must be Regex.
Triage = “Green” # If we find a match, what severity rating should we give it. Valid values are Black, Red, Yellow, Green. Gets ignored for Discard anyway.

This rule on the other hand will look at file extensions, and immediately discard any we don’t like.

In this case I’m mostly throwing away fonts, images, CSS, etc.

[[Classifiers]]
EnumerationScope = “FileEnumeration” # We’re looking at the actual files, not the shares or dirs or whatever.
RuleName = “DiscardExtExact” # just a name
MatchAction = “Discard” # We’re discarding these
MatchLocation = “FileExtension” # This time we’re only looking at the file extension part of the file’s name.
WordListType = “Exact” # and we only want exact matches.
WordList = [“.bmp”, “.eps”, “.gif”, “.ico”, “.jfi”, “.jfif”, “.jif”, “.jpe”, “.jpeg”, “.jpg”, “.png”, “.psd”, “.svg”, “.tif”, “.tiff”, “.webp”, “.xcf”, “.ttf”, “.otf”, “.lock”, “.css”, “.less”] # list of fil
e extensions.

Here’s an example of a really simple rule for stuff we like and want to keep.

[[Classifiers]]
EnumerationScope = “FileEnumeration” # Still looking at files
RuleName = “KeepExtExactBlack” # Just a name
MatchAction = “Snaffle” # This time we are ‘snaffling’ these. This usually just means send it to the UI,
# but if you turn on the appropriate option it will also grtab a copy.
MatchLocation = “FileExtension” # We’re looking at file extensions again
WordListType = “Exact” # With Exact Matches
WordList = [“.kdbx”, “.kdb”, “.ppk”, “.vmdk”, “.vhdx”, “.ova”, “.ovf”, “.psafe3”, “.cscfg”, “.kwallet”, “.tblk”, “.ovpn”, “.mdf”, “.sdf”, “.sqldump”] # and a bunch of fun file extensions.
Triage = “Black” # these are all big wins if we find them, so we’re giving them the most severe rating.

This one is basically the same, but we’re looking at the whole file name. Simple!

[[Classifiers]]
EnumerationScope = “FileEnumeration”
RuleName = “KeepFilenameExactBlack”
MatchAction = “Snaffle”
MatchLocation = “FileName”
WordListType = “Exact”
WordList = [“id_rsa”, “id_dsa”, “NTDS.DIT”, “shadow”, “pwd.db”, “passwd”]
Triage = “Black”

This one is a bit nifty, check this out…

[[Classifiers]]
EnumerationScope = “FileEnumeration” # we’re looking for files…
RuleName = “KeepCertContainsPrivKeyRed”
MatchLocation = “FileExtension” # specifically, ones with certain file extensions…
WordListType = “Exact”
WordList = [“.der”, “.pfx”] # specifically these ones

MatchAction = “CheckForKeys” # and any that we find, we’re going to parse them as x509 certs, and see if the file includes a private key!
Triage = “Red” # cert files aren’t very sexy, and you’ll get huge numbers of them in most wintel environments, but this check gives us a way better SNR!

OK, here’s where the REALLY powerful stuff comes in. We got a pair of rules in a chain here.

Files with extensions that match the first rule will be sent to second rule, which will “grep” (i.e. String.Contains()) them for stuff in a specific wordlist.

You can chain these together as much as you like, although I imagine you’ll start to see some performance problems if you get too inception-y with it.

[[Classifiers]]
EnumerationScope = “FileEnumeration” # this one looks at files…
RuleName = “ConfigGrepExtExact”
MatchLocation = “FileExtension” # specifically the extensions…
WordListType = “Exact”
WordList = [“.yaml”, “.xml”, “.json”, “.config”, “.ini”, “.inf”, “.cnf”, “.conf”] # these ones.
MatchAction = “Relay” # Then any files that match are handed downstream…
RelayTarget = “KeepConfigGrepContainsRed” # To the rule with this RuleName!

[[Classifiers]]
RuleName = “KeepConfigGrepContainsRed” # which is this one! This is why following a naming convention really helps.
EnumerationScope = “ContentsEnumeration” # this one looks at file content!
MatchAction = “Snaffle” # it keeps files that match
MatchLocation = “FileContentAsString” # it’s looking at the contents as a string (rather than a byte array)
WordListType = “Contains” # it’s using simple matching
WordList = [“password=”, ” connectionString=\””, “sqlConnectionString=\””, “validationKey=”, “decryptionKey=”, “NVRAM config last updated”]
Triage = “Red”

Who did you steal code from?

The share enumeration bits were snaffled (see what I did there?) from SharpShares, which was written by the exceedingly useful Dwight Hohnstein. (https://github.com/djhohnstein/SharpShares/) Dwight’s GitHub profile is like that amazing back aisle at a hardware store that has a whole bunch of tools that make you go “oh man I can’t wait til I have an excuse to try this one for realsies…” and you should definitely check it out.

While no code was taken (mainly cos it’s Ruby lol) we did steal a bunch of nifty ideas from plunder2 (http://joshstone.us/plunder2/)

Wordlists were also curated from those found in some other similar-ish tools like trufflehog, shhgit, gitrobber, and graudit.

R K

Recent Posts

Kali Linux 2024.4 Released, What’s New?

Kali Linux 2024.4, the final release of 2024, brings a wide range of updates and…

7 hours ago

Lifetime-Amsi-EtwPatch : Disabling PowerShell’s AMSI And ETW Protections

This Go program applies a lifetime patch to PowerShell to disable ETW (Event Tracing for…

8 hours ago

GPOHunter – Active Directory Group Policy Security Analyzer

GPOHunter is a comprehensive tool designed to analyze and identify security misconfigurations in Active Directory…

2 days ago

2024 MITRE ATT&CK Evaluation Results – Cynet Became a Leader With 100% Detection & Protection

Across small-to-medium enterprises (SMEs) and managed service providers (MSPs), the top priority for cybersecurity leaders…

5 days ago

SecHub : Streamlining Security Across Software Development Lifecycles

The free and open-source security platform SecHub, provides a central API to test software with…

1 week ago

Hawker : The Comprehensive OSINT Toolkit For Cybersecurity Professionals

Don't worry if there are any bugs in the tool, we will try to fix…

1 week ago