Over this summer I have delivered a speech entitled in my head “how to Word” a number of times. It covers some stuff which helps get you from someone who wrote a CV in Word once to someone who is more able to spit out 100 page documents in 5 days.
Likely to be a thing I add to over time. The blog post is really to shamelessly link people to the pinned page:
When I am interviewing people for a job I use the line about how “Pentesting is 50% technical and 50% consulting”. Not much point in being able to do technical gymnastics if you cannot document it, explain things right, or deliver those notes on time to customers.
I was given instructions to start an office. Find candidates, filter through them, and then pick a number of them. A lot of trust and I wholeheartedly thank my employers for giving me the rope to do this.
I have never really seen becoming a manager as a career ambition. I always have, and most likely always will, call this “failing upwards”. Why? Put simply being a Pentester for me is the greatest job on the planet. Any step away from doing that *every*day* must simply be called “failing” right?
Over the years I have assisted many in the industry by offering; advice, training, but most importantly (so I am told) is TIME.
During my decade or so I have seen those I helped go from entirely green to campaign hardened professionals. Thinking back to them being straight out of university and unable to sort out their “multiple simultaneous logins” from their “session hijacking” (more common than you might expect that confusion!). They now are doing exceptional things themselves.
Usually the committed people will get there on their own and it is not necessarily my influence. In some small way I hope they do even one thing better as a result though!
Currently, I still get to do a decent number of testing days which is good. However, I see the time for that coming to an end relatively soon. The in-take for the inaugural year seems set and now we just have to focus on starting them off correctly.
Guess I will have to live vicariously by assisting my new troop during their projects. To support all of this I have been making training and coming up with little speeches in my head.
Today I am really excited to start the newest member of the team off on the route of how to be professional, and get the job done. There are a lot of myths about the industry that seem to build up. Universities seem adept at teaching some things but not in making “consultants”. Part of the programme will be to sort those out!
The goal is not to make people ready for passing some certification like CREST. It is to make them engaged with their job, confident in doing it, aware of where to get information, and to deliver things that are useful to customers.
Remember these few points:
the report is the “product”. Not how amazing your technical acrobatics is.
recommendations are what the customer pays for.
talk to the customer. Not all of them are the same, some might have a preferred risk system etc.
Take ownership of your deliverables, care about your name being on it, and TALK to the customer. These things will set you up the right way.
It is exceptionally difficult to keep various content management systems up-to-date against the number of security patches that are released. However, many sites are powered by software such as; WordPress or Drupal etc.
A good site admin (one following various “lockdown” guides) will undertake steps to remove version numbers being disclosed in HTTP response headers, or within returned content as per recommendations like those linked to below:
As security professionals we also tend to recommend such steps since it pro-actively helps you engage customers in securing their site. However, the majority of attacks against publicly known issues are conducted by blind brute-force. Real-world attackers simply do not bother to check for version information before they fire the exploit code at you. What is it to them if their illegal activity causes your site to crash or get defaced ?
It is just down right unprofessional to fire public exploits at a target and hope something sticks in the manner a real threat agent would. So, in choosing to “secure” your site, you may effectively only be masking problems that your rather expensive penetration testing provider would otherwise have located.
How to ensure that your customer is not vulnerable when they have undertaken steps to obscure full version information? This was the question I had to answer last week.
To illustrate the work flow understand the following steps:
You come across a target using Drupal
You observe version 7 in the HTTP response headers but are unable to obtain specific minor version information.
You turn to an established fingerprinting technology such as Blind Elephant and point it at your target:
BlindElephant.py https://TARGETSITE/ drupal
Loaded /usr/local/lib/python2.7/dist-packages/blindelephant/dbs/drupal.pkl with 145 versions, 478 differentiating paths, and 434 version groups.
Starting BlindElephant fingerprint for version of drupal at https://TARGETSITE/
Hit https://TARGETSITE/INSTALL.txt
File produced no match. Error: Failed to reach a server: Not Found
Error: All versions ruled out!
This has failed us because it did not find one of two files. The approach for BlindElephant is (I believe) reliant on maintaining a database of files to check for centrally. To me that sounds like a lot more work than I am willing to put into life!
Then I was thinking to myself; “But cornerpirate, the site is powered by code which is entirely available on github.com, can’t we use the features of git to give a really robust answer?”. A few hours later enter “git-version”:
You do offline reconnaissance against your newly download drupal 7 folder. This equates to “hunting for static content”:
Find a unique list of file extensions (inside the new ‘drupal’ directory: find . -type f | perl -ne ‘print $1 if m/\.([^.\/]+)$/’ | sort -u
Review the output above to find anything static. This will at least be; *.txt, *.html, *.js, *.inc, *.sql in the case of Drupal. There are potentially a few more in there.
Create a list of the file names for such static content: find . -name ‘*.inc’ > inc-files.txt
Repeat for all interesting file types.
You now have a list of files you want to check for on the target site.
From here you need to try and download every single one of those files from your target site.
When you find a file simply download it and then use git-version to check which revision that file is at. Ideally you want to base your version on something which has hundreds of revisions. In the case of Drupal those *.inc files appear to be good candidates.
In my case the site allowed access to “bootstrap.inc” which I then passed as input into git-version:
Here we have a ‘mildly’ outdated site. The most recent version is 1/637 where 637 is the total number of revisions. As we are using 9/637 there are 8 newer revisions.
If you visit the URL provided it will take you to the raw version where you can typically learn things from the commit message:
Drupal 7.41 as determined using git-version
Great success! That version of the file is literally md5 checksum identical to the version in release 7.41 of Drupal.
Also note, as it happens, that the ‘bootstrap.inc’ file happens to helpfully announce the version anyway. So in the case of drupal 7, we could replace the entire ‘git-version’ tool workflow with: