Captain’s Log: March 2021

Here is how I did in the new condensed table format.

TargetSummary
11k steps a dayI hurt my ankle. I do not know how and cannot recall a slip or anything I was just suddenly in lots of pain walking one day. So.. At day 452 since I started going for 10k steps a day my run ended. I needed to rest it up.
150 active minutes per weekWeek 1 – Yes
Week 2 – No – ankle injury and sore throat so I took a week off.
Week 3 – Ditto on the ankle.
Week 4 – Ditto on the ankle.
1 technical blog a monthI managed 2 short but useful posts that I will need to refer to in future.
How to use letsencrypt with python HTTP services.
Solving a pentester’s pesky proxy problem.
Support my partner to exerciseWeek 1 – Yes
Week 2 – No – they actually smashed their goal and didn’t need me on the Saturday.
Week 3 – Yes
Week 4 – Yes
Record five songsSince I was inactive I had more time and I actually have made a song. Well almost. It is not finished. I had a DAW to try and learn and fancied trying some layers because that is all new to me:

Whatever – version 3

The vocals will be re-recorded sometime when the kids aren’t asleep in the next room, the lyrics are already different on paper, and there is a 3rd verse from a different protagonist to be expected. But still.. Been happily listening to it today on a loop.
OSWEI have not prioritised it this month.
Panic AttacksA clear month really. A few dicey moments but not full blown panic.

Other bits

  • Audiobooks 1 – carrying on with Stealing Light: Shoal, Book 1 by Gary Gibson. A good bit of Sci-Fi.
  • Television 1 – Star Trek Discovery (catch up Season 1 and 2). My house loves Star Trek. So we watched this when it came out. We sat down to consider watching season 3 and realised; we cannot remember a damn thing that happened. Like literally nothing. So we decided to re-watch Season 1 and 2 first. It is definitely different to previous ST series. It is not necessarily good or bad that it is different. I feel a bit like I really don’t care about any of the characters at all and that the plot is too heavily reliant on the lead character instead of there being an ensemble to draw on. Rewatching it again was like watching it for the first time for us. It was definitely a chore at times. Then we finally got to Season 3. It was like a breathe of fresh air. Suddenly it was way more exciting and we were happy to watch the next one as soon as possible!
  • Television 2 – Away (netflix) – I like a bit of sci-fi but I also have a soft spot for content imagining our near future with basically the technology we have but an increased will to use it. This series was pretty decent in that category. Some good acting performances. A nice distraction. I wanted the next episode when the season ended. But so far this is likely to be something I entirely forget in a few weeks.

That is the log for March. It is sunny outside and I have some days off. Excellent.

Solving a pentester’s pesky proxy problem

I usually test web applications using Firefox because it uses it’s own proxy settings and is easy to configure with burp. Chrome is then something that is used for googling answers, shitposting on Twitter etc to ensure that such traffic is not logged by Burp. This should sound familiar to most pentesters.

This process falls down when you need to test a thick client/os binary which uses only Internet Explorer’s proxy settings. Because Chrome also uses IE’s settings you will now see all your googling popup.

IE’s Proxy settings can be configured by PAC files. I have known this for a very long time. But I have never actually took the leap to think “oh that means I can tell it to only apply a proxy for the specific backend server the Thick Client uses” before. Proof, if more be needed, that I can be a pretty dull axe at times. I couldn’t chop a cucumber.

Here is a valid proxy configuration file:

function FindProxyForURL(url, host) {
// use proxy for specific domains
if (shExpMatch(host, "*.targetdomain.com"))
    return "PROXY localhost:8080";

// by default use no proxy
return "DIRECT";
}

Change the host you want to match for with your target domain. Save this as a “.js” file someplace you can type the path to and then import it into Internet Explorer’s proxy settings.

Revel in the freedom to live your best life on your terms.

Take care

Letsencrypt certificates for your python HTTP servers

Back in 2016 I blogged about how to do simple HTTP or HTTPS servers with python. You need to use these if you want to temporarily host files, and to investigate SSRF issues properly.

There my skills sat until recently the user-agent that was making the SSRF request was actually verifying the certificate. How rude! So I needed to up my game and generate a proper certificate.

Here are some caveats to the guide which you need to be aware of before you proceed:

  • My OS was Kali Linux (Good for standardisation for penetration testers but won’t be applicable to every one of you legends who are reading this).
  • The server that I am using has it’s own DNS entry already configured. This is the biggest gotcha. You need to have a valid DNS record. Now is the time to buy a cheap domain! Or maybe investigate a domain allowing anyone to add an A record such as “.tk”.

If you can point a DNS entry at your Kali server then you are going to be able to do this.

In one terminal:

mkdir /tmp/safespace
cd /tmp/safespace
python -m SimpleHTTPServer 80

NOTE: I created a new directory in “/tmp” to make sure that there is nothing sensitive in the folder I am about to share. If you share your “/home” folder you are going to have a bad time. While the HTTP listener is running anyone scanning your IP address will be able to list the contents of the folders and download anything you have saved.

From another terminal you need to use “certbot”

apt-get install certbot
certbot certonly
....
pick option 2
set the domain to <your_domain>
set the directory to /tmp/safespace

The “certbot certonly” command runs a wizard like interface that asks you how to proceed. I have given you the options above. What this does is create a file in your /tmp/safespace folder that letsencrypt can download on their end. This proves that you have write permissions to the web root of the server and allows them to trust the request is legit.

The output of the “certbot certonly” command will list the location of your new TLS certificates. They will be here:

/etc/letsencrypt/live/<your_domain>/fullchain.pem
/etc/letsencrypt/live/<your_domain>/privkey.pem

You can go back to your first terminal and kill that HTTP listener. We will no longer be needing it! We have a proper TLS setup so lets go in a Rolls Royce baby, yeaaaah!

You can use python’s “twisted” HTTPs server as shown below

python -m twisted web --https=443 --path=. -c /etc/letsencrypt/live/<your_domain>/fullchain.pem -k /etc/letsencrypt/live/<your_domain>/privkey.pem

That was it. I was able to browse to my new HTTPS listener and I had a shiny trusted certificate.

Hope that helps

Captain’s Log: February 2021

Here is how I did in the new condensed table format.

TargetSummary
11k steps a day5 miles of steps. Every single day.
150 active minutes per weekWeek 1 – Yes
Week 2 – Yes
Week 3 – Yes
Week 4 – Yes
1 technical blog a monthI missed this in January (which seems mad when I had several drafts almost ready to go). So in February I put out two:

Verifying Insecure SSL/TLS protocols are enabled
Pentesting Electron Applications

On track for that. I have a few other ideas for coming months.
Support my partner to exerciseWeek 1 – Yes
Week 2 – Yes
Week 3 – Yes
Week 4 – Yes
Record five songsI have not prioritised it this month.
OSWEI have not prioritised it this month.
Panic AttacksI never actually got to the state of disruptive levels of panic this month. I got pretty close. I had a fairly stressful project to work on at the same time as home schooling while maintaining the daily and weekly exercise tasks. This had me doing some pretty long days which definitely made me stressed. I just knew better when to go and take a lie down. All hail the hour nap.

Other bits

  • Audiobooks 1 – I completed Dune by Frank Herbert. This was a great listen and I thoroughly enjoyed it.
  • Audiobooks 2 – I have started Stealing Light: Shoal, Book 1 by Gary Gibson. This was a recommendation and so far I am not far into it. It is painting an interesting universe for mankind’s future. There is a species that pops up to other sentient species and goes: We can give you a bunch of technology and help you colonise this region of space which is yours. In order to get this you agree to not do X, Y and Z. Which creates a dependence on the Shoal corporation. So far it is some pretty decent Sci-Fi which shows despite the advancement of time we remain just as seedy.
  • Television – I have loved South Park for years. I haven’t been keeping up with it for over a decade. With a bunch of episodes being added to Netflix I have started to catch up. They are masters at talking about issues in an interesting way. You don’t have to agree with everything they say to enjoy it. But it is worth remembering how great this show is (once we get past the initial seasons which were mostly fun but entirely juvenile).
  • Television – A while back I started watching Babylon 5 again when I realised I could get it on Amazon Prime (not free). I watched the show when I was clearly too young to understand it. For some reason I didn’t complete the full watch through even though I was enjoying it. I picked it up again and mid season there was an announcement that the content had been upgraded to HD. This makes it even more watchable. The writing is amazing and the cast do some fantastic things together. It is truly amazing that this was made pre-streaming to be as layered as it is. You had to wait a week for another episode, and they would sometimes not have a thing pay off for years. Now that you can watch this back-to-back I intend to and so should you :D.

That is the log for February.

Pentesting Electron Applications

I recently came across my first Electron application as a target. As is the case I try and take notes as I go and here they are so that I am ready for the next time.

When you are targeting an “app” (various thick client or mobile application targets) I always want to:

  • Decompile it if possible – to enable source code review
  • Modify and recompile if possible – to enable me to add additional debugging to code or circumvent any client side controls.
  • Configure middling of communication channels – to enable testing of the server side API.

That is the general process for this kind of task and the blog post covers how to do all of these for Electron applications.

Extracting .asar Files on Windows

When going to the application’s folder I found that it had a “resources” directory containing a couple of “.asar” files. A quick google uncovered that this is a tar like archive format as per the link below:

This is similar to APKs in the Android app testing realm. To get serious we need to extract these to have a proper look.

First get NodeJs:

After installing this open a new command prompt and type “npm” to check that you have the Node Package Manager working right.

Then install “asar”

npm install --engine-strict asar

As a new user of npm I want to just marvel at the pretty colours for a second:

Oh NPM, you are gorgeous

Also an inbuilt check for vulnerabilities? NPM you are crushing it.. Just crushing it. So how come so many apps have outdated dependencies?

Turns out the above command – while gorgeous – did not stick the “asar” command into the windows path as intended. To do that I subsequently ran this command:

npm install -g asar

Then it was found the command in the path:

Oh asar, I found you, I never knew I needed you 10 minutes ago

I was unclear as to the operation of asar. Specifically, would this extract into the folder in a messy way? In the doubt I created a new folder “app” and copied my target “app.asar” file into that before extracting it:

mkdir app
copy app.asar app
asar e app.asar . 
del app.asar 

Note: that del is important because it stops you packing a massive file back into your modified version later.

I was glad I did copy the target into a folder because extraction did this:

Files Extracted to the Current Directory

Had I done the extraction a directory higher then the target application folder would have been messy.

With access to the source you can go looking for vulnerabilities in the code. I cannot really cover that for you since it will depend on your target.

Modifying the app

If you are half way serious about finding vulnerabilities you will need to modify that source and incorporate your changes when running the target. You need to modify the code and then re-pack the “app.asar” file back.

The best place to start is where the code begins its execution. That is in the “main.js” file within the “createWindow” function as shown:

Showing the createWindow function

I modified this to add a new debugging line as shown:

Added console.log debugging line

The point of this was to tell me that my modified version was running instead of the original. I started a command prompt in the “resources” folder and then executed these commands:

move app.asar app-original.asar
asar pack app app.asar

Note: that move command was about ensuring I have the original “app.asar” on tap if I needed to revert back to it. The pack command would otherwise overwrite the original file.

It turns out that “console.log” writes to stdout which means you can see the output if you run the target application from the command prompt. I prefer to create a .bat file for so that I can view stdout, and stderr easily in a temporary cmd window.

The contents of my bat file were:

target.exe
pause

The first line ran the target exe, and the second ensured that the cmd window would stay open in the event of the target being closed or crashing (until a key is pressed). I saved this in the folder with “target.exe” and called it “run.bat”. Instead of clicking on the exe I just double clicked on run.bat and got access to all of the debugging output.

Much success the first line of output when I load the application was now:

[*] Injecting in here

You can now see if there are any client side controls you can disable.

Allow Burp Suite to proxy the application

You will want to be able to see the HTTP/HTTPS requests issued by your target application. To do that you need to add an electron command line argument. If we look again at the “createWindow” function you can see two examples already:

This image has an empty alt attribute; its file name is image-9.png
createWindow function showing commandLine.appendSwitch

At line number 58 I added this switch which configured localhost port 8080 as the HTTP and HTTPS proxy:

electron_1.app.commandLine.appendSwitch('proxy-server', '127.0.0.1:8080');

Again I had to pack the app folder into app.asar as shown in the previous section.

Electron loads chromium and therefore you will need to install burp’s CA certificate as per this URL:

Once you have done that you should see all the juicy HTTP/HTTPS traffic going into burp.

Congratulations you can now go after the server side requests.

I have covered extracting the Electron application to let you see the code, how to modify the code, and how to middle the traffic. That is more than enough to get going with.

Happy hunting

Verifying Insecure SSL/TLS protocols are enabled

If a vulnerability scanner tells you that a website supports an insecure SSL/TLS protocol it is still on you to verify that this is true. While it is becoming rarer, there are HTTPS services which allow a connection over an insecure protocol. However, if you issue an HTTP request it will respond to the user with a message like “Weak cryptography was in use!”.

I think this was an older IIS trait. But I am paranoid and in the habit of making damn sure that a service which offers say SSLv3 will actually handle HTTP requests over it. If there is a warning about poor cryptography then the chances are a user won’t be submitting passwords which will reduce the risk.

This blog post explains how to use openssl to connect using specific SSL/TLS protocols. It also provides a solution to a major gotcha of the process; the version of openssl that ships with your OS cannot issue insecure connections.

Using openssl to connect with an old protocol

The best way to do this, for me, is to use “openssl s_client” to connect to the service. The command supports the flags “-ssl2” “-ssl3” and “-tls1” to enable you to confirm those respectively. Say you want to verify SSL3 is enabled on a target you would do this:

openssl s_client -connect <host>:<port> -ssl3 -quiet

If the server supports the protocol it will now say this the line above where your cursor landed:

verify return:1

Your cursor will be paused there because you have established an SSL/TLS connection and now it is waiting for you to enter data.

Issuing an HTTP request through that connection

Having established your connection you will want to issue an HTTP request. A simple GET request should be sufficient:

GET / HTTP/1.1
Host: <host>

This simple request will return the homepage of any HTTP server. The subtle part is that there are invisible newline characters. In truth the above can be expressed in a single line as:

GET / HTTP/1.1\r\nHost: <host>\r\n\r\n

Here a pair of “\r\n” is representing the newlines. The exchange ends with “\r\n\r\n” because HTTP wants a blank line after all the headers are done to indicate when the server should start responding.

To be easier we can use “printf” to issue the GET request directly through our openssl connection:

printf "GET / HTTP/1.1\r\nHost: <host>\r\n\r\n" | openssl s_client -connect <host>:<port> -ssl3 -quiet

If the server responds without warning the user about insecure cryptography then you have just confirmed it as vulnerable.

Major Caveat

Most operating systems rightly ship with secured versions of “openssl”. This means that they have been compiled to not support these insecure protocols. The reason for doing this is to protect users from exploitation! This is good for us. If fewer clients can even talk SSLv3 then how much more unlikely is it that exploitation will occur?

To solve this problem I suggest you compile a version of openssl which has been compiled to support insecure versions:

wget https://openssl.org/source/openssl-1.0.2k.tar.gz
tar -xvf openssl-1.0.2k.tar.gz
cd openssl-1.0.2k
./config --prefix=`pwd`/local --openssldir=/usr/lib/ssl enable-ssl2 enable-ssl3 enable-tls1 no-shared
make depend
make
make -i install

Doing it this way creates a local binary (in “openssl-1.0.2k/local/bin”) for you which is not then installed over the system’s maintained one. This keeps your every day “openssl” command in your path secure while giving you the opportunity to confirm vulnerable services.

This solution is adapted from the Gist available here:

For ease I choose to then create an alias for the command as shown below:

alias "openssl-insecure"="/<path>/<to>/openssl-1.0.2k/local/bin/openssl"

I can now call the insecure version by using “openssl-<tab>” to complete with insecure.

Happy insecure protocol verifications to you all.

Captain’s Log: January 2021

A new year has begun.

If we go back to two years to 2019 for a moment. I was diagnosed with a medical condition that is prone to immobilise me periodically for around 2 weeks. There were a few bouts of that. I also had some unrelated chest infections, tonsillitis, and a run of bad health that went through the peak of summer until December 2019. I had been pretty miserable and had seen my weight increase due to the lack of mobility.

By Christmas eve 2019 I had been on medication to control my condition long enough for me to get back to walking around. Sick of the misery I set myself a goal of doing the 10k steps per day that Fitbit offers as good minimum level. Through lockdowns this became an absolute mission but I got it done.

I got through my self-set 2020 challenges, and now I have to ask what next?

I have set some health, lifestyle, and hobby goals for 2021.

Lets get Spectabular!

I am going to introduce you to the table of the six labours, and one thing I need to track monthly:

TargetRationale
11k steps a daySlightly more than 2020. I have increased the number because this should take me over 4 million steps for the year which sounds like a thing. Really I am aiming to do 5 miles a day or greater than 1,825 miles in the year.
150 active minutes per weekThis equates to 22 active minutes every single day. I will get there with a mix of jogging, cycling, using a kettle bell, using my stairs to step exercise, and anything else that comes to mind. This is the REAL target for 2021.
1 technical blog a monthThe main focus of cornerpirate.com should always be technical content. As I will be posting my captain’s log entries once a month it makes sense to at least post something technical every month too.
Support my partner to exerciseI do not live in isolation. It is clear that my goals are only possible because of the love and support they provide. They created swathes of space for me to do so by looking after the kids and putting up with me going out at all hours to mine steps. There should absolutely be quid quo pro on this. They want a Saturday morning every week to be child free for a couple of hours to then do some exercise themselves. Totally onboard with that!
Record five songsI can play guitar. I can sing reasonably badly. But that hasn’t stopped me increasingly putting out content. Usually cover versions thrown together with little skill in a single take. I don’t even practice before hitting record. I do not know the lyrics. I am reading them on screen and playing the chords as they come on some tab or other. Why not try writing some new songs myself and putting a little more effort into it? It is rare for me to get the time to do so meaning that starting with five over a year should be achievable. Looking forward to learning how to use the kit I have collected rather than going “fuck it audacity and the mic that I used for conference calls is sufficient”.
OSWEI have pretty much no certifications to show I am even half way relevant or decent as a penetration tester. There are techniques that I need time to practice which are right there in the course material for OSWE. I am committing here to buying the lab access and using it. It is likely that I will be harvesting some of the technical blog posts based on the learning I do for this. I may or may not go for the certification. But for me I suspect the lab exposure and cost of many many evenings will be beneficial overall. If it looks like I can get a decent run at the exam than I might give it a go.
Panic AttacksThis is not a goal. This is just something I want to note happened or not in a given month. Gathering evidence as to their frequency, triggers, and impact. My logical brain is saying based on 2020 that these happen after a run of poor nights sleep.
Six Labours and One to Track

I will complete this table each month to add a bit more brevity and structure to the rather chaotic approach used last year. Starting from February I should get away with much shorter blogs as a result.

How did I do in January?

Now into the meat.

TargetSummary
11k steps a day5 miles of steps. Every single day.
150 active minutes per weekDone.
1 technical blog a monthA couple of drafts in progress but the day job put this on the back burner.
Support my partner to exerciseWeek 1 – Yes
Week 2 – Yes
Week 3 – Yes
Week 4 – Yes
Week 5 – Yes (though contested by my partner I wasn’t fast enough in getting a kid ready to leave the house on time).
Record five songsI have not prioritised it this month.
OSWEI have not prioritised it this month.
Panic AttacksA clear month.

The Good

  • Board Games – I discovered that someone at the school had taught my eldest how to play chess. We had tried for years to play board games but their lack of patience and attitude to losing at anything was too explosive and caused fights. I bought a compendium of old board games and made the time to play just with them regularly. It has been great fun! I can see they have a logical brain when they are not exploding about losing. Seeing them mature is beautiful.
  • Music – Having setup the Piano over the keyboard over the holidays, and procured a poster which shows the chord shapes, I have been able to belt out a few songs in the evenings. Since I have never had a lesson or loads of time to do this before it has been nice. I am happy mucking about like this. My goal clearly isn’t to master the instrument. Simply to have fun and this has been great.
  • Health – I have lost weight. I am just not terribly interested in the number this time. The focus is the exercise goals and maintaining those and the results will continue. Or not. But I will be healthier regardless of my mass. In most things I find data and statistics comforting. Somehow weighing myself is a mixed bag as it can be as demotivating as it is motivating. Trying it differently this time and not going for weekly weigh ins.
  • Audiobook 1 – I listened to Sandworm by Andy Greenberg. I found this very interesting and should be the kind of book you can recommend to non-infosec humans. It is well delivered. The pace is expertly done.
  • Audiobook 2 – From there I moved on to Dune by Frank Herbert. I would like to state for the record that I didn’t listen to Sandworm (which is a reference to Dune itself) and then decide to buy Dune to listen to. That would be far too logical! I had actually been meaning to read Dune for many years. I had bought it on Audible before I was recommended Sandworm. I let the recommendation jump the queue and was laughing when Sandworm was revealed as a reference to Dune. The audible production of Dune is excellent and is the first I have heard with additional music. It isn’t quite a radio play version but is definitely acted more than just read.

The Bad

  • Maybe not “bad” really. This one is actually a bittersweet. One of my team decided to move on to pastures new to continue their career progression. This is always both a sad and a wonderful thing. Sad because you won’t be talking to them every day anymore. But also wonderful because everyone needs to eventually move on. They were definitely ready to do so. Even though I will miss our wide ranging phone calls that always started “I have five minutes” and ended up an hour later with us both much happier for the exchange.

Captain’s Log: December 2020

This is the final Captain’s Log of 2020. I think I will keep doing this monthly but go me. I have managed this for 12 straight months.

The Good

10k Step Challenge – I have plodded 10 thousand steps a day. Every single day for over a year. The vast majority of that in a locked down flat. It has been a grind at times. But I am happy I have done that.

While several people have been great about my tweeting about this I have to give a special shout out to David Carson. In September I was at a low point as I was sick. I pithily put it out to the universe on Twitter that I was going to just lie down but they dropped a dash of encouragement at the exact right moment if you want to read the thread here:

Thanks to David I got the whole year done and so this video of the final steps on Christmas Eve is in part his responsibility:

150 active minutes a week challenge – December bit hard so my active minutes basically fell by the wayside as I focused on taking care of my family picking up some slack. We will get back to this after the chaos dies down in the new year. I have to set some new targets.

Audiobooks – A recommendation from the Stephanie Hill over at Ascent Cyber was to check out Social Engineering: the science of Human Hacking written by @humanhacker. I have never been into social engineering. I see its value (which can be huge) and understand the basics. Listening to this audiobook has removed some of the fog of war from the social engineering map and has been a worthwhile use of an audible credit. I would recommend it.

Weegiecast – I was invited by fuzz_sh and zephrfish to go onto their Podcast WeegieCast. This was my first ever podcast. It was fun. Though I think some bits of it are clunky now I have listened back to it. I am new to being recorded saying stuff so please forgive me. If you want to hear it then you can get to the links to it via the tweet thread linked below:

Christmas – with the miserable 2020 coming to a close it is worth taking a moment to grab hold of anything that is good and pure. We are each here for whatever time we have in this life. With whatever skills we can learn. Within whatever capabilities we can train our bodies to deliver. Some start with a shittier deck of cards but the player can overcome the odds in some respects.

2020 has had many of us walking up to the edge and peering over into the oblivion. Christmas serves as a circuit breaker for me most years where I unplug. This year I went for it. I ate rather a lot of ALDI’s Christmas related sweets and chocolates (you really should go there as zee Germans really know their confections). I haven’t been to a work Christmas do in years as I cannot travel so actually the remote nature was a nice change of pace:

Whatever this view on Teams is we had a Christmas bash of sorts

Genuine Blog Posts – In December I dragged three actual blog posts out of my drafts folder:

I do enjoy blogging about technical things and that is the real mission of this blog overall. So December was actually a productive month for this site.

Music Time – I know I spam you all with this shit all the time but I do enjoy recording music even if it is on a mobile phone.

  • Rudolph (the Red Team junior) – I was asked by two groups of lovely people to make a sort of novelty Christmas song which was actually pretty good to be asked for! Neither actually panned out but as it happened I delivered a report and had a spare 30 minutes to bash out a version of Rudolph the red nosed reindeer over lunch. I think it is notable for featuring me using several layers (not a hallmark of my silly ditties). Two guitars with different tones, my mouth drum :D, vocals, and even jingle bells via my Christmas Jumper’s embedded bells. Yes the pun is “SamT” instead of “Santa/Samty Clause”. Sam T is our Director of research and I love him:
  • Merry Whatever – Appropriate for new years. A couple of points here. I have never had a piano lesson in my life. I got a keyboard about a year ago but quickly found out I had to hide the power cable or the kids basically refused to do anything BUT discordant noise experiments at maximum volume. Therefore I haven’t actually had time to practice. I bought a poster with the common Piano chords on it and fired into that early on Boxing day to achieve this:

In 2021 I can only guarantee more stupid songs because I get a kick out of them at least.

The Bad

A panic attack (late on Christmas Eve) which ensured I was exhausted for Christmas Day. Nothing much to write home about here I am getting ever better at spotting them coming, dealing with them in the moment, and recovering from them.

The kids didn’t sleep great so by the time I was downstairs with them at 7am I was a total mess of a human. Yelling at them for stupid things such as not eating their breakfasts etc. To be fair it is the biggest fight we have – around them not bloody eating. It just takes on a rather ludicrous dimension when you just want to get through the fucking meal to play games or do absolutely ANYTHING else. I’d look forward to a session of hammering molten nails through the tips of my fingers, if it just meant I didn’t have to fight over the next three bites of toast!

Seriously this was clusterfuck of a day. But after a full nights sleep I declared Boxing Day as Christmas mark 2 and we had an excellent time.

Highlights of the month

Panic attack and my behaviour aside the Christmas break has been amazingly refreshing. I haven’t had a computer turned on until right now on New Years eve to finish up this post. I never “unplug” like this. This has been good.

Happy new year and we’ll meet again.

Firefox Add-Ons that you actually need

In this blog post I will introduce you to a few Firefox Add-Ons which are useful when assessing the security of web applications. There are many, many more Add-ons that people swear by but these ones help me out a lot.

To test a web application you are going to need a web browser to do so. That browser will need to be passed through a local proxy such as OWASP’s Zap or PortSwigger’s Burp Suite Pro if you are on someone’s payroll. I suggest that you pick Firefox for this purpose and that you use a completely separate web browser for keeping up-to-date with Twitter, idling in slack channels etc.

*STOP* In addition to the main point of this post let me park up in this lay by and drop an anecdote on you.

Many moons ago (~2006 I think) I was helping a newbie start their career. I told them to use one web browser for testing and another for their browsing. They didn’t listen to that advice. So when they uploaded their test data for archive it included their proxy logs. As I QAed their report I opened up the proxy logs to check some details and spotted that it included a whole raft of personal browsing and therefore their password which they reused on everything at the time.

I didn’t overly abuse that privileged information before the point was made that you need to keep things separate. Shout out to newbie who still newbs, though they never write or visit anymore. I still love you. Not least because every newbie since has had this anecdote told to them and it has rounded out the point nicely.

Anecdote dropped. Lets discuss the four Add-Ons that help me out loads.

Multi Account Containers

URL: https://addons.mozilla.org/en-GB/firefox/addon/multi-account-containers/

This is amazing. You can setup containers which are completely separate instances of Firefox. This means you can setup one tab to login as an admin level user and another tab to operate as a standard user:

Configuring multiple containers

These containers are marked by the colour you have assigned them and display the name on the far right:

Loading a site in two containers showing the different user levels

This is a game changer honestly. I feel like the way I worked before was in a cave with no light. Now I can line up access control checks with improved ease and more efficiently test complicated logic. Absolutely brilliant.

A shout out to Chris who showed this one to me.

Web Developer Toolbar

URL: https://addons.mozilla.org/en-GB/firefox/addon/web-developer/

I have used this for a very, very long time. It is useful if you want to quickly view all JavaScript files loaded in the current page:

Viewing all JavaScript Files Quickly

You can achieve a lot of other useful things with it. My need for this has diminished slightly as the in-built console when you press F12 has improved over the years. But I still find it useful for collecting all the JavaScript.

Cookie Quick Manager

URL – https://addons.mozilla.org/en-US/firefox/addon/cookie-quick-manager/

Technically you can manipulate cookies using Web Developer toolbar. I just find the interface with this Add-On much easier to use for this one:

Using Cookie Manager to add a new cookie

When you just want to clear a cookie, or maybe try swapping a value with another user this is quick and simple.

User-Agent Switcher and Manager

URL – https://addons.mozilla.org/en-GB/firefox/addon/user-agent-string-switcher/

Sometimes an application responds differently to different User-Agent strings. You can use a Burp match and replace rule or you can use this add-on which has the benefit of a massive list of built in User-Agent strings.

You can also add a little bit to you User-Agent to differentiate your users like this:

Add String to User-Agent

By applying the setting to the container you can mark up which level of user made the request. Now that I do this I have found it absolutely invaluable in sorting out what I was doing.

When you view the requests in your local proxy you will instantly know which user level was making that particular request. This is vital particularly where apps issues lots of teeny tiny annoying requests per minute. When it is otherwise easy to lose which browser container was saying what.

I hope that has helped you. If you have any other Add-ons you think are vital please sling me a comment or a Tweet. I’d like to look into more.

Regards

API testing with Swurg for Burp Suite

Swurg is a Burp Extender designed to make it easy to parse swagger documentation and create baseline requests. This is a function that penetration testers need if they are being asked to test an API.

Our ideal pre-requisites would be:

A Postman collection with environments configured and ready to go valid baseline requests. Ideally setup with any necessary pre or post request scripts to ensure that authentication tokens are updated where necessary.

— Every penetration tester

Not everyone works that way so we often have to fall back to a Swagger JSON file. In the worst cases we get a PDF file with 100s of pages of exposition and from here we are punching up hill to even say hello to the target. That is a cost to the project and isn’t a great experience for your customers either.

If you are reading this post and you are somehow in charge of how you distribute API details to your customers. Then I implore you to NOT rely on that massive PDF approach. This is for your sanity as much as for customers. Shorten your guides to explain how to authenticate and what API calls are required in a sequence to achieve a specific workflow. Then by providing living breathing documentation which is generated from your code you will rarely have to update the PDF. With the bonus that your documentation will be easier to interact with and accurate to the version of the code it was compiled against.

Anyway you have come here to learn how to setup and start using Swurg.

A shout out and thank you to the creator Alexandre Teyar who saw a problem and fixed it. Not all heroes wear capes.

This extender is now in the Burp app store under the name “OpenAPI Parser” so you can install it the easy way.

But if you want to make any changes to the Extender or others in general then the next few sections will be useful.

Check that you have Java in your Path

Open a command prompt and type:

java --version

If you get a warning that the command cannot be located then you need to:

  1. Ensure that you have a version of the JDK installed.
  2. That the path to the /bin folder for that JDK is in the environment’s PATH variable.

Note: after you have added something to the PATH variable you need to load a new command prompt for the change to take effect. There is probably a neat way to bring altered environment variables into the current cmd.exe session but honestly? I have so rarely needed to set environment variables on windows I would not retain the command in memory anyway so a restart suits me.

Installing Git Bash

I already had Git Bash installed but you might need it:

This has a binary installer which works fine and I have nothing more to add your honour.

Installing Gradle on Windows 10

There is a guide (link below) but it missed a few beats for Windows 10:

Step 1 download the latest binary only release from here:

There is no installer for the binary release so you have to do things manually. You will have a zip file. It tells you to extract to “c:\gradle”. Installing binaries in the root of c:\ has historically been exploitable in Windows leading to local privilege escalations. So I get nervous when I see this in the installation guide!

Usually “C:\Program Files\gradle” would be the location for an application to be installed. In Windows 10 you are going to need admin privileges to write to either of these locations. It is generally assumed that basically all developers have this but that is often not the case.

Based on the installation steps you should be able to unzip anywhere you have write access such as “C:\Users\USERNAME\Desktop” or other location.

Having extracted the Zip you should add some environment variables:

  • GRADLE_HOME – set this to point to the folder you extracted. The location should be the parent folder of “/bin”.
  • JAVA_HOME – set this to point to the root folder of a JDK install. This is also going to be the parent folder of “/bin”.

Finally you need to add this this to your PATH variable:

%GRADLE_HOME%/bin

If you ever upgrade to a newer version of gradle (and from the installer I expect there is not an automated update process) then you unzip the new version and change where GRADLE_HOME points to and your updated version will work.

Open yourself a new cmd prompt to ensure the env variables are applied. Type “gradle” and get your rewards:

Now lets get back to Swurg!

Building Swurg

The repository has excellent install instructions here:

But to tie it all together in my single post I’ll replicate what I needed to do.

I used git bash to clone the repository down and then gradle to build the jar:

git clone https://github.com/AresS31/swurg
cd swurg
gradle fatJar

That worked an absolute treat:

That process completes and leaves you a fresh new jar file in the “\build\libs” folder:

Installing Swurg in Burp

Use the “Extender” -> “Add” functionality to select the “swurg-all.jar”:

How to install a plugin manually

Using Swurg

You should now have a new tab and the opportunity to load a swagger file:

We have Swurg working away merrily here

If you load a valid swagger file this will create a full list of endpoints that you can explore.

Right click on an endpoint and you have an excellent place to start launching things from:

Sending things to Burp tabs

That is definitely enough to get going with. In my case I had replaced my target host with localhost to keep things anonymous as to what I was testing.

This worked well for me and was probably worth the setup. I prefer this to using Swagger-EZ which I have been using in the past.

If we are honest what we all want is a properly configured Postman collection which allows you to have fully configurable environment variables and run pre/post scripts for things such as taking the current Bearer token automatically into all subsequent requests.

In lieu of that this is a reasonable starting point which is embedded where you want it right into Burp suite. If I was to make any changes to the Extender I would probably want an option to globally set the host name and base folder locations. One of those “If I ever get the time” projects.

Hope this helps someone.