Monthly Archives: December 2020

Captain’s Log: December 2020

This is the final Captain’s Log of 2020. I think I will keep doing this monthly but go me. I have managed this for 12 straight months.

The Good

10k Step Challenge – I have plodded 10 thousand steps a day. Every single day for over a year. The vast majority of that in a locked down flat. It has been a grind at times. But I am happy I have done that.

While several people have been great about my tweeting about this I have to give a special shout out to David Carson. In September I was at a low point as I was sick. I pithily put it out to the universe on Twitter that I was going to just lie down but they dropped a dash of encouragement at the exact right moment if you want to read the thread here:

Thanks to David I got the whole year done and so this video of the final steps on Christmas Eve is in part his responsibility:

150 active minutes a week challenge – December bit hard so my active minutes basically fell by the wayside as I focused on taking care of my family picking up some slack. We will get back to this after the chaos dies down in the new year. I have to set some new targets.

Audiobooks – A recommendation from the Stephanie Hill over at Ascent Cyber was to check out Social Engineering: the science of Human Hacking written by @humanhacker. I have never been into social engineering. I see its value (which can be huge) and understand the basics. Listening to this audiobook has removed some of the fog of war from the social engineering map and has been a worthwhile use of an audible credit. I would recommend it.

Weegiecast – I was invited by fuzz_sh and zephrfish to go onto their Podcast WeegieCast. This was my first ever podcast. It was fun. Though I think some bits of it are clunky now I have listened back to it. I am new to being recorded saying stuff so please forgive me. If you want to hear it then you can get to the links to it via the tweet thread linked below:

Christmas – with the miserable 2020 coming to a close it is worth taking a moment to grab hold of anything that is good and pure. We are each here for whatever time we have in this life. With whatever skills we can learn. Within whatever capabilities we can train our bodies to deliver. Some start with a shittier deck of cards but the player can overcome the odds in some respects.

2020 has had many of us walking up to the edge and peering over into the oblivion. Christmas serves as a circuit breaker for me most years where I unplug. This year I went for it. I ate rather a lot of ALDI’s Christmas related sweets and chocolates (you really should go there as zee Germans really know their confections). I haven’t been to a work Christmas do in years as I cannot travel so actually the remote nature was a nice change of pace:

Whatever this view on Teams is we had a Christmas bash of sorts

Genuine Blog Posts – In December I dragged three actual blog posts out of my drafts folder:

I do enjoy blogging about technical things and that is the real mission of this blog overall. So December was actually a productive month for this site.

Music Time – I know I spam you all with this shit all the time but I do enjoy recording music even if it is on a mobile phone.

  • Rudolph (the Red Team junior) – I was asked by two groups of lovely people to make a sort of novelty Christmas song which was actually pretty good to be asked for! Neither actually panned out but as it happened I delivered a report and had a spare 30 minutes to bash out a version of Rudolph the red nosed reindeer over lunch. I think it is notable for featuring me using several layers (not a hallmark of my silly ditties). Two guitars with different tones, my mouth drum :D, vocals, and even jingle bells via my Christmas Jumper’s embedded bells. Yes the pun is “SamT” instead of “Santa/Samty Clause”. Sam T is our Director of research and I love him:
  • Merry Whatever – Appropriate for new years. A couple of points here. I have never had a piano lesson in my life. I got a keyboard about a year ago but quickly found out I had to hide the power cable or the kids basically refused to do anything BUT discordant noise experiments at maximum volume. Therefore I haven’t actually had time to practice. I bought a poster with the common Piano chords on it and fired into that early on Boxing day to achieve this:

In 2021 I can only guarantee more stupid songs because I get a kick out of them at least.

The Bad

A panic attack (late on Christmas Eve) which ensured I was exhausted for Christmas Day. Nothing much to write home about here I am getting ever better at spotting them coming, dealing with them in the moment, and recovering from them.

The kids didn’t sleep great so by the time I was downstairs with them at 7am I was a total mess of a human. Yelling at them for stupid things such as not eating their breakfasts etc. To be fair it is the biggest fight we have – around them not bloody eating. It just takes on a rather ludicrous dimension when you just want to get through the fucking meal to play games or do absolutely ANYTHING else. I’d look forward to a session of hammering molten nails through the tips of my fingers, if it just meant I didn’t have to fight over the next three bites of toast!

Seriously this was clusterfuck of a day. But after a full nights sleep I declared Boxing Day as Christmas mark 2 and we had an excellent time.

Highlights of the month

Panic attack and my behaviour aside the Christmas break has been amazingly refreshing. I haven’t had a computer turned on until right now on New Years eve to finish up this post. I never “unplug” like this. This has been good.

Happy new year and we’ll meet again.

Firefox Add-Ons that you actually need

In this blog post I will introduce you to a few Firefox Add-Ons which are useful when assessing the security of web applications. There are many, many more Add-ons that people swear by but these ones help me out a lot.

To test a web application you are going to need a web browser to do so. That browser will need to be passed through a local proxy such as OWASP’s Zap or PortSwigger’s Burp Suite Pro if you are on someone’s payroll. I suggest that you pick Firefox for this purpose and that you use a completely separate web browser for keeping up-to-date with Twitter, idling in slack channels etc.

*STOP* In addition to the main point of this post let me park up in this lay by and drop an anecdote on you.

Many moons ago (~2006 I think) I was helping a newbie start their career. I told them to use one web browser for testing and another for their browsing. They didn’t listen to that advice. So when they uploaded their test data for archive it included their proxy logs. As I QAed their report I opened up the proxy logs to check some details and spotted that it included a whole raft of personal browsing and therefore their password which they reused on everything at the time.

I didn’t overly abuse that privileged information before the point was made that you need to keep things separate. Shout out to newbie who still newbs, though they never write or visit anymore. I still love you. Not least because every newbie since has had this anecdote told to them and it has rounded out the point nicely.

Anecdote dropped. Lets discuss the four Add-Ons that help me out loads.

Multi Account Containers

URL: https://addons.mozilla.org/en-GB/firefox/addon/multi-account-containers/

This is amazing. You can setup containers which are completely separate instances of Firefox. This means you can setup one tab to login as an admin level user and another tab to operate as a standard user:

Configuring multiple containers

These containers are marked by the colour you have assigned them and display the name on the far right:

Loading a site in two containers showing the different user levels

This is a game changer honestly. I feel like the way I worked before was in a cave with no light. Now I can line up access control checks with improved ease and more efficiently test complicated logic. Absolutely brilliant.

A shout out to Chris who showed this one to me.

Web Developer Toolbar

URL: https://addons.mozilla.org/en-GB/firefox/addon/web-developer/

I have used this for a very, very long time. It is useful if you want to quickly view all JavaScript files loaded in the current page:

Viewing all JavaScript Files Quickly

You can achieve a lot of other useful things with it. My need for this has diminished slightly as the in-built console when you press F12 has improved over the years. But I still find it useful for collecting all the JavaScript.

Cookie Quick Manager

URL – https://addons.mozilla.org/en-US/firefox/addon/cookie-quick-manager/

Technically you can manipulate cookies using Web Developer toolbar. I just find the interface with this Add-On much easier to use for this one:

Using Cookie Manager to add a new cookie

When you just want to clear a cookie, or maybe try swapping a value with another user this is quick and simple.

User-Agent Switcher and Manager

URL – https://addons.mozilla.org/en-GB/firefox/addon/user-agent-string-switcher/

Sometimes an application responds differently to different User-Agent strings. You can use a Burp match and replace rule or you can use this add-on which has the benefit of a massive list of built in User-Agent strings.

You can also add a little bit to you User-Agent to differentiate your users like this:

Add String to User-Agent

By applying the setting to the container you can mark up which level of user made the request. Now that I do this I have found it absolutely invaluable in sorting out what I was doing.

When you view the requests in your local proxy you will instantly know which user level was making that particular request. This is vital particularly where apps issues lots of teeny tiny annoying requests per minute. When it is otherwise easy to lose which browser container was saying what.

I hope that has helped you. If you have any other Add-ons you think are vital please sling me a comment or a Tweet. I’d like to look into more.

Regards

API testing with Swurg for Burp Suite

Swurg is a Burp Extender designed to make it easy to parse swagger documentation and create baseline requests. This is a function that penetration testers need if they are being asked to test an API.

Our ideal pre-requisites would be:

A Postman collection with environments configured and ready to go valid baseline requests. Ideally setup with any necessary pre or post request scripts to ensure that authentication tokens are updated where necessary.

— Every penetration tester

Not everyone works that way so we often have to fall back to a Swagger JSON file. In the worst cases we get a PDF file with 100s of pages of exposition and from here we are punching up hill to even say hello to the target. That is a cost to the project and isn’t a great experience for your customers either.

If you are reading this post and you are somehow in charge of how you distribute API details to your customers. Then I implore you to NOT rely on that massive PDF approach. This is for your sanity as much as for customers. Shorten your guides to explain how to authenticate and what API calls are required in a sequence to achieve a specific workflow. Then by providing living breathing documentation which is generated from your code you will rarely have to update the PDF. With the bonus that your documentation will be easier to interact with and accurate to the version of the code it was compiled against.

Anyway you have come here to learn how to setup and start using Swurg.

A shout out and thank you to the creator Alexandre Teyar who saw a problem and fixed it. Not all heroes wear capes.

This extender is now in the Burp app store under the name “OpenAPI Parser” so you can install it the easy way.

But if you want to make any changes to the Extender or others in general then the next few sections will be useful.

Check that you have Java in your Path

Open a command prompt and type:

java --version

If you get a warning that the command cannot be located then you need to:

  1. Ensure that you have a version of the JDK installed.
  2. That the path to the /bin folder for that JDK is in the environment’s PATH variable.

Note: after you have added something to the PATH variable you need to load a new command prompt for the change to take effect. There is probably a neat way to bring altered environment variables into the current cmd.exe session but honestly? I have so rarely needed to set environment variables on windows I would not retain the command in memory anyway so a restart suits me.

Installing Git Bash

I already had Git Bash installed but you might need it:

This has a binary installer which works fine and I have nothing more to add your honour.

Installing Gradle on Windows 10

There is a guide (link below) but it missed a few beats for Windows 10:

Step 1 download the latest binary only release from here:

There is no installer for the binary release so you have to do things manually. You will have a zip file. It tells you to extract to “c:\gradle”. Installing binaries in the root of c:\ has historically been exploitable in Windows leading to local privilege escalations. So I get nervous when I see this in the installation guide!

Usually “C:\Program Files\gradle” would be the location for an application to be installed. In Windows 10 you are going to need admin privileges to write to either of these locations. It is generally assumed that basically all developers have this but that is often not the case.

Based on the installation steps you should be able to unzip anywhere you have write access such as “C:\Users\USERNAME\Desktop” or other location.

Having extracted the Zip you should add some environment variables:

  • GRADLE_HOME – set this to point to the folder you extracted. The location should be the parent folder of “/bin”.
  • JAVA_HOME – set this to point to the root folder of a JDK install. This is also going to be the parent folder of “/bin”.

Finally you need to add this this to your PATH variable:

%GRADLE_HOME%/bin

If you ever upgrade to a newer version of gradle (and from the installer I expect there is not an automated update process) then you unzip the new version and change where GRADLE_HOME points to and your updated version will work.

Open yourself a new cmd prompt to ensure the env variables are applied. Type “gradle” and get your rewards:

Now lets get back to Swurg!

Building Swurg

The repository has excellent install instructions here:

But to tie it all together in my single post I’ll replicate what I needed to do.

I used git bash to clone the repository down and then gradle to build the jar:

git clone https://github.com/AresS31/swurg
cd swurg
gradle fatJar

That worked an absolute treat:

That process completes and leaves you a fresh new jar file in the “\build\libs” folder:

Installing Swurg in Burp

Use the “Extender” -> “Add” functionality to select the “swurg-all.jar”:

How to install a plugin manually

Using Swurg

You should now have a new tab and the opportunity to load a swagger file:

We have Swurg working away merrily here

If you load a valid swagger file this will create a full list of endpoints that you can explore.

Right click on an endpoint and you have an excellent place to start launching things from:

Sending things to Burp tabs

That is definitely enough to get going with. In my case I had replaced my target host with localhost to keep things anonymous as to what I was testing.

This worked well for me and was probably worth the setup. I prefer this to using Swagger-EZ which I have been using in the past.

If we are honest what we all want is a properly configured Postman collection which allows you to have fully configurable environment variables and run pre/post scripts for things such as taking the current Bearer token automatically into all subsequent requests.

In lieu of that this is a reasonable starting point which is embedded where you want it right into Burp suite. If I was to make any changes to the Extender I would probably want an option to globally set the host name and base folder locations. One of those “If I ever get the time” projects.

Hope this helps someone.

Preload or GTFO; Middling users over TCP 443.

Your website only has TCP 443 open and has a bulletproof TLS configuration. I hear you scream that I cannot middle your users to exploit them! On the surface of it you are correct. Let me lay out some basics, explain how we got here, and then show you that you are incorrect. We can middle your users (but it is unlikely).

Laying the basics about HTTP and HTTPS

The default port of the Internet is TCP 80 which is where requests prefixed with “http://” will go. This is a plain-text protocol and offers neither confidentiality or integrity of data being sent between the client and server.

The default port for the “https://” protocol is TCP 443. This is an encrypted protocol with the “s” meaning “secure”.

As the Internet matured it became apparent that pretty much every request needed to be secured. An attacker using man-in-the-middle techniques can easily subvert plain-text communication channels. Any personal information being exchanged would be theirs to steal. They would also be able to alter server replies to serve either phishing or malware payloads straight into their victim’s browser.

This opened up a front in the cyber war to force encryption for every connection.

Question: What … all of them?

Answer:


Redirect to secure!

A common strategy has been to leave both TCP 80 and 443 open but to configure a redirect from 80 to 443. Any request over plain-text (http://) is immediately redirected to the secure site (https://).

The problem with this strategy is that the victim’s web browser will issue a plain-text request. If that attacker was there when they did this, then they could still compromise the victim. It only takes a single plain-text request and response to enable them to do so.

Only offer secure

To get around this savvy administrators make no compromises and simply disable TCP port 80. If a web browser attempts an “http://” request the port simply is not open. It cannot establish a TCP session and so will not send the plain-text HTTP request.

The downside of this is that the user might assume the target application is not online. They would go and try and find another domain to buy whatever it was they wanted. This is why redirecting to secure has been such a pervasive strategy. Vendors simply do not want to lose out on important traffic which can drive this quarter’s sales chart.

What is this HTTPS Strict Transport Security (HSTS) stuff?

You can learn more about HSTS here:

My understanding is that HSTS was created to reduce the number of plain-text HTTP requests being issued. There are two modes of operation:

  1. A URL is added to a preload list which is then available to modern web browsers.
  2. An HTTP header (Strict-Transport-Security) is added to server responses which tells the web browser to redirect all “http://” to “https://” before issuing the request.

When a user types a URL into the address bar and hits enter the browser will check to see if the redirection must happen. Where required the redirect happens in memory on the user’s computer BEFORE the TCP connection is established.

For strategy 1. the target site is in the preload list. A well behaved web browser will never issue a single “http://” request to the target site. The problem of middling the connection has been successfully resolved.

For strategy 2. we are arguably no better than the server redirecting from “http://” to “https://“. A single plain-text request will be issued. If the attacker is middling at that point they can alter the response as desired to exploit users.

However, strategy 2. is likely to lead to fewer plain-text requests overall since the browser will not request via “http://” until after an expiry date. Relying on “redirect to secure” alone will result in a single plain-text request per visit the user makes to the site. This increases the number of opportunities to middle the victim’s connection.

Gap Analysis

The reason for writing this blog was because I had an interesting conversation with a customer. They enabled only TCP 443 (https://). They saw this as sufficient and did not want to enable HSTS as recommended in my report. I was challenged to show an exploit route that could work or they would not bother.

Fortunately the edge case I am about to explain has been public knowledge for a long time. So I didn’t have to think too hard to add it in. I am just adding my voice to bounce that beach ball up again for visibility.

Exploit Steps

The exploit route is like this:

  1. An attacker must be able to middle the victim’s traffic.
    • Chances are this is on the same network as the victim.
    • For this reason mass exploitation of users is unlikely and the risk is small as a result.
    • Lets proceed with the steps assuming that this attacker is ABSOLUTELY DETERMINED to exploit this one person.
  2. An attacker crafts a link and sends it to the victim to click on.
    • That link is: http://target:443.
  3. The victim clicks on the link and their browser dutifully establishes a TCP connection to port 443. Because the browser sees a service it can talk to it fires a plain-text “http://” connection.
  4. The server then rejects the connection because it is expecting “https://“. However, the damage had already been done. Our attacker had the single request that they needed for exploitation to occur.

The following screenshot shows the Wireshark capture when this example URL was requested:

URL: http://www.cornerpirate.com:443
DNS lookup and then HTTP request being captured

The only requirement for this to work is that the targeted TCP port is open. It is most likely that 443 is used but you can do the same thing with any open TCP port.

What is the solution?

The optimal solution is to enable HSTS via the preload method. Even if your website only has HTTPS enabled.

Adding a site to the preload list can done here:

All other solutions leave a victim’s web browser issuing at least a single HTTP request.

Unfortunately it takes time for a site to be added to the preload list. Therefore at the same time you should also enable the “Strict-Transport-Security” header as described:

That is the famous belt and braces manoeuvre to reduce the chances of the world seeing you butt.

And you should definitely do as I say and not as I do:

Hope that helps

Captain’s Log: November 2020

The Good

  • 10k a day steps challenge – I have managed this every day again. That-is-11-months. Almost an entire freaking year. If I get to Christmas eve I will have actually done something I said I would do. Which in this whole crazy wreck of a year is something to be celebrated.
  • 150 active minutes a week challenge – I hurt my thigh as I started running again at the end of October. I needed to rest that up for a week or so. But I banged into November with bad news (see “The Bad”) regarding my health. After the thigh issue cleared up I had an excellent run of it (pun gleefully intended). Most weekday mornings I would be out jogging before work and I got both fitter, and thinner as a result. I ordered an exercise bike which was said to be “next day delivery” that I have not seen any more about. I ordered before England locked back down so I was expecting to get a bike :(. Update: It eventually arrived 3 weeks later, but I haven’t had time to build it.
  • Eating well challenge – (see the rabbit food ^^^) I don’t think I really ate too badly before but lockdown definitely accelerated the amount of crap I was eating. After the bad news (see “The Bad”) I went back to tracking calories. The act of having to scan barcodes and weigh spinach is so damn annoying that I definitely eat less as a result. The first bit of weight loss for me always goes really well. So into the baggy clothes and feeling good part of the process. Back to sorta where I was pre-lockdown when I started this series of posts. It is paying off.
  • Audiobooks – I moved back to feeding my brain with Sapiens. It started by reminding me about the Naked Ape which I read a good 20 years ago. I am intrigued by the speculation around what happened 70k years ago when suddenly one of several species of tool using humans came to dominance. The theory is that there was a cognitive revolution after which one species was capable of more complex language allowing both gossip and shared fantasies like religions. This allowed evolution through co-operation instead of time consuming genetics. The most fascinating point was that this is why we have anxiety about various things that logically do not make sense. Genetically speaking we are not apex predators but it turns out we are purely because of cognitive abilities. We get anxiety about things that would kill us on the plains of Africa. We have obesity because sweet things are great for survival (high calorie content) but rare in nature. If a chimp finds a ripe fig tree they immediately gorge the whole supply. Exactly how we cannot stop ourselves with a box of chocolates. Looking forward to where it is going.
  • Testing – I have done several testing projects this month. I learned lots of things. I found lots of things. This is always brilliant.
  • PS5 – There was a whole awful Saga where I can rant about how crap the vendor I ordered from were. But it eventually arrived the day after launch more because of luck that I had a postal redirect setup than the effort of the vendor. It remained in its box until the 29th and then it was an expensive massive brick while it downloaded update upon update upon update. I haven’t really played it. The Spiderman game seems good.

The Bad

  • I was tentatively diagnosed with a liver disease – This was found as a result of blood tests I had ordered due to me feeling extra shitty after moving house for weeks. The results said I had fatty liver meaning that I need to now actively lose weight and eat right for a real reason. We do not know the extent of the problem until I get an ultrasound and other tests done. But the chances are this is extremely early stage and if I lose weight the problem will reverse. That’s the hope. So I have thrown myself into that.

Highlights of the month

Football – Scotland Qualified for Euro 2020 through a delightful playoff win against Serbia. I honestly was calm throughout. I had no doubt we were going to do it and didn’t even waver when Serbia scored in the last minute. I just felt it was going to happen.

To be clear I have supported Scotland for a long time now and I have never once felt like that before. I have been hopeful, but always sort of knew it would implode. Because we had done the penalties so well in the previous game I just expected us to do it again when we had to play extra time.

InfoSec Community – The lovely people over at Ladies of London Hacking Society asked me to do a workshop on CVE bug hunting. Despite me being an absolute fraud with only one CVE to my name I took that on. It seemed like everyone had a good time – me included. It was recorded here. I am starting at 31 minutes and 05 seconds if you just want to see my face:

That’s all folks.