Git directory server vulnerability

Do you use git to manage your site and or server files? In my opinion, this is undoubtably a good way to run things but you need to make sure it’s secure. Just try going to yoursite.com/.git/config. If you haven’t secured your server properly, you will see the configuration file for your git repository. Not good, huh? Not only could an attacker reveal lots of information about your code base including where the upstream server is, I believe they could possibly get the entire source. This would allow the attacker to see exactly how the site works and be able to exploit it very easily.

Now, the good news. It’s an easy fix!

Here are the two snippets you need for Nginx and Apache respectively to secure your server.

# Nginx
location ~ /\.git {
    return 404;
}
# Apache
RedirectMatch 404 /\.git

Note that I use a 404 error instead of forbidden so that an attacker is fooled into assuming that it genuinely doesn’t exist. A 403 for example, would tell them that it does but is forbidden which may lead to them further attempting to gain access.

Gold DoE Expedition

I recently undertook the expedition phase of my Gold Duke of Edinburgh in a Canadian open canoe. The team and I paddled from just outside Thetford all the way down to Cambridge on the River Thet, the Little Ouse, the Great Ouse and finally, the Cam.

For the expedition, we needed to have an “aim”. This could be anything from photographing the team at checkpoints to measuring the water PH levels. My team opted to photograph wildlife along the way and due to this, I took along my Nikon Coolpix P610 because it featured GPS – something I thought would be useful when it came to showing where the photos were taken! The camera also had a “logging” mode which allowed logging of it’s location, speed and altitude, every n number of seconds for x amount of time. I set it to every 30s for 12 hours each morning before we left so that I could see our precise route.

When we got back from the trip, I copied over all the GPS data and photos and set to work on building my presentation. I opted to build a website as I thought it would be the easiest way of showing all the data, maps and photos.  I wrote a couple of Node.js scripts to munch the data into a form I could easily display on the site, you can have a look at those at GitHub.com/developius/data-munching. The two programs are for extracting the EXIF information from the pictures and for collating the GPS points that were logged by the camera.

To glue everything together, I opted to go for a Leaflet.js and OSM combination using a custom map layer from Mapbox, alongside Chart.js for the interactive graph at the bottom of the page which shows altitude in meters and mean and maximum speed in kph. I think it worked quite well, although I now regret not taking more photos – there are noticeably large gaps on the map with regards to imagery. The nicest part of the whole trip was definitely going through Cambridge itself; avoiding punters and various other waterborne hazards was a lot of fun!

The full source code (sporadic comments – sorry) is available at github.com/developius/DoE and you can view the site in all it’s glory at finnian.io/DoE.

IMG_2147

Drones, Zeros and Cake

We all love drones. We all love cake. And we all love Raspberry Pi. What better way to spend an afternoon than to kick up at my mate Ben‘s house, borrow his faster internet and combine all of those things.

IMG_2139.jpg

We started the day by wrapping a Pi Zero and camera module in excessive amounts of electrical tape and sticking it to Ben’s 250-class racing drone. Sounds cool huh? Not only did it work amazingly well, it didn’t impact the performance of the quad at all. Pretty good for a 1 GHz fully featured Linux box!

IMG_2135.jpg
Suitably padded Pi Zero – safe and sound before lift off
As you can see from the image above, the camera module was orientated so as to get a birds eye view from the drone’s perspective. The two ‘ears’ sticking up are the antennae of the drone itself and the red one at the back is for the FPV.

The Pi was setup to record from the camera on boot, which caused some interesting problems when attempting to debug the software, because the preview from raspivid (that we’d forgotten to disable) was blocking us from seeing the GUI! Development cycles increased from < 10s to at least 10 minutes, per cycle.

Before setting out for our first flight, we deemed it sensible to fill ourselves with confidence by consuming large amounts of cake.

IMG_2144.jpg
If life gives you cake, eat it.
Satisfied with our cake-eating efforts, we went outside for the maiden-flight-with-Pi. Everything went perfectly and we grabbed some great footage from the onboard Pi camera!

Ben’s FPV goggles have an extra output connection for an analog display, so we hooked that up to a little TFT screen which I had so that spectators could view the live stream from the drone.

IMG_2147.jpg
It turns out that having an external TFT display strapped to your head causes some unforeseen and inevitable circumstances. Namely the inability to move, let alone walk in a straight line.
We created a video compilation of the day’s events, featuring drones and cake, which you can view below!

Many thanks to Ben for putting up with me all weekend and allowing me to precariously strap awesome stuff to his drone. Stay tuned on his blog for more drone-related madness!

RStudio Server

My father, Ben Anderson plays with numbers. As his Twitter bio says “big data, small data, open data, any data”. He works with R a lot and has been persuading me to take a look at it. I’ve held off until now because I’m all for analysing data in real time (primarily using delightful JS libraries such as Chart.js and D3.js). As far as I understood it, R is geared towards static data analysis and because of that, is able to utilise the hardware it runs on to optimise computations. Dad has an SSD in his Mac which reduces the time to load data substantially, but he also makes use of the R package data.table. This library makes manipulation of data ridiculously fast as it stores it all in RAM.

Because I like using real time data, R wasn’t really something that I thought I could utilise very much. I was fine with creating beautiful little animated doughnuts and bar charts with Chart.js and D3.js. However, these libraries are designed to display data, not to process it. R, on the other hand, is absurdly powerful with some of the things it can do. Really, I wanted a way to use R to analyse data coming in from a Node.js application and then use Chart or D3 to visualise it, because they can handle updates to the data sets nicely without having to re-run R code, just to get results.

For the last few months, Dad’s laptop has had a problem with it’s keyboard and mouse. In the end, we took it to the Apple store and it’s currently away getting some TLC. In the mean (pun intended) time, Dad has been battling with a rather old Mac Mini which refuses to install half of the R packages. The temporary solution was to steal my mother’s computer! I saw what was going on and suggested to him that he install R on his AWS server. We soon discovered that running R in a remote shell is not one of the cleanest and slickest experiences you’ll ever have, but it did the job.

The problem was that Dad likes to store all his code on GitHub (woo!) and committing from AWS, so that he could see the output of the code, wasn’t really a sensible way to do it. Then, we discovered RStudio Server. We both use RStudio already, so migrating to a server version didn’t sound like it would be too complicated. Once installed (took less than half an hour), we were blown away by it. Even though it’s the open source version, it’s truly fantastic. Due to the fact that it’s running on Amazon’s infrastructure, installation of packages and anything web-related is almost instantaneous and the web interface is a complete clone of RStudio desktop, so there was nothing new to learn. It even comes with support for git out of the box! All our problems simply melted away.

It uses a user’s credentials to the system itself which allows it to securely authenticate users and access their home directory and files. Possibly the best feature is that you can upload files to your account, directly from the browser. No mucking around with FTP or SFTP, it’s as easy as pie to get your data where you want it.

I setup the server itself to run Nginx so that we can handle initial requests and then use a reverse proxy to forward all relevant traffic to the RStudio server. The reason for running Nginx in front of everything is that it’s then possible to setup other things on the server, without impacting RStudio. An example is the redirect from dataknut.io/blog to Dad’s blog. Nginx is also ultra-lightweight and uses hardly any resources so is the perfect choice for a site that has low levels of traffic.

I’m still unconvinced about R’s ability to process real time data, as that’s not really what it’s designed for. If you have any suggestions about doing that, we’d both love to hear from you!

What’s next? I think I might build Dad a home page…

Lads day out!

Cyber Centurion competition at Bletchley Park

Today, the guys at SubjectRefresh and I competed in the Cyber Centurion Security Challenge at The National Museum of Computing at Bletchley Park.

The day started with an introduction by the organisers and a brief explanation of how the day was going to work. Then it was off to the marquee to get started securing the machines we were provided with. There were two Windows VMs (server 2008 and 8.1) and one Ubuntu 14.04 image. The team delegated four people to work on the machines in the first part of the day and swapped out two at lunch time.

By the end, we’d managed to get 66% of the vulnerabilities on Ubuntu and about 80% on each of the Windows VMs. This result was on par with most of the other teams, the top 5 or so each having really close overall scores.

Our team bio is available at https://cybersecuritychallenge.org.uk/competitors/cybercenturion/ and you can find out more about the Finals at https://cybersecuritychallenge.org.uk/competition-final-at-bletchley-park/.

It was a fantastic day and we all had a lot of fun. To top it all off, we featured on the ITV Anglia news this evening! http://www.itv.com/news/anglia/update/2016-04-26/competition-aims-to-find-next-generation-of-cyber-defence-experts/