Moving to Jekyll

I’ve been meaning to move away from Movable Type for a while; they no longer provide the “Open Source” variant, I’ve had some issues with the commenting side of things (more the fault of spammers than Movable Type itself) and there are a few minor niggles that I wanted to resolve. Nothing has been particularly pressing me to move and I haven’t been blogging as much so while I’ve been keeping an eye open for a replacement I haven’t exerted a lot of energy into the process. I have a little bit of time at present so I asked around on IRC for suggestions. One was ikiwiki, which I use as part of helping maintain the SPI website (and think is fantastic for that), the other was Jekyll. Both are available as part of Debian Jessie.

Jekyll looked a bit fancier out of the box (I’m no web designer so pre-canned themes help me a lot), so I decided to spend some time investigating it a bit more. I’d found a Movable Type to ikiwiki converter which provided a starting point for exporting from the SQLite3 DB I was using for MT. Most of my posts are in markdown, the rest (mostly from my Blosxom days) are plain HTML, so there wasn’t any need to do any conversion on the actual content. A minor amount of poking convinced Jekyll to use the same URL format (permalink: /:year/:month/:title.html in the _config.yml did what I wanted) and I had to do a few bits of fix up for some images that had been uploaded into MT, but overall fairly simple stuff.

Next I had to think about comments. My initial thought was to just ignore them for the moment; they weren’t really working on the MT install that well so it’s not a huge loss. I then decided I should at least see what the options were. Google+ has the ability to embed in your site, so I had a play with that. It worked well enough but I didn’t really want to force commenters into the Google ecosystem. Next up was Disqus, which I’ve seen used in various places. It seems to allow logins via various 3rd parties, can cope with threading and deals with the despamming. It was easy enough to integrate to play with, and while I was doing so I discovered that it could cope with importing comments. So I tweaked my conversion script to generate a WXR based file of the comments. This then imported easily into Disqus (and also I double checked that the export system worked).

I’m sure the use of a third party to handle comments will put some people off, but given the ability to export I’m confident if I really feel like dealing with despamming comments again at some point I can switch to something locally hosted. I do wish it didn’t require Javascript, but again it’s a trade off I’m willing to make at present.

Anyway. Thanks to Tollef for the pointer (and others who made various suggestions). Hopefully I haven’t broken (or produced a slew of “new” posts for) any of the feed readers pointed at my site (but you should update to use feed.xml rather than any of the others - I may remove them in the future once I see usage has died down).

(On the off chance it’s useful to someone else the conversion script I ended up with is available. There’s a built in Jekyll importer that may be a better move, but I liked ending up with a git repository containing a commit for each post.)

Tracking a ship around the world

I moved back from the California Bay Area to Belfast a while back and for various reasons it looks like I’m going to be here a while, so it made sense to have my belongings shipped over here. They haven’t quite arrived yet, and I’ll do another post about that process once they have, but I’ve been doing various tweets prefixed with “[shipping]” during the process. Various people I’ve spoken to (some who should know me better) thought this was happening manually. It wasn’t. If you care about how it was done, read on.

I’d been given details of the ship carrying my container, and searching for that turned up the excellent MarineTraffic which let me see the current location of the ship. Turns out ships broadcast their location using AIS and anyone with a receiver can see the info. Very cool, and I spent some time having a look at various bits of shipping around the UK out of interest. I also found the ship’s itinerary which give me some idea of where it would be calling and when. Step one was to start recording this data; it was time sensitive and I wanted to be able to see historical data. I took the easy route and set up a cron job to poll the location and itinerary on an hourly basis, and store the results. That meant I had the data over time, if my parsing turned out to miss something I could easily correct it, and that I wasn’t hammering Marine Traffic while writing the parsing code.

Next I wanted to parse the results, store them in a more manageable format than the HTML, and alert me when the ship docked somewhere or set off again. I’ve been trying to learn more Python rather than doing my default action of turning to Perl for these things, and this seemed like a simple enough project to try out. Beautiful Soup seemed to turn up top for HTML parsing in Python, so that formed the basis. Throwing the info into a database so I could do queries felt like the right move so I used SQLite - if this had been more than a one off I’d have considered looking at PostgreSQL and its GIS support. Finally Tweepy made it very easy to tweet from Python in about 4 lines of code. The whole thing weighed in at only 175 lines of code, mostly around pulling the info out of the HTML and then a little to deal with checking for state changes against the current status and the last piece of info in the database.

The pieces of information I chose to store were the time of the update (i.e. when the ship sent it, not when my script ran), reported area, reported state, the position + course, reported origin, reported destination and eta. The fact this is all in a database makes it very easy to do a few queries on the data.

How fast did the ship go?

sqlite> SELECT MAX(speed) FROM status;
MAX(speed)
21.9

What areas did it report?

sqlite> SELECT area FROM status GROUP BY area;
area
-
Atlantic North
California
Caribbean Sea
Celtic Sea
English Channel
Hudson River
Pacific North
Panama Canal

What statuses did we see?

sqlite> SELECT status FROM status GROUP BY status;
status
At Anchor
Moored
Stopped
Underway
Underway using Engine

Finally having hourly data lets me draw a map of where the ship went. The data isn’t complete, because the free AIS info depends on the ship being close enough to a receiving station. That means there were a few days in the North Atlantic without updates, for example. However there’s enough to give a good idea of just how well traveled my belongings are, and it gave me an excuse to play with OpenLayers.

(Apologies if the zoom buttons aren’t working for you here; I think I need to kick the CSS in some manner I haven’t quite figured out yet.)

Cup!

I got a belated Christmas present today. Thanks Jo + Simon!

cup.jpg

Automatic inline signing for mutt with RT

I spend a surprising amount of my time as part of keyring-maint telling people their requests are badly formed and asking them to fix them up so I can actually process them. The one that’s hardest to fault anyone on is that we require requests to be inline PGP signed (i.e. the same sort of output as you get with “gpg –clearsign”). That’s because RT does various pieces of unpacking[0] of MIME messages that mean that a PGP/MIME signatures that have passed through it are no longer verifiable. Daniel has pointed out that inline PGP is a bad idea and got as far as filing a request that RT handle PGP/MIME correctly (you need a login for that but there’s a generic read-only one that’s easy to figure out), but until that happens the requirement stands when dealing with Debian’s RT instance. So today I finally added the following lines to my .muttrc rather than having to remember to switch Mutt to inline signing for this one special case:

send-hook . "unset pgp_autoinline; unset pgp_autosign"
send-hook rt.debian.org "set pgp_autosign; set pgp_autoinline"

i.e. by default turn off auto inlined PGP signatures, but when emailing anything at rt.debian.org turn them on.

(Most of the other things I tell people to fix are covered by the replacing keys page; I advise anyone requesting a key replacement to read that page. There’s even a helpful example request template at the bottom.)

[0] RT sticks a header on the plain text portion of the mail, rather than adding a new plain text part for the header if there are multiple parts (this is something Mailman handles better). It will also re-encode received mail into UTF-8 which I can understand, but Mutt will by default try to find an 8 bit encoding that can handle the mail, because that’s more efficient, which tends to mean it picks latin1.

Update: Apparently Mutt in Jessie and beyond doesn’t have the pgp_autosign option; you want crypt_autosign instead (and maybe crypt_autopgp but that defaults to yes so unless you’ve configured your setup to do S/MIME by default you should be fine). Thanks to Luca Capello for pointing this out.

Back from DebConf 14

I previously forgot to mention that I was planning to attend DebConf14, having missed DebConf13. This year the conference was held in Portland, OR. This is a city I’ve been to many times before, and enjoy, but I hadn’t spent any time wandering around its city centre as a pedestrian. I have to say I really prefer DebConfs that are held in middle of city. It always seems a bit of a shame to travel some distance to somewhere new and spend all the time there in a conference venue. Plus these days I have the added lure of going out and playing Ingress in a new location. DebConf14 didn’t disappoint in these respects; the location was super easy to get to from the airport via public transportation, all of the evening social events were within reasonable walking distance (I’ll tend to default to walking when possible) and the talk venue/accommodation were close to each other and various eating + drinking options. Throw in the fact at Portland managed to produce some excellent weather (modulo my Ingress session on the last Saturday morning, when rained on me) and it’s impossible to fault the physicalities of DebConf this year.

This year the conference format was a bit different; previous years have had a week long DebConf before the week of the conference itself. This year went for a 9 day talk schedule (Saturday -> Sunday) with various gaps of hacking time interspersed. I’ve found it hard to justify a full two weeks away in the past, so this setup worked a lot better from my viewpoint. Also I rarely go to DebConf with a predetermined list of things to do; the stuff I work on naturally falls out of talks I attend and informal discussions I have. Having hack time throughout the conference helped me avoid feeling I was having to trade off hacking vs talks.

Naturally enough a lot of my involvement at DebConf was around OpenPGP. Gunnar and I spent a fair bit of time getting Daniel up to speed with the keyring-maint team (Gunnar more than I, I’ll confess). We finally set a hard timeframe for freeing Debian of older 1024 bit keys. I was introduced to the Gnuk, which is a particularly interesting piece of open specification hardware with a completely Free software stack on top if it that implements the OpenPGP smartcard spec. Currently it’s limited to 2K keys but it’s hoped that 4K support can be added (and I ended up spending a couple of hours after the closing talk hacking on the source and seeing how much needs to change for 4K support, aided by the very patient Niibe). These are the sort of things that really benefit from the face time that DebConf offers to the Debian project. I’ve said it before, but I think it’s worth saying again: Debian is a bit like a huge telecommuting organization and it’s my opinion that any such organization should try and ensure its members actually spend some time together on a regular basis. It improves the ability to work remotely a hell of a lot if you can actually put a face to the entity you’re emailing / IRCing and have some sort of idea where they’re coming from because you’ve spent some time with them, whether that’s in talks or over dinner or just casual hallway chats.

For once I also found myself considering alternative employment while at DebConf and it was incredibly useful to be able to have various conversations with both old friends and people who were there with an eye on recruitment. Thanks to all those whose ears I bent about the subject (and more on the outcome in a future post). Thank you also to the many people involved with the organization of DebConf; I’ve been on the periphery a few times over the years and it’s given me a glimpse into the amount of hard work all of the volunteers (be they global team, local organizing team, video team or just random volunteers) put into making DebConf one of my must-attend yearly conferences. If you’re at all involved in Debian and haven’t attended I strongly urge you to do so - I’ll see you all next year at DebConf15 in Heidelberg!

subscribe via RSS