home :: Diary [ ]
Sat, 19 Jan 2008
I've started writing yet another blog, also with a lack of content, but with more than this one. It's over at Bitclean and will feature music from We7 that stick in my mind when I see them. It'll probably come in little spurts each time I find a batch of "interesting" content.
Last updated: 12:01, 19 Jan 2008 Link..
Fri, 03 Nov 2006
Could you spare two minutes to locate your house/workplace on a online map, and type in your postcode?
The downside is that you have to find your house on old maps. The upside is that you get to look at old maps.
Why? To help build a free-to-use database that links Postcode to Latitude/Longitude - something desperately needed in the UK.
Over the last month I've been working on a website to collect the locations of postcodes. Update: It is here
The idea is to use out-of-copyright OS maps, and to get people to locate their home on the map, and specify the postcode. Get enough people to do it (perhaps for their workplace too), and you will have a not-perfect-but-good-enough-for-most-purposes database that anybody will be free to use.
Last updated: 13:07, 03 Nov 2006 Link..
Thu, 31 Aug 2006
|The Art of SQL|
A year ago I didn't know much about SQL. I knew that the keywords were INSERT, UPDATE, SELECT and DELETE. I knew that indexes made things faster and I had been taught the theory behind what makes something 1st, 2nd, 3rd normal form, but that is it.
I saw that some of our sites were slow and that was due to bad SQL design and lack of indexes. I soon learned enough SQL that, when combined with a bit of a guess about how I would execute the queries if I was a computer, I was able to make the queries faster. Since then I have been more and more interested in making sure that queries are as efficient as possible, sadly something that can be very important, but is difficult to test in the early stages of an application's life as there isn't a lot of data in the database so slow approaches taken by the query analyser are not noticable.
Recently I have been reading ' The Art of SQL ' by Stéphane Faroul . This is a treatment of the idea of optimising your databases from the theory side, just the way that I have been taught about everything else. It doesn't really tell you what to do, but it explains why it all happens the way it does, and suggests mays of approaching the common problems. I have found it easy to read and not at all drudgery. It isn't really a reference book, more of a book that you read from cover to cover. You may, however, wish to return to it from time to time when you encounter a problem that was discussed in the book.
Last updated: 14:20, 31 Aug 2006 Link..
Tue, 21 Feb 2006
Thanks to some scripts from Dom, a couple of hours feeding CDs into my drives and a few hours playing with the Amazon web-services API, I now have a rather pretty summary of my music collection.
If you have any suggestions about how to improve it, let me know. And if you buy any from the Amazon links I'll get a few pence to put towards growing the collection.
Currently I'm thinking of some way of showing why there are duplicate images for the two CDs in a boxed set. Not sure what to do there without making the scripts more complicated and in need of more manual intervention.
Oh, and then I will try something similar for my DVD collection, but that will require more manual work, as there isn't really a CDDB for DVDs.
Last updated: 23:50, 21 Feb 2006 Link..
Fri, 17 Feb 2006
On my way to work today I was listening to episode one of The Mary Whitehouse Experience radio show. It is great to find old BBC Radio 4 comedy online as they are just the right length for my commute, and lovely to listen to. The Mark Steel Lectures went down well too. I think Jeremy Hardy Speaks to The Nation would be a good one next if I can find it.
Anyway, whilst listening to this "topical" satire from 1989, it realised quite how similar it was to now. They have the same Nick Ross impression that is still used on Dead Ringers (I wonder if it is done by the same guy), but there were also jokes about airline security (Lockabie), and muslim extremists (Rushdi).
I also hadn't realised that Mark Thomas and Jo Brand were involved in the old shows. It is a great bit of nostalgia, and fills a gap where the Now Show isn't there.
Last updated: 09:49, 17 Feb 2006 Link..
Sun, 15 Jan 2006
After creating a long list (which may feature here soon), of films that I wanted to watch but did not get around to seing in the cinema, or that I didn't think were worth it at the time, I decided to sign up for one of those online DVD rental programs that have sprung up over the last couple of years.
Dom had signed up for LoveFilm before. They had a program where you would pay £14.99 a month and get 3 DVDs at a time, and a new one whenever you sent one back. This is probably a good system if you are a student, or film buff. But we didn't watch the films at a particularly great rate. Some of the films he got were a bit on the weird side like Delicatessen, and you really have to be in the right mood to watch that. This lead to them just sitting there taking up one of the limited slots. I think that we may have averaged watching just over one DVD a week.
Amazon's cheapest offering limits you to one DVD at a time and three rentals a month. This is a level that means that you don't feel obliged to watch DVDs when not in the mood. The fact that you only have one means that you don't get a lot of choice about what to watch, but then it also forces you to watch some of less frivolous DVDs (assuming that you put one in your basket in a moment of self-improvement). £5.99/month for up to three rentals (and three a month is an easy amount to watch without feeling pressured) means that the rentals work out at £2 each. This is less than blockbuster charge for most DVDs, but you do have to plan in advance what you want, rather than picking a movie based on your mood. Update: And they give you 10% off the DVDs you buy whilst a member, so I paid for my first month's subscription as I wanted to buy two seasons of Babylon 5.
Anyway, so far it is working, and I have quite a list yet to watch. Will I get bored of the rentals before or after I run out of films in the list? I'll try to remember to let you know (does anyone apart from Mike read this?).
Last updated: 22:46, 15 Jan 2006 Link..
There is something screwy in Coldfusion's locking. If you make a cfm page that contains the code below. What you would expect is that executing the code would never take more than a shade over 15 seconds, as it can wait up to one second to aquire the lock, and then spend 14 seconds sleeping.
It appears that in some cases, this is what happens. For example, if you get the page from two wget sessions. It also will show you the error message if you go to the URL, then (whilst waiting) hit refresh. Unfortunately if you go to the URL in two different windows in Firefox, it appears that the second window will wait the whole time for the lock to be released (so up to 28 seconds total). I can't see why this would happen. Could it be something session related?
<cftry> <cflock name="testLock" timeout="1" throwOnTimeout="yes"> Got lock, sleeping...<cfflush> <cfset thread = CreateObject("java", "java.lang.Thread")> <cfset thread.sleep(14000)> finished. </cflock> <cfcatch type="Lock"> Another process already has the lock. </cfcatch> </cftry>
Last updated: 16:51, 15 Jan 2006 Link..
Fri, 22 Jul 2005
It has been said recently that Jamie Oliver has done a lot for cookery. I'm not so sure. He might have done a good job with school dinners (though it will be a while before we see major changes here), but I am not sure that he has done a lot to make men cook.
Jamies style of cooking is very much show cooking. It involves the more expensive ingredients, stuffed together with much ceremony and the chance of personal expression in the way you just rip up the corriander. There is also a heavy emphasis on cool gadgets, and big shiney pans.
This is not cooking for everyday, but for special occasions (and often the food isn't sufficiently impressive for that, it is food for having your mates round, when pizza has got boring, and you have loads of time).
The question is, does this sort of cooking by blokes really help much? It makes everything dirty but there isn't a way of show-washing-up. It isn't economical to use regularly, so only gets brought out for special occasions. It also raises expectations for the rest of the time.
Generally, though Jamie Oliver has encouraged men to cook, it isn't sustainable, everyday cooking. Hence there is not so much of a change as we think to begin with.
This is based on discussions in the pub, so may cover ideas found elsewhere. I do not gaurantee that all my thoughts are my own.
Last updated: 14:26, 22 Jul 2005 Link..
Sun, 29 May 2005
I have moved my website from Compsoc to Earth.li . This should leave more flexability in the future, and will reduce my reliance on Compsoc who appear to be low on sysadmins. If you have any bookmarks to the compsoc pages, please update them for the new address.
Last updated: 12:24, 29 May 2005 Link..
Sat, 12 Mar 2005
I have had a problem where flash wasn't showing any text on menus in the flash, or in the flash settings app.
I am running debian, and it appears that the solution is to install gsfonts-x11. This then provides the fonts that flash uses.
Last updated: 09:24, 12 Mar 2005 Link..
Thu, 30 Dec 2004
There is a lot of movement from certain parts of the internet population towards publishing the details of their life. This may include thoughts, photographs, todo lists, calenders, that sort of thing.
The majority of it is for the benefit of the publisher and a couple of friends. Some grow so large that thousands of people read their thoughts, and look at their holiday photographs.
There is some very good on-line software to organise this sort of thing for you, and most of it can be locked down so that only yourself, or selected friends can see the content if you so desire. Unfortunately it doesn't seem ideal to be uploading all your photos to a site on the internet, just so that you can organise them for yourself. It seems kinda wasteful. What you need is something that you can run locally, to organise your life, and optionally publish parts of it to the big wide interweb, possibly limited to sets of people.
The major pieces are there. Good blogging software can be run locally. Passable photo galleries exist in their millions. I'm not sure why I think it just isn't complete. I want to have Furl , Flickr , Blogger , Bloglines , Gmane and probably more all running on my machine where I have control of my data, and I want the ability to authorise people from some central authority to view the published content, so that they don't have to create yet another username and password, oh, and I want the moon on a stick. Maybe when I have all this I will be happy, or maybe it will just make me want more.
All this blog software, and integrating it, and adding RSS/Atom feeds of everything looks like good fun. If only I had some content. This is part of the driver towards me having a separate, dynamic blog, which I can do this with. The only thing is that, although I don't want to write my own system to blog with, I do want to have the fun of fiddling. I think I will try to go with Blojsom as it is a java version (losely based I think) on Bloxsom which I am using at the moment for this site. This should give me some scope for adding extra features.
Last updated: 23:41, 30 Dec 2004 Link..
Wed, 29 Dec 2004
I think that my current idea of having a single site, that contains a blog as well as all the normal content, is not as good as I expected. It is nice to have everything generated from a common source, and that being nice, simple text files with very little markup.
Unfortunately the static nature of the website is just getting a little boring. I think that I will have to have a separate, more dynamic life-sharing site. This can be a place where I can play with cool toys, as well as write some of the less serious comments.
Not sure when this will happen, but I will see what I can do about having links between the sites, and keeping this one up to date about as often as I do now.
Last updated: 23:46, 29 Dec 2004 Link..
Sat, 20 Nov 2004
At the request of Mike and Robin in the pub, I have added RSS feeds. You can get a feed for each directory by just using index.rss.
It also appears that you can get a feed for each page (by using .rss rather than .html), but given that single pages only have a single item on, I am
This will probably lead to a flurry of posts in the next few days, followed by nothing for months again, but at elast now you will not have to check back for the updates, as your rss agregator will know all.
Last updated: 09:43, 20 Nov 2004 Link..
Tue, 02 Mar 2004
It appears that the police have noticed what everyone has been telling them for years. There are no new crimes for the Internet. They are just updated versions of old crimes like fraud and theft.
I'm not convinced about the comparison between DoS attacks and protection rackets. Though they can be used like that, at least some DoS attacks are more like spraying "Wanker" or "Bob waz Here" across the windows of a shop.
Now, considering that the head of the national Hi-tech crimes unit has said that there are no new Crimes for the Internet you start to wonder why they need to pass so many laws aimed directly at the Internet (though some of them have effects that leak into other means of communication).
This question has at least been addressed by the bottom of the article where it wonders whether or not DoS attacks are actually illegal, and mentions that they sometimes have to bend the law to squeeze a percieved offence on the behalf of an officer into something that will be prosecutable by the law. One wonders why they don't catch the perpetrator under the old law, if they are just comitting an old crime with new technology. The answer to this seems to be that the new laws relating to the Internet appear to have significantly more stiff sentances than you can get for the old crimes that they replace. Some even move the offence from the civil into the criminal, like the new regulations on copyright that will move copying a tape for your mate into the same box as kidnapping people to work in a sweatshop producing copied CDs.
Last updated: 23:36, 02 Mar 2004 Link..
This is one of a series of articles on on-line fraud. They seem to tell stories of supposedly clever businessmen, falling for one of the many scams that happen over the Internet. Now, when these were rare, and came by fax, I can see that some people might be caught. However I receive up to 25 of these scams a day. I cannot believe that there is really that many bundles of 12.5 million DOLLARS needing to get out of Africa that there that many people interested in my help getting it out.
This guy seems not to understand quite the way that he has been conned. He has lost $200,000 to the conman, and yet still comments on how the conman was "always polite and considerate." It is amazing how polite someone will be to get that sort of money.
Last updated: 12:00, 02 Mar 2004 Link..
It appears that hackers are reverse-engineering the patches that come out of Microsoft in order to produce attacks against them. I can believe that, it is hard work to find a hole, particularly if you don't have a copy of the source (though people do manage it, as people report holes to Microsoft in the first place).
That said, it does not appear to be the solution to either not release the source, or to not release the patch. Sure, there will be fewer large virus outbreaks, but there will still be people who are able to take advantage of the holes that were there in the first place, they just wont make it well-known, and their method of entry will nto be discovered.
This stinks of the articles that we got last month when Microsoft mislaid some of the Windows source-code. There was a fit of people saying how terrible it would be for the source-code to fall into the hands of hackers who would be able to use it to exploit the code. Well, maybe if the code was better written and security audited in the first-place, there would be less chance of them being able to exploit it. Wouldn't it be terrible if the source to the software that runs most of the Internet infrastructure and servers got into the open. There would be a massive spate of attacks, the Internet would collapse? Well, no, Apache is Open Source, Bind is Open Source, Linux is, FreeBSD is, We can go on for a while. OK, so none of this software is exploit-proof, but there are no more exploits in this code where the source is in the open, than in the Microsoft Code that is kept safely away from the eyes of nasty hackers. Oh, and the software is updated much faster than the Microsoft holes (though they are getting better these days), and you can fix holes in old software without having to upgrade your infrastructure to the bleeding edge version that has support, potentially breaking your custom apps. I have seen people not upgrade an insecure system because their vendor wouldn't support their database, for example, on the new platform.
Last updated: 12:00, 02 Mar 2004 Link..
Fri, 28 Nov 2003
On Tuesday, BBC Technology Pundit Bill Thompson arrived home from work to find that his Internet had broken.
According to Bill Thompson - "the entire NTL network had gone down.". Hmm. Well, I have NTL and I was using it on Tuesday night. Ok, DNS was a little broken, which also meant that their webcaches were broken, oh, and you cant just not use the webcaches as they intercept outgoing traffic automatically. However it wasn't the case that their whole network was down.
Oh, and the Bill Thompson article (at least at the current time), says that the cable broke at 16:00 on Monday.
Ok, further down the article he made some better points, like that companies that rely on their email servers should check that not only will they still work in the even of a power cut, but also that all of the infrastructure required to reach them will also work in the even of a power failure. UPSs on switches and ADSL modems and things like that are essential if you expect to be able to contact your running servers in the event of a power cut. Maybe more companies need to think about this, or maybe they are thinking about it, but only at low levels, and the budgets are not there until after the disaster. Hmm, maybe IT departments need to stage power cuts regularly.
Last updated: 13:59, 28 Nov 2003 Link..
Sun, 26 Oct 2003
This is another Bill Thompson article, and actually not too bad. This is a problem that it is worth publicising, and thankfully he does not propose some sort of really drastic method of "solving" it.
I can think of two ways make it harder for spammers to spam your comments pages of your blog (except for my current solution of not allowing people to comment on it directly, by rather through email). I don't think that it will be impossible to allow people to post automatically without allowing spam in.
- Use non-standard blog software - the spammers are likely to write software to automatically post to the standard software, but if yours is different it will be more effort for them. For example you want to name the fields things other than "Name", and "Comment".
- Ask humans to enter the text displayed in an image. At the moment it is probably sufficient for this to be in a clear font, so that you don't filter the old and partially sighted too, or even those not used to reading the same character set as you. When the spammers catch up with this in the inevitable arms race, then you might have to protect it from OCR applications.
Another possible suggestion would be to make it easier to mark posts as spam, so that they can be remvoed quickly. For example unchecked messages could have a "this is spam" button (such as on the articles in gmane), that removes them from view and puts them into a holding area where they can be checked by someone trusted. Once a message has been marked as non-spam, another trivial thing to do, but something that should be restricted to someone trusted (for example the Blog owner, and possibly regular posters). This just requires a bit of effort to be put into the infrastructure, and thinking about it in order to make it easy enough to use.
Last updated: 12:08, 26 Oct 2003 Link..
According to this article, experts are planning to create something called IPv6 that will enable us to have billions more IP addresses. They give the impression that this is a completely new thing that boffins are working on as we speak.
In fact there has been an RFC about IPv6 since at least 1995, 8 years ago. The only problem has been one of adoption. There have been many reasons why it has not caught on. One has been the slow rate of porting applications to the new APIs for resolving these longer addresses. The other has been simply lack of depend, given that everyone now uses RFC1918 addresses, and simplify their firewalling in this way.
Maybe the BBC writing this article will tell more people about IPv6 and speed its adoption. It would be a lot easier if everyone used IPv6, and we could all address the individual computers behind a single ISP-supplied network connection. It would make thinking about firewalling more important for a lot of home users, so I am not overly convinced that it will be a good thing in the short term. Currently a lot of home PCs running older versions of Windows are protected by being NATed by ADSL routers and similar. This leaves them without an Internet accessible IP address, and so shields them from direct attack.
Overall this article is probably a good thing, even if it does come about 5 years too late.
Last updated: 12:05, 26 Oct 2003 Link..
Mon, 20 Oct 2003
Apparently a cluster of 1100 Apple G5 computers requires "the same amount of electricity as 3,000 average sized homes." Now, it doesn't say exactly what an average-sized home uses, but this seems a little high.
I was under the impression that the CPUs used in the Macs were low-energy, efficient things. This is requiring the energy from almost 3 average-sized homes for each computer. Don't most houses these days have a computer in them? Or are we averaging this out across all the homes, including mud huts in Africa? Maybe a computer will use half the energy of a home. I will assume this below.
Ok, cooling is probably a large percentage of the energy usage. 1100 computers must generate some heat, and they go on about it in the article, however I don't believe that it should need 5 watts of energy to move 1watt of energy from a machine room into the atmosphere. If it does, that might explain why so much energy is currently used for air-conditioning.
If the G5 is efficient, and it does need 3 homes per computer, I would hate to have built something like this with Intel CPUs, they must use even more unbelievable amounts of energy. Do supercomputers need this sort of power to get the same performance? 3000 homes must be a small substation, you would not get much time out of your average UPS for this sort of load.
Maybe we should be looking into harnessing the waste heat from PCs, and using it in some sort of combined heat and power system. If we even got enough energy out of it to drive the air-con it would probably be a good thing. How could we do this? Thermocouples?
Last updated: 14:25, 20 Oct 2003 Link..