Trains forever

I’m sure that if I added up all the time I spent waiting on train platforms, I’d have plenty of time to get all my work done. I know, that’s it! The reason my project is behind is because it’s the trains’ fault for being late. Honest!

Automating web testing

We’re just starting to ramp up the amount of testing we do on our web apps at work, mostly via the use of JUnit and some code coverage tools which are automated via CruiseControl.

I’m reasonably interested in getting some automated user acceptance testing going, and have been looking at Selenium, because it seems a great way to generate tests from the real UI, rather than, say, writing code for them in HttpUnit or JMeter (although I know that’s really a load testing tool, it could do the job at a push).

I’d like to be able to generate the tests using the Selenium IDE which is a Firefox extension. My problem with Selenium seems to be that it validates against real browsers, rather than against the HTTP requests and responses and so won’t fit in to our automated testing.

It’s possible to write Selenium tests as Java code (amongst others) but again this runs against the actual browser, and the whole point is that with the IDE you should’t actually have to be a developer to write acceptance tests.

Our CruiseControl machine is a headless server somewhere, so does this really mean that we can’t use Selenium?

I think that actually I’m just baffled that a tool might work this way, and not allow automation without a browser to hand. It rather looks as though we’re going to have to forget the idea of generating tests using the Selenium IDE and just code them in with HttpUnit (or Cactus or HtmlUnit or whatever)

War (on XML), what is it good for?

Were switching to XML a requirement for reaping compelling benefits, the public would indubitably have moved. So would the majority browser engine.

So if the majority browser engine did support XML, this presumably brings us to the question of what would those compelling benefits to the public be?

Aristotle has updated with a response to the question he thinks I asked of his post, which I didn’t. Rather than quote it all, feel free to go and read it on his blog post

My question is, what would those compelling features of XHTML 1.5 be? What could those new features at the time have been? Would they have been compelling enough to encourage the move to XML?

Maybe I’m even asking (in a very broad sense) whether the changes which the WHATWG (and apparently TB-L) are suggesting, are compelling, in and of themselves, to the public, now.

Opensearch in Firefox 2 and IE7

We recently rolled out the Google Search Appliance at work, replacing the ageing (and dreadful) htdig (wikipedia link since the website appears to be down).

I’d previously put together a Firefox search plugin for htdig, but it wasn’t very good because the htdig search gave poor results and took ages to deliver them. Since we rolled out the GSA, I went to update the search plugin and found to my delight that Firefox 2 now uses OpenSearch for defining search plugins. So, ten minutes later and I had a working plugin, which was great. A week later, when IE7 RC1 came out I realised that IE has supported OpenSearch plugins for months, so I pointed it at my plugin, only to find that it didn’t work.

It turns out that IE doesn’t appear to support the nested Param element, although I freely admit that I haven’t checked the OpenSearch 1.1 specs to see if something changed from 1.0 (although a first pass of draft 3 doesn’t seem to include the Param element – I suppose the Firefox docs on the topic could just be out of date).

To install the plugin, go to the University’s GSA tools page To see the working plugin XML looks like this, the Firefox2-only XML looks like this

Things that annoy me about Firefox 2 UI

  • No close button on every tab
  • No close button at the right of the tab bar
  • highlight text, right-click, search uses the selected Search Box engine, not Google
  • scrolling tab bar
  • constant switching between “extensions” and “add-ons” between Firefox versions
  • atrocious options pane for Windows users (this is actually a Firefox 1.5+ problem)
  • massively slow first startup time without a splashscreen
  • terrible UI for session saving (text box has become a dropdown, with no clues)
  • ‘tabs’ and ‘feeds’ options tabs are poorly designed and laid out
  • spell-checker dictionary doesn’t come bundled with your localised build

More as they occur to me.

Wikis in the public sector

One of my colleagues, Marieke Guy, has just written a puntastically-titled article entitled Wiki or Won’t He? A Tale of Public Sector Wikis in the UKOLN online periodical Ariadne in which I get extensively quoted. I hadn’t realised when I tacked those few paragraphs onto the end of the short questionnaire, it was going to be quoted directly, so I apologise in advance for my poor use of English.

Tomorrow I’ll be talking at UKOLN’s national Wiki Workshop in Birmingham about evaluating wikis for rollout at universities. It’s only a 15 minute talk (most of the day is spent in discussion panels), and I have trouble keeping just sentences that short, so I have no idea how I’m going to shut myself up. It should be a fun experience anyway.

XHTML hilarity

It’s funny to watch this Sending XHTML as text/html not-considerd-harmful after all (which is wrong, of course) post get a load of comments almost a year later, but the best bit is saved until the comments, where the author of the article, Brad, responds to a comment by Sam:

Sam: how exactly is this:

I believe the biggest advantages to XHTML are its readability, uniformity, well-formedness as it pertains to authoring, and the consistency of the rendered DOM (which is also a result of any well-formed HTML document).

different from HTML 4.01 strict?

Very simply, XHTML is more aesthetically and logically pleasing than HTML.

Comedy genius.

Fixing the WAG54G

I’ve had my Linkssys WAG54G (a wifi router and four-port switch) for about eighteen months, but over the past month or so, it’s been consistently dropping and then reconnecting my wired network connections every few seconds with Windows reporting that “A network cable was unplugged”. Replacing the network card would solve the problem for a few hours, after which it would start again.

It turns out that changing the connection type to 10Mbps (full duplex) solves this problem. I’m not really sure why, possibly too sensitive to noise on the line at higher transfer rates?