Render Conference 2016

Last week I was in Oxford for RenderConf.

It was brilliant.

Render conference is two packed days of technical front-end talks from a seriously awesome line-up of speakers.

And when they said awesome, they meant it. Go check out the schedule!

My top three talks (in no particular order) were:

Val Head (@vlh): Designing meaningful animation (about web animation from Disney to CSS bezier curves)

Ashley G. Williams (@ag_dubs): If you wish to learn ES6/2015 from scratch, you must first invent the universe (Computer Science theory, Picasso and teaching)

Frederik Vanhoutte (@wblut): Rendering the obvious (Rainbows, categorisation, and Beethoven’s exploding head)

Between these three talks alone there was creativity, practical advice, inspiration and a call to change how you look at the world. Across the other 17 talks you could get everything from demoscene coding in WebGL to how to use the <canvas> element to print a new scarf. There was a really great range of presentations.

All the talks will apparently be available on shortly, and the videos will be on Vimeo. My photos, which are mostly of the nice things I saw in between talks because I was too busy listening, are on Flickr.

As well as being an exciting, inspirational conference though, the whole thing was just executed really, really well. Regular breaks, pastries and drinks, retro gaming (enjoying losing at videogames is my superpower), provided lunch, table tennis, skittles, 3D printing pens, hacking competitions and more meant there was always something to do and always someone interesting or cool to talk to.

There was a party at the venue on the first evening where the food even came to us – freshly-made burritos and pizzas from vans outside the venues complemented the wine, beers and alcohol-free options inside. Hats off to all the organisers!

I think my only complaint was that given this was my first visit to Oxford, I was disappointed to find out that not every single building is a 1,000-year old cathedral. Absolutely gutted.

Probably lucky though, I hear they have terrible wifi. See you next year, Render!

Pimoroni Flotilla

Flotilla – Friendly Electronics for All” is the headline, which sounds pretty damn good to me, someone who is permanently afraid of blowing his Arduino and other kit by connecting something the wrong way around.

So this, which has a ton of sensors and outputs, and a central dock for making them talk to each other, which all works over USB, is perfect for me.

We bought the Mega Treasure Chest starter kit and it arrived the other day, but it’s early enough in its life that there are very very few instructions on how to use it just yet. There are three “secret” URLs for getting going (the whole thing is controlled by a Raspberry Pi so you’ll need one of those too!):

  1. – instructions on setup are here
  2. – the web interface for controlling your flotilla is here
  3. – the two demos which do exist are here

We also used a first version of the “Getting Started” booklet to see us through the initial stages.

Once you’ve done those, you can watch the video for how to build your own robot which you can control over your network using all the different sensors and controllers you get. My 6yo was able to put the whole thing together himself and learn how to control it predictably in under two hours, which was brilliant! We did actually run out of the plastic screws used to hold it all together – I think someone in the warehouse was tired and we ended up with fewer than expected but just about managed to get it all together and have been assured more are on their way!

The completed robot! (upside down!)

I was really glad to discover that the flotilla daemon runs on a Kano, which we’ve had for a couple of years now and which my son loves, because the forums had made me a bit nervous about which OSes were supported.

We’re really looking forward to the rest of the instructions arriving, but my son is already completely delighted with the fact that he has built, and can control, his own robot! I just want to know if the flotilla daemon will run from a Pi Zero powered by a power bar so we can put them on the back of the robot and cut the cord it currently needs to the regular Pi!

Estimating is good for your health

Matt Jukes has written a post called Don’t do Agile. Be agile. Or something. where he describes how estimating stories has not worked for them at the Office of National Statistics.

I think there is a more important reason to do estimation that just release planning, and that’s people management.

Over time, estimating allows you to create a sprint which is neither too daunting nor too trivial, and to build a good team cadence. A sprint with too much work in it can be depressing and ultimately slow down delivery, a sprint with too little runs the risk of falling foul of Parkinson’s law.

You could argue that if you have a good, committed team who are always up for the challenge of whatever’s put in front of them then it doesn’t matter – I’d argue the opposite and that you’re at risk of burning out your team.

Velocity, paired with your observations of the team, will give you the data to be able to work out if they are maintaining a sustainable pace or not.

As much as I am not a fan of Pivotal Tracker, it’s taught me that the burndown is not as important as having a graphical view onto historical velocity. Did the team deliver fewer points this sprint? Why? Was it story-related or human-related? If their velocity has been solid and steady for a long time, are they ready to step up, or do they need a rest? Most managers have a gut feel for these things, but having the data to be able to back it up makes it easier to discuss in retrospectives and planning sessions.

Estimation is not always easy, but so long as you don’t put too much importance on getting the “right” estimate, and so long as it’s really treated like an estimate, it can be extremely valuable for both release planning and maintaining the long-term health of your team.

Brick wall

So I wanted to parse the feed I retrieved yesterday, and now I know a bit more about how the package system works, I just search NuGet and found SimpleFeedReader

$ dnu install SimpleFeedReader

At this point Visual Studio Code prompts me that it needs to run the `restore` command, so I click the button and in theory SimpleFeedReader becomes available from my code.

In theory anyway. In practice it doesn’t look like it works in .NET Core, so I decided to try and use the System.XML.XPath package to parse the feed myself, but after a few hours of trying to include the dependencies in package.json correctly, I simply couldn’t do it and ended up going to install Visual Studio. I feel very disappointed about this – I was hoping I could treat .NET as just another language – I don’t need a specialised editor to be able to write small scripts in almost any other language, but the docs around .NET Core just aren’t there. Hopefully I will be able to use Visual Studio to work out what I was getting wrong.

Running .NET code from Visual Studio Code

“Code” is the name of Microsoft’s new lightweight code editor. It’s been out since April but it seemed like this project was a good opportunity to use it properly.

It has a command palette much like the one in Sublime Text, but I couldn’t find a command to run my small .cs file.

Lots of the docs talk about using Grunt and Gulp to run your tasks, but that seemed like overkill for running a single command to build and run my code.

It turns out that you can configure your project.json with a list of commands which then appear in the palette.

A couple of small lines and now I can run my code directly from inside Code, hurray!

Retrieve an Atom feed in .NET Core

After yesterday’s adventure and recent forehead-smacking, this seemed like an appropriate, and small, goal.

After some Googling I found RestSharp, which bills itself as a Simple REST and HTTP API Client for .NET. Sounds good.

I tried to work out how to add this to my project.json but couldn’t find any documentation on what it should like, even after I remembered about things like NuGet.

So I guessed and typed dnu install restsharp, which seemed to fetch the right files. My package.json didn’t seem to have updated though, so I then did a dnu restore. This updated the package file, but might have been overkill and a timing issue in my editor.

I now have a small file which will retrieve an Atom file and dump it to screen.

Another small step tomorrow.

Getting started with .NET Core

It’s been a few years since I last used .net so I thought I’d give it a go. It was slightly more eventful 30 minutes than I’d have liked, so I thought I’d write it up.

I started by trying to install Visual Studio, but half an hour later it was still under 50% done, and since I only wanted to be writing scripts rather than applications, I started looking for some smaller getting started guides and came across Microsoft’s guide to getting started with .NET Core. So I took a brief look at the new shape of the .NET stack, liked what I saw, and went back to the guide.

After installing the .NET Version Manager (dnvm) I had to restart PowerShell to get the changes to my PATH to take effect, and also I had to open an Administrator PowerShell window and run set-executionpolicy remotesigned because I hit the Running scripts is disabled on this system error message.

The code samples from the guide can’t simply be copy and pasted into an editor – each is actually a single line of code and then JavaScript is applied to make it appear as though it is spread across multiple lines. This is pretty disappointing, I thought something was wrong with my local editor configuration until I hit view source on the page.

I was using Visual Studio Code, and at this point was also disappointed that SHIFT+ALT+F didn’t format the CSharp code for me, although it did format the JSON.

I’d closed my PowerShell window by this point, and when I opened a new one and ran dnu restore I got an error message about it not being on my path (The term ‘dnu’ is not recognized as the name of a cmdlet, function, script file, or operable program.). Running dnvm upgrade seems to have fixed this and I can now run the commands from both PowerShell and Command Prompt!

My “Hello World” did at least run first time. Phew!

I should now have the infrastructure to get properly going, but this was a much rougher intro than I was hoping for.

Why contracting developers are a giant pain in the ass

I have just read Why contracting developers refuse to go permanent. It says:

Working with legacy technology is something the developers I spoke to were not particularly fond of. They explained that these pieces of software are often built using outdated methodologies and poorly documented, if at all.

This is true, but my experience of dealing with several legacy projects which were written by contracting developers is like this:

This one is in Rails from three years ago and is full of CSRFs! This one is in Ember! This one is hard-wired to a five-year-old version of WordPress!

Yeah, it’s definitely the organisations which are the problem.

Bristol Mini Maker Faire

Today I took my 6 and 2 year olds into town to look around the Bristol Mini Maker Faire. It was really good and highly recommended.

We saw the Nao and Baxter robots from Active8 Robotics, a laser cutter from Just Add Sharks in action making press-out catapults, Raspberry Pi-powered wheel and track robots from Dawn Robotics and plenty of other things.

The kids were particularly taken by a couple who reverse-engineered childrens’ electronic toys like the Furby or basic motor-controlled toys and had them rigged up to simple push buttons to make them work.

We also liked seeing how Is Martin Running? works, and playing on the Makey Make.

They were sadly a bit too young to make their own shonkbot or get a Petduino (needs soldering) but watching the RepRapPro in action was a novelty, and we came away with some printed robot figurines which they treasured!

Many thanks to the chap from Ragworm who valiantly tried to explain how circuit boards are printed to my 6 year old 🙂 it will no doubt be easier to comprehend when our Flotilla kit arrives!

There was lots of other homebrew displays from people who used the Bristol Hackspace and I’m only sorry I can’t remember the names of their projects, but we all enjoyed it thoroughly!

Here’s looking forward to the next one!