What I’ve Been Doing

July 5, 2009

I’m about to post on the Guardian’s Activate09 conference, and as it’s been a while since I’ve written here, a bit of an update is in order.

For the last 18 months or so I’ve been working at Carphone Warehouse on their CRM application, focusing on performance, and making sure it can do more with less as the number of customers increases. The application (it’s Java-based) seems to be holding up well and I’m probably going to need a new challenge soon.

There have been a few other things going on, like a 3 month home renovation, and the frenetic pace of family life; what free time remains has been spent on a project which plans to do a better job of presenting up to date information about local businesses. It’s an increasingly crowded space for startups, but when I launch, which should be this year, I hope people will find it useful. As part of this I’ve been exploring new languages, including PHP, Ruby and Groovy.


December 1, 2008

A week or so ago I saw a post from James Governor on Twitter mentioning an “unconference” called Homecamp. Always keen to learn new things I investigated and found a link to the Current Cost meter, which provides a way to monitor one’s home electricity usage easily. Thinking of the days before large electricity bills this struck a chord, so I ordered one and spent the tail end of last week turning lights and electronic devices
off to see how low I could get the figure.

Homecamp08 took place on Saturday 29th November at Imperial College, and was organised by Chris Dalby, with help from Andy Stanford-Clark, Dale Lane, and James Governor.

The Current Cost Unit

Dale Lane started off by talking about the Current Cost meter, followed up by Andy Stanford Clark. Andy had already built his own DIY electricity monitoring solution, and then came across the Current Cost people. The Current Cost team were persuaded to add the serial port which has allowed people to do interesting things with the unit. It currently provides (every 6 seconds):

  • Temperature
  • Current Watts used
  • Hourly usage for the last 24 hours
  • Daily usage for the last 31 days

Code samples are available online in various languages to allow experimentation. The new unit, due out in December, will keep daily history for 90 days, and allow sensor readings on 9 input channels, as well as supporting a higher data rate.

Andy talked about other monitoring devices he has built, and how it is possible to build a monitoring system based on a low-power PC such as the NSLU2 running Linux or Viglen MPC with a small message broker (such as RSMB) that supports the MQTT protocol;
other devices can subscribe to published events and act on them (such as Twittering). Andy’s team have even equipped the Hursley bus to Twitter its location (how we could do with that on London buses).


So why should we bother about all this? Well, from a purely selfish perspective, one watt per year costs roughtly £1. So given that the meter costs around £30 (from Eco Gadget Shop) it doesn’t take long before it starts paying for itself. Having the meter constantly showing you how much you are paying (it is accurate to within around 10%) does encourage lower usage. And it shows you how much ambient energy your house is using when you are getting no benefit from it,
i.e. from devices left running, or on standby. We aren’t yet at a point where electricity is sufficiently scarce that this is an issue, but it could be in the future.

Dynamic Demand

Phoene Bright and Joe ShortPhoebe Bright talked about how dynamic pricing, based on demand throughout the day. There is typically a peak between 6pm and 7pm in the evening, which means that generators in the UK need to work at low capacity and then ramp up their output for the peak. Ideally we would like to spread demand throughout the day. Joe Short, who is involved with Dynamic Demand, joined Phoebe and described how it is possible to evaluate how well the UK generators are coping with current demand by measuring the frequency of the mains supply.


The IBM Hursley team have created their own network involving an element of friendly competition to compare and highlight energy usage. There was general consensus among the Homecamp attendees that social factors have a powerful impact on behaviour, more-so than economic factors. (Pachube – pronounced “patch-bay” – was covered on a later session and builds on this idea.)


The majority of the home automation and monitoring solutions being used are home-built. A very useful potential component for this is the Arduino module, which Nicholas O’Leary took us through. The unit has a number of digital and analogue inputs and can have other boards piggy-backed on top (for example, to introduce ethernet support). Nicholas has built an ambient orb based on the Arduino, which incorporates an illuminated sphere that provides constant feedback on the home’s
energy consumption (green / amber / red, for example).

Monitoring Gas, Water and Petrol Consumption

Other home-grown solutions involve monitoring gas based on light reflected from read-out digits, the pulsing LED, or boiler flame height. Andy Stanford-Clark was able to get another water meter installed which has a magnetised needle whose rotations can then be counted. And petrol consumption can be monitoring through the On-Board Display connector.

Honourable mentions go to Ben Hardill for his water monitoring unit that would be suitable for a flat, to Nicholas O’Leary for rigging up his doorbell, and to Adrian McEwan who is working on a fridge monitor.

In summary, there was lots to think about, much of it new to me. I felt a sense of excitement arising from the progress that people had already made, and that through a combination of social, economic and technological factors it is possible for us to continue encouraging individual and collective energy reduction.


Onzo designs in-home energy displays and smart meter solutions that enable utilities to put the end user in control.

Rob Veck is taking retirement early in order to work on Green Home Diary.

Write-ups from Dale Lane, Phoebe Bright and Tom Taylor.




Getting Started with the Current Cost Meter

November 28, 2008

My Current Cost meter has been running for just over two days. Apart from the microwave, oven and kettle it seems to be an accumulation of other items (such as leaving lights on, or the PC running) that racks up the cost. This probably sounds obvious, but seeing a monthly cost displayed with a pound sign in front of it helps to reinforce the point.

I wanted to understand what data was available from the unit, and compare it with the feed that Andy Stanford-Clark is producing on Twitter, to get me thinking about the possibilities.

  1. I downloaded the Java code written by Dale Lane and imported it into Eclipse.
  2. Dale’s code needs a class that is available in Trent Jarvi’s RXTX package; version 2.1.7 for Windows is installed by copying the rxtxcomm.jar into your Java Runtime’s \jre\lib\ext folder, and the rxtxSerial.dll to \jre\bin
  3. Windows wouldn’t install the the USB-serial converter for the USB cable but I found a link on getsatisfaction, also courtesy of Dale, to the Prolific drivers.

Now I can run the CurrentCostSample, which produces the following:

Stable Library
Native lib Version = RXTX-2.1-7
Java lib Version = RXTX-2.1-7
27/11/2008 7:00 = 2.0
27/11/2008 13:00 = 2.8
27/11/2008 23:00 = 2.0
27/11/2008 21:00 = 1.8


The unit actually returns the data in XML format, which Rich Cumbers has helpfully detailed.

For some fairly short-term gratification I plan to plug the unit into my Linux box that is currently used for recording TV so that I can do more fine-grained / longer term data capture and think about publishing some useful events (perhaps over Googletalk initially). I expect there will be plenty of ideas at Homecamp tomorrow. In the spirit of power reduction I have also bought a NSLU2 network storage device
to run as a small server, which takes only about 4 Watts; not set up yet, but on the to do list.

If you are thinking of experimenting with the Current Cost unit, you might want to listen to Andy Stanford-Clark on the Tech Weekly or Automated Home podcasts. I also found Chris Dalby’s post pretty useful.

Down But Not Out

November 18, 2008

So, my last post was back in June. I’ve been spending my time with my head down at work, and trying out Twitter and Friendfeed. Out of hours I’ve also been getting up to speed with PHP and CodeIgniter, and Groovy. However the main priority has been catching up on sleep as I’ve been borrowing from that to work on these other things. There are still a number of things I want to write about here, so while I thought about closing the blog down I’ve decided to give it another go. More soon.

Being-Digital, Afternoon

June 12, 2008

This is the second part of my notes from Being Digital on 10th June (speakers list).


Panel members: Ashley McKorkle, Loic LeMeur by video, Chris Seth of Piczo, Andrew Scott of Rummble, Ankur Shah of Techlightenment and Jerome Touze of WAYN.

Ashley McKorkle at Being-Digital

Ashley McKorkle is a Mobile Futures Analyst at Intel, and talked about their Intel Atom chip which is optimised for Mobile Internet Devices (e.g. low power video codecs). Future mobile consumption will be more contextual and more immersive. At Intel they are thinking about how to help consumers answer questions like: “What is a good kebab shop?” or “Is this a dangerous neighbourhood?”

Loic LeMeur (Seesmic) then chipped in by recorded video to answer some questions Simon Grice had posed. His view:

  • Everything is social (it’s a way of thinking about software … currently limited to geeks but will become pervasive eventually)
  • Social software is becoming more decentralised; centralised comments (e.g. on blogs) are no longer the norm … you have to get your comments on Flickr, Facebook, Friendfeed; the most important thing is the conversation
  • Twitter is a model in terms of platform … anyone can grow things on top of Twitter by using their APIs; you can build on it, as well as get the data and put it somewhere else; openness in any future platform is essential
  • We’ve only scratched the service with mobile location services; thanks to Twitter we know what our friends do but we don’t know where they are in real time (yet)
  • Another trend – to get the experience as human as possible, e.g. with video (which brings gestures and feelings) – this is an area that Seesmic is focusing on

There seemed to be a consensus among the panel that for a media business (at least) a social networking site is a destination and not just a feature. Ankur Shah suggested that users are happy for companies to have their data as long as they get something back (and the other panelists seemed to agree).

Predicted trends included (Andrew Scott) more adoption of mobile and location-based features, as well as portable social graphs; Chris Seth echoed the points on mobile and unlocking of data, and added vertical networks.


Panel: Brent Hoberman of mydeco.com, Adri Kraa of IKEA, Lisa Rodwell of moo.com, Richard Anson of Revoo and Jason Smith of shop.com.

Brent Hoberman talked about his new venture, mydeco.com, which is one of those “why hasn’t anyone thought of this before?” ideas. It allows users to create designs and share designs, and furnish their house according to a given design and budget. The home furnishing market has been highly fragmented, with the market leader having only 6% market share.

The consensus among the panel seemed to be that online retailing is threatening the high street (e.g. Dixons); the challenge for high street retailers is to build strong brands and use online to drive offline sales. Brent mentioned that with more and more user data, online shopping has the potential to get much better, while Richard Anson observed that a “buy locally” option on the web provides an opportunity for retails to engage with customers in a different way (instead of purely on the basis of price).


Panel: Giles Palmer of Brandwatch, Jeff Kelisky ex-CEO of Multimap and now GM of Commercial Search at Microsoft, Kristofer Mansson of Silobreaker, Dominic Blackburn of 192.com, Ariela Freed of Jumptap, and Simon Grice.

In his presentation Giles Palmer observed that there is a continued blurring going on between ads and other info, and between PR / Search Engine Optimisation. More metadata means more connections (between, for example, people, products and locations). Kristofer Mansson pointed out that keyword search is very limited, while Jeff Kelisky talked about what Microsoft is doing with real world (3D) search. There was also a discussion on linking real world items to online, and how we do this (barcodes?); 192.com are now geocoding news stories.

Simon Grice recounted an anecdote from Loic Lemeur who was looking to hire a raccoon (Seesmic’s logo) in San Francisco. Not having had much luck through more traditional channels, Loic posted a request on Twitter and received a number of replies. So perhaps Twitter has potential as a product search engine?


After each session and throughout the day a number of companies gave a one minute pitch on their company. I mostly didn’t follow up on these, but bumped into Joe Drumgoole who I’ve met before, and he gave me a demo of the latest version of Putplace (which allows you to Secure, Backup and Organise your data online). It looks like a useful product – I like the clean interface, file versioning, and icons showing in situ which of your files have been backed up (reminds me of Subversion).

I use ZYB for synchronising my N95 address book, and thought that YouGetItBack, Liquid Data, zoomorama, u-myx, Singtones, and KiWork looked interesting.

Being-Digital, Morning Session

June 11, 2008

So it’s been a few months since my last visit to a conference. Yesterday was Being Digital, organised by Tony Fish and Simon Grice, hosted at BAFTA on Piccadilly. There were various sessions throughout the day with different speakers, panel discussions and demos. Here are my takeaways:


being-digital Advertising PanelThe “old” model still applies, i.e. commercial reality requires eyeballs to generate a return. However, consumers are now in control as they use technology to filter incoming messages, and make personal product recommendations to each other.

Other quotes, paraphrased:

Andrew Gerrard of d-marketing

  • Large quantities of user-generated content mean that we need to filter this to identify the quality
  • Mass audiences are disappearing, and are becoming fragmented
  • Quoted from Lord Puttnam: “Technology can only ever serve as a bridge, never as a destination”

Helen Keegan

  • There is too much focus on metrics; advertising isn’t all about click-throughs
  • Good marketing begins and ends with good products; companies cannot now control what consumers say about them
  • Mobile adoption is increasing faster than computer adoption
  • The iPhone is a game changer for mobile internet advertising

Michael Bayler

  • Digital channels are currently more important as inbound channels (it is up to companies to develop a service proposition in response)
  • There are about twelve rings that you need to get through to reach the consumer (in the middle is “me” – I use technology defensively to filter marketing, followed by the “us” tribes)
  • Consumers want to tell stories about themselves

Kate Burns of DailyMotion

  • Advertisers don’t see that local advertising has critical mass
  • Media is fragmenting, and the consumer is in control
  • On the positive side, compaigns can be run with sub-10K budgets – more creativity is required (think Kate Modern)

Turlough Martin of WunderLoop

  • It’s all about giving people what they want, while balancing with privacy


For this session Wendy Grossman began by observing that your online identity can be separate from your real-world identity; it might involve your role (e.g. as a commentator in rec.tennis) or purchases you’ve made from Amazon.com. Wendy recounted an anecdote that Martina Navratilova had told during a press conference where someone was impersonating her online so she created an account and posted “She isn’t the real Martina Navratilova, I am”. Nobody believed her, but they believed Wendy who was known in that community.

Simon Willison did well to compress his usual 45 minute talk into less than 10 minutes; some key points:

  • OpenID lets you prove that you own a URL
  • Spammers have OpenIDs too, so you need to check the morality of your users
  • If you do want to put all your eggs / credentials you can go for a more heavyweight approach, such as Trustbearer OpenId which supports two-factor authentication involving a smart token or biometric device
  • OpenID 2.0 improves usability by avoiding the need for URLs
  • OpenID is decentralised

After Andy Thomas of Garlik, and Luke Razzell of iTogether, there was a discussion led by Simon Grice and the following observations were made:

  • In a show of hands, most of the audience used the same user name and password for the majority of their online accounts
  • OpenID potentially provides a mechanism for social networking sites to share data (e.g. contacts) with one another
  • Wendy wondered whether eBay could share a person’s reputation
  • SimonG: People rely on Twitter and SMS and there is no one dominant SMS provider
  • Microsoft is trying to separate different roles with Cardspace
  • SimonG: We are still bumbling along with username and password, but in 5 years time maybe it will happen
  • Andy: The solution needs to be relevant to consumers
  • SimonW: Identity is a big and vague problem; OpenID is a useful building block

Will McInnes moderated a panel discussion involving Nick Brown (A2A Group), Peter Miles (sub.tv), Ave Wrigley (ITN), Joe Drumgoole (Putplace) and Ivan Pope of Sniperoo (who recorded part of the session).

Key points for me were:

Peter Miles:

  • Is content FREE (i.e. little value) or free (where someone pays eventually)?
  • Distribution has been democratised
  • Branded content is the most popular form of all advertising formats with consumers (67% found it valuable or acceptable)


  • Content value can come from timeliness
  • People pay for convenience (that’s why we use iTunes) but they won’t pay for the same content over and over in different formats


  • People managed just fine before Hollywood, and are actually easily entertained
  • Content implies that it lives in a container; what are we filling? (Previously this was media-controlled channels.)
  • Predicts that in 20 years time there will be no more Hollywood blockbuster; people will remake their view of the world with fragments

In summary, Will McInnes predicts that 50% of big media companies will go bust. The future is in platforms for user-generated content. Content discovery is still a big problem – we need something like last.tv (last.fm are already working on this).


Moderated by Andrew Gill, the location panel included Tim Warr of Multimap Microsoft, Richard Varham of Locomatrix, Judy Gibbons of Accel and Rob Hinchcliffe.

I didn’t take many notes from this session as I was too busy listening, except to note Tim Warr’s comment that we haven’t got the interface right yet for location-based (mobile) apps.

ThoughtWorks Keynote at QCon

March 13, 2008

Martin Fowler, Thoughtworks‘ Chief Scientist, and Jim Webber (SOA Evangelist) were speaking at QCon last night in a talk entitled “Does my Bus look big in this?” Martin Fowler’s books on software patterns, particularly Patterns of Enterprise Application Architecture, probably get the most use out of my collection, so I’m particularly interested in what he has to say.

I have personally experienced what I believe is a lot of unnecessary complexity in Enterprise software, and it is refreshing to see Martin and Jim cut through this to come up with a set of principles for effective (lightweight) delivery. I’m also pleased that ThoughtWorks are actively promoting (J)Ruby on Rails in an Enterprise context as this seems a natural successor to “traditional” Java development, and in my opinion they seem to be one of the most enlightened consultancies on how to deliver Enterprise software effectively. (They’re also agile.) Martin and Jim’s recommendations are as follows:

  • Adopt an agile approach with two main notions: (1) accept change as an inevitable part of the software process, and design processes accordingly; (2) the most important part of the process is the people.
    Various techniques are becoming more prevalent to support this notion (see Test-Driven Development: A J2EE Example, Pragmatic Unit Testing, Refactoring or Applying Domain-Driven Design and Patterns).
  • De-risk projects by beginning with a small system that works.
  • The web is middleware (and low risk): why bother with proprietary middleware when you can use the internet?; provide innovation at the edges, and “heavy hitting in the cloud”middleware


  • Services host business processes
  • Business people own our architecture
  • Prioritise and deliver services incrementally (it might be messy but at least it’s delivering business value)

The key to good software design isn’t to allow for everything you think you’ll want it to do (You Aren’t Gonna Need It), but to enable it to do things you haven’t thought of yet.

Some proposed alternative definitions:
SOA = databases wrapped in WSDLs being orchestrated
ESB = Erroneous Spaghetti Box

Martin and Jim don’t like ESB’s … but if you really really feel you need one, use a proxy server (e.g. Squid).