Pioneer One

05Jul10

A little while ago I wrote about how a TV series could be released on bittorrent and paid for by fans. Well, somebody’s done it – check out Pioneer One.

I read about this last week on TechDirt, but it took me a few days to get around to downloading and watching it. My impression – absolutely brilliant. More please – I’ve donated already. I frankly don’t care how much this cost to make, the pilot reminded me of a good episode of X-files – great TV. If it only takes a few thousand of us to put our hands in our pockets for less than the cost of a DVD, and that gets us away from the mass produced tripe of ‘reality’ TV then this is my future of TV.

Now then… I don’t really want theme tunes or deleted scenes (I never really bother with DVD extras type stuff). Some cool merch would be a different matter altogether though.


I watched this film last week, but it’s  taken me a while to find the time for a quick review. I guess I must like Michael Moore‘s films, as I’ve watched most of them, though I know from experience to expect a certain political perspective that I might not entirely agree with.

I think he asked some important questions – questions that I’ve been asking myself and others for some time. Ultimately though there weren’t satisfactory answers. I really would like to know why when ‘productivity’ has gone up so much in the worldwide economy we’re all having to work so much harder than a generation or so ago? Moore would have us believe that the richest 1% of society is somehow living the grand life whilst the remaining 99% toil away, but where’s the evidence? Isn’t the real problem that today’s workers are paying down the debts of the past (debts run up to cover post war healthcare and social services) – or is that my rather UK centric point of view? Or maybe it’s all really down to our obsession with liability (and hence the huge ‘elf and safety tax on society)?

and that’s what it all boils down to… there are taxes on society that exist now that weren’t a drag on Moore’s family when he was growing up in Flint. Perhaps Wall St is one of those taxes, but is it really the dominant cause? Convince me – he didn’t.

Moore makes a point that Wall St (and by extension the finance industry worldwide) is sucking up science and engineering talent and wasting it by engaging in a zero sum game of algo trading and derivatives structuring. This is a question that’s certainly worthy of deeper investigation than it receives in the film – is that talent really going to waste, or (like in the space programme[1]) are there spin-offs to other parts of the economy/society that have benefit? Most fundamentally, where are those banks hovering up capital from? Why are businesses willing to pay fees (on transactions, or for structuring, or advisory services etc.) that maintain this status quo? If all the profits are from prop trading (as they have been for most iBanks since 2008) then who are the participants at the edge of the market who are (willingly?) feeding the beast?

Questions? Questions? Questions? – Moore asks many in this 2hr epic, but his answers are superficial and leave me with more questions.

One final question. Those people that we saw being booted off their farm (which is very sad) – what did they spend the money on (the money they got from whatever con man that sold them the mortgage they ultimately couldn’t afford)? Clearly our love affair with capitalism goes right to the foundation of society where people would rather borrow money to consume ‘stuff’ than hold a safe position on their family property.

[1]  The Apollo programme was reported to Congress as $25.4Bn (1973 $s), but did it really cost the US public any more than a few tons of aluminium and a few good men. Many books and case studies have been written about the economic upside from the science spin-offs of the space race.

Postscript… I first heard the term ‘sub-prime’ in the summer of 2006 from a taxi driver. We spent some time talking about CDOs, how they were risk managed and the diffusion of complexity across the financial system that would inevitably lead to cascading failure. Neither of us put our money where our mouths were. But it goes to show that many people were lying later on when they said the failures were all a huge surprise. If a taxi driver and IT bod can see what’s happening then anybody can. I met a hedge fund manager the other day who said that he was swapping out traders for industry experts (e.g. people who understood the fundamentals of a business rather than just the numbers) – I suggested that he might hire some taxi drivers.


I touched upon some time ago when I bought my new DVR. The question is how are we going to achieve distribution of video around the home when analogue TV goes away?

The good old way

I spent part of fathers day setting up distribution in my in law’s new place. They live in a place where Freeview doesn’t reach yet, so their needs were simple – redistribution of the analogue TV signal (BBC1/2, ITV and C4) along with the output of their Sky+ box. This couldn’t be easier, as the Sky+ has an analogue in for the Antenna and two outputs – one for the nearby TV and another specifically for the purpose of redistribution. All I needed to do was connect the redistribution output to an amplifier (luckily their was already a coax cable running back up to the loft) and then drop cables from the amp to the other TVs they wanted. Had I researched this properly before coming up with the bill of materials my father in law could have even bought a special type of amplifier with ‘bypass’ functionality that would facilitate a Sky ‘magic eye’ – a device that beams remote control commands back to the Sky box via redistribution connection, which is great if you want to be able to change channels in another room.

But analogue is going away

In the next couple of years the analogue TV signal is being switched off across the UK. This means that there’s no immediate point in TV makers incorporating the electronics to receive analogue signals, and as I observed with my DVR many equipment makers are assuming that you’ll connect via a better quality connection than UHF RF (e.g. HDMI, Component, Scart etc.). So both ends of the analogue connection are disappearing – so what are the alternatives?

Wireless Senders

These are boxes (normally shipped as a pair – a sender and a receiver) that retransmit a composite video and audio signal over unlicensed spectrum (2.4GHz). In my experience they’re awful, which is pretty predictable – composite video is the lowest common denominator to video interconnect (and only one step of degradation better than UHF), and the 2.4GHz band is full of other stuff that you probably already have – WiFi, DECT phones, MicroWave ovens (though you’d hope they’re not leaking).

Wired Senders

I’ve seen devices that can take a VGA signal from one room to another over Cat 5 cabling. These seem to work OK, but the one’s I’ve come across seem to be aimed at the business market rather than domestic use (with pricing set accordingly). I also suspect that they’re not properly network friendly – expecting a dedicated set of twisted pairs.

Media servers/players

I’m a big fan of media servers (and associated players like the Kiss DP-500 it’s HD sibling the DP-600, which I’ve had for some time now [1]). These are fine for watching content libraries in different rooms, but there are some issues:

  • Building a legal content library is tough – video has been slow to follow music into using formats with DRM and other encumbrances.
  • It’s not a multi room solution – even with multiple players there’s no way to achieve audio/video sync as you walk from room to room.
  • It can only deal with stale content rather than live events.

Slingbox

Slingbox make devices that can digitally encode a video signal and stream it to a receiver. Sadly these are crippled so that only one receiver can connect at a time, and it also seems that the dedicated receiver (the ‘SlingCatcher’) isn’t made any more. I also had concerns about picture quality on these, which were conceived to stream SD video over the Internet (though an HD version came along later) rather than high quality video around a house.

Pay up

You can of course just pay content distributors (like Sky) for multiple boxes for multiple rooms. This works fine for live events, and covers watching different recordings in different rooms, but doesn’t deal elegantly with watching the same recording in multiple rooms – time for some home network friendly features on those boxes perhaps (once the Holywood DRM Stasi figue out a way to lock things down to their satisfaction).

and that’s about it

it seems that multi room TV is coming to an end, but I have some ideas…

  • HDMI redistribution – seems like a really good plan, at least that is until HDCP enters the frame. Sadly the whole point of HDCP is to limit HDMI to a conversation between one playback device and one viewing device. I keep wondering what happens when TVs with HDMI/HDCP get beyond their useful lives and hackers start doing their thing? If it ever happens then HDMI redistribution would be tricky cabling wise unless it can use a home network in the middle (which brings its own challenges with bandwidth).
  • Live re(encoding) – it’s been possible to do real time MPEG4 (or similar) encoding of SD video for some time on a single x86 core. With a bit of dedicated GPU assistance I’ve no doubt that HD would be feasible these days. All that would be needed then is players capable of doing streams rather than files (hardly a problem). This is basically Slingbox for the home (with maybe extensions over the Internet for remote viewing).
  • Get at the digital stream at its source. PC based DVB cards offer the opportunity to stream and/or record TV and serve it up to various ‘receivers’ (and I believe Windows Media Center pretty much does this). The problem is that such cards are only available for free to air broadcasts – no commercial satellite or cable (and likely no free to air HD if the BBC/Ofcom give in to Holywood).

Any other thoughts – please comment below?

[1] These days a 3rd generation games console like an XBox 360 or PS/3 can also double as a media player. Even the Wii can join the party if you hack it a little (with official support often rumoured but never released).

Update 1 Jul 2010 – and a few days after writing this along comes HDBaseT (thanks to The Register for flagging this up). No mention of multi room capabilities, but it looks like exactly the right enabling technology (and could probably be somehow combined with the StarTech stuff that Patrick mentions in comments below).

Update 7 Jul 2010 – the BBC has an article out saying that ‘Analogue TVs no longer sold by UK retailers‘. It lacks detail – how many of those TVs sold in May were capable of receiving an analogue signal (most I’d bet)? The trend was however already clear.


When I first heard about Pleo I knew that I had to get one. My son is a huge dinosaur fan, and a toy robot baby Camarasaurus was sure to be a huge hit. I ended up ordering two ‘first hatch‘ Pleos in the hope that I could turn a quick profit selling on the 2nd one on eBay, but in the end it went to live with @khairoun and @monadic (after he’d been so helpful bringing them back from the US as they weren’t available in the UK).

Xmas turned out to be a little tricky, as we were visiting family in Spain, but I managed to tuck Pleo into my hand luggage, and when the day came there were huge smiles all around. The pet dinosaur was the huge hit that I expected.

But there was trouble… UGobe had clearly built this to  price (albeit quite a steep one) and the huge compromise that they’d taken was using a NiMh battery pack. This took ages (3-4hrs) to charge, and at its best lasted about an hour of play time. I bought a second battery pack, but that only provided temporary reprieve. It didn’t take long before the battery packs wouldn’t work at all. Most frustratingly the battery packs were simply 6 AAs bound together; if they’d used regular AAs then it would have been no problem keeping the beast going.

So Pleo stopped being a playful pet, and just became another (larger) plastic dinosaur amongst the herd.

When the recent 2.0 firmware came along I thought I’d try the upgrade. A fully charged battery pack managed to last just about long enough to run the upgrade process, but wasn’t supplying the juice to allow movement. So I started looking around for alternatives…

and discovered what UGobe should have put in there in the first place. A battery pack using Lithium Polymer cells from a firm called UCube. Of course these are impossible to get by normal means in the UK, but as usual eBay came to the rescue, and I ordered a battery and charger ‘egg’ from a seller in China.

With the new battery inside Pleo walks (and roars, and sings) again – better than ever :) It’s such a shame that UGobe didn’t use a battery like this in the first place. With a decent battery Pleo really is one of the coolest toys ever. I just need to teach my son a little more about batteries now – he’s now asking if we could get an even better battery that would make Pleo grow (and poo).


Yesterday the kids got a hand me down TV from their grandparents, which caused me to do a bit of reconfiguration of things in their play room. The HomePlug adaptor in the garage had died a few weeks back, so I’d swapped it for the one connected to their little used X-Box 360. Anyway, to make up for the absent HomePlug adaptor I found myself running a cable to another switch, which got me thinking that my home network is growing a bit on the big side…

Study: I have an 8 port Gig-E switch for my PC, his’n’hers laptops, B&W laser printer and a SIP ATA. The telephone wiring in my house was done using Cat-5, so I’ve set up a wall socket to use a pair for telephone and 2 pairs for ethernet running to…

Coat (comms) cupboard: This is where the telephone line comes into the house, so I have my ADSL router (connected via a Type A faceplate), a spur to the lounge (another repurposed ‘telephone’ line) and a HomePlug adaptor that goes to the…

Garage: I very much wanted a proper line out to the garage, but after many failed attempts to follow the same conduit as the power cable I gave up and used WiFi. The WiFi was never terribly reliable, so I switched to HomePlug a few years back, which has been much better (typically giving me about 50Mb/s to my backup box). To keep things simple I also use HomePlug to the…

Kids playroom: Where they have a PC each, a media player, two games consoles and a colour laser, collectively hung off a pair of 5 port switches. This just leaves the…

Lounge: Where I have a WiFi router, which has a switch in it too, one port connected back to the coat cupboard, and another running to a media player.

So.. that’s a total of 5 switches (including a WiFi router and ADSL router) and 3 HomePlug adaptors. I’ve not even begun to list the WiFi connected devices.

I wonder when home builders will start doing a better job of accommodating this type of stuff? I asked my builder if they could run some network ports (and surround sound cables) when they were doing the rest of the wiring some 8 years ago, but they weren’t interested in helping (the Cat 5 ‘telephone’ cables seem to have been an unintentional kindness).

So – what’s ‘normal’ these days – 24 ports? More? Less?


Kim Cameron has had lots of interesting things to say over the past few days about the security and privacy implications of harvesting MAC addresses in the wake of Google being somewhat caught out with their activities in this area.

Today though he has a piece where I think he’s crossed over the Chicken Little line. In the normal run of things I’d just leave a comment on his blog, but I can’t sign in  – even after creating a personal information card on my new Windows 7 machine and dirtying myself with IE so that I could use the card selector. No wonder information card is dying out there.

I wholeheartedly agree that there are issues to be resolved around devices such as WiFi routers and access points in people’s homes – MACs that they can’t change. But saying that people must stick with the MACs on their devices like laptops (and therefore can’t opt out of services that use MACs as keys) is simply not true. Many of my friends routinely use fake or random MACs (particularly when on the road and accessing WiFi networks that they can’t fully trust). Doing this is pretty much trivial for Linux and OS-X users, and for Windows there’s MacMakeup.

Of course we can’t all be DE:AD:BE:EF:CA:FE all at once, but the collision space is large.


This is going to be one of those ‘to save me from repeating myself’ sort of blog posts, as I seem to have been frequently engaged in conversation recently about this topic.

Apple

Let’s start with the elephant in the room. Apple redefined the smartphone with the launch of the iPhone, and though I don’t have one myself I’ve had an iPod Touch since the early days so I’m no stranger to the platform. Many seem to be convinced that the iPad is going to work the same magic, but I’ll stick to what I said previously on this – I think the iPad will create a category but won’t necessarily dominate it.

Apple clearly has a head of steam in its app store, and seems for the time being to be the app that developers will do first. I expect this to change though; as other platforms proliferate, and developers tire of the walled garden constantly having its walls moved against them, then the talent and economics will stack up elsewhere.

Long term I expect that Apple as a mobile platform (in both smartphone and tablet form factors) will become much like Apple as a personal computer platform e.g. a design led premium product that leads on the simplicity of its user interface. Apple will do very well from this arrangement financially (as premium products bring with them enhanced margins), but the growth that has propelled them past Microsoft will stop sooner than many buying the stock today might hope.

Android

Android is the clear challenger to Apple’s crown, but there’s a lot more to it than that. I don’t presently own an Android device as I believe the platform is (or at least was until very recently) in a phase of such dramatic change that it was a sure road to buyer’s remorse – any Android handset bought today (on a typical 2 year mobile contract) will seem positively prehistoric before it’s upgrade time (just look at the G1!). Since Google Apps is where I live on my laptop I expect that I will get an Android device when my present handset is due an upgrade, but that’s almost a year away. I desperately hoped that the plan by Google to sell the Nexus 1 SIM free and without a contract would be something I could buy into, but the economics of mobile remain a strange beast (it seems $SIM free handset + $contract > $handset with contract).

I don’t subscribe to some of the present arguments that Android will conquer the world because it has tethering[1] and the iPhone doesn’t etc. This is not a features war, and features can easily be copied between platforms. Android will conquer the world because it will out evolve the competition – it will suck developers and handset manufacturers and everybody else in the ecosystem in, and collectively they will push the platform forward faster than any single organisation could on its own. Google may be in the driving seat, but others will provide the pit crew, and the engine, and stickier tyres etc. Android will win the race because it’s more manoeuvrable.

The rest of the field

Having already declared what I expect to be the winner (at least in terms of long term shipment volumes and application footprint) it’s worth taking a look at what else is out there (and I’m inclined to agree with Tim Bray that ‘Two is not the cosmically correct number of viable mobile platforms.’

RIM

This seems to be the one that people keep overlooking in this debate, and that the techies always have a reason to discount, but that keeps on being there stronger than ever. BlackBerries have this perception as being business only, just for email, but that’s just not the case any more. The BlackBerry has a huge market share, and it’s not all business – ordinary people buy them as consumer devices because they don’t care that the browser sucks so long as they can email friends and use a (barely functional) FaceBook application. I’ve read elsewhere that BlackBerry messenger is a huge hit with US teens, which I’ve seen no first hand evidence of. What I do know from my own experience is that they make great devices for symmetric communication (via email) – where you send stuff rather than just reading. Also the apps are getting towards ‘good enough’ – the recent Twitter App may not be as swish as TweetDeck on my iPod, but it does the job – I can read (and write) tweets on the hoof.

For some reason I can’t quite pin down the BlackBerry has fallen way behind on its browser, which always was clumsy and slow compared to the state of the art. It also seems to be execrable from a development point of view (and their app store is truly awful). This will be what kills RIM (unless it raises its game) – when the choice is between mobile optimised HTML5 and ‘there’s an app for that’ and their device does neither well. I chose my present BlackBerry because I wanted a keyboard (and knew I couldn’t get on with an iPhone style touch interface); when it comes to upgrade time I may be able to get an Android device that has a keyboard, or somebody might just have a touch screen that I can live with (unlike the Storm).

Palm

The Treo was perhaps the device that first defined the smartphone by colliding a PDA into a phone. I remember when I first saw the 600 and I was frankly amazed by how small but functional it was (and I put up with its flakiness for a whole year and a bit until the 650 came out shortly after my upgrade window had opened). Sadly Palm OS was limiting, and the diversion to making Windows Mobile devices didn’t help with development. When the Pre came along last year I could see ‘too little, too late’ writ large on the impending tombstone.

and then HP came along and saved the day, and I’ve been scratching my head ever since trying to figure out why? On one side there’s the cost (and inflexibility) of a WinTel approach to mobile/tablet, but why didn’t they just join the Android party? One thing’s for sure, the $1.2Bn that HP paid for Palm is just table stakes, and they’re going to have to throw down a lot more money if they want to seriously have a go a bringing WebOS into centre frame for developers and consumers.

Windows Mobile

There was a brief period a few years ago when the iPhone wasn’t born and Windows Mobile sucked just a bit less than everything else in the PDA inspired world of smartphones. During that period I bought an Orange M3100 (aka HTC TyTN), which wasn’t bad, but wasn’t great either.

These days the only positive vibe I hear about Windows Mobile comes from Microsoft partners who’ve been sucked into the Redmond spin machine. They’ve been shown shiny Powerpoint, and talked to by evangelists who are passionate and convincing. What I don’t see is a passionate user community. People (including MS employees) always seem to put up with Windows Mobile rather than loving it. I don’t believe that the desk bound .Net developer ecosystem magically translates into a huge pot of Windows Mobile developers (and great apps). MS are kind of caught between a rock and a hard place on this one – how do they monitise Windows Mobile without charging a per unit license, and yet who would want to pay that license fee when they can get Android? How does MS add the value that it’s trying to extract from its licensees?

Nokia

I used to love the way that Nokia phone just worked when others didn’t (in particular Motorola, but Ericsson were guilty too). I still use a 7210 as my travel phone as it’s unlocked and the battery runs for days on end (weeks when it was fresh). Nokia sadly seem to have lost their way. They still make great little phones, but they don’t make great little smartphones. The last Series xx device that I spent time with was just awful.

My bet is that Nokia will eventually join the Android pack, but that it will take a long time for them to get over the (emotional) sunk cost of where they are with Symbian. Open Source will not save them – an open source project without a grass roots community is always a worse place to be than a community without open source.

Others

There’s some other interesting stuff rattling around out there like INQ (and we don’t have to look too far in the rear view mirror for things like the Sidekick), but frankly you’d have to be mad these days to start something from scratch with such great open source platforms available.

What about the operator?

Android is shifting the balance of power in the mobile space, and this seems to be disrupting the operators as much as the incumbent platform plays. Eric Raymond has had a string of meaningful things to say about this in recent weeks (though he does fall into the same trap as others by fixating on specific features). I think he’s probably right – the operators can bitch and moan as much as they like about what’s happening here, but the market will route around them if they become an obstacle. Sean Park covers this superbly in his platforms, markets and bytes presentation at eComm 09. The real problem here is that the operators have been trying too desperately to have some kind of value add (that justifies the enormous cost of those spectrum licenses), but all the consumer wants is reliable service and accurate billing (the stuff that telcos are supposed to be good at). Which brings me to my endnote…

[1] Tethering is going to be a very controversial feature. On the consumer side I can completely understand people not wanting to carry around another device like a dongle or a MiFi, and tethering is the natural answer to that (provided your battery can hold out, or you’re happy to tether by wire). The mobile telcos seem less happy with this route, as it turns out that the ‘killer application’ for those expensive 3G licenses is data – plain dumb pipes, and the only way to monitise that data is to sell a separate contract for it. Selling a separate device along with that separate contract has been an integral part of the smoke and mirrors game of getting the consumer to pay for (mostly) the same thing twice.

Update 2 JunAT&T have just relaunched their smartphone tariffs, and tethering is now on the menu (in anticipation of iPhone OS 4). The price is $20/month, which just gets you the right to tether – no extra data! This seems pretty outrageous to me when compared against $10/GB for AT&T’s out of bundle usage, or £15/15GB that I pay to Three for mobile data. So there we have it, AT&T think that the right to tether is worth around 2GB/month – I find this pretty disagreeable, but it will be interesting to see how consumers and other suppliers respond. I suspect that the operator that offers tethering with a bump to the data bundle will be the consumers choice (e.g. charge the extra $20, but give the extra 2GB headroom with that).


BlackBerry OS 5

15May10

One of the side effects of getting a new machine was that I used BlackBerry Desktop for the first time in ages. When it started up I was offered an upgrade to the newer 5.x OS, which I went ahead with (luckily I wasn’t in a hurry). Here’s my experience of running the new OS on my 8900…

Icons

After the backup/upgrade/restore process was complete all of my icons were back to where they’d started life rather than where I’d left them, which then led to a little spell of hunt the app. The upgrade also reset my background. Pretty much everything else made it through unscathed, apart from…

SplashID

The main reason for plugging my BlackBerry into my new tablet was to sync stuff that I had saved in SplashID. This didn’t go well for me. When I found the app again and started it I was asked to set a password (rather than asked for a password), and all of my entries were garbled:( Luckily I was able to export what I had on the old netbook and import it onto the new desktop app, which then synced things back to my BlackBerry.

Twitter

I quite like the new Twitter app on the BlackBerry, but it went missing during the upgrade, meaning that I had to reinstall (at which point my BlackBerry said that the app was there, and that I was ‘upgrading’ to the same version). After reinstalling I had to set things up again, but that wasn’t too time consuming.

I still have one gripe about this app – why can’t it be launched without a network connection? It works fine for reading tweets when you lose the connection. One of the killer things about BlackBerry email is that it works pretty seamlessly between online and offline, and Twitter should be no different.

Battery consumption

In the last few weeks before the upgrade my battery life was becoming a real problem – I was struggling to make it through a busy day. This seemed pretty poor considering that when I got the machine a year ago it could go 2 days with ease and manage 3 a a stretch (and this is the primary advantage of having a 2.5G device rather than 3G). I blamed the Twitter app (as it was the thing that I’d installed most recently), and took care to have a USB charging cable with me.

Anyway, the post upgrade BlackBerry seems to be back to being able to cope with 2 days (even though I’m still running the Twitter app) so it seems that the new OS is doing some really smart things with power management.

Lock/unlock/standby

The good news – the spurious messages about standby mode that used to happen on pressing the mute button when locked seem to have gone away.

The bad news – unlocking seems to be less reliable than before. I sometimes need to press the unlock 2-3 times to get the unlock dialogue box to appear.

Performance

The clock mini icon still appears with annoying regularity, but maybe just a little less than before. Performance is certainly worse than when I ran without encryption, but that’s not really something I consider optional for a device with my contacts and company email on.

Overall

The main benefit of the upgrade is a return to decent battery life, and it was worth the (small amount of) trouble just for that. There are a few cosmetic touches that I could easily live without, and some niggles have come others have gone.

Amazingly it seems that very few BlackBerry users have gone down the upgrade route. I’d certainly recommend it if you’re on an older OS and having battery life issues.


This is my first follow up post after my first impressions, which went up ten days ago now. I’m using the X201 as my main machine on the road, at work and around the house, so it’s getting to the stage now where I know it reasonably well.

Pimping my ride

I’ve done a few upgrades in the last week or so:

  • An extra 2GB of RAM, bringing the total up to 4GB. I’d previously noted that 2GB was a bit stingy, and I had an extra 2GB on order when I last wrote. Of course when it arrived it immediately rubbed my face in the fact that 32bit Windows 7 really doesn’t deal well with any more than 3GB. Time for another upgrade then…
  • Windows 7 x64 (Ultimate). I’ve been a bit cautious about using x64 fearing issues with with drivers (and particularly driver signing). That said, I’ve been running Server 2008 x64 on my garage box for a while now without any snags, and the recently arrived ThinkPad Edge machines I got for the office also had Windows 7 x64 so it was time for a change. I didn’t want to blow away the supplied system though, and 250GB was a bit of a squeeze, so for a bit of space (and speed)…
  • A Seagate Momentus 500GB 7200rpm. This drive is big and fast. I contemplated a large SSD, but felt that 256GB wouldn’t be large enough, and the price of 512GB SSDs is still stratospheric. Seagate claim that the Momentus only uses 0.05% more power than a regular 5400rpm drive, and it doesn’t seem like a power hog (but it does feel quick). It also works with the ‘airbag protection’ mechanism, which was at one time the sole preserve of Hitachi/IBM drives.

I think it needed all three upgrades to unshackle this machine’s capability, but it now really does feel fast and responsive.

Plugged in, not charging

I committed the cardinal sin of not waiting a full 12 hours for the battery to charge when I first got the machine, so I feel that I only have myself to blame for subsequent issues. But it was new, and shiny, and I needed it.

Anyway, battery management seemed to be a complete game of chance for the first week or so. The machine would regularly fail to charge, and battery life indicators would swing quite dramatically.

The update to Windows 7 x64 (or the clean install process) seemed to bring things back onto an even keel, and I had a bit more confidence that I could get 3, maybe even 4 hours out of the 8 cell battery. Then yesterday it basically refused to charge past 43%. I’ve now updated the BIOS, and it seems to be behaving, but this power management bugbear goes back to the early days of Vista, so I’m somewhat shocked that the shipping BIOS for a machine this new would still suffer.

Network management

I continue to be impressed by the GigE, which seems more useful than ever when paired up with a really fast drive.

The Lenovo comms utility seems to have been the only driver that has refused to install properly since the x64 upgrade. To be honest I’m not missing it as the default Windows 7 network management works well enough and doesn’t have me tearing my hair out at seemingly random radio on/off actions.

The bump is on the wrong side

After a bit of investigation I turned up an answer to my issue with ‘Display cannot switch to secondary portrait with this configuration‘. It seems that it’s all to do with the position of the WiFi/3G antennae and the proximity of them to the body when the screen is used in tablet mode. Lenovo, I have a suggestion – put the antenna hump on the left hand side.

Resuming

Since the clean install/upgrade to Windows 7 x64 resume from sleep seems to be more reliable and robust than it was before, and it also doesn’t start playing paused videos, which was a niggle before.

Video

I had a go at playing some HD (720p) video, which to be honest didn’t look that much more impressive that SD, but barely tickled the available resources. There was certainly no need to go into a battery sapping energy management configuration (though it was nice to turn up screen brightness, which seems to suffer more than my old s10e – maybe because of the tablet sensor stuff acting as a filter).

I also tried some video transcoding, which seemed  impressively fast, though Divx Plus doesn’t give FPS data like the old Dr Divx. With a fast network and drive it’s almost worth copying stuff to and from this machine for transcoding.

Overall

The upgrades that I did were relatively cheap (about £100) and have transformed the machine. It really does feel pretty top class now, and I’m looking forward to continued use as my main machine for work and travel.


Not Only SQL

07May10

No, or Not Only

One of the most sensible things to emerge for the recent no:sql(eu) event (which sadly I didn’t attend) was a statement that NOSQL should be expanded to Not Only SQL rather than No SQL. This is an interesting development, as there’s been lots of good stuff going on in the NOSQL world, but the debate has been polarised and driven off centre.

I could be wrong about this, but it seems to be that the movement would be better labelled as NORDBMS, as it’s with relational databases that the problem lies, not the language used to query them.

The central point: SQL != RDBMS

Now I don’t have any particular complaint against RDBMS per se, it’s just that the world has moved on from a place where an RDBMS can be used as the default mechanism for persistence. The RDBMS has its place, and that place isn’t everywhere for everything. In fact I’d go so far as to say that the rational architect wouldn’t choose RDBMS from the plethora of available choice for all but a few projects starting from scratch today. That is if it wasn’t for organisational (and ecosystem biases), if it wasn’t for…

The cult of the DBA

DBA’s are the self appointed guardians of the modern firms ‘books and records’. They’re a conservative, safety first, sort of bunch, which is why they don’t really do new technology. And why should they, when the vendors that keep them supplied with T-Shirts, track days and conferences to slack off to assure them that they’re on top of all this new fangled stuff like XML and object caching and data warehousing.

The cult of the DBA is also powerful, with it’s leaders firmly emplaced in senior positions within the firm and across the industry. There are rules around here, and we know who set them.

Leaving practicalities aside, when you ascend to layers 8 & 9 of the OSI model (politics and religion), then the essence of the NOSQL movement is ‘screw you DBAs, I’m going to do this myself, I don’t need you and your time consuming processes’.

The cult is not alone however, as the RDBMS is just a veneer on top of the real persistence and…

The priesthood of storage

These are the guys that add the extra zeroes to the end of your cost per TB. The guys that have to run their own private networks (with their own esoteric protocols) to make up for the fact that storage inherently has no security.

Of course the priesthood make out like they’re saints – these are the guys that work weekends, just in case that slip of a finger takes out the whole trade floor.

Together, the cult and the priesthood have established themselves in such a way that the typical enterprise developer has no choice – if you want your data to be secure (and who doesn’t) then you put it in here (the RDBMS) and we look after it there (an EMC DMX or similar).

The impedance cost

I have written before about impedance mismatches in data, so I’ll try not to repeat myself too much.

The trouble with RDBMS is that relational form (and the set theory it’s based on) and yes, the SQL query language itself aren’t often the optimum way of representing data. This is especially true if you manipulate that data in an object oriented language (where you almost certainly represent it as objects) or exchange that data with others (using perhaps a text based expansion like XML). There are conversion costs in time, compute effort and fidelity when moving between representations. Common sense therefore suggests that if data can be stored in the same form that it’s manipulated then that’s a big win.

This problem isn’t confined to the code, as it extends to the modelling domain too. Entity Relationship (ER) tools are great at helping you organise how you put things into an RDBMS, but I feel that’s where the story ends.

The training cost

A (perhaps specious) argument I’ve heard against new data management techniques and technologies is that ‘my developers know SQL’. Setting aside for a moment my scepticism that SELECTs and JOINs are blunt instruments compared to the other tools in the bag, I do accept that fully exploring and understanding the tool bag is too much for the average developer – so if SQL is what they know then let’s take advantage of that. This is where things get interesting…

Bolt on (or bolt in) a SQL parser

Just because you have an unconventional data storage paradigm (e.g. anything that’s not RDBMS) doesn’t meant that you can’t understand SQL. Mike Stonebraker clearly understood this when he chose StreamSQL as the way to go for complex event processing (CEP) queries with StreamBase. Others are starting to follow the lead. Sean Park recently pointed me at GenieDB, which looks like a very interesting hybrid of memcached and MySQL. My first reaction was that it’s a system that will let you have your cake and eat it (and that provides a reasonable path to eating LOTS of cake really quick when that need arises). I hope to kick the tyres soon (and report back here what I find).

And in the cloud

This post wouldn’t be complete without a comment on the big news of the day – VMWare (Springsource) buying Gemstone, which is clearly a big bet it the NOSQL space. This will get interesting, as VMWare now have all of the pieces to make a complete Platform as a Service (PaaS) that can sit on top of VSphere (and other?) Infrastructure as a Service (IaaS). Since they can abstract and automate a lot of the traditional admin overhead out of the way then I think this torpedoes the cult of the DBA and the priesthood of storage – there simply isn’t a place for these people in a cloudy world (public, private, hybrid or whatever). And just to be clear, I’m not arguing that systems administration evaporates into the cloud (I’m with the very wise Simon Wardley on this one); but the discipline of system administration changes – progress routes around obstacles, and the cult/priesthood certainly were obstacles. NOSQL has implications for developer productivity, systems administration workload and risk. No of these are trivial, but new and better choices are now on the table, and I sense that there’s more to come.