I mentioned in my August 2022 post that I’d write separately about this, so here goes…

The Google Developer Experts program is a global network of highly experienced technology experts, influencers, and thought leaders who have expertise in Google technologies, are active leaders in the space, natural mentors, and contribute to the wider developer and startup ecosystem.

Google Developers – Experts


Atsign has been working with the Dart & Flutter community since before I joined the company, and a bunch of the community leaders we’ve worked with as advisors are Google Developer Experts (or GDEs as they’re most commonly abbreviated). My colleague Anthony, who leads our developer relations efforts, thought it would be a good idea for us to have our own GDE (or GDEs), and initially suggested that I help mentor one of the team through the process. That plan didn’t survive contact with reality, though I’m hopeful that in due course we will grow more GDEs in the company.

Having taken a look at what was needed I ended up suggesting that I could apply myself. I’d been talking about Dart, writing about Dart, and generally trying to get my friends in the industry interested in Dart since shortly after starting with Atsign. So maybe I could do it.


I reached out to Majid Hajian for guidance through the process. He’s one of the most visible people in the community through his work on Flutter Vikings, and I’d had the pleasure of meeting Majid and speaking alongside him at last year’s Droidcon London.

His primary guidance to me was to catalogue everything I’d done in the community as preparation for the application process. So I pulled together a spreadsheet of every presentation, every appearance on webcasts, every blog post, every GitHub issue and pull request, and the analyst coverage I’d contributed to. Once that was ready Majid sent me a link to the application form.

Step 1 – Application form

The programme uses a platform from Advocu, and the ‘Eligibility Check’ begins with filling out a form detailing all those community activities. For each item there’s a measure of impact – how many people attended the talk, or watched the video, or read the blog post? Majid’s advice helped a lot here, but I didn’t have detailed/accurate numbers to hand for some of the items, which left me fretting a little about one particular post where I’d guessed it reached 250 people then subsequently learned it had been over 5000.

Talking to other GDEs (and aspiring GDEs) this seems to be the stage where most people trip up. They either don’t have enough quantifiable impact on the community, or their work is seen as too entry level and they’re directed to produce stuff that’s aimed at an intermediate/advanced audience.

Steps 2 & 3 – Community Interview and Product Interview

Having thankfully passed the eligibility check the next steps were a couple of interviews; firstly with an existing GDE, and then with a Googler. They both went pretty much the same, with the conversation focussing firstly on presentations I’d done in the past (on things like Dart in Docker on Arm) and then new things I’m working on (like comparison of Dart ahead of time [AOT] binaries to just in time [JIT] snapshots, and the little compiler bug I’d found on my adventures). My product interview also strayed into some discussion about product roadmap, and areas where I saw frustration.

In both cases the interviews didn’t feel like a grilling designed to find weaknesses in my knowledge, but rather a chat between enthusiasts about areas of interest and how we can make things better for everybody.

Step 4 – Legal Sign Offs

The GDE programme exposes members to pre release information, so naturally Google asks for confidentiality using a non-disclosure agreement (NDA). As NDAs go it’s one of the shortest, simplest and fairest I’ve seen, so I had no hesitation in signing (other than it needed to be redone to get my correct full legal name).

Step 5 – Onboarding

I missed my onboarding meeting due to a prior commitment, so as I write this I’m not quite fully onboarded to the programme. But I’ve received lots of invites to online groups etc., and also a whole bunch of materials that I can use to help with future presentations etc.

There are badges, like this one:

The community

Days after receiving the good news I was in Oslo for Flutter Vikings 2022, which was quite the gathering of the European Dart & Flutter community, with many of the GDEs from the region there. It was a great event, and I had a great time meeting folk and chatting about what we’re building at Atsign and how we’re using Dart at the heart of that. More on the event from Anthony in his writeup.


I’ve got some talks about Dart coming up for QCon San Francisco, and QCon Plus, and I’m sure there will be more towards the end of the year and into 2023. Of course I’ll be keeping track of those in the Advocu platform that’s used to help decide if people remain in the programme from one year to the next by showing continuing contribution to the community.

As I looked at other GDEs LinkedIn profiles to see how they present themselves I saw that the most popular choice (and in fact the Google suggested approach) is to add it to the Volunteering section. GDE isn’t a certification or an honour or award. It’s a bunch of volunteers, helping their communities grow and learn. I look forward to the future presentations, because I’ve enjoyed doing that stuff for a long time; but I also look forward to the behind the scenes stuff – mentoring, guiding, interviewing future GDEs, and helping the Google product team do the best job they can.

A footnote on imposter syndrome

Something that initially put me off applying for the programme is that I’m not the world’s greatest Dart programmer. There are probably half a dozen better Dart developers than me just at Atsign, and we’re a tiny startup. If the product interview was anything like the dreaded whiteboard sessions in most technical interviews then I was doomed to fail. But it quickly became clear that GDE is about developing talent, and being open to learning, rather than being the stereotypical ‘expert’ developer. It’s not necessarily about writing in Dart (or any of the other Google Developer tools) all day every day, it’s about community contribution, and that takes many forms beyond just writing code.

People who know me well may wonder why I didn’t apply to become a Google Cloud Platform (GCP) Expert? I’ve been using GCP since the launch of Compute Engine, I got the Professional Cloud Architect (PCA) certification (now expired), and I use GCP pretty much daily in my Atsign work. But I’m not an active member of that community, I don’t got to events and talk about GCP, apart from some very specific product management calls it’s not a platform where I feel I’m pushing things forwards. It’s different with Dart, I may not be the best coder, but I’ve found a few things where I could make my mark, and I just have to remember that when the shadow of imposter syndrome looms.

PS anybody wanting to learn more about imposter syndrome and dealing with it should check out one of Jessica Rose’s talks on the the topic.


Using SSH keys is already a big part of the git/GitHub experience, and now they can be used for signing commits, which saves having to deal with GPG keys.


For a while I’ve been signing my git commits with a GPG key (at least on my primary desktop), and GitHub has some nice UI candy to add a Verified badge for anybody looking there to see.

When I recently created the Atsign guide to GitHub there’s even a section recommending signed commits.

Git 2.35 introduced the ability to sign commits with SSH keys, which people are far more likely to have than GPG keys, and recently GitHub (finally) got functionality in place for people to register SSH keys as signing keys, and show those Verified badges.

First you need a newer git client

I mostly use Ubuntu 20.04 LTS for things, and that comes with git version 2.25, which is nowhere near new enough. Thankfully it’s pretty easy to add the the git-core package archive and grab the latest stable git from that:

sudo add-apt-repository ppa:git-core/ppa
sudo apt update
sudo apt install git

That just worked for me, but if you run into trouble, take a look at this StackOverflow guide for some tips.

Now set up git to use an SSH key for signing

I found a few guides for this, but the one I shared with colleagues is Caleb Hearth’s ‘Signing Git Commits with Your SSH Key

The main point is to enable signing, and tell git to use SSH:

git config --global commit.gpgsign true
git config --global gpg.format ssh

Initially I did this without the –global flag to experiment on a single repo.

Git also needs to be told about your public key:

git config --global user.signingkey "ssh-ed25519 <your key id>"

All of the examples I’ve seen use shiny new (and short) ed25519 keys, but good old rsa keys can also be used.

What about my private key?

The line above uses a public key, and you also upload a public key to GitHub so it can display that Verified badge, but git needs your private key to do the signing, so there’s a little bit of key management magic happening behind the scenes.

Peeling back the magic, there’s an assumption that you’re running ssh-agent, which might be a fair assumption for folk who get deeply into doing stuff with SSH keys, but breaks when encountering a lot a default setups. My WSL2 install of Ubuntu doesn’t have ssh-agent there by default, so I needed to run:

eval $(ssh-agent -s)

But that’s only a temporary fix for the life of that shell session. I’ve added the following to my .bashrc:

{ eval $(ssh-agent -s); ssh-add -q; } &>/dev/null

I’ve seen a suggestion that ‘Using SSH-Agent the right way in Windows 10/11 WSL2‘ should be done with the keychain tool, but my fairly simple setup is just fine without it.

The GitHub bit

GitHub just needs the SSH public key (just like when setting up SSH keys for authentication), except the key is added as a signing key:

But wait, there’s more…

I’m pretty sick of seeing messages like this:

$ git push
fatal: The current branch cpswan-anotherbranch has no upstream branch.
To push the current branch and set the remote as upstream, use

    git push --set-upstream origin cpswan-anotherbranch

and because I always just copy/paste the correct line from those error messages, I’ve never learned the actual incantation to get it right first time. I extra hate it when software knows exactly what I’m trying to do, refuses to do it, scolds me and tells me what I should be typing.

Thankfully recent git can now be configured to just get on with it:

git config --global --add --bool push.autoSetupRemote true


Setting up GPG was enough of a hassle that I’d only done it on my desktop, but setting up SSH signing is easy enough that I’ll be happy to do it everywhere that I use git. There are a few extra hoops to jump through to make everything seamless, but they’ll get smoothed out over time.

August 2022



After the heatwave in July bringing record temperatures, there was little let up for most of August with lots of hot dry weather. We frequently had to pick walking routes that kept the boys in the shade, and when the weather did finally break we were treated to a ‘false autumn’.

Hot, damn hot

In last November’s post I wrote about getting a heat pump installed. It did a great job of keeping the open plan kitchen/dining/living room warm during the winter months. The impact on the electricity bill was about 50kW/h per month, which at 2022 winter prices wasn’t bad compared to running the log burner.

Having air conditioning in my office and bedroom has been a game changer, and doesn’t seem to have had a big hit on electricity usage. The A/C was also quiet enough not to disrupt my office setup (in stark contrast to the noise of the portable A/C I used last year).


Something that should offset any extra electricity used by A/C is the installation of rooftop photo-voltaic (PV) panels. I got them from Infinity Renewables through a group buy scheme organised by my local council. When the sun is shining the 16 panels seem to push out about 3kW, and I’m using an iBoost to divert excess into my hot water tank. Sadly the data logger on the inverter isn’t working, so I don’t yet have a good handle on how much total power I’m creating and what impact that’s having on my electricity consumption from the grid.

When I first signed up for the scheme I expressed interest in a battery to go with the panels. But the model that Infinity supply can’t do backup power, so I decided not to bother with that. Now might be the time to join the queue for a Tesla Powerwall.

Google Developer Expert

I’ll write a separate post about this when I find a little more time…

Just in time for Flutter Vikings I got the good news that I’d joined the ranks of the Google Developer Experts (GDE) program, which I hope gives me opportunities to do more with the Dart and Flutter community. Many thanks to Majid Hajian for shepherding me through the process, and for everybody else involved for helping me along the way.


$daughter0 got her A-Level results, and they were (more than) good enough for her to get the place she wanted on a Chemistry course at Bath, so she’ll be off to University at the end of next month. The house is already abuzz with planning and organising for the move, and I suspect things will be eerily quiet come October.

Book of Mormon

$wife and I went to see Book of Mormon years ago, and loved it. But we didn’t take the kids that time as they were a bit too young. For a birthday day out we decided to go back, with the kids this time, and the show was hilarious again. We got seats in a box, which might sound fancy (and actually was fancy in terms of comfort); but they were actually the cheapest seats in the house (sold as ‘restricted view’) with all four of us costing less than a single stalls seat. I didn’t feel I missed anything due to a ‘restricted view’ and really liked the extra comfort and privacy of the box – definitely something to consider for the next show.

Saying goodbye to a friend

July brought the very sad and sudden departure of my friend and InfoQ colleague Alex Blewitt, and I joined a group of InfoQ folk to say goodbye at his funeral. The service was very well attended, with many of his former colleagues joining family and friends. The celebrant did a wonderful job of explaining the full richness of a life that I only caught glimpses of at InfoQ editors’ dinners and Twitter threads. Alex’s humour shone through to the end, with his choice of Monty Python’s ‘Always Look on the Bright Side of Life’ playing to conclude the service.

I’ll miss you Alex. QCons won’t be the same without you there.

July 2022



Milo’s been with us for a whole year now, which meant he got to join Max this time on our family holiday in the Lake District:


We returned to Keeper’s Cottage on the Graythwaite Estate, as it was brilliant last year, and it was just as good this time around.


Our trip coincided with the hottest days ever in England, with temperatures in the South going past 40C. Thankfully it didn’t get quite so hot in the North West, but it was still well past 30C, which for me (and the dogs) is uncomfortably hot. We spent the mornings down by the lake so they could paddle and cool off, and thankfully the big old stone cottage did a great job of keeping cool for the afternoons. It was generally bedtime before the temperature outside was lower than inside, making it safe to open the windows again.

Unfortunately for $daughter0 she needed to catch us up in the Lakes, as her school trip to Zambia overlapped with the start of our week in the cottage. Plan A, to join us by train, failed decisively with the 1230 from Euston never really going anywhere due to a track side fire. If only we’d planned a little less lie in time she’d have made it on the 1130. So she trudged back home (delayed some more at Gatwick by another fire) and then had to drive up. The silver lining was that $son0 joined her for the drive, so we ended up with the whole gang in the cottage.


The Lakes were unusually quiet. I first noticed (and commented) on this on one of the hot mornings down at Windermere; but it carried on like that after temperatures returned to normal. Hiking up and down Helvellyn I counted 20 people when I’d usually expect to see hundreds. It was spooky. The two prevailing theories seem to be:

  1. People are choosing to holiday abroad now that they can again (evidence – the huge queues at Dover at the end of the week as school terms ended)
  2. The cost of living crisis means people can’t afford to travel and do stuff in the National Parks

My sense is that it’s a bit of both, which had me worried about the tourist economy, but there also seems to be a demographic aspect to it. Dinner at The Swan Hotel on Friday night was apparently rammed, and the towns weren’t that quiet. So it seems that the older folk who potter about, buy pub meals, and stay in nice hotels might be out in roughly their usual force; whilst the younger folk (and families) aren’t showing up.


I took the old boots I had to get refurbished last year, and the new boots I’d bought to stand in for them. No falling apart problems at all this time around, though I didn’t find the Vibram soles of the Meindl Merans particularly confidence inspiring on wet rocks over Striding Edge, and I did slip a few times on the stony descent to Wythburn Church.


After a good experience with the OS Maps app last year I subscribed again this year. It was great to see that all my saved maps etc. were just there and ready once more once my subscription was active.

Steam Deck

I pre ordered my Steam Deck on launch day (16 Jul 2021) whilst in the Lakes last year, and it arrived just in time for the trip back this year. The device itself is everything I hoped it would be – a perfect way of playing PC games on the move (or sat on a sofa). On $son0’s recommendation I bought Portal and Portal 2 to try it out, and ended up playing my way through them on the ‘hot day’ afternoons in the cottage. I’ve also dabbled with Untiled Goose Game, but yet to try a AAA title like No Man’s Sky (which I bought my gaming laptop to play).

Valve have done a great job of integrating the device into their gaming ecosystem, so stuff ‘just works’. But I’m also impressed with how open it is. If you want to use the Steam Deck as a computer then it’s just a matter of adding a USB-C hub and a keyboard (and mouse or trackpad, and screen).

I suspect the Steam Deck will be joining me on future holidays (and maybe also business trips) as it’s a great way to keep entertained without taking up too much space.


Details of how the Labour party used an ersatz SIGINT operation against it’s own (prospective) members and supporters have come to light in the Forde Inquiry. This provides confirmation that the National Executive Committee (NEC) were excluding people for political expression on social media that didn’t align to their chosen ideology.


Almost six years ago I wrote The Surveillance Party to describe how the UK Labour Party ran an ersatz SIGINT operation to block people from becoming supporters, and then voting in their upcoming leadership election. I had to speculate back then about some of the details, because (like with proper SIGINT) everything was cloaked in secrecy.

But now we have the Forde Report (pdf)[1,2], which has a section on the ‘2015-2016 “validation” exercises’ (starting at page 40). This tells us who was doing the work, and reveals some high level details of how it was executed.

Who did it?

I was wrong in speculating that it would be a law firm or ‘big 4’ consulting firm. C2.14 tells us that the work actually went to a group of casual staff employed from the ranks of Labour Students. This might have been cheaper, but meant that some of those involved inevitably lacked the ideological purity to complete the task at hand without question or concern.

What were they looking for?

C2.17 of the report mentions ‘1,959 “flagged phrases”‘ and ’35 abusive phrases which included the word “Blairite” or “Blair”‘.

I wrote another post ‘Racist, abusive or foul language‘ looking at my tweets from the day referenced in my supporter application refusal. Since we don’t have the full list of phrases there’s still some need for speculation here, but in my case I’d expect it was ‘neoliberal’ or my RT of something with ‘#ChickenCoup’ that did it for me.

What were the consequences?

C2.20 notes that ‘there had been just under 4,000 “total actions by the NEC which includes all Supporter Rejections, Membership rejections, Auto Exclusions and Administrative Suspensions”, and that 1,024 of those actions had been against existing members’.

I was a bit miffed about not being able to participate in a democratic process I’d paid £25 to engage in. But those 1024, they’re ANGRY. I was at a screening of Ithaka the other day, which had Q&A afterwards, and one of the folk asking a question opened by describing himself as ‘an exiled member of the Labour party’. Labour was part of their identity, and taking that away from them has caused harm that they’re not ashamed to be noisy about.

Looking at the past, present, and future


Going back over all this had me revisiting why I’d even applied for a vote. I think the main thing was frustration at the Brexit referendum, and wanting to have some kind of say in the future of the country by helping to choose the leader of the opposition (or LOTO as the report frequently refrains).

The really mad thing here is that a process to exclude left wing ‘trots’ got me even though I’d not even made up my mind at that stage whether I’d cast my vote for or against ‘Jezza’.


It’s amusing to contrast what happened with Labour in 2016 with what’s happening with the Conservatives in 2022. Another party, another leadership election that involves members. The Tories don’t seem to care at all about entryism, they’re just happy to take the money. Of course the Tories are in the shape they’re in following years of entryism from ‘Kippers‘; which is why I have no interest in voting this time around as both candidates are awful.


It seems likely that this sort of shenanigans will happen more going forward. For Labour the lesson from Forde on this point will simply be that they have to ensure that their criteria for exclusion are properly promulgated. Other parties in other places will learn from what happened, and whilst we might hope the lesson will be “that’s a dystopian nightmare, let’s not do that”, I fear it will actually be “this is how we keep the membership pure”. This is not good for democracy or the conversations needed to facilitate it.


It’s interesting to finally see behind the curtain, and confirm that what I thought was happening is mostly what was happening. I was wrong on some speculative details, but right on the broad sweep of what was happening. The report spends some pages on this because what was happening wasn’t good for the party or the people it should serve; but perhaps swerves from calling out the full horror or providing more comprehensive resources (like the ‘flagged phrases’ lists).

I’m not optimistic that Labour in particular, and political parties in general, are headed in the right direction on this topic.


[1] Thanks to Craig Murray for drawing my attention to the report in his ‘The Forde Report and the Labour Right‘. I’d missed it myself amongst the news being saturated with climate disaster, the Tory performative awfulness contest, and the Lionesses kicking their way towards the Euro 2022 Final.
[2] Of course the report is published as a pdf, rather than something that can be usefully searched, indexed and cross referenced like a web page. ‘Never attribute to malice what can be explained by incompetence’, and there are plenty of (old) organisations out there that ply their trade in electronic versions of dead tree publications. But come on… it really feels like they didn’t want this stuff to be linkable.

June 2022



The boys got their first taste of ‘doggy daycare’ at a local kennels, and seemed to get on with it OK.

They even got a report card:

They also got ‘done’ this month, which led to a day or two of feeling pretty sorry for themselves:

Electromagnetic Field

I only found out about Electromagnetic Field after it had happened in 2016, and loved it in 2018. When it came to ticket sales in 2020 my mouse pointer was hovering over the ‘buy’ button for tickets, and then I convinced myself it wouldn’t be happening. And of course it didn’t happen that year, but what I didn’t consider was that the 2020 ticket buyers would get first dibs on 2022 tickets. So… I persuaded Atsign to sponsor, which guaranteed tickets for my colleague Gary and I.

The weather wasn’t quite as kind as 2018, and the arcade wasn’t quite as filled with my favourites; but I still had an amazing time there, and I’m really looking forward to 2024 and maybe a retro meetup village.

As I was getting ready to leave the BBC coverage for the Platinum Jubilee was hitting fever pitch. I hoped the camp might provide some escape, and it really did, I didn’t see any jubilee stuff at all. It’s a place full of people of living in the future, and it seems that the future doesn’t have a British Monarchy.

Regatta Karuna 6 Tent

For EMF 2018 I bought a new family tent, but given the amazing weather it wasn’t really tested.

EMF 2022 was just its second outing, and the first night revealed its one big flaw. The layout has two pods at one end, and one pod at the other, with one set perpendicular to the other. That means that if there’s any slope then one set of pods is head up on the incline, and the other is rolling down the hill. Sadly the Eastnor site doesn’t offer much level space, which was fine for those of us at one end, and pretty terrible at the other end. I mostly LOVE this tent, but this particular aspect is perhaps a major design flaw.

Slopes aside the tent was great. We got rained on, and stayed dry. It was spacious, comfortable, easy to set up, easy to take down, easy to dry out later.

School’s out

$son0 finished his 2nd year at university early on in the month, and $daughter0 finished her exams towards the end, so that’s us done with ‘school’ (where us Brits don’t use that word for University). Another one of those seminal moments of parenthood passes by.

May 2022



May saw the last of this year’s beautiful bluebells, and the boys enjoyed running around amongst them.

Back to Barcelona

After Mobile World Congress (MWC) in Mar I returned to Barcelona for the more focussed Internet of Things (IoT) Solutions World Congress. It was a much smaller event, but for me the conversations were much better, perhaps because this time our partner ZARIOT had a stand right on the main drag, and everybody there was into IoT stuff.

Although the event was smaller, Barcelona seemed pretty much as busy (certainly in terms of hotel space); likely due to the Integrated Systems Europe (ISE) event happening in much of the rest of the Fira Gran Via venue.

I stayed at Plaça d’Espanya again, though this time in a hotel rather than an apartment. The Occidental Barcelona 1929 was nice, and felt relaxed; and I find the location a good compromise between getting to the venue, and getting around the rest of Barcelona.


Sadly the IoT event meant that I missed most of Devoxx UK, but I was back in time for my talk on the final day. It was good to catch up with friends there, but also nice to see an audience that wasn’t the usual suspects I see everywhere else. I wish I’d had more time to hang out and get to know new people. Maybe next year?

MAME/RC2014 stuff

The RC2014 Mini stuff I put into MAME back in February has been torn out, and I’m not sad about it. The new RC2014 MAME Driver by Miodrag Milanović is pretty much everything I’d dreamed of doing, and it was a pleasure to collaborate with him on documenting what it can do.

The GraphQL Way



From a short time using GraphQL APIs I sense that there’s a ‘GraphQL Way’ for how things should be. A set of promises that the technology makes to its users. But those promises are frequently being broken, or at least undermined, as people rush to create GraphQL end points without perhaps investing enough time and effort into how things should work. The causes of this woe are mostly things we’ve seen before, as previous waves of new stuff have swept through the industry; so of course the remedies are much the same as ever, and it starts with building a picture of what good looks like.


I’ve been using a few GraphQL APIs over the past year or so, and many that I’ve encountered fall short of the expectations set by the Introduction to GraphQL. This leads me to think that there is a ‘GraphQL Way’, but maybe it’s not yet easy enough for API developers to follow it.

Recent conversations with James Governor, Fintan Ryan, Paul Johnston and others such as this Twitter thread make me think that there’s a need to better articulate what good looks like. This post is intended to get the conversation started. I’m not a GraphQL expert, just somebody who has to use APIs and often finds themselves thinking ‘this could be better, this should be better’.


Colossal Cave Adventure

That’s how I often feel when trying to navigate a GraphQL API. The thing that you want should be through the next door, but after going through 3-4 doors you’ve circled back in the graph to the place you were a few steps back, with no sign of the grail of enlightenment.

The promises

GraphQL makes a number of implicit promises:

1. Ask for just the data you need, get back just the data you need

GraphQL is pretty good at realising this promise, which makes it great for its original purpose of more efficient communication between mobile apps and their backend services. It’s less chatty, and more verbose, which are both good things when you’re worried about latency, data consumption and network flakiness.

Previous APIs developed a nasty habit of serving up giant JSON documents, expecting them to be cut down to size by the client; whilst GraphQL lets you just get the bits that are needed.

2. Self describing

GraphQL APIs can serve up their own schema, which should be enough to help users of the API find what they need. They have the map, so it should be easy to find the treasure.

That’s fine, provided that the map makes sense, to people other than those who made the map. If the schema is just another way of shipping your org chart then it could be dysfunctional. If it’s full of unexplained internal references then it might be incomprehensible. Self describing only works if the description makes sense (to a newbie with no other context to grab onto). James Scott explores this in ‘Is GraphQL really self-documenting?‘.

Michelle Garrett recently talked at QCon London (link to follow once they get around to publishing) about the lengths taken by the Twitter GraphQL team to curate a schema that makes sense. This stuff doesn’t just happen, it takes work, and care – governance.

The good bit is that we’ve come a LONG way from Web Services Description Language (WSDL) and Universal Description Discovery and Integration (UDDI), and this time it’s clear that self describing can work (beyond standards org slide decks and out in the real world).

3. Filtering

GraphQL APIs should allow a request to specify a filter. But many implementations don’t actually implement that capability. This ends up breaking some of the first promise, as you now need to ask for everything and do the filtering locally. At least (with promise 1) GraphQL queries are specifying which nodes/fields you want back, so you’re not having to filter through all of everything, just some of everything. It’s not pretty though, and can quickly run into needing pagination where previous REST based approaches spared that.

The causes

Why do the promises get broken?

New shiny

There are lots of GraphQL implementations popping up that are the first time doing this for the teams involved. They’re inexperienced, don’t necessarily know what good looks like, and easily walk into minefields without noticing the warning signs.


Resume Driven Development bedevils the entire industry, and because GraphQL is the new shiny there are an awful lot of new GraphQL end points getting created to bolster CVs and LinkedIn profiles.

REST inertia

Where there’s an existing REST API there’s a risk that a new(er) GraphQL API will try to repeat the same story in a slightly changed dialect. The promises of GraphQL don’t come true by adding a new end point, they have to be underpinned by some (re)thinking about design and purpose.

Not used internally

If you’re creating a new API, and it’s just for external customers, and not being used internally; then something is wrong. Badly wrong. And you likely won’t be getting the immediate feedback to make things right; just lots of whinging and moaning off in the distance.

Making it better

There are of course ways to help ensure GraphQL implementations live up to the promises:

Documentation, Samples and Examples

My colleagues hear me say those words almost every day, and I say them because good documentation, samples and examples are generally the difference between something that’s accessible, easy to use, and scores mass adoption, and something that’s none of those things.

Even if it’s obvious to you the creator of the new super smart self describing GraphQL end point how every last drop of valuable data can be wrung out of it on demand; some samples and examples of that brilliance will help some of us less sharp people get to our ‘aha’ moments more quickly and easily.

Pave the golden path

There’s a lot of good stuff out there already to help people implement their GraphQL services, and corresponding good stuff to help people consume those services. But maybe there’s scope for more help with design considerations, the questions of ongoing schema governance, and helping people keep the promises of GraphQL.

Tooling might help here. But also pausing for a moment, thinking, running some experiments and getting feedback. Asking what good looks like, and charting a path there. Proper software engineering.


GraphQL APIs are coming into the mainstream, and as they do so they’re often breaking the promises made by the underlying technology and its proponents. That’s happening for a slew of fairly predictable reasons; which is why the remedies also look so familiar. We can do better, we should do better.

Generally MAME emulations that would use a terminal make use of the built in terminal emulation. So you can just start MAME and get going with a screen like this:

But that has some limitations, such as you can’t copy from it, or paste into it. Also if you want to use the MAME display (layout) for some blinkenlights or similar, then there has to be some other way of connecting a terminal.

Of course, there’s a way to deal with that. Running:

rc2014 rc2014bp5 -bus:1 micro -bus:1:micro:rs232 null_modem -bitb socket.localhost:1234

Is telling MAME:

  • rc2014 is starting my cut down emulator that I compiled whilst waiting for the 0.244 release (after which it can just be mame).
  • rc2014bp5 selects the 5 slot backplane
  • -bus:1 micro puts an RC2014 Micro card into the first slot
  • -bus:1:micro:rs232 null_modem configures the RS232 port on the RC2014 Micro to use null modem connectivity (rather than the default terminal)
  • -bitb socket.localhost:1234 then connects that null modem (serial) port to (TCP) port 1234 on localhost.

MAME itself will show No screens attached to the system:

But now I can attached a Terminal Emulator such as TeraTerm or PuTTY using Other to port 1234:

MAME should be configured for serial at 115200 baud:

And the terminal settings might take some fiddling with local echo etc.

Click to get a clearer view

COM ports

It’s also possible to connect to a COM port using something like:

rc2014 rc2014bp5 -bus:1 micro -bus:1:micro:rs232 null_modem -bitb COM6

Here I’m using com0com to provide a pair of COM ports on Windows (COM4 and COM6). I’ve found that the terminal emulator should be connected first (to COM4) so that MAME doesn’t hang when trying to connect to COM6. Again make sure that you have matching (fastest) baud rates in MAME and the terminal emulator.

This approach seems to be more reliable for cursor movement within the terminal emulation :)

Miodrag Milanović has created a new RC2014 driver for MAME, and it’s very comprehensive, offering a full range of systems, backplanes and boards. This post is intended to be a quick tour of how to use it.

At the time of writing the new driver hasn’t yet made it to a MAME release, but it should show up in 0.244. For now I’ve built a standalone rc2014.exe so that I can try things out. The command line to do that was:

make SOURCES=src/mame/drivers/rc2014.cpp SUBTARGET=rc2014 -j

I can then see a list of systems:

[MINGW64] C:\Users\Chris\git\github.com\cpswan\mame>rc2014 -lb
Source file:         Name:            Parent:
rc2014.cpp           rc2014
rc2014.cpp           rc2014bp5        rc2014
rc2014.cpp           rc2014bp8        rc2014
rc2014.cpp           rc2014bppro      rc2014
rc2014.cpp           rc2014cl2        rc2014
rc2014.cpp           rc2014mini       rc2014
rc2014.cpp           rc2014minicpm    rc2014
rc2014.cpp           rc2014pro        rc2014
rc2014.cpp           rc2014zed        rc2014
rc2014.cpp           rc2014zedp       rc2014

There are a LOT of ROMs to ‘PokeROM’ for this, which can be listed with:

rc2014 -listcrc

With the ROMs in place, let’s fire it up…

RC2014 Mini

Since I started my real life RC2014 journey with the Mini, I’ll repeat that:

rc2014 rc2014mini

The system boots up into 32K BASIC:

Hitting Scroll Lock then Tab brings up the options, and from there Machine Configuration can be selected:

Switching the ROM to Small Computer Monitor (SCM) is simply a matter of hitting the right arrow until SCM is selected, then going to Reset Machine:

At this stage a program can be typed in, whether that’s in BASIC or using the Monitor.

RC2014 Mini with CP/M

In real life it’s not long until you start adding things to your RC2014, and MAME brings instant gratification with the ability to try out expansion options.

First I need the image from CPM 128MB in transient apps.zip, which I’ve renamed A128.IMG. I can then start the emulation with the disk image mounted:

rc2014 rc2014minicpm -hard A128.img

Typing cpm into the SCM prompt starts CP/M:

The essential utilities are there:

RC2014 Pro

It’s fairly simple to step up to one of the bigger RC2014s. The only major change needed is to grab a CP/M image that supports the SIO/2 serial board, which I’ve renamed to S128.IMG:

rc2014 rc2014pro -hard S128.IMG

The big difference here is that the Slot Devices menu can be played with:

This provides lots of opportunities to play with different boards, and fiddle with the config of them.

Whilst many of the official RC2014 boards are there already, there’s also lots of scope to replicated the huge variety of other boards out there (or create new functionality from scratch).

Running Zork

One of the points of retro hardware emulation is to enjoy retro software, particularly games, and Zork was one of the defining games of the era.

With the Zork binaries in hand CPMtools can be used to copy them (in this case to the G: drive):

cpmcp -f rc2014g S128.img ~/zork/Z. 0:

NB: I’m using a patched diskdefs file there to provide the definition of rc2014g

Then start up CP/M, switch to G: and run zork1:


The largest RC2014 systems such as the Zed and Zed Pro use the 512K ROM 512K RAM board with Wayne Warthen’s RomWBW.

Grabbing the hd_combo.img file from the binaries directory inside the v3.0.1 release package, and mounting it as a Compact Flash card:

rc2014 rc2014zedp -bus:5 cf -hard hd_combo.img

Allows the system to boot up with lots of goodies installed:


This has been a very brief introduction to the new MAME RC2014 driver, and there’s a lot to explore, loads more hardware configurations to try out, a ton of cards (and similar systems) yet to be emulated, and endless possibilities bringing together the variety of the RC2014 ecosystem with the breadth of emulated components in MAME. I’m looking forward to tinkering, and seeing what happens next.