Releasing code as a public service

Earlier this week, the BBC took the opportunity to showcase its newest app at the Radio Festival. BBC Sounds is a multiaudio platform that unifies BBC's audio ecosytem, gathering all the live broadcasts, podcasts and music mixes of its 57 radio stations (9 national + 8 world/regional + 40 local) in one single place. While boasting the baby app, BBC's director for radio and education, James Purnell, claimed that the british public broadcaster is developing its own "public service algorithm".

The pervasiveness of algorithms in our daily lives is immense, but the invisibility cloak under which each one of them imposes itself in our routines is what makes it so brutal. By now, we know that echo chambers are one of the promiscuous sides of being given just what someone (or something) knows we will probably like. As the internet keeps on feeding us with algorithmic suggestions of content, the results range from a somewhat innocent narrowing of our musical taste, to a much more serious case of completely false perceptions of world. Nonetheless, it is important to note that these bubbles can offer a safe harbour to people who feel outcast, often giving them a sense of belonging to a community.

Since it is conceived and made by human beings, technology is obviously prone to being faulty. Over and over again, algorithms have proven to be susceptible to biases. When data is trained and framed by humans, the deep-learning process gets easilly skewed and both maths and machines will get caught up in our own personally distorted views of the world. The innequalities of our physical societies are replicated in the digital sphere. As such, algorithms are also permeable to prejudiced mindsets such as racism and misoginy, to name a few. This is not easy to fix, but there are mechanisms that can be put in place in order to mitigate the effects of such defects.

As the demands for more transparency roar loudly from nearly every corner of society, technology offers new possibilities for decision-makers and public stakeholders to demonstrate their commitment to their people. While open source code is not a magical formula for transparency, it is one of the stepping stones for better understanding on how technology impacts our lives as it allows other to scrutinize the code that hides behind our screens and devices. UK's Public Whip project, for instance, is an obvious example of how technology (and open code) can be used to increase transparency between institutions and citizens.

In this context, I'd argue that, despite BBC's intentions of developing an algorithm that "informs and educates", this algorithm will only be de facto public service if it is open source. This is the only way to publicly audit if the machines are learning biases and reproduce them straight into our ears. We already know that the gender pay gap is real at the BBC and how, in 2018, 96 of the top 100 most well-payed employees there were white. Who can guarantee that such innequalities will not be replicated? Additionally, if taxpayers are funding the BBC, on what grounds can they be denied access to an algorithm that profiles their audio taste? (also, see the campaign Public Money, Public Code)

The BBC already has nearly 600 repositories of open source software on GitHub, so it's not like opening up their code is something new for the broadcasting house.

More than commendable, BBC Sounds is the evidence that the broadcasting powerhouse is in tune with the people it serves, especially those who seem to be drifting away from traditional radio: catering for their needs as technology evolves and antecipating the shock of the future. However, much like technology evolves, what we perceive as public service also takes on new shapes. Open source is one of them.