Make world less shit. NOW.
updated almost 7 years ago; latest suggestion almost 7 years ago
Technology is often seen as somehow objective, especially by people who don't know much about how it's made.
But like everything that involves human decision-making, it's riddled with biases.
In search of profit (or sometimes innocent simplification) we have the choice to for example reduce human friendship, with all their nuances, to a boolean: friendship is approved or it isn't.
I want to tell stories which will make it clear how we shape technologies with our belief and value systems.
Among them, a story of a London university which built a neural network to deal with the first round of admission, presumed to be objective and based on logic. It was later discovered to inherit all biases of the people who were doing its job in previous academic years, because that's what the system was based on.
Or another story, of the biggest encyclopaedia ever created, one that removes barriers to entry and truly democratises knowledge. Only it doesn't quite achieve that despite the ambition, as the participants are largely a self-selected group that lacks involvement of huge swaths of society.
How we look at these biases will be crucial in building a better world, one where we acknowledge and address the issues we build into technology in the first place. The decisions we make in designing the tools of tomorrow are necessarily political and I want to leave you examining your own.
Chris: I think you're right, I will try to show how others approached this successfully, or what kinds of approaches might work in different situations.
If this is a story-based talk It'd be good to included some examples of places where technology has been used in a way that you consider without bias. It'd be nice to leave with some objective things I could do better in my own applications, rather than just a series of cautionary tales/rants.
@tomstuart: There is no specific Ruby angle. And yes, the stories are cautionary, but I am hoping to use them to illustrate how to look at your own decision making to avoid the same mistakes.
Very interesting - just think of all those security holes left in systems because they don't have parallels in real life that people can think imagine.
Does object orientated development stress human understanding of isolation over efficiency?
This sounds interesting and I’d like to hear the stories. It sounds like they’re mostly cautionary in nature — is there anything we can do to proactively avoid these pitfalls, other than simply being aware that they exist?
Is there a specific Ruby angle? (It’s okay if the answer’s “no”, because this sounds like something that should be interesting to technical people in general, but if there is a Ruby angle it’d be worth bringing it out in the proposal.)