Last month, the UK government and devolved administrations demonstrated spectacularly why these issues are so important when they tried to apply an algorithm to already standardised school grades and ended up entrenching inequalities even further. In part this happened because, human nature being what it is, governments in the UK wanted to take a blunt approach to fix a problem that wasn’t there (I’ll talk more about this in a future post) and partly because the tool used, an algorithm, is an inherently biased tool that takes on the biases of its creator(s).
If we agree that everyone in a society needs to be included, treated equally or with equity (depending on their situation) and is entitled to access the best education, healthcare, job opportunities, housing, transport and other infrastructure, etc; then we need to better articulate the kind of world that we want to live in.
Of course I’m going to tell you to envisage the future you want to make happen – just like a strategic framework or theory of change – that’s my job! But in this case, the effect on students’ grades and their life opportunities of an algorithm that entrenched bias underscores how important it is to have a well articulated vision of the social impact you are trying to achieve, and ensure that your systems, processes, tools are all aligned towards that vision. I’ll go into this in more detail in a later post too, but for now: what kind of world do you want to live in and how would you articulate that vision?