>

The ‘right to explanation’ and accountability maximisation

>Mern:

Regarding article 22 – automated systems, and the GDPR articles 13, 14 and 15, they present a combined ‘Right to Explanation’.These could be a good tool to counteract some of the issues as they could interpretably be made to engage corporations and controllers with a responsibility to explain and show users the decision-making process behind content personalisation, thereby enabling individuals to critically evaluate the information they encounter and understand the forces shaping their perceptions, or to at least take information received with a grain of salt.  But the law as-is doesn’t present any concrete obligation to do this with intrinsic regard for autonomy, and it is otherwise just one possible interpretation of a group of articles, the law arguably leans towards implying a different sort of explanation.

Present debate about what the Right to Explanation means exactly revolves around what can be interpreted from the text of the GDPR, Selbst and Powell in their report “Meaningful information and the right to explanation” (International Data Privacy Law, Volume 7, Issue 4) stitch together a definition of the right as such:

“When an individual is subject to ‘a decision based solely on automated processing’ that ‘produces legal effects … or similarly significantly affects him or her’, the GDPR creates rights to ‘meaningful information about the logic involved’.” 

Their interpretation is that ‘meaningful’ should only be interpreted as information that is meaningful to the data subject, as in – a human with presumably no particular technical expertise, and that the interpretation of “meaningful information” under GDPR Articles 13-15, must prioritize the data subject’s understanding and ability to act. Meaningful information, therefore, should be readily comprehensible to a non-technical individual. Further, a minimum functional threshold should be met, ensuring the information empowers the data subject to exercise their rights. For instance, explanations of automated decisions should enable individuals to assess potential discrimination, where the right to explanation should be interpreted functionally and flexibly, with the core objective of facilitating the data subject’s ability to exercise their legal rights. But this reading is not a cast-in-stone presumption and the text of the GDPR as written leaves room for interpretations that can undermine the meaningfulness of the information involved.

To push the thought further then: A straightforward interpretation of the Right to Explanation, outside of the Selbst definition, can still open the door to the sort of overtly technical explanations which prioritise explaining the machine learning model (its overall logic or specificities) to an individual subject of that model, instead of focusing on structuring accountability. The language of the text – particularly “the logic involved” is ambiguous, and can conceivably be interpreted as a requirement to explain the logic of the whole machine learning model rather than explain the reasoning behind its predictions, thereby treating any explanation as an instrument by which the individual is expected to achieve autonomy for themselves, rather than having an intrinsic requirement for autonomy. Note that – under the intrinsic paradigm – the discussion risks being undermined by the need to balance an otherwise amorphous concept like autonomy against concrete, well defended, and well established interests in, for example, trade secrecy.

Insofar as governing algorithmic decision-making goes – the choices that developers make during the machine learning process shape the form that the model ultimately takes and the effect it will have on individuals over time. The Right to Explanation, therefore, should be considered in the wider context of the GDPR’s trend towards strengthening data protection (and by extension, individual autonomy) as a fundamental right and seek to maximise accountability on behalf of the controller. The requirement that an institution explain how its model works is not an adequate constraint on the power of the institution, per Gillis and Simon in THEIR report (“Explanation Justification: GDPR and the Perils of Privacy”, Journal of Law & Innovation): 

“The [interpretation] mistakenly characterises a challenge of institutional justification as a challenge of algorithmic explanation. Focusing on the requirement of those with power to inform subjects as to what the rules are, intentionally or not, distracts from the higher-order question of what the rules should be.”

Ultimately for the GDPR to be an effective mechanism for enforcing human rights in this way, the ‘right to explanation’ should have to focus on putting the controller in a position of accountability, ideally accountability from data subjects but also accountability generally. This is why – beyond ensuring that justifications are fit for this purpose – justifications should also be directed towards institutions that have systemic powers to affect controllers at the macro-level, instead of solely towards individual subjects. This means that an interpretation of the articles following Selbst and Powell’s advice or Simon and Gillis’ 2b category might be a more viable approach. With explanations taken beyond simply explaining why a content feed shows what it does or how it works – and instead drawing attention to the underlying power that corporations have to control who sees what, and why. Ideally, these are keyed to explaining not just choices about how machine learning models are integrated into their service from a design, user interface and decision-making perspective, but also about the principles governing those choices and the aims – commercial or otherwise – of the controller in deploying it. Although, a fundamental flaw in this reasoning is that article 22 is based on decisions ‘based solely’ on automated data processing, this means that even perfunctory human roles can invalidate the legislation completely. It should instead specify that the human involvement must be meaningful to disqualify it.

>End Of Post

;

Written by

;

Comments

Leave a comment