Sometimes we destroy more data than we’d like, says columnist Tom Grossfeld. Even in mental health care, sensitive data is now being collected. “Teach data science students that some things, like psychological suffering, are too complex and elusive to be quantified.”
Companies and managers are finding it increasingly important to be able to capture reality in data. Tilburg University quickly responded to this by opening the Geronimus Academy of Data Science (JADS) in 2016, where data science students are prepared for our data-driven society. And with success: My friends who studied data science were bombarded with offers from big companies long before their degree came out.
I often think about what the German-Korean philosopher Byung-Chol Han said – who never hates a strong person One liner – In his book Psychopolitics (2015) write, specifically, that belief in manifestism occurs with an “insistence on the Second Enlightenment.” That sounds like an exaggeration, but it’s not. Of course: part of reality can be converted into data. But some of them are not. We seem to forget the latter in all our excitement.
What is currently happening in mental health care is an example of that mindset. Since 1 July, practitioners have been forced to calculate and share questionnaires dealing with highly sensitive social and mental problems, such as self-harm, suicidal thoughts and behavioral problems, with the Dutch Healthcare Authority (NZa). The data is then pseudonymized (and therefore traceable) and put into an algorithm.
The results are being shared with health insurers, who hope to be able to predict how much care particularly “complex” patients will need. This gain in efficiency would reduce waiting lists.
Measuring psychological suffering
It’s brave: trying to capture psychological suffering, our mental space, the essence of what affects us as human beings, in data. There is a kind of insistence about it. It is as if it cannot be accepted within our neoliberal, technocratic society that what happens in mental health care, especially within the psychotherapeutic consulting room, is not measurable, actionable or controllable.
The complexity of our unconscious spiritual life, our suffering, our unique human life history, is compressed into a package of data.
While we have known from research for a long time that such a thing is not possible at all. The same method has already been tried in Australia, New Zealand and Great Britain. Always with the same result. The predictive value of the algorithm remains within twenty percent.
At the same time, this blind faith in data has serious consequences: the consulting room is no longer a free and safe space for patients. Psychologists and psychiatrists are affected at the core of their profession: they must break medical confidentiality.
JADS is an institute that the university can be proud of. But I think it is also necessary to teach students that not every social problem needs to be combatted with technology, and that some issues – psychological suffering for example – are too complex and elusive to be quantified.
If we don’t, I see data science as a risk rather than a useful tool for society.