Did you know Neural is taking the stage this fall? Together with an amazing line-up of experts, we will explore the future of AI during TNW Conference 2021. Secure your ticket now!

Apple’s walled garden for AI tends to keep the Cupertino company out of critical media’s crosshairs. So it’s a bit of a shock when a story breaks demonstrating that the house that Jobs and Woz built is just as dangerous and unhinged as the rest of big tech when it comes to machine learning.

Exhibit A: The Wall Street Journal today reported that Apple’s developing an AI-driven system that uses biometric data obtained from iPhone users to diagnose mental illness and autism.

The big idea here is that, much like the Apple Watch’s ability to detect cardiac distress, your iPhone would be able to process data in the background during your normal device usage and determine whether you’re experiencing mental distress. Or, apparently, autism. The two are not necessarily related.

Up front: Let’s not spend 1,000 words explaining the problem here. You can’t detect autism with an iPhone. You just can’t.

Autistic individuals exist along a mental and emotional spectrum just like allistic individuals do. There’s no test for autism. Doctors don’t use static criteria to diagnose autism. It takes a lot of discussion and observation for a physician to arrive at a diagnosis of autism.

Check out the sheer number of research articles/papers/books there are on Google Scholar concerning the difficulty of diagnosing mental illness and autism:

Furthermore, autism isn’t a disease that can be cured or a form of mental illness. Many autistic people consider themselves perfectly healthy and find the idea that they need to be “treated” offensive.

Background: But the real problem for Apple is much simpler: AI cannot detect mental illness or autism. Full stop.

It can be trained to look for signs that researchers believe could be associated with those conditions, but even then we’re talking about facial expressions, how quickly you type words, and how your voice sounds when you’re speaking. These are data streams that can vary wildly across demographics. It’s beyond the scope of short term studies to apply such metrics to deep learning algorithms. 

Lukewarm take: It makes no sense to develop this product. Sure, research and development is never an indication that a feature will actually end up in production. But this is a flawed premise from inception. Simply put, there’s no scientific basis for this product. It’s snake oil.

When an Apple product detects a potentially harmful heart condition, it’s detecting an event. That event either did or did not happen. But when AI tries to detect something subjective, like the signs of autism, for example, it’s not looking for a single diagnostic event. It’s trying to find arbitrary evidence of autism hidden in data. That’s just not how mental illness and autism diagnosis works.

The ramifications of misdiagnosing mental illness or autism can be devastating. The stigma against those with mental illness is often exacerbated by the compound stigma of seeking therapy.

And the stigma against those with autism has lead to some truly vile things. Such bigotry stands at the heart of the pro-disease (commonly called “anti-vaxxers”) movement. Giving the general public a device that they believe can detect autism or mental illness is certain to result in harm. 

Here’s hoping this one never makes it out of the lab. 

Advantages of local domestic helper.