Will future psychologists be…robots?


That’s, at least, what certain experts predict. After all, it is only the natural step forward after social companion robots — a technology that is only gaining popularity as more people attempt to fill the void in their lives with a non-human “human.”

The concept of a robot psychologist might seem off-putting to some, but the fact remains that the technology is there, and it’s really only a matter of time before someone decides to utilize it to create a walking, talking, interactive humanoid thing that will heal us of our mental and emotional hurts.

Talk about a shrink without any kinks.

Or perhaps the kink is even bigger than we can possibly fathom. An article written in Psychology Today discussed the potential benefits and risks that may emerge from an automated mental health industry.

The concept, as unusual and disconcerting as it initially appears, does make sense. Some psychological services, such as the administration of tests, for example, can be powered by AI technology. This is really not that difficult to imagine. Mental health exams can be easily done on a smart device. This, in turn, would lower barriers to adoption, cost, access, availability, and a lack of perceived stigma. Patients would only have to open their phone, and find out if they are ill.

Recent estimates from Statista state that the average price to purchase an app in the Apple App Store is about 89 cents. This is significantly less than an average session with a psychologist, which can cost hundreds of dollars. Patients also have to wait in line to see their doctor, as opposed to merely opening an app.

Using an app would also lessen the fear of appearing “abnormal.” Mental health is still considered to be a taboo subject for a lot of people. Accessing an app would remove any sort of fear, and users can happily talk about their problems with some unknown entity.

Have you seen the gaping hole yet?

An AI psychologist, even if it wasn’t confined to an app, but actually presented itself in a humanoid structure, may represent compromised confidentiality and privacy. Regular readers of Natural News know how much we’ve reported on the dangers of hacking and how technology can be easily bypassed for nefarious use. Technology is notoriously imperfect when it comes to maintaining confidentiality and privacy.

Users may feel comfortable getting mental health support from a non-judgmental device (or “person”), but may also be unwittingly giving away personal information.

Even if these risks were to be minimized, there are still issues of bias and quality of service. Artificial intelligence is (still) as inherently biased as the human who created it. The areas of cognitive bias include data set size, data structure, and the degree of objectivity in the data itself.

There are also concerns that an AI psychologist will actually increase the feelings of loneliness and isolation in an already disconnected (yet “connected”) world. Psychologists warn that high-risk patients may feel more misunderstood if they share their innermost feelings with a machine rather than a fellow human being.

Then again, who is to say that machines won’t soon be developing human emotions in the future? (Related: Sad robot: Expert says that robots could become so life-like that they will develop mental illnesses too.)

What does the horizon of humanity look like, in this landscape of technological advances? Take a gander at our potential future at Robotics.news.

Sources include:

PsychologyToday.com

MedicalFuturist.com

MarketWatch.com



Comments
comments powered by Disqus

RECENT NEWS & ARTICLES