This article tells us about a new study and research that is currently out there. This new study will help show a trained professional if the pics that are posted by an individual on their social media will show any evidence of a depressed person or not.
Harvard University’s Andrew Reece and University of Vermont’s Christopher Danford are doing the research study and according to there 34 page published research; the photos that are “photos taken, edited, and shared to Instagram can show signs of depression through filters, faces, and colors.”
The two are using a computer or artificial intelligent to detect a person’s depression and this research shows them a person’s their depression level in order to later help them in their treatment.
How it works is that the computers are trained to detect the signals in your posted pictures such as hue, facial recognition and saturation all before someone has been diagnosed.
These detections are extremely important and they are designed to help a depressed person in treatment and their discoveries won’t be seen by any untrained eyes who are looking at their friends postings on any given moment.
“We foresee a more mature version of this tool being used more in the context of screening and assessment, rather than treatment,” Danforth and Reece explained in an email. “The algorithm we used looks for complex, systemic patterns across many data points to infer clues about individual psychology. If someone posts a dark, bluish photo to Instagram, it shouldn’t necessarily be a red flag for their therapist – that person could just like photos of whales, or blueberries.”
The two published on arxiv.org, as they said, their research was “Using Instagram data from 166 individuals, we applied machine learn tools to successfully identify markers of depression. Statistical features were computationally extracted from 43,950 participant Instagram photo, using color analysis, metadata components, and algorithmic face detection.”
When your friend post a picture that shows a darker, grayer, and bluer kind of hue you don’t think that they’re depressed and feeling dark, no, you just see a filtered picture right?
“Human ratings exhibited extremely low correlation with computation features,” Danforth and Reece said. “In other words, the bluer, darker, grayer photos weren’t the sad photo, according to the people we asked. So while our computational approach is capable of screening for depression, it uses different information signals than those captured by human judgments of what makes a photo sad.”
Their study also shows that sometimes a person who could be suffering from a different range of mental disorders and they might feel more comfortable speaking to an “AI” or artificial internet computer before they even get to speak to a trained professional Doctor.
These disorders range from someone with post-traumatic stress disorder and anxiety and that’s why they’re using the AI for the initial diagnosis before they use the live Doctors.
“The intuition and experience of human physicians are still very important, especially in mental health care settings, and machine learning methods seem to work best when they complement, rather than replace, human knowledge,” they said. “In that sense, ‘augmenting’, rather than ‘displacing’, might be a better way to describe the new approaches to human-computer interaction that are advancing health care.”
Also On 97.9 The Box:
More From TheBoxHouston