Why #Facebook and others profit from Emotion
Facebook is in the news again - and other networks. Why? The claim is that there is too much harm done to individuals and then to groups of those individuals by what they see and read on such provider sites.Let's face it, in news negative has always sold more than positive. Before social media, phones, or computers, fires, shootings, disasters, divorces, wars, UFO sightings and other topics sold magazines and newspapers. Occasionally a moon landing or big lottery win made the front page and when they did they were quickly replaced. Thus is the morbid nature of the average person's curiosity.
The same goes for conspiracies. Again - before computers were common - the JFK assassination was and still is popular because conspiracies prevailed. Roswell, UFOs, and the death of Diana were similar. Everyone wants to prove that their favored conspiracy is the right one. Social Media supports this desire perfectly.
We have probably all heard of "rabbit holes". Facebook in particular is exceptional at creating them. A person clicks on one article or conspiracy and the "algorithm" immediately fires another at them for consumption. It begins to appear that such negative stories are dominating the news when they are not.
If you click on a positive story or just a personal interest or pastime, you will again receive dozens of similar stories. This might be a good thing if it makes you feel good. Facebook does not know and does not care.
As we all know both promote more clicking and each new page provides more potential advertising revenue for Facebook. Is this breaking the law? Not yet. Zuckerberg claims that it is all a matter of free speech.
Here is the problem: what these algorithms can NOT do and likely never will, is assess the emotion the reader feels whether positive or negative. They would claim that the reader is consuming more of those articles in the rabbit hole presumably because he/she wants to. That is the core problem. A click is not always an indicator that you like the content. You might actually hate it but hate is a negative emotion and as agreed above, it sells more. People are psychologically drawn to it.
When the reader - especially a young one - is already feeling badly about something like their self-image, the rabbit hole can turn into a grave. People with pre-existing feelings of rage begin to act upon that rage.
Despite false claims to the contrary, that was at the heart of January 6th and other violent uprisings.
Algorithms can not feel or measure emotion. The question is whether their creators - and their top executives can or even want to. If an algorithm is capable of judging that a second article is similar to the first, surely it can return a different article with an opposing view or topic. Maintain a taboo list - subjects like self-image; depression; drug availability; guns and so many more. Do NOT return similar articles. It would be a start. Why not try it?
Mark you are by all measures extremely rich. Take a page from Bill Gates and start to give back.
#thebrewsterblock