Tamar Charney is Program Director of Michigan Radio, one of the biggest and best public radio stations, covering a large chunk of the Great Lakes State. In today’s guest post, Tamar explores the other side of algorithms, and just what we may be missing when we let technology make our content choices. By the way, this post contains no gluten and is totally fat free. – FJ
Facebook has decided I have a weight problem – a big weight problem.
It’s been helpfully suggesting diet pills, plus-sized swimsuits with tummy-control panels, and affirming articles about body image as I apparently struggle with the motivation to battle my obesity.
The thing is I’m petite, not plus
On one hand, I find it perversely funny that all-knowing Facebook somehow got the wrong idea about my weight. But on the other hand, it is a disturbing glimpse into the future, as more of what we see and read online is customized for us by computer algorithms.
Facebook and many other media outlets are increasingly relying on systems that predict what interests you, based on your online behavior and other information they’ve gleaned about you. Even if you and I have all the same Facebook friends, what you see on Facebook is different from what I see. Major media outlets like Huffington Post and The New York Times are also headed in this direction. At some point I wouldn’t be surprised if Michigan Radio does the same thing.
Because it works
The more customized we make things for you, the more you use our stuff. Sounds smart, right?
Until the algorithms get it wrong, or don’t quite get it right, which might be even more subversive.
It didn’t take much for me to realize Facebook guessed my weight grossly wrong. But what about the more subtle mistakes? The errors that edit out stories that actually would interest us, but it isn’t obvious to us that anything is amiss. Just slowly and subtly, what we see is based on an incorrect assumption until slowly and subtly our view on the world is shifted.
Sins of algorithmic omission
Michigan Radio’s underlying editorial philosophy for our state news coverage is that we can all better understand the challenges and successes in our own lives and communities when we can see how they fit into the bigger statewide story. What happens in one place tends to happen somewhere else, too.
At some point, perhaps we’ll know you live in Grand Rapids and like stories about science, or that you live in Ann Arbor and like to go fishing. But then will you still see the stories about Detroit and the DIA’s art collection? Or will you even find out about a new law that will affect your kids if the algorithms somehow didn’t know you had children? Will you still get that full picture that connects the dots between our lives and the bigger picture?
Ultimately, will we care as much about each other and our neighbors when everything is tailored to our lives and whims?
Facebook doesn’t fit me right
I’ve been inadvertently learning about the challenges you might deal with if you are overweight. I have a new appreciation for what that is like.
But, what am I not seeing in my Facebook feed as it presumes my interests and battle with obesity? I have no idea. Facebook is making those decisions about what I might want to know about, not me.
I don’t know how to drop those extra 200 Facebook pounds and see the world again from a thinner perspective. But it is going to take more than pills and a tummy control swimsuit to make up for an unbalanced media diet and not enough mental exercise.
- Radio + Thanksgiving = Gratitude - November 27, 2024
- Is It Quittin’ Time For SiriusXM? - November 26, 2024
- Radio, It Oughta Be A Crime - November 25, 2024
Mikel Ellcessor says
With the attention on Upworthy, his latest project, the work Eli Pariser did to document the above is forgotten. Tamar is right to flag this and anyone interested must add The Filter Bubble to their reading list: https://www.thefilterbubble.com/
Fred Jacobs says
Mikel, thanks. This looks like a worthy project, and one that we should be looking at. Tamar raises some issues, and it looks like Pariser is on it. Appreciate you bringing it to our attention.
Tamar Charney says
Thanks Mikel!
Ken Dardis says
What’s described above is a good example of what radio faces; a long lag between when items become news worthy and when radio starts noticing they exist.
This concept of filtering is a very old subject. I wrote about it in January 2012, after watching a video by Eli Pariser giving a Ted talk. (What Mikel Ellcessor is referencing in his comment is a short article by Eli Pariser on this subject.) I strongly suggest viewing the 9-minute video, though. In simple form, it outlines the problems of filtering: https://www.youtube.com/watch?v=bOE1HFEL8XA&feature=youtu.be
As for what an algorithm is. Few in radio have ever seen one in its purest form. This is from Google; it’s one of hundreds used in delivering your search returns, and it pertains to how a PageRank is assigned (affecting where your web site page sits in a search return). I pulled it off an early Google course on Search Engine Optimizing:
PageRank is defined as follows:
We assume page A has pages T1…Tn which point to it (i.e., are citations).
The parameter d is a damping factor which can be set between 0 and 1. We
usually set d to 0.85. There are more details about d in the next section.
Also C(A) is defined as the number of links going out of page A.
The PageRank of a page A is given as follows:
PR(A) = (1-d) + d (PR(T1)/C(T1) + … + PR(Tn)/C(Tn))
Note that the PageRanks form a probability distribution over web pages,
so the sum of all web pages’ PageRanks will be one.
PageRank or PR(A) can be calculated using a simple iterative algorithm, and
corresponds to the principal eigenvector of the normalized link matrix of
the web. Also, a PageRank for 26 million web pages can be computed in a few
hours on a medium size workstation. There are many other details which are
beyond the scope of this paper.
Fred Jacobs says
Ken, thanks for taking the time to take us through the innards of the “filter bubble.” The Pariser TED video is amazing, and worth our time. The concept of human gatekeepers (broadcasters) to algorithmic ones (Google, Facebook, etc.) is at the heart of this conversation and its importance to how we receive and process news and information. Much appreciated.
Tamar Charney says
Ken, I appreciate the comments and the under the hood look. As these practices get used more and more by more and more news sites, it is important that the public understand what is going on.
Jeff Schmidt says
We’ve all experienced the horrible mis-match of “if you like, you must like” reccomendation engines and poorly targeted ad and ads that follow you around the instant you express any interest in the subject matter at all. It’s exactly as wonky as it appears.
But based on my observation of how Digital Audio has evolved (going from an absolute horror in the 70s) to near perfect replication of sound today – I can’t help but wonder if today’s alogorythms are just like those 1970s Drum Machines everyone laughed at. The ones that evolved into all the mainstream sounds heard on EVERY pop music station today.
I can easily imagine alogrithms “growing up” and in fact becoming our “best friends” – collating the big data of our every decision into a matrix that can mimic “knowing” who we truly are. I know that sounds Kurzweilan – but I personally don’t think it’s that far fetched. In fact, I can imagine busy people using such tools to organize their lives – so of course they can be more productive at work. 😉
Jeff Schmidt says
sorry for the misspellings – I replying on my phone. 🙂
Tamar Charney says
I agree there’s a lot of ways this can be helpful. What is currently missing is some level of transparency about the assumptions that are being used to tailor content and recommendations.
Bob Bellin says
I don’t think I’ve ever looked at a Facebook ad so I have no idea what their algorithm has in store for me. And I kinda don’t care. I always default to latest posts in chronological order so I’m not subjected to their editorializing about what I want to see.
But now I’m curious 🙂
Bob Bellin says
Chubbies Shorts, Three online gambling/casinos and minimalist shoes.
Fred Jacobs says
See – the algorithm works! 🙂
Ken Dardis says
Bob: It doesn’t matter whether you’ve used Facebook. The types of filtering describe are being used by hundreds (thousands?) of web sites; be they social media, retail outlets, or news sources.
This is an extension of the new thinking: “People (especially youth) don’t feel threatened by having their movement tracked. Think of this as the equivalent of when people were building bomb shelters in their backyard. That’s something that was kept under “hush,” because you didn’t want the neighborhood pounding at your door if/when the big blast came. Today, nobody cares. That’s the difference between the thought process of today and when we older folks were growing up.
If you’re online, except under rare circumstance, your movement is being tracked and used to some company’s advantage.
Bob Bellin says
Of course they do. Every time I pull out my smart phone someone owns my thoughts