Information Today, Inc. Corporate Site KMWorld CRM Media Streaming Media Faulkner Speech Technology DBTA/Unisphere
PRIVACY/COOKIES POLICY
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM Faulkner Information Services Fulltext Sources Online InfoToday Europe KMWorld Literary Market Place Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research



Vendors: For commercial reprints in print or digital form, contact LaShawn Fugate (lashawn@infotoday.com)
Periodicals > Link-Up Digital
Back Forward

Understanding Facebook's News Feed
by

Bookmark and Share
Link-Up Digital

Have you noticed that you don’t see all of the Facebook posts of your Facebook friends on your news feed? Wonder why this happens?

In a nutshell, it’s because you don’t interact enough with that person’s posts or they don’t interact enough with yours, such as liking them or commenting about them. Here’s more detail. It has to do with algorithms.

Algorithms are central to our digital lives, but most of us have no clue what they are. An algorithm, in short, is a set of instructions to solve a problem. They’re at the core of computer programs.

Corporations and government agencies use algorithms to target online ads, trade stocks, price insurance, and identify potential terrorists. Algorithms can determine whether we get into college, get hired, get promoted, get a mortgage, or get a car loan.

Maybe the best-known algorithm today is Google’s PageRank, which displays pages resulting from a Web search according to the number of other pages that link to them and the number of pages that link to the linking pages. Google is forever tweaking it.

The ever-changing algorithm behind Facebook’s news feed, called EdgeRank, is also crucial in today’s social media-infused world. It determines what you see when you check Facebook.

The more Facebook friends you have, the more important EdgeRank is. One frequently cited statistic is that on average only 35% of any given Facebook user’s friends will see any given post.

When you post to Facebook by answering the question, “What’s on your mind?” or by otherwise doing an Update Status, you’re posting to your timeline. Depending on your privacy settings, your Facebook friends can post to your timeline as well.

What you and your friends post to your and their timeline may or may not show up in the respective news feeds. Facebook’s EdgeRank algorithm has a lot to say about this.

Facebook is largely mum about EdgeRank, keeping many things private for competitive reasons. But based on what is publicly known as well as EdgeRank’s behavior, Facebook uses EdgeRank to help create your news feed mostly through Affinity Score, Edge Weight, and Time Decay.

Affinity Score measures how closely you interact with a Facebook friend and how closely that person interacts with you based on the options Facebook provides for expressing the emoji reactions “like,” “love,” “haha,” “wow,” “sad,” and “angry.”

If you frequently emote after seeing friends’ posts and if they do the same with your posts, you and they will see more posts. But if there’s little or no interaction, a friend’s posts may disappear from your news feed and vice versa.

Edge Weight measures how frequently people comment on or share one another’s posts. Sharing a post, whether it’s an originally written note or a link to a news article, video, or other content from someone else, will increase its Edge Weight. On the other hand, copying and pasting the link won’t.

Like Affinity Score, Edge Weight promotes the chances of friends’ posts showing up in your news feed.

Time Decay as the name implies measures how old posts are. Yesterday’s news is less likely to show up than today’s and down the line.

Facebook also incorporates other more subtle factors into EdgeRank. If you like, comment on, or share posts that are pro-Republican, you’ll see more posts from Republicans, and if you do so with posts that are pro-Democratic, you’ll see more posts from Democrats.

This is both a boon and a bane. Facebook is trying to provide you with what you like. But this creates an echo chamber effect, with people being exposed to content that reinforces their existing beliefs.

Facebook has gotten headlines lately, both before and after the last presidential election, through the accusation that it has made it easier to spread fake news. It has been trying to counter this accusation and to fight the phenomenon.

Despite the fact that many people obtain their news largely from Facebook, it contends it’s not a news site. Initially it tried to jiggle its EdgeRank news feed algorithm to cut down on fake news, without much success, according to reports. Then it debuted a “false news” reporting feature.

Most recently Facebook said it planned to launch an educational tool for fake news. When clicked on, you’re directed to the Facebook Help Center, which displays information “on how to spot false news, such as checking the URL of the site, investigating the source and looking for other reports on the topic,” according to Facebook.


Reid Goldsborough is a syndicated columnist and author of the book Straight Talk About the Information Superhighway. He can be reached at reidgoldsborough@gmail.com or reidgold.com.


       Back to top