AIRLINK 69.92 Increased By ▲ 4.72 (7.24%)
BOP 5.46 Decreased By ▼ -0.11 (-1.97%)
CNERGY 4.50 Decreased By ▼ -0.06 (-1.32%)
DFML 25.71 Increased By ▲ 1.19 (4.85%)
DGKC 69.85 Decreased By ▼ -0.11 (-0.16%)
FCCL 20.02 Decreased By ▼ -0.28 (-1.38%)
FFBL 30.69 Increased By ▲ 1.58 (5.43%)
FFL 9.75 Decreased By ▼ -0.08 (-0.81%)
GGL 10.12 Increased By ▲ 0.11 (1.1%)
HBL 114.90 Increased By ▲ 0.65 (0.57%)
HUBC 132.10 Increased By ▲ 3.00 (2.32%)
HUMNL 6.73 Increased By ▲ 0.02 (0.3%)
KEL 4.44 No Change ▼ 0.00 (0%)
KOSM 4.93 Increased By ▲ 0.04 (0.82%)
MLCF 36.45 Decreased By ▼ -0.55 (-1.49%)
OGDC 133.90 Increased By ▲ 1.60 (1.21%)
PAEL 22.50 Decreased By ▼ -0.04 (-0.18%)
PIAA 25.39 Decreased By ▼ -0.50 (-1.93%)
PIBTL 6.61 Increased By ▲ 0.01 (0.15%)
PPL 113.20 Increased By ▲ 0.35 (0.31%)
PRL 30.12 Increased By ▲ 0.71 (2.41%)
PTC 14.70 Decreased By ▼ -0.54 (-3.54%)
SEARL 57.55 Increased By ▲ 0.52 (0.91%)
SNGP 66.60 Increased By ▲ 0.15 (0.23%)
SSGC 10.99 Increased By ▲ 0.01 (0.09%)
TELE 8.77 Decreased By ▼ -0.03 (-0.34%)
TPLP 11.51 Decreased By ▼ -0.19 (-1.62%)
TRG 68.61 Decreased By ▼ -0.01 (-0.01%)
UNITY 23.47 Increased By ▲ 0.07 (0.3%)
WTL 1.34 Decreased By ▼ -0.04 (-2.9%)
BR100 7,394 Increased By 99.2 (1.36%)
BR30 24,121 Increased By 266.7 (1.12%)
KSE100 70,910 Increased By 619.8 (0.88%)
KSE30 23,377 Increased By 205.6 (0.89%)

SAN FRANCISCO: Content identified as misleading or problematic were mistakenly prioritized in users’ Facebook feeds recently, thanks to a software bug that took six months to fix, according to tech site The Verge.

Facebook disputed the report, which was published Thursday, saying that it “vastly overstated what this bug was because ultimately it had no meaningful, long-term impact on problematic content,” according to Joe Osborne, a spokesman for parent company Meta.

But the bug was serious enough for a group of Facebook employees to draft an internal report referring to a “massive ranking failure” of content, The Verge reported.

Hackers got user data from Meta with forged request

In October, the employees noticed that some content which had been marked as questionable by external media – members of Facebook’s third-party fact-checking program – was nevertheless being favored by the algorithm to be widely distributed in users’ News Feeds.

“Unable to find the root cause, the engineers watched the surge subside a few weeks later and then flare up repeatedly until the ranking issue was fixed on March 11,” The Verge reported.

But according to Osborne, the bug affected “only a very small number of views” of content.

That’s because “the overwhelming majority of posts in Feed are not eligible to be down-ranked in the first place,” Osborne explained, adding that other mechanisms designed to limit views of “harmful” content remained in place, “including other demotions, fact-checking labels and violating content removals.”

AFP currently works with Facebook’s fact checking program in more than 80 countries and 24 languages. Under the program, which started in December 2016, Facebook pays to use fact checks from around 80 organizations, including media outlets and specialized fact checkers, on its platform, WhatsApp and on Instagram.

Content rated “false” is downgraded in news feeds so fewer people will see it. If someone tries to share that post, they are presented with an article explaining why it is misleading.

Those who still choose to share the post receive a notification with a link to the article. No posts are taken down.

Fact checkers are free to choose how and what they wish to investigate.

Comments

Comments are closed.