Earlier this week, Instagram felt obliged to try to quash a rumor that had been circulating on the platform that the company’s algorithms made it so that only 7% of any given account’s followers would see that account’s posts. This wasn’t the first time: this sort of rumor has been recurrent ever since platforms switched to algorithmic sorting. It speaks to a conflict of interest between platforms and the people who rely on them for distribution. Platforms don’t trust users not to game their algorithms, and in turn, users don’t trust the platform’s algorithms not to cheat them out of attention. Algorithms seem almost designed to foster a low-trust environment that, paradoxically, makes any conspiracy easier to credit. There is no basis for believing anything that isn’t convenient or self-serving.
Understanding Makes the Mind Lazy
Understanding Makes the Mind Lazy
Understanding Makes the Mind Lazy
Earlier this week, Instagram felt obliged to try to quash a rumor that had been circulating on the platform that the company’s algorithms made it so that only 7% of any given account’s followers would see that account’s posts. This wasn’t the first time: this sort of rumor has been recurrent ever since platforms switched to algorithmic sorting. It speaks to a conflict of interest between platforms and the people who rely on them for distribution. Platforms don’t trust users not to game their algorithms, and in turn, users don’t trust the platform’s algorithms not to cheat them out of attention. Algorithms seem almost designed to foster a low-trust environment that, paradoxically, makes any conspiracy easier to credit. There is no basis for believing anything that isn’t convenient or self-serving.