Not everyone on the internet is a jerk, though many are
PNAS Nexus

Americans tend to believe that online spaces are far more hostile than they actually are. Many assume that nearly half of people on major platforms regularly post cruel, aggressive, or abusive comments. In reality, truly severe online toxicity is much rarer.
One striking example is Reddit,
where Americans estimate that 43% of users post highly toxic comments, even
though research shows the real figure is closer to just 3%. This gap between
perception and reality can quietly fuel a broader sense of pessimism about
other people and about society as a whole.
To better understand this disconnect, researchers Angela Y.
Lee, Eric Neumann, and their colleagues surveyed 1,090 American adults using
the online research platform CloudResearch Connect. The goal was to compare
what people believe about harmful online behavior with actual data collected in
previous large-scale studies of social media platforms.
The results showed that people dramatically overestimate how common toxic behavior is. On Reddit, participants believed toxic commenters were 13 times more common than they truly are. A similar pattern appeared on Facebook. Participants guessed that 47% of users share false or misleading news stories, even though existing research suggests the real number is about 8.5%. In other words, people assume that misinformation and harmful content dominate social media feeds far more than they actually do.
Recognizing Toxic Content Does Not Fix the Misbelief
Interestingly, this inflated perception was not simply due
to confusion about what counts as toxic content. In a signal detection task, a
type of psychological test used to measure how accurately people can identify
specific signals amid noise, many participants correctly recognized examples of
toxic online posts. Even so, they still believed that a large share of users
regularly produce such content.
This suggests that the problem is not an inability to spot
harmful behavior, but a mistaken belief about how widespread it is. People may
remember extreme posts more vividly or encounter them more often because social
media algorithms amplify attention grabbing content, leading them to assume
that such behavior is the norm.
How Correcting the Misperception Changes Attitudes
The researchers also tested whether changing these beliefs
could influence how people feel about society. In an experiment, participants
were shown accurate information about how rare severe online toxicity actually
is. Afterward, many reported feeling more optimistic and less concerned that
society is in moral decline. They were also less likely to believe that most
Americans are comfortable with harmful or aggressive online behavior.
According to the authors, people often confuse a very small but extremely vocal group of users with the majority. A limited number of highly active accounts produce most toxic and harmful content, creating the illusion that it reflects widespread attitudes. Recognizing this distinction may help reduce the negative emotional effects associated with social media and could improve social cohesion by reminding people that most users are not behaving badly online.