We study learning via shared news. Each period agents receive the same quantity
and quality of first-hand information and can share it with friends. Some friends
(possibly few) share selectively, generating heterogeneous news diets across agents akin
to echo chambers. Agents are aware of selective sharing and update beliefs by Bayes’
rule. Contrary to standard learning results, we show that beliefs can diverge in this
environment leading to polarization. This requires that (i) agents hold misperceptions
(even minor) about friends’ sharing and (ii) information quality is sufficiently low.
Polarization can worsen when agents’ social connections expand. When the quantity of
first-hand information becomes large, agents can hold opposite extreme beliefs resulting
in severe polarization. We find that news aggregators can curb polarization caused by
news sharing. Our results hold without media bias or fake news, so eliminating these
is not sufficient to reduce polarization. When fake news is included, it can lead to
polarization but only through misperceived selective sharing. We apply our theory to
shed light on the evolution of public opinions about climate change in the US.