Jah-Wren Ryel writes with news that a few CS folks are working on a way to present opposing viewpoints without angering the reader. From the article: "Computer scientists have discovered a way to number-crunch an individual's own preferences to recommend content from others with opposing views. The goal? To burst the 'filter bubble' that surrounds us with people we like and content that we agree with. A recent example of the filter bubble at work: Two people who googled the term 'BP.' One received links to investment news about BP while the other received links to the Deepwater Horizon oil spill, presumably as a result of some recommendation algorithm." From the paper's abstract: "We found that recommending topically relevant content from authors with opposite views in a baseline interface had a negative emotional effect. We saw that our organic visualization design reverts that effect. We also observed significant individual differences linked to evaluation of recommendations. Our results suggest that organic visualization may revert the negative effects of providing potentially sensitive content."