Recommendation algorithms profoundly shape users' attention and information consumption on social media platforms. This study introduces a computational intervention aimed at mitigating two key biases in algorithms by influencing the recommendation process. We tackle interest bias, or the tendency of the algorithm to recommend non-news, non-political information to the majority of users, and ideological bias, or the algorithm directing the politically interested and more strongly partisan users to one-sided congenial content. Employing a sock-puppet experiment alongside a month-long randomized experiment involving 2,142 frequent YouTube users, we investigate if nudging the algorithm by playing videos from quality, verified, and ideologically balanced news channels in the background increases recommendations to and consumption of quality news content. We additionally test if providing balanced news input to the algorithm promotes diverse and cross-cutting news recommendations and consumption. We find that nudging the algorithm significantly and sustainably enhances both recommendations and consumption of news and also successfully minimizes ideological biases in recommendations and consumption, particularly among conservative users. In fact, recommendations have stronger effects on users’ exposure than users’ exposure has on subsequent recommendations. In contrast, nudging the users themselves has no observable effects on news consumption. Increased news consumption, however, has no effects on democratic attitudes (e.g., knowledge, participation, polarization, misperceptions), adding to growing evidence of limited effects of on-platform exposure. The intervention does not adversely affect user engagement on YouTube, showcasing its potential for real-world implementation. These findings underscore the profound influence wielded by platform recommender algorithms on users' attention and information exposure.