I don't think it is fake, but the computer models that are employed are very tricky. And computer models are the basis of climate science. They involves millions of grids that represent the atmosphere and the physical properties of each grid is subject to assumptions about the energy and physics involved in each grid. We would like smaller grids, but we lack computer power. We have no idea whether the grids should be uniform size.
It Is one way to control for all other variables, but those of us who do econometrics know the danger of overfitting the models. What explains past history may not predict the future. And of course, with computer models one way to overfit is to make assumptions that make the model fit the data.
As the models improve, we should see the variance in predicted outcomes decline. The spread in predicted outcomes is wider than ever (5.6 F, 3x wider than the observed temperature rise in the 20th century).
Most of the models overshot the expected global warming in the early 2000s. They was good explanation as to why there was an overshoot, but they were not included in the existing models. This provides fuel for the deniers.
Many of the qualitative forecasts outline the possible risks we face: warmer temps, more drought in some areas, less ice, etc.
I seems to be a giant insurance issue. The high end predictions may come true and what can we do to mitigate the risk?