Imagine there is someone over the other side of the room. You can see and hear them normally, but, in addition, there is a camera trained on them, and a microphone to pick up what they're saying. The audiovisual material is then presented on screen, and you're watching them on the screen, with headphones on. So far so good: the picture and the sound should theoretically be same as if you're looking directly at them, watching, and listening.
Now imagine that the picture and the sound are being manipulated by software on the computer, but you don't know this is happening, and so you think that the picture and the sound accurately reflect reality, but they don't.
This is a metaphor for the 'personal reality filter' through which we process incoming material. We think we're receiving reality, whereas the brain is already filtering what we see and hear, processing it in accordance with algorithms generated in the past, and presenting us with an altered version.
Unless we recognise this is the case, we will continue to perceive the problem as lying outside us, when the problem is actually the programming that is taking neutral reality and turning it into pictures of heaven and hell.
The good news is that reality has not gone anywhere. It is available as soon as we take off the headphones and move away from the screen.