The two key subjects in modern physics today are quantum physics and general relativity. Quantum physics gives us a picture of what's happening on the small scale (atoms, electrons, protons etc) while general relativity describes the massive things in the universe (galaxies, black holes etc). At the moment, the two theories contradict one another; when we marry the two together then we'll be part way to having a model which describes everything in physics. The quantum theory is all about UNCERTAINTY; the closer we look into the microscopic world the more "misty" our view becomes. We're taught at school that electrons in an atom orbit (that is, go around in a circular path) the nucleus like the moon orbits the Earth. This is not really a very accurate view of what happens. In quantum theory we imagine that there is a "cloud" around the nucleus representing probability. Where the cloud is thickest is where an electron is most likely to be. Most people think that if you knew where a particle is and how fast it's going then you can predict exactly where it will be in the future. In the Seventeenth Century, Sir Isaac Newton laid down the laws which allowed you to do this. These laws remained accepted until the first half of the Twentieth Century when quantum mechanics was first being worked on. The result of this work was that you could not predict anything exactly, you could only give probabilities for any particular outcome. What's even stranger is that nature stops you from knowing too much in the first place; if you know exactly where a particle (an electron, for instance) is you cannot possibly know how fast it's going. If you know exactly how fast it's going then you cannot possibly know whereabouts it is. It's like a conspiracy! But conspiracies in nature are just laws of physics. Naturally, you don't worry about these things in real life; these problems only happen on a microscopic scale. Here's one demonstration: If you fire an electron through two narrow and closely spaced slits in a screen, common sense tells you that if it goes through the screen it will through one or other of the holes. You can't tell which because if you put a detector over the slit, you stop it from going through; there is no way to detect which slit it goes through without stopping the electron altogether; this, of course, makes the experiment pointless! Now, if we put a photographic plate behind the slits and send one electron at a time at these slits we get what's called a DIFFRACTION PATTERN. If you throw two stones into a pond at the same time you get an interesting wave pattern as the two ripples interfere with each other; this is due to diffraction. But, diffraction patterns are only created by two or more sources. If each electron goes through only one slit at a time, how was the diffraction pattern (which requires twq sources) created? The answer has to be that the electron went through both slits at the same time! It interferes with itself! Half of the "probability cloud" of a particular electron goes through one slit and half through the other; only when the electron hits the plate does the "position probability cloud" shrink to make a small point on the film. This means we become certain about its position. Everything is made from particles like electrons so everything has its own probability cloud! Everything exists in many different positions, states, speeds etc. Only when we make an observation does the thing we're observing have definite values. This all sounds very strange but the thing is it works! Most modern appliances from CDs to PCs use this theory. The world really is that strange.