At a very basic level, lets say you have an update function that gets called each frame, lets just say it's called update().
If you know the framerate, you know the exact number of times update() is going to be called per second. Take the two following lines that I wrote from the top of my head (and no language in particular):
Code:
function update()
{
position.x = position.x + 0.1;
}
Code:
function update()
{
speed = 6;
position.x = position.x + speed*deltaT;
}
At 60fps, these two snippets result in exactly the same effect. This object will move 6 units to the right every second.
However, snippet 1 applies a value per frame, while snippet 2 calculates a rate of change per second. If you know your framerate (typical of console games, where there are no options and a simple 30/60 fps target), then 1 is fine. So you get a lot of console devs who get in a habit of programming like this. Problem is, if you run the above at 120fps, snippet 1 actually moves you at double the speed of code 2.
PC allows you to do what you want. Including arbitrary frame rates. So when devs port over, if they don't change everything to account for that you get these kinds of problems.