The Runge-Kutta algorithm is the magic formula behind most of the simulations shown on this web site.
The Runge-Kutta algorithm lets us solve a differential equation numerically (ie. approximately). Consider the single variable problem
x' = f(t,x)
with initial condition
xn+1 = xn + (h/6)(k1 + 2k2 + 2k3 + k4)
where
k1 = f(tn, xn)
k2 = f(tn + h/2, xn + (h/2)k1)
k3 = f(tn + h/2, xn + (h/2)k2)
k4 = f(tn + h, xn + h k3)
To run the simulation, we simply start with
The Runge-Kutta algorithm is known to be very accurate and well-behaved for a wide range of problems. See the references for more information on why this is such a good method, as well as where it can be expected to fail.
The Runge-Kutta algorithm can be easily extended to a set of first order differential equations. You wind up with essentially the same formulas shown above, but all the variables (except for time) are vectors.
To give an idea of how it works, here is an example where we expand the vector notation. That is, instead of using the highly compact vector notation, we write out all the components of the vector.
Suppose there are only two variables,
x' = f(t,x,y)
y' = g(t,x,y)
Again we let
xn+1 = xn + (h/6)(k1 + 2k2 + 2k3 + k4)
yn+1 = yn + (h/6)(j1 + 2j2 + 2j3 + j4)
where
k1 = f(tn, xn, yn)
j1 = g(tn, xn, yn)
k2 = f(tn + h/2, xn + (h/2)k1, yn + (h/2)j1)
j2 = g(tn + h/2, xn + (h/2)k1, yn + (h/2)j1)
k3 = f(tn + h/2, xn + (h/2)k2, yn + (h/2)j2)
j3 = g(tn + h/2, xn + (h/2)k2, yn + (h/2)j2)
k4 = f(tn + h, xn + h k3, yn + h j3)
j4 = g(tn + h, xn + h k3, yn + h j3)
Given starting values
Suppose we have more than two variables which are
x' = f(t, x, y, z, ...)
y' = g(t, x, y, z, ...)
z' = h(t, x, y, z, ...)
...
You will have as many equations as there are variables. The algorithm is similar to that shown for the one-variable version. The details can be confusing and easy to get wrong, so I recommend looking at source code. A simplified version of the source code is available here.