Quantum calculus, sometimes called calculus without limits, is equivalent to traditional infinitesimal calculus without the notion of limits. It defines "q-calculus" and "h-calculus", where h ostensibly stands for Planck's constant while q stands for quantum. The two parameters are related by the formula
where is the reduced Planck constant.
In the q-calculus and h-calculus, differentials of functions are defined as
and
respectively. Derivatives of functions are then defined as fractions by the q-derivative
and by
In the limit, as h goes to 0, or equivalently as q goes to 1, these expressions take on the form of the derivative of classical calculus.
A function F(x) is a q-antiderivative of f(x) if DqF(x) = f(x). The q-antiderivative (or q-integral) is denoted by and an expression for F(x) can be found from the formula which is called the Jackson integral of f(x). For 0 < q < 1, the series converges to a function F(x) on an interval (0,A] if |f(x)xα| is bounded on the interval (0,A] for some 0 ≤ α < 1.