For a vector valued function \(\mathbf{f}\), the Jacobian is the matrix of partial derivatives of the elements of the function's result with respect to those of its argument.
\[ \mathbf{J}(\mathbf{f}) = \begin{pmatrix} \frac{\partial f_0}{\partial x_0} & \frac{\partial f_0}{\partial x_1} & \dots & \frac{\partial f_0}{\partial x_n}\\ \frac{\partial f_1}{\partial x_0} & \frac{\partial f_1}{\partial x_1} & \dots & \frac{\partial f_1}{\partial x_n}\\ \vdots & \vdots & \ddots & \vdots &\\ \frac{\partial f_m}{\partial x_0} & \frac{\partial f_m}{\partial x_1} & \dots & \frac{\partial f_m}{\partial x_n}\\ \end{pmatrix} \]
The determinant of a square Jacobian is used in integration by substitution and is often written
\[ \left|\mathbf{J}(\mathbf{f})\right| = \left|\frac{\partial\left(f_0,f_1,\dots,f_n\right)}{\partial\left(x_0,x_1,\dots,x_n\right)}\right| \]
If the function \(\mathbf{f}\) equals the partial derivatives of a scalar valued function with respect to its arguments then the Jacobian is equal to the Hessian.