Mean Vector

  • the mean vector represents the central point or the expected value of the distribution,
  • you can have multiple dimensions for a mean vector which would be like

Covariance Matrix

  • The covariance matrix Σ in a multivariate Gaussian distribution captures how two variables change together (covary). It provides information on the extent to which two variables are linearly related to each other and the scale of these variables.
  • In a standard Gaussian distribution, the covariance matrix includes variances along the diagonal and covariances off the diagonal. Variance measures how much a set of values is spread out around the mean, and covariance measures how much two values vary from their mean together.

Diagonal Covariance Matrix

  • A diagonal Gaussian distribution is a specific case where the variables are assumed to be independent of each other, which means the covariances are zero.

Example: Here are the examples of a covariance matrix and a diagonal covariance matrix for a 3-dimensional Gaussian distribution:

General Covariance Matrix
[[1.0, 0.5, 0.3],
 [0.5, 1.0, 0.6],
 [0.3, 0.6, 1.0]]

In this general covariance matrix, the diagonal elements (1.0, 1.0, 1.0) represent the variances of each individual variable, and the off-diagonal elements (e.g., 0.5, 0.3, 0.6) represent the covariances between each pair of variables. These covariances indicate that there is some degree of linear relationship between the variables.

Diagonal Covariance Matrix
[[1.0, 0.0, 0.0],
 [0.0, 1.0, 0.0],
 [0.0, 0.0, 1.0]]

In the diagonal covariance matrix, all off-diagonal elements are zero, indicating that there is no linear relationship between the different variables—they are independent. The diagonal still contains the variances for each variable, identical to the general covariance matrix. This simplification often makes computations easier and is a common assumption when the independence of variables is a reasonable approximation.