# L01 Intro Probability theory
## Readings ([reference material](/fy9b5mX0Rui-Sii3wHTncg))
- Kittel Chapter 1
- Maxwell distribution, Kittel pages 391-394
## Notes
[**Lecture 01 class notes**](https://web.physics.utah.edu/~rogachev/7310/L01%20Intro%20Prob%20Theory.pdf)
Statistical mechanics uses probabilities to describe systems of many particles. For many-particle systems, we can't describe easily with either QM or CM. However, laws emerge that govern such systems which build from the concepts of entropy, temperature, free energy. Tons of observations have led us to the fundamental assumption of statistical mechanics:
#### Fundamental assumption
"In equilibrium, all accessible (degenerate) states of a closed system are equally probable". This is also a formulation of the second law of Thermodynamics.
### Laws of Thermodynamics
##### 1st law (conservation of energy - thermo)
The total energy in a system remains constant, although it may be converted from one form to another. Often expressed as:
$\Delta U=Q-W$,
where $\Delta U$ denotes the change in the internal energy of a closed system (for which heat or work through the system boundary are possible, but matter transfer is not possible), $Q$ denotes the quantity of energy supplied to the system as heat, and $W$ denotes the amount of thermodynamic work done by the system on its surroundings.
##### 2nd law (entropy maximum, heat flow)
"The entropy of the universe tends toward a maximum". A system assumes a configuration of maximum entropy at thermodynamic equilibrium.
For closed systems:
$\frac{dS}{dt} = \frac{\dot Q}{T}+\dot S_{i}$ with $\dot{S_i} \ge 0$
For open systems:
$\frac{dS}{dt} = \frac{\dot Q}{T}+\dot{S}+\dot S_{i}$ with $\dot{S_i} \ge 0$,
Where $\dot {Q}$ is the heat flow into the system, $T$ is the temperature at the point where the heat enters the system, and $\dot S$ is the flow of entropy into the system associated with the flow of matter entering the system. It should not be confused with the time derivative of the entropy.
### Kinetic Theory of Gases
- molecular collisions entail an equalization of temperatures and hence a tendency towards equilibrium
- gases should over time tend toward the MaxwellโBoltzmann distribution
#### Entropy
The entropy of an isolated system in thermal equilibrium containing an amount of energy of E E is:
$S = {\displaystyle k_{\mathrm {B} }\ln \left[\Omega \left(E\right)\right]}$
where $\Omega\left(E\right)$ is the number of quantum states in a small interval between $E$ and $E + \delta E$
### Probability Theory
**statistical variable** (x) may be discrete or continuous. We'll often use [conjugate variables](/@OptXFinite-phd-notes/statmech-dictionary#canonically-conjugate-variables) p and q (usually position and momentum, respectively).
$\Omega$ is the **multiplicity** of a system.
#### Axioms of Probability theory
1. Positivity (๐(๐ธ) > 0)
2. Additivity ๐(A or B) = ๐(๐ด) + ๐(๐ด), [if A and B are disconnected events](https://www.skytowner.com/explore/probability_axioms#Additivity)
4. Normalization ๐(๐) = 1. Random variable must have some outcome in S
#### Probability density function (continuous systems)
In the case of a continuous system, the probability of being at any single point, for example, is 0 since continuous ranges are by definition infinite.
$\rho_x(x)dx$ is the probability of the variable $x$ to be between $x$ and $x + dx$. It has dimension equal to the number of parameter infinitesimals tacked onto the end. e.g. $\rho(x, y)dxdy$ is has x and y dimensions (and units(!) which must be normalized out)
Notice that these have dimension and that they must be normalized such that
$\int \int \rho(x,y)dxdy = 1$
Normally we have a parameter space like $x,y,z,p_x,p_y,p_z$. Say from this we want probability of finding between height z and dz. Then:
$\rho{z}dz = \int \int \int \int \int \rho(x, y, z, p_x, p_y, p_z)dxdydp_xdp_zdp_z$
##### Means / Average values
$<x> = \bar{x} = \int \int x\rho(x,y)dxdy$
##### Covariance
Meaure of the joint variability between two variables
$Cov(x,y) = <xy> - <x><y>$
##### Variance
Measure of the variability/spread of a variable. It is the expectation of the squared deviation of a random variable from its population mean
$Var(x) = \sigma^2 = <(x - \bar{x})^2> = <x^2> - <x>^2$
##### Standard deviation
$\sigma = \sqrt{\sigma^2}$
##### Extending PDF to discrete systems (I mean why not?)
Enter dirac delta babeyy
For example, dice would then go like dees
$\rho = \sum\limits_{n=1}^6 \frac{1}{6}\delta(x-n)$
##### Change of variables
Say y can be written as some function $y(x)$. Then we can change between $\rho_x(x)$ and $\rho_y(y)$ via:
$\rho_x(x)dx = \rho_y(y)dy$
$\hspace{2cm}$ $\downarrow$
$\rho_y(y) = \rho_x(x(y))(\frac{dy}{dx})^{-1} = \rho_x(x(y))|J(dx,dy)|$,
where $J$ is [Jacobian](https://en.wikipedia.org/wiki/Jacobian_matrix_and_determinant) matrix $\matrix[\frac{dx}{dy}]$
For example, moving from polar to cartesian coordinates (See [HW 1 p 3](/@OptXFinite-PhD-Notes/statmech-HW1#3) for more on this):
$\rho_{polar}(\varphi, r) = \rho_{cart}(x(r, \varphi), y(r, \varphi)) \frac{\partial (x,y)}{\partial (\varphi, r)} = \rho_{cart}(x(r, \varphi), y(r, \varphi)) \begin{vmatrix}\frac{\partial x}{\partial r} & \frac{\partial x}{\partial \varphi} \\ \frac{\partial y}{\partial r} & \frac{\partial y}{\partial \varphi}\end{vmatrix}$
Here's a shortcut for computing Jacobians:
```python=
from sympy import *
x, y, z, r, varphi, theta = symbols('x y z r varphi theta')
x = r*sin(theta)*cos(varphi)
y = r*sin(theta)*sin(varphi)
z = r*cos(theta)
X = Matrix([x, y, z])
Y = Matrix([r, varphi, theta])
display(X.jacobian(Y))
simplify(det(X.jacobian(Y)))
```
outputs:

:heart:
<p style="font-family: cursive">
<em>God bless the SymPy contributors</em>
</p>
:heart:
#### Maxwell-Boltzmann Distribution
The probability density function representing a system containing a large number of identical non-interacting, non-relativistic classical particles in thermodynamic equilibrium, the fraction of the particles within an infinitesimal element of the three-dimensional velocity space $d^{3}v$, centered on a velocity vector of magnitude $v$ goes:
$f(v)~d^{n}v=\left({\frac {m}{2\pi kT}}\right)^{n/2}\,e^{-{\frac {mv^{2}}{2kT}}}~d^{n}v$, n is # dimensions)
Example: Particle in a box given by a maxwell distribution (really only applicable for a larger # particles, just illustrating the idea)
$\rho(x,y,z,v_x,v_y,v_z) = \frac{1}{V}(\frac{m}{2\pi kT})^{3/2}e^{-mv^2/2kT}$
$\rho_v = \int_V \rho dxdydz \rightarrow \rho_{v_x} = \int \rho_v dv_ydv_z \rightarrow \rho_a(r, \theta, \varphi) = \rho_v(v_x, v_y, v_z)\begin{vmatrix}\frac{\partial(v_x, v_y, v_z)}{\partial(v_r, v_{\theta}, v_{\varphi})}\end{vmatrix}$
{"metaMigratedAt":"2023-06-17T19:44:21.237Z","metaMigratedFrom":"YAML","title":"L01 Intro Probability theory","breaks":true,"contributors":"[{\"id\":\"d318ff1b-6411-4b00-bf12-ab0485a7fb22\",\"add\":8800,\"del\":1952}]","description":"Kittel Chapter 1"}