Mathematics

Trigonometry Review

[latexpage]

More content coming in the near future ūüôā

SOHCAHTOA

For any right triangle (i.e., a triangle with a $90^\circ$ angle between two of its segments), we can apply the following equations to determine the length of segments that comprise the triangle or the degree of the acute angles in the triangle.

 

sohcahtoa

SOH: \begin{equation*} \sin(\theta) = \frac{\mathrm{opposite}}{\mathrm{hypotenuse}} \end{equation}

CAH: \begin{equation*} \cos(\theta) = \frac{\mathrm{adjacent}}{\mathrm{hypotenuse}}  \end{equation}

TOA: \begin{equation*} \tan(\theta) = \frac{\mathrm{opposite}}{\mathrm{adjacent}}  \end{equation}

Equation for a Circle

A circle is defined as the set of points lying at a fixed distance from some center point. If we set the center of a circle with a radius $r$ to be the origin of a 2D Cartesian coordiante system, then we can use the Pythagorean theorem to find any set of points $P:(r_x, r_y)$ that lie along the perimeter of our circle. Equation of a circle

If we want to know the $r_x$ and $r_y$ coordinates associated with some $0 \leq \theta \leq 2\pi$ radians, we can use our first two SOHCAHTOA equations:

  • $r_x= r \cos(\theta)$
  • $r_y = r \sin(\theta)$

Aside: to convert between radians and degrees, use the equality $\pi \, \mathrm{rad} = 180^\circ$

  • Thus, $1 \, \mathrm{rad} = (\frac{180}{\pi})^\circ \approx 57.2958^\circ$

    radian
    $1 \, \mathrm{rad} \approx 57.3^\circ$

The Unit Circle

Unit Circle

Mathematics

Probability

[latexpage]

Counting Techniques

  1. Multiplication rule
    • Suppose there are $r$ items in a set
    • Also, let’s say there are $n_1$ possibilities for the $r=1^{\mathrm{st}}$ item, $n_2$ possibilities for the $r=2^{\mathrm{nd}}$ item, …, and $n_r$ possibilities for the $r=r^{\mathrm{th}}$ item
      • Then, the total number of possibilities for all the different $r$ items is $(n_1)(n_2) \cdot … \cdot (n_r)$
  2. Permulations 
    • A¬†permutation¬†is an¬†ordered¬†set of $r$ items selected from a (larger) set of $n$ items¬†without replacement
    • Note: these are a special case of the multiplication rule
      • Specifically, we use permutations when the order in which we choose an item from a set is relevant
    • Theorem: the number of $n$ distinct items selected $r$ at a time without replacement is… \begin{equation} \frac{n!}{(n-r)!} \end{equation}
  3. Combinations 
    • A¬†combination¬†is an unordered¬†set of $r$ items taken from a (larger) set of $n$ objects¬†without¬†replacement
    • Theorem: the number of combinations of $r$ items taken without replacement from a (larger) set of $n$ distinct objects is… \begin{equation} \binom{n}{r} = \frac{n!}{(n-r)!r!} \end{equation}

Random Variables

Two different kinds of random variables:

  1. Discrete random variables: characterized by a probability mass function
  2. Continuous random variables: characterized by a probability density function

 

**CORRECTION: As pointed out by adityaguharoy,¬†there acutally random variables/distributions that are both discrete and continuous (e.g., the cumulative distribution function). Hopefully I’ll remember update this post with more details ūüôā

Mathematics

Linear Algebra: Introduction

[latexpage]

The quintessential linear algebra problem will ask for the solution of a set of linear equations.

  • Example: Find the solution $(x,y)$ for the linear system

$\begin{matrix}
3x-y=2 \\
2x+3=4
\end{matrix}$

  • There are two ways we can interpret the solution to these equations:
    • The point(s) at which these equations intersect when plotted as a line in $\mathbb{R}^2$
      • For our example, we can rewrite each equation in our system as a function $f(x)=y$ and plot each function on $\mathbb{R}^2$ to visualize the this interpretation of the solution:
    • If we rewrite the system as a vector equation, the solution becomes the set of scalar values that are multiplied with respective column vectors on the left hand side of the vector equation so that we obtain the right hand side

Don’t worry, I will be adding more content to this page in the near future

Sources

  • Linear Algebra and its Applications, 3rd Edition. Gilbert Strang (1986)
Mathematics, Neuroscience

Markov Chains

[latexpage]

Introduction – Toothpaste Brand Example

Consider the following scenerio:

  • Brand T is a toothpaste company that controls 20% of the toothpaste market
  • A market researcher predicts the following effects of an ad campaign:
    • A consumer using brand T will continue using brand T with a 90% probability
    • A consumer not using brand T will switch to brand T with a 70% probability
  • For any given customer in the toothpaste market, let…
    • $T$ denote a state of using brand T
    • $T’$ denote the state of using a toothpaste brand other than T

Definitions:

  • A¬†transition diagram¬†is wieghted directed graph whose nodes denote various states a system can exist in and edges denote the probability of the system transitioning from one state to another after some time step $\Delta t$
    • Transition diagram describing our toothpaste brand example:

      Transition Diagram - Copy
      Figure 1
  • A¬†transition probability matrix¬†$\mathrm{P}$¬†is a $n \times n$ matrix whose $(i,j)$ entries give the probabiliy a “sample” in the $i$th state of a $n$-state system will transition to its $j$th state
    • Importantly, the elemental sum of all rows in any¬†$\mathrm{P}$¬†must equal 1
    • Here is transition probability matrix describing our toothpaste brand example:

      Transition Probability Matrix - copy
      Figure 2
  • An initial state distribution matrix $\mathrm{S_0}$ for a $n$-state system¬†is a $n$-dimensional column vector whose $i$th entries denote the percentage of “samples” that are in state $i$ at time $t=0$
    • Here is the initial state distribution matrix for our toothpaste example:

      initial state distribution matrix
      Figure 3
    • The $i=1$st entry indicates that 20% of customers in the toothpaste market use brand T (state $T$) and the $i=2$nd entry indicates that 80% of customers in the toothpaste market use brand T’ (state $T’$)
  • A¬†probability tree¬†gives the probability a “sample” will transition to some state after some time
    • For our toothpaste example, let’s construct¬†¬†a probability tree that tells us the probability a person will use brand T vs. T’ after one month

      Probability Tree 1
      Figure 4:
    • In order to determine the probability a customer is using brand T after one week, we need to take the sum of all the possible products of the probability transitions that end in state T (Fig 5)
      • Here, there are two possible state transition sequences that end in state T (highlighted in orange and green):
        Probability Tree 2
        Figure 5
        • The orange path gives the probability of a customer will use toothpaste brand T for the entire month¬†‚áí $P(T \rightarrow T) = (0.2)(0.9) = 0.18$
        • The green path gives the probability of a customer using brand T’ at the beginning of the month, then switching to band T at the end of the month¬†‚áí \[P(T’ \rightarrow T) = (0.8)(0.7) = 0.56\]
      • Now, we can sum the probabilities of each possible transition sequence ending in state $T$ in order to determine the total probability a custormer will be using brand T by the end of the month: $P(T)¬†=¬†P(T \rightarrow T) +¬†P(T’ \rightarrow T) = (0.18)(0.56) = 0.74$
        • There is a 74% chance a random customer in the toothpaste market will be using brand T by the end of the month
      • Since we only have two possible states in our system, subtracting¬†$P(T)$ from 1 will give us the probability a random customer will be using brand T’ by the end of the month:¬†$P(T’) = 1 – P(T) = 1 – 0.74 = 0.26$
        • Importantly, we can construct a state distribution matrix giving the the probabilities for the toothpaste brand a randomly sampled customer will be using after one month:¬†state distribution matrix 1

Theorem: $\mathrm{S_i} \cdot \mathrm{P} = \mathrm{S_{i+1}}$

  • Specifically, this is saying that the dot product between a state distribution matrix for time $i$th time step and a corresponding transition probability matrix will return the state distribution matrix for the $i+1$th time step
  • Let’s check and make sure this hold true for our example:

$\mathrm{S_0} \cdot \mathrm{P} = \begin{bmatrix} 0.2 & 0.8 \end{bmatrix} \cdot \begin{bmatrix} 0.9 & 0.1 \\ 0.7 & 0.3 \end{bmatrix} = \begin{bmatrix} (0.2)(0.9) + (0.8)(0.7) & (0.2)(0.1) + (0.8)(0.3) \end{bmatrix} = \begin{bmatrix} 0.74 & 0.26 \end{bmatrix} =  \mathrm{S_{1}}$

  • Results agree with our probability tree!

Assuming $\mathrm{P}$ remains valid, we can determine the expected state distribution matrix for any time step (i.e., month)

  • State distribution matrix for second time step:

$\mathrm{S_1} \cdot \mathrm{P} = \begin{bmatrix} 0.74 & 0.26 \end{bmatrix} \cdot \begin{bmatrix} 0.9 & 0.1 \\ 0.7 & 0.3 \end{bmatrix}$

$= \begin{bmatrix} (0.74)(0.9) + (0.26)(0.7) & (0.74)(0.1) + (0.26)(0.3) \end{bmatrix}$

$= \begin{bmatrix} 0.848 & 0.152 \end{bmatrix} =  \mathrm{S_{2}}$

  • State distribution matrix for third time step

$\mathrm{S_2} \cdot \mathrm{P} = \begin{bmatrix} 0.848 & 0.152 \end{bmatrix} \cdot \begin{bmatrix} 0.9 & 0.1 \\ 0.7 & 0.3 \end{bmatrix}$

$= \begin{bmatrix} (0.848)(0.9) + (0.152)(0.7) & (0.848)(0.1) + (0.152)(0.3) \end{bmatrix}$

$= \begin{bmatrix} 0.8698 & 0.1304 \end{bmatrix} =  \mathrm{S_{3}}$

Regular Markov Chains: Stationary Matrices and Steady State Markov Chains

Example: assume a company initially has 10% of the market share

  • Using an advertising campaign, the transition probability matrix is given by¬†Transition Probability Matrix Example 2
    • Notations:
      • $A = $ state where a customer is using brand A
      • $A’ =$ state where a customer is using brand A’
  • Question: what happens to the company’s market shar over a long period of time (assuming $\mathrm{P}$ continues to be valid)
    • Solution:

$ \mathrm{S_0} = \begin{bmatrix} 0.1 & 0.9 \end{bmatrix}$

$ \mathrm{S_1} = \mathrm{S_0} \cdot \mathrm{P} = \begin{bmatrix} 0.1 & 0.9 \end{bmatrix} \cdot \begin{bmatrix} 0.8 & 0.2 \\ 0.6 & 0.4 \end{bmatrix} = \begin{bmatrix} 0.62 & 0.38 \end{bmatrix}$

$ \mathrm{S_2} = \mathrm{S_1} \cdot \mathrm{P} = \begin{bmatrix} 0.62 & 0.38 \end{bmatrix} \cdot \begin{bmatrix} 0.8 & 0.2 \\ 0.6 & 0.4 \end{bmatrix} = \begin{bmatrix} 0.724 & 0.276 \end{bmatrix}$

$ \mathrm{S_3} = \mathrm{S_2} \cdot \mathrm{P} = \begin{bmatrix} 0.7448 & 0.2552 \end{bmatrix}$

$ \mathrm{S_4} = \mathrm{S_3} \cdot \mathrm{P} = \begin{bmatrix} 0.74896 & 0.25104 \end{bmatrix}$

$ \mathrm{S_5} = \mathrm{S_4} \cdot \mathrm{P} = \begin{bmatrix} 0.749792 & 0.250208\end{bmatrix}$

$ \mathrm{S_6} = \mathrm{S_5} \cdot \mathrm{P} = \begin{bmatrix} 0.7499584 & 0.2500416\end{bmatrix}$

  • Here, the state distribution matrices $\mathrm{S_i}$ get closer and closer to $\begin{bmatrix} 0.75 & 0.25 \end{bmatrix}$ as $i \rightarrow \infty$
    • Moreover, if we take the dot product between $\mathrm{S} = \begin{bmatrix} 0.75 & 0.25 \end{bmatrix}$ and $\mathrm{P}$, we get $\mathrm{S}$:
      • $ \mathrm{S} = \mathrm{S} \cdot \mathrm{P} = \begin{bmatrix} 0.75 & 0.25 \end{bmatrix} \cdot \begin{bmatrix} 0.8 & 0.2 \\ 0.6 & 0.4 \end{bmatrix} = \begin{bmatrix} 0.75 & 0.25 \end{bmatrix}$
        • No change occurs!
          • The matrix $\mathrm{S}$ is called a¬†stationary matrix¬†and the system is said to be at steady state

Questions:

  1. Does every Markov chain have a unique stationary matrix?
  2. If a Markov chain does have a unique stationary matrix, will the successive state matrices always approach this stationary matrix?

Answers:

  • Generally, the answer to both questions is no
  • However, if a Markov chain is a regular Markov chain, then the answer to both questions is yes

Sources:

Chemistry, Mathematics, Physics

Special Relativity – Part 1

[latexpage]

Luminiferous ether

Mid-ninteenth century: mainstream science believed light particles were (mechanical) waves traveling through a medium called the luminiferous ether

Mid-ninteenth century definition of a wave:

  • Waves¬†= a¬†disturbance traveling through a medium¬†
    • Note: this is essentially what our current definition of a mechanical¬†wave is today
    • Medium¬†= the material or substance through which a wave is traveling
  • Example 1: a dewdrop falling into a pond (Figure 1)
    water dropplet
    Figure 1
    • Here, the medium is the water in the pond
    • The initial disburbance is the dewdrop landing in the water
      • The water particles initially disturbed by the dewdrop further disturbs the position of surrounding water particles
      • This disturbance is further propagated throughout the medium (i.e., the pond)
  • Example 2: sound waves from clapping
    • Medium = air
    • Initial disturbance = compression of air molecules
      • Compressed air molecules causes them to collide with one another and generate sound waves
  • Note: both of these examples are mechanical waves

We knew from research such as Young’s double-slit experiment (1801) that light has wave-like properties

  • Specifically, it showed one of the hallmark signs of wave behavior: interference
  • Some unaswered questions of the mid-19th century
    • How could they define light in terms of its wave-like properties?
      • Note: they were trying to define light in terms of mehcanical waves (they didn’t know about electromagntic waves back then)
    • They theorized that light (e.g., light traveling from the Earth to the Sun) could be explained as a disturbance propagating through a medium
      • People called this medium the luminiferous ether
    • Big question: does the luminferous ether exist?

Luminiferous ether = medium through which light (supposedly) propogates

  • One major goal of mainstream science back then was to detect/validate the existence of this medium
  • ¬†Note: if there is a luminiferous ether, the Earth must be traveling fast relative to it
    • Not only is the Earth rotating on its own axis, but is also treaveling along an elliptical orbit around the Sun at $\approx 30 \mathrm{km/s}$
    • Moreover, the Sun is estimated to orbit around the center of the galaxy at $\approx 200 \mathrm{km/s}$
    • As far as our galaxy is concerned, we don’t really know what it’s doing, we just know its moving
      • Most scientists theorize our galaxy is rotating around a black hole
    • Take-home-message: if the luminiferous ether exists, Earth’s position should be constantly changing relative to it
      • Reasoning behing this:
        • The odds of us being stationary relative to such a medium are essentially zero
        • We should either be moving relative to the ether or the ether should be moving relative to us
          • Thus, we should be able to detect some sort of “ether wind” or the “current” associated with the luminiferous ether
  • Aside: waves propagate faster in the direction which current is moving
    • Example:¬†a dew drop falling in a stream with a current flowing (Figure 2)
      water dropplet 2
      Figure 2
      • Here, the medium is the water in the stream and the initial disturbance is the dewdrop falling into the stream
      • Propagation of medium distortion (i.e., the waves/ripples in the stream) will occur more quickly in the direction of current (i.e., movment to the left)

 

The Michelson-Morely Experiment:

Experiment Background

Assuming there did exist a luminiferous ether, let $\overrightarrow{s}$ be the velocity of the its ether wind

  • From our dicussion on wave propogation speed and currents, light that is propagated in the same direction as $\overrightarrow{s}$ show travel at a faster velocity than light propagated in the $-\overrightarrow{s}$ direction
  • For a while, no one could figure out how to test this because the tools/technology did not yet exist that could detect velocities near the speed of light (thus, any differences they would have expected to find were inmeasureable)

Eventually Michelson and Morely designed an experiment that was able to work around this issue using wave interference

  • Recall: interference is a hallmark behavior of waves
  • Instead of attempting to measure the speed of light emitted in different directions, they split light into two different directions, recombined them, and observed the interference patterns
    • They reasoned, that if light emitted in different directions traveled at different speeds, then different interference patterns would result
    • However, this isn’t what happened!!
  • No matter how they oriented the apparatus (no matter the time of day and/or year), they always observed the same interference pattern
    • Conclusion: the luminiferous ether doesn’t seem to affect light waves¬†‚áí breakdown of the idea behind a luminiferous ether and/or an “absolute” inertial frame of reference through which light traveles
  • Titled one of “the most famous failed experiments”
    • Note: there were other experiments besides this one at the time that were also causing people to question the existence of a luminiferous ether

As it turns out, no matter the reference frame, light always travels at a constant speed!

 

Sources