My study path towards Linear Algebra:
- Stephen Boyd, Lieven Vandenberghe, Introduction to Applied Linear Algebra
- Strang, Linear Algebra online(MIT 18.06 Linear Algebra)
Since I am now resolved to incline to computational design rather than traditional architectural design, I feel compelled to review these knowledge. I am trying to revise the notes into digital format.
The following libraries are required among Jupyter Notebooks.
- Matplotlib
- NumPy
- Python
- SymPy
📌 vector
It is a sequence of finite numbers / values. Normally it looks something like this: $$ \begin{bmatrix}1\2\3\end{bmatrix} \text{or} \begin{pmatrix}1\2\3\end{pmatrix} $$
📌 elements/entries/coefficients/components
The above 4 names are the same. Naming, the individual element of a vector.
📌size/dimension/length
They are the same, naming how long is the vector.
📌scalar and vector
There are real scalar
Therefore, we also have real vector and complex vector. In most case, we refer real vector as vector.
📌block or stacked vector
Sometimes it is very convenient to concatenate or stack vectors.
e.g. $$ a=\begin{bmatrix}a_1\\vdots\a_m\end{bmatrix}, b=\begin{bmatrix}b_1\\vdots\b_n\end{bmatrix}, c=\begin{bmatrix}c_1\\vdots\c_p\end{bmatrix}, \\ A=\begin{bmatrix}a_1\\vdots\a_m\b_1\\vdots\b_n\c_1\\vdots\c_p\end{bmatrix} $$
import numpy as np
a = np.array([[1],
[2],
[3]])
b = np.array([[4],
[5],
[6]])
c = np.array([[7],
[8],
[9]])
A = np.vstack([a,b,c])
📌subvector/slice
We call
📌 colon notation and index range
📌indexing
when you see
- the
$i$ -th vector in the collection of vectors. - the
$i$ -th element in the vector.
Therefore, for clarity. We use
-
$a_i$ as the$i$ -th vector. -
$(a_i)j$ as the$j$ -th element of the$i$ -th vector.
📌 zero vector
📌 unit vector
One unit vector only has one nonzero element. $$ e_1 = \begin{bmatrix}1\0\0\end{bmatrix}, e_2 = \begin{bmatrix}0\1\0\end{bmatrix}, e_3 = \begin{bmatrix}0\0\1\end{bmatrix} $$
📌 Ones Vector
Size of
📌 Sparsity
If there are many zero elements in one vector, then we can say this vector is sparse vector.
The number of nonzero elements in a vector
In machine learning or computer science, the vector is often saw as a list of value more than its mathematical meaning.
#include <vector>
//...
std:vector<int> ListInt;
📌 Displacement and Location
A vector can either represents a location or a displacement.
📌 Colors
The colors can be represented as vectors.
📌 Quantities
A
📌 Value Across Population
A
📌 Image
This is much more "machine learning" feel. If per pixel per vector
$$
\text{Pixel}_{i,j}=\begin{bmatrix}0.8\0.2\0.3\end{bmatrix}
$$
Then an
Prerequisite: The vectors have to be same shape. $$ \begin{bmatrix}0\7\3\end{bmatrix}+ \begin{bmatrix}1\2\0\end{bmatrix}= \begin{bmatrix}1\9\3\end{bmatrix} $$
📌 commutative $$ a+b = b+a $$ :pushpin: associative
📌 Displacement
📌 Vector as a list of value
There are lots of example of these.
Nothing special. $$ (-2)\begin{bmatrix}0\7\3\end{bmatrix}=\begin{bmatrix}0\-14\-6\end{bmatrix} $$
📌 commutative
📌 Displacement
Scale the displacement.
scalar:
vector:
linear combination:
📌 Linear combination of unit vectors
Nothing special, just a linear combination using unit vectors as bases.
📌 Special linear combinations
sum , when
mean , when
affine combination , when
convex combination / mixture / weighted average , when affine combination
📌 Displacement
📌 Line and Segment
This is a good example of affine combination.
Use
📌 Equation $$ a^Tb = a_1b_1+a_2b_2+\cdots+a_nb_n $$ There are other representation of dot product. $$ \langle a,b\rangle , \langle a|b\rangle , (a,b),a\cdot b $$
$$ (a+b)^T(c+d) = a^Tc+a^Td+b^Tc+b^Td $$ in short, why the above would work? it is very easy to understand.
-
$(a+b)$ is a vector -
$(c+d)$ is also a vector -
$(a+b)$ has to transpose into$(a+b)^T$ , a.k.a. a row vector -
$(a+b)^T(c+d)$ , row vector * column vector is a scalar. (if their dim are the same)
📌
📌 sum
📌 average / mean
📌 sum of squares
📌 co-occurrence
Then the total sum of co-occurrence is:
$$
a=(0,1,1,1,1,1,1)\quad b= (1,0,1,0,1,0,0)\
a^Tb = 2
$$
meaning, there are
📌 weights / features / scores
📌 document sentiment analysis
-1=negative, 0=neutral, 1=positive
, e.g. "sad" is -1
Then the measure of the sentiment is: $$ t^Tx $$
📌 computer representation of numbers and vectors
The real numbers x64
computer.
📌 floating point operation
When computer carries out arithmetic operation, the result is rounded to the nearest floating point number. The very small error is called round-off error. Now you understand why you can't compare the equality of float
.
Because the left hand side and right hand side are not equal sometimes!! But the error between is extremely small.
📌 flop counts and complexity
flop = floating point operation per second
complexity , 2 meanings
1. In theoretical computer science, the term `complexity` is to mean the number of flops of the best method to carry out the computation. It always is denoted as $O$. 2. In this book, it means the number of flops required by a specific method.
📌 complexity of vector operation
-
$n$ times- scalar-vector multiplication and division of
$n$ -vector , e.g.$aV$ - vector addition and subtraction
$n$ -vector, e.g.$P+V$
- scalar-vector multiplication and division of
-
$2n$ times- inner product of
$n$ -vectors, e.g.$P^TV = P_1V_1+\cdots+P_nV_n$ . Multiplication takes$n$ times, addition takes$n-1$ times. But for simplicity, we denote as$2n$ times.
- inner product of
📌 complexity of sparse vector operation
$x,y$ are sparse vectors.
$ax$ takes$nnz(x)$ times.
$x+y$ takes
- less than
$\min{\bold{nnz}(x),\bold{nnz}(y)}$ times- 0 times if
$x$ and$y$ have zero overlapped.
$x^Ty$ takes
- less than
$2\min{\bold{nnz}(x),\bold{nnz}(y)}$ - 0 times if they are not overlapped.
This chapter is mainly on Linear Function and Affine Function.
📌The symbol
Use
📌 What is a function?
Therefore,
📌 What is it?
An inner product function is like:
$$
f(x)=a^Tx = a_1x_1+a_2x_2+\cdots+a_nx_n
$$
Since it is an inner product function,
📌 Superposition叠加性 & Linearity线性性
Suppose
📌 Super Formal Definition on Linear
The function
-
homogeneity齐次性.
$f(\alpha x)=\alpha f(x)$ emphasize on scaling -
Additivity可加性.
$f(x+y)=f(x)+f(y)$ emphasize on adding
📌Inner Product Representation of a Linear Function
The logic can be back and forth.
Hypothesis/Conclusion | Relation | Hypothesis/Conclusion |
---|---|---|
A function defined as the inner product of its argument with some fixed vector. | 🔁 | A function is linear. |
We therefore say
📌 Average
The average of a
📌 Maximum
The maximum of a
📌 What is it?
Affine function is (linear function) + (a CONSTANT).
$$
f:\mathbb{R}^n\to\mathbb{R}
\f(x)=a^Tx+b
$$
📌 Constraint on Superposition
For linear function, no constraint on superposition.
For affine function, there IS constraint on superposition
Because superposition is: $$ f(\alpha x+\beta y)=\alpha f(x)+\beta f(y) $$ And affine function is: $$ f(x)=a^Tx+b $$ Therefore there are 2 in both sides: $$ b=\alpha b_1+\beta b_2 $$ So the only chance they are the same is that $$ \alpha+\beta=1 $$ :thinking: What can we take advantage of this property?
OK, an affine function can satisfy superposition if and only if
📌 What for?
🎯 Application on scalar-valued functions of
📌 What is Taylor Approximation?
Suppose that
-
$\frac{\partial f}{\partial x_i}(z)$ , denotes the partial derivative of$f$ with respect to its $i$th argument, evaluated at the$n$ -vector$z$ . -
$\hat{f}$ , the hat is a hint that it is an approximation of the function$f$ . -
$\hat{f}(x;z)$ , is written with 2nd argument to show the point$z$ at which the approximation is developed. -
$f(z)$ , the first item of Taylor is constant, while others can be seen as the contributions to the change(from $f(z)$) due to the changes in the component of$x$ (from$z$ ).
📌Affine Function
Apparently, since there is always a
-
$f(z)$ , it is a constant which is the value of this function when$x=z$ -
$\grad f(z)$ , is a$n$ -vector, the gradient of$f$ at the point$z$ is:
-
$(x-z)$ , the deviation/perturbation(偏差/扰动) of$x$ respect to$z$
📌 What does it mean by approximation
?
A simple example, for
A function f of one variable, and the first-order Taylor approximation
📌 Example
Consider the function
- first, take the partial derivative of this function:
- two, place
$z_2=2, z_1=1$ in it -
$\text{exp}=e,\quad e^1=2.7183$ , therefore$\grad f(z)$ at$z=(1,2)$ is:
\begin{bmatrix} -1.7183\ 2.7183 \end{bmatrix} $$
-
$f(z)$ at$z=(1,2)$ is:
- the Taylor approximation therefore is:
How to measure it is a good approximation:
(1.00, 2.00) | 3.7183 | 3.7183 | 0.000 |
(0.96, 1.98) | 3.7332 | 3.7326 | 0.0005 |
(1 .10, 2.11) | 3.8456 | 3.8455 | 0.0001 |
(0.85, 2.05) | 4.1701 | 4.1119 | 0.0582 |
(1.25, 2.41) | 4.4399 | 4.4032 | 0.0367 |
📌 What is it? $$ \hat{y} = x^T\beta+v $$
-
$x$ , a$n$ -vector a.k.a. feature vector-
$x_1,x_2,...,x_n$ are regressor
-
-
$\beta$ , a$n$ -vector, weight vector / coefficient vector -
$v$ , a scalar, offset / intercept -
$\hat{y}$ , the$\hat{}$ symbol indicating it is an estimate / prediction -
$y$ , dependent variable, output labels
Special Indication:
-
$\beta, v$ are both parameters tuning this regression model. -
$\beta$ is a weight vector which indicates the dependence between input and the performance of model.- e.g.
$\beta_{13}$ is bigger,$x_{13}$ has more significant effect on the model - e.g.
$\beta_{27}$ is smaller,$x_{27}$ has less effect on the model
- e.g.
📌 Simplified Regression Model Notation
Although I think this notation is kind of useless, the following is its equation.
$$
\hat{y}=x^T\beta+v=\begin{bmatrix}1\x\end{bmatrix}^T\begin{bmatrix}v\\beta\end{bmatrix}
$$
The dimension of
The dimension of
Literally no difference from the left hand side.
📌 House price regression model
Suppose:
-
$y$ , the actual selling price of the house -
$x_1$ , the house area -
$x_2$ , the number of bedrooms -
$\hat{y}$ , the estimate price
Then we have: $$ \hat{y}=x^T\beta+v=\beta_1x_1+\beta_2x_2+v $$ As a specific numerical example, consider the regression model parameters: $$ \beta=(148.73,-18.85), \quad v=54.40 $$
-
$\beta_1>0$ , it is easy to understand, house area:arrow_up_small:, price:arrow_up_small: -
$\beta_2<0$ , this is hard to understand, maybe because increasing the number of rooms with same area will be considered as a public house?..(公租房) -
$v>0$ , considering when$x_1=0,x_2=0$ , then$v=54.40$ , that is the value of the lot.(土地价格)
📌 Euclidean Norm(L-2 Norm)
Suppose:
-
$x$ is a$n$ -vector -
$\norm{x}$ , is denoted as its norm.
Since it is no difference as the squareroot of the inner product of the vector with itself, we can write as:
$$
\norm{x}=\sqrt{x^Tx}
$$
Another notation of Euclidean Norm:
$$
\norm{x}_2
$$
The subscript
📌 Why we use
:star:For measuring magnitude
The double bar notation indicates the norm of a vector is a (numerical) measure of its magnitude(not considering orientation). Therefore, we can say a vector is small if its norm is a small number, and large is with a large number.
📌 Properties of Norm
:one:nonnegative homogeneity非负齐次性
:two:Triangle Inequality (subadditivity次可加性)
:three:Nonnegativity
:four:Definiteness
📌 general norm
Any real-valued function of an
There are a lot of concepts closely related to norm.
📌RMS(root-mean-square) $$ \bold{rms}(x)=\sqrt{\frac{x_1^2+\cdots+x_n^2}{n}}=\frac{\norm{x}}{\sqrt{n}} $$
The key idea of RMS is to compare norms of vectors with different dimensions.🌟
📌 Norm of a sum
Suppose you have 2 vectors,
📌 Norm of a block vectors
Suppose you have a block vectors
This philosophy always is treated as its reverse: the norm of a stacked vector is the norm of the vector formed from the norms of the subvectors. $$ \norm{(a,b,c)}=\sqrt{\norm{a}^2+\norm{b}^2+\norm{c}^2}=\norm{(\norm{a},\norm{b},\norm{c})} $$
📌 Chebyshev Inequality
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****
📌 ****