Added ch 14

This commit is contained in:
chris-admin 2025-09-13 16:17:35 +02:00
parent 82f8d8e906
commit f5bc43c59b

View File

@ -122,6 +122,8 @@ Take all node embeddings that are in the neighbouroud and do similar steps as th
## Polynomial Filters
Each polynomial filter is order invariant
### Graph Laplacian
Let's set an order over nodes of a graph, where $A$ is the adjacency matrix:
@ -195,4 +197,97 @@ $$
> So this shows that the degree of the polynomial decides the max number of hops
> to be included during the filtering stage, like if it were defining a [kernel](./../7-Convolutional-Networks/INDEX.md#filters)
### ChebNet
### ChebNet
The polynomial in ChebNet becomes:
$$
\begin{aligned}
p_{\vec{w}}(L) &= \sum_{i = 1}^{d} w_{i} T_{i}(\tilde{L}) \\
T_{i} &= cos(i\theta) \\
\tilde{L} &= \frac{2L}{\lambda_{\max}(L)} - I_{n}
\end{aligned}
$$
- $T_{i}$ is Chebischev first kind polynomial
- $\tilde{L}$ is a reduced version of $L$ because we divide for its max eigenvalue,
keeping it in range $[-1, 1]$. Moreover $L$ ha no negative eigenvalues, so it's
positive semi-definite
These polynomials are more stable as they do not explode with higher powers
### Embedding Computation
<!-- TODO: Read PDF 14 Anelli from 81 to 83 -->
## Other methods
- <span style="color:skyblue">Learnable parameters</span>
- <span style="color:orange">Embeddings of node v</span>
- <span style="color:violet">Embeddings of neighbours of v</span>
### Graph Convolutional Networks
$$
\textcolor{orange}{h_{v}^{(k)}} =
\textcolor{skyblue}{f^{(k)}} \left(
\underbrace{\textcolor{skyblue}{W^{(k)}} \cdot
\frac{
\sum_{u \in \mathcal{N}(v)} \textcolor{violet}{h_{u}^{(k-1)}}
}{
|\mathcal{N}(v)|
}}_{\text{mean of previous neighbour embeddings}} + \underbrace{\textcolor{skyblue}{B^{(k)}} \cdot
\textcolor{orange}{h_{v}^{(k - 1)}}}_{\text{previous embeddings}}
\right) \forall v \in V
$$
### Graph Attention Networks
$$
\textcolor{orange}{h_{v}^{(k)}} =
\textcolor{skyblue}{f^{(k)}} \left(
\textcolor{skyblue}{W^{(k)}} \cdot \left[
\underbrace{
\sum_{u \in \mathcal{N}(v)} \alpha^{(k-1)}_{v,u}
\textcolor{violet}{h_{u}^{(k-1)}}
}_{\text{weighted mean of previous neighbour embeddings}} +
\underbrace{\alpha^{(k-1)}_{v,v}
\textcolor{orange}{h_{v}^{(k-1)}}}_{\text{previous embeddings}}
\right] \right) \forall v \in V
$$
where
$$
\alpha^{(k)}_{v,u} = \frac{
\textcolor{skyblue}{A^{(k)}}(
\textcolor{orange}{h_{v}^{(k)}},
\textcolor{violet}{h_{u}^{(k)}},
)
}{
\sum_{w \in \mathcal{N}(v)} \textcolor{skyblue}{A^{(k)}}(
\textcolor{orange}{h_{v}^{(k)}},
\textcolor{violet}{h_{w}^{(k)}},
)
} \forall (v, u) \in E
$$
### Graph Sample and Aggregate (GraphSAGE)
<!-- TODO: See PDF 14 Anelli from 98 to 102 -->
### Graph Isomorphism Network (GIN)
$$
\textcolor{orange}{h_{v}^{(k)}} =
\textcolor{skyblue}{f^{(k)}}
\left(
\sum_{u \in \mathcal{N}(v)}
\textcolor{violet}{h_{u}^{(k - 1)}} +
(
1 +
\textcolor{skyblue}{\epsilon^{(k)}}
) \cdot \textcolor{orange}{h_{v}^{(k - 1)}}
\right)
\forall v \in V
$$