## Graph Thinking
https://hackmd.io/@teacher144123/SyuvT5Lsv
---
## Graph Representation Learning ch1,3
- Present: Dragon Chen
- Date: 12/04
---
## Intro
- Title: Graph Representation Learning ch1-3
- Website: [There](https://www.cs.mcgill.ca/~wlh/grl_book/)(qsirch 上也有 pdf)

----
## What is Graph
Review your data structure course <!-- .element: class="fragment" -->
Loading... <!-- .element: class="fragment" -->
**Collection of nodes**<!-- .element: class="fragment" -->
**Some connected edges**<!-- .element: class="fragment" -->
----

- Zachary's karate club dataset
----
Basic types:
- undirected
- directed
- weighted
----
### Recap: Adjacency Matrix

Undirected graph is **symmetric**
Weighted graph has **different values**
----
Advanced kinds of graph
- multi-relation
- heterogeneous
- multiplex
- ...
----
### Multi-Relation

Edge has different types
Note:
Source: Zero-shot Ingredient Recognition by Multi-Relational Graph Convolutional Network, AAAI 2020
----
### Heterogeneous

Certain edges only connect nodes of certain types.
Note:
Source: Graph Classification in Heterogeneous Networks
---
## Machine Learning is coming
----
### Task type
- Node Classification
- Relation Prediction
- Clusering and Commnity Detection
- Graph Classification, Regression and Clustering
----
- **Node Classification**
Classifiy node into categories
e.g. abstract of a paper and citation to classify category
<font color='red'>Graph usually be seen as **semi-supervised learning** because need all nodes to build graph</font><!-- .element: class="fragment" -->
- **Relation Prediction**
Predict an edge between nodes.
e.g. do you know a fb user in reality
----
- **Clusering and Commnity Detection**
Clustering nodes
- **Graph Classification, Regression and Clustering**
Predict an edge between nodes.
e.g. predict toxicity(regression) by molecule graph
Note:
molecule 分子
---
## Node Embedding
Encode nodes as **low-dimensional vectors** that summarize their graph position and the structure
----
### Encoder-Decoder

$DEC(ENC(u), ENC(v)) = S[u, v]$
$DEC(z_u, z_v) = S[u, v]$
S is a simlarity value, e.g. 1 for connected, 0 for no
**ENC, DEC, Similarity decide different algorithms**<!-- .element: class="fragment" -->
----
### Encoder first
$z_u = ENC(u) = Z[u]$
$Z \in R^{|v| \times d}$
Each row of z contains a embedding of given node
----
### Algorithms

There are two kinds of approachs
- Factorization-based
- Random walk embeddings
----
### Laplacian Eigenmaps
$DEC(z_u, z_v) = ||z_u - z_v||^2_2$
$Loss = \sum_{(u, v) \in D} DEC(z_u, z_v)\cdot S[u, v]$
It builds upon the spectral clustering ideas.
----
### DeepWalk

$DEC(z_u, z_v) = P(v|u)$
$Loss = \sum_{(u, v) \in D} -log(DEC(z_u, z_v))$
**Using probability of visting v on a random walk starting with u.**
note:
Source: Billion-scale Commodity Embedding for E-commerce Recommendation in Alibaba
---

### Thanks you for listening 🐧
{"metaMigratedAt":"2023-06-15T16:35:34.282Z","metaMigratedFrom":"YAML","title":"Graph Representation Learning ch1,3","breaks":true,"slideOptions":"{\"theme\":\"serif\",\"transition\":\"slide\"}","contributors":"[{\"id\":\"1155b8f4-1581-4900-adcf-1b4abbb2c7fd\",\"add\":4317,\"del\":860}]"}