## Graph Thinking https://hackmd.io/@teacher144123/SyuvT5Lsv --- ## Graph Representation Learning ch1,3 - Present: Dragon Chen - Date: 12/04 --- ## Intro - Title: Graph Representation Learning ch1-3 - Website: [There](https://www.cs.mcgill.ca/~wlh/grl_book/)(qsirch 上也有 pdf) ![](https://i.imgur.com/cLGjzRU.png =300x) ---- ## What is Graph Review your data structure course <!-- .element: class="fragment" --> Loading... <!-- .element: class="fragment" --> **Collection of nodes**<!-- .element: class="fragment" --> **Some connected edges**<!-- .element: class="fragment" --> ---- ![](https://i.imgur.com/Bvcz2u1.png) - Zachary's karate club dataset ---- Basic types: - undirected - directed - weighted ---- ### Recap: Adjacency Matrix ![](https://i.imgur.com/1Df57Ew.png) Undirected graph is **symmetric** Weighted graph has **different values** ---- Advanced kinds of graph - multi-relation - heterogeneous - multiplex - ... ---- ### Multi-Relation ![](https://i.imgur.com/UCpa3Zf.png =1000x) Edge has different types Note: Source: Zero-shot Ingredient Recognition by Multi-Relational Graph Convolutional Network, AAAI 2020 ---- ### Heterogeneous ![](https://i.imgur.com/XbfzprS.png =800x) Certain edges only connect nodes of certain types. Note: Source: Graph Classification in Heterogeneous Networks --- ## Machine Learning is coming ---- ### Task type - Node Classification - Relation Prediction - Clusering and Commnity Detection - Graph Classification, Regression and Clustering ---- - **Node Classification** Classifiy node into categories e.g. abstract of a paper and citation to classify category <font color='red'>Graph usually be seen as **semi-supervised learning** because need all nodes to build graph</font><!-- .element: class="fragment" --> - **Relation Prediction** Predict an edge between nodes. e.g. do you know a fb user in reality ---- - **Clusering and Commnity Detection** Clustering nodes - **Graph Classification, Regression and Clustering** Predict an edge between nodes. e.g. predict toxicity(regression) by molecule graph Note: molecule 分子 --- ## Node Embedding Encode nodes as **low-dimensional vectors** that summarize their graph position and the structure ---- ### Encoder-Decoder ![](https://i.imgur.com/2EYoPF6.png =800x) $DEC(ENC(u), ENC(v)) = S[u, v]$ $DEC(z_u, z_v) = S[u, v]$ S is a simlarity value, e.g. 1 for connected, 0 for no **ENC, DEC, Similarity decide different algorithms**<!-- .element: class="fragment" --> ---- ### Encoder first $z_u = ENC(u) = Z[u]$ $Z \in R^{|v| \times d}$ Each row of z contains a embedding of given node ---- ### Algorithms ![](https://i.imgur.com/ob6A6pc.png) There are two kinds of approachs - Factorization-based - Random walk embeddings ---- ### Laplacian Eigenmaps $DEC(z_u, z_v) = ||z_u - z_v||^2_2$ $Loss = \sum_{(u, v) \in D} DEC(z_u, z_v)\cdot S[u, v]$ It builds upon the spectral clustering ideas. ---- ### DeepWalk ![](https://i.imgur.com/YSI6B1w.png =1000x) $DEC(z_u, z_v) = P(v|u)$ $Loss = \sum_{(u, v) \in D} -log(DEC(z_u, z_v))$ **Using probability of visting v on a random walk starting with u.** note: Source: Billion-scale Commodity Embedding for E-commerce Recommendation in Alibaba --- ![](https://i.imgur.com/Wm4aGq0.jpg =300x) ### Thanks you for listening 🐧
{"metaMigratedAt":"2023-06-15T16:35:34.282Z","metaMigratedFrom":"YAML","title":"Graph Representation Learning ch1,3","breaks":true,"slideOptions":"{\"theme\":\"serif\",\"transition\":\"slide\"}","contributors":"[{\"id\":\"1155b8f4-1581-4900-adcf-1b4abbb2c7fd\",\"add\":4317,\"del\":860}]"}
    538 views