Denken
Menu
  • About Me
  • Deep Learning with Pytorch
  • Generative AI: Tutorial Series
  • Python Tutorials
  • Contact Me
Menu

Graph Neural Network – Getting Started – 1.0

Posted on November 3, 2022January 31, 2023 by Aritra Sen

In this new blog post series we will talk about Graph neural network which is according to the ‘State of AI report 2021’ has been stated as one of the hottest field of the AI research. This blog post series will be mostly implementation oriented , however required theory will be covered as much as possible and required resource to understand the concepts in detail will be provided.

Now a days we can see graph data everywhere starting Medical/Biology field (molecules / DNA -proteins interactions can be represented as graph) , Social Networks , Web Graph (World Wide Web is another huge graph data) and many navigation systems like Google Maps / Uber uses graphs networks to solve road networks / navigation problem. Recommendation system also started using graphs as primary object to provide optimal recommendations.

To begin with , Graph is generally a collection of nodes( collection of nodes are also called Vertices) and edges , below is an example given –

Graph

V (Vertices) = {0, 1, 2, 3}

E (Edges) = {(0,1), (1,2), (2,3), (0,3)}

G (Graph) = {V, E}

Furthermore , we will talk about different types of graphs , how graphs are generally represented.
Generally graphs are classified as below –

Directed graph:
A graph that is made up of a set of vertices connected by directed edges.

Undirected graph:
A graph where edges does not have any directions is called undirected edges.

Adjacency matrix (2D matrix) is mostly used to represent a graph where rows and columns are denote vertices.  The values of this matrix Aij are defined as:

Adjacency Matrix Representation
Adj Matrix for Directed and Undirected graph

Example :
Representing the Caffeine molecule from graphs:

Molecules can be represented through of as a group of Atoms held together with chemical bonds. Atoms can be corresponded to the different chemical elements such as carbon(C) , oxygen(0) , nitrogen (N) or hydrogen (H). Bonds between the atoms can also be of different types such as single bond or double bond.

We can represent the molecule with undirected graph with a node label matrix which is an one-hot representation of the atom type of the node along with a edge label matrix where each row is an one-hot representation of associated edge type (single or double bond) as shown below –

Caffeine Molecule Representation (Source : Sebastian Raschka)

GNN can be used to solve multiple machine learning tasks with different kind of prediction problems as shown below –

Different Prediction Tasks (Source : DeepFindr – YouTube)

Why working with Graphs are different than traditional machine learning problems?

1. Graphs exists in non-Euclidian (neither 2D or 3D) space which makes it harder to analyze or visualize Graphs data.

2. Graphs does not have any fixed form; they are dynamic in nature. Two different graphs can have same adjacency matrix or same graph can be represented with different adjacency matrix (by varying the ordering of the nodes). For these reasons while designing the solution for a graphs problem we have to have a strict prior of permutation invariance (which means that ordering of the nodes does not affect the output).

To address the issues mentioned above different types GNN architectures has been developed as shown below –

GCNs (Graph Convolution Networks)
GAT (Graph Attention Network)

In the next post we will go through the very basics of message passing Technique in the GCNs and will try to implement a graph machine learning problem.

Thanks for reading, please comment in case if you have any questions.

Category: Machine Learning, Python

Post navigation

← 1.2 – Fine Tune a Transformer Model (2/2)
Graph Neural Network – Message Passing (GCN) – 1.1 →

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RSS Feeds

Enter your email address:

Delivered by FeedBurner

Pages

  • About Me
  • Contact Me
  • Deep Learning with Pytorch
  • Generative AI: Tutorial Series
  • Python Tutorials

Tag Cloud

Announcements Anrdoid BERT Bias Celebration Cricket CyanogenMod deep-learning Denken Experience Facebook Features Finetuning GCN GenerativeAI GNN Google HBOOT HBOOT downgrading HTC Wildfire huggingface India Launch Life LLM Lumia 520 MachineLearning mobile My Space nlp Orkut People Python pytorch pytorch-geometric Rooting Sachin Share Social Network tranformers transformers Tutorials Twitter weight-initialization Windows Phone

WP Cumulus Flash tag cloud by Roy Tanck and Luke Morton requires Flash Player 9 or better.

Categories

Random Posts

  • Generative AI: LLMs: Finetuning Llama2 with QLoRA on custom dataset 1.5
  • About Me
  • Deep Learning with Pytorch-CNN – Getting Started – 2.0
  • Voyage
  • Python Tutorials – 1.6 – Class and Instances

Recent Comments

  • Generative AI: LLMs: Reduce Hallucinations with Retrieval-Augmented-Generation (RAG) 1.8 – Denken on Generative AI: LLMs: Semantic Search and Conversation Retrieval QA using Vector Store and LangChain 1.7
  • vikas on Domain Fuss
  • Kajal on Deep Learning with Pytorch -Text Generation – LSTMs – 3.3
  • Aritra Sen on Python Tutorials – 1.1 – Variables and Data Types
  • Aakash on Python Tutorials – 1.1 – Variables and Data Types

Visitors Count

AmazingCounters.com

Archives

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org

Copyright

AritraSen’s site© This site has been protected from copyright by copyscape.Copying from this site is stricktly prohibited. Protected by Copyscape Original Content Validator
© 2025 Denken | Powered by Minimalist Blog WordPress Theme