Graph-of-Models - Literature Review 3 - and they call LLM and KG

Bro, hear me out. I feel like my Graph-of-Models dream is falling apart. There is a field called LLM+KG seems very close to my idea, I think it worth to see what is it.

After a day, I figure out that I can deploy my idea extending from this one to maximize the efficient of input data and methods to do the logic.

Surveys

What is LLM+KG?

LLM or Large Language Models is the hype of the world nowadays with the appearance of ChatGPT, Gemini or Bing AI (I heard about a shit called “vibe-coding” based on Claude but don’t pay attention to it yet). To me it’s a bunch of tensorflow files and json files eyes roll.

KG or Knowledge Graph is a structure represents the entity and its description, for the better data integration and insights (Hogan et al., 2021).

Why LLM+KG?

Based on the survey (Pan et al., 2024), the combination of LLM and KG is very promising. They can fix each other’s cons with their pros.

According to the authors, there will be 3 types of framework developing in this direction:

  1. KG-enhanced LLMs: incorporate KGs during pre-training to boost the power of LLMs
  2. LLM-augmented KGs: using the power of LLM for various KG tasks
  3. Synergized LLMs + KG: bidirectional reasoning to enhance both

After surfing for a while, with the main resource is XiaoxinHe/Awesome-Graph-LLM, I use 3 more works to research, they are: (Cheng et al., 2024), (Zhang & Soh, 2024) and (Jiang et al., 2024). Beside (Du et al., 2024) I already talked about in previous post, we will create a table to compare the work:

Table: Comparison between 4 chosen works.

Metric GaCLLM (Du et al., 2024) ReaDi (Cheng et al., 2024) EDC (Zhang & Soh, 2024) KG-FIT (Jiang et al., 2024)
Problem & Methodology GCN-based LLM adaptation Structured reasoning refinement LLM-to-Graph + Retrieval-Augmented Generation Hierarchical graph fine-tuning
Code Structure No code Well-structured Structured, but dataset issues Decent structure, nothing remarkable
Reputation 15 citations 23 citations / 12 GitHub stars 52 citations / 122 GitHub stars 11 citations / 112 GitHub stars
Last Update 404 Not Found ~1 year ago 11 months ago 9 months ago

So in this post, we will focus on edc (Zhang & Soh, 2024) and KG-FIT (Jiang et al., 2024).

edc

In the current development of edc (Zhang & Soh, 2024), it’s likely that the work is LLM-augmented KGs, but I believe there is potential to make it Synergized LLMs + KG. The purpose of this work is automatically create KGs to make it applicable when doing real-world application. This work proposes a framework has 3 steps: Extract-Define-Canonicalize.

  • Extract: take information from the dataset and convert them into a relation triplet [Object A, Relationship of A and B, Object B].
  • Define: write the definition for each component of the schema.
  • Canonicalize: use the schema definition from Define step to standardize the triplets.

The Schema Retriever of this work is what makes me impressed. It’s a trained model specialized in extracting schema components relevant to input text. This work divides step by step, in which step you can use a different model. It’s challenging but also a great chance to tailor-make and optimize the workflow.

KG-FIT

KG-FIT or Knowledge Graph - Finetuning (Jiang et al., 2024) has a different mindset to edc (Zhang & Soh, 2024). It focuses on fine-tuning or having a powerful LLM to improve the KG. The steps are less complex, the step to process the input data is also more easier for the devs. My concern is the graph traversing and the size of LLM as my resource isn’t enough to fine-tune a large model.

Maybe because I am already fall into edc so I am less investigated in this work.


I feel like I can integrate edc into my next action :>

References

  1. Knowledge graphs
    Aidan Hogan, Eva Blomqvist, Michael Cochez, and 8 more authors
    ACM Computing Surveys (Csur), Sep 2021
  2. Unifying large language models and knowledge graphs: A roadmap
    Shirui Pan, Linhao Luo, Yufei Wang, and 3 more authors
    IEEE Transactions on Knowledge and Data Engineering, Sep 2024
  3. Call me when necessary: Llms can efficiently and faithfully reason over structured environments
    Sitao Cheng, Ziyuan Zhuang, Yong Xu, and 8 more authors
    arXiv preprint arXiv:2403.08593, Sep 2024
  4. Extract, define, canonicalize: An llm-based framework for knowledge graph construction
    Bowen Zhang and Harold Soh
    arXiv preprint arXiv:2404.03868, Sep 2024
  5. Kg-fit: Knowledge graph fine-tuning upon open-world knowledge
    Pengcheng Jiang, Lang Cao, Cao Danica Xiao, and 3 more authors
    Advances in Neural Information Processing Systems, Sep 2024
  6. Large language model with graph convolution for recommendation
    Yingpeng Du, Ziyan Wang, Zhu Sun, and 6 more authors
    arXiv preprint arXiv:2402.08859, Sep 2024



    Enjoy Reading This Article?

    Here are some more articles you might like to read next:

  • Graph-of-Models - First Sketch
  • June 2025 Reading Log
  • my first ereader - Kobo Clara Colour
  • Graph-of-Models - Literature Review 2 - GaCLLM
  • Graph-of-Models - Literature Review 1 - Transformer and MoH