xiaoxinhe / awesome-graph-llm Goto Github PK
View Code? Open in Web Editor NEWA collection of AWESOME things about Graph-Related LLMs.
License: MIT License
A collection of AWESOME things about Graph-Related LLMs.
License: MIT License
I want to share this recent work:
MindMap: Knowledge Graph Prompting Sparks Graph of Thoughts in Large Language Models
https://github.com/agiresearch/InstructGLM
Do you mind if I want a help for adding this official github code link to (arXiv 2023.08) Natural Language is All a Graph Needs
Thank you so much.
Thank you for sharing your insightful paper list!
We would like to introduce another paper that complements this topic: "GIMLET: A Unified Graph-Text Model for Instruction-Based Molecule Zero-Shot Learning." This paper integrates the graph modality into the language model and employs an instruction-based learning approach to address graph property prediction tasks. We would be delighted if our suggestion contributes to the enrichment of this awesome paper list
Dear authors,
Thank you so much for making this great repo! Could you please add these three papers? The first two are about representation learning with large pretrained language models on graphs associated with textual information, while the last is about pretraining large language models on text-rich networks.
[1] Heterformer: A Transformer Architecture for Node Representation Learning on Heterogeneous Text-Rich Networks. KDD 2023.
[2] Edgeformers: Graph-Empowered Transformers for Representation Learning on Textual-Edge Networks. ICLR 2023.
[3] Patton: Language Model Pretraining on Text-rich Networks. ACL 2023.
Thanks!
Hi Xiaoxin,
We have just arxiv a new position paper "Integrating Graphs with Large Language Models: Methods and Prospects", which discusses the current paradigms and open questions regarding the integration of graphs and LLMs.
Could you please kindly include this paper in your repo? Much appreciated!
Here is the link to the paper:
https://arxiv.org/abs/2310.05499
Warmest Regards,
Yizhen Zheng
Hi Xiaoxin,
We have recently published a paper titled "Efficient Tuning and Inference for Large Language Models on Textual Graphs ". This paper introduces an efficient training and inference algorithm specifically designed for Large Language Models (LLMs) on text-attributed graphs.
We would greatly appreciate it if you could include this paper in your repository, specifically in the "Node Classification" section.
Best,
Yun
(arXiv 2024.02) Similarity-based Neighbor Selection for Graph LLMs [paper]
This paper should belong to the node classification part.
Hi, thanks for this awesome repo! We would like to introduce a related work on using the graph structure of general tasks to enhance complex reasoning with LLMs "Thought Propagation: An Analogical Approach to Complex Reasoning with Large Language Models". We hope our suggestion could enrich this awesome paper list
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.