Skip to content
View Zhang-Each's full-sized avatar
🍋
Just so so!
🍋
Just so so!

Highlights

  • Pro

Block or report Zhang-Each

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
Zhang-Each/README.md

About ME

  • I'm Zhang-Each
    • A graduate student in Zhejiang University, major in Computer Science
    • Study Knowledge Graphs(KG) and Natural Language Processing(NLP) in ZJU-KG lab.
  • Learning open courses released by Stanford/MIT/CMU
  • Blog: link here
  • Notebook: link here
  • Personal Page: link here

Publication

  • Knowledge Graph Completion with Pre-trained Multimodal Transformer and Twins Negative Sampling. (First Author, Accepted by KDD-2022 Undergraduate consortium, ArXiv)
  • Tele-Knowledge Pre-training for Fault Analysis. (Accepted by ICDE-2023 Industry Track, ArXiv)
  • Modality-Aware Negative Sampling for Multi-modal Knowledge Graph Embedding. (Accepted by IJCNN 2023, ArXiv)
  • CausE: Towards Causal Knowledge Graph Embedding. (Accepted by CCKS 2023, ArXiv)
  • MACO: A Modality Adversarial and Contrastive Framework for Modality-missing Multi-modal Knowledge Graph Completion. (Accepted by NLPCC 2023, ArXiv)
  • Unleashing the Power of Imbalanced Modality Information for Multi-modal Knowledge Graph Completion. (Accepted by COLING 2024, ArXiV)
  • NativE: Multi-modal Knowledge Graph Completion in the Wild. (Accepted by SIGIR 2024, ArXiV).
  • Knowledgeable Preference Alignment for LLMs in Domain-specific Question Answering. (Accepted by ACL 2024 Findings, ArXiv)

Preprint

  • Making Large Language Models Perform Better in Knowledge Graph Completion. (ArXiv)
  • Knowledge Graphs Meet Multi-Modal Learning: A Comprehensive Survey. (ArXiv)
  • MyGO: Discrete Modality Information as Fine-Grained Tokens for Multi-modal Knowledge Graph Completion. (ArXiv)
  • Multi-domain Knowledge Graph Collaborative Pre-training and Prompt Tuning for Diverse Downstream Tasks. (ArXiV)
  • Mixture of Modality Knowledge Experts for Robust Multi-modal Knowledge Graph Completion. (ArXiV)

Stats

Zhang-Each's GitHub Stats Haofei Yu's GitHub Stats

nothing

Pinned Loading

  1. CourseNoteOfZJUSE CourseNoteOfZJUSE Public

    ZJU-SE的一些课程笔记,历年卷,课程经历分享

    413 79

  2. zjukg/KG-MM-Survey zjukg/KG-MM-Survey Public

    Knowledge Graphs Meet Multi-Modal Learning: A Comprehensive Survey

    362 18

  3. zjukg/MyGO zjukg/MyGO Public

    [Paper][AAAI 2025] (MyGO)Tokenization, Fusion, and Augmentation: Towards Fine-grained Multi-modal Entity Representation

    Python 224 4

  4. zjukg/KnowPAT zjukg/KnowPAT Public

    [Paper][ACL 2024 Findings] Knowledgeable Preference Alignment for LLMs in Domain-specific Question Answering

    Python 189 17

  5. zjukg/KoPA zjukg/KoPA Public

    [Paper][ACM MM 2024] Making Large Language Models Perform Better in Knowledge Graph Completion

    Python 154 8

  6. zjukg/NATIVE zjukg/NATIVE Public

    [Paper][SIGIR 2024] NativE: Multi-modal Knowledge Graph Completion in the Wild

    Python 28 1