Infinity is a high-throughput, low-latency serving engine for text-embeddings, reranking models, clip, clap and colpali
-
Updated
Nov 18, 2024 - Python
Infinity is a high-throughput, low-latency serving engine for text-embeddings, reranking models, clip, clap and colpali
PhoBERT: Pre-trained language models for Vietnamese (EMNLP-2020 Findings)
💭 Aspect-Based-Sentiment-Analysis: Transformer & Explainable ML (TensorFlow)
NLP for human. A fast and easy-to-use natural language processing (NLP) toolkit, satisfying your imagination about NLP.
Fast + Non-Autoregressive Grammatical Error Correction using BERT. Code and Pre-trained models for paper "Parallel Iterative Edit Models for Local Sequence Transduction": www.aclweb.org/anthology/D19-1435.pdf (EMNLP-IJCNLP 2019)
Simple State-of-the-Art BERT-Based Sentence Classification with Keras / TensorFlow 2. Built with HuggingFace's Transformers.
pretrained BERT model for cyber security text, learned CyberSecurity Knowledge
ColBERT humor dataset for the task of humor detection, containing 200,000 jokes/news
COVID-19 Question Dataset from the paper "What Are People Asking About COVID-19? A Question Classification Dataset"
This repository contains PyTorch implementation for the baseline models from the paper Utterance-level Dialogue Understanding: An Empirical Study
BERT semantic search engine for searching literature research papers for coronavirus covid-19 in google colab
This is a repo of basic Machine Learning what I learn. More to go...
Automated Essay Scoring using BERT
Topic clustering library built on Transformer embeddings and cosine similarity metrics.Compatible with all BERT base transformers from huggingface.
文本相似度,语义向量,文本向量,text-similarity,similarity, sentence-similarity,BERT,SimCSE,BERT-Whitening,Sentence-BERT, PromCSE, SBERT
Electra pre-trained model using Vietnamese corpus
Recommendation engine framework based on Wikipedia data
Bilingual term extractor
Finetuning BERT in PyTorch for sentiment analysis.
This is the code for loading the SenseBERT model, described in our paper from ACL 2020.
Add a description, image, and links to the bert-embeddings topic page so that developers can more easily learn about it.
To associate your repository with the bert-embeddings topic, visit your repo's landing page and select "manage topics."