Skip to content

Latest commit

 

History

History
33 lines (23 loc) · 1.05 KB

README_EN.md

File metadata and controls

33 lines (23 loc) · 1.05 KB

bert4ms

mindspore implementation of transformers

Installation

Install from source

git clone https://github.com/lvyufeng/bert4ms
python setup.py install

Quick Start

from cybertron import BertTokenizer, BertModel
from cybertron import compile_model

tokenizer = BertTokenizer.load('bert-base-uncased')
model = BertModel.load('bert-base-uncased')

# get tokenized inputs
inputs = tokenizer("hello world")

# compile model
compile_model(model, inputs)

# run model inference
outputs = model(inputs)

Why bert4ms?

MindSpore has already provide the implementation of SOTA models in ModelZoo, but all checkpoints are trained from scratch which is not faithful. Since Transformers has become a convenient toolkit to finish research or industry tasks, I develop this tool to transfer the checkpoint with code from huggingface to MindSpore. You can use it as same as Transformers to develop your own pretrained or finetuned models.