You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I was looking for a good Chinese/Japanese tokenizer in Go and stumbled across this one.
Based on the release history it seems like it looks like this library has been in use for quite a while, but it's still v0. Any reason not to issue an official v1 release?
It would also be nice to see quality metrics on the readme, if you have any. E.g. comparison to data like https://universaldependencies.org/
The text was updated successfully, but these errors were encountered:
Hi, I was looking for a good Chinese/Japanese tokenizer in Go and stumbled across this one.
Based on the release history it seems like it looks like this library has been in use for quite a while, but it's still v0. Any reason not to issue an official v1 release?
It would also be nice to see quality metrics on the readme, if you have any. E.g. comparison to data like https://universaldependencies.org/
The text was updated successfully, but these errors were encountered: