Source: sentencepiece Section: science Priority: optional Maintainer: Debian Science Maintainers Uploaders: TSUCHIYA Masatoshi , Kentaro Hayashi Build-Depends: debhelper-compat (= 13), protobuf-compiler, libprotobuf-dev, dh-python, python3-all-dev, quilt, cmake, python3-setuptools Standards-Version: 4.5.0 Homepage: https://github.com/google/sentencepiece Vcs-Browser: https://salsa.debian.org/science-team/sentencepiece Vcs-Git: https://salsa.debian.org/science-team/sentencepiece.git Rules-Requires-Root: no Package: sentencepiece Architecture: any Depends: ${shlibs:Depends}, ${misc:Depends} Description: Unsupervised text tokenizer and detokenizer SentencePiece is an unsupervised text tokenizer/detokenizer mainly designed for Neural Network-based text generation systems where the vocabulary size is predetermined prior to the neural model training. Package: libsentencepiece0 Section: libs Architecture: any Depends: ${shlibs:Depends}, ${misc:Depends} Description: Library files of SentencePiece SentencePiece is an unsupervised text tokenizer/detokenizer mainly designed for Neural Network-based text generation systems where the vocabulary size is predetermined prior to the neural model training. Package: libsentencepiece-dev Section: libdevel Architecture: any Depends: libsentencepiece0 (= ${binary:Version}), ${misc:Depends} Description: Header files of SentencePiece SentencePiece is an unsupervised text tokenizer/detokenizer mainly designed for Neural Network-based text generation systems where the vocabulary size is predetermined prior to the neural model training. Package: python3-sentencepiece Section: python Architecture: any Depends: ${shlibs:Depends}, ${misc:Depends}, ${python3:Depends} Description: SentencePiece binding for Python3 SentencePiece is an unsupervised text tokenizer/detokenizer mainly designed for Neural Network-based text generation systems where the vocabulary size is predetermined prior to the neural model training. . python3-sentencepiece is its binding for Python3.