toxicity classification module
v0.1.10
用于预测俄罗斯和英语的毒性消息的模块
from toxicityclassifier import *
classifier = ToxicityClassificatorV1 ()
print ( classifier . predict ( text )) # (0 or 1, probability)
print ( classifier . get_probability ( text )) # probability
print ( classifier . classify ( text )) # 0 or 1 分类的重量(如果概率> = = strigh => 1 else 0 0)
classifier . weight = 0.5
语言检测的重量(英语或俄语)
如果俄罗斯语言的百分比> = language_weight,则使用俄罗斯模型,否则英语
classifier . language_weight = 0.5