Model Card for mBERT Manipulative Language Detector
ๆฌๆจกๅ็จไบๆฃๆตไธญๆๅ่ฑๆๆๆฌไธญ็ๆ็บตๆง่ฏญ่จ๏ผManipulative Language๏ผ๏ผไพๅฆ้ๆงๆงๅถใๆ ๆๅ็ดขใ่ฏญ่จๆๆง็ญ๏ผๅนฟๆณๅบ็จไบ็คพไบคๅฟ็ใๆๆฌ็ญๆฅๅๅ ๅฎนๅฎกๆ ธ็ญๅบๆฏใ
๐ง Model Details
- Developed by: LilithHu
- Finetuned from: google-bert/bert-base-multilingual-cased
- Languages: ไธญๆใ่ฑๆ
- License: other
- Model type: ๆๆฌๅ็ฑปๆจกๅ๏ผbinary classifier: manipulative / non-manipulative๏ผ
๐ง Uses
โ Direct Use
- ่พๅ ฅไธๆฎตๆๆฌ๏ผๆจกๅๅฐ่ฟๅ่ฏฅๆๆฌๆฏๅฆๅ ๅซๆ็บตๆง่ฏญ่จใ
- ๅฏ้่ฟ Hugging Face Inference API ๆ Web UI๏ผStreamlit๏ผ็ดๆฅ่ฐ็จใ
๐ฅ Intended Users
- NLP ็ ็ฉถ่
- ๅ ๅฎนๅฎกๆ ธไปไธ่
- ๅฟ็ๅญฆ็ ็ฉถไบบๅ
- ็คพไบคๅนณๅฐๆๅฏน่ฏ็ณป็ปๅผๅ่
๐ซ Out-of-Scope Use
ๆฌๆจกๅไธ้ๅ็จไบ๏ผ
- ๆณๅพๅฎกๅค
- ๅป็่ฏๆญ
- ็ฒพๅ่ฅ้็ญ้ซ้ฃ้ฉๅไธ่กไธบ
- ๅคๅฎไปไบบๅจๆบใไบบๆ ผๆๆ ๆ
โ ๏ธ Bias, Risks and Limitations
่ฏทๆณจๆ๏ผ
- ๆจกๅ่พๅบไธ็ญไบไบๅฎ๏ผไป ๅบไบ่ฎญ็ปๆฐๆฎ็ๆจกๅผ่ฟ่กๅ็ฑป
- ๆ็บตๆง่ฏญ่จ็ๅคๆญๅธฆๆไธๅฎไธป่งๆงไธๆๅๅๅทฎ
- ไธๅบ่ขซ็จไบ่ฏๅคๅ ทไฝไธชไบบใๆ ๆๆ่กไธบๆญฃๅฝๆง
โ ๅปบ่ฎฎ
ไฝฟ็จ่ ๅบ็ปๅไบบๅทฅๅคๆญ๏ผๅคๆจกๆใๅคๆธ ้ๅฐ็่งฃๆๆฌๅซไนใๅฏนไบๆจกๅ้ขๆต็ปๆไธๅฏ็ฒไฟก๏ผๅบ่งไธบ่พ ๅฉๅทฅๅ ทใ
๐ How to Use
from transformers import pipeline
classifier = pipeline("text-classification", model="LilithHu/mbert-manipulative-detector")
result = classifier("ๆ็ฑไฝ ")
print(result)
ไนๅฏ้่ฟ็ป็ซฏ่ฐ็จ๏ผ
curl -X POST https://api-inference.huggingface.co/models/LilithHu/mbert-manipulative-detector \
-H "Authorization: Bearer <your_hf_token>" \
-H "Content-Type: application/json" \
-d '{"inputs": "ๆ็ฑไฝ "}'
๐๏ธ Training Details
๐ Training Data
- CDial-GPT/toy_valid
- thu-coai/esconvใcdconv ๆฐๆฎ้
- ่ชๅปบไธญๆๆ็บตๆง่ฏญ่จ่ฏญๆ๏ผๆชๅ ฌๅผ๏ผ
โ๏ธ Training Procedure
- ่ฎญ็ปๅนณๅฐ๏ผGoogle Colab๏ผGPU๏ผT4
- Epochs: 3
- Batch size: 32
- Optimizer: AdamW
- LR: 2e-5
๐ Evaluation
| Metric | Score |
|---|---|
| Accuracy | 0.** |
| Precision | 0.** |
| Recall | 0.** |
| F1-score | 0.** |
๐ Environmental Impact
- ่ฎญ็ปๆถ้ด็บฆ 3 ๅฐๆถ๏ผไฝฟ็จ Google Colab GPU๏ผT4๏ผ
- ไผฐ็ฎ็ขณๆๆพ < 2kg CO2eq
๐ Disclaimer
- ๆฌๆจกๅ็จไบ็ ็ฉถไธๆ่ฒ็จ้๏ผไธๅพไฝไธบๆณๅพใ้ๅพทใๅป็ๆๅไธๅคๆญไพๆฎใ
- ้ขๆต็ปๆไป ไธบๅ่๏ผไฝฟ็จ่ ้่ช่กๆฟๆ ้ฃ้ฉใ
- ่ฏทๅฟ็จไบๆถๆๆปๅปใ่ๆ ๆ็บตๆ่ฏฏๅฏผไปไบบ่กไธบใ
๐ Model Card Authors
LilithHu
๐ฌ Contact
ๅฆ้ๅ้ฆๅปบ่ฎฎ๏ผ่ฏท้่ฟ Hugging Face ็่จ่็ณปไฝ่ ใ
๐ Citation
@misc{LilithHu2025,
title={mBERT Manipulative Language Detector},
author={LilithHu},
year={2025},
url={https://huggingface.co/LilithHu/mbert-manipulative-detector}
}
- Downloads last month
- 36
Model tree for LilithHu/mbert-manipulative-detector
Base model
google-bert/bert-base-multilingual-cased