AP6621 commited on
Commit
6af3a70
Β·
verified Β·
1 Parent(s): 06ad31b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +122 -3
README.md CHANGED
@@ -1,3 +1,122 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ tags:
5
+ - gpt2
6
+ - keras-nlp
7
+ - text-generation
8
+ - bhagavad-gita
9
+ license: mit
10
+ datasets:
11
+ - custom
12
+ pipeline_tag: text-generation
13
+ ---
14
+
15
+ # πŸ“– Fine-tuned GPT-2 on the Bhagavad Gita
16
+
17
+ This repository contains a **fine-tuned GPT-2 model** trained on the *Bhagavad Gita* (English meanings/verses).
18
+ It generates text inspired by the teachings and style of the Bhagavad Gita.
19
+
20
+ ---
21
+
22
+ ## 🧾 Model Card
23
+
24
+ | Attribute | Details |
25
+ |------------------|---------|
26
+ | **Base Model** | [GPT-2 (124M)](https://huggingface.co/openai-community/gpt2) |
27
+ | **Architecture** | Decoder-only Transformer |
28
+ | **Framework** | TensorFlow / KerasNLP |
29
+ | **Dataset** | Bhagavad Gita (English meanings) |
30
+ | **Languages** | English |
31
+ | **Dataset Size** | ~700 verses |
32
+ | **Tokenizer** | GPT-2 tokenizer (inherited vocabulary) |
33
+
34
+ ---
35
+
36
+ ## πŸ“Š Training Details
37
+
38
+ - **Epochs**: 3
39
+ - **Batch Size**: 8
40
+ - **Learning Rate**: 3e-5
41
+ - **Optimizer**: AdamW
42
+ - **Loss Function**: Causal Language Modeling (CrossEntropy)
43
+ - **Preprocessing**: Cleaned, tokenized, and formatted into text sequences
44
+
45
+ ---
46
+
47
+ ## πŸš€ Usage
48
+
49
+ You can load and run the model using Hugging Face `transformers`:
50
+
51
+ ```python
52
+ from transformers import AutoTokenizer, AutoModelForCausalLM
53
+
54
+ # Load tokenizer & model
55
+ tokenizer = AutoTokenizer.from_pretrained("AP6621/Bhagawatgitagpt")
56
+ model = AutoModelForCausalLM.from_pretrained("AP6621/Bhagawatgitagpt")
57
+
58
+ # Generate text
59
+ inputs = tokenizer("Arjuna asked:", return_tensors="pt")
60
+ outputs = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.9, temperature=0.8)
61
+
62
+ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
63
+ ```
64
+
65
+ ---
66
+
67
+ ## ✨ Example
68
+
69
+ **Prompt**:
70
+ ```
71
+ Arjuna asked:
72
+ ```
73
+
74
+ **Generated Response**:
75
+ *"O Arjuna, when the mind is detached from desires, the soul attains peace and wisdom. Such a person sees all beings with equal vision."*
76
+
77
+ ---
78
+
79
+ ## βš–οΈ Limitations & Bias
80
+
81
+ - Small dataset β†’ may **repeat phrases** or **hallucinate content**
82
+ - Not an official or scholarly translation
83
+ - Should not be used as a **religious authority**
84
+
85
+ ---
86
+
87
+ ## βœ… Intended Use
88
+
89
+ βœ”οΈ Educational experiments
90
+ βœ”οΈ Creative/spiritual-inspired text generation
91
+ βœ”οΈ AI-assisted storytelling
92
+
93
+ ❌ Not for religious/spiritual authority
94
+ ❌ Not for official translations
95
+ ❌ Not for sensitive decision-making
96
+
97
+ ---
98
+
99
+ ## πŸ“Œ Citation
100
+
101
+ If you use this model, please cite:
102
+
103
+ ```bibtex
104
+ @misc{gpt2-bhagavadgita,
105
+ title = {Fine-tuned GPT-2 on Bhagavad Gita Dataset},
106
+ author = {Your Name},
107
+ year = {2025},
108
+ publisher = {Hugging Face},
109
+ howpublished = {\url{https://huggingface.co/your-username/gpt2-bhagavad-gita}}
110
+ }
111
+ ```
112
+
113
+ ---
114
+
115
+ ## πŸ™ Acknowledgements
116
+
117
+ - [OpenAI](https://huggingface.co/openai-community) for GPT-2
118
+ - Public Bhagavad Gita dataset
119
+ - Hugging Face community for tools & inspiration
120
+
121
+ ---
122
+