File size: 2,230 Bytes
0a549c3 76cb247 0a549c3 76cb247 0a549c3 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 | ---
license: cc-by-nc-4.0
datasets:
- iamtarun/python_code_instructions_18k_alpaca
language:
- en
base_model: Pacific-Prime/pacific-prime-code
tags:
- code
- python
- i64
- complexity-deep
- sft
pipeline_tag: text-generation
library_name: transformers
---
# Pacific-Prime: Python Node
**Pure Python specialist** fine-tuned from Pacific-Prime Code (I64 architecture, 1.5B parameters).
## Skills
- Python basics & standard library
- Algorithms & data structures
- Object-oriented programming
- Decorators & generators
- List comprehensions
- File I/O & error handling
## Training
- **Architecture**: I64 (Complexity-Deep)
- **Parameters**: 1.5B
- **Base model**: pacific-prime-code (checkpoint epoch 70)
- **Method**: Full SFT (no LoRA)
- **Dataset**: python_code_instructions_18k_alpaca (18K samples)
- **Epochs**: 1000
- **Max context**: 4096 tokens
## Inference with vLLM-I64
Use our custom vLLM engine with native I64 support:
👉 **[vllm-i64](https://github.com/Complexity-ML/vllm-i64)**
```bash
git clone https://github.com/Complexity-ML/vllm-i64.git
cd vllm-i64
pip install -e .
```
```python
from vllm import LLM, SamplingParams
model = LLM(model="Pacific-Prime/python-node")
params = SamplingParams(temperature=0.7, max_tokens=4096)
prompt = "User: Write a Python function to find the longest common subsequence of two strings.\nAssistant:"
output = model.generate([prompt], params)
print(output[0].outputs[0].text)
```
## Serve Your Own I64 Model
Trained your own I64 model with [complexity-deep](https://github.com/Complexity-ML/complexity-deep)? Serve it with vllm-i64:
```python
from vllm import LLM, SamplingParams
model = LLM(model="/path/to/your/i64-model")
params = SamplingParams(temperature=0.7, max_tokens=4096)
output = model.generate(["User: Hello!\nAssistant:"], params)
print(output[0].outputs[0].text)
```
## Links
- [Complexity Framework](https://github.com/Complexity-ML/complexity-framework) — ML framework for building & training I64 models
- [Complexity-Deep](https://github.com/Complexity-ML/complexity-deep) — Training framework & architecture
- [vllm-i64](https://github.com/Complexity-ML/vllm-i64) — Inference engine for I64 models
## License
CC BY-NC 4.0
|