Instructions to use deepcode-ai/deepcode-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use deepcode-ai/deepcode-base with Transformers:
# Load model directly from transformers import MultiModalityCausalLM model = MultiModalityCausalLM.from_pretrained("deepcode-ai/deepcode-base", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Create processor_config.json
Browse files- processor_config.json +9 -0
processor_config.json
ADDED
|
@@ -0,0 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"add_special_token": false,
|
| 3 |
+
"ignore_id": -100,
|
| 4 |
+
"image_tag": "<image_placeholder>",
|
| 5 |
+
"mask_prompt": true,
|
| 6 |
+
"num_image_tokens": 576,
|
| 7 |
+
"processor_class": "VLChatProcessor",
|
| 8 |
+
"sft_format": "deepseek"
|
| 9 |
+
}
|