Instructions to use bigscience/bloom-intermediate with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use bigscience/bloom-intermediate with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("bigscience/bloom-intermediate", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Ctrl+K
- 6.07 kB
- 21.9 kB
- 509 Bytes
- 67.1 kB
- 7.19 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet
- 4.93 GB xet