Working with machine learning models doesn’t have to mean installing huge toolchains locally.
### 🧪 Why Colab?
- Free GPUs and TPUs
- No local setup
- Easy sharing with friends or teammates
- Built-in Google Drive integration
### 🤗 Why Hugging Face?
- Pretrained models for text, vision, audio
- Easy-to-use APIs
- Community-driven model hub
## 🚀 Minimal working example
Here’s a Hugging Face model running in Colab (copy this into a .ipynb file):
from transformers import pipeline
pipe = pipeline("text-generation", model="gpt2")
response = pipe("Hugo is a static site generator that", max\_new\_tokens=30)
print(response\[0]\["generated\_text"])