Overview
Hugging Face Transformers is the most popular library for working with pre-trained language models. This guide covers installation, basic usage, and common NLP tasks.
Installation
pip install transformers torch
Loading a Pre-trained Model
from transformers import pipeline
# Text classification
classifier = pipeline("sentiment-analysis")
result = classifier("I love using Hugging Face!")
print(result) # [{'label': 'POSITIVE', 'score': 0.9998}]
Text Generation
generator = pipeline("text-generation", model="gpt2")
output = generator("The future of AI is", max_length=50)
print(output[0]['generated_text'])
Named Entity Recognition
ner = pipeline("ner", grouped_entities=True)
text = "Apple was founded by Steve Jobs in Cupertino."
entities = ner(text)
# [{'entity_group': 'ORG', 'word': 'Apple'}, ...]
Fine-tuning a Model
from transformers import AutoModelForSequenceClassification, Trainer, TrainingArguments
model = AutoModelForSequenceClassification.from_pretrained("bert-base-uncased", num_labels=2)
training_args = TrainingArguments(
output_dir="./results",
num_train_epochs=3,
per_device_train_batch_size=16,
evaluation_strategy="epoch"
)
trainer = Trainer(model=model, args=training_args, train_dataset=train_data)
trainer.train()
Key Resources
- Hugging Face Documentation
- Model Hub - 400,000+ pre-trained models
- Datasets Library
💬 Comments
Comments are coming soon! We're setting up our discussion system.
In the meantime, feel free to contact us with your feedback.