Aprendizado com poucos exemplos
Definição
Few-shot learning aims to adapt quickly from a small number of labeled examples (por ex. 1–5 por classe). Meta-learning (por ex. MAML) trains models to be good at few-shot adaptation.
Situa-se entre transfer learning (more target data) and zero-shot (no target examples). LLMs do few-shot implicitly via in-context examples in the prompt; classical few-shot uses episodic meta-training (por ex. MAML) so the model learns to adapt from a support set.
Como funciona
Cada tarefa tem um conjunto de suporte (poucos exemplos rotulados, por ex. 1–5 por classe) e um conjunto de consulta (exemplos para prever). Adaptar: o modelo usa o conjunto de suporte para se adaptar (por ex. compute prototypes, or take a few gradient steps in MAML). Predict: the adapted model predicts labels for the query set. Episodic training: sample many few-shot tasks from a meta-train set; for each, adapt on the task support set and optimize so that predictions on the query set improve. At test time, the model gets a new task’s support set and predicts on its query set. For LLMs, "adapt" is just conditioning on the support examples in the prompt (in-context few-shot).
Casos de uso
Few-shot learning applies when you have only a handful of examples por classe or task (including in-context LLM prompts).
- Classifying rare classes with only a poucos exemplos rotulados
- LLM in-context learning (por ex. 1–5 examples in the prompt)
- Rapid adaptation in robotics or personalization with minimal data