The slide compares pretraining and fine-tuning across key aspects like data requirements, objectives, compute needs, advantages, and disadvantages. Pretraining demands massive diverse data and high compute for broad general knowledge, while fine-tuning uses smaller task-specific data and moderate compute for high accuracy but with overfitting risks.
Pretraining vs Fine-tuning
{ "headers": [ "Aspect", "Pretraining", "Fine-tuning" ], "rows": [ [ "Data Requirements", "Massive, diverse data", "Smaller, task-specific data" ], [ "Training Objective", "General knowledge", "Specific tasks" ], [ "Compute Needs", "Very high", "Moderate" ], [ "Advantages", "Broad capabilities", "High task accuracy" ], [ "Disadvantages", "Costly, less specialized", "Overfitting risk" ] ] }