Model Weights
The collection of all learned parameter values in a neural network. Model weights are what you download when you get a pre-trained model — they encode everything the model learned.
Why It Matters
Model weights are the actual 'brain' of an AI model. Open-weight models share these, while closed-weight models keep them proprietary.
Example
Downloading the 140GB weight file for Llama 2 70B, which contains 70 billion floating-point numbers representing everything the model learned during training.
Think of it like...
Like a brain scan that captures the exact state of every neural connection — the weights are the physical encoding of the model's knowledge.
Related Terms
Parameter
Any learnable value in a machine learning model that is adjusted during training. Parameters include weights and biases in neural networks. Model size is often described by parameter count.
Open Source AI
AI models and tools released with open licenses that allow anyone to use, modify, and distribute them. Open-source AI democratizes access and enables community-driven improvement.
Closed Source AI
AI models where the architecture, weights, and training details are proprietary and not publicly available. Users access them only through APIs or products controlled by the developer.
Pre-training
The initial phase of training a model on a large, general-purpose dataset before specializing it for specific tasks. Pre-training gives the model broad knowledge and capabilities.
Fine-Tuning
The process of taking a pre-trained model and further training it on a smaller, domain-specific dataset to specialize its behavior for a particular task or domain. Fine-tuning adjusts the model's weights to improve performance on the target task.