Skip to content

Commit 37ccf1e

Browse files
arthurdouillardarthurdouillard
authored andcommitted
Add docs about baselines and pretrained models.
PiperOrigin-RevId: 494986022 Change-Id: I2cdce164b17b0c6113da62ab0ea02e07a3f0612a
1 parent 6de783b commit 37ccf1e

File tree

1 file changed

+21
-1
lines changed

1 file changed

+21
-1
lines changed

README.md

Lines changed: 21 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -209,7 +209,27 @@ Then, we launch the example learner:
209209
Note that the stream `DEBUG` is already specified in the config
210210
`./experiments_jax/config/example.py`.
211211

212-
## 4. Code paths
212+
## 4. Baselines
213+
214+
We provide several baselines, defined in the `learners/` directory with configurations
215+
in the `configs/` directory. Note that the same approach might have multiple configurations.
216+
217+
Reminder, to run configuration `configs/X.py`, do `./launch_local.sh jax X.py`.
218+
219+
We provide the following baselines:
220+
- **Independent**, in `configs/finetuning_ind.py` where each dataset is learned by an independent model
221+
- **Previous**, in `configs/finetuning_prev.py` where we learn sequentially each dataset and initialize its parameters from the parameter vector learned on the previous task.
222+
- **Dynamic**, in `configs/finetuning_dknn.py`. where the initialization of task T is chosen among the models which have been trained on a dataset most similar to the current dataset. This baseline performs hyperparameter tuning while learning the task, following the protocol described in our tech report.
223+
224+
225+
Variants are also proposed, such as cheaper configurations in `configs/cheap_finetuning_dknn.py` which use a smaller net and fewer trials of hyper-parameter search. These are the best entry point for people who have access to only one or few GPUs.
226+
227+
228+
It is also possible to run a pretrained model on the Nevis stream. First train
229+
your own pretrained model. For example on ImageNet, run the configuration `configs/pretrain_imagenet.py`. Collect the resulting checkpoint, see configuration file to see where it's saved.
230+
Then, use this checkpoint for `configs/finetuning_ind_pretrained.py`.
231+
232+
## 5. Code paths
213233

214234
The code is structured as follows:
215235

0 commit comments

Comments
 (0)