-
Notifications
You must be signed in to change notification settings - Fork 73
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue while loading the trained checkpoint #55
Comments
I am sorry about that, I think at some point there was a version incompatibility regarding the use of hydra/omegaconf with the checkpointing. Since you are already loading the module with the parameters, as long as you do not need to update them with the ones in the checkpoint, you can comment out the line File "/home//virtualenv/luke/lib/python3.8/site-packages/pytorch_lightning/core/saving.py", line 157, in load_from_checkpoint And it should load it without issues. I know it's kind of an ugly hack but it's what I suggest until I find a proper fix if you need to reload your checkpoint. |
Thanks for the quick fix. I tried several different things to reload the checkpoints but was not successful. If I find a solution, I will also post it here. For now, this quick fix works. |
I have a similar issue Specifically, it does the following:
This means it tries to open a path that does not exist. Do you know what is wrong Example of this: (I want to use the model saved at timestamp '2024-05-16/18-27-53', yet it wants to send me to something created at timestamp '2024-05-23/18-38-08') |
Hi,
I am trying to test the trained model by loading the checkpoint, but it shows the following error:
Traceback (most recent call last):
File "test.py", line 119, in main
train(conf)
File "test.py", line 101, in train
pl_module = pl_module.load_from_checkpoint(checkpoint_path=conf.checkpoint_path,config=config, tokenizer = tokenizer, model = model)
File "/home//virtualenv/luke/lib/python3.8/site-packages/pytorch_lightning/core/saving.py", line 157, in load_from_checkpoint
checkpoint[cls.CHECKPOINT_HYPER_PARAMS_KEY].update(kwargs)
File "/usr/lib/python3.8/_collections_abc.py", line 832, in update
self[key] = other[key]
omegaconf.errors.ConfigKeyError: Key 'config' is not in struct
full_key: config
reference_type=Optional[Dict[Union[str, Enum], Any]]
object_type=dict
Set the environment variable HYDRA_FULL_ERROR=1 for a complete stack trace.
I came across this issue: #47 but the answers did not help.
I tried converting the config to struct using Omegaconfig but it still does not work.
The text was updated successfully, but these errors were encountered: