-
Notifications
You must be signed in to change notification settings - Fork 216
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Assign different weights to different style layers #65
Comments
actually relu layers - is just activation layers which you include in overall image reconstruction, and you cannot set weights for each of them independently, alternatively you can play with different relu layers like relu1_1,relu2_1, relu3_2,relu4_3,relu5_2 etc. |
@Vladkryvoruchko Thanks for your response. I asked this question because it is allowed to specify the individual weights of different style layers in the similar "fast-neural-style" project by Justin Johnson. Following is the explanation of the parameter in fast-neural-style. -style_weights: Weights to use for style reconstruction terms. Either a single number, in which case the same weight is used for all style reconstruction terms, or a comma-separated list of weights of the same length as -style_layers. |
Hey @michaelhuang74 , it is not possible with this code, I cut it for simplicity to be true. Did you find the cases when these weights are important? |
@DmitryUlyanov Thanks for the response. I just asked this question for curiosity. Sometimes, I found that it was difficult to get good results by simply excluding some layers. So I thought it might be good to include those layers but with lower weights. |
@DmitryUlyanov Currently a single weight is assigned to all style layers during the training process. Is it possible to assign different weights to different style layers?
For example, if "style_layers" = "relu1_2,relu2_2,relu3_2,relu4_2", is it possible to use "style_weight" = "2,4,5,3" so that the weights of the four layers (relu1_2,relu2_2,relu3_2,relu4_2) are 2, 4, 5, 3, respectively, in training? Thanks.
The text was updated successfully, but these errors were encountered: