Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Assign different weights to different style layers #65

Open
michaelhuang74 opened this issue Jan 19, 2017 · 4 comments
Open

Assign different weights to different style layers #65

michaelhuang74 opened this issue Jan 19, 2017 · 4 comments

Comments

@michaelhuang74
Copy link

@DmitryUlyanov Currently a single weight is assigned to all style layers during the training process. Is it possible to assign different weights to different style layers?

For example, if "style_layers" = "relu1_2,relu2_2,relu3_2,relu4_2", is it possible to use "style_weight" = "2,4,5,3" so that the weights of the four layers (relu1_2,relu2_2,relu3_2,relu4_2) are 2, 4, 5, 3, respectively, in training? Thanks.

@Vladkryvoruchko
Copy link

Vladkryvoruchko commented Jan 19, 2017

actually relu layers - is just activation layers which you include in overall image reconstruction, and you cannot set weights for each of them independently, alternatively you can play with different relu layers like relu1_1,relu2_1, relu3_2,relu4_3,relu5_2 etc.
And see output results
it can be even just relu1_1,relu4_4

@michaelhuang74
Copy link
Author

@Vladkryvoruchko Thanks for your response.

I asked this question because it is allowed to specify the individual weights of different style layers in the similar "fast-neural-style" project by Justin Johnson. Following is the explanation of the parameter in fast-neural-style.

-style_weights: Weights to use for style reconstruction terms. Either a single number, in which case the same weight is used for all style reconstruction terms, or a comma-separated list of weights of the same length as -style_layers.

@DmitryUlyanov
Copy link
Owner

Hey @michaelhuang74 , it is not possible with this code, I cut it for simplicity to be true. Did you find the cases when these weights are important?

@michaelhuang74
Copy link
Author

@DmitryUlyanov Thanks for the response. I just asked this question for curiosity. Sometimes, I found that it was difficult to get good results by simply excluding some layers. So I thought it might be good to include those layers but with lower weights.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants