Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about GAN loss #3

Open
Nevermetyou65 opened this issue Nov 30, 2021 · 0 comments
Open

Question about GAN loss #3

Nevermetyou65 opened this issue Nov 30, 2021 · 0 comments

Comments

@Nevermetyou65
Copy link

Nevermetyou65 commented Nov 30, 2021

Hello Sir, Thanks for a very useful repository about GANs.

Would mind clarifying something about GAN loss? It's about the "sign" return by d_loss_fn and g_loss_fn in this snippet.

def get_loss_fn():
    def d_loss_fn(real_logits, fake_logits):
        return -tf.reduce_mean(tf.math.log(real_logits + 1e-10) + tf.math.log(1. - fake_logits + 1e-10))

    def g_loss_fn(fake_logits):
        return -tf.reduce_mean(tf.math.log(fake_logits + 1e-10))

    return d_loss_fn, 

Is this just a binary crossentropy? if it's just a binary crossentropy can I use the loss defined in DCGAN?
And you put a minus sign to make a positive return value am I correct?
I find that your implementation is the closest to those implemented in papers but slightly different in the sign of value.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant