Loss function - can we modify it?

Are we allowed to modify loss function completely or can we only tune it’s parameters (beta, wx, wy)?

Hello, @gru.dam
Thank you for question.
Unfortunately, it is not allowed to completely overhaul the loss function. But the parameters of the loss function, such as beta, wx, wy, provide a way to fine-tune the behavior and characteristics of the loss function within a defined framework.
Team Xeek

Are we allowed to use torch.nn.MSELoss or any other losses with or instead of torch.nn.L1Loss ?

1 Like

Hello @jc138691

Unfortunately, it is not allowed as well. It is supposed to tune loss function by the parameters beta, wx, wy.

Team Xeek

1 Like

So basically, the only thing we have influence on are the hyperparameters, the initial model parameters and the exact shape of Decoder and Encoder, is that right?

Hello @gru.dam

Totally correct!
Team Xeek

Thank you for clarifying and confirming.