MinMaxScaler for constant values
Transformations based on the MinMaxScaler
(e.g. ZeroToOne
) produce NaN
values (division by zero) if the variable has constant values (min==max).
In CAE, this only throws a warning during scaling, but later producess invalid losses (loss=nan.0
) during training.
CVAE fails altogether:
---> 16 cae.fit(datamodule, name_run='', max_epochs=5, accelerator='cpu', flag_wandb=False)
File ~\CODE\aixd\src\aixd\mlmodel\architecture\cond_ae_model.py:675, in CondAEModel.fit(self, datamodule, name_run, max_epochs, callbacks, loggers, accelerator, flag_early_stop, criteria, flag_wandb, wandb_entity, **kwargs)
666 # Create a model_trainer object and fit the model
667 self.model_trainer = pl.Trainer(
668 accelerator=accelerator,
669 max_epochs=max_epochs,
(...)
673 **kwargs,
674 )
--> 675 self.model_trainer.fit(self, datamodule=datamodule)
677 if flag_wandb:
678 import wandb
File ..\Anaconda3\envs\aixd-dev\lib\site-packages\pytorch_lightning\trainer\trainer.py:538, in Trainer.fit(self, model, train_dataloaders, val_dataloaders, datamodule, ckpt_path)
536 self.state.status = TrainerStatus.RUNNING
537 self.training = True
--> 538 call._call_and_handle_interrupt(
539 self, self._fit_impl, model, train_dataloaders, val_dataloaders, datamodule, ckpt_path
540 )
...
[nan, nan, nan],
[nan, nan, nan],
[nan, nan, nan],
[nan, nan, nan],
[nan, nan, nan]], grad_fn=<AddmmBackward0>)
Related to #161 (closed)