2025-04-03 15:19:43 +02:00

25 lines
5.1 KiB
Plaintext

Model with activation layers (<class 'torch.nn.modules.activation.ReLU'>, <class 'torch.nn.modules.activation.ReLU'>) and learning rate 0.0001 had success of (0.44952931636286714, 0.6824383880407573, 0.788915135916511)
Model with activation layers (<class 'torch.nn.modules.activation.ReLU'>, <class 'torch.nn.modules.activation.ReLU'>) and learning rate 0.0005 had success of (0.5080210132919649, 0.7299298381694461, 0.8241018227973064)
Model with activation layers (<class 'torch.nn.modules.activation.ReLU'>, <class 'torch.nn.modules.activation.ReLU'>) and learning rate 0.001 had success of (0.5215950357860593, 0.7354299615696506, 0.826111483270458)
Model with activation layers (<class 'torch.nn.modules.activation.ReLU'>, <class 'torch.nn.modules.activation.ReLU'>) and learning rate 0.005 had success of (0.5230758382399605, 0.7383563092761697, 0.8298840038077777)
Model with activation layers (<class 'torch.nn.modules.activation.ReLU'>, <class 'torch.nn.modules.activation.ReLU'>) and learning rate 0.01 had success of (0.5206783485526919, 0.7364171632055847, 0.8278390861333428)
Model with activation layers (<class 'torch.nn.modules.activation.ReLU'>, <class 'torch.nn.modules.activation.ReLU'>) and learning rate 0.05 had success of (0.12682015301625357, 0.29884003807777737, 0.45160949123858546)
Model with activation layers (<class 'torch.nn.modules.activation.ReLU'>, <class 'torch.nn.modules.activation.GELU'>) and learning rate 0.0001 had success of (0.44251313330747805, 0.6765504354264359, 0.7860240454112752)
Model with activation layers (<class 'torch.nn.modules.activation.ReLU'>, <class 'torch.nn.modules.activation.GELU'>) and learning rate 0.0005 had success of (0.5103127313753835, 0.7293304657476289, 0.8237492507844727)
Model with activation layers (<class 'torch.nn.modules.activation.ReLU'>, <class 'torch.nn.modules.activation.GELU'>) and learning rate 0.001 had success of (0.5211366921693756, 0.7379332228607693, 0.8288968021718436)
Model with activation layers (<class 'torch.nn.modules.activation.ReLU'>, <class 'torch.nn.modules.activation.GELU'>) and learning rate 0.005 had success of (0.5246271550964284, 0.739942883333921, 0.8305538906321617)
Model with activation layers (<class 'torch.nn.modules.activation.ReLU'>, <class 'torch.nn.modules.activation.GELU'>) and learning rate 0.01 had success of (0.5214892641822092, 0.7391319677044036, 0.8297077178013609)
Model with activation layers (<class 'torch.nn.modules.activation.ReLU'>, <class 'torch.nn.modules.activation.GELU'>) and learning rate 0.05 had success of (0.1655325600253852, 0.3544759017029228, 0.495469449635088)
Model with activation layers (<class 'torch.nn.modules.activation.GELU'>, <class 'torch.nn.modules.activation.ReLU'>) and learning rate 0.0001 had success of (0.44706131227303175, 0.6806755279765893, 0.7906427387793957)
Model with activation layers (<class 'torch.nn.modules.activation.GELU'>, <class 'torch.nn.modules.activation.ReLU'>) and learning rate 0.0005 had success of (0.5120050770369848, 0.7312343546169305, 0.8229735923562388)
Model with activation layers (<class 'torch.nn.modules.activation.GELU'>, <class 'torch.nn.modules.activation.ReLU'>) and learning rate 0.001 had success of (0.5179282868525896, 0.7381800232697528, 0.8289673165744104)
Model with activation layers (<class 'torch.nn.modules.activation.GELU'>, <class 'torch.nn.modules.activation.ReLU'>) and learning rate 0.005 had success of (0.5234636674540775, 0.7421640870147728, 0.8307654338398618)
Model with activation layers (<class 'torch.nn.modules.activation.GELU'>, <class 'torch.nn.modules.activation.ReLU'>) and learning rate 0.01 had success of (0.5197264041180412, 0.7384268236787364, 0.8286500017628601)
Model with activation layers (<class 'torch.nn.modules.activation.GELU'>, <class 'torch.nn.modules.activation.ReLU'>) and learning rate 0.05 had success of (0.12551563656876918, 0.29757077883157634, 0.45034023199238443)
Model with activation layers (<class 'torch.nn.modules.activation.GELU'>, <class 'torch.nn.modules.activation.GELU'>) and learning rate 0.0001 had success of (0.4493530303564503, 0.683284560871558, 0.7907837675845292)
Model with activation layers (<class 'torch.nn.modules.activation.GELU'>, <class 'torch.nn.modules.activation.GELU'>) and learning rate 0.0005 had success of (0.5151077107499207, 0.733808130310616, 0.8255121108486408)
Model with activation layers (<class 'torch.nn.modules.activation.GELU'>, <class 'torch.nn.modules.activation.GELU'>) and learning rate 0.001 had success of (0.5195148609103409, 0.7389204244967035, 0.8294961745936608)
Model with activation layers (<class 'torch.nn.modules.activation.GELU'>, <class 'torch.nn.modules.activation.GELU'>) and learning rate 0.005 had success of (0.5214892641822092, 0.7401896837429045, 0.8302365758206114)
Model with activation layers (<class 'torch.nn.modules.activation.GELU'>, <class 'torch.nn.modules.activation.GELU'>) and learning rate 0.01 had success of (0.5198674329231746, 0.7398371117300708, 0.8258294256601911)
Model with activation layers (<class 'torch.nn.modules.activation.GELU'>, <class 'torch.nn.modules.activation.GELU'>) and learning rate 0.05 had success of (0.3762648520960406, 0.6283538412720798, 0.7500617001022459)