CustomDropout이 원래 dropout - nn.Dropout과 학습곡선 동일하게 나옴
ChannelDropout이 바꾼거
성능은... 단순 CNN에 넣으니까 둘다 잘 안나오는데 음....
class CustomDropout(nn.Module):
def __init__(self, p: float = 0.5):
super(CustomDropout, self).__init__()
self.p = p
def forward(self, x):
if self.training:
mask = (torch.rand_like(x) > self.p).float()
return x * mask / (1 - self.p)
else:
return x
class ChannelDropout(nn.Module):
def __init__(self, p: float = 0.5):
super(ChannelDropout, self).__init__()
self.p = p
def forward(self, x):
if self.training:
drop = (torch.rand(1, x.shape[1], 1, 1, device=x.device) > self.p).float()
x = x * drop / (1 - self.p)
return x
else:
return x

'코드' 카테고리의 다른 글
| Pre-trained Resnet with Flowers102(Transfer Learning) (8) | 2024.12.29 |
|---|---|
| Pre-trained Resnet with CIFAR-10(Transfer Learning) 2탄 (6) | 2024.12.28 |
| Pre-trained Resnet with CIFAR-10(Transfer Learning) (10) | 2024.12.28 |
| pytorch datasets (11) | 2024.12.10 |
| MNIST 홀짝분류 코드 구현 및 분석(pytorch) (3) | 2024.12.04 |