# 100天深度学习--PartA：Week1-day3 NIN

AI基础 经典网络  收藏
0 / 470

## 简介

2014年 ICLR 的paper，Network In Network(NIN)，他对传统的CNN 网络进行了改进，大大减少参数数量，进一步提高了 CIFAR-10、CIFAR-100 等数据集上的准确率。对后来的模型设计起到了重要的作用。

## 创新点

1 Mlpconv Layer：Conv+MLP

2 Global Average Pooling

## 网络结构

CCCP1 $55\times55\times96$ $1\times1\times96$ $55\times55\times96$
CCCP2 $55\times55\times96$ $1\times1\times96$ $55\times55\times96$

CCCP3 $27\times27\times256$ $1\times1\times256$ $27\times27\times256$
CCCP4 $27\times27\times256$ $1\times1\times256$ $27\times27\times256$

CCCP5 $13\times13\times384$ $1\times1\times384$ $13\times13\times384$
CCCP6 $13\times13\times384$ $1\times1\times384$ $13\times13\times384$

CCCP7 $6\times6\times1024$ $1\times1\times1024$ $6\times6\times1024$
CCCP8 $6\times6\times1024$ $1\times1\times1000$ $6\times6\times1000$

NIN也可以方便的用在其他网络结构中：

##源码

class NIN(nn.Module):
def __init__(self, num_classes=NUM_CLASSES):
super(NIN, self).__init__()
self.features = nn.Sequential(
nn.ReLU(inplace=True),
nn.Conv2d(64, 64, kernel_size=1, stride=1),
nn.ReLU(inplace=True),
nn.Conv2d(64, 64, kernel_size=1, stride=1),
nn.ReLU(inplace=True),
nn.MaxPool2d(kernel_size=2),

nn.ReLU(inplace=True),
nn.Conv2d(192, 192, kernel_size=1, stride=1),
nn.ReLU(inplace=True),
nn.Conv2d(192, 192, kernel_size=1, stride=1),
nn.ReLU(inplace=True),
nn.MaxPool2d(kernel_size=2),

nn.ReLU(inplace=True),
nn.Conv2d(384, 384, kernel_size=1, stride=1),
nn.ReLU(inplace=True),
nn.Conv2d(384, 256, kernel_size=1, stride=1),
nn.ReLU(inplace=True),
nn.AvgPool2d(kernel_size=4),
)

self.classifier = nn.Linear(256, 10)

def forward(self, x):
x = self.features(x)
x = x.view(x.size(0),-1)
x = self.classifier(x)
return x


BEST ACC. PERFORMANCE: 76.570%

paper ： Batch-normalized Maxout Network in Network

## 知识点

1x1卷积对channel维度上的元素做乘加操作. 如图所示,