Lightweight Convolutional Neural Networks (Lwt-CNN) generally refer to CNN with fewer parameters. They have raised researchers' much attention and developed rapidly over the last decade due to the advantages such as feasible embeded deployment, real-time applications and less communication overhead.
The first lightweight CNN dates back to SqueezeNet which was proposed in ICLR 2017.
The picture below shows the development of lightweight CNN over the years.
The following table summarizes the structure of the core modules in the lightweight neural networks. The usage of batch normalization (BN) and activation function (act) in these modules are particularly indicated.
LWT-CNN | Module | Structure | BN | act |
---|---|---|---|---|
SqueezeNet | Fire | squeeze (k=1 nc๐ฝ) + expand (k=1 nc๐ผ + k=3 nc๐ผ concat) |
โ | ReLU |
Xception | SeparableConv2d Block |
pw (k=1 nc๐) + dw (k>1 nc๐ผ) act + SeparableConv2d + BN |
โ โ๏ธ |
โ โ๏ธ |
MobileNet | Block | dw (k=3 nc๐) + pw (k=1 nc๐ผ) | โ๏ธ | โ๏ธ |
ShuffleNet | ShuffleUnit | gconv (k=1 nc๐ฝ) + channel shuffle + dwconv (k=1 nc๐) + gconv (k=1 nc๐ผ) add/concat |
โ๏ธ - โ๏ธ โ๏ธ โ |
ReLU - โ โ ReLU |