All convolutions in a very dense block are ReLU-activated and use batch normalization. Channel-clever concatenation is simply possible if the height and width Proportions of the information stay unchanged, so convolutions within a dense block are all of stride 1. Pooling levels are inserted between dense blocks for additional https://financefeeds.com/xtb-secures-licenses-in-uae-and-indonesia-plans-to-launch-virtual-wallet/