All convolutions inside of a dense block are ReLU-activated and use batch normalization. Channel-sensible concatenation is simply attainable if the height and width Proportions of the info continue to be unchanged, so convolutions in a dense block are all of stride 1. Pooling levels are inserted between dense blocks https://financefeeds.com/ion-secures-indias-bse-approval-for-algo-trading-solutions/