The output on the convolutional layer is normally handed with the ReLU activation purpose to bring non-linearity towards the model. It's going to take the element map and replaces every one of the negative values with zero. A VGG-block had a lot of 3x3 convolutions padded by one to https://financefeeds.com/33-copyright-etfs-filed-as-gary-gensler-resigns-from-sec-time-to-buy-xrp-sol-meme-coins/