Menu

Deep learning for NeuroImaging in Python.

Note

This page is a reference documentation. It only explains the class signature, and not how to use it. Please refer to the gallery for the big picture.

class nidl.volume.backbones.densenet3d.DenseNet(growth_rate: int = 32, block_config: tuple[int, int, int, int] = (3, 12, 24, 16), num_init_features: int = 64, bn_size: int = 4, in_channels: int = 1, n_embedding: int = 512, memory_efficient: bool = False)[source]

Bases: Module

3D Densenet architecture adapted from Huang et al. 2018. See https://doi.org/10.48550/arXiv.1608.06993 for details.

Parameters:

growth_rate : int, default=32

how many filters to add at each layer (k in paper).

block_config : (int, int, int, int), default=(3, 12, 24, 16)

how many layers in each pooling block (4 blocks in total).

num_init_features : int, default=64

number of filters to learn in the first convolution layer.

bn_size : int, default=4

multiplicative factor for number of bottleneck layers (i.e. bn_size * k features in the bottleneck layer).

in_channels : int, default=1

how many input channels has the input.

n_embedding : int, default=512

the size of the embedding space.

memory_efficient : bool, default=False

if True, uses checkpointing. Much more memory efficient, but slower. See <https://arxiv.org/pdf/1707.06990.pdf>.

Initialize internal Module state, shared by both nn.Module and ScriptModule.

forward(x)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

Follow us

© 2025, nidl developers