braindecode.modules.MLP#
- class braindecode.modules.MLP(in_features: int, hidden_features=None, out_features=None, activation=<class 'torch.nn.modules.activation.GELU'>, drop=0.0, normalize=False)[source]#
Multilayer Perceptron (MLP) with GELU activation and optional dropout.
Also known as fully connected feedforward network, an MLP is a sequence of non-linear parametric functions
over feature vectors
, with the input and output feature vectors and , respectively. The non-linear functions are called activation functions. The trainable parameters of an MLP are its weights and biases .