dlpy.layers.Dense¶
-
class
dlpy.layers.
Dense
(n, name=None, act='relu', fcmp_act=None, init=None, std=None, mean=None, truncation_factor=None, init_bias=None, dropout=None, include_bias=None, src_layers=None, **kwargs)¶ Bases: dlpy.layers.Layer
Fully connected layer
Parameters: - n : int
Specifies the number of neurons.
- name : string, optional
Specifies the name of the layer.
- act : string, optional
Specifies the activation function.
Valid Values: AUTO, IDENTITY, LOGISTIC, SIGMOID, TANH, RECTIFIER, RELU, SOFPLUS, ELU, LEAKY, FCMP
Default: AUTO- fcmp_act : string, optional
Specifies the FCMP activation function for the layer.
- init : string, optional
Specifies the initialization scheme for the layer.
Valid Values: XAVIER, UNIFORM, NORMAL, CAUCHY, XAVIER1, XAVIER2, MSRA, MSRA1, MSRA2
Default: XAVIER- std : float, optional
Specifies the standard deviation value when the init parameter is set to NORMAL.
- mean : float, optional
Specifies the mean value when the init parameter is set to NORMAL.
- truncation_factor : float, optional
Specifies the truncation threshold (truncationFactor x std), when the init parameter is set to NORMAL
- init_bias : float, optional
Specifies the initial bias for the layer.
- dropout : float, optional
Specifies the dropout rate.
Default: 0- include_bias : bool, optional
Includes bias neurons (default).
- src_layers : iter-of-Layers, optional
Specifies the layers directed to this layer.
Returns: -
__init__
(n, name=None, act='relu', fcmp_act=None, init=None, std=None, mean=None, truncation_factor=None, init_bias=None, dropout=None, include_bias=None, src_layers=None, **kwargs)¶ Initialize self. See help(type(self)) for accurate signature.
Methods
__init__(n[, name, act, fcmp_act, init, …]) Initialize self. count_instances() format_name([block_num, local_count]) Format the name of the layer get_number_of_instances() to_model_params() Convert the model configuration to CAS action parameters