dlpy.blocks.Bidirectional¶
-
class
dlpy.blocks.
Bidirectional
(n, n_blocks=1, rnn_type='gru', output_type='samelength', dropout=0.2, max_output_length=None, src_layers=None, name=None)¶ Bases: object
Bidirectional RNN layers
Parameters: - n : int or list of int
Specifies the number of neurons in the recurrent layer. If n_blocks=1, then n should be an int. If n_blocks > 1, then n can be an int or a list of ints to indicate the number of neurons in each block.
- n_blocks : int, optional
Specifies the number of bidirectional recurrent layer blocks.
Default: 1- rnn_type : string, optional
Specifies the type of the rnn layer.
Default: GRU
Valid Values: RNN, LSTM, GRU- output_type : string, optional
Specifies the output type of the recurrent layer.
Default: SAMELENGTH
Valid Values: ENCODING, SAMELENGTH, ARBITRARYLENGTH- max_output_length : int, mostly optional
Specifies the maximum number of tokens to generate when the outputType parameter is set to ARBITRARYLENGTH.
- dropout : float, optional
Specifies the dropout rate.
Default: 0.2- src_layers : list, optional
Specifies the list of source layers for the layer.
- name : string, optional
Specifies layer names. If not specified, ‘RNN’ is used
Returns: - :class:`Bidirectional’
-
__init__
(n, n_blocks=1, rnn_type='gru', output_type='samelength', dropout=0.2, max_output_length=None, src_layers=None, name=None)¶ Initialize self. See help(type(self)) for accurate signature.
Methods
__init__(n[, n_blocks, rnn_type, …]) Initialize self. add_layers() Add layers for the block compile([block_num]) Convert the options into DLPy layer definition. get_last_layers() Return last two layers, if they exist get_layers() Return list of layers