pipefitter.estimator.NeuralNetwork

class pipefitter.estimator.NeuralNetwork(acts='tanh', annealing_rate=1e-06, direct=False, error_func=None, hiddens=9, lasso=0, learning_rate=0.001, max_iters=10, max_time=0, ridge=0, seed=0.0, std='midrange', optimization='lbfgs', num_tries=10, target=None, nominals=None, inputs=None)

Bases: pipefitter.base.BaseEstimator

Neural Network

Parameters:

acts : string, optional

Specifies the activation function for the neurons on each hidden layer. Valid values are ‘identity’, ‘logistic’, ‘sin’, ‘softplus’, and ‘tanh’.

annealing_rate : float, optional

Specifies the annealing parameter

direct : bool, optional

Specifies to use an architecture that is an extension of MLP with direct connections between the input layer and the output layer

error_func : string, optional

Specifies the error function to train the network. Valid values are ‘normal’ and ‘entropy’.

hiddens : list-of-ints, optional

Specifies the number of hidden neurons for each hidden layer in the feedforward model

lasso : float, optional

Specifies the L2 regularization parameter, the value must be nonnegative

learning_rate : float, optional

Specifies the learning rate parameter for SGD

max_iters : int, optional

Specifies the maximum iterations allowed for optimization

max_time : int, optional

Specifies the maximum time (in seconds) allowed for optimization

ridge : float, optional

Specifies the L2 regularization parameter, the value must be nonnegative.

seed : float, optional

Specifies the random number seed for generating random numbers to initialize the network weights

std : string, optional

Specifies the standardization to use on the interval variables. Valid values are ‘midrange’, ‘none’, and ‘std’.

optimization : string, optional

Specifies the optimization technique. Valid values are ‘lbfgs’ and ‘sgd’.

target : string, optional

The target variable

nominals : string or list of strings, optional

The nominal variables

inputs : string or list of strings, optional

The input variables

Returns:

NeuralNetwork

Examples

>>> nn = NeuralNetwork(target='Origin',
...                    inputs=['MPG_City', 'MPG_Highway', 'Length',
...                            'Weight', 'Type', 'Cylinders'],
...                    nominals = ['Type', 'Cylinders', 'Origin'])
__init__(acts='tanh', annealing_rate=1e-06, direct=False, error_func=None, hiddens=9, lasso=0, learning_rate=0.001, max_iters=10, max_time=0, ridge=0, seed=0.0, std='midrange', optimization='lbfgs', num_tries=10, target=None, nominals=None, inputs=None)

Methods

__init__([acts, annealing_rate, direct, ...])
fit(table, \*args, \*\*kwargs) Fit function for neural network
get_combined_params(\*args, \*\*kwargs) Merge all parameters and verify that they valid
get_filtered_params(\*args, \*\*kwargs) Merge parameters that keys that belong to self
get_param(\*names) Return a copy of the requested parameters
get_params(\*names) Return a copy of the requested parameters
has_param(name) Does the parameter exist?
set_param(\*args, \*\*kwargs) Set one or more parameters
set_params(\*args, \*\*kwargs) Set one or more parameters
transform(table, \*args, \*\*kwargs) Transform function for transformer

Attributes

param_defs
static_params