dlpy.lr_scheduler.FixedLR¶
-
class
dlpy.lr_scheduler.
FixedLR
(learning_rate=0.001)¶ Bases: dlpy.lr_scheduler._LRScheduler
Fixed learning rate scheduler
Parameters: - learning_rate : double, optional
Specifies the learning rate for the deep learning algorithm.
Returns: -
__init__
(learning_rate=0.001)¶ Initialize self. See help(type(self)) for accurate signature.
Methods
__init__([learning_rate]) Initialize self. clear() get(k[,d]) items() keys() pop(k[,d]) If key is not found, d is returned if given, otherwise KeyError is raised. popitem() as a 2-tuple; but raise KeyError if D is empty. setdefault(k[,d]) update([E, ]**F) If E present and has a .keys() method, does: for k in E: D[k] = E[k] If E present and lacks .keys() method, does: for (k, v) in E: D[k] = v In either case, this is followed by: for k, v in F.items(): D[k] = v values()