Source: lua-torch-optim Section: interpreters Priority: optional Maintainer: Debian Science Maintainers Uploaders: Mo Zhou Build-Depends: debhelper (>=11), dh-lua, # lua-torch-torch7 is not a real B-D, but an explicit runtime dependency lua-torch-torch7, Standards-Version: 4.1.4 Homepage: https://github.com/torch/optim Vcs-Browser: https://salsa.debian.org/science-team/lua-torch-optim Vcs-Git: https://salsa.debian.org/science-team/lua-torch-optim.git Package: lua-torch-optim Architecture: all Multi-Arch: foreign Depends: ${misc:Depends}, lua5.1 | luajit, lua-torch-torch7, lua-torch-xlua, XB-Lua-Versions: ${lua:Versions} Description: Numeric Optimization Package for Torch Framework This package contains several optimization routines and a logger for Torch. . The following algorithms are provided: * Stochastic Gradient Descent * Averaged Stochastic Gradient Descent * L-BFGS * Congugate Gradients * AdaDelta * AdaGrad * Adam * AdaMax * FISTA with backtracking line search * Nesterov's Accelerated Gradient method * RMSprop * Rprop * CMAES All these algorithms are designed to support batch optimization as well as stochastic optimization. It's up to the user to construct an objective function that represents the batch, mini-batch, or single sample on which to evaluate the objective. . This package provides also logging and live plotting capabilities via the `optim.Logger()` function. Live logging is essential to monitor the network accuracy and cost function during training and testing, for spotting under- and over-fitting, for early stopping or just for monitoring the health of the current optimisation task.