optimizer function (Machine Learning)
up:: Machine Learning
The optimizer
function runs when another guess is made. It tries to ==minimise the loss based on the loss
function==.
Example in TensorflowJS
The adam
optimiser is effective and requires no configuration.
// Prepare the model for training.
model.compile({
optimizer: tf.train.adam(),
loss: tf.losses.meanSquaredError,
metrics: ['mse'],
});
Example in Python
Example using mean_squared_error
for the loss and stochastic gradient descent (sgd
) for the optimizer.
# add loss and optimizer to model
model.compile(optimizer='sgd', loss='mean_squared_error')