The optimizer
function runs when another guess is made. It tries to ==minimise the loss based on the loss
function==.
Example in TensorflowJS
The adam
optimiser is effective and requires no configuration.
Example in Python
Example using mean_squared_error
for the loss and stochastic gradient descent (sgd
) for the optimizer.