# Intro

SciModel is similar to Keras' Model, prepared to make scientific model creation effortless. The inputs are of Variable objects, and the outputs are Target objects. As an example:

from sciann import Variable, Functional, SciModel, Data

x = Variable('x')
y = Variable('y')

Fxy = Functional('Fxy', [x, y],
hidden_layers=[10, 20, 10],
activation='tanh')

model = SciModel([x,y], Data(Fxy))


SciModel can be trained by calling model.train.

training_history = model.train([X_data, Y_data], Fxy_data)


training_history object records the loss for each epoch as well as other parameters. Check Keras' documentation for more details.

SciModel object also provides functionality such as predict and save.

[source]

### SciModel

sciann.models.model.SciModel(inputs=None, targets=None, loss_func='mse', optimizer='adam', load_weights_from=None, plot_to_file=None)


Configures the model for training. Example: Arguments

• inputs: Main variables (also called inputs, or independent variables) of the network, xs. They all should be of type Variable.
• targets: list all targets (also called outputs, or dependent variables) to be satisfied during the training. Expected list members are:
• Entries of type Constraint, such as Data, Tie, etc.
• Entries of type Functional can be: . A single Functional: will be treated as a Data constraint. The object can be just a Functional or any derivatives of Functionals. An example is a PDE that is supposed to be zero. . A tuple of (Functional, Functional): will be treated as a Constraint of type Tie.
• If you need to impose more complex types of constraints or to impose a constraint partially in a specific part of region, use Data or Tie classes from Constraint.
• loss_func: defaulted to "mse" or "mean_squared_error". It can be an string from supported loss functions, i.e. ("mse" or "mae"). Alternatively, you can create your own loss function and pass the function handle (check Keras for more information).
• optimizer: defaulted to "adam" optimizer. It can be one of Keras accepted optimizers, e.g. "adam". You can also pass more details on the optimizer:

• optimizer = k.optimizers.RMSprop(lr=0.01, rho=0.9, epsilon=None, decay=0.0)
• optimizer = k.optimizers.SGD(lr=0.001, momentum=0.0, decay=0.0, nesterov=False)
• optimizer = k.optimizers.Adam(lr=0.01, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0, amsgrad=False)

Check our Keras documentation for further details. We have found

• load_weights_from: (file_path) Instantiate state of the model from a previously saved state.

• plot_to_file: A string file name to output the network architecture.

Raises

• ValueError: inputs must be of type Variable. targets must be of types Functional, or (Functional, data), or (Functional, Functional).

### train

train(x_true, y_true, weights=None, target_weights=None, batch_size=64, epochs=100, learning_rate=0.001, shuffle=True, callbacks=None, stop_after=None, save_weights_to=None, save_weights_freq=0, default_zero_weight=0.0)


Performs the training on the model.

Arguments

• x_true: list of Xs associated to targets of Y. Expecting a list of np.ndarray of size (N,1) each, with N as the sample size.
• y_true: list of true Ys associated to the targets defined during model setup. Expecting the same size as list of targets defined in SciModel.
• To impose the targets at specific Xs only, pass a tuple of (ids, y_true) for that target.
• weights: (np.ndarray) A global sample weight to be applied to samples. Expecting an array of shape (N,1), with N as the sample size. Default value is one to consider all samples equally important.
• target_weights: (list) A weight for each target defined in y_true.
• batch_size: (Integer) or 'None'. Number of samples per gradient update. If unspecified, 'batch_size' will default to 2^6=64.
• epochs: (Integer) Number of epochs to train the model. Defaulted to 100. An epoch is an iteration over the entire x and y data provided.
• learning_rate: (Tuple/List) (epochs, lrs). Expects a list/tuple with a list of epochs and a list or learning rates. It linearly interpolates between entries. Defaulted to 0.001 with no decay. Example: learning_rate = ([0, 100, 1000], [0.001, 0.0005, 0.00001])
• shuffle: Boolean (whether to shuffle the training data). Default value is True.
• callbacks: List of keras.callbacks.Callback instances.
• stop_after: To stop after certain missed epochs. Defaulted to epochs.
• save_weights_to: (file_path) If you want to save the state of the model (at the end of the training).
• save_weights_freq: (Integer) Save weights every N epcohs. Defaulted to 0.
• default_zero_weight: a small number for zero sample-weight.

Returns

A Keras 'History' object after performing fitting.

### predict

predict(xs, batch_size=None, verbose=0, steps=None)


Predict output from network.

Arguments

• xs: list of Xs associated model. Expecting a list of np.ndarray of size (N,1) each, with N as the sample size.
• verbose: defaulted to 0 (None). Check Keras documentation for more information.
• steps: defaulted to 0 (None). Check Keras documentation for more information.

Returns

List of numpy array of the size of network outputs.

Raises

ValueError if number of xss is different from number of inputs.

### loss_functions

loss_function)


loss_function returns the callable object to evaluate the loss.

Arguments

• method: String.
• "mse" for Mean Squared Error or
• "mae" for Mean Absolute Error or
• "se" for Squared Error or
• "ae" for Absolute Error.

Returns

Callable function that gets (y_true, y_pred) as the input and returns the loss value as the output.

Raises

ValueError if anything other than "mse" or "mae" is passed.