Preface
Most of the keras interfaces implement the call method.
The parent class call calls ().
Therefore, almost all the model / network layers mentioned below can be called directly as functions after being defined.
eg:
Model object (parameter) Network layer object (parameter)
We can also implement inheritance templates
Import
from tensorflow import keras
metrics (statistical average)
There are interfaces for various measures
Such as: two classification, multi classification cross entropy loss container, MSE, MAE loss value container, Accuracy rate container, etc.
Take the Accuracy pseudo code as an example:
acc_meter = keras.metrics.Accuracy() # Create a container for _ in epoches: for _ in batches: y = ... y_predict = ... acc_meter.update_state(y, y_predict) # Every time the data is thrown in, the container automatically calculates the accuracy and stores it if times % 100 == 0: # Output one hundred times, set a threshold / valve print(acc_meter.result().numpy()) # Take out the average accuracy of all data stored in the container acc_meter. reset_states() # The container cache is empty, and the next epoch is counted from the beginning.
Activation function + loss function + optimizer
Import method:
keras.activations.relu() # Activation function: take relu for example, there are many more keras.losses.categorical_crossentropy() # Loss function: take cross entropy as an example, there are many more keras.optimizers.SGD() # Optimizer: taking random gradient descent optimizer as an example keras.callbacks.EarlyStopping() # Callback function: take "pause training in advance according to specified conditions" callback as an example
Sequential (inherited from Model) belongs to model
Model definition method
Definition method 1:
model = keras.models.Sequential([first layer network, second layer network...)... )
Definition method 1:
model = keras.models.Sequential() model.add(First tier network) model.add(Layer 2 network)
Model related callback configuration
logdir = 'callbacks' if not os.path.exists(logdir): os.mkdir(logdir) save_model_file = os.path.join(logdir, 'mymodel.h5') callbacks = [ keras.callbacks.TensorBoard(logdir), # Write tensorboard keras.callbacks.ModelCheckpoint(output_model_file, save_best_only=True), # Model preservation keras.callbacks.EarlyStopping(patience=5, min_delta=1e-3) # Conditional termination of model training # The verification set will be promoted every time. If the promotion fails, and the promotion is less than the min_delta threshold, you will wait patiently for 5 times. # Five times later, if you still improve. It's over early. ] # The code is written here. How to pass the call is mentioned in the "model related metric configuration" below
Model related metric configuration: ((loss, optimizer, accuracy, etc.)
Note: the following metric properties can be used in string mode or import instantiation object mode mentioned above.
model.compile( loss="sparse_categorical_crossentropy", # Loss function, this is string mode optimizer= keras.optimizers.SGD() # This is the way to instantiate an object. In this way, parameters can be passed metrics=['accuracy'] # This item will be printed out at fit() ) # compile() operation, no real training. model.fit( x,y, epochs=10, # 10 rounds of repeated training validation_data = (x_valid,y_valid), # Put the divided verification set in (print loss and val when fit) validation_freq = 5, # Train 5 times and verify once. It can not be transmitted. The default value is 1. callbacks=callbacks, # Specify the callback function. Please connect the "model related callback configuration" above ) # fit() is the real training
Model validation & Testing
Generally, we will divide the data into three parts first (if the same data can't play the role of test and verification, please refer to the idea of cheating in the exam):
- Training set: (mass, subject)
- Test set: (used after all training of the model)
- Verification set: (used for training process)
Note 1: (how to separate?)
1. Their separation needs to be (x,y) combined. If it is implemented manually, it needs random break, zip and other operations. 2. But we can use the train_test_split() method of the scikit learn library to achieve (twice separation) 3. tf.split() can be used to manually implement
Specific separation cases: refer to the previous article: https://segmentfault.com/a/11...
Note 2: (why do we have test sets and need verification sets?)
- Test set is used to test the final model after training (fixed parameters), and return the predicted result value!!!!
- Validation set is validated with model training)
The code is as follows:
loss, accuracy = model.evaluate( (x_test, y_test) ) # Measure, note, return precision index, etc target = model.predict( (x_test, y_test) ) # Test, note that the predicted results are returned!
Available parameters
model.trainable_variables # Return all trainable variables in the model # Usage scenario: just like the zip we used in grade (derivation result, model. Traceable ﹣ variables)
Custom Model
Model is the same as the master. You can inherit it and implement the corresponding methods, and also implement the definition of the model easily.
Custom Layer
With the same Model, Layer is also the same as the master. You inherit it and implement the corresponding methods. You can also easily implement the definition of network Layer.
Model saving and loading
Method 1: what callback said before
Method 2: save only weight (the model is not completely consistent)
Preservation:
model = keras.Sequential([...]) ... model.fit() model.save_weights('weights.ckpt')
Load:
If in another file. (of course, the right to save should be copied to the local directory) model = keras.Sequential([...)] the construction of this model must be exactly the same as the structure defined in the save time. model.load_weights('weights.ckpt') model.evaluate(...) model.predict(...)
Method 3: save the whole model (the model is completely consistent)
Preservation:
model = keras.Sequential([...]) ... model.fit() model.save('model.h5') # Notice that it's changed. It's save
Loading: (loading directly, no need to restore the modeling process)
If in another file. (copy the saved model to the local directory, of course) Model = keras.models.load'u model ('model. H5 '); load'u model is under keras.models model.evaluate(...) model.predict(...)
Method 4: export for other languages (industrialization)
Save: (using TF. Saved? Model module)
model = keras.Sequential([...]) ... model.fit() tf.saved_model.save(model, 'Catalog')
Load: (using TF. Saved? Model module)
model = tf.saved_model.load('Catalog')