From 076b131efb1d72c14f5e750abda12ced9fe44e56 Mon Sep 17 00:00:00 2001 From: Damoon Date: Thu, 27 Feb 2025 16:35:13 +0300 Subject: [PATCH 1/5] moe layer --- examples/vision/ipynb/mnist_moe.ipynb | 624 + examples/vision/md/mnist_moe.md | 22871 ++++++++++++++++ examples/vision/mnist_moe.py | 424 + .../examples/audio/vocal_track_separation.md | 921 + 4 files changed, 24840 insertions(+) create mode 100644 examples/vision/ipynb/mnist_moe.ipynb create mode 100644 examples/vision/md/mnist_moe.md create mode 100644 examples/vision/mnist_moe.py create mode 100644 templates/examples/audio/vocal_track_separation.md diff --git a/examples/vision/ipynb/mnist_moe.ipynb b/examples/vision/ipynb/mnist_moe.ipynb new file mode 100644 index 0000000000..0d4c4a0ad4 --- /dev/null +++ b/examples/vision/ipynb/mnist_moe.ipynb @@ -0,0 +1,624 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "colab_type": "text" + }, + "source": [ + "# MoE for MNIST\n", + "\n", + "**Author:** [Damoon Shahhosseini](https://www.linkedin.com/in/damoonsh/)
\n", + "**Date created:** 2015/06/19
\n", + "**Last modified:** 2020/04/21
\n", + "**Description:** Showcasing concepts relates to Mixture of Experts (MoE)." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "colab_type": "text" + }, + "source": [ + "# Introduction\n", + "\n", + "In this example, we implement an adaptation of the Mixture of Experts (MoE) architecture\n", + "([Shazeer et al.](https://arxiv.org/abs/1701.06538)).\n", + "The idea is to use conditional computation to increases model capacity without increasing computation.\n", + "Experts are identical blocks within a layer where each are trained to specialize in different parts of the input space.\n", + "At each forward pass, a gating network selects a subset of experts to apply to the input.\n", + "\n", + "The components to implement are:\n", + "- Gating network: A dense layer that outputs a probability distribution over the experts.\n", + "- MoE layer: A layer that applies a different expert to each input in the batch. And a loss function that ensures specialization among the experts.\n", + "- Model: A simple model that uses the MoE layer.\n", + "\n", + "In this example, we will first implement a linear MoE layer and then a CNN-based MoE layer. Lastly we will combine the two using an abstract implementation to showcase its capacties." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "colab_type": "text" + }, + "source": [ + "## Imports" + ] + }, + { + "cell_type": "code", + "execution_count": 0, + "metadata": { + "colab_type": "code" + }, + "outputs": [], + "source": [ + "import numpy as np\n", + "import keras\n", + "from keras import layers, models\n", + "import tensorflow as tf\n", + "from tensorflow.keras import backend as K" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "colab_type": "text" + }, + "source": [ + "### Data Prepration" + ] + }, + { + "cell_type": "code", + "execution_count": 0, + "metadata": { + "colab_type": "code" + }, + "outputs": [], + "source": [ + "# Model / data parameters\n", + "num_classes = 10\n", + "input_shape = (28, 28, 1)\n", + "\n", + "# Load the data and split it between train and test sets\n", + "(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data()\n", + "\n", + "# Scale images to the [0, 1] range\n", + "x_train = x_train.astype(\"float32\") / 255\n", + "x_test = x_test.astype(\"float32\") / 255\n", + "# Make sure images have shape (28, 28, 1)\n", + "x_train = np.expand_dims(x_train, -1)\n", + "x_test = np.expand_dims(x_test, -1)\n", + "print(\"x_train shape:\", x_train.shape)\n", + "print(x_train.shape[0], \"train samples\")\n", + "print(x_test.shape[0], \"test samples\")\n", + "\n", + "\n", + "# convert class vectors to binary class matrices\n", + "y_train = keras.utils.to_categorical(y_train, num_classes)\n", + "y_test = keras.utils.to_categorical(y_test, num_classes)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "colab_type": "text" + }, + "source": [ + "## Constants" + ] + }, + { + "cell_type": "code", + "execution_count": 0, + "metadata": { + "colab_type": "code" + }, + "outputs": [], + "source": [ + "NUM_EXPERTS = 5\n", + "TOP_K = 3\n", + "BATCH_SIZE = 128\n", + "NUM_EPOCHS = 20\n", + "LEARNING_RATE = 0.001\n", + "" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "colab_type": "text" + }, + "source": [ + "## Base architecture\n", + "\n", + "The most basic [MNIST classifier](https://keras.io/examples/vision/mnist_convnet/) consists of a stack of convolutional layers followed by a dense layer. In this tutorial, we will first replace the dense layer with a MoE layer. Then do the same for convolutional layers." + ] + }, + { + "cell_type": "code", + "execution_count": 0, + "metadata": { + "colab_type": "code" + }, + "outputs": [], + "source": [ + "model = keras.Sequential(\n", + " [\n", + " keras.Input(shape=input_shape),\n", + " layers.Conv2D(32, kernel_size=(3, 3), activation=\"relu\"),\n", + " layers.MaxPooling2D(pool_size=(2, 2)),\n", + " layers.Conv2D(64, kernel_size=(3, 3), activation=\"relu\"),\n", + " layers.MaxPooling2D(pool_size=(2, 2)),\n", + " layers.Flatten(),\n", + " layers.Dropout(0.5),\n", + " layers.Dense(num_classes, activation=\"softmax\"),\n", + " ]\n", + ")\n", + "\n", + "model.summary()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "colab_type": "text" + }, + "source": [ + "# Linear MoE using Dense layers\n", + "\n", + "For this layer, we will create multiple dense layers that will be used as experts. Then a simple gating network will select at each step which exerts should be utilized for the current input. We will keep track of the number of times each expert is used. Then the selected experts will be combined using a weighted sum." + ] + }, + { + "cell_type": "code", + "execution_count": 0, + "metadata": { + "colab_type": "code" + }, + "outputs": [], + "source": [ + "\n", + "class LinearMoE(layers.Layer):\n", + " def __init__(\n", + " self,\n", + " hidden_size,\n", + " num_experts=NUM_EXPERTS,\n", + " top_k=TOP_K,\n", + " ):\n", + " super(LinearMoE, self).__init__()\n", + "\n", + " # Initialize experts\n", + " self.experts = [\n", + " layers.Dense(\n", + " hidden_size,\n", + " kernel_initializer=tf.keras.initializers.RandomNormal(\n", + " mean=0.0, stddev=0.001\n", + " ),\n", + " bias_initializer=\"zeros\",\n", + " )\n", + " for _ in range(num_experts)\n", + " ]\n", + " # Initialize gating network\n", + " self.gating_network = layers.Dense(\n", + " NUM_EXPERTS,\n", + " kernel_initializer=tf.keras.initializers.RandomNormal(\n", + " mean=0.0, stddev=0.001\n", + " ),\n", + " bias_initializer=\"zeros\",\n", + " )\n", + "\n", + " self.num_experts = num_experts\n", + " self.top_k = top_k\n", + " # Keep track of how many times each expert is used\n", + " self.expert_usage_count = tf.Variable(\n", + " tf.zeros((num_experts,), dtype=tf.float32)\n", + " )\n", + "\n", + " def call(self, x):\n", + " # Get gating weights\n", + " gating_weights = self.gating_network(x)\n", + "\n", + " # Get the top k experts based on the gating weights\n", + " top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k)\n", + "\n", + " # Count usage of each expert symbolically\n", + " updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32)\n", + " # Use tf.tensor_scatter_nd_add to increment the usage count\n", + " self.expert_usage_count.assign(\n", + " tf.tensor_scatter_nd_add(\n", + " self.expert_usage_count, tf.reshape(top_k_indices, [-1, 1]), updates\n", + " )\n", + " )\n", + "\n", + " # Get outputs from only the top-k experts\n", + " top_k_expert_outputs = tf.stack(\n", + " [\n", + " self.experts[expert_index](x)\n", + " for expert_index in top_k_indices.numpy()[0]\n", + " ],\n", + " axis=1,\n", + " ) # Stack outputs along axis 1\n", + "\n", + " # Combine outputs using top-k weights\n", + " combined_output = tf.einsum(\"ijk,ij->ik\", top_k_expert_outputs, top_k_weights)\n", + "\n", + " return combined_output\n", + "" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "colab_type": "text" + }, + "source": [ + "Output of the top 3 experts out of 10 for one layer of MoE:" + ] + }, + { + "cell_type": "code", + "execution_count": 0, + "metadata": { + "colab_type": "code" + }, + "outputs": [], + "source": [ + "sample_data = tf.random.uniform((1, 10))\n", + "linear_mode = LinearMoE(32, 10, 3)\n", + "linear_mode(sample_data)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "colab_type": "text" + }, + "source": [ + "## Routing Collapse\n", + "\n", + "Routing collapse is a problem that occurs with MoE layers. The route terminology refers to the selection process of which expert to use for a given input.\n", + "\n", + "Route collapse happens when a routing model, early in training, starts favoring just a few experts because they perform slightly better due to random starting conditions. This leads to most examples being sent to these experts, leaving others unused and reducing the model\u2019s overall capacity.\n", + "\n", + "Code below demonstrates the randomness of expert selection:" + ] + }, + { + "cell_type": "code", + "execution_count": 0, + "metadata": { + "colab_type": "code" + }, + "outputs": [], + "source": [ + "\n", + "def check_expert_usage(runs):\n", + " # Running the later multiple times to show randomness of expert selection\n", + " for i in range(runs):\n", + " sample_data = tf.random.uniform((1, 10))\n", + " linear_mode = LinearMoE(10, 5)\n", + " _ = linear_mode(sample_data)\n", + " print(f\"Run {i}, Expert usage: {linear_mode.expert_usage_count.numpy()}\")\n", + "\n", + "\n", + "check_expert_usage(4)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "colab_type": "text" + }, + "source": [ + "### Adding loss functions to prevent route collapse\n", + "To fix this, the authors use extra rules (importance and load losses), ideas borrowed from [Shazeer et al.](https://arxiv.org/abs/1701.06538), to ensure all experts get used evenly.\n", + "\n", + "The importance_loss calculates how much the usage of each expert (tracked in batch_importance_sum) deviates from the average usage (mean_importance) by using mean squared error, aiming to balance expert utilization. This helps prevent route collapse by discouraging the model from overloading a few experts, instead promoting an even distribution of examples across all experts to maintain diverse and effective routing.\n", + "\n", + "#### Load losses:\n", + " - Diversity loss: Diversity loss helps prevent route collapse by encouraging the routing model to evenly distribute examples across all experts, rather than favoring just a few due to their initial performance. It does this by maximizing the entropy of the gating weights, ensuring balanced expert utilization and improving the model's overall capacity.\n", + " - Overflow loss: The batch_overflow_sum measures how much the usage of experts exceeds a set capacity by applying ReLU to the difference between usage_counts (how many examples each expert handles) and batch_capacity (the allowed limit), then summing the excesses. This helps prevent route collapse by penalizing situations where certain experts are overused, encouraging a more even spread of examples across all experts to keep the model's capacity balanced." + ] + }, + { + "cell_type": "code", + "execution_count": 0, + "metadata": { + "colab_type": "code" + }, + "outputs": [], + "source": [ + "\n", + "class LinearMoE(layers.Layer):\n", + " def __init__(\n", + " self,\n", + " hidden_size,\n", + " num_experts=NUM_EXPERTS,\n", + " top_k=TOP_K,\n", + " ):\n", + " super(LinearMoE, self).__init__()\n", + "\n", + " # Initialize experts\n", + " self.experts = [\n", + " layers.Dense(\n", + " hidden_size,\n", + " kernel_initializer=tf.keras.initializers.RandomNormal(\n", + " mean=0.0, stddev=0.001\n", + " ),\n", + " bias_initializer=\"zeros\",\n", + " )\n", + " for _ in range(num_experts)\n", + " ]\n", + " # Initialize gating network\n", + " self.gating_network = layers.Dense(\n", + " num_experts, # Match output to num_experts\n", + " kernel_initializer=tf.keras.initializers.RandomNormal(\n", + " mean=0.0, stddev=0.001\n", + " ),\n", + " bias_initializer=\"zeros\",\n", + " )\n", + "\n", + " self.num_experts = num_experts\n", + " self.top_k = top_k\n", + " # Keep track of how many times each expert is used as a layer weight\n", + " self.expert_usage_count = tf.Variable(\n", + " tf.zeros((num_experts,), dtype=tf.float32)\n", + " )\n", + "\n", + " self.batch_capacity = BATCH_SIZE // num_experts\n", + "\n", + " def _diversity_loss(self, weights):\n", + " entropy = -K.sum(weights * K.log(weights + 1e-10), axis=1)\n", + " self.diversity_loss = -K.mean(entropy)\n", + "\n", + " def _importance_loss(self, gating_weights):\n", + " batch_importance_sum = K.sum(gating_weights, axis=0)\n", + " mean_importance = K.mean(batch_importance_sum)\n", + " self.importance_loss = K.mean(\n", + " K.square(\n", + " batch_importance_sum\n", + " - mean_importance * tf.ones_like(batch_importance_sum)\n", + " )\n", + " )\n", + "\n", + " def call(self, x):\n", + " # Get gating weights and normalize\n", + " gating_weights = self.gating_network(x)\n", + " gating_weights = K.softmax(gating_weights) # Ensure weights are probabilities\n", + " self._diversity_loss(gating_weights)\n", + " self._importance_loss(gating_weights)\n", + "\n", + " # Get the top k experts based on the gating weights\n", + " top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k)\n", + "\n", + " # Count usage of each expert symbolically\n", + " updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32)\n", + " # Use tf.tensor_scatter_nd_add to increment the usage count\n", + " self.expert_usage_count.assign(\n", + " tf.tensor_scatter_nd_add(\n", + " self.expert_usage_count, tf.reshape(top_k_indices, [-1, 1]), updates\n", + " )\n", + " )\n", + "\n", + " # Calculate overflow using updated usage count\n", + " self.batch_overflow_sum = K.sum(\n", + " K.relu(tf.convert_to_tensor(self.expert_usage_count) - self.batch_capacity)\n", + " )\n", + "\n", + " # Compute all expert outputs\n", + " expert_outputs = tf.stack(\n", + " [expert(x) for expert in self.experts], axis=1\n", + " ) # Shape: (batch_size, num_experts, hidden_size)\n", + "\n", + " # Gather the top-k expert outputs using top_k_indices\n", + " batch_size = tf.shape(x)[0]\n", + " batch_indices = tf.expand_dims(\n", + " tf.range(batch_size), 1\n", + " ) # Shape: (batch_size, 1)\n", + " batch_indices = tf.tile(\n", + " batch_indices, [1, self.top_k]\n", + " ) # Shape: (batch_size, top_k)\n", + "\n", + " # Create indices for gathering\n", + " indices = tf.stack(\n", + " [batch_indices, top_k_indices], axis=2\n", + " ) # Shape: (batch_size, top_k, 2)\n", + " top_k_expert_outputs = tf.gather_nd(\n", + " expert_outputs, indices\n", + " ) # Shape: (batch_size, top_k, hidden_size)\n", + "\n", + " # Combine outputs using top-k weights\n", + " combined_output = tf.reduce_sum(\n", + " top_k_expert_outputs * tf.expand_dims(top_k_weights, axis=-1), axis=1\n", + " )\n", + "\n", + " return combined_output\n", + "\n", + " def compute_total_loss(self, load_balance_coef=0.01):\n", + " return load_balance_coef * (\n", + " self.diversity_loss + self.batch_overflow_sum + self.importance_loss\n", + " )\n", + "" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "colab_type": "text" + }, + "source": [ + "## MNIST classification with MoE" + ] + }, + { + "cell_type": "code", + "execution_count": 0, + "metadata": { + "colab_type": "code" + }, + "outputs": [], + "source": [ + "\n", + "class MoEModel(keras.Model):\n", + " def __init__(self, input_shape, num_classes, num_experts=NUM_EXPERTS, top_k=TOP_K):\n", + " super(MoEModel, self).__init__()\n", + "\n", + " # Define the convolutional block\n", + " self.conv_block = keras.Sequential(\n", + " [\n", + " layers.Conv2D(32, kernel_size=(3, 3), activation=\"relu\"),\n", + " layers.MaxPooling2D(pool_size=(2, 2)),\n", + " layers.Conv2D(64, kernel_size=(3, 3), activation=\"relu\"),\n", + " layers.MaxPooling2D(pool_size=(2, 2)),\n", + " layers.Flatten(),\n", + " layers.Dropout(0.5),\n", + " ]\n", + " )\n", + "\n", + " # MoE classifier\n", + " self.moe_classifier = LinearMoE(\n", + " hidden_size=num_classes, num_experts=num_experts, top_k=top_k\n", + " )\n", + "\n", + " # Softmax layer\n", + " self.softmax = layers.Softmax()\n", + "\n", + " def call(self, inputs, training=False):\n", + " conv_flatten = self.conv_block(inputs)\n", + " moe_output = self.moe_classifier(conv_flatten)\n", + " outputs = self.softmax(moe_output)\n", + " return outputs\n", + "\n", + " def train_step(self, data):\n", + " x, y = data # Unpack input data and labels\n", + "\n", + " with tf.GradientTape() as tape:\n", + " y_pred = self(x, training=True)\n", + " classification_loss = self.compute_loss(x, y, y_pred)\n", + " moe_loss = self.moe_classifier.compute_total_loss(load_balance_coef=0.01)\n", + " total_loss = classification_loss + moe_loss\n", + "\n", + " # Compute gradients\n", + " gradients = tape.gradient(total_loss, self.trainable_variables)\n", + "\n", + " # Update weights\n", + " self.optimizer.apply_gradients(\n", + " zip(gradients, self.trainable_variables)\n", + " ) # Update metrics (e.g., accuracy)\n", + " self.compiled_metrics.update_state(y, y_pred)\n", + " # Return a dict of metrics for monitoring\n", + " return {\n", + " \"loss\": total_loss,\n", + " \"moe_loss\": moe_loss,\n", + " **{m.name: m.result() for m in self.metrics},\n", + " }\n", + "\n", + " def test_step(self, data):\n", + " x, y = data\n", + " y_pred = self(x, training=False)\n", + " classification_loss = self.compute_loss(x, y, y_pred)\n", + " moe_loss = self.moe_classifier.compute_total_loss(load_balance_coef=0.01)\n", + " total_loss = classification_loss + moe_loss\n", + "\n", + " self.compiled_metrics.update_state(y, y_pred)\n", + " return {\n", + " \"loss\": total_loss,\n", + " \"moe_loss\": moe_loss,\n", + " **{m.name: m.result() for m in self.metrics},\n", + " }\n", + "\n", + "\n", + "# Instantiate and compile the model\n", + "inputs = keras.Input(shape=input_shape)\n", + "model = MoEModel(\n", + " input_shape=input_shape, num_classes=num_classes, num_experts=6, top_k=4\n", + ")\n", + "\n", + "model.compile(\n", + " optimizer=keras.optimizers.Adam(learning_rate=LEARNING_RATE),\n", + " loss=keras.losses.CategoricalCrossentropy(), # Assumes one-hot encoded labels\n", + " metrics=[\"accuracy\"],\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "colab_type": "text" + }, + "source": [ + "### Training" + ] + }, + { + "cell_type": "code", + "execution_count": 0, + "metadata": { + "colab_type": "code" + }, + "outputs": [], + "source": [ + "history = model.fit(\n", + " x_train,\n", + " y_train,\n", + " batch_size=BATCH_SIZE,\n", + " epochs=NUM_EPOCHS,\n", + " validation_data=(x_test, y_test),\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "colab_type": "text" + }, + "source": [ + "### Evaluation" + ] + }, + { + "cell_type": "code", + "execution_count": 0, + "metadata": { + "colab_type": "code" + }, + "outputs": [], + "source": [ + "score = model.evaluate(x_test, y_test, verbose=0)\n", + "print(\"Test loss:\", score[0])\n", + "print(\"Test accuracy:\", score[1])" + ] + } + ], + "metadata": { + "accelerator": "GPU", + "colab": { + "collapsed_sections": [], + "name": "mnist_moe", + "private_outputs": false, + "provenance": [], + "toc_visible": true + }, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.0" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} \ No newline at end of file diff --git a/examples/vision/md/mnist_moe.md b/examples/vision/md/mnist_moe.md new file mode 100644 index 0000000000..9165321e16 --- /dev/null +++ b/examples/vision/md/mnist_moe.md @@ -0,0 +1,22871 @@ +# MoE for MNIST + +**Author:** [Damoon Shahhosseini](https://www.linkedin.com/in/damoonsh/)
+**Date created:** 2015/06/19
+**Last modified:** 2020/04/21
+**Description:** Showcasing concepts relates to Mixture of Experts (MoE). + + + [**View in Colab**](https://colab.research.google.com/github/keras-team/keras-io/blob/master/examples/vision/ipynb/mnist_moe.ipynb) [**GitHub source**](https://github.com/keras-team/keras-io/blob/master/examples/vision/mnist_moe.py) + + + +# Introduction + +In this example, we implement an adaptation of the Mixture of Experts (MoE) architecture +([Shazeer et al.](https://arxiv.org/abs/1701.06538)). +The idea is to use conditional computation to increases model capacity without increasing computation. +Experts are identical blocks within a layer where each are trained to specialize in different parts of the input space. +At each forward pass, a gating network selects a subset of experts to apply to the input. + +The components to implement are: +- Gating network: A dense layer that outputs a probability distribution over the experts. +- MoE layer: A layer that applies a different expert to each input in the batch. And a loss function that ensures specialization among the experts. +- Model: A simple model that uses the MoE layer. + +In this example, we will first implement a linear MoE layer and then a CNN-based MoE layer. Lastly we will combine the two using an abstract implementation to showcase its capacties. + +--- +## Imports + + +```python +import numpy as np +import keras +from keras import layers, models +import tensorflow as tf +from tensorflow.keras import backend as K +``` + +### Data Prepration + + +```python +# Model / data parameters +num_classes = 10 +input_shape = (28, 28, 1) + +# Load the data and split it between train and test sets +(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data() + +# Scale images to the [0, 1] range +x_train = x_train.astype("float32") / 255 +x_test = x_test.astype("float32") / 255 +# Make sure images have shape (28, 28, 1) +x_train = np.expand_dims(x_train, -1) +x_test = np.expand_dims(x_test, -1) +print("x_train shape:", x_train.shape) +print(x_train.shape[0], "train samples") +print(x_test.shape[0], "test samples") + + +# convert class vectors to binary class matrices +y_train = keras.utils.to_categorical(y_train, num_classes) +y_test = keras.utils.to_categorical(y_test, num_classes) +``` + +
+``` +x_train shape: (60000, 28, 28, 1) +60000 train samples +10000 test samples + +``` +
+--- +## Constants + + +```python +NUM_EXPERTS = 5 +TOP_K = 3 +BATCH_SIZE = 128 +NUM_EPOCHS = 20 +LEARNING_RATE = 0.001 + +``` + +--- +## Base architecture + +The most basic [MNIST classifier](https://keras.io/examples/vision/mnist_convnet/) consists of a stack of convolutional layers followed by a dense layer. In this tutorial, we will first replace the dense layer with a MoE layer. Then do the same for convolutional layers. + + +```python +model = keras.Sequential( + [ + keras.Input(shape=input_shape), + layers.Conv2D(32, kernel_size=(3, 3), activation="relu"), + layers.MaxPooling2D(pool_size=(2, 2)), + layers.Conv2D(64, kernel_size=(3, 3), activation="relu"), + layers.MaxPooling2D(pool_size=(2, 2)), + layers.Flatten(), + layers.Dropout(0.5), + layers.Dense(num_classes, activation="softmax"), + ] +) + +model.summary() +``` + + +
Model: "sequential"
+
+ + + + +
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
+┃ Layer (type)                     Output Shape                  Param # ┃
+┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
+│ conv2d (Conv2D)                 │ (None, 26, 26, 32)     │           320 │
+├─────────────────────────────────┼────────────────────────┼───────────────┤
+│ max_pooling2d (MaxPooling2D)    │ (None, 13, 13, 32)     │             0 │
+├─────────────────────────────────┼────────────────────────┼───────────────┤
+│ conv2d_1 (Conv2D)               │ (None, 11, 11, 64)     │        18,496 │
+├─────────────────────────────────┼────────────────────────┼───────────────┤
+│ max_pooling2d_1 (MaxPooling2D)  │ (None, 5, 5, 64)       │             0 │
+├─────────────────────────────────┼────────────────────────┼───────────────┤
+│ flatten (Flatten)               │ (None, 1600)           │             0 │
+├─────────────────────────────────┼────────────────────────┼───────────────┤
+│ dropout (Dropout)               │ (None, 1600)           │             0 │
+├─────────────────────────────────┼────────────────────────┼───────────────┤
+│ dense (Dense)                   │ (None, 10)             │        16,010 │
+└─────────────────────────────────┴────────────────────────┴───────────────┘
+
+ + + + +
 Total params: 34,826 (136.04 KB)
+
+ + + + +
 Trainable params: 34,826 (136.04 KB)
+
+ + + + +
 Non-trainable params: 0 (0.00 B)
+
+ + + +# Linear MoE using Dense layers + +For this layer, we will create multiple dense layers that will be used as experts. Then a simple gating network will select at each step which exerts should be utilized for the current input. We will keep track of the number of times each expert is used. Then the selected experts will be combined using a weighted sum. + + +```python + +class LinearMoE(layers.Layer): + def __init__( + self, + hidden_size, + num_experts=NUM_EXPERTS, + top_k=TOP_K, + ): + super(LinearMoE, self).__init__() + + # Initialize experts + self.experts = [ + layers.Dense( + hidden_size, + kernel_initializer=tf.keras.initializers.RandomNormal( + mean=0.0, stddev=0.001 + ), + bias_initializer="zeros", + ) + for _ in range(num_experts) + ] + # Initialize gating network + self.gating_network = layers.Dense( + NUM_EXPERTS, + kernel_initializer=tf.keras.initializers.RandomNormal( + mean=0.0, stddev=0.001 + ), + bias_initializer="zeros", + ) + + self.num_experts = num_experts + self.top_k = top_k + # Keep track of how many times each expert is used + self.expert_usage_count = tf.Variable( + tf.zeros((num_experts,), dtype=tf.float32) + ) + + def call(self, x): + # Get gating weights + gating_weights = self.gating_network(x) + + # Get the top k experts based on the gating weights + top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k) + + # Count usage of each expert symbolically + updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32) + # Use tf.tensor_scatter_nd_add to increment the usage count + self.expert_usage_count.assign( + tf.tensor_scatter_nd_add( + self.expert_usage_count, tf.reshape(top_k_indices, [-1, 1]), updates + ) + ) + + # Get outputs from only the top-k experts + top_k_expert_outputs = tf.stack( + [ + self.experts[expert_index](x) + for expert_index in top_k_indices.numpy()[0] + ], + axis=1, + ) # Stack outputs along axis 1 + + # Combine outputs using top-k weights + combined_output = tf.einsum("ijk,ij->ik", top_k_expert_outputs, top_k_weights) + + return combined_output + +``` + +Output of the top 3 experts out of 10 for one layer of MoE: + + +```python +sample_data = tf.random.uniform((1, 10)) +linear_mode = LinearMoE(32, 10, 3) +linear_mode(sample_data) +``` + + + + +
+``` + + +``` +
+--- +## Routing Collapse + +Routing collapse is a problem that occurs with MoE layers. The route terminology refers to the selection process of which expert to use for a given input. + +Route collapse happens when a routing model, early in training, starts favoring just a few experts because they perform slightly better due to random starting conditions. This leads to most examples being sent to these experts, leaving others unused and reducing the model’s overall capacity. + +Code below demonstrates the randomness of expert selection: + + +```python + +def check_expert_usage(runs): + # Running the later multiple times to show randomness of expert selection + for i in range(runs): + sample_data = tf.random.uniform((1, 10)) + linear_mode = LinearMoE(10, 5) + _ = linear_mode(sample_data) + print(f"Run {i}, Expert usage: {linear_mode.expert_usage_count.numpy()}") + + +check_expert_usage(4) +``` + +
+``` +Run 0, Expert usage: [1. 0. 1. 1. 0.] +Run 1, Expert usage: [0. 1. 1. 0. 1.] +Run 2, Expert usage: [1. 1. 0. 1. 0.] +Run 3, Expert usage: [1. 0. 1. 1. 0.] + +``` +
+### Adding loss functions to prevent route collapse +To fix this, the authors use extra rules (importance and load losses), ideas borrowed from [Shazeer et al.](https://arxiv.org/abs/1701.06538), to ensure all experts get used evenly. + +The importance_loss calculates how much the usage of each expert (tracked in batch_importance_sum) deviates from the average usage (mean_importance) by using mean squared error, aiming to balance expert utilization. This helps prevent route collapse by discouraging the model from overloading a few experts, instead promoting an even distribution of examples across all experts to maintain diverse and effective routing. + +#### Load losses: + - Diversity loss: Diversity loss helps prevent route collapse by encouraging the routing model to evenly distribute examples across all experts, rather than favoring just a few due to their initial performance. It does this by maximizing the entropy of the gating weights, ensuring balanced expert utilization and improving the model's overall capacity. + - Overflow loss: The batch_overflow_sum measures how much the usage of experts exceeds a set capacity by applying ReLU to the difference between usage_counts (how many examples each expert handles) and batch_capacity (the allowed limit), then summing the excesses. This helps prevent route collapse by penalizing situations where certain experts are overused, encouraging a more even spread of examples across all experts to keep the model's capacity balanced. + + +```python + +class LinearMoE(layers.Layer): + def __init__( + self, + hidden_size, + num_experts=NUM_EXPERTS, + top_k=TOP_K, + ): + super(LinearMoE, self).__init__() + + # Initialize experts + self.experts = [ + layers.Dense( + hidden_size, + kernel_initializer=tf.keras.initializers.RandomNormal( + mean=0.0, stddev=0.001 + ), + bias_initializer="zeros", + ) + for _ in range(num_experts) + ] + # Initialize gating network + self.gating_network = layers.Dense( + num_experts, # Match output to num_experts + kernel_initializer=tf.keras.initializers.RandomNormal( + mean=0.0, stddev=0.001 + ), + bias_initializer="zeros", + ) + + self.num_experts = num_experts + self.top_k = top_k + # Keep track of how many times each expert is used as a layer weight + self.expert_usage_count = tf.Variable( + tf.zeros((num_experts,), dtype=tf.float32) + ) + + self.batch_capacity = BATCH_SIZE // num_experts + + def _diversity_loss(self, weights): + entropy = -K.sum(weights * K.log(weights + 1e-10), axis=1) + self.diversity_loss = -K.mean(entropy) + + def _importance_loss(self, gating_weights): + batch_importance_sum = K.sum(gating_weights, axis=0) + mean_importance = K.mean(batch_importance_sum) + self.importance_loss = K.mean( + K.square( + batch_importance_sum + - mean_importance * tf.ones_like(batch_importance_sum) + ) + ) + + def call(self, x): + # Get gating weights and normalize + gating_weights = self.gating_network(x) + gating_weights = K.softmax(gating_weights) # Ensure weights are probabilities + self._diversity_loss(gating_weights) + self._importance_loss(gating_weights) + + # Get the top k experts based on the gating weights + top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k) + + # Count usage of each expert symbolically + updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32) + # Use tf.tensor_scatter_nd_add to increment the usage count + self.expert_usage_count.assign( + tf.tensor_scatter_nd_add( + self.expert_usage_count, tf.reshape(top_k_indices, [-1, 1]), updates + ) + ) + + # Calculate overflow using updated usage count + self.batch_overflow_sum = K.sum( + K.relu(tf.convert_to_tensor(self.expert_usage_count) - self.batch_capacity) + ) + + # Compute all expert outputs + expert_outputs = tf.stack( + [expert(x) for expert in self.experts], axis=1 + ) # Shape: (batch_size, num_experts, hidden_size) + + # Gather the top-k expert outputs using top_k_indices + batch_size = tf.shape(x)[0] + batch_indices = tf.expand_dims( + tf.range(batch_size), 1 + ) # Shape: (batch_size, 1) + batch_indices = tf.tile( + batch_indices, [1, self.top_k] + ) # Shape: (batch_size, top_k) + + # Create indices for gathering + indices = tf.stack( + [batch_indices, top_k_indices], axis=2 + ) # Shape: (batch_size, top_k, 2) + top_k_expert_outputs = tf.gather_nd( + expert_outputs, indices + ) # Shape: (batch_size, top_k, hidden_size) + + # Combine outputs using top-k weights + combined_output = tf.reduce_sum( + top_k_expert_outputs * tf.expand_dims(top_k_weights, axis=-1), axis=1 + ) + + return combined_output + + def compute_total_loss(self, load_balance_coef=0.01): + return load_balance_coef * ( + self.diversity_loss + self.batch_overflow_sum + self.importance_loss + ) + +``` + +--- +## MNIST classification with MoE + + +```python + +class MoEModel(keras.Model): + def __init__(self, input_shape, num_classes, num_experts=NUM_EXPERTS, top_k=TOP_K): + super(MoEModel, self).__init__() + + # Define the convolutional block + self.conv_block = keras.Sequential( + [ + layers.Conv2D(32, kernel_size=(3, 3), activation="relu"), + layers.MaxPooling2D(pool_size=(2, 2)), + layers.Conv2D(64, kernel_size=(3, 3), activation="relu"), + layers.MaxPooling2D(pool_size=(2, 2)), + layers.Flatten(), + layers.Dropout(0.5), + ] + ) + + # MoE classifier + self.moe_classifier = LinearMoE( + hidden_size=num_classes, num_experts=num_experts, top_k=top_k + ) + + # Softmax layer + self.softmax = layers.Softmax() + + def call(self, inputs, training=False): + conv_flatten = self.conv_block(inputs) + moe_output = self.moe_classifier(conv_flatten) + outputs = self.softmax(moe_output) + return outputs + + def train_step(self, data): + x, y = data # Unpack input data and labels + + with tf.GradientTape() as tape: + y_pred = self(x, training=True) + classification_loss = self.compute_loss(x, y, y_pred) + moe_loss = self.moe_classifier.compute_total_loss(load_balance_coef=0.01) + total_loss = classification_loss + moe_loss + + # Compute gradients + gradients = tape.gradient(total_loss, self.trainable_variables) + + # Update weights + self.optimizer.apply_gradients( + zip(gradients, self.trainable_variables) + ) # Update metrics (e.g., accuracy) + self.compiled_metrics.update_state(y, y_pred) + # Return a dict of metrics for monitoring + return { + "loss": total_loss, + "moe_loss": moe_loss, + **{m.name: m.result() for m in self.metrics}, + } + + def test_step(self, data): + x, y = data + y_pred = self(x, training=False) + classification_loss = self.compute_loss(x, y, y_pred) + moe_loss = self.moe_classifier.compute_total_loss(load_balance_coef=0.01) + total_loss = classification_loss + moe_loss + + self.compiled_metrics.update_state(y, y_pred) + return { + "loss": total_loss, + "moe_loss": moe_loss, + **{m.name: m.result() for m in self.metrics}, + } + + +# Instantiate and compile the model +inputs = keras.Input(shape=input_shape) +model = MoEModel( + input_shape=input_shape, num_classes=num_classes, num_experts=6, top_k=4 +) + +model.compile( + optimizer=keras.optimizers.Adam(learning_rate=LEARNING_RATE), + loss=keras.losses.CategoricalCrossentropy(), # Assumes one-hot encoded labels + metrics=["accuracy"], +) +``` + +### Training + + +```python +history = model.fit( + x_train, + y_train, + batch_size=BATCH_SIZE, + epochs=NUM_EPOCHS, + validation_data=(x_test, y_test), +) +``` + +
+``` +Epoch 1/20 + +/opt/homebrew/Caskroom/miniforge/base/envs/keras-io/lib/python3.11/site-packages/keras/src/backend/tensorflow/trainer.py:642: UserWarning: `model.compiled_metrics()` is deprecated. Instead, use e.g.: +``` +for metric in self.metrics: + metric.update_state(y, y_pred) +``` +``` +
+ +
+``` + return self._compiled_metrics_update_state( + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 8:21 1s/step - accuracy: 0.1406 - loss: 0.1000 - moe_loss: 3.8421 + +
+``` + +``` +
+ 4/469 ━━━━━━━━━━━━━━━━━━━━ 8s 17ms/step - accuracy: 0.1637 - loss: 0.1000 - moe_loss: 11.5298 + +
+``` + +``` +
+ 8/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.2071 - loss: 0.1000 - moe_loss: 21.7700 + +
+``` + +``` +
+ 12/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.2438 - loss: 0.1000 - moe_loss: 32.0082 + +
+``` + +``` +
+ 16/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.2744 - loss: 0.1000 - moe_loss: 42.2476 + +
+``` + +``` +
+ 20/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.2988 - loss: 0.1000 - moe_loss: 52.4873 + +
+``` + +``` +
+ 24/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.3206 - loss: 0.1000 - moe_loss: 62.7278 + +
+``` + +``` +
+ 28/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.3409 - loss: 0.1000 - moe_loss: 72.9716 + +
+``` + +``` +
+ 32/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.3598 - loss: 0.1000 - moe_loss: 83.2221 + +
+``` + +``` +
+ 36/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.3774 - loss: 0.1000 - moe_loss: 93.4818 + +
+``` + +``` +
+ 40/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.3938 - loss: 0.1000 - moe_loss: 103.7338 + +
+``` + +``` +
+ 44/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4092 - loss: 0.1000 - moe_loss: 113.9789 + +
+``` + +``` +
+ 48/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4237 - loss: 0.1000 - moe_loss: 124.2205 + +
+``` + +``` +
+ 52/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4373 - loss: 0.1000 - moe_loss: 134.4638 + +
+``` + +``` +
+ 56/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4503 - loss: 0.1000 - moe_loss: 144.7069 + +
+``` + +``` +
+ 60/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4626 - loss: 0.1000 - moe_loss: 154.9452 + +
+``` + +``` +
+ 64/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4744 - loss: 0.1000 - moe_loss: 165.1867 + +
+``` + +``` +
+ 67/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4828 - loss: 0.1000 - moe_loss: 172.8672 + +
+``` + +``` +
+ 71/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4937 - loss: 0.1000 - moe_loss: 183.1078 + +
+``` + +``` +
+ 75/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5039 - loss: 0.1000 - moe_loss: 193.3479 + +
+``` + +``` +
+ 79/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5137 - loss: 0.1000 - moe_loss: 203.5896 + +
+``` + +``` +
+ 82/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5207 - loss: 0.1000 - moe_loss: 211.2682 + +
+``` + +``` +
+ 86/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5297 - loss: 0.1000 - moe_loss: 221.5096 + +
+``` + +``` +
+ 89/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5362 - loss: 0.1000 - moe_loss: 229.1885 + +
+``` + +``` +
+ 93/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5445 - loss: 0.1000 - moe_loss: 239.4279 + +
+``` + +``` +
+ 97/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5524 - loss: 0.1000 - moe_loss: 249.6689 + +
+``` + +``` +
+ 100/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5581 - loss: 0.1000 - moe_loss: 257.3486 + +
+``` + +``` +
+ 104/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.5654 - loss: 0.1000 - moe_loss: 267.5893 + +
+``` + +``` +
+ 107/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.5707 - loss: 0.1000 - moe_loss: 275.2696 + +
+``` + +``` +
+ 110/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.5758 - loss: 0.1000 - moe_loss: 282.9488 + +
+``` + +``` +
+ 113/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.5808 - loss: 0.1000 - moe_loss: 290.6284 + +
+``` + +``` +
+ 116/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.5856 - loss: 0.1000 - moe_loss: 298.3105 + +
+``` + +``` +
+ 119/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.5903 - loss: 0.1000 - moe_loss: 305.9899 + +
+``` + +``` +
+ 123/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.5963 - loss: 0.1000 - moe_loss: 316.2303 + +
+``` + +``` +
+ 126/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.6007 - loss: 0.1000 - moe_loss: 323.9091 + +
+``` + +``` +
+ 129/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.6049 - loss: 0.1000 - moe_loss: 331.5898 + +
+``` + +``` +
+ 133/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.6104 - loss: 0.1000 - moe_loss: 341.8302 + +
+``` + +``` +
+ 136/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.6143 - loss: 0.1000 - moe_loss: 349.5105 + +
+``` + +``` +
+ 140/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.6195 - loss: 0.1000 - moe_loss: 359.7512 + +
+``` + +``` +
+ 143/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.6232 - loss: 0.1000 - moe_loss: 367.4311 + +
+``` + +``` +
+ 147/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.6281 - loss: 0.1000 - moe_loss: 377.6704 + +
+``` + +``` +
+ 150/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.6316 - loss: 0.1000 - moe_loss: 385.3510 + +
+``` + +``` +
+ 154/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.6361 - loss: 0.1000 - moe_loss: 395.5921 + +
+``` + +``` +
+ 157/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.6394 - loss: 0.1000 - moe_loss: 403.2722 + +
+``` + +``` +
+ 160/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.6427 - loss: 0.1000 - moe_loss: 410.9522 + +
+``` + +``` +
+ 163/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.6458 - loss: 0.1000 - moe_loss: 418.6319 + +
+``` + +``` +
+ 167/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6499 - loss: 0.1000 - moe_loss: 428.8718 + +
+``` + +``` +
+ 171/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6539 - loss: 0.1000 - moe_loss: 439.1111 + +
+``` + +``` +
+ 175/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6578 - loss: 0.1000 - moe_loss: 449.3512 + +
+``` + +``` +
+ 179/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6615 - loss: 0.1000 - moe_loss: 459.5908 + +
+``` + +``` +
+ 182/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6643 - loss: 0.1000 - moe_loss: 467.2707 + +
+``` + +``` +
+ 186/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6678 - loss: 0.1000 - moe_loss: 477.5105 + +
+``` + +``` +
+ 190/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6713 - loss: 0.1000 - moe_loss: 487.7509 + +
+``` + +``` +
+ 194/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6747 - loss: 0.1000 - moe_loss: 497.9902 + +
+``` + +``` +
+ 197/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6771 - loss: 0.1000 - moe_loss: 505.6700 + +
+``` + +``` +
+ 201/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6804 - loss: 0.1000 - moe_loss: 515.9094 + +
+``` + +``` +
+ 204/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6827 - loss: 0.1000 - moe_loss: 523.5893 + +
+``` + +``` +
+ 207/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6850 - loss: 0.1000 - moe_loss: 531.2690 + +
+``` + +``` +
+ 211/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6880 - loss: 0.1000 - moe_loss: 541.5093 + +
+``` + +``` +
+ 215/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6910 - loss: 0.1000 - moe_loss: 551.7495 + +
+``` + +``` +
+ 219/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6938 - loss: 0.1000 - moe_loss: 561.9893 + +
+``` + +``` +
+ 222/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6959 - loss: 0.1000 - moe_loss: 569.6691 + +
+``` + +``` +
+ 225/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6980 - loss: 0.1000 - moe_loss: 577.3488 + +
+``` + +``` +
+ 229/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7007 - loss: 0.1000 - moe_loss: 587.5882 + +
+``` + +``` +
+ 233/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7033 - loss: 0.1000 - moe_loss: 597.8292 + +
+``` + +``` +
+ 237/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7059 - loss: 0.1000 - moe_loss: 608.0698 + +
+``` + +``` +
+ 241/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7084 - loss: 0.1000 - moe_loss: 618.3093 + +
+``` + +``` +
+ 244/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7102 - loss: 0.1000 - moe_loss: 625.9894 + +
+``` + +``` +
+ 247/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7121 - loss: 0.1000 - moe_loss: 633.6691 + +
+``` + +``` +
+ 251/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7144 - loss: 0.1000 - moe_loss: 643.9095 + +
+``` + +``` +
+ 255/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7167 - loss: 0.1000 - moe_loss: 654.1490 + +
+``` + +``` +
+ 258/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7184 - loss: 0.1000 - moe_loss: 661.8292 + +
+``` + +``` +
+ 262/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7207 - loss: 0.1000 - moe_loss: 672.0692 + +
+``` + +``` +
+ 265/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7223 - loss: 0.1000 - moe_loss: 679.7494 + +
+``` + +``` +
+ 269/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7245 - loss: 0.1000 - moe_loss: 689.9895 + +
+``` + +``` +
+ 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7266 - loss: 0.1000 - moe_loss: 700.2294 + +
+``` + +``` +
+ 277/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7286 - loss: 0.1000 - moe_loss: 710.4691 + +
+``` + +``` +
+ 281/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7306 - loss: 0.1000 - moe_loss: 720.7094 + +
+``` + +``` +
+ 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7326 - loss: 0.1000 - moe_loss: 730.9502 + +
+``` + +``` +
+ 289/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7346 - loss: 0.1000 - moe_loss: 741.1905 + +
+``` + +``` +
+ 293/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7365 - loss: 0.1000 - moe_loss: 751.4304 + +
+``` + +``` +
+ 295/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7374 - loss: 0.1000 - moe_loss: 756.5504 + +
+``` + +``` +
+ 298/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7388 - loss: 0.1000 - moe_loss: 764.2300 + +
+``` + +``` +
+ 302/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7406 - loss: 0.1000 - moe_loss: 774.4699 + +
+``` + +``` +
+ 306/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7424 - loss: 0.1000 - moe_loss: 784.7094 + +
+``` + +``` +
+ 310/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7441 - loss: 0.1000 - moe_loss: 794.9492 + +
+``` + +``` +
+ 314/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7458 - loss: 0.1000 - moe_loss: 805.1893 + +
+``` + +``` +
+ 318/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7475 - loss: 0.1000 - moe_loss: 815.4291 + +
+``` + +``` +
+ 321/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7488 - loss: 0.1000 - moe_loss: 823.1090 + +
+``` + +``` +
+ 325/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7504 - loss: 0.1000 - moe_loss: 833.3490 + +
+``` + +``` +
+ 329/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7520 - loss: 0.1000 - moe_loss: 843.5892 + +
+``` + +``` +
+ 332/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7532 - loss: 0.1000 - moe_loss: 851.2693 + +
+``` + +``` +
+ 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7548 - loss: 0.1000 - moe_loss: 861.5092 + +
+``` + +``` +
+ 340/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7563 - loss: 0.1000 - moe_loss: 871.7490 + +
+``` + +``` +
+ 343/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7575 - loss: 0.1000 - moe_loss: 879.4288 + +
+``` + +``` +
+ 347/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7590 - loss: 0.1000 - moe_loss: 889.6685 + +
+``` + +``` +
+ 350/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7601 - loss: 0.1000 - moe_loss: 897.3481 + +
+``` + +``` +
+ 353/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7612 - loss: 0.1000 - moe_loss: 905.0280 + +
+``` + +``` +
+ 357/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7626 - loss: 0.1000 - moe_loss: 915.2678 + +
+``` + +``` +
+ 361/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7640 - loss: 0.1000 - moe_loss: 925.5076 + +
+``` + +``` +
+ 365/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7654 - loss: 0.1000 - moe_loss: 935.7476 + +
+``` + +``` +
+ 368/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7664 - loss: 0.1000 - moe_loss: 943.4277 + +
+``` + +``` +
+ 372/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7678 - loss: 0.1000 - moe_loss: 953.6683 + +
+``` + +``` +
+ 375/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7688 - loss: 0.1000 - moe_loss: 961.3480 + +
+``` + +``` +
+ 378/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7698 - loss: 0.1000 - moe_loss: 969.0279 + +
+``` + +``` +
+ 382/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7711 - loss: 0.1000 - moe_loss: 979.2679 + +
+``` + +``` +
+ 386/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7724 - loss: 0.1000 - moe_loss: 989.5076 + +
+``` + +``` +
+ 390/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7736 - loss: 0.1000 - moe_loss: 999.7477 + +
+``` + +``` +
+ 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7749 - loss: 0.1000 - moe_loss: 1009.9877 + +
+``` + +``` +
+ 398/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7761 - loss: 0.1000 - moe_loss: 1020.2275 + +
+``` + +``` +
+ 402/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7773 - loss: 0.1000 - moe_loss: 1030.4677 + +
+``` + +``` +
+ 406/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7785 - loss: 0.1000 - moe_loss: 1040.7075 + +
+``` + +``` +
+ 410/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7797 - loss: 0.1000 - moe_loss: 1050.9473 + +
+``` + +``` +
+ 414/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7808 - loss: 0.1000 - moe_loss: 1061.1871 + +
+``` + +``` +
+ 418/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7820 - loss: 0.1000 - moe_loss: 1071.4269 + +
+``` + +``` +
+ 421/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7828 - loss: 0.1000 - moe_loss: 1079.1069 + +
+``` + +``` +
+ 425/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7839 - loss: 0.1000 - moe_loss: 1089.3467 + +
+``` + +``` +
+ 429/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7850 - loss: 0.1000 - moe_loss: 1099.5865 + +
+``` + +``` +
+ 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7861 - loss: 0.1000 - moe_loss: 1109.8267 + +
+``` + +``` +
+ 436/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7869 - loss: 0.1000 - moe_loss: 1117.5068 + +
+``` + +``` +
+ 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7877 - loss: 0.1000 - moe_loss: 1125.1870 + +
+``` + +``` +
+ 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7887 - loss: 0.1000 - moe_loss: 1135.4268 + +
+``` + +``` +
+ 446/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7895 - loss: 0.1000 - moe_loss: 1143.1067 + +
+``` + +``` +
+ 450/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7905 - loss: 0.1000 - moe_loss: 1153.3466 + +
+``` + +``` +
+ 454/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7916 - loss: 0.1000 - moe_loss: 1163.5869 + +
+``` + +``` +
+ 458/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7926 - loss: 0.1000 - moe_loss: 1173.8270 + +
+``` + +``` +
+ 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7933 - loss: 0.1000 - moe_loss: 1181.5070 + +
+``` + +``` +
+ 464/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7940 - loss: 0.1000 - moe_loss: 1189.1869 + +
+``` + +``` +
+ 468/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7950 - loss: 0.1000 - moe_loss: 1199.4266 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 9s 18ms/step - accuracy: 0.7956 - loss: 0.1000 - moe_loss: 1204.5302 - val_loss: 0.1000 - val_moe_loss: 2798.7275 + + +
+``` +Epoch 2/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 14s 30ms/step - accuracy: 0.9688 - loss: 0.1000 - moe_loss: 2803.8604 + +
+``` + +``` +
+ 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9682 - loss: 0.1000 - moe_loss: 2814.1450 + +
+``` + +``` +
+ 9/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9677 - loss: 0.1000 - moe_loss: 2824.3696 + +
+``` + +``` +
+ 13/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9686 - loss: 0.1000 - moe_loss: 2834.6130 + +
+``` + +``` +
+ 17/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9693 - loss: 0.1000 - moe_loss: 2844.8762 + +
+``` + +``` +
+ 20/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9693 - loss: 0.1000 - moe_loss: 2852.5579 + +
+``` + +``` +
+ 23/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9692 - loss: 0.1000 - moe_loss: 2860.2383 + +
+``` + +``` +
+ 26/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9691 - loss: 0.1000 - moe_loss: 2867.9192 + +
+``` + +``` +
+ 29/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9688 - loss: 0.1000 - moe_loss: 2875.5964 + +
+``` + +``` +
+ 33/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9683 - loss: 0.1000 - moe_loss: 2885.8335 + +
+``` + +``` +
+ 36/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9679 - loss: 0.1000 - moe_loss: 2893.5164 + +
+``` + +``` +
+ 40/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9676 - loss: 0.1000 - moe_loss: 2903.7554 + +
+``` + +``` +
+ 44/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9672 - loss: 0.1000 - moe_loss: 2913.9944 + +
+``` + +``` +
+ 48/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9668 - loss: 0.1000 - moe_loss: 2924.2329 + +
+``` + +``` +
+ 52/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9665 - loss: 0.1000 - moe_loss: 2934.4727 + +
+``` + +``` +
+ 56/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 2944.7114 + +
+``` + +``` +
+ 60/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9661 - loss: 0.1000 - moe_loss: 2954.9500 + +
+``` + +``` +
+ 64/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9660 - loss: 0.1000 - moe_loss: 2965.1897 + +
+``` + +``` +
+ 68/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 2975.4287 + +
+``` + +``` +
+ 72/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 2985.6675 + +
+``` + +``` +
+ 75/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 2993.3472 + +
+``` + +``` +
+ 79/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 3003.5850 + +
+``` + +``` +
+ 83/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 3013.8240 + +
+``` + +``` +
+ 87/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 3024.0654 + +
+``` + +``` +
+ 91/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 3034.3062 + +
+``` + +``` +
+ 95/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 3044.5454 + +
+``` + +``` +
+ 99/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 3054.7854 + +
+``` + +``` +
+ 103/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9660 - loss: 0.1000 - moe_loss: 3065.0247 + +
+``` + +``` +
+ 107/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9660 - loss: 0.1000 - moe_loss: 3075.2642 + +
+``` + +``` +
+ 110/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9660 - loss: 0.1000 - moe_loss: 3082.9436 + +
+``` + +``` +
+ 114/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9661 - loss: 0.1000 - moe_loss: 3093.1829 + +
+``` + +``` +
+ 117/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9661 - loss: 0.1000 - moe_loss: 3100.8628 + +
+``` + +``` +
+ 120/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9661 - loss: 0.1000 - moe_loss: 3108.5425 + +
+``` + +``` +
+ 123/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9662 - loss: 0.1000 - moe_loss: 3116.2224 + +
+``` + +``` +
+ 127/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9662 - loss: 0.1000 - moe_loss: 3126.4617 + +
+``` + +``` +
+ 131/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9662 - loss: 0.1000 - moe_loss: 3136.7017 + +
+``` + +``` +
+ 134/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9662 - loss: 0.1000 - moe_loss: 3144.3816 + +
+``` + +``` +
+ 138/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9662 - loss: 0.1000 - moe_loss: 3154.6211 + +
+``` + +``` +
+ 142/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3164.8611 + +
+``` + +``` +
+ 145/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3172.5408 + +
+``` + +``` +
+ 148/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3180.2202 + +
+``` + +``` +
+ 151/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3187.8999 + +
+``` + +``` +
+ 154/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3195.5798 + +
+``` + +``` +
+ 158/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3205.8191 + +
+``` + +``` +
+ 162/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3216.0586 + +
+``` + +``` +
+ 166/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3226.2993 + +
+``` + +``` +
+ 170/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3236.5393 + +
+``` + +``` +
+ 174/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9664 - loss: 0.1000 - moe_loss: 3246.7805 + +
+``` + +``` +
+ 178/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9664 - loss: 0.1000 - moe_loss: 3257.0203 + +
+``` + +``` +
+ 182/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9664 - loss: 0.1000 - moe_loss: 3267.2603 + +
+``` + +``` +
+ 185/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9664 - loss: 0.1000 - moe_loss: 3274.9407 + +
+``` + +``` +
+ 188/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9664 - loss: 0.1000 - moe_loss: 3282.6201 + +
+``` + +``` +
+ 192/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9665 - loss: 0.1000 - moe_loss: 3292.8596 + +
+``` + +``` +
+ 195/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9665 - loss: 0.1000 - moe_loss: 3300.5391 + +
+``` + +``` +
+ 199/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9665 - loss: 0.1000 - moe_loss: 3310.7786 + +
+``` + +``` +
+ 202/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9665 - loss: 0.1000 - moe_loss: 3318.4583 + +
+``` + +``` +
+ 206/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9666 - loss: 0.1000 - moe_loss: 3328.6982 + +
+``` + +``` +
+ 210/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9666 - loss: 0.1000 - moe_loss: 3338.9380 + +
+``` + +``` +
+ 213/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9666 - loss: 0.1000 - moe_loss: 3346.6179 + +
+``` + +``` +
+ 217/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9667 - loss: 0.1000 - moe_loss: 3356.8574 + +
+``` + +``` +
+ 221/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9667 - loss: 0.1000 - moe_loss: 3367.0972 + +
+``` + +``` +
+ 225/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9667 - loss: 0.1000 - moe_loss: 3377.3372 + +
+``` + +``` +
+ 229/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9667 - loss: 0.1000 - moe_loss: 3387.5769 + +
+``` + +``` +
+ 233/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9668 - loss: 0.1000 - moe_loss: 3397.8169 + +
+``` + +``` +
+ 237/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9668 - loss: 0.1000 - moe_loss: 3408.0564 + +
+``` + +``` +
+ 241/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9669 - loss: 0.1000 - moe_loss: 3418.2964 + +
+``` + +``` +
+ 244/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9669 - loss: 0.1000 - moe_loss: 3425.9768 + +
+``` + +``` +
+ 247/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9669 - loss: 0.1000 - moe_loss: 3433.6567 + +
+``` + +``` +
+ 251/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9669 - loss: 0.1000 - moe_loss: 3443.8967 + +
+``` + +``` +
+ 255/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9670 - loss: 0.1000 - moe_loss: 3454.1365 + +
+``` + +``` +
+ 259/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9670 - loss: 0.1000 - moe_loss: 3464.3762 + +
+``` + +``` +
+ 263/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9670 - loss: 0.1000 - moe_loss: 3474.6157 + +
+``` + +``` +
+ 267/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9671 - loss: 0.1000 - moe_loss: 3484.8552 + +
+``` + +``` +
+ 271/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9671 - loss: 0.1000 - moe_loss: 3495.0950 + +
+``` + +``` +
+ 274/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9671 - loss: 0.1000 - moe_loss: 3502.7751 + +
+``` + +``` +
+ 278/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9672 - loss: 0.1000 - moe_loss: 3513.0149 + +
+``` + +``` +
+ 282/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9672 - loss: 0.1000 - moe_loss: 3523.2546 + +
+``` + +``` +
+ 286/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9672 - loss: 0.1000 - moe_loss: 3533.4944 + +
+``` + +``` +
+ 290/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9673 - loss: 0.1000 - moe_loss: 3543.7341 + +
+``` + +``` +
+ 294/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9673 - loss: 0.1000 - moe_loss: 3553.9744 + +
+``` + +``` +
+ 298/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9674 - loss: 0.1000 - moe_loss: 3564.2141 + +
+``` + +``` +
+ 302/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9674 - loss: 0.1000 - moe_loss: 3574.4539 + +
+``` + +``` +
+ 306/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9674 - loss: 0.1000 - moe_loss: 3584.6936 + +
+``` + +``` +
+ 310/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9675 - loss: 0.1000 - moe_loss: 3594.9331 + +
+``` + +``` +
+ 314/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9675 - loss: 0.1000 - moe_loss: 3605.1729 + +
+``` + +``` +
+ 318/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9675 - loss: 0.1000 - moe_loss: 3615.4126 + +
+``` + +``` +
+ 322/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9676 - loss: 0.1000 - moe_loss: 3625.6523 + +
+``` + +``` +
+ 325/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9676 - loss: 0.1000 - moe_loss: 3633.3323 + +
+``` + +``` +
+ 328/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9676 - loss: 0.1000 - moe_loss: 3641.0125 + +
+``` + +``` +
+ 332/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9676 - loss: 0.1000 - moe_loss: 3651.2524 + +
+``` + +``` +
+ 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9677 - loss: 0.1000 - moe_loss: 3661.4922 + +
+``` + +``` +
+ 340/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9677 - loss: 0.1000 - moe_loss: 3671.7319 + +
+``` + +``` +
+ 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9677 - loss: 0.1000 - moe_loss: 3681.9717 + +
+``` + +``` +
+ 348/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9678 - loss: 0.1000 - moe_loss: 3692.2117 + +
+``` + +``` +
+ 352/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9678 - loss: 0.1000 - moe_loss: 3702.4514 + +
+``` + +``` +
+ 356/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9678 - loss: 0.1000 - moe_loss: 3712.6914 + +
+``` + +``` +
+ 360/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9679 - loss: 0.1000 - moe_loss: 3722.9312 + +
+``` + +``` +
+ 364/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9679 - loss: 0.1000 - moe_loss: 3733.1711 + +
+``` + +``` +
+ 367/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9679 - loss: 0.1000 - moe_loss: 3740.8511 + +
+``` + +``` +
+ 370/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9679 - loss: 0.1000 - moe_loss: 3748.5310 + +
+``` + +``` +
+ 374/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9680 - loss: 0.1000 - moe_loss: 3758.7710 + +
+``` + +``` +
+ 378/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9680 - loss: 0.1000 - moe_loss: 3769.0112 + +
+``` + +``` +
+ 381/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9680 - loss: 0.1000 - moe_loss: 3776.6914 + +
+``` + +``` +
+ 384/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9680 - loss: 0.1000 - moe_loss: 3784.3713 + +
+``` + +``` +
+ 388/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9680 - loss: 0.1000 - moe_loss: 3794.6113 + +
+``` + +``` +
+ 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9681 - loss: 0.1000 - moe_loss: 3802.2913 + +
+``` + +``` +
+ 392/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9681 - loss: 0.1000 - moe_loss: 3804.8511 + +
+``` + +``` +
+ 395/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9681 - loss: 0.1000 - moe_loss: 3812.5310 + +
+``` + +``` +
+ 398/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9681 - loss: 0.1000 - moe_loss: 3820.2109 + +
+``` + +``` +
+ 401/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9681 - loss: 0.1000 - moe_loss: 3827.8906 + +
+``` + +``` +
+ 404/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9682 - loss: 0.1000 - moe_loss: 3835.5706 + +
+``` + +``` +
+ 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9682 - loss: 0.1000 - moe_loss: 3843.2505 + +
+``` + +``` +
+ 410/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9682 - loss: 0.1000 - moe_loss: 3850.9304 + +
+``` + +``` +
+ 413/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9682 - loss: 0.1000 - moe_loss: 3858.6106 + +
+``` + +``` +
+ 417/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9682 - loss: 0.1000 - moe_loss: 3868.8503 + +
+``` + +``` +
+ 421/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9683 - loss: 0.1000 - moe_loss: 3879.0901 + +
+``` + +``` +
+ 425/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9683 - loss: 0.1000 - moe_loss: 3889.3303 + +
+``` + +``` +
+ 429/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9683 - loss: 0.1000 - moe_loss: 3899.5706 + +
+``` + +``` +
+ 432/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9683 - loss: 0.1000 - moe_loss: 3907.2507 + +
+``` + +``` +
+ 435/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9684 - loss: 0.1000 - moe_loss: 3914.9309 + +
+``` + +``` +
+ 438/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9684 - loss: 0.1000 - moe_loss: 3922.6106 + +
+``` + +``` +
+ 441/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9684 - loss: 0.1000 - moe_loss: 3930.2908 + +
+``` + +``` +
+ 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9684 - loss: 0.1000 - moe_loss: 3940.5305 + +
+``` + +``` +
+ 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9685 - loss: 0.1000 - moe_loss: 3950.7703 + +
+``` + +``` +
+ 452/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9685 - loss: 0.1000 - moe_loss: 3958.4500 + +
+``` + +``` +
+ 456/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9685 - loss: 0.1000 - moe_loss: 3968.6899 + +
+``` + +``` +
+ 459/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9685 - loss: 0.1000 - moe_loss: 3976.3699 + +
+``` + +``` +
+ 462/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9686 - loss: 0.1000 - moe_loss: 3984.0498 + +
+``` + +``` +
+ 466/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9686 - loss: 0.1000 - moe_loss: 3994.2898 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9686 - loss: 0.1000 - moe_loss: 4004.5132 - val_loss: 0.1000 - val_moe_loss: 5598.7266 + + +
+``` +Epoch 3/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 13s 28ms/step - accuracy: 0.9766 - loss: 0.1000 - moe_loss: 5603.8740 + +
+``` + +``` +
+ 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9725 - loss: 0.1000 - moe_loss: 5614.1147 + +
+``` + +``` +
+ 9/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9711 - loss: 0.1000 - moe_loss: 5624.3594 + +
+``` + +``` +
+ 12/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9713 - loss: 0.1000 - moe_loss: 5632.0366 + +
+``` + +``` +
+ 16/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9720 - loss: 0.1000 - moe_loss: 5642.2812 + +
+``` + +``` +
+ 20/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9727 - loss: 0.1000 - moe_loss: 5652.5317 + +
+``` + +``` +
+ 24/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9731 - loss: 0.1000 - moe_loss: 5662.7671 + +
+``` + +``` +
+ 28/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9734 - loss: 0.1000 - moe_loss: 5673.0073 + +
+``` + +``` +
+ 31/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9736 - loss: 0.1000 - moe_loss: 5680.6851 + +
+``` + +``` +
+ 35/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9738 - loss: 0.1000 - moe_loss: 5690.9282 + +
+``` + +``` +
+ 39/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9740 - loss: 0.1000 - moe_loss: 5701.1680 + +
+``` + +``` +
+ 43/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9742 - loss: 0.1000 - moe_loss: 5711.4087 + +
+``` + +``` +
+ 47/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9742 - loss: 0.1000 - moe_loss: 5721.6470 + +
+``` + +``` +
+ 51/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9743 - loss: 0.1000 - moe_loss: 5731.8843 + +
+``` + +``` +
+ 54/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9743 - loss: 0.1000 - moe_loss: 5739.5645 + +
+``` + +``` +
+ 58/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9744 - loss: 0.1000 - moe_loss: 5749.8052 + +
+``` + +``` +
+ 61/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9745 - loss: 0.1000 - moe_loss: 5757.4844 + +
+``` + +``` +
+ 65/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9746 - loss: 0.1000 - moe_loss: 5767.7251 + +
+``` + +``` +
+ 69/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9747 - loss: 0.1000 - moe_loss: 5777.9648 + +
+``` + +``` +
+ 73/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9748 - loss: 0.1000 - moe_loss: 5788.2041 + +
+``` + +``` +
+ 77/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9749 - loss: 0.1000 - moe_loss: 5798.4434 + +
+``` + +``` +
+ 81/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9750 - loss: 0.1000 - moe_loss: 5808.6831 + +
+``` + +``` +
+ 84/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9751 - loss: 0.1000 - moe_loss: 5816.3623 + +
+``` + +``` +
+ 88/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9752 - loss: 0.1000 - moe_loss: 5826.6025 + +
+``` + +``` +
+ 92/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9752 - loss: 0.1000 - moe_loss: 5836.8413 + +
+``` + +``` +
+ 96/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9753 - loss: 0.1000 - moe_loss: 5847.0811 + +
+``` + +``` +
+ 100/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9753 - loss: 0.1000 - moe_loss: 5857.3213 + +
+``` + +``` +
+ 104/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9754 - loss: 0.1000 - moe_loss: 5867.5610 + +
+``` + +``` +
+ 108/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9755 - loss: 0.1000 - moe_loss: 5877.8013 + +
+``` + +``` +
+ 111/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9755 - loss: 0.1000 - moe_loss: 5885.4810 + +
+``` + +``` +
+ 115/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9756 - loss: 0.1000 - moe_loss: 5895.7212 + +
+``` + +``` +
+ 119/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9756 - loss: 0.1000 - moe_loss: 5905.9614 + +
+``` + +``` +
+ 122/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9757 - loss: 0.1000 - moe_loss: 5913.6421 + +
+``` + +``` +
+ 126/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9758 - loss: 0.1000 - moe_loss: 5923.8813 + +
+``` + +``` +
+ 129/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9758 - loss: 0.1000 - moe_loss: 5931.5615 + +
+``` + +``` +
+ 132/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9758 - loss: 0.1000 - moe_loss: 5939.2412 + +
+``` + +``` +
+ 136/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9759 - loss: 0.1000 - moe_loss: 5949.4810 + +
+``` + +``` +
+ 140/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9759 - loss: 0.1000 - moe_loss: 5959.7207 + +
+``` + +``` +
+ 144/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9760 - loss: 0.1000 - moe_loss: 5969.9600 + +
+``` + +``` +
+ 148/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9760 - loss: 0.1000 - moe_loss: 5980.2007 + +
+``` + +``` +
+ 152/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9761 - loss: 0.1000 - moe_loss: 5990.4404 + +
+``` + +``` +
+ 156/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9761 - loss: 0.1000 - moe_loss: 6000.6802 + +
+``` + +``` +
+ 160/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9762 - loss: 0.1000 - moe_loss: 6010.9199 + +
+``` + +``` +
+ 164/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9762 - loss: 0.1000 - moe_loss: 6021.1602 + +
+``` + +``` +
+ 168/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9763 - loss: 0.1000 - moe_loss: 6031.3994 + +
+``` + +``` +
+ 172/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9763 - loss: 0.1000 - moe_loss: 6041.6392 + +
+``` + +``` +
+ 173/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9763 - loss: 0.1000 - moe_loss: 6044.1992 + +
+``` + +``` +
+ 176/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9764 - loss: 0.1000 - moe_loss: 6051.8794 + +
+``` + +``` +
+ 179/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9764 - loss: 0.1000 - moe_loss: 6059.5596 + +
+``` + +``` +
+ 183/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9765 - loss: 0.1000 - moe_loss: 6069.7998 + +
+``` + +``` +
+ 187/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9765 - loss: 0.1000 - moe_loss: 6080.0405 + +
+``` + +``` +
+ 191/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9765 - loss: 0.1000 - moe_loss: 6090.2808 + +
+``` + +``` +
+ 195/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9766 - loss: 0.1000 - moe_loss: 6100.5200 + +
+``` + +``` +
+ 199/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9766 - loss: 0.1000 - moe_loss: 6110.7603 + +
+``` + +``` +
+ 203/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9767 - loss: 0.1000 - moe_loss: 6120.9995 + +
+``` + +``` +
+ 207/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9767 - loss: 0.1000 - moe_loss: 6131.2402 + +
+``` + +``` +
+ 211/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9768 - loss: 0.1000 - moe_loss: 6141.4800 + +
+``` + +``` +
+ 215/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9768 - loss: 0.1000 - moe_loss: 6151.7197 + +
+``` + +``` +
+ 219/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9769 - loss: 0.1000 - moe_loss: 6161.9600 + +
+``` + +``` +
+ 223/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9769 - loss: 0.1000 - moe_loss: 6172.1992 + +
+``` + +``` +
+ 227/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9769 - loss: 0.1000 - moe_loss: 6182.4390 + +
+``` + +``` +
+ 231/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9770 - loss: 0.1000 - moe_loss: 6192.6792 + +
+``` + +``` +
+ 235/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9770 - loss: 0.1000 - moe_loss: 6202.9194 + +
+``` + +``` +
+ 239/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9771 - loss: 0.1000 - moe_loss: 6213.1592 + +
+``` + +``` +
+ 243/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9771 - loss: 0.1000 - moe_loss: 6223.3989 + +
+``` + +``` +
+ 246/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9771 - loss: 0.1000 - moe_loss: 6231.0786 + +
+``` + +``` +
+ 250/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9772 - loss: 0.1000 - moe_loss: 6241.3188 + +
+``` + +``` +
+ 253/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9772 - loss: 0.1000 - moe_loss: 6248.9990 + +
+``` + +``` +
+ 256/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9772 - loss: 0.1000 - moe_loss: 6256.6792 + +
+``` + +``` +
+ 260/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9773 - loss: 0.1000 - moe_loss: 6266.9189 + +
+``` + +``` +
+ 264/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9773 - loss: 0.1000 - moe_loss: 6277.1587 + +
+``` + +``` +
+ 267/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9773 - loss: 0.1000 - moe_loss: 6284.8384 + +
+``` + +``` +
+ 270/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9773 - loss: 0.1000 - moe_loss: 6292.5186 + +
+``` + +``` +
+ 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9773 - loss: 0.1000 - moe_loss: 6300.1987 + +
+``` + +``` +
+ 276/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9774 - loss: 0.1000 - moe_loss: 6307.8789 + +
+``` + +``` +
+ 279/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9774 - loss: 0.1000 - moe_loss: 6315.5586 + +
+``` + +``` +
+ 282/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9774 - loss: 0.1000 - moe_loss: 6323.2388 + +
+``` + +``` +
+ 286/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9774 - loss: 0.1000 - moe_loss: 6333.4790 + +
+``` + +``` +
+ 290/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9775 - loss: 0.1000 - moe_loss: 6343.7188 + +
+``` + +``` +
+ 294/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9775 - loss: 0.1000 - moe_loss: 6353.9590 + +
+``` + +``` +
+ 298/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9775 - loss: 0.1000 - moe_loss: 6364.1992 + +
+``` + +``` +
+ 302/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9776 - loss: 0.1000 - moe_loss: 6374.4390 + +
+``` + +``` +
+ 305/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9776 - loss: 0.1000 - moe_loss: 6382.1191 + +
+``` + +``` +
+ 309/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9776 - loss: 0.1000 - moe_loss: 6392.3589 + +
+``` + +``` +
+ 313/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9776 - loss: 0.1000 - moe_loss: 6402.5991 + +
+``` + +``` +
+ 317/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9777 - loss: 0.1000 - moe_loss: 6412.8389 + +
+``` + +``` +
+ 321/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9777 - loss: 0.1000 - moe_loss: 6423.0786 + +
+``` + +``` +
+ 325/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9777 - loss: 0.1000 - moe_loss: 6433.3184 + +
+``` + +``` +
+ 329/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9778 - loss: 0.1000 - moe_loss: 6443.5581 + +
+``` + +``` +
+ 333/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9778 - loss: 0.1000 - moe_loss: 6453.7983 + +
+``` + +``` +
+ 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9778 - loss: 0.1000 - moe_loss: 6461.4780 + +
+``` + +``` +
+ 340/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9778 - loss: 0.1000 - moe_loss: 6471.7178 + +
+``` + +``` +
+ 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9779 - loss: 0.1000 - moe_loss: 6481.9580 + +
+``` + +``` +
+ 348/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9779 - loss: 0.1000 - moe_loss: 6492.1978 + +
+``` + +``` +
+ 352/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9779 - loss: 0.1000 - moe_loss: 6502.4375 + +
+``` + +``` +
+ 356/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9779 - loss: 0.1000 - moe_loss: 6512.6777 + +
+``` + +``` +
+ 360/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9780 - loss: 0.1000 - moe_loss: 6522.9180 + +
+``` + +``` +
+ 364/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9780 - loss: 0.1000 - moe_loss: 6533.1577 + +
+``` + +``` +
+ 367/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9780 - loss: 0.1000 - moe_loss: 6540.8379 + +
+``` + +``` +
+ 371/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9780 - loss: 0.1000 - moe_loss: 6551.0776 + +
+``` + +``` +
+ 375/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9780 - loss: 0.1000 - moe_loss: 6561.3174 + +
+``` + +``` +
+ 379/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9781 - loss: 0.1000 - moe_loss: 6571.5576 + +
+``` + +``` +
+ 383/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9781 - loss: 0.1000 - moe_loss: 6581.7974 + +
+``` + +``` +
+ 387/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9781 - loss: 0.1000 - moe_loss: 6592.0371 + +
+``` + +``` +
+ 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9781 - loss: 0.1000 - moe_loss: 6602.2773 + +
+``` + +``` +
+ 395/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9781 - loss: 0.1000 - moe_loss: 6612.5176 + +
+``` + +``` +
+ 398/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9782 - loss: 0.1000 - moe_loss: 6620.1973 + +
+``` + +``` +
+ 402/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9782 - loss: 0.1000 - moe_loss: 6630.4375 + +
+``` + +``` +
+ 405/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9782 - loss: 0.1000 - moe_loss: 6638.1172 + +
+``` + +``` +
+ 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9782 - loss: 0.1000 - moe_loss: 6648.3569 + +
+``` + +``` +
+ 413/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9782 - loss: 0.1000 - moe_loss: 6658.5972 + +
+``` + +``` +
+ 416/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9782 - loss: 0.1000 - moe_loss: 6666.2769 + +
+``` + +``` +
+ 419/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 6673.9570 + +
+``` + +``` +
+ 423/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 6684.1973 + +
+``` + +``` +
+ 426/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 6691.8770 + +
+``` + +``` +
+ 429/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 6699.5571 + +
+``` + +``` +
+ 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 6709.7969 + +
+``` + +``` +
+ 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 6720.0366 + +
+``` + +``` +
+ 441/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 6730.2764 + +
+``` + +``` +
+ 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 6740.5166 + +
+``` + +``` +
+ 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 6750.7563 + +
+``` + +``` +
+ 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 6760.9961 + +
+``` + +``` +
+ 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 6771.2363 + +
+``` + +``` +
+ 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 6781.4766 + +
+``` + +``` +
+ 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 6791.7163 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 6801.9536 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9785 - loss: 0.1000 - moe_loss: 6804.5000 - val_loss: 0.1000 - val_moe_loss: 8398.7275 + + +
+``` +Epoch 4/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 12s 26ms/step - accuracy: 0.9766 - loss: 0.1000 - moe_loss: 8403.8486 + +
+``` + +``` +
+ 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 8414.1064 + +
+``` + +``` +
+ 9/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9777 - loss: 0.1000 - moe_loss: 8424.3496 + +
+``` + +``` +
+ 13/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9775 - loss: 0.1000 - moe_loss: 8434.5850 + +
+``` + +``` +
+ 17/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9771 - loss: 0.1000 - moe_loss: 8444.8232 + +
+``` + +``` +
+ 21/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9770 - loss: 0.1000 - moe_loss: 8455.0625 + +
+``` + +``` +
+ 25/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9772 - loss: 0.1000 - moe_loss: 8465.3047 + +
+``` + +``` +
+ 28/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9773 - loss: 0.1000 - moe_loss: 8472.9844 + +
+``` + +``` +
+ 32/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9774 - loss: 0.1000 - moe_loss: 8483.2256 + +
+``` + +``` +
+ 36/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9776 - loss: 0.1000 - moe_loss: 8493.4678 + +
+``` + +``` +
+ 40/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9779 - loss: 0.1000 - moe_loss: 8503.7090 + +
+``` + +``` +
+ 44/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9781 - loss: 0.1000 - moe_loss: 8513.9502 + +
+``` + +``` +
+ 48/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9782 - loss: 0.1000 - moe_loss: 8524.1924 + +
+``` + +``` +
+ 52/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 8534.4336 + +
+``` + +``` +
+ 56/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 8544.6738 + +
+``` + +``` +
+ 60/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9786 - loss: 0.1000 - moe_loss: 8554.9131 + +
+``` + +``` +
+ 64/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9787 - loss: 0.1000 - moe_loss: 8565.1514 + +
+``` + +``` +
+ 68/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9789 - loss: 0.1000 - moe_loss: 8575.3916 + +
+``` + +``` +
+ 72/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9790 - loss: 0.1000 - moe_loss: 8585.6318 + +
+``` + +``` +
+ 75/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9791 - loss: 0.1000 - moe_loss: 8593.3125 + +
+``` + +``` +
+ 79/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9792 - loss: 0.1000 - moe_loss: 8603.5527 + +
+``` + +``` +
+ 83/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9793 - loss: 0.1000 - moe_loss: 8613.7930 + +
+``` + +``` +
+ 87/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9793 - loss: 0.1000 - moe_loss: 8624.0332 + +
+``` + +``` +
+ 90/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9794 - loss: 0.1000 - moe_loss: 8631.7139 + +
+``` + +``` +
+ 94/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9794 - loss: 0.1000 - moe_loss: 8641.9541 + +
+``` + +``` +
+ 98/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9795 - loss: 0.1000 - moe_loss: 8652.1943 + +
+``` + +``` +
+ 101/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9795 - loss: 0.1000 - moe_loss: 8659.8740 + +
+``` + +``` +
+ 104/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9796 - loss: 0.1000 - moe_loss: 8667.5547 + +
+``` + +``` +
+ 108/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9797 - loss: 0.1000 - moe_loss: 8677.7939 + +
+``` + +``` +
+ 111/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9797 - loss: 0.1000 - moe_loss: 8685.4736 + +
+``` + +``` +
+ 115/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9798 - loss: 0.1000 - moe_loss: 8695.7139 + +
+``` + +``` +
+ 119/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9799 - loss: 0.1000 - moe_loss: 8705.9541 + +
+``` + +``` +
+ 122/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9799 - loss: 0.1000 - moe_loss: 8713.6348 + +
+``` + +``` +
+ 126/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9800 - loss: 0.1000 - moe_loss: 8723.8750 + +
+``` + +``` +
+ 130/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9800 - loss: 0.1000 - moe_loss: 8734.1143 + +
+``` + +``` +
+ 134/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9801 - loss: 0.1000 - moe_loss: 8744.3545 + +
+``` + +``` +
+ 138/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9802 - loss: 0.1000 - moe_loss: 8754.5947 + +
+``` + +``` +
+ 142/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9802 - loss: 0.1000 - moe_loss: 8764.8350 + +
+``` + +``` +
+ 146/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9803 - loss: 0.1000 - moe_loss: 8775.0742 + +
+``` + +``` +
+ 149/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9803 - loss: 0.1000 - moe_loss: 8782.7549 + +
+``` + +``` +
+ 152/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9804 - loss: 0.1000 - moe_loss: 8790.4346 + +
+``` + +``` +
+ 156/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9805 - loss: 0.1000 - moe_loss: 8800.6738 + +
+``` + +``` +
+ 160/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9805 - loss: 0.1000 - moe_loss: 8810.9141 + +
+``` + +``` +
+ 164/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9806 - loss: 0.1000 - moe_loss: 8821.1533 + +
+``` + +``` +
+ 168/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9806 - loss: 0.1000 - moe_loss: 8831.3936 + +
+``` + +``` +
+ 172/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9806 - loss: 0.1000 - moe_loss: 8841.6328 + +
+``` + +``` +
+ 176/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9807 - loss: 0.1000 - moe_loss: 8851.8730 + +
+``` + +``` +
+ 180/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9807 - loss: 0.1000 - moe_loss: 8862.1123 + +
+``` + +``` +
+ 184/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9808 - loss: 0.1000 - moe_loss: 8872.3525 + +
+``` + +``` +
+ 188/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9808 - loss: 0.1000 - moe_loss: 8882.5928 + +
+``` + +``` +
+ 192/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9809 - loss: 0.1000 - moe_loss: 8892.8330 + +
+``` + +``` +
+ 196/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9809 - loss: 0.1000 - moe_loss: 8903.0732 + +
+``` + +``` +
+ 200/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9809 - loss: 0.1000 - moe_loss: 8913.3135 + +
+``` + +``` +
+ 204/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9810 - loss: 0.1000 - moe_loss: 8923.5537 + +
+``` + +``` +
+ 207/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9810 - loss: 0.1000 - moe_loss: 8931.2334 + +
+``` + +``` +
+ 210/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9810 - loss: 0.1000 - moe_loss: 8938.9131 + +
+``` + +``` +
+ 214/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9810 - loss: 0.1000 - moe_loss: 8949.1523 + +
+``` + +``` +
+ 218/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9811 - loss: 0.1000 - moe_loss: 8959.3926 + +
+``` + +``` +
+ 222/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9811 - loss: 0.1000 - moe_loss: 8969.6318 + +
+``` + +``` +
+ 226/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9811 - loss: 0.1000 - moe_loss: 8979.8721 + +
+``` + +``` +
+ 229/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9811 - loss: 0.1000 - moe_loss: 8987.5518 + +
+``` + +``` +
+ 233/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9812 - loss: 0.1000 - moe_loss: 8997.7920 + +
+``` + +``` +
+ 236/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9812 - loss: 0.1000 - moe_loss: 9005.4717 + +
+``` + +``` +
+ 240/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9812 - loss: 0.1000 - moe_loss: 9015.7119 + +
+``` + +``` +
+ 244/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9812 - loss: 0.1000 - moe_loss: 9025.9521 + +
+``` + +``` +
+ 248/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9813 - loss: 0.1000 - moe_loss: 9036.1914 + +
+``` + +``` +
+ 252/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9813 - loss: 0.1000 - moe_loss: 9046.4316 + +
+``` + +``` +
+ 255/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9813 - loss: 0.1000 - moe_loss: 9054.1113 + +
+``` + +``` +
+ 258/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9813 - loss: 0.1000 - moe_loss: 9061.7910 + +
+``` + +``` +
+ 262/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9813 - loss: 0.1000 - moe_loss: 9072.0312 + +
+``` + +``` +
+ 266/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 9082.2715 + +
+``` + +``` +
+ 269/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 9089.9512 + +
+``` + +``` +
+ 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 9100.1914 + +
+``` + +``` +
+ 277/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 9110.4307 + +
+``` + +``` +
+ 280/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 9118.1113 + +
+``` + +``` +
+ 284/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 9128.3516 + +
+``` + +``` +
+ 288/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9815 - loss: 0.1000 - moe_loss: 9138.5908 + +
+``` + +``` +
+ 292/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9815 - loss: 0.1000 - moe_loss: 9148.8311 + +
+``` + +``` +
+ 296/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9815 - loss: 0.1000 - moe_loss: 9159.0713 + +
+``` + +``` +
+ 300/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9815 - loss: 0.1000 - moe_loss: 9169.3105 + +
+``` + +``` +
+ 304/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9816 - loss: 0.1000 - moe_loss: 9179.5508 + +
+``` + +``` +
+ 307/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9816 - loss: 0.1000 - moe_loss: 9187.2305 + +
+``` + +``` +
+ 311/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9816 - loss: 0.1000 - moe_loss: 9197.4707 + +
+``` + +``` +
+ 314/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9816 - loss: 0.1000 - moe_loss: 9205.1504 + +
+``` + +``` +
+ 318/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9816 - loss: 0.1000 - moe_loss: 9215.3906 + +
+``` + +``` +
+ 322/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9816 - loss: 0.1000 - moe_loss: 9225.6309 + +
+``` + +``` +
+ 326/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9817 - loss: 0.1000 - moe_loss: 9235.8711 + +
+``` + +``` +
+ 329/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9817 - loss: 0.1000 - moe_loss: 9243.5508 + +
+``` + +``` +
+ 332/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9817 - loss: 0.1000 - moe_loss: 9251.2314 + +
+``` + +``` +
+ 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9817 - loss: 0.1000 - moe_loss: 9261.4707 + +
+``` + +``` +
+ 340/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9817 - loss: 0.1000 - moe_loss: 9271.7109 + +
+``` + +``` +
+ 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9817 - loss: 0.1000 - moe_loss: 9281.9512 + +
+``` + +``` +
+ 348/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 9292.1914 + +
+``` + +``` +
+ 351/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 9299.8711 + +
+``` + +``` +
+ 355/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 9310.1113 + +
+``` + +``` +
+ 359/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 9320.3516 + +
+``` + +``` +
+ 363/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 9330.5908 + +
+``` + +``` +
+ 367/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 9340.8311 + +
+``` + +``` +
+ 370/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9348.5107 + +
+``` + +``` +
+ 374/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9358.7510 + +
+``` + +``` +
+ 378/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9368.9912 + +
+``` + +``` +
+ 381/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9376.6709 + +
+``` + +``` +
+ 385/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9386.9111 + +
+``` + +``` +
+ 389/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9397.1514 + +
+``` + +``` +
+ 392/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9404.8311 + +
+``` + +``` +
+ 396/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9415.0713 + +
+``` + +``` +
+ 399/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9422.7510 + +
+``` + +``` +
+ 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9432.9912 + +
+``` + +``` +
+ 406/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9440.6709 + +
+``` + +``` +
+ 409/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9448.3506 + +
+``` + +``` +
+ 413/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9458.5908 + +
+``` + +``` +
+ 417/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9468.8311 + +
+``` + +``` +
+ 421/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9479.0703 + +
+``` + +``` +
+ 425/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9489.3105 + +
+``` + +``` +
+ 429/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9499.5498 + +
+``` + +``` +
+ 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9509.7900 + +
+``` + +``` +
+ 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9520.0303 + +
+``` + +``` +
+ 440/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9527.7100 + +
+``` + +``` +
+ 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9535.3896 + +
+``` + +``` +
+ 447/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9545.6299 + +
+``` + +``` +
+ 450/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9553.3096 + +
+``` + +``` +
+ 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9560.9902 + +
+``` + +``` +
+ 456/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9568.6699 + +
+``` + +``` +
+ 460/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9578.9102 + +
+``` + +``` +
+ 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9586.5898 + +
+``` + +``` +
+ 466/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9594.2695 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 17ms/step - accuracy: 0.9822 - loss: 0.1000 - moe_loss: 9604.4932 - val_loss: 0.1000 - val_moe_loss: 11198.7256 + + +
+``` +Epoch 5/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 12s 26ms/step - accuracy: 0.9688 - loss: 0.1000 - moe_loss: 11203.8506 + +
+``` + +``` +
+ 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9796 - loss: 0.1000 - moe_loss: 11214.1025 + +
+``` + +``` +
+ 8/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9802 - loss: 0.1000 - moe_loss: 11221.7900 + +
+``` + +``` +
+ 11/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9807 - loss: 0.1000 - moe_loss: 11229.4717 + +
+``` + +``` +
+ 15/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9807 - loss: 0.1000 - moe_loss: 11239.7148 + +
+``` + +``` +
+ 18/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9809 - loss: 0.1000 - moe_loss: 11247.3945 + +
+``` + +``` +
+ 21/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9810 - loss: 0.1000 - moe_loss: 11255.0732 + +
+``` + +``` +
+ 25/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9813 - loss: 0.1000 - moe_loss: 11265.3115 + +
+``` + +``` +
+ 29/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9813 - loss: 0.1000 - moe_loss: 11275.5498 + +
+``` + +``` +
+ 33/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 11285.7881 + +
+``` + +``` +
+ 37/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 11296.0273 + +
+``` + +``` +
+ 40/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 11303.7070 + +
+``` + +``` +
+ 42/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9815 - loss: 0.1000 - moe_loss: 11308.8281 + +
+``` + +``` +
+ 43/469 ━━━━━━━━━━━━━━━━━━━━ 9s 22ms/step - accuracy: 0.9815 - loss: 0.1000 - moe_loss: 11311.3887 + +
+``` + +``` +
+ 46/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9816 - loss: 0.1000 - moe_loss: 11319.0693 + +
+``` + +``` +
+ 49/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9817 - loss: 0.1000 - moe_loss: 11326.7500 + +
+``` + +``` +
+ 53/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 11336.9912 + +
+``` + +``` +
+ 56/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 11344.6709 + +
+``` + +``` +
+ 60/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 11354.9111 + +
+``` + +``` +
+ 64/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 11365.1504 + +
+``` + +``` +
+ 67/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 11372.8301 + +
+``` + +``` +
+ 70/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 11380.5098 + +
+``` + +``` +
+ 74/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9822 - loss: 0.1000 - moe_loss: 11390.7490 + +
+``` + +``` +
+ 77/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9822 - loss: 0.1000 - moe_loss: 11398.4287 + +
+``` + +``` +
+ 81/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9822 - loss: 0.1000 - moe_loss: 11408.6680 + +
+``` + +``` +
+ 84/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9822 - loss: 0.1000 - moe_loss: 11416.3486 + +
+``` + +``` +
+ 87/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9823 - loss: 0.1000 - moe_loss: 11424.0283 + +
+``` + +``` +
+ 90/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9823 - loss: 0.1000 - moe_loss: 11431.7080 + +
+``` + +``` +
+ 94/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9823 - loss: 0.1000 - moe_loss: 11441.9473 + +
+``` + +``` +
+ 97/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9823 - loss: 0.1000 - moe_loss: 11449.6270 + +
+``` + +``` +
+ 100/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9824 - loss: 0.1000 - moe_loss: 11457.3076 + +
+``` + +``` +
+ 104/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9824 - loss: 0.1000 - moe_loss: 11467.5479 + +
+``` + +``` +
+ 108/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9824 - loss: 0.1000 - moe_loss: 11477.7871 + +
+``` + +``` +
+ 112/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9825 - loss: 0.1000 - moe_loss: 11488.0273 + +
+``` + +``` +
+ 116/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9825 - loss: 0.1000 - moe_loss: 11498.2666 + +
+``` + +``` +
+ 120/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9825 - loss: 0.1000 - moe_loss: 11508.5078 + +
+``` + +``` +
+ 124/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9826 - loss: 0.1000 - moe_loss: 11518.7480 + +
+``` + +``` +
+ 127/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9826 - loss: 0.1000 - moe_loss: 11526.4277 + +
+``` + +``` +
+ 131/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9827 - loss: 0.1000 - moe_loss: 11536.6680 + +
+``` + +``` +
+ 134/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9827 - loss: 0.1000 - moe_loss: 11544.3477 + +
+``` + +``` +
+ 138/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9827 - loss: 0.1000 - moe_loss: 11554.5879 + +
+``` + +``` +
+ 142/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9828 - loss: 0.1000 - moe_loss: 11564.8271 + +
+``` + +``` +
+ 146/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9828 - loss: 0.1000 - moe_loss: 11575.0674 + +
+``` + +``` +
+ 150/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9829 - loss: 0.1000 - moe_loss: 11585.3066 + +
+``` + +``` +
+ 153/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9829 - loss: 0.1000 - moe_loss: 11592.9873 + +
+``` + +``` +
+ 157/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9830 - loss: 0.1000 - moe_loss: 11603.2266 + +
+``` + +``` +
+ 161/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9830 - loss: 0.1000 - moe_loss: 11613.4668 + +
+``` + +``` +
+ 165/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9830 - loss: 0.1000 - moe_loss: 11623.7061 + +
+``` + +``` +
+ 169/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9831 - loss: 0.1000 - moe_loss: 11633.9463 + +
+``` + +``` +
+ 172/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9831 - loss: 0.1000 - moe_loss: 11641.6270 + +
+``` + +``` +
+ 175/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9831 - loss: 0.1000 - moe_loss: 11649.3066 + +
+``` + +``` +
+ 179/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9832 - loss: 0.1000 - moe_loss: 11659.5459 + +
+``` + +``` +
+ 182/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9832 - loss: 0.1000 - moe_loss: 11667.2256 + +
+``` + +``` +
+ 185/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9832 - loss: 0.1000 - moe_loss: 11674.9062 + +
+``` + +``` +
+ 189/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9833 - loss: 0.1000 - moe_loss: 11685.1455 + +
+``` + +``` +
+ 193/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9833 - loss: 0.1000 - moe_loss: 11695.3857 + +
+``` + +``` +
+ 197/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9833 - loss: 0.1000 - moe_loss: 11705.6260 + +
+``` + +``` +
+ 201/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9834 - loss: 0.1000 - moe_loss: 11715.8662 + +
+``` + +``` +
+ 204/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9834 - loss: 0.1000 - moe_loss: 11723.5459 + +
+``` + +``` +
+ 208/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9834 - loss: 0.1000 - moe_loss: 11733.7861 + +
+``` + +``` +
+ 211/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9834 - loss: 0.1000 - moe_loss: 11741.4658 + +
+``` + +``` +
+ 215/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9835 - loss: 0.1000 - moe_loss: 11751.7061 + +
+``` + +``` +
+ 218/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9835 - loss: 0.1000 - moe_loss: 11759.3857 + +
+``` + +``` +
+ 221/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9835 - loss: 0.1000 - moe_loss: 11767.0654 + +
+``` + +``` +
+ 225/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9835 - loss: 0.1000 - moe_loss: 11777.3057 + +
+``` + +``` +
+ 229/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9836 - loss: 0.1000 - moe_loss: 11787.5459 + +
+``` + +``` +
+ 233/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9836 - loss: 0.1000 - moe_loss: 11797.7861 + +
+``` + +``` +
+ 236/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9836 - loss: 0.1000 - moe_loss: 11805.4658 + +
+``` + +``` +
+ 240/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9836 - loss: 0.1000 - moe_loss: 11815.7051 + +
+``` + +``` +
+ 243/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9837 - loss: 0.1000 - moe_loss: 11823.3857 + +
+``` + +``` +
+ 246/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9837 - loss: 0.1000 - moe_loss: 11831.0654 + +
+``` + +``` +
+ 249/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9837 - loss: 0.1000 - moe_loss: 11838.7461 + +
+``` + +``` +
+ 253/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9837 - loss: 0.1000 - moe_loss: 11848.9854 + +
+``` + +``` +
+ 257/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9837 - loss: 0.1000 - moe_loss: 11859.2256 + +
+``` + +``` +
+ 260/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9838 - loss: 0.1000 - moe_loss: 11866.9053 + +
+``` + +``` +
+ 263/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9838 - loss: 0.1000 - moe_loss: 11874.5859 + +
+``` + +``` +
+ 267/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9838 - loss: 0.1000 - moe_loss: 11884.8252 + +
+``` + +``` +
+ 270/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9838 - loss: 0.1000 - moe_loss: 11892.5059 + +
+``` + +``` +
+ 274/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9838 - loss: 0.1000 - moe_loss: 11902.7451 + +
+``` + +``` +
+ 278/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11912.9854 + +
+``` + +``` +
+ 281/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11920.6650 + +
+``` + +``` +
+ 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11930.9053 + +
+``` + +``` +
+ 288/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11938.5850 + +
+``` + +``` +
+ 291/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11946.2656 + +
+``` + +``` +
+ 294/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11953.9453 + +
+``` + +``` +
+ 298/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11964.1855 + +
+``` + +``` +
+ 302/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11974.4248 + +
+``` + +``` +
+ 305/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 11982.1055 + +
+``` + +``` +
+ 308/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 11989.7852 + +
+``` + +``` +
+ 312/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12000.0254 + +
+``` + +``` +
+ 315/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12007.7051 + +
+``` + +``` +
+ 318/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12015.3848 + +
+``` + +``` +
+ 321/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12023.0654 + +
+``` + +``` +
+ 324/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12030.7451 + +
+``` + +``` +
+ 327/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12038.4248 + +
+``` + +``` +
+ 330/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12046.1055 + +
+``` + +``` +
+ 333/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12053.7852 + +
+``` + +``` +
+ 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12061.4648 + +
+``` + +``` +
+ 340/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12071.7051 + +
+``` + +``` +
+ 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12081.9453 + +
+``` + +``` +
+ 348/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12092.1846 + +
+``` + +``` +
+ 352/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12102.4248 + +
+``` + +``` +
+ 355/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12110.1055 + +
+``` + +``` +
+ 359/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12120.3447 + +
+``` + +``` +
+ 362/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12128.0254 + +
+``` + +``` +
+ 365/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12135.7051 + +
+``` + +``` +
+ 368/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12143.3848 + +
+``` + +``` +
+ 372/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12153.6250 + +
+``` + +``` +
+ 376/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12163.8652 + +
+``` + +``` +
+ 380/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12174.1055 + +
+``` + +``` +
+ 384/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12184.3447 + +
+``` + +``` +
+ 387/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12192.0254 + +
+``` + +``` +
+ 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12202.2646 + +
+``` + +``` +
+ 395/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12212.5049 + +
+``` + +``` +
+ 399/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12222.7451 + +
+``` + +``` +
+ 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12232.9854 + +
+``` + +``` +
+ 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12243.2256 + +
+``` + +``` +
+ 411/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12253.4658 + +
+``` + +``` +
+ 414/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12261.1455 + +
+``` + +``` +
+ 418/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12271.3857 + +
+``` + +``` +
+ 421/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12279.0654 + +
+``` + +``` +
+ 424/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12286.7451 + +
+``` + +``` +
+ 427/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12294.4248 + +
+``` + +``` +
+ 431/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12304.6650 + +
+``` + +``` +
+ 434/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12312.3447 + +
+``` + +``` +
+ 438/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12322.5850 + +
+``` + +``` +
+ 441/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12330.2646 + +
+``` + +``` +
+ 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12340.5049 + +
+``` + +``` +
+ 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12350.7451 + +
+``` + +``` +
+ 452/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12358.4248 + +
+``` + +``` +
+ 456/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12368.6650 + +
+``` + +``` +
+ 460/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12378.9053 + +
+``` + +``` +
+ 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12386.5850 + +
+``` + +``` +
+ 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9843 - loss: 0.1000 - moe_loss: 12396.8252 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9843 - loss: 0.1000 - moe_loss: 12404.4883 - val_loss: 0.1000 - val_moe_loss: 13998.7246 + + +
+``` +Epoch 6/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 13s 29ms/step - accuracy: 0.9609 - loss: 0.1000 - moe_loss: 14003.8555 + +
+``` + +``` +
+ 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9740 - loss: 0.1000 - moe_loss: 14014.0947 + +
+``` + +``` +
+ 8/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9780 - loss: 0.1000 - moe_loss: 14021.7832 + +
+``` + +``` +
+ 12/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9807 - loss: 0.1000 - moe_loss: 14032.0244 + +
+``` + +``` +
+ 16/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 14042.2637 + +
+``` + +``` +
+ 19/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9827 - loss: 0.1000 - moe_loss: 14049.9434 + +
+``` + +``` +
+ 22/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9834 - loss: 0.1000 - moe_loss: 14057.6221 + +
+``` + +``` +
+ 25/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9838 - loss: 0.1000 - moe_loss: 14065.3027 + +
+``` + +``` +
+ 29/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9843 - loss: 0.1000 - moe_loss: 14075.5439 + +
+``` + +``` +
+ 33/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9848 - loss: 0.1000 - moe_loss: 14085.7842 + +
+``` + +``` +
+ 37/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9852 - loss: 0.1000 - moe_loss: 14096.0244 + +
+``` + +``` +
+ 40/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9854 - loss: 0.1000 - moe_loss: 14103.7041 + +
+``` + +``` +
+ 43/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9856 - loss: 0.1000 - moe_loss: 14111.3838 + +
+``` + +``` +
+ 46/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9858 - loss: 0.1000 - moe_loss: 14119.0635 + +
+``` + +``` +
+ 49/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9859 - loss: 0.1000 - moe_loss: 14126.7432 + +
+``` + +``` +
+ 52/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9860 - loss: 0.1000 - moe_loss: 14134.4229 + +
+``` + +``` +
+ 55/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9861 - loss: 0.1000 - moe_loss: 14142.1035 + +
+``` + +``` +
+ 59/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9862 - loss: 0.1000 - moe_loss: 14152.3428 + +
+``` + +``` +
+ 62/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9863 - loss: 0.1000 - moe_loss: 14160.0225 + +
+``` + +``` +
+ 65/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9864 - loss: 0.1000 - moe_loss: 14167.7021 + +
+``` + +``` +
+ 68/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9865 - loss: 0.1000 - moe_loss: 14175.3818 + +
+``` + +``` +
+ 71/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9865 - loss: 0.1000 - moe_loss: 14183.0625 + +
+``` + +``` +
+ 74/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9866 - loss: 0.1000 - moe_loss: 14190.7422 + +
+``` + +``` +
+ 77/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9866 - loss: 0.1000 - moe_loss: 14198.4219 + +
+``` + +``` +
+ 80/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9867 - loss: 0.1000 - moe_loss: 14206.1016 + +
+``` + +``` +
+ 83/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9867 - loss: 0.1000 - moe_loss: 14213.7812 + +
+``` + +``` +
+ 86/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9868 - loss: 0.1000 - moe_loss: 14221.4619 + +
+``` + +``` +
+ 89/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9868 - loss: 0.1000 - moe_loss: 14229.1416 + +
+``` + +``` +
+ 92/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9869 - loss: 0.1000 - moe_loss: 14236.8223 + +
+``` + +``` +
+ 95/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9869 - loss: 0.1000 - moe_loss: 14244.5029 + +
+``` + +``` +
+ 98/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9869 - loss: 0.1000 - moe_loss: 14252.1826 + +
+``` + +``` +
+ 102/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9869 - loss: 0.1000 - moe_loss: 14262.4229 + +
+``` + +``` +
+ 106/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9869 - loss: 0.1000 - moe_loss: 14272.6621 + +
+``` + +``` +
+ 109/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9870 - loss: 0.1000 - moe_loss: 14280.3428 + +
+``` + +``` +
+ 112/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9870 - loss: 0.1000 - moe_loss: 14288.0225 + +
+``` + +``` +
+ 115/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9870 - loss: 0.1000 - moe_loss: 14295.7021 + +
+``` + +``` +
+ 118/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9870 - loss: 0.1000 - moe_loss: 14303.3818 + +
+``` + +``` +
+ 121/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9871 - loss: 0.1000 - moe_loss: 14311.0625 + +
+``` + +``` +
+ 124/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9871 - loss: 0.1000 - moe_loss: 14318.7422 + +
+``` + +``` +
+ 127/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9871 - loss: 0.1000 - moe_loss: 14326.4229 + +
+``` + +``` +
+ 130/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9871 - loss: 0.1000 - moe_loss: 14334.1035 + +
+``` + +``` +
+ 133/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9871 - loss: 0.1000 - moe_loss: 14341.7832 + +
+``` + +``` +
+ 136/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9871 - loss: 0.1000 - moe_loss: 14349.4629 + +
+``` + +``` +
+ 140/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9871 - loss: 0.1000 - moe_loss: 14359.7031 + +
+``` + +``` +
+ 143/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14367.3838 + +
+``` + +``` +
+ 146/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14375.0635 + +
+``` + +``` +
+ 150/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14385.3037 + +
+``` + +``` +
+ 153/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14392.9844 + +
+``` + +``` +
+ 156/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14400.6641 + +
+``` + +``` +
+ 159/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14408.3447 + +
+``` + +``` +
+ 163/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14418.5840 + +
+``` + +``` +
+ 166/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14426.2637 + +
+``` + +``` +
+ 169/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14433.9443 + +
+``` + +``` +
+ 172/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14441.6240 + +
+``` + +``` +
+ 175/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14449.3037 + +
+``` + +``` +
+ 179/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14459.5430 + +
+``` + +``` +
+ 182/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14467.2236 + +
+``` + +``` +
+ 185/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14474.9033 + +
+``` + +``` +
+ 188/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14482.5830 + +
+``` + +``` +
+ 191/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14490.2627 + +
+``` + +``` +
+ 194/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14497.9434 + +
+``` + +``` +
+ 197/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14505.6230 + +
+``` + +``` +
+ 200/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14513.3037 + +
+``` + +``` +
+ 204/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14523.5430 + +
+``` + +``` +
+ 207/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14531.2236 + +
+``` + +``` +
+ 210/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14538.9033 + +
+``` + +``` +
+ 213/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14546.5840 + +
+``` + +``` +
+ 216/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14554.2637 + +
+``` + +``` +
+ 219/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14561.9434 + +
+``` + +``` +
+ 222/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14569.6230 + +
+``` + +``` +
+ 225/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14577.3037 + +
+``` + +``` +
+ 228/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14584.9834 + +
+``` + +``` +
+ 231/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14592.6631 + +
+``` + +``` +
+ 234/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14600.3428 + +
+``` + +``` +
+ 237/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14608.0234 + +
+``` + +``` +
+ 240/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14615.7031 + +
+``` + +``` +
+ 243/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14623.3828 + +
+``` + +``` +
+ 246/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14631.0625 + +
+``` + +``` +
+ 249/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14638.7432 + +
+``` + +``` +
+ 252/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14646.4229 + +
+``` + +``` +
+ 255/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14654.1025 + +
+``` + +``` +
+ 258/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14661.7832 + +
+``` + +``` +
+ 261/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14669.4629 + +
+``` + +``` +
+ 264/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14677.1426 + +
+``` + +``` +
+ 267/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14684.8223 + +
+``` + +``` +
+ 270/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14692.5029 + +
+``` + +``` +
+ 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14700.1826 + +
+``` + +``` +
+ 276/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14707.8623 + +
+``` + +``` +
+ 279/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14715.5420 + +
+``` + +``` +
+ 282/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14723.2227 + +
+``` + +``` +
+ 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14730.9023 + +
+``` + +``` +
+ 288/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14738.5820 + +
+``` + +``` +
+ 291/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14746.2627 + +
+``` + +``` +
+ 294/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14753.9424 + +
+``` + +``` +
+ 297/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14761.6221 + +
+``` + +``` +
+ 300/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14769.3027 + +
+``` + +``` +
+ 303/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14776.9824 + +
+``` + +``` +
+ 306/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14784.6631 + +
+``` + +``` +
+ 309/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14792.3428 + +
+``` + +``` +
+ 312/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14800.0234 + +
+``` + +``` +
+ 315/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14807.7031 + +
+``` + +``` +
+ 318/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14815.3828 + +
+``` + +``` +
+ 321/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14823.0625 + +
+``` + +``` +
+ 324/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14830.7432 + +
+``` + +``` +
+ 327/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14838.4229 + +
+``` + +``` +
+ 330/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14846.1025 + +
+``` + +``` +
+ 333/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14853.7832 + +
+``` + +``` +
+ 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14861.4629 + +
+``` + +``` +
+ 339/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14869.1426 + +
+``` + +``` +
+ 342/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14876.8232 + +
+``` + +``` +
+ 345/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14884.5029 + +
+``` + +``` +
+ 348/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14892.1826 + +
+``` + +``` +
+ 351/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14899.8623 + +
+``` + +``` +
+ 354/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14907.5430 + +
+``` + +``` +
+ 357/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14915.2227 + +
+``` + +``` +
+ 360/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14922.9023 + +
+``` + +``` +
+ 363/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14930.5820 + +
+``` + +``` +
+ 367/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14940.8223 + +
+``` + +``` +
+ 370/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14948.5029 + +
+``` + +``` +
+ 373/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14956.1826 + +
+``` + +``` +
+ 376/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14963.8623 + +
+``` + +``` +
+ 379/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14971.5430 + +
+``` + +``` +
+ 382/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14979.2227 + +
+``` + +``` +
+ 385/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14986.9023 + +
+``` + +``` +
+ 388/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14994.5830 + +
+``` + +``` +
+ 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15002.2627 + +
+``` + +``` +
+ 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15009.9424 + +
+``` + +``` +
+ 398/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15020.1826 + +
+``` + +``` +
+ 401/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15027.8623 + +
+``` + +``` +
+ 404/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15035.5420 + +
+``` + +``` +
+ 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15043.2227 + +
+``` + +``` +
+ 410/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15050.9023 + +
+``` + +``` +
+ 413/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15058.5820 + +
+``` + +``` +
+ 416/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15066.2627 + +
+``` + +``` +
+ 419/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15073.9424 + +
+``` + +``` +
+ 422/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15081.6221 + +
+``` + +``` +
+ 425/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15089.3027 + +
+``` + +``` +
+ 427/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15094.4229 + +
+``` + +``` +
+ 431/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15104.6621 + +
+``` + +``` +
+ 434/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15112.3428 + +
+``` + +``` +
+ 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15120.0225 + +
+``` + +``` +
+ 440/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15127.7021 + +
+``` + +``` +
+ 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15135.3828 + +
+``` + +``` +
+ 446/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15143.0625 + +
+``` + +``` +
+ 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15150.7422 + +
+``` + +``` +
+ 452/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15158.4219 + +
+``` + +``` +
+ 455/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15166.1025 + +
+``` + +``` +
+ 458/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15173.7822 + +
+``` + +``` +
+ 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15181.4619 + +
+``` + +``` +
+ 464/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15189.1426 + +
+``` + +``` +
+ 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15196.8223 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15204.4854 - val_loss: 0.1000 - val_moe_loss: 16798.7246 + + +
+``` +Epoch 7/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 13s 29ms/step - accuracy: 0.9844 - loss: 0.1000 - moe_loss: 16803.8555 + +
+``` + +``` +
+ 4/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9907 - loss: 0.1000 - moe_loss: 16811.5371 + +
+``` + +``` +
+ 7/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 16819.2188 + +
+``` + +``` +
+ 10/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 16826.9043 + +
+``` + +``` +
+ 13/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 16834.5898 + +
+``` + +``` +
+ 16/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 16842.2676 + +
+``` + +``` +
+ 19/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 16849.9531 + +
+``` + +``` +
+ 22/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 16857.6309 + +
+``` + +``` +
+ 23/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9907 - loss: 0.1000 - moe_loss: 16860.1934 + +
+``` + +``` +
+ 24/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9906 - loss: 0.1000 - moe_loss: 16862.7539 + +
+``` + +``` +
+ 26/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9904 - loss: 0.1000 - moe_loss: 16867.8711 + +
+``` + +``` +
+ 29/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9902 - loss: 0.1000 - moe_loss: 16875.5527 + +
+``` + +``` +
+ 33/469 ━━━━━━━━━━━━━━━━━━━━ 10s 24ms/step - accuracy: 0.9901 - loss: 0.1000 - moe_loss: 16885.7910 + +
+``` + +``` +
+ 36/469 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 0.9899 - loss: 0.1000 - moe_loss: 16893.4688 + +
+``` + +``` +
+ 39/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9898 - loss: 0.1000 - moe_loss: 16901.1484 + +
+``` + +``` +
+ 42/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9897 - loss: 0.1000 - moe_loss: 16908.8281 + +
+``` + +``` +
+ 45/469 ━━━━━━━━━━━━━━━━━━━━ 9s 22ms/step - accuracy: 0.9896 - loss: 0.1000 - moe_loss: 16916.5078 + +
+``` + +``` +
+ 48/469 ━━━━━━━━━━━━━━━━━━━━ 9s 22ms/step - accuracy: 0.9895 - loss: 0.1000 - moe_loss: 16924.1875 + +
+``` + +``` +
+ 51/469 ━━━━━━━━━━━━━━━━━━━━ 9s 22ms/step - accuracy: 0.9895 - loss: 0.1000 - moe_loss: 16931.8691 + +
+``` + +``` +
+ 54/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 16939.5488 + +
+``` + +``` +
+ 57/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 16947.2285 + +
+``` + +``` +
+ 60/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 16954.9082 + +
+``` + +``` +
+ 63/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 16962.5898 + +
+``` + +``` +
+ 66/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9892 - loss: 0.1000 - moe_loss: 16970.2676 + +
+``` + +``` +
+ 69/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9892 - loss: 0.1000 - moe_loss: 16977.9473 + +
+``` + +``` +
+ 72/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9891 - loss: 0.1000 - moe_loss: 16985.6270 + +
+``` + +``` +
+ 75/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9891 - loss: 0.1000 - moe_loss: 16993.3086 + +
+``` + +``` +
+ 78/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9890 - loss: 0.1000 - moe_loss: 17000.9863 + +
+``` + +``` +
+ 81/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9890 - loss: 0.1000 - moe_loss: 17008.6660 + +
+``` + +``` +
+ 84/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9889 - loss: 0.1000 - moe_loss: 17016.3457 + +
+``` + +``` +
+ 87/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9889 - loss: 0.1000 - moe_loss: 17024.0273 + +
+``` + +``` +
+ 90/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9888 - loss: 0.1000 - moe_loss: 17031.7070 + +
+``` + +``` +
+ 93/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9888 - loss: 0.1000 - moe_loss: 17039.3867 + +
+``` + +``` +
+ 96/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9888 - loss: 0.1000 - moe_loss: 17047.0664 + +
+``` + +``` +
+ 99/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9887 - loss: 0.1000 - moe_loss: 17054.7461 + +
+``` + +``` +
+ 102/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9887 - loss: 0.1000 - moe_loss: 17062.4277 + +
+``` + +``` +
+ 105/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9887 - loss: 0.1000 - moe_loss: 17070.1074 + +
+``` + +``` +
+ 108/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9887 - loss: 0.1000 - moe_loss: 17077.7871 + +
+``` + +``` +
+ 111/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9886 - loss: 0.1000 - moe_loss: 17085.4668 + +
+``` + +``` +
+ 114/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9886 - loss: 0.1000 - moe_loss: 17093.1465 + +
+``` + +``` +
+ 117/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9886 - loss: 0.1000 - moe_loss: 17100.8262 + +
+``` + +``` +
+ 120/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9886 - loss: 0.1000 - moe_loss: 17108.5059 + +
+``` + +``` +
+ 123/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9886 - loss: 0.1000 - moe_loss: 17116.1855 + +
+``` + +``` +
+ 126/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9885 - loss: 0.1000 - moe_loss: 17123.8652 + +
+``` + +``` +
+ 129/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9885 - loss: 0.1000 - moe_loss: 17131.5449 + +
+``` + +``` +
+ 132/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9885 - loss: 0.1000 - moe_loss: 17139.2246 + +
+``` + +``` +
+ 135/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9885 - loss: 0.1000 - moe_loss: 17146.9043 + +
+``` + +``` +
+ 138/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9885 - loss: 0.1000 - moe_loss: 17154.5840 + +
+``` + +``` +
+ 141/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17162.2637 + +
+``` + +``` +
+ 144/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17169.9434 + +
+``` + +``` +
+ 147/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17177.6230 + +
+``` + +``` +
+ 150/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17185.3047 + +
+``` + +``` +
+ 153/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17192.9844 + +
+``` + +``` +
+ 156/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17200.6641 + +
+``` + +``` +
+ 159/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17208.3438 + +
+``` + +``` +
+ 162/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17216.0234 + +
+``` + +``` +
+ 165/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17223.7031 + +
+``` + +``` +
+ 168/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17231.3848 + +
+``` + +``` +
+ 171/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17239.0645 + +
+``` + +``` +
+ 174/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17246.7441 + +
+``` + +``` +
+ 177/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17254.4238 + +
+``` + +``` +
+ 180/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17262.1035 + +
+``` + +``` +
+ 183/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17269.7832 + +
+``` + +``` +
+ 186/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17277.4629 + +
+``` + +``` +
+ 189/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17285.1445 + +
+``` + +``` +
+ 192/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17292.8242 + +
+``` + +``` +
+ 195/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17300.5039 + +
+``` + +``` +
+ 198/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17308.1836 + +
+``` + +``` +
+ 201/469 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17315.8633 + +
+``` + +``` +
+ 204/469 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17323.5430 + +
+``` + +``` +
+ 207/469 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17331.2227 + +
+``` + +``` +
+ 210/469 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17338.9023 + +
+``` + +``` +
+ 213/469 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17346.5820 + +
+``` + +``` +
+ 216/469 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17354.2617 + +
+``` + +``` +
+ 219/469 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17361.9434 + +
+``` + +``` +
+ 222/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17369.6230 + +
+``` + +``` +
+ 225/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17377.3027 + +
+``` + +``` +
+ 228/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17384.9824 + +
+``` + +``` +
+ 231/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17392.6621 + +
+``` + +``` +
+ 234/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17400.3438 + +
+``` + +``` +
+ 237/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17408.0234 + +
+``` + +``` +
+ 240/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17415.7031 + +
+``` + +``` +
+ 243/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17423.3828 + +
+``` + +``` +
+ 246/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17431.0625 + +
+``` + +``` +
+ 249/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17438.7422 + +
+``` + +``` +
+ 252/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17446.4219 + +
+``` + +``` +
+ 255/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17454.1035 + +
+``` + +``` +
+ 258/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17461.7832 + +
+``` + +``` +
+ 261/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17469.4629 + +
+``` + +``` +
+ 264/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17477.1426 + +
+``` + +``` +
+ 267/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17484.8223 + +
+``` + +``` +
+ 270/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17492.5020 + +
+``` + +``` +
+ 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17500.1816 + +
+``` + +``` +
+ 276/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17507.8633 + +
+``` + +``` +
+ 279/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17515.5430 + +
+``` + +``` +
+ 282/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17523.2227 + +
+``` + +``` +
+ 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17530.9023 + +
+``` + +``` +
+ 288/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17538.5820 + +
+``` + +``` +
+ 291/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17546.2617 + +
+``` + +``` +
+ 294/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17553.9414 + +
+``` + +``` +
+ 297/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17561.6211 + +
+``` + +``` +
+ 300/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17569.3027 + +
+``` + +``` +
+ 303/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17576.9824 + +
+``` + +``` +
+ 306/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17584.6621 + +
+``` + +``` +
+ 309/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17592.3418 + +
+``` + +``` +
+ 312/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17600.0234 + +
+``` + +``` +
+ 315/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17607.7031 + +
+``` + +``` +
+ 318/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17615.3828 + +
+``` + +``` +
+ 321/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17623.0625 + +
+``` + +``` +
+ 324/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17630.7422 + +
+``` + +``` +
+ 327/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17638.4219 + +
+``` + +``` +
+ 330/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17646.1016 + +
+``` + +``` +
+ 333/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17653.7832 + +
+``` + +``` +
+ 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17661.4629 + +
+``` + +``` +
+ 339/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17669.1426 + +
+``` + +``` +
+ 342/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17676.8223 + +
+``` + +``` +
+ 345/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17684.5020 + +
+``` + +``` +
+ 348/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17692.1816 + +
+``` + +``` +
+ 351/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17699.8613 + +
+``` + +``` +
+ 354/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17707.5430 + +
+``` + +``` +
+ 357/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17715.2227 + +
+``` + +``` +
+ 360/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17722.9023 + +
+``` + +``` +
+ 363/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17730.5820 + +
+``` + +``` +
+ 366/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17738.2617 + +
+``` + +``` +
+ 369/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17745.9414 + +
+``` + +``` +
+ 372/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17753.6230 + +
+``` + +``` +
+ 375/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17761.3027 + +
+``` + +``` +
+ 378/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17768.9824 + +
+``` + +``` +
+ 381/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17776.6621 + +
+``` + +``` +
+ 384/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17784.3418 + +
+``` + +``` +
+ 387/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17792.0215 + +
+``` + +``` +
+ 390/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17799.7012 + +
+``` + +``` +
+ 393/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17807.3828 + +
+``` + +``` +
+ 396/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17815.0625 + +
+``` + +``` +
+ 399/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17822.7422 + +
+``` + +``` +
+ 402/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17830.4219 + +
+``` + +``` +
+ 405/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17838.1016 + +
+``` + +``` +
+ 408/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17845.7832 + +
+``` + +``` +
+ 411/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17853.4629 + +
+``` + +``` +
+ 414/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17861.1426 + +
+``` + +``` +
+ 417/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17868.8223 + +
+``` + +``` +
+ 420/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17876.5020 + +
+``` + +``` +
+ 423/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17884.1816 + +
+``` + +``` +
+ 426/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17891.8633 + +
+``` + +``` +
+ 429/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17899.5430 + +
+``` + +``` +
+ 432/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17907.2227 + +
+``` + +``` +
+ 435/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17914.9023 + +
+``` + +``` +
+ 438/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17922.5820 + +
+``` + +``` +
+ 441/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17930.2617 + +
+``` + +``` +
+ 444/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17937.9414 + +
+``` + +``` +
+ 447/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17945.6230 + +
+``` + +``` +
+ 450/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17953.3027 + +
+``` + +``` +
+ 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17960.9824 + +
+``` + +``` +
+ 456/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17968.6621 + +
+``` + +``` +
+ 459/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17976.3418 + +
+``` + +``` +
+ 462/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17984.0215 + +
+``` + +``` +
+ 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17991.7031 + +
+``` + +``` +
+ 468/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17999.3828 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 9s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 18004.4863 - val_loss: 0.1000 - val_moe_loss: 19598.7227 + + +
+``` +Epoch 8/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 13s 28ms/step - accuracy: 0.9844 - loss: 0.1000 - moe_loss: 19603.8477 + +
+``` + +``` +
+ 4/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9844 - loss: 0.1000 - moe_loss: 19611.5293 + +
+``` + +``` +
+ 7/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9864 - loss: 0.1000 - moe_loss: 19619.2109 + +
+``` + +``` +
+ 10/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 19626.8906 + +
+``` + +``` +
+ 13/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9877 - loss: 0.1000 - moe_loss: 19634.5723 + +
+``` + +``` +
+ 16/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9880 - loss: 0.1000 - moe_loss: 19642.2539 + +
+``` + +``` +
+ 19/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 19649.9355 + +
+``` + +``` +
+ 22/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 19657.6152 + +
+``` + +``` +
+ 25/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 19665.2969 + +
+``` + +``` +
+ 28/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9886 - loss: 0.1000 - moe_loss: 19672.9785 + +
+``` + +``` +
+ 31/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9887 - loss: 0.1000 - moe_loss: 19680.6582 + +
+``` + +``` +
+ 34/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9889 - loss: 0.1000 - moe_loss: 19688.3379 + +
+``` + +``` +
+ 37/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9890 - loss: 0.1000 - moe_loss: 19696.0195 + +
+``` + +``` +
+ 40/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9891 - loss: 0.1000 - moe_loss: 19703.6992 + +
+``` + +``` +
+ 43/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9892 - loss: 0.1000 - moe_loss: 19711.3809 + +
+``` + +``` +
+ 46/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 19719.0605 + +
+``` + +``` +
+ 49/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 19726.7422 + +
+``` + +``` +
+ 52/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 19734.4219 + +
+``` + +``` +
+ 55/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 19742.1016 + +
+``` + +``` +
+ 58/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 19749.7812 + +
+``` + +``` +
+ 61/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19757.4609 + +
+``` + +``` +
+ 64/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19765.1406 + +
+``` + +``` +
+ 67/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19772.8203 + +
+``` + +``` +
+ 70/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19780.5000 + +
+``` + +``` +
+ 73/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19788.1797 + +
+``` + +``` +
+ 76/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9895 - loss: 0.1000 - moe_loss: 19795.8594 + +
+``` + +``` +
+ 79/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9895 - loss: 0.1000 - moe_loss: 19803.5391 + +
+``` + +``` +
+ 82/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9895 - loss: 0.1000 - moe_loss: 19811.2188 + +
+``` + +``` +
+ 85/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19818.9004 + +
+``` + +``` +
+ 88/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19826.5801 + +
+``` + +``` +
+ 91/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19834.2598 + +
+``` + +``` +
+ 94/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19841.9395 + +
+``` + +``` +
+ 97/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19849.6191 + +
+``` + +``` +
+ 100/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19857.3008 + +
+``` + +``` +
+ 103/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19864.9805 + +
+``` + +``` +
+ 106/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19872.6602 + +
+``` + +``` +
+ 109/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19880.3398 + +
+``` + +``` +
+ 112/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19888.0195 + +
+``` + +``` +
+ 115/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19895.6992 + +
+``` + +``` +
+ 118/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19903.3809 + +
+``` + +``` +
+ 121/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19911.0605 + +
+``` + +``` +
+ 124/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19918.7402 + +
+``` + +``` +
+ 127/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19926.4199 + +
+``` + +``` +
+ 130/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19934.0996 + +
+``` + +``` +
+ 133/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19941.7812 + +
+``` + +``` +
+ 136/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19949.4609 + +
+``` + +``` +
+ 139/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19957.1406 + +
+``` + +``` +
+ 142/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19964.8203 + +
+``` + +``` +
+ 145/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19972.5000 + +
+``` + +``` +
+ 148/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19980.1797 + +
+``` + +``` +
+ 151/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19987.8594 + +
+``` + +``` +
+ 154/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19995.5391 + +
+``` + +``` +
+ 157/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20003.2207 + +
+``` + +``` +
+ 160/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20010.9004 + +
+``` + +``` +
+ 163/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20018.5801 + +
+``` + +``` +
+ 166/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20026.2598 + +
+``` + +``` +
+ 169/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20033.9395 + +
+``` + +``` +
+ 172/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20041.6191 + +
+``` + +``` +
+ 175/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20049.3008 + +
+``` + +``` +
+ 178/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20056.9805 + +
+``` + +``` +
+ 181/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20064.6602 + +
+``` + +``` +
+ 184/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20072.3398 + +
+``` + +``` +
+ 187/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20080.0195 + +
+``` + +``` +
+ 190/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20087.6992 + +
+``` + +``` +
+ 193/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20095.3789 + +
+``` + +``` +
+ 196/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20103.0586 + +
+``` + +``` +
+ 199/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20110.7402 + +
+``` + +``` +
+ 202/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20118.4199 + +
+``` + +``` +
+ 205/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20126.0996 + +
+``` + +``` +
+ 208/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20133.7793 + +
+``` + +``` +
+ 211/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20141.4590 + +
+``` + +``` +
+ 214/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20149.1387 + +
+``` + +``` +
+ 217/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20156.8203 + +
+``` + +``` +
+ 220/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20164.5000 + +
+``` + +``` +
+ 223/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20172.1797 + +
+``` + +``` +
+ 226/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20179.8594 + +
+``` + +``` +
+ 229/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20187.5391 + +
+``` + +``` +
+ 232/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20195.2188 + +
+``` + +``` +
+ 235/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20202.8984 + +
+``` + +``` +
+ 238/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20210.5801 + +
+``` + +``` +
+ 241/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20218.2598 + +
+``` + +``` +
+ 244/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20225.9395 + +
+``` + +``` +
+ 247/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20233.6191 + +
+``` + +``` +
+ 250/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20241.2988 + +
+``` + +``` +
+ 253/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20248.9805 + +
+``` + +``` +
+ 256/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20256.6602 + +
+``` + +``` +
+ 259/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20264.3398 + +
+``` + +``` +
+ 262/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20272.0195 + +
+``` + +``` +
+ 265/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20279.6992 + +
+``` + +``` +
+ 268/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20287.3789 + +
+``` + +``` +
+ 271/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20295.0586 + +
+``` + +``` +
+ 274/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20302.7402 + +
+``` + +``` +
+ 277/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20310.4199 + +
+``` + +``` +
+ 280/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20318.0996 + +
+``` + +``` +
+ 283/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20325.7793 + +
+``` + +``` +
+ 286/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20333.4590 + +
+``` + +``` +
+ 289/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20341.1387 + +
+``` + +``` +
+ 292/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20348.8203 + +
+``` + +``` +
+ 295/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20356.5000 + +
+``` + +``` +
+ 298/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20364.1797 + +
+``` + +``` +
+ 301/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20371.8594 + +
+``` + +``` +
+ 304/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20379.5391 + +
+``` + +``` +
+ 307/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20387.2188 + +
+``` + +``` +
+ 310/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20394.9004 + +
+``` + +``` +
+ 313/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20402.5801 + +
+``` + +``` +
+ 316/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20410.2598 + +
+``` + +``` +
+ 319/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20417.9395 + +
+``` + +``` +
+ 320/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20420.5000 + +
+``` + +``` +
+ 323/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20428.1797 + +
+``` + +``` +
+ 326/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20435.8594 + +
+``` + +``` +
+ 329/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20443.5391 + +
+``` + +``` +
+ 332/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20451.2188 + +
+``` + +``` +
+ 335/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20458.8984 + +
+``` + +``` +
+ 338/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20466.5781 + +
+``` + +``` +
+ 341/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20474.2598 + +
+``` + +``` +
+ 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20481.9395 + +
+``` + +``` +
+ 347/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20489.6191 + +
+``` + +``` +
+ 350/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20497.2988 + +
+``` + +``` +
+ 353/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20504.9785 + +
+``` + +``` +
+ 356/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20512.6582 + +
+``` + +``` +
+ 359/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20520.3398 + +
+``` + +``` +
+ 362/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20528.0195 + +
+``` + +``` +
+ 365/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20535.6992 + +
+``` + +``` +
+ 368/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20543.3789 + +
+``` + +``` +
+ 371/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20551.0586 + +
+``` + +``` +
+ 374/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20558.7383 + +
+``` + +``` +
+ 377/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20566.4180 + +
+``` + +``` +
+ 380/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20574.0996 + +
+``` + +``` +
+ 383/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20581.7793 + +
+``` + +``` +
+ 386/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20589.4590 + +
+``` + +``` +
+ 389/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20597.1387 + +
+``` + +``` +
+ 392/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20604.8184 + +
+``` + +``` +
+ 393/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20607.3789 + +
+``` + +``` +
+ 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20609.9395 + +
+``` + +``` +
+ 395/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20612.4980 + +
+``` + +``` +
+ 398/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20620.1797 + +
+``` + +``` +
+ 401/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20627.8594 + +
+``` + +``` +
+ 404/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20635.5391 + +
+``` + +``` +
+ 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20643.2188 + +
+``` + +``` +
+ 410/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20650.8984 + +
+``` + +``` +
+ 413/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20658.5781 + +
+``` + +``` +
+ 416/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20666.2578 + +
+``` + +``` +
+ 419/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20673.9395 + +
+``` + +``` +
+ 422/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20681.6191 + +
+``` + +``` +
+ 425/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20689.2988 + +
+``` + +``` +
+ 428/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20696.9785 + +
+``` + +``` +
+ 431/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20704.6582 + +
+``` + +``` +
+ 434/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20712.3379 + +
+``` + +``` +
+ 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20720.0195 + +
+``` + +``` +
+ 440/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20727.6992 + +
+``` + +``` +
+ 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20735.3789 + +
+``` + +``` +
+ 446/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20743.0586 + +
+``` + +``` +
+ 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20750.7383 + +
+``` + +``` +
+ 452/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20758.4180 + +
+``` + +``` +
+ 455/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20766.0996 + +
+``` + +``` +
+ 458/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20773.7793 + +
+``` + +``` +
+ 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20781.4590 + +
+``` + +``` +
+ 464/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20789.1387 + +
+``` + +``` +
+ 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20796.8184 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 10s 20ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20804.4824 - val_loss: 0.1000 - val_moe_loss: 22398.7227 + + +
+``` +Epoch 9/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 15s 33ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 22403.8516 + +
+``` + +``` +
+ 4/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9959 - loss: 0.1000 - moe_loss: 22411.5312 + +
+``` + +``` +
+ 7/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9964 - loss: 0.1000 - moe_loss: 22419.2129 + +
+``` + +``` +
+ 10/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9967 - loss: 0.1000 - moe_loss: 22426.8926 + +
+``` + +``` +
+ 13/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9965 - loss: 0.1000 - moe_loss: 22434.5742 + +
+``` + +``` +
+ 16/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9960 - loss: 0.1000 - moe_loss: 22442.2539 + +
+``` + +``` +
+ 19/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9954 - loss: 0.1000 - moe_loss: 22449.9355 + +
+``` + +``` +
+ 22/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9948 - loss: 0.1000 - moe_loss: 22457.6172 + +
+``` + +``` +
+ 25/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 22465.2969 + +
+``` + +``` +
+ 28/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 22472.9785 + +
+``` + +``` +
+ 30/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 22478.0977 + +
+``` + +``` +
+ 32/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 22483.2168 + +
+``` + +``` +
+ 35/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 22490.8965 + +
+``` + +``` +
+ 38/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 22498.5762 + +
+``` + +``` +
+ 41/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 22506.2559 + +
+``` + +``` +
+ 44/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 22513.9375 + +
+``` + +``` +
+ 47/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 22521.6172 + +
+``` + +``` +
+ 50/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 22529.2969 + +
+``` + +``` +
+ 53/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 22536.9766 + +
+``` + +``` +
+ 56/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 22544.6562 + +
+``` + +``` +
+ 59/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 22552.3359 + +
+``` + +``` +
+ 62/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 22560.0176 + +
+``` + +``` +
+ 65/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 22567.6973 + +
+``` + +``` +
+ 68/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 22575.3770 + +
+``` + +``` +
+ 71/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 22583.0566 + +
+``` + +``` +
+ 74/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 22590.7363 + +
+``` + +``` +
+ 77/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 22598.4180 + +
+``` + +``` +
+ 80/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 22606.0977 + +
+``` + +``` +
+ 83/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 22613.7793 + +
+``` + +``` +
+ 86/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 22621.4590 + +
+``` + +``` +
+ 89/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 22629.1387 + +
+``` + +``` +
+ 92/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 22636.8184 + +
+``` + +``` +
+ 95/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 22644.5000 + +
+``` + +``` +
+ 97/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 22649.6191 + +
+``` + +``` +
+ 99/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 22654.7383 + +
+``` + +``` +
+ 102/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 22662.4199 + +
+``` + +``` +
+ 105/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 22670.0996 + +
+``` + +``` +
+ 108/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 22677.7793 + +
+``` + +``` +
+ 111/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 22685.4590 + +
+``` + +``` +
+ 114/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 22693.1387 + +
+``` + +``` +
+ 117/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 22700.8184 + +
+``` + +``` +
+ 120/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 22708.5000 + +
+``` + +``` +
+ 123/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 22716.1797 + +
+``` + +``` +
+ 126/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 22723.8594 + +
+``` + +``` +
+ 129/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 22731.5391 + +
+``` + +``` +
+ 132/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 22739.2188 + +
+``` + +``` +
+ 135/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 22746.9004 + +
+``` + +``` +
+ 138/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 22754.5801 + +
+``` + +``` +
+ 141/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 22762.2598 + +
+``` + +``` +
+ 144/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 22769.9395 + +
+``` + +``` +
+ 147/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 22777.6211 + +
+``` + +``` +
+ 150/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 22785.3008 + +
+``` + +``` +
+ 153/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 22792.9805 + +
+``` + +``` +
+ 156/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 22800.6602 + +
+``` + +``` +
+ 159/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 22808.3398 + +
+``` + +``` +
+ 162/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 22816.0195 + +
+``` + +``` +
+ 165/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 22823.7012 + +
+``` + +``` +
+ 168/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 22831.3809 + +
+``` + +``` +
+ 171/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 22839.0605 + +
+``` + +``` +
+ 174/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22846.7402 + +
+``` + +``` +
+ 177/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22854.4199 + +
+``` + +``` +
+ 180/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22862.1016 + +
+``` + +``` +
+ 183/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22869.7812 + +
+``` + +``` +
+ 186/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22877.4609 + +
+``` + +``` +
+ 189/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22885.1406 + +
+``` + +``` +
+ 192/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22892.8203 + +
+``` + +``` +
+ 195/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22900.5000 + +
+``` + +``` +
+ 198/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22908.1797 + +
+``` + +``` +
+ 201/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22915.8594 + +
+``` + +``` +
+ 204/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22923.5410 + +
+``` + +``` +
+ 207/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22931.2207 + +
+``` + +``` +
+ 210/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22938.9004 + +
+``` + +``` +
+ 213/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22946.5801 + +
+``` + +``` +
+ 216/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22954.2598 + +
+``` + +``` +
+ 219/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22961.9395 + +
+``` + +``` +
+ 222/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22969.6191 + +
+``` + +``` +
+ 225/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22977.3008 + +
+``` + +``` +
+ 228/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22984.9805 + +
+``` + +``` +
+ 231/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22992.6602 + +
+``` + +``` +
+ 234/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23000.3398 + +
+``` + +``` +
+ 237/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23008.0195 + +
+``` + +``` +
+ 240/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23015.6992 + +
+``` + +``` +
+ 243/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23023.3789 + +
+``` + +``` +
+ 246/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23031.0605 + +
+``` + +``` +
+ 249/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23038.7402 + +
+``` + +``` +
+ 252/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23046.4199 + +
+``` + +``` +
+ 255/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23054.0996 + +
+``` + +``` +
+ 258/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23061.7793 + +
+``` + +``` +
+ 261/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23069.4590 + +
+``` + +``` +
+ 264/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23077.1387 + +
+``` + +``` +
+ 267/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23084.8203 + +
+``` + +``` +
+ 270/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23092.5000 + +
+``` + +``` +
+ 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23100.1797 + +
+``` + +``` +
+ 276/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23107.8594 + +
+``` + +``` +
+ 279/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23115.5391 + +
+``` + +``` +
+ 282/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23123.2188 + +
+``` + +``` +
+ 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23130.9004 + +
+``` + +``` +
+ 288/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23138.5801 + +
+``` + +``` +
+ 291/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23146.2598 + +
+``` + +``` +
+ 294/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23153.9395 + +
+``` + +``` +
+ 297/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23161.6191 + +
+``` + +``` +
+ 300/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23169.2988 + +
+``` + +``` +
+ 303/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23176.9785 + +
+``` + +``` +
+ 306/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23184.6602 + +
+``` + +``` +
+ 309/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23192.3398 + +
+``` + +``` +
+ 312/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23200.0195 + +
+``` + +``` +
+ 315/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23207.6992 + +
+``` + +``` +
+ 318/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23215.3789 + +
+``` + +``` +
+ 321/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23223.0586 + +
+``` + +``` +
+ 324/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23230.7383 + +
+``` + +``` +
+ 327/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23238.4180 + +
+``` + +``` +
+ 330/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23246.0996 + +
+``` + +``` +
+ 333/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23253.7793 + +
+``` + +``` +
+ 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23261.4590 + +
+``` + +``` +
+ 339/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23269.1387 + +
+``` + +``` +
+ 342/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23276.8184 + +
+``` + +``` +
+ 345/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23284.4980 + +
+``` + +``` +
+ 348/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23292.1797 + +
+``` + +``` +
+ 351/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23299.8594 + +
+``` + +``` +
+ 354/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23307.5391 + +
+``` + +``` +
+ 357/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23315.2188 + +
+``` + +``` +
+ 360/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23322.8984 + +
+``` + +``` +
+ 363/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23330.5781 + +
+``` + +``` +
+ 365/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23335.6992 + +
+``` + +``` +
+ 368/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23343.3789 + +
+``` + +``` +
+ 370/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23348.5000 + +
+``` + +``` +
+ 373/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23356.1797 + +
+``` + +``` +
+ 376/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23363.8594 + +
+``` + +``` +
+ 379/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23371.5391 + +
+``` + +``` +
+ 382/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23379.2188 + +
+``` + +``` +
+ 385/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23386.8984 + +
+``` + +``` +
+ 388/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23394.5801 + +
+``` + +``` +
+ 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23402.2598 + +
+``` + +``` +
+ 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23409.9395 + +
+``` + +``` +
+ 397/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23417.6191 + +
+``` + +``` +
+ 400/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23425.2988 + +
+``` + +``` +
+ 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23432.9785 + +
+``` + +``` +
+ 406/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23440.6582 + +
+``` + +``` +
+ 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23448.3379 + +
+``` + +``` +
+ 412/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23456.0195 + +
+``` + +``` +
+ 415/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23463.6992 + +
+``` + +``` +
+ 418/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23471.3789 + +
+``` + +``` +
+ 421/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23479.0586 + +
+``` + +``` +
+ 424/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23486.7383 + +
+``` + +``` +
+ 427/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23494.4180 + +
+``` + +``` +
+ 430/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23502.0996 + +
+``` + +``` +
+ 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23509.7793 + +
+``` + +``` +
+ 436/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23517.4590 + +
+``` + +``` +
+ 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23525.1387 + +
+``` + +``` +
+ 442/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23532.8184 + +
+``` + +``` +
+ 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23540.4980 + +
+``` + +``` +
+ 448/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23548.1797 + +
+``` + +``` +
+ 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23555.8594 + +
+``` + +``` +
+ 454/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23563.5391 + +
+``` + +``` +
+ 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23571.2188 + +
+``` + +``` +
+ 460/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23578.8984 + +
+``` + +``` +
+ 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23586.5781 + +
+``` + +``` +
+ 466/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23594.2578 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23601.9355 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 10s 21ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23604.4824 - val_loss: 0.1000 - val_moe_loss: 25198.7246 + + +
+``` +Epoch 10/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 15s 33ms/step - accuracy: 1.0000 - loss: 0.1000 - moe_loss: 25203.8496 + +
+``` + +``` +
+ 4/469 ━━━━━━━━━━━━━━━━━━━━ 9s 20ms/step - accuracy: 0.9963 - loss: 0.1000 - moe_loss: 25211.5312 + +
+``` + +``` +
+ 7/469 ━━━━━━━━━━━━━━━━━━━━ 9s 20ms/step - accuracy: 0.9955 - loss: 0.1000 - moe_loss: 25219.2148 + +
+``` + +``` +
+ 10/469 ━━━━━━━━━━━━━━━━━━━━ 9s 20ms/step - accuracy: 0.9951 - loss: 0.1000 - moe_loss: 25226.8945 + +
+``` + +``` +
+ 13/469 ━━━━━━━━━━━━━━━━━━━━ 9s 20ms/step - accuracy: 0.9948 - loss: 0.1000 - moe_loss: 25234.5781 + +
+``` + +``` +
+ 16/469 ━━━━━━━━━━━━━━━━━━━━ 9s 20ms/step - accuracy: 0.9946 - loss: 0.1000 - moe_loss: 25242.2578 + +
+``` + +``` +
+ 19/469 ━━━━━━━━━━━━━━━━━━━━ 9s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 25249.9375 + +
+``` + +``` +
+ 22/469 ━━━━━━━━━━━━━━━━━━━━ 9s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 25257.6191 + +
+``` + +``` +
+ 25/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 25265.2988 + +
+``` + +``` +
+ 28/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 25272.9785 + +
+``` + +``` +
+ 31/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 25280.6582 + +
+``` + +``` +
+ 34/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 25288.3379 + +
+``` + +``` +
+ 37/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 25296.0176 + +
+``` + +``` +
+ 40/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 25303.6992 + +
+``` + +``` +
+ 43/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 25311.3789 + +
+``` + +``` +
+ 46/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 25319.0586 + +
+``` + +``` +
+ 49/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 25326.7383 + +
+``` + +``` +
+ 52/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 25334.4160 + +
+``` + +``` +
+ 55/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 25342.0977 + +
+``` + +``` +
+ 58/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 25349.7773 + +
+``` + +``` +
+ 61/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 25357.4570 + +
+``` + +``` +
+ 64/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 25365.1367 + +
+``` + +``` +
+ 67/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 25372.8164 + +
+``` + +``` +
+ 70/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 25380.4961 + +
+``` + +``` +
+ 73/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 25388.1758 + +
+``` + +``` +
+ 76/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 25395.8555 + +
+``` + +``` +
+ 79/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 25403.5352 + +
+``` + +``` +
+ 82/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 25411.2148 + +
+``` + +``` +
+ 85/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 25418.8965 + +
+``` + +``` +
+ 88/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 25426.5762 + +
+``` + +``` +
+ 91/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 25434.2559 + +
+``` + +``` +
+ 94/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 25441.9355 + +
+``` + +``` +
+ 97/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 25449.6152 + +
+``` + +``` +
+ 100/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 25457.2949 + +
+``` + +``` +
+ 103/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 25464.9766 + +
+``` + +``` +
+ 106/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 25472.6562 + +
+``` + +``` +
+ 109/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 25480.3359 + +
+``` + +``` +
+ 112/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 25488.0156 + +
+``` + +``` +
+ 115/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 25495.6953 + +
+``` + +``` +
+ 118/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 25503.3770 + +
+``` + +``` +
+ 121/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 25511.0566 + +
+``` + +``` +
+ 124/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25518.7363 + +
+``` + +``` +
+ 127/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25526.4160 + +
+``` + +``` +
+ 130/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25534.0977 + +
+``` + +``` +
+ 133/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25541.7773 + +
+``` + +``` +
+ 136/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25549.4570 + +
+``` + +``` +
+ 139/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25557.1367 + +
+``` + +``` +
+ 142/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25564.8164 + +
+``` + +``` +
+ 145/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25572.4961 + +
+``` + +``` +
+ 148/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25580.1758 + +
+``` + +``` +
+ 151/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25587.8574 + +
+``` + +``` +
+ 154/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25595.5371 + +
+``` + +``` +
+ 157/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25603.2168 + +
+``` + +``` +
+ 160/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25610.8965 + +
+``` + +``` +
+ 163/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25618.5762 + +
+``` + +``` +
+ 166/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25626.2559 + +
+``` + +``` +
+ 169/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25633.9375 + +
+``` + +``` +
+ 172/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25641.6172 + +
+``` + +``` +
+ 175/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25649.2969 + +
+``` + +``` +
+ 178/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25656.9766 + +
+``` + +``` +
+ 181/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25664.6562 + +
+``` + +``` +
+ 184/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25672.3359 + +
+``` + +``` +
+ 187/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25680.0156 + +
+``` + +``` +
+ 190/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25687.6973 + +
+``` + +``` +
+ 193/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25695.3770 + +
+``` + +``` +
+ 196/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25703.0566 + +
+``` + +``` +
+ 199/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25710.7363 + +
+``` + +``` +
+ 202/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25718.4160 + +
+``` + +``` +
+ 205/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25726.0957 + +
+``` + +``` +
+ 208/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25733.7773 + +
+``` + +``` +
+ 211/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25741.4570 + +
+``` + +``` +
+ 214/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25749.1367 + +
+``` + +``` +
+ 217/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25756.8164 + +
+``` + +``` +
+ 220/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25764.4961 + +
+``` + +``` +
+ 223/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25772.1758 + +
+``` + +``` +
+ 226/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25779.8574 + +
+``` + +``` +
+ 229/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25787.5371 + +
+``` + +``` +
+ 232/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25795.2168 + +
+``` + +``` +
+ 235/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25802.8965 + +
+``` + +``` +
+ 238/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25810.5762 + +
+``` + +``` +
+ 241/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25818.2559 + +
+``` + +``` +
+ 244/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25825.9355 + +
+``` + +``` +
+ 247/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25833.6172 + +
+``` + +``` +
+ 250/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25841.2969 + +
+``` + +``` +
+ 253/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25848.9766 + +
+``` + +``` +
+ 256/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25856.6562 + +
+``` + +``` +
+ 259/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25864.3359 + +
+``` + +``` +
+ 262/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25872.0176 + +
+``` + +``` +
+ 265/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25879.6973 + +
+``` + +``` +
+ 268/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25887.3770 + +
+``` + +``` +
+ 271/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25895.0566 + +
+``` + +``` +
+ 274/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25902.7363 + +
+``` + +``` +
+ 277/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25910.4160 + +
+``` + +``` +
+ 280/469 ━━━━━━━━━━━━━━━━━━━━ 3s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25918.0977 + +
+``` + +``` +
+ 283/469 ━━━━━━━━━━━━━━━━━━━━ 3s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25925.7773 + +
+``` + +``` +
+ 286/469 ━━━━━━━━━━━━━━━━━━━━ 3s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25933.4570 + +
+``` + +``` +
+ 288/469 ━━━━━━━━━━━━━━━━━━━━ 3s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25938.5762 + +
+``` + +``` +
+ 289/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25941.1367 + +
+``` + +``` +
+ 290/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25943.6973 + +
+``` + +``` +
+ 293/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25951.3770 + +
+``` + +``` +
+ 296/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25959.0566 + +
+``` + +``` +
+ 299/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25966.7363 + +
+``` + +``` +
+ 302/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25974.4160 + +
+``` + +``` +
+ 305/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25982.0977 + +
+``` + +``` +
+ 308/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25989.7773 + +
+``` + +``` +
+ 311/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 25997.4570 + +
+``` + +``` +
+ 314/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26005.1367 + +
+``` + +``` +
+ 317/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26012.8164 + +
+``` + +``` +
+ 320/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26020.4961 + +
+``` + +``` +
+ 323/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26028.1777 + +
+``` + +``` +
+ 326/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26035.8574 + +
+``` + +``` +
+ 329/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26043.5371 + +
+``` + +``` +
+ 332/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26051.2168 + +
+``` + +``` +
+ 335/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26058.8965 + +
+``` + +``` +
+ 338/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26066.5781 + +
+``` + +``` +
+ 341/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26074.2578 + +
+``` + +``` +
+ 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26081.9375 + +
+``` + +``` +
+ 347/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26089.6172 + +
+``` + +``` +
+ 350/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26097.2969 + +
+``` + +``` +
+ 353/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26104.9766 + +
+``` + +``` +
+ 356/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26112.6562 + +
+``` + +``` +
+ 359/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26120.3379 + +
+``` + +``` +
+ 362/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26128.0176 + +
+``` + +``` +
+ 365/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26135.6973 + +
+``` + +``` +
+ 368/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26143.3770 + +
+``` + +``` +
+ 371/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26151.0566 + +
+``` + +``` +
+ 374/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26158.7363 + +
+``` + +``` +
+ 377/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26166.4180 + +
+``` + +``` +
+ 380/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26174.0977 + +
+``` + +``` +
+ 383/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26181.7773 + +
+``` + +``` +
+ 385/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26186.8965 + +
+``` + +``` +
+ 388/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26194.5762 + +
+``` + +``` +
+ 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26202.2578 + +
+``` + +``` +
+ 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26209.9375 + +
+``` + +``` +
+ 397/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26217.6172 + +
+``` + +``` +
+ 400/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26225.2969 + +
+``` + +``` +
+ 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26232.9766 + +
+``` + +``` +
+ 406/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26240.6562 + +
+``` + +``` +
+ 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26248.3379 + +
+``` + +``` +
+ 412/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26256.0176 + +
+``` + +``` +
+ 415/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26263.6973 + +
+``` + +``` +
+ 418/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26271.3770 + +
+``` + +``` +
+ 421/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26279.0566 + +
+``` + +``` +
+ 424/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26286.7363 + +
+``` + +``` +
+ 427/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26294.4180 + +
+``` + +``` +
+ 430/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26302.0977 + +
+``` + +``` +
+ 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26309.7773 + +
+``` + +``` +
+ 436/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26317.4570 + +
+``` + +``` +
+ 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26325.1367 + +
+``` + +``` +
+ 442/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26332.8164 + +
+``` + +``` +
+ 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26340.4980 + +
+``` + +``` +
+ 448/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26348.1777 + +
+``` + +``` +
+ 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26355.8574 + +
+``` + +``` +
+ 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26360.9766 + +
+``` + +``` +
+ 456/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26368.6562 + +
+``` + +``` +
+ 459/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26376.3379 + +
+``` + +``` +
+ 462/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26384.0176 + +
+``` + +``` +
+ 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26391.6973 + +
+``` + +``` +
+ 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26396.8164 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 11s 23ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26404.4805 - val_loss: 0.1000 - val_moe_loss: 27998.7227 + + +
+``` +Epoch 11/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 18s 39ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28003.8535 + +
+``` + +``` +
+ 4/469 ━━━━━━━━━━━━━━━━━━━━ 10s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 28011.5312 + +
+``` + +``` +
+ 7/469 ━━━━━━━━━━━━━━━━━━━━ 10s 22ms/step - accuracy: 0.9948 - loss: 0.1000 - moe_loss: 28019.2109 + +
+``` + +``` +
+ 10/469 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 0.9955 - loss: 0.1000 - moe_loss: 28026.8926 + +
+``` + +``` +
+ 13/469 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 0.9954 - loss: 0.1000 - moe_loss: 28034.5723 + +
+``` + +``` +
+ 16/469 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 0.9952 - loss: 0.1000 - moe_loss: 28042.2539 + +
+``` + +``` +
+ 19/469 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 0.9950 - loss: 0.1000 - moe_loss: 28049.9336 + +
+``` + +``` +
+ 22/469 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 0.9947 - loss: 0.1000 - moe_loss: 28057.6152 + +
+``` + +``` +
+ 25/469 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 28065.2949 + +
+``` + +``` +
+ 28/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 28072.9766 + +
+``` + +``` +
+ 31/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 28080.6562 + +
+``` + +``` +
+ 34/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 28088.3359 + +
+``` + +``` +
+ 37/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 28096.0156 + +
+``` + +``` +
+ 40/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 28103.6953 + +
+``` + +``` +
+ 43/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 28111.3750 + +
+``` + +``` +
+ 46/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 28119.0566 + +
+``` + +``` +
+ 49/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 28126.7363 + +
+``` + +``` +
+ 52/469 ━━━━━━━━━━━━━━━━━━━━ 9s 22ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 28134.4160 + +
+``` + +``` +
+ 55/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 28142.0957 + +
+``` + +``` +
+ 58/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 28149.7754 + +
+``` + +``` +
+ 61/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 28157.4551 + +
+``` + +``` +
+ 64/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 28165.1348 + +
+``` + +``` +
+ 67/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 28172.8145 + +
+``` + +``` +
+ 70/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 28180.4941 + +
+``` + +``` +
+ 73/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 28188.1738 + +
+``` + +``` +
+ 76/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 28195.8555 + +
+``` + +``` +
+ 79/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28203.5352 + +
+``` + +``` +
+ 82/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28211.2148 + +
+``` + +``` +
+ 85/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28218.8945 + +
+``` + +``` +
+ 88/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28226.5762 + +
+``` + +``` +
+ 91/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28234.2559 + +
+``` + +``` +
+ 94/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28241.9355 + +
+``` + +``` +
+ 97/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28249.6172 + +
+``` + +``` +
+ 100/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28257.2969 + +
+``` + +``` +
+ 102/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28262.4160 + +
+``` + +``` +
+ 105/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28270.0957 + +
+``` + +``` +
+ 108/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 28277.7773 + +
+``` + +``` +
+ 111/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 28285.4570 + +
+``` + +``` +
+ 114/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 28293.1367 + +
+``` + +``` +
+ 117/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 28300.8164 + +
+``` + +``` +
+ 120/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 28308.4961 + +
+``` + +``` +
+ 123/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 28316.1758 + +
+``` + +``` +
+ 126/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 28323.8574 + +
+``` + +``` +
+ 129/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 28331.5371 + +
+``` + +``` +
+ 132/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 28339.2168 + +
+``` + +``` +
+ 135/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 28346.8965 + +
+``` + +``` +
+ 138/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 28354.5762 + +
+``` + +``` +
+ 141/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 28362.2559 + +
+``` + +``` +
+ 144/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 28369.9355 + +
+``` + +``` +
+ 147/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 28377.6172 + +
+``` + +``` +
+ 150/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 28385.2969 + +
+``` + +``` +
+ 153/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 28392.9766 + +
+``` + +``` +
+ 156/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 28400.6562 + +
+``` + +``` +
+ 159/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 28408.3359 + +
+``` + +``` +
+ 162/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 28416.0156 + +
+``` + +``` +
+ 165/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 28423.6953 + +
+``` + +``` +
+ 168/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 28431.3770 + +
+``` + +``` +
+ 171/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 28439.0566 + +
+``` + +``` +
+ 174/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 28446.7363 + +
+``` + +``` +
+ 177/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28454.4160 + +
+``` + +``` +
+ 179/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28459.5371 + +
+``` + +``` +
+ 181/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28464.6562 + +
+``` + +``` +
+ 183/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28469.7773 + +
+``` + +``` +
+ 185/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28474.8965 + +
+``` + +``` +
+ 188/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28482.5762 + +
+``` + +``` +
+ 191/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28490.2578 + +
+``` + +``` +
+ 194/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28497.9375 + +
+``` + +``` +
+ 197/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28505.6172 + +
+``` + +``` +
+ 200/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28513.2969 + +
+``` + +``` +
+ 203/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28520.9766 + +
+``` + +``` +
+ 206/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28528.6562 + +
+``` + +``` +
+ 209/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28536.3379 + +
+``` + +``` +
+ 212/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28544.0176 + +
+``` + +``` +
+ 215/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28551.6973 + +
+``` + +``` +
+ 218/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28559.3770 + +
+``` + +``` +
+ 221/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28567.0566 + +
+``` + +``` +
+ 224/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28574.7363 + +
+``` + +``` +
+ 227/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28582.4160 + +
+``` + +``` +
+ 230/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28590.0977 + +
+``` + +``` +
+ 233/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28597.7773 + +
+``` + +``` +
+ 236/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28605.4570 + +
+``` + +``` +
+ 239/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28613.1367 + +
+``` + +``` +
+ 242/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28620.8164 + +
+``` + +``` +
+ 245/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28628.4961 + +
+``` + +``` +
+ 247/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28633.6172 + +
+``` + +``` +
+ 250/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28641.2969 + +
+``` + +``` +
+ 253/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28648.9766 + +
+``` + +``` +
+ 256/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28656.6562 + +
+``` + +``` +
+ 259/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28664.3359 + +
+``` + +``` +
+ 262/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28672.0176 + +
+``` + +``` +
+ 265/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28679.6973 + +
+``` + +``` +
+ 267/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28684.8164 + +
+``` + +``` +
+ 269/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28689.9375 + +
+``` + +``` +
+ 272/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28697.6172 + +
+``` + +``` +
+ 275/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28705.2969 + +
+``` + +``` +
+ 277/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28710.4160 + +
+``` + +``` +
+ 280/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28718.0957 + +
+``` + +``` +
+ 283/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28725.7773 + +
+``` + +``` +
+ 286/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28733.4570 + +
+``` + +``` +
+ 289/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28741.1367 + +
+``` + +``` +
+ 292/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28748.8164 + +
+``` + +``` +
+ 295/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28756.4961 + +
+``` + +``` +
+ 297/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28761.6172 + +
+``` + +``` +
+ 299/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28766.7363 + +
+``` + +``` +
+ 302/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28774.4160 + +
+``` + +``` +
+ 305/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28782.0957 + +
+``` + +``` +
+ 308/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28789.7773 + +
+``` + +``` +
+ 311/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28797.4570 + +
+``` + +``` +
+ 314/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28805.1367 + +
+``` + +``` +
+ 317/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28812.8164 + +
+``` + +``` +
+ 320/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28820.4961 + +
+``` + +``` +
+ 323/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28828.1777 + +
+``` + +``` +
+ 325/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28833.2969 + +
+``` + +``` +
+ 328/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28840.9766 + +
+``` + +``` +
+ 331/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28848.6562 + +
+``` + +``` +
+ 333/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28853.7773 + +
+``` + +``` +
+ 336/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28861.4570 + +
+``` + +``` +
+ 338/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28866.5762 + +
+``` + +``` +
+ 341/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28874.2578 + +
+``` + +``` +
+ 343/469 ━━━━━━━━━━━━━━━━━━━━ 2s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28879.3770 + +
+``` + +``` +
+ 346/469 ━━━━━━━━━━━━━━━━━━━━ 2s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28887.0566 + +
+``` + +``` +
+ 349/469 ━━━━━━━━━━━━━━━━━━━━ 2s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28894.7363 + +
+``` + +``` +
+ 352/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28902.4160 + +
+``` + +``` +
+ 355/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28910.0977 + +
+``` + +``` +
+ 358/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28917.7773 + +
+``` + +``` +
+ 361/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28925.4570 + +
+``` + +``` +
+ 364/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28933.1367 + +
+``` + +``` +
+ 367/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28940.8164 + +
+``` + +``` +
+ 370/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28948.4961 + +
+``` + +``` +
+ 373/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28956.1777 + +
+``` + +``` +
+ 376/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28963.8574 + +
+``` + +``` +
+ 379/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28971.5371 + +
+``` + +``` +
+ 381/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28976.6562 + +
+``` + +``` +
+ 384/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28984.3359 + +
+``` + +``` +
+ 387/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28992.0176 + +
+``` + +``` +
+ 390/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28999.6973 + +
+``` + +``` +
+ 392/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 29004.8164 + +
+``` + +``` +
+ 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 29009.9375 + +
+``` + +``` +
+ 396/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 29015.0566 + +
+``` + +``` +
+ 398/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 29020.1777 + +
+``` + +``` +
+ 400/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 29025.2969 + +
+``` + +``` +
+ 402/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29030.4180 + +
+``` + +``` +
+ 404/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29035.5371 + +
+``` + +``` +
+ 406/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29040.6562 + +
+``` + +``` +
+ 408/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29045.7773 + +
+``` + +``` +
+ 410/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29050.8965 + +
+``` + +``` +
+ 412/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29056.0176 + +
+``` + +``` +
+ 414/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29061.1367 + +
+``` + +``` +
+ 416/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29066.2578 + +
+``` + +``` +
+ 418/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29071.3770 + +
+``` + +``` +
+ 421/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29079.0566 + +
+``` + +``` +
+ 423/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29084.1777 + +
+``` + +``` +
+ 425/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29089.2969 + +
+``` + +``` +
+ 427/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29094.4160 + +
+``` + +``` +
+ 429/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29099.5371 + +
+``` + +``` +
+ 431/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29104.6562 + +
+``` + +``` +
+ 434/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29112.3359 + +
+``` + +``` +
+ 436/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29117.4570 + +
+``` + +``` +
+ 438/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29122.5762 + +
+``` + +``` +
+ 440/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29127.6973 + +
+``` + +``` +
+ 442/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29132.8164 + +
+``` + +``` +
+ 444/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29137.9375 + +
+``` + +``` +
+ 446/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29143.0566 + +
+``` + +``` +
+ 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29150.7363 + +
+``` + +``` +
+ 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29155.8574 + +
+``` + +``` +
+ 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29160.9766 + +
+``` + +``` +
+ 455/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29166.0977 + +
+``` + +``` +
+ 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 29171.2168 + +
+``` + +``` +
+ 459/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 29176.3359 + +
+``` + +``` +
+ 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 29181.4570 + +
+``` + +``` +
+ 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 29186.5762 + +
+``` + +``` +
+ 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 29191.6973 + +
+``` + +``` +
+ 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 29196.8164 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 12s 25ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 29204.4805 - val_loss: 0.1000 - val_moe_loss: 30798.7227 + + +
+``` +Epoch 12/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 20s 45ms/step - accuracy: 0.9844 - loss: 0.1000 - moe_loss: 30803.8516 + +
+``` + +``` +
+ 3/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 30808.9746 + +
+``` + +``` +
+ 5/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9880 - loss: 0.1000 - moe_loss: 30814.0918 + +
+``` + +``` +
+ 7/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9889 - loss: 0.1000 - moe_loss: 30819.2129 + +
+``` + +``` +
+ 10/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9896 - loss: 0.1000 - moe_loss: 30826.8965 + +
+``` + +``` +
+ 12/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9898 - loss: 0.1000 - moe_loss: 30832.0176 + +
+``` + +``` +
+ 15/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9900 - loss: 0.1000 - moe_loss: 30839.6973 + +
+``` + +``` +
+ 18/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9903 - loss: 0.1000 - moe_loss: 30847.3789 + +
+``` + +``` +
+ 21/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9904 - loss: 0.1000 - moe_loss: 30855.0605 + +
+``` + +``` +
+ 24/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9906 - loss: 0.1000 - moe_loss: 30862.7422 + +
+``` + +``` +
+ 26/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9907 - loss: 0.1000 - moe_loss: 30867.8633 + +
+``` + +``` +
+ 28/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 30872.9824 + +
+``` + +``` +
+ 31/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 30880.6621 + +
+``` + +``` +
+ 33/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 30885.7812 + +
+``` + +``` +
+ 36/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 30893.4609 + +
+``` + +``` +
+ 38/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 30898.5801 + +
+``` + +``` +
+ 40/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 30903.6992 + +
+``` + +``` +
+ 42/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 30908.8203 + +
+``` + +``` +
+ 44/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 30913.9395 + +
+``` + +``` +
+ 46/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 30919.0586 + +
+``` + +``` +
+ 48/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 30924.1797 + +
+``` + +``` +
+ 50/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 30929.2988 + +
+``` + +``` +
+ 52/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 30934.4199 + +
+``` + +``` +
+ 54/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 30939.5391 + +
+``` + +``` +
+ 56/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 30944.6602 + +
+``` + +``` +
+ 58/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 30949.7793 + +
+``` + +``` +
+ 60/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 30954.8984 + +
+``` + +``` +
+ 62/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 30960.0195 + +
+``` + +``` +
+ 64/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 30965.1387 + +
+``` + +``` +
+ 65/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 30967.6992 + +
+``` + +``` +
+ 67/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 30972.8184 + +
+``` + +``` +
+ 69/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 30977.9395 + +
+``` + +``` +
+ 71/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 30983.0586 + +
+``` + +``` +
+ 74/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 30990.7383 + +
+``` + +``` +
+ 76/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 30995.8594 + +
+``` + +``` +
+ 78/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 31000.9785 + +
+``` + +``` +
+ 80/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 31006.0996 + +
+``` + +``` +
+ 82/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 31011.2188 + +
+``` + +``` +
+ 85/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 31018.9004 + +
+``` + +``` +
+ 87/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 31024.0195 + +
+``` + +``` +
+ 89/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 31029.1406 + +
+``` + +``` +
+ 91/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 31034.2598 + +
+``` + +``` +
+ 93/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 31039.3789 + +
+``` + +``` +
+ 95/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 31044.5000 + +
+``` + +``` +
+ 97/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31049.6191 + +
+``` + +``` +
+ 99/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31054.7402 + +
+``` + +``` +
+ 101/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31059.8594 + +
+``` + +``` +
+ 103/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31064.9805 + +
+``` + +``` +
+ 105/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31070.0996 + +
+``` + +``` +
+ 107/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31075.2188 + +
+``` + +``` +
+ 109/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31080.3398 + +
+``` + +``` +
+ 111/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31085.4590 + +
+``` + +``` +
+ 113/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31090.5801 + +
+``` + +``` +
+ 115/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31095.6992 + +
+``` + +``` +
+ 117/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31100.8203 + +
+``` + +``` +
+ 119/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31105.9395 + +
+``` + +``` +
+ 121/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31111.0586 + +
+``` + +``` +
+ 123/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31116.1797 + +
+``` + +``` +
+ 125/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31121.2988 + +
+``` + +``` +
+ 127/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31126.4199 + +
+``` + +``` +
+ 129/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31131.5391 + +
+``` + +``` +
+ 131/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31136.6602 + +
+``` + +``` +
+ 133/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31141.7793 + +
+``` + +``` +
+ 135/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31146.8984 + +
+``` + +``` +
+ 137/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31152.0195 + +
+``` + +``` +
+ 139/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31157.1387 + +
+``` + +``` +
+ 141/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31162.2598 + +
+``` + +``` +
+ 143/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31167.3789 + +
+``` + +``` +
+ 145/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31172.5000 + +
+``` + +``` +
+ 147/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31177.6191 + +
+``` + +``` +
+ 149/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31182.7402 + +
+``` + +``` +
+ 151/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31187.8594 + +
+``` + +``` +
+ 153/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31192.9805 + +
+``` + +``` +
+ 155/469 ━━━━━━━━━━━━━━━━━━━━ 8s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31198.0996 + +
+``` + +``` +
+ 157/469 ━━━━━━━━━━━━━━━━━━━━ 8s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31203.2207 + +
+``` + +``` +
+ 159/469 ━━━━━━━━━━━━━━━━━━━━ 8s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31208.3398 + +
+``` + +``` +
+ 161/469 ━━━━━━━━━━━━━━━━━━━━ 8s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31213.4609 + +
+``` + +``` +
+ 163/469 ━━━━━━━━━━━━━━━━━━━━ 8s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31218.5801 + +
+``` + +``` +
+ 165/469 ━━━━━━━━━━━━━━━━━━━━ 8s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31223.6992 + +
+``` + +``` +
+ 166/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31226.2598 + +
+``` + +``` +
+ 167/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31228.8203 + +
+``` + +``` +
+ 169/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31233.9395 + +
+``` + +``` +
+ 171/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31239.0605 + +
+``` + +``` +
+ 173/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31244.1797 + +
+``` + +``` +
+ 175/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31249.2988 + +
+``` + +``` +
+ 177/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31254.4199 + +
+``` + +``` +
+ 179/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31259.5391 + +
+``` + +``` +
+ 181/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31264.6602 + +
+``` + +``` +
+ 183/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31269.7793 + +
+``` + +``` +
+ 185/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31274.8984 + +
+``` + +``` +
+ 187/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31280.0195 + +
+``` + +``` +
+ 189/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31285.1387 + +
+``` + +``` +
+ 191/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31290.2598 + +
+``` + +``` +
+ 193/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31295.3789 + +
+``` + +``` +
+ 195/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31300.5000 + +
+``` + +``` +
+ 197/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31305.6191 + +
+``` + +``` +
+ 199/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31310.7383 + +
+``` + +``` +
+ 201/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31315.8594 + +
+``` + +``` +
+ 203/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31320.9785 + +
+``` + +``` +
+ 205/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31326.0996 + +
+``` + +``` +
+ 207/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31331.2188 + +
+``` + +``` +
+ 209/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31336.3398 + +
+``` + +``` +
+ 211/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31341.4590 + +
+``` + +``` +
+ 213/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31346.5801 + +
+``` + +``` +
+ 215/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31351.6992 + +
+``` + +``` +
+ 217/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31356.8184 + +
+``` + +``` +
+ 219/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31361.9395 + +
+``` + +``` +
+ 221/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31367.0586 + +
+``` + +``` +
+ 223/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31372.1797 + +
+``` + +``` +
+ 225/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31377.2988 + +
+``` + +``` +
+ 227/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31382.4199 + +
+``` + +``` +
+ 229/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31387.5391 + +
+``` + +``` +
+ 231/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31392.6582 + +
+``` + +``` +
+ 233/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31397.7793 + +
+``` + +``` +
+ 235/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31402.8984 + +
+``` + +``` +
+ 237/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31408.0195 + +
+``` + +``` +
+ 239/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31413.1387 + +
+``` + +``` +
+ 241/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31418.2578 + +
+``` + +``` +
+ 243/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31423.3789 + +
+``` + +``` +
+ 245/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31428.4980 + +
+``` + +``` +
+ 247/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31433.6191 + +
+``` + +``` +
+ 249/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31438.7383 + +
+``` + +``` +
+ 251/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31443.8574 + +
+``` + +``` +
+ 253/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31448.9785 + +
+``` + +``` +
+ 255/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31454.0977 + +
+``` + +``` +
+ 257/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31459.2188 + +
+``` + +``` +
+ 259/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31464.3379 + +
+``` + +``` +
+ 261/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31469.4590 + +
+``` + +``` +
+ 263/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31474.5781 + +
+``` + +``` +
+ 265/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31479.6992 + +
+``` + +``` +
+ 267/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31484.8184 + +
+``` + +``` +
+ 269/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31489.9375 + +
+``` + +``` +
+ 271/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31495.0586 + +
+``` + +``` +
+ 273/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31500.1777 + +
+``` + +``` +
+ 275/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31505.2988 + +
+``` + +``` +
+ 277/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31510.4180 + +
+``` + +``` +
+ 279/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31515.5391 + +
+``` + +``` +
+ 281/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31520.6582 + +
+``` + +``` +
+ 283/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31525.7773 + +
+``` + +``` +
+ 285/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31530.8984 + +
+``` + +``` +
+ 287/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31536.0176 + +
+``` + +``` +
+ 289/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31541.1387 + +
+``` + +``` +
+ 291/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31546.2578 + +
+``` + +``` +
+ 293/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31551.3770 + +
+``` + +``` +
+ 295/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31556.4980 + +
+``` + +``` +
+ 297/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31561.6172 + +
+``` + +``` +
+ 299/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31566.7383 + +
+``` + +``` +
+ 301/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31571.8574 + +
+``` + +``` +
+ 303/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31576.9785 + +
+``` + +``` +
+ 305/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31582.0977 + +
+``` + +``` +
+ 307/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31587.2168 + +
+``` + +``` +
+ 309/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31592.3379 + +
+``` + +``` +
+ 311/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31597.4570 + +
+``` + +``` +
+ 313/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31602.5781 + +
+``` + +``` +
+ 315/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31607.6973 + +
+``` + +``` +
+ 317/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31612.8184 + +
+``` + +``` +
+ 319/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31617.9375 + +
+``` + +``` +
+ 321/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31623.0586 + +
+``` + +``` +
+ 323/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31628.1777 + +
+``` + +``` +
+ 325/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31633.2969 + +
+``` + +``` +
+ 327/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31638.4180 + +
+``` + +``` +
+ 329/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31643.5371 + +
+``` + +``` +
+ 331/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31648.6582 + +
+``` + +``` +
+ 333/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31653.7773 + +
+``` + +``` +
+ 335/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31658.8984 + +
+``` + +``` +
+ 337/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31664.0176 + +
+``` + +``` +
+ 339/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31669.1367 + +
+``` + +``` +
+ 341/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31674.2578 + +
+``` + +``` +
+ 343/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31679.3770 + +
+``` + +``` +
+ 345/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31684.4980 + +
+``` + +``` +
+ 347/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31689.6172 + +
+``` + +``` +
+ 349/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31694.7383 + +
+``` + +``` +
+ 351/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31699.8574 + +
+``` + +``` +
+ 353/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31704.9785 + +
+``` + +``` +
+ 355/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31710.0977 + +
+``` + +``` +
+ 357/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31715.2168 + +
+``` + +``` +
+ 359/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31720.3379 + +
+``` + +``` +
+ 361/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31725.4570 + +
+``` + +``` +
+ 363/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31730.5781 + +
+``` + +``` +
+ 365/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31735.6973 + +
+``` + +``` +
+ 367/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31740.8184 + +
+``` + +``` +
+ 369/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31745.9375 + +
+``` + +``` +
+ 371/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31751.0566 + +
+``` + +``` +
+ 373/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31756.1777 + +
+``` + +``` +
+ 375/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31761.2969 + +
+``` + +``` +
+ 377/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31766.4180 + +
+``` + +``` +
+ 379/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31771.5371 + +
+``` + +``` +
+ 381/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31776.6582 + +
+``` + +``` +
+ 383/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31781.7773 + +
+``` + +``` +
+ 385/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31786.8965 + +
+``` + +``` +
+ 387/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31792.0176 + +
+``` + +``` +
+ 389/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31797.1367 + +
+``` + +``` +
+ 391/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31802.2578 + +
+``` + +``` +
+ 393/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31807.3770 + +
+``` + +``` +
+ 395/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31812.4980 + +
+``` + +``` +
+ 397/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31817.6172 + +
+``` + +``` +
+ 399/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31822.7383 + +
+``` + +``` +
+ 401/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31827.8574 + +
+``` + +``` +
+ 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31832.9766 + +
+``` + +``` +
+ 405/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31838.0977 + +
+``` + +``` +
+ 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31843.2168 + +
+``` + +``` +
+ 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31848.3379 + +
+``` + +``` +
+ 411/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31853.4570 + +
+``` + +``` +
+ 413/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31858.5781 + +
+``` + +``` +
+ 415/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31863.6973 + +
+``` + +``` +
+ 417/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31868.8164 + +
+``` + +``` +
+ 419/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31873.9375 + +
+``` + +``` +
+ 421/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31879.0566 + +
+``` + +``` +
+ 423/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31884.1777 + +
+``` + +``` +
+ 425/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31889.2969 + +
+``` + +``` +
+ 427/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31894.4180 + +
+``` + +``` +
+ 429/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31899.5371 + +
+``` + +``` +
+ 431/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31904.6562 + +
+``` + +``` +
+ 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31909.7773 + +
+``` + +``` +
+ 435/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31914.8965 + +
+``` + +``` +
+ 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31920.0176 + +
+``` + +``` +
+ 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31925.1367 + +
+``` + +``` +
+ 441/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31930.2578 + +
+``` + +``` +
+ 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31935.3770 + +
+``` + +``` +
+ 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31940.4961 + +
+``` + +``` +
+ 447/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31945.6172 + +
+``` + +``` +
+ 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31950.7363 + +
+``` + +``` +
+ 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31955.8574 + +
+``` + +``` +
+ 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31960.9766 + +
+``` + +``` +
+ 455/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31966.0977 + +
+``` + +``` +
+ 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31971.2168 + +
+``` + +``` +
+ 459/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31976.3379 + +
+``` + +``` +
+ 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31981.4570 + +
+``` + +``` +
+ 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31986.5762 + +
+``` + +``` +
+ 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31991.6973 + +
+``` + +``` +
+ 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31996.8164 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 14s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 32004.4805 - val_loss: 0.1000 - val_moe_loss: 33598.7227 + + +
+``` +Epoch 13/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 21s 45ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 33603.8633 + +
+``` + +``` +
+ 4/469 ━━━━━━━━━━━━━━━━━━━━ 11s 24ms/step - accuracy: 0.9954 - loss: 0.1000 - moe_loss: 33611.5352 + +
+``` + +``` +
+ 6/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9953 - loss: 0.1000 - moe_loss: 33616.6523 + +
+``` + +``` +
+ 9/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9948 - loss: 0.1000 - moe_loss: 33624.3320 + +
+``` + +``` +
+ 11/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 33629.4531 + +
+``` + +``` +
+ 13/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 33634.5742 + +
+``` + +``` +
+ 15/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 33639.6953 + +
+``` + +``` +
+ 17/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 33644.8125 + +
+``` + +``` +
+ 19/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 33649.9336 + +
+``` + +``` +
+ 21/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 33655.0547 + +
+``` + +``` +
+ 23/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 33660.1758 + +
+``` + +``` +
+ 25/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 33665.2930 + +
+``` + +``` +
+ 27/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 33670.4141 + +
+``` + +``` +
+ 29/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 33675.5352 + +
+``` + +``` +
+ 31/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 33680.6562 + +
+``` + +``` +
+ 33/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 33685.7734 + +
+``` + +``` +
+ 35/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 33690.8945 + +
+``` + +``` +
+ 37/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 33696.0156 + +
+``` + +``` +
+ 39/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 33701.1328 + +
+``` + +``` +
+ 41/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 33706.2539 + +
+``` + +``` +
+ 43/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 33711.3750 + +
+``` + +``` +
+ 45/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 33716.4961 + +
+``` + +``` +
+ 47/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 33721.6133 + +
+``` + +``` +
+ 49/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 33726.7344 + +
+``` + +``` +
+ 51/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 33731.8555 + +
+``` + +``` +
+ 53/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 33736.9766 + +
+``` + +``` +
+ 55/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 33742.0938 + +
+``` + +``` +
+ 57/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 33747.2148 + +
+``` + +``` +
+ 59/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 33752.3359 + +
+``` + +``` +
+ 61/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 33757.4531 + +
+``` + +``` +
+ 63/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 33762.5742 + +
+``` + +``` +
+ 65/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 33767.6953 + +
+``` + +``` +
+ 67/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33772.8164 + +
+``` + +``` +
+ 69/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33777.9336 + +
+``` + +``` +
+ 71/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33783.0547 + +
+``` + +``` +
+ 73/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33788.1758 + +
+``` + +``` +
+ 75/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33793.2930 + +
+``` + +``` +
+ 77/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33798.4141 + +
+``` + +``` +
+ 79/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33803.5352 + +
+``` + +``` +
+ 81/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33808.6562 + +
+``` + +``` +
+ 83/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33813.7734 + +
+``` + +``` +
+ 85/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33818.8945 + +
+``` + +``` +
+ 87/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33824.0156 + +
+``` + +``` +
+ 89/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33829.1367 + +
+``` + +``` +
+ 91/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33834.2539 + +
+``` + +``` +
+ 93/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33839.3750 + +
+``` + +``` +
+ 95/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33844.4961 + +
+``` + +``` +
+ 97/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33849.6133 + +
+``` + +``` +
+ 99/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33854.7344 + +
+``` + +``` +
+ 101/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33859.8555 + +
+``` + +``` +
+ 103/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33864.9766 + +
+``` + +``` +
+ 105/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33870.0938 + +
+``` + +``` +
+ 107/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33875.2148 + +
+``` + +``` +
+ 109/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33880.3359 + +
+``` + +``` +
+ 111/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33885.4531 + +
+``` + +``` +
+ 113/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33890.5742 + +
+``` + +``` +
+ 115/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33895.6953 + +
+``` + +``` +
+ 117/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33900.8164 + +
+``` + +``` +
+ 119/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33905.9336 + +
+``` + +``` +
+ 121/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33911.0547 + +
+``` + +``` +
+ 123/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33916.1758 + +
+``` + +``` +
+ 125/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33921.2930 + +
+``` + +``` +
+ 127/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33926.4141 + +
+``` + +``` +
+ 129/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33931.5352 + +
+``` + +``` +
+ 131/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33936.6562 + +
+``` + +``` +
+ 133/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33941.7734 + +
+``` + +``` +
+ 135/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33946.8945 + +
+``` + +``` +
+ 137/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33952.0156 + +
+``` + +``` +
+ 139/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33957.1367 + +
+``` + +``` +
+ 141/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33962.2539 + +
+``` + +``` +
+ 143/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33967.3750 + +
+``` + +``` +
+ 145/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33972.4961 + +
+``` + +``` +
+ 147/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33977.6172 + +
+``` + +``` +
+ 149/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 33982.7344 + +
+``` + +``` +
+ 151/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 33987.8555 + +
+``` + +``` +
+ 153/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 33992.9766 + +
+``` + +``` +
+ 155/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 33998.0938 + +
+``` + +``` +
+ 157/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34003.2148 + +
+``` + +``` +
+ 159/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34008.3359 + +
+``` + +``` +
+ 161/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34013.4570 + +
+``` + +``` +
+ 163/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34018.5742 + +
+``` + +``` +
+ 165/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34023.6953 + +
+``` + +``` +
+ 167/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34028.8164 + +
+``` + +``` +
+ 169/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34033.9336 + +
+``` + +``` +
+ 171/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34039.0547 + +
+``` + +``` +
+ 173/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34044.1758 + +
+``` + +``` +
+ 175/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34049.2969 + +
+``` + +``` +
+ 177/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34054.4141 + +
+``` + +``` +
+ 179/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34059.5352 + +
+``` + +``` +
+ 181/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34064.6562 + +
+``` + +``` +
+ 183/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34069.7734 + +
+``` + +``` +
+ 185/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34074.8945 + +
+``` + +``` +
+ 186/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34077.4531 + +
+``` + +``` +
+ 187/469 ━━━━━━━━━━━━━━━━━━━━ 8s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34080.0156 + +
+``` + +``` +
+ 188/469 ━━━━━━━━━━━━━━━━━━━━ 8s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34082.5742 + +
+``` + +``` +
+ 190/469 ━━━━━━━━━━━━━━━━━━━━ 8s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34087.6953 + +
+``` + +``` +
+ 192/469 ━━━━━━━━━━━━━━━━━━━━ 8s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34092.8164 + +
+``` + +``` +
+ 194/469 ━━━━━━━━━━━━━━━━━━━━ 8s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34097.9336 + +
+``` + +``` +
+ 196/469 ━━━━━━━━━━━━━━━━━━━━ 8s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34103.0547 + +
+``` + +``` +
+ 198/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34108.1758 + +
+``` + +``` +
+ 200/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34113.2930 + +
+``` + +``` +
+ 202/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34118.4141 + +
+``` + +``` +
+ 204/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34123.5352 + +
+``` + +``` +
+ 206/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34128.6562 + +
+``` + +``` +
+ 208/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34133.7734 + +
+``` + +``` +
+ 210/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34138.8945 + +
+``` + +``` +
+ 212/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34144.0156 + +
+``` + +``` +
+ 214/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34149.1367 + +
+``` + +``` +
+ 216/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34154.2539 + +
+``` + +``` +
+ 218/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34159.3750 + +
+``` + +``` +
+ 220/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34164.4961 + +
+``` + +``` +
+ 222/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34169.6133 + +
+``` + +``` +
+ 224/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34174.7344 + +
+``` + +``` +
+ 226/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34179.8555 + +
+``` + +``` +
+ 228/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34184.9766 + +
+``` + +``` +
+ 230/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34190.0938 + +
+``` + +``` +
+ 232/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34195.2148 + +
+``` + +``` +
+ 234/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34200.3359 + +
+``` + +``` +
+ 236/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34205.4531 + +
+``` + +``` +
+ 238/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34210.5742 + +
+``` + +``` +
+ 240/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34215.6953 + +
+``` + +``` +
+ 242/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34220.8164 + +
+``` + +``` +
+ 244/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34225.9336 + +
+``` + +``` +
+ 246/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34231.0547 + +
+``` + +``` +
+ 248/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34236.1758 + +
+``` + +``` +
+ 250/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34241.2969 + +
+``` + +``` +
+ 252/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34246.4141 + +
+``` + +``` +
+ 254/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34251.5352 + +
+``` + +``` +
+ 256/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34256.6562 + +
+``` + +``` +
+ 258/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34261.7734 + +
+``` + +``` +
+ 260/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34266.8945 + +
+``` + +``` +
+ 262/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34272.0156 + +
+``` + +``` +
+ 263/469 ━━━━━━━━━━━━━━━━━━━━ 6s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34274.5742 + +
+``` + +``` +
+ 265/469 ━━━━━━━━━━━━━━━━━━━━ 6s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34279.6953 + +
+``` + +``` +
+ 267/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34284.8164 + +
+``` + +``` +
+ 269/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34289.9336 + +
+``` + +``` +
+ 271/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34295.0547 + +
+``` + +``` +
+ 273/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34300.1758 + +
+``` + +``` +
+ 275/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34305.2930 + +
+``` + +``` +
+ 277/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34310.4141 + +
+``` + +``` +
+ 279/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34315.5352 + +
+``` + +``` +
+ 281/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34320.6562 + +
+``` + +``` +
+ 283/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34325.7734 + +
+``` + +``` +
+ 285/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34330.8945 + +
+``` + +``` +
+ 287/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34336.0156 + +
+``` + +``` +
+ 289/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34341.1328 + +
+``` + +``` +
+ 291/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34346.2539 + +
+``` + +``` +
+ 293/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34351.3750 + +
+``` + +``` +
+ 295/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34356.4961 + +
+``` + +``` +
+ 297/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34361.6133 + +
+``` + +``` +
+ 299/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34366.7344 + +
+``` + +``` +
+ 301/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34371.8555 + +
+``` + +``` +
+ 303/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34376.9727 + +
+``` + +``` +
+ 305/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34382.0938 + +
+``` + +``` +
+ 307/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34387.2148 + +
+``` + +``` +
+ 309/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34392.3359 + +
+``` + +``` +
+ 311/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34397.4531 + +
+``` + +``` +
+ 313/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34402.5742 + +
+``` + +``` +
+ 315/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34407.6953 + +
+``` + +``` +
+ 317/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34412.8125 + +
+``` + +``` +
+ 319/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34417.9336 + +
+``` + +``` +
+ 321/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34423.0547 + +
+``` + +``` +
+ 323/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34428.1758 + +
+``` + +``` +
+ 325/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34433.2930 + +
+``` + +``` +
+ 327/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34438.4141 + +
+``` + +``` +
+ 329/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34443.5352 + +
+``` + +``` +
+ 331/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34448.6562 + +
+``` + +``` +
+ 333/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34453.7734 + +
+``` + +``` +
+ 335/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34458.8945 + +
+``` + +``` +
+ 337/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34464.0156 + +
+``` + +``` +
+ 339/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34469.1328 + +
+``` + +``` +
+ 341/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34474.2539 + +
+``` + +``` +
+ 343/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34479.3750 + +
+``` + +``` +
+ 345/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34484.4961 + +
+``` + +``` +
+ 347/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34489.6133 + +
+``` + +``` +
+ 349/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34494.7344 + +
+``` + +``` +
+ 351/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34499.8555 + +
+``` + +``` +
+ 353/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34504.9727 + +
+``` + +``` +
+ 355/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34510.0938 + +
+``` + +``` +
+ 357/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34515.2148 + +
+``` + +``` +
+ 359/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34520.3359 + +
+``` + +``` +
+ 361/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34525.4531 + +
+``` + +``` +
+ 363/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34530.5742 + +
+``` + +``` +
+ 365/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34535.6953 + +
+``` + +``` +
+ 367/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34540.8164 + +
+``` + +``` +
+ 369/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34545.9336 + +
+``` + +``` +
+ 371/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34551.0547 + +
+``` + +``` +
+ 373/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34556.1758 + +
+``` + +``` +
+ 375/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34561.2930 + +
+``` + +``` +
+ 377/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34566.4141 + +
+``` + +``` +
+ 379/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34571.5352 + +
+``` + +``` +
+ 381/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34576.6562 + +
+``` + +``` +
+ 383/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34581.7734 + +
+``` + +``` +
+ 385/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34586.8945 + +
+``` + +``` +
+ 387/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34592.0156 + +
+``` + +``` +
+ 389/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34597.1328 + +
+``` + +``` +
+ 391/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34602.2539 + +
+``` + +``` +
+ 393/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34607.3750 + +
+``` + +``` +
+ 395/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34612.4961 + +
+``` + +``` +
+ 397/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34617.6133 + +
+``` + +``` +
+ 399/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34622.7344 + +
+``` + +``` +
+ 401/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34627.8555 + +
+``` + +``` +
+ 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34632.9727 + +
+``` + +``` +
+ 405/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34638.0938 + +
+``` + +``` +
+ 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34643.2148 + +
+``` + +``` +
+ 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34648.3359 + +
+``` + +``` +
+ 411/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34653.4531 + +
+``` + +``` +
+ 413/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34658.5742 + +
+``` + +``` +
+ 415/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34663.6953 + +
+``` + +``` +
+ 417/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34668.8164 + +
+``` + +``` +
+ 419/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34673.9336 + +
+``` + +``` +
+ 421/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34679.0547 + +
+``` + +``` +
+ 423/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34684.1758 + +
+``` + +``` +
+ 425/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34689.2930 + +
+``` + +``` +
+ 427/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34694.4141 + +
+``` + +``` +
+ 429/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34699.5352 + +
+``` + +``` +
+ 431/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34704.6562 + +
+``` + +``` +
+ 433/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34709.7734 + +
+``` + +``` +
+ 435/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34714.8945 + +
+``` + +``` +
+ 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34720.0156 + +
+``` + +``` +
+ 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34725.1328 + +
+``` + +``` +
+ 441/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34730.2539 + +
+``` + +``` +
+ 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34735.3750 + +
+``` + +``` +
+ 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34740.4961 + +
+``` + +``` +
+ 447/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34745.6133 + +
+``` + +``` +
+ 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34750.7344 + +
+``` + +``` +
+ 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34755.8555 + +
+``` + +``` +
+ 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34760.9727 + +
+``` + +``` +
+ 455/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34766.0938 + +
+``` + +``` +
+ 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34771.2148 + +
+``` + +``` +
+ 459/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34776.3359 + +
+``` + +``` +
+ 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34781.4531 + +
+``` + +``` +
+ 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34786.5742 + +
+``` + +``` +
+ 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34791.6953 + +
+``` + +``` +
+ 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34796.8164 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34801.9336 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 15s 32ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34804.4766 - val_loss: 0.1000 - val_moe_loss: 36398.7227 + + +
+``` +Epoch 14/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 23s 50ms/step - accuracy: 1.0000 - loss: 0.1000 - moe_loss: 36403.8555 + +
+``` + +``` +
+ 3/469 ━━━━━━━━━━━━━━━━━━━━ 13s 28ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 36408.9766 + +
+``` + +``` +
+ 5/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9905 - loss: 0.1000 - moe_loss: 36414.0938 + +
+``` + +``` +
+ 7/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9898 - loss: 0.1000 - moe_loss: 36419.2148 + +
+``` + +``` +
+ 9/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9898 - loss: 0.1000 - moe_loss: 36424.3320 + +
+``` + +``` +
+ 11/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9899 - loss: 0.1000 - moe_loss: 36429.4531 + +
+``` + +``` +
+ 13/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9899 - loss: 0.1000 - moe_loss: 36434.5742 + +
+``` + +``` +
+ 15/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9900 - loss: 0.1000 - moe_loss: 36439.6914 + +
+``` + +``` +
+ 17/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9901 - loss: 0.1000 - moe_loss: 36444.8125 + +
+``` + +``` +
+ 19/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9901 - loss: 0.1000 - moe_loss: 36449.9336 + +
+``` + +``` +
+ 21/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9902 - loss: 0.1000 - moe_loss: 36455.0547 + +
+``` + +``` +
+ 23/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9903 - loss: 0.1000 - moe_loss: 36460.1719 + +
+``` + +``` +
+ 25/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9903 - loss: 0.1000 - moe_loss: 36465.2930 + +
+``` + +``` +
+ 27/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9904 - loss: 0.1000 - moe_loss: 36470.4141 + +
+``` + +``` +
+ 29/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9905 - loss: 0.1000 - moe_loss: 36475.5312 + +
+``` + +``` +
+ 31/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9906 - loss: 0.1000 - moe_loss: 36480.6523 + +
+``` + +``` +
+ 33/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9906 - loss: 0.1000 - moe_loss: 36485.7734 + +
+``` + +``` +
+ 35/469 ━━━━━━━━━━━━━━━━━━━━ 12s 29ms/step - accuracy: 0.9907 - loss: 0.1000 - moe_loss: 36490.8945 + +
+``` + +``` +
+ 36/469 ━━━━━━━━━━━━━━━━━━━━ 13s 32ms/step - accuracy: 0.9907 - loss: 0.1000 - moe_loss: 36493.4531 + +
+``` + +``` +
+ 37/469 ━━━━━━━━━━━━━━━━━━━━ 14s 33ms/step - accuracy: 0.9907 - loss: 0.1000 - moe_loss: 36496.0117 + +
+``` + +``` +
+ 39/469 ━━━━━━━━━━━━━━━━━━━━ 14s 34ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 36501.1328 + +
+``` + +``` +
+ 41/469 ━━━━━━━━━━━━━━━━━━━━ 14s 34ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 36506.2539 + +
+``` + +``` +
+ 42/469 ━━━━━━━━━━━━━━━━━━━━ 14s 35ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 36508.8125 + +
+``` + +``` +
+ 44/469 ━━━━━━━━━━━━━━━━━━━━ 15s 37ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 36513.9336 + +
+``` + +``` +
+ 45/469 ━━━━━━━━━━━━━━━━━━━━ 16s 39ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 36516.4922 + +
+``` + +``` +
+ 46/469 ━━━━━━━━━━━━━━━━━━━━ 18s 44ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 36519.0508 + +
+``` + +``` +
+ 48/469 ━━━━━━━━━━━━━━━━━━━━ 18s 44ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 36524.1719 + +
+``` + +``` +
+ 50/469 ━━━━━━━━━━━━━━━━━━━━ 18s 43ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 36529.2930 + +
+``` + +``` +
+ 52/469 ━━━━━━━━━━━━━━━━━━━━ 18s 43ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 36534.4102 + +
+``` + +``` +
+ 54/469 ━━━━━━━━━━━━━━━━━━━━ 17s 43ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 36539.5312 + +
+``` + +``` +
+ 55/469 ━━━━━━━━━━━━━━━━━━━━ 18s 44ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 36542.0898 + +
+``` + +``` +
+ 56/469 ━━━━━━━━━━━━━━━━━━━━ 18s 44ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 36544.6523 + +
+``` + +``` +
+ 58/469 ━━━━━━━━━━━━━━━━━━━━ 17s 44ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 36549.7695 + +
+``` + +``` +
+ 60/469 ━━━━━━━━━━━━━━━━━━━━ 17s 43ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 36554.8906 + +
+``` + +``` +
+ 62/469 ━━━━━━━━━━━━━━━━━━━━ 17s 43ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 36560.0117 + +
+``` + +``` +
+ 64/469 ━━━━━━━━━━━━━━━━━━━━ 17s 43ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 36565.1328 + +
+``` + +``` +
+ 66/469 ━━━━━━━━━━━━━━━━━━━━ 17s 43ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 36570.2500 + +
+``` + +``` +
+ 68/469 ━━━━━━━━━━━━━━━━━━━━ 17s 43ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 36575.3711 + +
+``` + +``` +
+ 70/469 ━━━━━━━━━━━━━━━━━━━━ 16s 42ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 36580.4922 + +
+``` + +``` +
+ 72/469 ━━━━━━━━━━━━━━━━━━━━ 16s 42ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 36585.6133 + +
+``` + +``` +
+ 74/469 ━━━━━━━━━━━━━━━━━━━━ 16s 42ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 36590.7305 + +
+``` + +``` +
+ 76/469 ━━━━━━━━━━━━━━━━━━━━ 16s 42ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 36595.8516 + +
+``` + +``` +
+ 78/469 ━━━━━━━━━━━━━━━━━━━━ 16s 41ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 36600.9727 + +
+``` + +``` +
+ 80/469 ━━━━━━━━━━━━━━━━━━━━ 16s 41ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 36606.0898 + +
+``` + +``` +
+ 82/469 ━━━━━━━━━━━━━━━━━━━━ 15s 41ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 36611.2109 + +
+``` + +``` +
+ 84/469 ━━━━━━━━━━━━━━━━━━━━ 15s 41ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 36616.3320 + +
+``` + +``` +
+ 86/469 ━━━━━━━━━━━━━━━━━━━━ 15s 41ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 36621.4531 + +
+``` + +``` +
+ 88/469 ━━━━━━━━━━━━━━━━━━━━ 15s 41ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 36626.5703 + +
+``` + +``` +
+ 90/469 ━━━━━━━━━━━━━━━━━━━━ 15s 41ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 36631.6914 + +
+``` + +``` +
+ 91/469 ━━━━━━━━━━━━━━━━━━━━ 15s 41ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 36634.2500 + +
+``` + +``` +
+ 93/469 ━━━━━━━━━━━━━━━━━━━━ 15s 41ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 36639.3711 + +
+``` + +``` +
+ 95/469 ━━━━━━━━━━━━━━━━━━━━ 15s 40ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 36644.4922 + +
+``` + +``` +
+ 97/469 ━━━━━━━━━━━━━━━━━━━━ 15s 40ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 36649.6133 + +
+``` + +``` +
+ 99/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36654.7344 + +
+``` + +``` +
+ 101/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36659.8516 + +
+``` + +``` +
+ 103/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36664.9727 + +
+``` + +``` +
+ 105/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36670.0938 + +
+``` + +``` +
+ 107/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36675.2148 + +
+``` + +``` +
+ 109/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36680.3320 + +
+``` + +``` +
+ 111/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36685.4531 + +
+``` + +``` +
+ 113/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36690.5742 + +
+``` + +``` +
+ 115/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36695.6953 + +
+``` + +``` +
+ 117/469 ━━━━━━━━━━━━━━━━━━━━ 13s 40ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36700.8125 + +
+``` + +``` +
+ 119/469 ━━━━━━━━━━━━━━━━━━━━ 13s 40ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36705.9336 + +
+``` + +``` +
+ 121/469 ━━━━━━━━━━━━━━━━━━━━ 13s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36711.0547 + +
+``` + +``` +
+ 123/469 ━━━━━━━━━━━━━━━━━━━━ 13s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36716.1758 + +
+``` + +``` +
+ 125/469 ━━━━━━━━━━━━━━━━━━━━ 13s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36721.2930 + +
+``` + +``` +
+ 127/469 ━━━━━━━━━━━━━━━━━━━━ 13s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36726.4141 + +
+``` + +``` +
+ 129/469 ━━━━━━━━━━━━━━━━━━━━ 13s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36731.5352 + +
+``` + +``` +
+ 131/469 ━━━━━━━━━━━━━━━━━━━━ 13s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36736.6562 + +
+``` + +``` +
+ 133/469 ━━━━━━━━━━━━━━━━━━━━ 13s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36741.7734 + +
+``` + +``` +
+ 135/469 ━━━━━━━━━━━━━━━━━━━━ 12s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36746.8945 + +
+``` + +``` +
+ 137/469 ━━━━━━━━━━━━━━━━━━━━ 12s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36752.0156 + +
+``` + +``` +
+ 139/469 ━━━━━━━━━━━━━━━━━━━━ 12s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36757.1328 + +
+``` + +``` +
+ 141/469 ━━━━━━━━━━━━━━━━━━━━ 12s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36762.2539 + +
+``` + +``` +
+ 143/469 ━━━━━━━━━━━━━━━━━━━━ 12s 38ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36767.3750 + +
+``` + +``` +
+ 145/469 ━━━━━━━━━━━━━━━━━━━━ 12s 38ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36772.4961 + +
+``` + +``` +
+ 147/469 ━━━━━━━━━━━━━━━━━━━━ 12s 38ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36777.6133 + +
+``` + +``` +
+ 149/469 ━━━━━━━━━━━━━━━━━━━━ 12s 38ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36782.7344 + +
+``` + +``` +
+ 151/469 ━━━━━━━━━━━━━━━━━━━━ 12s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36787.8555 + +
+``` + +``` +
+ 153/469 ━━━━━━━━━━━━━━━━━━━━ 12s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36792.9727 + +
+``` + +``` +
+ 155/469 ━━━━━━━━━━━━━━━━━━━━ 11s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36798.0938 + +
+``` + +``` +
+ 157/469 ━━━━━━━━━━━━━━━━━━━━ 11s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36803.2148 + +
+``` + +``` +
+ 159/469 ━━━━━━━━━━━━━━━━━━━━ 11s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36808.3359 + +
+``` + +``` +
+ 161/469 ━━━━━━━━━━━━━━━━━━━━ 11s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36813.4531 + +
+``` + +``` +
+ 163/469 ━━━━━━━━━━━━━━━━━━━━ 11s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36818.5742 + +
+``` + +``` +
+ 165/469 ━━━━━━━━━━━━━━━━━━━━ 11s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36823.6953 + +
+``` + +``` +
+ 167/469 ━━━━━━━━━━━━━━━━━━━━ 11s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36828.8125 + +
+``` + +``` +
+ 169/469 ━━━━━━━━━━━━━━━━━━━━ 11s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36833.9336 + +
+``` + +``` +
+ 171/469 ━━━━━━━━━━━━━━━━━━━━ 11s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36839.0547 + +
+``` + +``` +
+ 173/469 ━━━━━━━━━━━━━━━━━━━━ 11s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36844.1758 + +
+``` + +``` +
+ 175/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36849.2930 + +
+``` + +``` +
+ 177/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36854.4141 + +
+``` + +``` +
+ 179/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36859.5352 + +
+``` + +``` +
+ 181/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36864.6523 + +
+``` + +``` +
+ 183/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36869.7734 + +
+``` + +``` +
+ 185/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36874.8945 + +
+``` + +``` +
+ 187/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36880.0156 + +
+``` + +``` +
+ 189/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36885.1328 + +
+``` + +``` +
+ 191/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36890.2539 + +
+``` + +``` +
+ 193/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36895.3750 + +
+``` + +``` +
+ 195/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36900.4922 + +
+``` + +``` +
+ 197/469 ━━━━━━━━━━━━━━━━━━━━ 9s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36905.6133 + +
+``` + +``` +
+ 198/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36908.1758 + +
+``` + +``` +
+ 200/469 ━━━━━━━━━━━━━━━━━━━━ 9s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36913.2930 + +
+``` + +``` +
+ 202/469 ━━━━━━━━━━━━━━━━━━━━ 9s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36918.4141 + +
+``` + +``` +
+ 204/469 ━━━━━━━━━━━━━━━━━━━━ 9s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36923.5352 + +
+``` + +``` +
+ 206/469 ━━━━━━━━━━━━━━━━━━━━ 9s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36928.6523 + +
+``` + +``` +
+ 208/469 ━━━━━━━━━━━━━━━━━━━━ 9s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36933.7734 + +
+``` + +``` +
+ 209/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36936.3359 + +
+``` + +``` +
+ 211/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36941.4531 + +
+``` + +``` +
+ 213/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36946.5742 + +
+``` + +``` +
+ 215/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36951.6953 + +
+``` + +``` +
+ 217/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36956.8125 + +
+``` + +``` +
+ 219/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36961.9336 + +
+``` + +``` +
+ 221/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36967.0547 + +
+``` + +``` +
+ 222/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36969.6133 + +
+``` + +``` +
+ 224/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36974.7344 + +
+``` + +``` +
+ 225/469 ━━━━━━━━━━━━━━━━━━━━ 26s 107ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36977.2930 + +
+``` + +``` +
+ 226/469 ━━━━━━━━━━━━━━━━━━━━ 26s 107ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36979.8555 + +
+``` + +``` +
+ 227/469 ━━━━━━━━━━━━━━━━━━━━ 26s 109ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36982.4141 + +
+``` + +``` +
+ 228/469 ━━━━━━━━━━━━━━━━━━━━ 26s 109ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36984.9766 + +
+``` + +``` +
+ 229/469 ━━━━━━━━━━━━━━━━━━━━ 27s 113ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36987.5352 + +
+``` + +``` +
+ 230/469 ━━━━━━━━━━━━━━━━━━━━ 26s 112ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36990.0938 + +
+``` + +``` +
+ 231/469 ━━━━━━━━━━━━━━━━━━━━ 26s 112ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36992.6562 + +
+``` + +``` +
+ 232/469 ━━━━━━━━━━━━━━━━━━━━ 26s 112ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36995.2148 + +
+``` + +``` +
+ 233/469 ━━━━━━━━━━━━━━━━━━━━ 26s 112ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36997.7734 + +
+``` + +``` +
+ 235/469 ━━━━━━━━━━━━━━━━━━━━ 26s 111ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37002.8945 + +
+``` + +``` +
+ 236/469 ━━━━━━━━━━━━━━━━━━━━ 25s 111ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37005.4531 + +
+``` + +``` +
+ 238/469 ━━━━━━━━━━━━━━━━━━━━ 25s 111ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37010.5742 + +
+``` + +``` +
+ 240/469 ━━━━━━━━━━━━━━━━━━━━ 25s 110ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37015.6953 + +
+``` + +``` +
+ 242/469 ━━━━━━━━━━━━━━━━━━━━ 24s 110ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37020.8164 + +
+``` + +``` +
+ 244/469 ━━━━━━━━━━━━━━━━━━━━ 24s 109ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37025.9336 + +
+``` + +``` +
+ 246/469 ━━━━━━━━━━━━━━━━━━━━ 24s 108ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37031.0547 + +
+``` + +``` +
+ 248/469 ━━━━━━━━━━━━━━━━━━━━ 23s 108ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37036.1758 + +
+``` + +``` +
+ 250/469 ━━━━━━━━━━━━━━━━━━━━ 23s 107ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37041.2930 + +
+``` + +``` +
+ 252/469 ━━━━━━━━━━━━━━━━━━━━ 23s 107ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37046.4141 + +
+``` + +``` +
+ 254/469 ━━━━━━━━━━━━━━━━━━━━ 22s 106ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37051.5352 + +
+``` + +``` +
+ 256/469 ━━━━━━━━━━━━━━━━━━━━ 22s 106ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37056.6562 + +
+``` + +``` +
+ 258/469 ━━━━━━━━━━━━━━━━━━━━ 22s 105ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37061.7734 + +
+``` + +``` +
+ 260/469 ━━━━━━━━━━━━━━━━━━━━ 21s 105ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37066.8945 + +
+``` + +``` +
+ 262/469 ━━━━━━━━━━━━━━━━━━━━ 21s 104ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37072.0156 + +
+``` + +``` +
+ 264/469 ━━━━━━━━━━━━━━━━━━━━ 21s 104ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37077.1328 + +
+``` + +``` +
+ 266/469 ━━━━━━━━━━━━━━━━━━━━ 20s 103ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37082.2539 + +
+``` + +``` +
+ 268/469 ━━━━━━━━━━━━━━━━━━━━ 20s 103ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37087.3750 + +
+``` + +``` +
+ 270/469 ━━━━━━━━━━━━━━━━━━━━ 20s 102ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37092.4961 + +
+``` + +``` +
+ 272/469 ━━━━━━━━━━━━━━━━━━━━ 19s 101ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37097.6133 + +
+``` + +``` +
+ 274/469 ━━━━━━━━━━━━━━━━━━━━ 19s 101ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37102.7344 + +
+``` + +``` +
+ 276/469 ━━━━━━━━━━━━━━━━━━━━ 19s 100ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37107.8555 + +
+``` + +``` +
+ 279/469 ━━━━━━━━━━━━━━━━━━━━ 18s 100ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37115.5352 + +
+``` + +``` +
+ 282/469 ━━━━━━━━━━━━━━━━━━━━ 18s 99ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37123.2148 + +
+``` + +``` +
+ 285/469 ━━━━━━━━━━━━━━━━━━━━ 18s 98ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37130.8945 + +
+``` + +``` +
+ 288/469 ━━━━━━━━━━━━━━━━━━━━ 17s 97ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37138.5742 + +
+``` + +``` +
+ 291/469 ━━━━━━━━━━━━━━━━━━━━ 17s 96ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37146.2539 + +
+``` + +``` +
+ 294/469 ━━━━━━━━━━━━━━━━━━━━ 16s 95ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37153.9336 + +
+``` + +``` +
+ 297/469 ━━━━━━━━━━━━━━━━━━━━ 16s 95ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37161.6133 + +
+``` + +``` +
+ 300/469 ━━━━━━━━━━━━━━━━━━━━ 15s 94ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37169.2930 + +
+``` + +``` +
+ 304/469 ━━━━━━━━━━━━━━━━━━━━ 15s 93ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37179.5352 + +
+``` + +``` +
+ 308/469 ━━━━━━━━━━━━━━━━━━━━ 14s 92ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37189.7734 + +
+``` + +``` +
+ 312/469 ━━━━━━━━━━━━━━━━━━━━ 14s 91ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37200.0156 + +
+``` + +``` +
+ 316/469 ━━━━━━━━━━━━━━━━━━━━ 13s 90ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37210.2539 + +
+``` + +``` +
+ 320/469 ━━━━━━━━━━━━━━━━━━━━ 13s 89ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37220.4961 + +
+``` + +``` +
+ 323/469 ━━━━━━━━━━━━━━━━━━━━ 12s 88ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37228.1758 + +
+``` + +``` +
+ 326/469 ━━━━━━━━━━━━━━━━━━━━ 12s 88ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37235.8555 + +
+``` + +``` +
+ 329/469 ━━━━━━━━━━━━━━━━━━━━ 12s 87ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37243.5352 + +
+``` + +``` +
+ 332/469 ━━━━━━━━━━━━━━━━━━━━ 11s 86ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37251.2148 + +
+``` + +``` +
+ 335/469 ━━━━━━━━━━━━━━━━━━━━ 11s 86ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37258.8945 + +
+``` + +``` +
+ 338/469 ━━━━━━━━━━━━━━━━━━━━ 11s 85ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37266.5742 + +
+``` + +``` +
+ 342/469 ━━━━━━━━━━━━━━━━━━━━ 10s 84ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37276.8164 + +
+``` + +``` +
+ 346/469 ━━━━━━━━━━━━━━━━━━━━ 10s 84ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37287.0547 + +
+``` + +``` +
+ 350/469 ━━━━━━━━━━━━━━━━━━━━ 9s 83ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37297.2969 + +
+``` + +``` +
+ 353/469 ━━━━━━━━━━━━━━━━━━━━ 9s 82ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37304.9766 + +
+``` + +``` +
+ 357/469 ━━━━━━━━━━━━━━━━━━━━ 9s 81ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37315.2148 + +
+``` + +``` +
+ 360/469 ━━━━━━━━━━━━━━━━━━━━ 8s 81ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37322.8945 + +
+``` + +``` +
+ 363/469 ━━━━━━━━━━━━━━━━━━━━ 8s 80ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37330.5742 + +
+``` + +``` +
+ 367/469 ━━━━━━━━━━━━━━━━━━━━ 8s 80ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37340.8164 + +
+``` + +``` +
+ 370/469 ━━━━━━━━━━━━━━━━━━━━ 7s 79ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37348.4961 + +
+``` + +``` +
+ 373/469 ━━━━━━━━━━━━━━━━━━━━ 7s 79ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37356.1758 + +
+``` + +``` +
+ 377/469 ━━━━━━━━━━━━━━━━━━━━ 7s 78ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37366.4141 + +
+``` + +``` +
+ 381/469 ━━━━━━━━━━━━━━━━━━━━ 6s 77ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37376.6562 + +
+``` + +``` +
+ 384/469 ━━━━━━━━━━━━━━━━━━━━ 6s 77ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37384.3359 + +
+``` + +``` +
+ 387/469 ━━━━━━━━━━━━━━━━━━━━ 6s 76ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37392.0156 + +
+``` + +``` +
+ 390/469 ━━━━━━━━━━━━━━━━━━━━ 6s 76ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37399.6953 + +
+``` + +``` +
+ 392/469 ━━━━━━━━━━━━━━━━━━━━ 5s 76ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37404.8164 + +
+``` + +``` +
+ 395/469 ━━━━━━━━━━━━━━━━━━━━ 5s 76ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37412.4961 + +
+``` + +``` +
+ 398/469 ━━━━━━━━━━━━━━━━━━━━ 5s 75ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37420.1758 + +
+``` + +``` +
+ 401/469 ━━━━━━━━━━━━━━━━━━━━ 5s 75ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37427.8555 + +
+``` + +``` +
+ 404/469 ━━━━━━━━━━━━━━━━━━━━ 4s 74ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37435.5352 + +
+``` + +``` +
+ 407/469 ━━━━━━━━━━━━━━━━━━━━ 4s 74ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37443.2148 + +
+``` + +``` +
+ 411/469 ━━━━━━━━━━━━━━━━━━━━ 4s 73ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37453.4531 + +
+``` + +``` +
+ 414/469 ━━━━━━━━━━━━━━━━━━━━ 4s 73ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37461.1367 + +
+``` + +``` +
+ 417/469 ━━━━━━━━━━━━━━━━━━━━ 3s 73ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37468.8164 + +
+``` + +``` +
+ 421/469 ━━━━━━━━━━━━━━━━━━━━ 3s 72ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37479.0547 + +
+``` + +``` +
+ 425/469 ━━━━━━━━━━━━━━━━━━━━ 3s 72ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37489.2930 + +
+``` + +``` +
+ 429/469 ━━━━━━━━━━━━━━━━━━━━ 2s 71ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37499.5352 + +
+``` + +``` +
+ 433/469 ━━━━━━━━━━━━━━━━━━━━ 2s 70ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37509.7734 + +
+``` + +``` +
+ 437/469 ━━━━━━━━━━━━━━━━━━━━ 2s 70ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37520.0156 + +
+``` + +``` +
+ 440/469 ━━━━━━━━━━━━━━━━━━━━ 2s 70ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37527.6953 + +
+``` + +``` +
+ 444/469 ━━━━━━━━━━━━━━━━━━━━ 1s 69ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37537.9336 + +
+``` + +``` +
+ 448/469 ━━━━━━━━━━━━━━━━━━━━ 1s 69ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37548.1758 + +
+``` + +``` +
+ 452/469 ━━━━━━━━━━━━━━━━━━━━ 1s 68ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37558.4141 + +
+``` + +``` +
+ 454/469 ━━━━━━━━━━━━━━━━━━━━ 1s 68ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37563.5352 + +
+``` + +``` +
+ 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 68ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37571.2148 + +
+``` + +``` +
+ 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37581.4531 + +
+``` + +``` +
+ 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37591.6953 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37601.9336 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 32s 67ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37604.4766 - val_loss: 0.1000 - val_moe_loss: 39198.7227 + + +
+``` +Epoch 15/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 39203.8633 + +
+``` + +``` +
+ 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9954 - loss: 0.1000 - moe_loss: 39214.0938 + +
+``` + +``` +
+ 9/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 39224.3320 + +
+``` + +``` +
+ 13/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 39234.5742 + +
+``` + +``` +
+ 17/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39244.8125 + +
+``` + +``` +
+ 21/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39255.0547 + +
+``` + +``` +
+ 25/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39265.2930 + +
+``` + +``` +
+ 29/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 39275.5352 + +
+``` + +``` +
+ 33/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 39285.7773 + +
+``` + +``` +
+ 37/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 39296.0156 + +
+``` + +``` +
+ 41/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39306.2539 + +
+``` + +``` +
+ 45/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39316.4961 + +
+``` + +``` +
+ 49/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39326.7344 + +
+``` + +``` +
+ 53/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39336.9766 + +
+``` + +``` +
+ 57/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39347.2148 + +
+``` + +``` +
+ 61/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39357.4531 + +
+``` + +``` +
+ 65/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39367.6953 + +
+``` + +``` +
+ 69/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39377.9336 + +
+``` + +``` +
+ 72/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39385.6133 + +
+``` + +``` +
+ 76/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39395.8555 + +
+``` + +``` +
+ 80/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39406.0938 + +
+``` + +``` +
+ 84/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39416.3320 + +
+``` + +``` +
+ 88/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39426.5742 + +
+``` + +``` +
+ 92/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39436.8125 + +
+``` + +``` +
+ 95/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39444.4922 + +
+``` + +``` +
+ 98/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39452.1758 + +
+``` + +``` +
+ 101/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39459.8555 + +
+``` + +``` +
+ 102/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39462.4141 + +
+``` + +``` +
+ 103/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39464.9727 + +
+``` + +``` +
+ 105/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39470.0938 + +
+``` + +``` +
+ 108/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39477.7734 + +
+``` + +``` +
+ 111/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39485.4531 + +
+``` + +``` +
+ 114/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39493.1328 + +
+``` + +``` +
+ 117/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39500.8125 + +
+``` + +``` +
+ 120/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39508.4922 + +
+``` + +``` +
+ 123/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39516.1719 + +
+``` + +``` +
+ 126/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39523.8555 + +
+``` + +``` +
+ 129/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39531.5352 + +
+``` + +``` +
+ 132/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39539.2148 + +
+``` + +``` +
+ 135/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39546.8945 + +
+``` + +``` +
+ 138/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39554.5742 + +
+``` + +``` +
+ 141/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39562.2539 + +
+``` + +``` +
+ 144/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39569.9336 + +
+``` + +``` +
+ 147/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39577.6133 + +
+``` + +``` +
+ 150/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39585.2930 + +
+``` + +``` +
+ 153/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39592.9727 + +
+``` + +``` +
+ 156/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39600.6523 + +
+``` + +``` +
+ 159/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39608.3320 + +
+``` + +``` +
+ 162/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39616.0117 + +
+``` + +``` +
+ 165/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39623.6914 + +
+``` + +``` +
+ 168/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39631.3750 + +
+``` + +``` +
+ 172/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39641.6133 + +
+``` + +``` +
+ 175/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39649.2930 + +
+``` + +``` +
+ 179/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39659.5352 + +
+``` + +``` +
+ 183/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39669.7734 + +
+``` + +``` +
+ 187/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39680.0117 + +
+``` + +``` +
+ 190/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39687.6953 + +
+``` + +``` +
+ 194/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39697.9336 + +
+``` + +``` +
+ 198/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39708.1719 + +
+``` + +``` +
+ 201/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39715.8555 + +
+``` + +``` +
+ 204/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39723.5352 + +
+``` + +``` +
+ 208/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39733.7734 + +
+``` + +``` +
+ 212/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39744.0156 + +
+``` + +``` +
+ 216/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39754.2539 + +
+``` + +``` +
+ 219/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39761.9336 + +
+``` + +``` +
+ 223/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39772.1758 + +
+``` + +``` +
+ 226/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39779.8555 + +
+``` + +``` +
+ 230/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39790.0938 + +
+``` + +``` +
+ 234/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39800.3359 + +
+``` + +``` +
+ 238/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39810.5742 + +
+``` + +``` +
+ 242/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39820.8125 + +
+``` + +``` +
+ 245/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39828.4922 + +
+``` + +``` +
+ 248/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39836.1758 + +
+``` + +``` +
+ 252/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39846.4141 + +
+``` + +``` +
+ 256/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39856.6523 + +
+``` + +``` +
+ 259/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39864.3320 + +
+``` + +``` +
+ 262/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39872.0156 + +
+``` + +``` +
+ 265/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39879.6953 + +
+``` + +``` +
+ 269/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39889.9336 + +
+``` + +``` +
+ 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39900.1758 + +
+``` + +``` +
+ 277/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39910.4141 + +
+``` + +``` +
+ 281/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39920.6523 + +
+``` + +``` +
+ 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39930.8945 + +
+``` + +``` +
+ 289/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39941.1328 + +
+``` + +``` +
+ 292/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39948.8125 + +
+``` + +``` +
+ 296/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39959.0547 + +
+``` + +``` +
+ 299/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39966.7344 + +
+``` + +``` +
+ 303/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39976.9727 + +
+``` + +``` +
+ 307/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39987.2148 + +
+``` + +``` +
+ 311/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39997.4531 + +
+``` + +``` +
+ 315/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40007.6953 + +
+``` + +``` +
+ 319/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40017.9336 + +
+``` + +``` +
+ 323/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40028.1758 + +
+``` + +``` +
+ 327/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40038.4141 + +
+``` + +``` +
+ 331/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40048.6523 + +
+``` + +``` +
+ 335/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40058.8945 + +
+``` + +``` +
+ 339/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40069.1328 + +
+``` + +``` +
+ 343/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40079.3750 + +
+``` + +``` +
+ 346/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40087.0547 + +
+``` + +``` +
+ 350/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40097.2930 + +
+``` + +``` +
+ 354/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40107.5352 + +
+``` + +``` +
+ 358/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40117.7734 + +
+``` + +``` +
+ 362/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40128.0156 + +
+``` + +``` +
+ 366/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40138.2539 + +
+``` + +``` +
+ 369/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40145.9336 + +
+``` + +``` +
+ 373/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40156.1758 + +
+``` + +``` +
+ 377/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40166.4141 + +
+``` + +``` +
+ 381/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40176.6523 + +
+``` + +``` +
+ 385/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40186.8945 + +
+``` + +``` +
+ 389/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40197.1328 + +
+``` + +``` +
+ 393/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40207.3750 + +
+``` + +``` +
+ 396/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40215.0547 + +
+``` + +``` +
+ 400/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40225.2930 + +
+``` + +``` +
+ 404/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40235.5352 + +
+``` + +``` +
+ 408/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40245.7734 + +
+``` + +``` +
+ 412/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40256.0156 + +
+``` + +``` +
+ 416/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40266.2539 + +
+``` + +``` +
+ 420/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40276.4961 + +
+``` + +``` +
+ 423/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40284.1758 + +
+``` + +``` +
+ 427/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40294.4141 + +
+``` + +``` +
+ 431/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40304.6523 + +
+``` + +``` +
+ 435/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40314.8945 + +
+``` + +``` +
+ 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40325.1328 + +
+``` + +``` +
+ 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40335.3750 + +
+``` + +``` +
+ 447/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40345.6133 + +
+``` + +``` +
+ 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40355.8555 + +
+``` + +``` +
+ 454/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40363.5352 + +
+``` + +``` +
+ 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40371.2148 + +
+``` + +``` +
+ 460/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40378.8945 + +
+``` + +``` +
+ 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40386.5742 + +
+``` + +``` +
+ 466/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40394.2539 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40404.4766 - val_loss: 0.1000 - val_moe_loss: 41998.7227 + + +
+``` +Epoch 16/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9844 - loss: 0.1000 - moe_loss: 42003.8633 + +
+``` + +``` +
+ 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 42014.1016 + +
+``` + +``` +
+ 9/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9901 - loss: 0.1000 - moe_loss: 42024.3359 + +
+``` + +``` +
+ 13/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 42034.5742 + +
+``` + +``` +
+ 16/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 42042.2539 + +
+``` + +``` +
+ 20/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 42052.4922 + +
+``` + +``` +
+ 24/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 42062.7344 + +
+``` + +``` +
+ 28/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 42072.9727 + +
+``` + +``` +
+ 32/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 42083.2109 + +
+``` + +``` +
+ 36/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 42093.4531 + +
+``` + +``` +
+ 39/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 42101.1328 + +
+``` + +``` +
+ 42/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 42108.8125 + +
+``` + +``` +
+ 45/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 42116.4922 + +
+``` + +``` +
+ 49/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 42126.7344 + +
+``` + +``` +
+ 53/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 42136.9727 + +
+``` + +``` +
+ 57/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 42147.2148 + +
+``` + +``` +
+ 61/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 42157.4531 + +
+``` + +``` +
+ 65/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42167.6953 + +
+``` + +``` +
+ 69/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42177.9336 + +
+``` + +``` +
+ 73/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42188.1758 + +
+``` + +``` +
+ 77/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42198.4141 + +
+``` + +``` +
+ 81/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42208.6562 + +
+``` + +``` +
+ 85/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42218.8945 + +
+``` + +``` +
+ 89/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42229.1328 + +
+``` + +``` +
+ 93/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42239.3750 + +
+``` + +``` +
+ 97/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42249.6133 + +
+``` + +``` +
+ 100/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42257.2930 + +
+``` + +``` +
+ 104/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42267.5352 + +
+``` + +``` +
+ 108/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42277.7734 + +
+``` + +``` +
+ 112/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42288.0156 + +
+``` + +``` +
+ 116/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42298.2539 + +
+``` + +``` +
+ 120/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42308.4961 + +
+``` + +``` +
+ 124/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42318.7344 + +
+``` + +``` +
+ 128/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42328.9766 + +
+``` + +``` +
+ 132/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42339.2148 + +
+``` + +``` +
+ 136/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42349.4531 + +
+``` + +``` +
+ 140/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42359.6953 + +
+``` + +``` +
+ 144/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42369.9336 + +
+``` + +``` +
+ 148/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42380.1758 + +
+``` + +``` +
+ 150/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42385.2930 + +
+``` + +``` +
+ 152/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42390.4141 + +
+``` + +``` +
+ 155/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42398.0938 + +
+``` + +``` +
+ 159/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42408.3359 + +
+``` + +``` +
+ 163/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42418.5742 + +
+``` + +``` +
+ 167/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42428.8125 + +
+``` + +``` +
+ 171/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42439.0547 + +
+``` + +``` +
+ 175/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42449.2930 + +
+``` + +``` +
+ 179/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42459.5352 + +
+``` + +``` +
+ 183/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42469.7734 + +
+``` + +``` +
+ 187/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42480.0156 + +
+``` + +``` +
+ 191/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42490.2539 + +
+``` + +``` +
+ 195/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42500.4922 + +
+``` + +``` +
+ 199/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42510.7344 + +
+``` + +``` +
+ 203/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42520.9727 + +
+``` + +``` +
+ 207/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42531.2148 + +
+``` + +``` +
+ 211/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42541.4531 + +
+``` + +``` +
+ 214/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42549.1328 + +
+``` + +``` +
+ 218/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42559.3750 + +
+``` + +``` +
+ 222/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42569.6133 + +
+``` + +``` +
+ 226/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42579.8555 + +
+``` + +``` +
+ 230/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42590.0938 + +
+``` + +``` +
+ 234/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42600.3320 + +
+``` + +``` +
+ 238/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42610.5742 + +
+``` + +``` +
+ 242/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42620.8125 + +
+``` + +``` +
+ 246/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42631.0547 + +
+``` + +``` +
+ 250/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42641.2930 + +
+``` + +``` +
+ 254/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42651.5352 + +
+``` + +``` +
+ 258/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42661.7734 + +
+``` + +``` +
+ 262/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42672.0117 + +
+``` + +``` +
+ 266/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42682.2539 + +
+``` + +``` +
+ 269/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42689.9336 + +
+``` + +``` +
+ 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42700.1719 + +
+``` + +``` +
+ 276/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42707.8555 + +
+``` + +``` +
+ 279/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42715.5352 + +
+``` + +``` +
+ 281/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42720.6523 + +
+``` + +``` +
+ 282/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42723.2148 + +
+``` + +``` +
+ 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42730.8945 + +
+``` + +``` +
+ 288/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42738.5742 + +
+``` + +``` +
+ 291/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42746.2539 + +
+``` + +``` +
+ 294/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42753.9336 + +
+``` + +``` +
+ 298/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42764.1719 + +
+``` + +``` +
+ 302/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42774.4141 + +
+``` + +``` +
+ 305/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42782.0938 + +
+``` + +``` +
+ 309/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42792.3320 + +
+``` + +``` +
+ 312/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42800.0117 + +
+``` + +``` +
+ 315/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42807.6914 + +
+``` + +``` +
+ 318/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42815.3750 + +
+``` + +``` +
+ 321/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42823.0547 + +
+``` + +``` +
+ 324/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42830.7344 + +
+``` + +``` +
+ 327/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42838.4141 + +
+``` + +``` +
+ 331/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42848.6523 + +
+``` + +``` +
+ 335/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42858.8945 + +
+``` + +``` +
+ 338/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42866.5742 + +
+``` + +``` +
+ 342/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42876.8125 + +
+``` + +``` +
+ 346/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42887.0547 + +
+``` + +``` +
+ 349/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42894.7344 + +
+``` + +``` +
+ 353/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42904.9727 + +
+``` + +``` +
+ 357/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42915.2148 + +
+``` + +``` +
+ 361/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42925.4531 + +
+``` + +``` +
+ 364/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42933.1328 + +
+``` + +``` +
+ 368/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42943.3750 + +
+``` + +``` +
+ 371/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42951.0547 + +
+``` + +``` +
+ 374/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42958.7344 + +
+``` + +``` +
+ 378/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42968.9727 + +
+``` + +``` +
+ 382/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42979.2148 + +
+``` + +``` +
+ 386/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42989.4531 + +
+``` + +``` +
+ 390/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42999.6914 + +
+``` + +``` +
+ 393/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43007.3750 + +
+``` + +``` +
+ 397/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43017.6133 + +
+``` + +``` +
+ 401/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43027.8516 + +
+``` + +``` +
+ 405/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43038.0938 + +
+``` + +``` +
+ 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43048.3320 + +
+``` + +``` +
+ 413/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43058.5742 + +
+``` + +``` +
+ 416/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43066.2539 + +
+``` + +``` +
+ 420/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43076.4922 + +
+``` + +``` +
+ 423/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43084.1719 + +
+``` + +``` +
+ 426/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43091.8516 + +
+``` + +``` +
+ 430/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43102.0938 + +
+``` + +``` +
+ 434/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43112.3320 + +
+``` + +``` +
+ 438/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43122.5742 + +
+``` + +``` +
+ 442/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43132.8125 + +
+``` + +``` +
+ 446/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43143.0547 + +
+``` + +``` +
+ 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43150.7344 + +
+``` + +``` +
+ 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43160.9727 + +
+``` + +``` +
+ 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43171.2148 + +
+``` + +``` +
+ 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43181.4531 + +
+``` + +``` +
+ 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43191.6953 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43201.9297 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43204.4766 - val_loss: 0.1000 - val_moe_loss: 44798.7227 + + +
+``` +Epoch 17/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 12s 26ms/step - accuracy: 1.0000 - loss: 0.1000 - moe_loss: 44803.8477 + +
+``` + +``` +
+ 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9947 - loss: 0.1000 - moe_loss: 44814.0938 + +
+``` + +``` +
+ 9/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 44824.3359 + +
+``` + +``` +
+ 13/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 44834.5742 + +
+``` + +``` +
+ 16/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 44842.2539 + +
+``` + +``` +
+ 19/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 44849.9336 + +
+``` + +``` +
+ 23/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 44860.1719 + +
+``` + +``` +
+ 27/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 44870.4141 + +
+``` + +``` +
+ 31/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 44880.6523 + +
+``` + +``` +
+ 33/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 44885.7734 + +
+``` + +``` +
+ 37/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 44896.0117 + +
+``` + +``` +
+ 41/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 44906.2539 + +
+``` + +``` +
+ 45/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 44916.4922 + +
+``` + +``` +
+ 48/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 44924.1719 + +
+``` + +``` +
+ 51/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 44931.8516 + +
+``` + +``` +
+ 55/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 44942.0938 + +
+``` + +``` +
+ 59/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 44952.3320 + +
+``` + +``` +
+ 63/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 44962.5742 + +
+``` + +``` +
+ 67/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 44972.8125 + +
+``` + +``` +
+ 71/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 44983.0547 + +
+``` + +``` +
+ 74/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 44990.7344 + +
+``` + +``` +
+ 78/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 45000.9727 + +
+``` + +``` +
+ 81/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 45008.6523 + +
+``` + +``` +
+ 84/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 45016.3320 + +
+``` + +``` +
+ 87/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 45024.0117 + +
+``` + +``` +
+ 90/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 45031.6914 + +
+``` + +``` +
+ 94/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 45041.9336 + +
+``` + +``` +
+ 97/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 45049.6133 + +
+``` + +``` +
+ 101/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 45059.8516 + +
+``` + +``` +
+ 105/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 45070.0938 + +
+``` + +``` +
+ 108/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 45077.7734 + +
+``` + +``` +
+ 111/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 45085.4531 + +
+``` + +``` +
+ 114/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 45093.1328 + +
+``` + +``` +
+ 118/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 45103.3711 + +
+``` + +``` +
+ 121/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 45111.0508 + +
+``` + +``` +
+ 124/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 45118.7305 + +
+``` + +``` +
+ 127/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 45126.4102 + +
+``` + +``` +
+ 130/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 45134.0938 + +
+``` + +``` +
+ 133/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 45141.7734 + +
+``` + +``` +
+ 136/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 45149.4531 + +
+``` + +``` +
+ 139/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 45157.1328 + +
+``` + +``` +
+ 142/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 45164.8125 + +
+``` + +``` +
+ 145/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 45172.4922 + +
+``` + +``` +
+ 149/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 45182.7344 + +
+``` + +``` +
+ 153/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 45192.9727 + +
+``` + +``` +
+ 156/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45200.6523 + +
+``` + +``` +
+ 159/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45208.3320 + +
+``` + +``` +
+ 163/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45218.5742 + +
+``` + +``` +
+ 167/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45228.8125 + +
+``` + +``` +
+ 170/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45236.4922 + +
+``` + +``` +
+ 174/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45246.7344 + +
+``` + +``` +
+ 177/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45254.4141 + +
+``` + +``` +
+ 180/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45262.0938 + +
+``` + +``` +
+ 183/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45269.7734 + +
+``` + +``` +
+ 186/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45277.4531 + +
+``` + +``` +
+ 189/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45285.1328 + +
+``` + +``` +
+ 193/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45295.3711 + +
+``` + +``` +
+ 196/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45303.0547 + +
+``` + +``` +
+ 199/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45310.7344 + +
+``` + +``` +
+ 202/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45318.4141 + +
+``` + +``` +
+ 205/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45326.0938 + +
+``` + +``` +
+ 208/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45333.7734 + +
+``` + +``` +
+ 211/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45341.4531 + +
+``` + +``` +
+ 214/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45349.1328 + +
+``` + +``` +
+ 217/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45356.8125 + +
+``` + +``` +
+ 220/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45364.4922 + +
+``` + +``` +
+ 223/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45372.1719 + +
+``` + +``` +
+ 227/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45382.4141 + +
+``` + +``` +
+ 231/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45392.6523 + +
+``` + +``` +
+ 234/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45400.3320 + +
+``` + +``` +
+ 237/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45408.0117 + +
+``` + +``` +
+ 241/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45418.2539 + +
+``` + +``` +
+ 244/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45425.9336 + +
+``` + +``` +
+ 247/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45433.6133 + +
+``` + +``` +
+ 250/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45441.2930 + +
+``` + +``` +
+ 253/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45448.9727 + +
+``` + +``` +
+ 257/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45459.2109 + +
+``` + +``` +
+ 261/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45469.4531 + +
+``` + +``` +
+ 264/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45477.1328 + +
+``` + +``` +
+ 267/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45484.8125 + +
+``` + +``` +
+ 270/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45492.4922 + +
+``` + +``` +
+ 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45500.1719 + +
+``` + +``` +
+ 276/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45507.8516 + +
+``` + +``` +
+ 279/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45515.5312 + +
+``` + +``` +
+ 282/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45523.2109 + +
+``` + +``` +
+ 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45530.8945 + +
+``` + +``` +
+ 288/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45538.5742 + +
+``` + +``` +
+ 291/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45546.2539 + +
+``` + +``` +
+ 294/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45553.9336 + +
+``` + +``` +
+ 297/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45561.6133 + +
+``` + +``` +
+ 300/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45569.2930 + +
+``` + +``` +
+ 303/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45576.9727 + +
+``` + +``` +
+ 306/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45584.6523 + +
+``` + +``` +
+ 307/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45587.2109 + +
+``` + +``` +
+ 310/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45594.8906 + +
+``` + +``` +
+ 313/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45602.5703 + +
+``` + +``` +
+ 316/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45610.2539 + +
+``` + +``` +
+ 319/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45617.9336 + +
+``` + +``` +
+ 322/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45625.6133 + +
+``` + +``` +
+ 326/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45635.8516 + +
+``` + +``` +
+ 329/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45643.5312 + +
+``` + +``` +
+ 332/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45651.2109 + +
+``` + +``` +
+ 335/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45658.8906 + +
+``` + +``` +
+ 338/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45666.5703 + +
+``` + +``` +
+ 341/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45674.2539 + +
+``` + +``` +
+ 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45681.9336 + +
+``` + +``` +
+ 347/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45689.6133 + +
+``` + +``` +
+ 350/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45697.2930 + +
+``` + +``` +
+ 353/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45704.9727 + +
+``` + +``` +
+ 356/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45712.6523 + +
+``` + +``` +
+ 359/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45720.3320 + +
+``` + +``` +
+ 362/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45728.0117 + +
+``` + +``` +
+ 365/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45735.6914 + +
+``` + +``` +
+ 368/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45743.3711 + +
+``` + +``` +
+ 371/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45751.0508 + +
+``` + +``` +
+ 374/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45758.7305 + +
+``` + +``` +
+ 377/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45766.4141 + +
+``` + +``` +
+ 380/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45774.0938 + +
+``` + +``` +
+ 383/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45781.7734 + +
+``` + +``` +
+ 386/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45789.4531 + +
+``` + +``` +
+ 389/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45797.1328 + +
+``` + +``` +
+ 392/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45804.8125 + +
+``` + +``` +
+ 395/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45812.4922 + +
+``` + +``` +
+ 398/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45820.1719 + +
+``` + +``` +
+ 401/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45827.8516 + +
+``` + +``` +
+ 404/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45835.5312 + +
+``` + +``` +
+ 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45843.2109 + +
+``` + +``` +
+ 410/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45850.8906 + +
+``` + +``` +
+ 413/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45858.5742 + +
+``` + +``` +
+ 416/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45866.2539 + +
+``` + +``` +
+ 419/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45873.9336 + +
+``` + +``` +
+ 422/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45881.6133 + +
+``` + +``` +
+ 425/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45889.2930 + +
+``` + +``` +
+ 428/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45896.9727 + +
+``` + +``` +
+ 431/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45904.6523 + +
+``` + +``` +
+ 434/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45912.3320 + +
+``` + +``` +
+ 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45920.0117 + +
+``` + +``` +
+ 440/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45927.6914 + +
+``` + +``` +
+ 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45935.3711 + +
+``` + +``` +
+ 446/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45943.0547 + +
+``` + +``` +
+ 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45950.7344 + +
+``` + +``` +
+ 452/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45958.4141 + +
+``` + +``` +
+ 455/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45966.0938 + +
+``` + +``` +
+ 458/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45973.7734 + +
+``` + +``` +
+ 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45981.4531 + +
+``` + +``` +
+ 464/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45989.1328 + +
+``` + +``` +
+ 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45996.8125 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 46004.4766 - val_loss: 0.1000 - val_moe_loss: 47598.7227 + + +
+``` +Epoch 18/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 12s 27ms/step - accuracy: 1.0000 - loss: 0.1000 - moe_loss: 47603.8477 + +
+``` + +``` +
+ 4/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9967 - loss: 0.1000 - moe_loss: 47611.5312 + +
+``` + +``` +
+ 7/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9965 - loss: 0.1000 - moe_loss: 47619.2109 + +
+``` + +``` +
+ 10/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9965 - loss: 0.1000 - moe_loss: 47626.8906 + +
+``` + +``` +
+ 13/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9966 - loss: 0.1000 - moe_loss: 47634.5703 + +
+``` + +``` +
+ 16/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9964 - loss: 0.1000 - moe_loss: 47642.2500 + +
+``` + +``` +
+ 19/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9961 - loss: 0.1000 - moe_loss: 47649.9336 + +
+``` + +``` +
+ 22/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9957 - loss: 0.1000 - moe_loss: 47657.6133 + +
+``` + +``` +
+ 25/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9954 - loss: 0.1000 - moe_loss: 47665.2930 + +
+``` + +``` +
+ 28/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9952 - loss: 0.1000 - moe_loss: 47672.9727 + +
+``` + +``` +
+ 31/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9950 - loss: 0.1000 - moe_loss: 47680.6523 + +
+``` + +``` +
+ 34/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9950 - loss: 0.1000 - moe_loss: 47688.3359 + +
+``` + +``` +
+ 37/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9949 - loss: 0.1000 - moe_loss: 47696.0156 + +
+``` + +``` +
+ 40/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9948 - loss: 0.1000 - moe_loss: 47703.6953 + +
+``` + +``` +
+ 43/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9947 - loss: 0.1000 - moe_loss: 47711.3750 + +
+``` + +``` +
+ 46/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9946 - loss: 0.1000 - moe_loss: 47719.0547 + +
+``` + +``` +
+ 49/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9946 - loss: 0.1000 - moe_loss: 47726.7344 + +
+``` + +``` +
+ 52/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 47734.4141 + +
+``` + +``` +
+ 55/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 47742.0938 + +
+``` + +``` +
+ 58/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 47749.7734 + +
+``` + +``` +
+ 61/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 47757.4570 + +
+``` + +``` +
+ 64/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 47765.1367 + +
+``` + +``` +
+ 67/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 47772.8164 + +
+``` + +``` +
+ 70/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 47780.4961 + +
+``` + +``` +
+ 73/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 47788.1758 + +
+``` + +``` +
+ 76/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 47795.8555 + +
+``` + +``` +
+ 79/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 47803.5352 + +
+``` + +``` +
+ 82/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 47811.2148 + +
+``` + +``` +
+ 85/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 47818.8945 + +
+``` + +``` +
+ 88/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 47826.5742 + +
+``` + +``` +
+ 91/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 47834.2539 + +
+``` + +``` +
+ 94/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47841.9336 + +
+``` + +``` +
+ 97/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47849.6133 + +
+``` + +``` +
+ 100/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47857.2930 + +
+``` + +``` +
+ 103/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47864.9727 + +
+``` + +``` +
+ 106/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47872.6562 + +
+``` + +``` +
+ 107/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47875.2148 + +
+``` + +``` +
+ 108/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47877.7734 + +
+``` + +``` +
+ 111/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47885.4531 + +
+``` + +``` +
+ 114/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47893.1328 + +
+``` + +``` +
+ 117/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47900.8164 + +
+``` + +``` +
+ 121/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47911.0547 + +
+``` + +``` +
+ 124/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47918.7344 + +
+``` + +``` +
+ 127/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47926.4141 + +
+``` + +``` +
+ 130/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47934.0938 + +
+``` + +``` +
+ 133/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47941.7734 + +
+``` + +``` +
+ 136/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47949.4531 + +
+``` + +``` +
+ 139/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47957.1328 + +
+``` + +``` +
+ 142/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47964.8125 + +
+``` + +``` +
+ 145/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47972.4922 + +
+``` + +``` +
+ 148/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47980.1758 + +
+``` + +``` +
+ 151/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47987.8555 + +
+``` + +``` +
+ 154/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47995.5352 + +
+``` + +``` +
+ 157/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48003.2148 + +
+``` + +``` +
+ 160/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48010.8945 + +
+``` + +``` +
+ 163/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48018.5742 + +
+``` + +``` +
+ 166/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48026.2539 + +
+``` + +``` +
+ 169/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48033.9336 + +
+``` + +``` +
+ 172/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48041.6133 + +
+``` + +``` +
+ 175/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48049.2930 + +
+``` + +``` +
+ 178/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48056.9727 + +
+``` + +``` +
+ 181/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48064.6562 + +
+``` + +``` +
+ 184/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48072.3359 + +
+``` + +``` +
+ 187/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48080.0156 + +
+``` + +``` +
+ 190/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48087.6953 + +
+``` + +``` +
+ 193/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48095.3750 + +
+``` + +``` +
+ 196/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48103.0547 + +
+``` + +``` +
+ 199/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48110.7344 + +
+``` + +``` +
+ 202/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48118.4141 + +
+``` + +``` +
+ 205/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48126.0938 + +
+``` + +``` +
+ 208/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48133.7734 + +
+``` + +``` +
+ 211/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48141.4531 + +
+``` + +``` +
+ 214/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48149.1328 + +
+``` + +``` +
+ 217/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48156.8164 + +
+``` + +``` +
+ 220/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48164.4961 + +
+``` + +``` +
+ 223/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48172.1758 + +
+``` + +``` +
+ 226/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48179.8555 + +
+``` + +``` +
+ 229/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48187.5352 + +
+``` + +``` +
+ 232/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48195.2148 + +
+``` + +``` +
+ 235/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48202.8945 + +
+``` + +``` +
+ 238/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48210.5742 + +
+``` + +``` +
+ 241/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48218.2539 + +
+``` + +``` +
+ 244/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48225.9336 + +
+``` + +``` +
+ 247/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48233.6133 + +
+``` + +``` +
+ 250/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48241.2930 + +
+``` + +``` +
+ 253/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48248.9727 + +
+``` + +``` +
+ 256/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48256.6562 + +
+``` + +``` +
+ 259/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48264.3359 + +
+``` + +``` +
+ 262/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48272.0156 + +
+``` + +``` +
+ 265/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48279.6953 + +
+``` + +``` +
+ 268/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48287.3750 + +
+``` + +``` +
+ 271/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48295.0547 + +
+``` + +``` +
+ 274/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48302.7344 + +
+``` + +``` +
+ 277/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48310.4141 + +
+``` + +``` +
+ 280/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48318.0938 + +
+``` + +``` +
+ 283/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48325.7734 + +
+``` + +``` +
+ 286/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48333.4531 + +
+``` + +``` +
+ 289/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48341.1328 + +
+``` + +``` +
+ 292/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48348.8125 + +
+``` + +``` +
+ 295/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48356.4922 + +
+``` + +``` +
+ 298/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48364.1758 + +
+``` + +``` +
+ 301/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48371.8555 + +
+``` + +``` +
+ 304/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48379.5352 + +
+``` + +``` +
+ 307/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48387.2148 + +
+``` + +``` +
+ 310/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48394.8945 + +
+``` + +``` +
+ 313/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48402.5742 + +
+``` + +``` +
+ 316/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48410.2539 + +
+``` + +``` +
+ 319/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48417.9336 + +
+``` + +``` +
+ 322/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48425.6133 + +
+``` + +``` +
+ 325/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48433.2930 + +
+``` + +``` +
+ 328/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48440.9727 + +
+``` + +``` +
+ 331/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48448.6523 + +
+``` + +``` +
+ 334/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48456.3320 + +
+``` + +``` +
+ 337/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48464.0156 + +
+``` + +``` +
+ 340/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48471.6953 + +
+``` + +``` +
+ 343/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48479.3750 + +
+``` + +``` +
+ 346/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48487.0547 + +
+``` + +``` +
+ 349/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48494.7344 + +
+``` + +``` +
+ 352/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48502.4141 + +
+``` + +``` +
+ 355/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48510.0938 + +
+``` + +``` +
+ 358/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48517.7734 + +
+``` + +``` +
+ 361/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48525.4531 + +
+``` + +``` +
+ 364/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48533.1328 + +
+``` + +``` +
+ 367/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48540.8125 + +
+``` + +``` +
+ 370/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48548.4922 + +
+``` + +``` +
+ 373/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48556.1719 + +
+``` + +``` +
+ 376/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48563.8555 + +
+``` + +``` +
+ 379/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48571.5352 + +
+``` + +``` +
+ 382/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48579.2148 + +
+``` + +``` +
+ 385/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48586.8945 + +
+``` + +``` +
+ 388/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48594.5742 + +
+``` + +``` +
+ 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48602.2539 + +
+``` + +``` +
+ 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48609.9336 + +
+``` + +``` +
+ 397/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48617.6133 + +
+``` + +``` +
+ 400/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48625.2930 + +
+``` + +``` +
+ 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48632.9727 + +
+``` + +``` +
+ 406/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48640.6523 + +
+``` + +``` +
+ 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48648.3320 + +
+``` + +``` +
+ 412/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48656.0117 + +
+``` + +``` +
+ 415/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48663.6914 + +
+``` + +``` +
+ 418/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48671.3750 + +
+``` + +``` +
+ 421/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48679.0547 + +
+``` + +``` +
+ 424/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48686.7344 + +
+``` + +``` +
+ 427/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48694.4141 + +
+``` + +``` +
+ 430/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48702.0938 + +
+``` + +``` +
+ 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48709.7734 + +
+``` + +``` +
+ 436/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48717.4531 + +
+``` + +``` +
+ 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48725.1328 + +
+``` + +``` +
+ 442/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48732.8125 + +
+``` + +``` +
+ 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48740.4922 + +
+``` + +``` +
+ 448/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48748.1719 + +
+``` + +``` +
+ 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48755.8516 + +
+``` + +``` +
+ 454/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48763.5352 + +
+``` + +``` +
+ 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48771.2148 + +
+``` + +``` +
+ 460/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48778.8945 + +
+``` + +``` +
+ 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48786.5742 + +
+``` + +``` +
+ 466/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48794.2539 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48801.9297 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 9s 19ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48804.4766 - val_loss: 0.1000 - val_moe_loss: 50398.7227 + + +
+``` +Epoch 19/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 14s 31ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 50403.8594 + +
+``` + +``` +
+ 4/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50411.5312 + +
+``` + +``` +
+ 7/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50419.2070 + +
+``` + +``` +
+ 10/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50426.8867 + +
+``` + +``` +
+ 13/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50434.5703 + +
+``` + +``` +
+ 16/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50442.2500 + +
+``` + +``` +
+ 19/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50449.9297 + +
+``` + +``` +
+ 22/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50457.6094 + +
+``` + +``` +
+ 25/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50465.2891 + +
+``` + +``` +
+ 28/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50472.9688 + +
+``` + +``` +
+ 31/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 50480.6523 + +
+``` + +``` +
+ 34/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50488.3320 + +
+``` + +``` +
+ 37/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50496.0117 + +
+``` + +``` +
+ 40/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 50503.6914 + +
+``` + +``` +
+ 43/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 50511.3711 + +
+``` + +``` +
+ 46/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 50519.0508 + +
+``` + +``` +
+ 49/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 50526.7305 + +
+``` + +``` +
+ 52/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50534.4102 + +
+``` + +``` +
+ 55/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50542.0938 + +
+``` + +``` +
+ 58/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50549.7734 + +
+``` + +``` +
+ 61/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50557.4531 + +
+``` + +``` +
+ 64/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50565.1328 + +
+``` + +``` +
+ 67/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50572.8125 + +
+``` + +``` +
+ 70/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 50580.4922 + +
+``` + +``` +
+ 73/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 50588.1719 + +
+``` + +``` +
+ 76/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 50595.8516 + +
+``` + +``` +
+ 79/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 50603.5312 + +
+``` + +``` +
+ 82/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 50611.2148 + +
+``` + +``` +
+ 85/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 50618.8945 + +
+``` + +``` +
+ 88/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50626.5742 + +
+``` + +``` +
+ 91/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50634.2539 + +
+``` + +``` +
+ 94/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50641.9336 + +
+``` + +``` +
+ 97/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50649.6133 + +
+``` + +``` +
+ 100/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50657.2930 + +
+``` + +``` +
+ 103/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50664.9727 + +
+``` + +``` +
+ 106/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50672.6562 + +
+``` + +``` +
+ 109/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50680.3359 + +
+``` + +``` +
+ 112/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50688.0156 + +
+``` + +``` +
+ 115/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50695.6953 + +
+``` + +``` +
+ 118/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50703.3750 + +
+``` + +``` +
+ 121/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50711.0547 + +
+``` + +``` +
+ 124/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50718.7344 + +
+``` + +``` +
+ 127/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50726.4141 + +
+``` + +``` +
+ 130/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50734.0938 + +
+``` + +``` +
+ 133/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50741.7734 + +
+``` + +``` +
+ 136/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50749.4531 + +
+``` + +``` +
+ 139/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50757.1328 + +
+``` + +``` +
+ 142/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50764.8125 + +
+``` + +``` +
+ 145/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50772.4922 + +
+``` + +``` +
+ 148/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50780.1719 + +
+``` + +``` +
+ 151/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50787.8555 + +
+``` + +``` +
+ 154/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50795.5352 + +
+``` + +``` +
+ 157/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50803.2148 + +
+``` + +``` +
+ 160/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50810.8945 + +
+``` + +``` +
+ 163/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50818.5742 + +
+``` + +``` +
+ 166/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50826.2539 + +
+``` + +``` +
+ 169/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50833.9336 + +
+``` + +``` +
+ 172/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50841.6133 + +
+``` + +``` +
+ 175/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50849.2930 + +
+``` + +``` +
+ 178/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50856.9727 + +
+``` + +``` +
+ 181/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50864.6523 + +
+``` + +``` +
+ 184/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50872.3320 + +
+``` + +``` +
+ 187/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50880.0117 + +
+``` + +``` +
+ 190/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50887.6914 + +
+``` + +``` +
+ 193/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50895.3750 + +
+``` + +``` +
+ 196/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50903.0547 + +
+``` + +``` +
+ 199/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50910.7344 + +
+``` + +``` +
+ 202/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50918.4141 + +
+``` + +``` +
+ 205/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50926.0938 + +
+``` + +``` +
+ 208/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50933.7734 + +
+``` + +``` +
+ 211/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50941.4531 + +
+``` + +``` +
+ 214/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50949.1328 + +
+``` + +``` +
+ 217/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50956.8125 + +
+``` + +``` +
+ 220/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50964.4922 + +
+``` + +``` +
+ 223/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50972.1758 + +
+``` + +``` +
+ 226/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50979.8555 + +
+``` + +``` +
+ 229/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50987.5352 + +
+``` + +``` +
+ 232/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 50995.2148 + +
+``` + +``` +
+ 235/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51002.8945 + +
+``` + +``` +
+ 238/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51010.5742 + +
+``` + +``` +
+ 241/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51018.2539 + +
+``` + +``` +
+ 242/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51020.8125 + +
+``` + +``` +
+ 243/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51023.3750 + +
+``` + +``` +
+ 245/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51028.4922 + +
+``` + +``` +
+ 247/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51033.6133 + +
+``` + +``` +
+ 250/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51041.2930 + +
+``` + +``` +
+ 253/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51048.9727 + +
+``` + +``` +
+ 256/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51056.6523 + +
+``` + +``` +
+ 259/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51064.3320 + +
+``` + +``` +
+ 262/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51072.0117 + +
+``` + +``` +
+ 265/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51079.6953 + +
+``` + +``` +
+ 268/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51087.3750 + +
+``` + +``` +
+ 271/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51095.0547 + +
+``` + +``` +
+ 274/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51102.7344 + +
+``` + +``` +
+ 277/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51110.4141 + +
+``` + +``` +
+ 280/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51118.0938 + +
+``` + +``` +
+ 283/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51125.7734 + +
+``` + +``` +
+ 286/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51133.4531 + +
+``` + +``` +
+ 289/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51141.1328 + +
+``` + +``` +
+ 292/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51148.8125 + +
+``` + +``` +
+ 295/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51156.4922 + +
+``` + +``` +
+ 298/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51164.1719 + +
+``` + +``` +
+ 301/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51171.8555 + +
+``` + +``` +
+ 304/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51179.5352 + +
+``` + +``` +
+ 307/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51187.2148 + +
+``` + +``` +
+ 310/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51194.8945 + +
+``` + +``` +
+ 313/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51202.5742 + +
+``` + +``` +
+ 316/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51210.2539 + +
+``` + +``` +
+ 319/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51217.9336 + +
+``` + +``` +
+ 322/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51225.6133 + +
+``` + +``` +
+ 325/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51233.2930 + +
+``` + +``` +
+ 328/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51240.9727 + +
+``` + +``` +
+ 331/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51248.6523 + +
+``` + +``` +
+ 334/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51256.3320 + +
+``` + +``` +
+ 337/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51264.0117 + +
+``` + +``` +
+ 340/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51271.6953 + +
+``` + +``` +
+ 343/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51279.3750 + +
+``` + +``` +
+ 346/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51287.0547 + +
+``` + +``` +
+ 349/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51294.7344 + +
+``` + +``` +
+ 352/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51302.4141 + +
+``` + +``` +
+ 355/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51310.0938 + +
+``` + +``` +
+ 358/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51317.7734 + +
+``` + +``` +
+ 361/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51325.4531 + +
+``` + +``` +
+ 364/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51333.1328 + +
+``` + +``` +
+ 367/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51340.8125 + +
+``` + +``` +
+ 370/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51348.4922 + +
+``` + +``` +
+ 373/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51356.1719 + +
+``` + +``` +
+ 376/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51363.8516 + +
+``` + +``` +
+ 379/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51371.5352 + +
+``` + +``` +
+ 382/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51379.2148 + +
+``` + +``` +
+ 385/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51386.8945 + +
+``` + +``` +
+ 388/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51394.5742 + +
+``` + +``` +
+ 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51402.2539 + +
+``` + +``` +
+ 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51409.9336 + +
+``` + +``` +
+ 397/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51417.6133 + +
+``` + +``` +
+ 400/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51425.2930 + +
+``` + +``` +
+ 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51432.9727 + +
+``` + +``` +
+ 406/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51440.6523 + +
+``` + +``` +
+ 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51448.3320 + +
+``` + +``` +
+ 412/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51456.0117 + +
+``` + +``` +
+ 415/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51463.6914 + +
+``` + +``` +
+ 418/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51471.3711 + +
+``` + +``` +
+ 421/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51479.0547 + +
+``` + +``` +
+ 424/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51486.7344 + +
+``` + +``` +
+ 427/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51494.4141 + +
+``` + +``` +
+ 430/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51502.0938 + +
+``` + +``` +
+ 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51509.7734 + +
+``` + +``` +
+ 436/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51517.4531 + +
+``` + +``` +
+ 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51525.1328 + +
+``` + +``` +
+ 442/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51532.8125 + +
+``` + +``` +
+ 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51540.4922 + +
+``` + +``` +
+ 448/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51548.1719 + +
+``` + +``` +
+ 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51555.8516 + +
+``` + +``` +
+ 454/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51563.5312 + +
+``` + +``` +
+ 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51571.2109 + +
+``` + +``` +
+ 459/469 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51576.3320 + +
+``` + +``` +
+ 460/469 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51578.8945 + +
+``` + +``` +
+ 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51586.5742 + +
+``` + +``` +
+ 466/469 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51594.2539 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51601.9297 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 10s 22ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51604.4766 - val_loss: 0.1000 - val_moe_loss: 53198.7227 + + +
+``` +Epoch 20/20 + +``` +
+ + 1/469 ━━━━━━━━━━━━━━━━━━━━ 18s 40ms/step - accuracy: 0.9844 - loss: 0.1000 - moe_loss: 53203.8633 + +
+``` + +``` +
+ 4/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 53211.5391 + +
+``` + +``` +
+ 7/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 53219.2188 + +
+``` + +``` +
+ 10/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 53226.8945 + +
+``` + +``` +
+ 13/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 53234.5742 + +
+``` + +``` +
+ 16/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 53242.2578 + +
+``` + +``` +
+ 19/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 53249.9375 + +
+``` + +``` +
+ 22/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 53257.6172 + +
+``` + +``` +
+ 25/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 53265.2969 + +
+``` + +``` +
+ 28/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 53272.9766 + +
+``` + +``` +
+ 31/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 53280.6562 + +
+``` + +``` +
+ 34/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 53288.3359 + +
+``` + +``` +
+ 37/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 53296.0156 + +
+``` + +``` +
+ 40/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 53303.6953 + +
+``` + +``` +
+ 43/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 53311.3750 + +
+``` + +``` +
+ 46/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 53319.0547 + +
+``` + +``` +
+ 49/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 53326.7383 + +
+``` + +``` +
+ 52/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 53334.4180 + +
+``` + +``` +
+ 55/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 53342.0977 + +
+``` + +``` +
+ 58/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 53349.7773 + +
+``` + +``` +
+ 61/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 53357.4570 + +
+``` + +``` +
+ 64/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 53365.1367 + +
+``` + +``` +
+ 67/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 53372.8164 + +
+``` + +``` +
+ 70/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 53380.4961 + +
+``` + +``` +
+ 73/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 53388.1758 + +
+``` + +``` +
+ 76/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 53395.8555 + +
+``` + +``` +
+ 79/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 53403.5352 + +
+``` + +``` +
+ 82/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 53411.2148 + +
+``` + +``` +
+ 85/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 53418.8945 + +
+``` + +``` +
+ 88/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 53426.5742 + +
+``` + +``` +
+ 91/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 53434.2539 + +
+``` + +``` +
+ 94/469 ━━━━━━━━━━━━━━━━━━━━ 8s 22ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 53441.9336 + +
+``` + +``` +
+ 97/469 ━━━━━━━━━━━━━━━━━━━━ 8s 22ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 53449.6133 + +
+``` + +``` +
+ 100/469 ━━━━━━━━━━━━━━━━━━━━ 8s 22ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 53457.2930 + +
+``` + +``` +
+ 102/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 53462.4141 + +
+``` + +``` +
+ 105/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 53470.0938 + +
+``` + +``` +
+ 108/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 53477.7734 + +
+``` + +``` +
+ 111/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 53485.4531 + +
+``` + +``` +
+ 114/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 53493.1328 + +
+``` + +``` +
+ 117/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 53500.8125 + +
+``` + +``` +
+ 120/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 53508.4922 + +
+``` + +``` +
+ 123/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 53516.1758 + +
+``` + +``` +
+ 126/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 53523.8555 + +
+``` + +``` +
+ 129/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 53531.5352 + +
+``` + +``` +
+ 132/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 53539.2148 + +
+``` + +``` +
+ 135/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 53546.8945 + +
+``` + +``` +
+ 138/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 53554.5742 + +
+``` + +``` +
+ 141/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 53562.2539 + +
+``` + +``` +
+ 144/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 53569.9336 + +
+``` + +``` +
+ 147/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 53577.6133 + +
+``` + +``` +
+ 150/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 53585.2930 + +
+``` + +``` +
+ 153/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 53592.9727 + +
+``` + +``` +
+ 156/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 53600.6523 + +
+``` + +``` +
+ 159/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 53608.3320 + +
+``` + +``` +
+ 162/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 53616.0156 + +
+``` + +``` +
+ 165/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 53623.6953 + +
+``` + +``` +
+ 168/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 53631.3750 + +
+``` + +``` +
+ 171/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 53639.0547 + +
+``` + +``` +
+ 174/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 53646.7344 + +
+``` + +``` +
+ 177/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 53654.4141 + +
+``` + +``` +
+ 180/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 53662.0938 + +
+``` + +``` +
+ 183/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 53669.7734 + +
+``` + +``` +
+ 186/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53677.4531 + +
+``` + +``` +
+ 189/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53685.1328 + +
+``` + +``` +
+ 192/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53692.8125 + +
+``` + +``` +
+ 195/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53700.4922 + +
+``` + +``` +
+ 198/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53708.1719 + +
+``` + +``` +
+ 201/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53715.8516 + +
+``` + +``` +
+ 204/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53723.5352 + +
+``` + +``` +
+ 207/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53731.2148 + +
+``` + +``` +
+ 210/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53738.8945 + +
+``` + +``` +
+ 213/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53746.5742 + +
+``` + +``` +
+ 214/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53749.1328 + +
+``` + +``` +
+ 216/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53754.2539 + +
+``` + +``` +
+ 219/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53761.9336 + +
+``` + +``` +
+ 222/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53769.6133 + +
+``` + +``` +
+ 225/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53777.2930 + +
+``` + +``` +
+ 228/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53784.9727 + +
+``` + +``` +
+ 231/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53792.6523 + +
+``` + +``` +
+ 234/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53800.3320 + +
+``` + +``` +
+ 237/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53808.0117 + +
+``` + +``` +
+ 240/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53815.6914 + +
+``` + +``` +
+ 243/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53823.3711 + +
+``` + +``` +
+ 246/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53831.0547 + +
+``` + +``` +
+ 249/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53838.7344 + +
+``` + +``` +
+ 252/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53846.4141 + +
+``` + +``` +
+ 255/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53854.0938 + +
+``` + +``` +
+ 258/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53861.7734 + +
+``` + +``` +
+ 261/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53869.4531 + +
+``` + +``` +
+ 264/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53877.1328 + +
+``` + +``` +
+ 267/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53884.8125 + +
+``` + +``` +
+ 270/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53892.4922 + +
+``` + +``` +
+ 273/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53900.1719 + +
+``` + +``` +
+ 276/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53907.8516 + +
+``` + +``` +
+ 279/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53915.5312 + +
+``` + +``` +
+ 282/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53923.2109 + +
+``` + +``` +
+ 285/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53930.8945 + +
+``` + +``` +
+ 288/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53938.5742 + +
+``` + +``` +
+ 291/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53946.2539 + +
+``` + +``` +
+ 294/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53953.9336 + +
+``` + +``` +
+ 297/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53961.6133 + +
+``` + +``` +
+ 300/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 53969.2930 + +
+``` + +``` +
+ 303/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 53976.9727 + +
+``` + +``` +
+ 306/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 53984.6523 + +
+``` + +``` +
+ 309/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 53992.3320 + +
+``` + +``` +
+ 312/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54000.0117 + +
+``` + +``` +
+ 315/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54007.6914 + +
+``` + +``` +
+ 318/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54015.3711 + +
+``` + +``` +
+ 321/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54023.0508 + +
+``` + +``` +
+ 324/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54030.7305 + +
+``` + +``` +
+ 327/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54038.4141 + +
+``` + +``` +
+ 330/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54046.0938 + +
+``` + +``` +
+ 333/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54053.7734 + +
+``` + +``` +
+ 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54061.4531 + +
+``` + +``` +
+ 339/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54069.1328 + +
+``` + +``` +
+ 342/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54076.8125 + +
+``` + +``` +
+ 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54081.9336 + +
+``` + +``` +
+ 346/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54087.0508 + +
+``` + +``` +
+ 349/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54094.7305 + +
+``` + +``` +
+ 352/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54102.4141 + +
+``` + +``` +
+ 354/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54107.5312 + +
+``` + +``` +
+ 357/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54115.2109 + +
+``` + +``` +
+ 360/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54122.8906 + +
+``` + +``` +
+ 363/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54130.5703 + +
+``` + +``` +
+ 366/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54138.2539 + +
+``` + +``` +
+ 369/469 ━━━━━━━━━━━━━━━━━━━━ 2s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54145.9336 + +
+``` + +``` +
+ 372/469 ━━━━━━━━━━━━━━━━━━━━ 2s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54153.6133 + +
+``` + +``` +
+ 375/469 ━━━━━━━━━━━━━━━━━━━━ 2s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54161.2930 + +
+``` + +``` +
+ 378/469 ━━━━━━━━━━━━━━━━━━━━ 2s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54168.9727 + +
+``` + +``` +
+ 381/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54176.6523 + +
+``` + +``` +
+ 384/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54184.3320 + +
+``` + +``` +
+ 386/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54189.4531 + +
+``` + +``` +
+ 389/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54197.1328 + +
+``` + +``` +
+ 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54202.2539 + +
+``` + +``` +
+ 393/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54207.3711 + +
+``` + +``` +
+ 395/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54212.4922 + +
+``` + +``` +
+ 397/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54217.6133 + +
+``` + +``` +
+ 399/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54222.7305 + +
+``` + +``` +
+ 402/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54230.4141 + +
+``` + +``` +
+ 405/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54238.0938 + +
+``` + +``` +
+ 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54243.2109 + +
+``` + +``` +
+ 410/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54250.8906 + +
+``` + +``` +
+ 413/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54258.5742 + +
+``` + +``` +
+ 416/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54266.2539 + +
+``` + +``` +
+ 419/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54273.9336 + +
+``` + +``` +
+ 422/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54281.6133 + +
+``` + +``` +
+ 425/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54289.2930 + +
+``` + +``` +
+ 428/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54296.9727 + +
+``` + +``` +
+ 431/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54304.6523 + +
+``` + +``` +
+ 434/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54312.3320 + +
+``` + +``` +
+ 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54320.0117 + +
+``` + +``` +
+ 440/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54327.6914 + +
+``` + +``` +
+ 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54335.3711 + +
+``` + +``` +
+ 446/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54343.0508 + +
+``` + +``` +
+ 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54350.7305 + +
+``` + +``` +
+ 452/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54358.4141 + +
+``` + +``` +
+ 455/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54366.0938 + +
+``` + +``` +
+ 458/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54373.7734 + +
+``` + +``` +
+ 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54381.4531 + +
+``` + +``` +
+ 464/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54389.1328 + +
+``` + +``` +
+ 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54396.8125 + +
+``` + +``` +
+ 469/469 ━━━━━━━━━━━━━━━━━━━━ 11s 24ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54404.4766 - val_loss: 0.1000 - val_moe_loss: 55998.7227 + + +### Evaluation + + +```python +score = model.evaluate(x_test, y_test, verbose=0) +print("Test loss:", score[0]) +print("Test accuracy:", score[1]) +``` + +
+``` +Test loss: tf.Tensor(0.10000026, shape=(), dtype=float32) +Test accuracy: {'accuracy': } + +``` +
\ No newline at end of file diff --git a/examples/vision/mnist_moe.py b/examples/vision/mnist_moe.py new file mode 100644 index 0000000000..8b1911511d --- /dev/null +++ b/examples/vision/mnist_moe.py @@ -0,0 +1,424 @@ +""" +Title: MoE for MNIST +Author: [Damoon Shahhosseini](https://www.linkedin.com/in/damoonsh/) +Date created: 2015/06/19 +Last modified: 2020/04/21 +Description: Showcasing concepts relates to Mixture of Experts (MoE). +Accelerator: GPU +""" + +""" +# Introduction + +In this example, we implement an adaptation of the Mixture of Experts (MoE) architecture +([Shazeer et al.](https://arxiv.org/abs/1701.06538)). +The idea is to use conditional computation to increases model capacity without increasing computation. +Experts are identical blocks within a layer where each are trained to specialize in different parts of the input space. +At each forward pass, a gating network selects a subset of experts to apply to the input. + +The components to implement are: +- Gating network: A dense layer that outputs a probability distribution over the experts. +- MoE layer: A layer that applies a different expert to each input in the batch. And a loss function that ensures specialization among the experts. +- Model: A simple model that uses the MoE layer. + +In this example, we will first implement a linear MoE layer and then a CNN-based MoE layer. Lastly we will combine the two using an abstract implementation to showcase its capacties. +""" + +""" +## Imports +""" + +import numpy as np +import keras +from keras import layers, models +import tensorflow as tf +from tensorflow.keras import backend as K + +""" +### Data Prepration +""" + +# Model / data parameters +num_classes = 10 +input_shape = (28, 28, 1) + +# Load the data and split it between train and test sets +(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data() + +# Scale images to the [0, 1] range +x_train = x_train.astype("float32") / 255 +x_test = x_test.astype("float32") / 255 +# Make sure images have shape (28, 28, 1) +x_train = np.expand_dims(x_train, -1) +x_test = np.expand_dims(x_test, -1) +print("x_train shape:", x_train.shape) +print(x_train.shape[0], "train samples") +print(x_test.shape[0], "test samples") + + +# convert class vectors to binary class matrices +y_train = keras.utils.to_categorical(y_train, num_classes) +y_test = keras.utils.to_categorical(y_test, num_classes) + +""" +## Constants + +""" +NUM_EXPERTS = 5 +TOP_K = 3 +BATCH_SIZE = 128 +NUM_EPOCHS = 20 +LEARNING_RATE = 0.001 + + +""" +## Base architecture + +The most basic [MNIST classifier](https://keras.io/examples/vision/mnist_convnet/) consists of a stack of convolutional layers followed by a dense layer. In this tutorial, we will first replace the dense layer with a MoE layer. Then do the same for convolutional layers. +""" + +model = keras.Sequential( + [ + keras.Input(shape=input_shape), + layers.Conv2D(32, kernel_size=(3, 3), activation="relu"), + layers.MaxPooling2D(pool_size=(2, 2)), + layers.Conv2D(64, kernel_size=(3, 3), activation="relu"), + layers.MaxPooling2D(pool_size=(2, 2)), + layers.Flatten(), + layers.Dropout(0.5), + layers.Dense(num_classes, activation="softmax"), + ] +) + +model.summary() + +""" +# Linear MoE using Dense layers + +For this layer, we will create multiple dense layers that will be used as experts. Then a simple gating network will select at each step which exerts should be utilized for the current input. We will keep track of the number of times each expert is used. Then the selected experts will be combined using a weighted sum. +""" + + +class LinearMoE(layers.Layer): + def __init__( + self, + hidden_size, + num_experts=NUM_EXPERTS, + top_k=TOP_K, + ): + super(LinearMoE, self).__init__() + + # Initialize experts + self.experts = [ + layers.Dense( + hidden_size, + kernel_initializer=tf.keras.initializers.RandomNormal( + mean=0.0, stddev=0.001 + ), + bias_initializer="zeros", + ) + for _ in range(num_experts) + ] + # Initialize gating network + self.gating_network = layers.Dense( + NUM_EXPERTS, + kernel_initializer=tf.keras.initializers.RandomNormal( + mean=0.0, stddev=0.001 + ), + bias_initializer="zeros", + ) + + self.num_experts = num_experts + self.top_k = top_k + # Keep track of how many times each expert is used + self.expert_usage_count = tf.Variable( + tf.zeros((num_experts,), dtype=tf.float32) + ) + + def call(self, x): + # Get gating weights + gating_weights = self.gating_network(x) + + # Get the top k experts based on the gating weights + top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k) + + # Count usage of each expert symbolically + updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32) + # Use tf.tensor_scatter_nd_add to increment the usage count + self.expert_usage_count.assign( + tf.tensor_scatter_nd_add( + self.expert_usage_count, tf.reshape(top_k_indices, [-1, 1]), updates + ) + ) + + # Get outputs from only the top-k experts + top_k_expert_outputs = tf.stack( + [ + self.experts[expert_index](x) + for expert_index in top_k_indices.numpy()[0] + ], + axis=1, + ) # Stack outputs along axis 1 + + # Combine outputs using top-k weights + combined_output = tf.einsum("ijk,ij->ik", top_k_expert_outputs, top_k_weights) + + return combined_output + + +""" +Output of the top 3 experts out of 10 for one layer of MoE: +""" +sample_data = tf.random.uniform((1, 10)) +linear_mode = LinearMoE(32, 10, 3) +linear_mode(sample_data) + +""" +## Routing Collapse + +Routing collapse is a problem that occurs with MoE layers. The route terminology refers to the selection process of which expert to use for a given input. + +Route collapse happens when a routing model, early in training, starts favoring just a few experts because they perform slightly better due to random starting conditions. This leads to most examples being sent to these experts, leaving others unused and reducing the model’s overall capacity. + +Code below demonstrates the randomness of expert selection: +""" + + +def check_expert_usage(runs): + # Running the later multiple times to show randomness of expert selection + for i in range(runs): + sample_data = tf.random.uniform((1, 10)) + linear_mode = LinearMoE(10, 5) + _ = linear_mode(sample_data) + print(f"Run {i}, Expert usage: {linear_mode.expert_usage_count.numpy()}") + + +check_expert_usage(4) + +""" +### Adding loss functions to prevent route collapse +To fix this, the authors use extra rules (importance and load losses), ideas borrowed from [Shazeer et al.](https://arxiv.org/abs/1701.06538), to ensure all experts get used evenly. + +The importance_loss calculates how much the usage of each expert (tracked in batch_importance_sum) deviates from the average usage (mean_importance) by using mean squared error, aiming to balance expert utilization. This helps prevent route collapse by discouraging the model from overloading a few experts, instead promoting an even distribution of examples across all experts to maintain diverse and effective routing. + +#### Load losses + - Diversity loss: Diversity loss helps prevent route collapse by encouraging the routing model to evenly distribute examples across all experts, rather than favoring just a few due to their initial performance. It does this by maximizing the entropy of the gating weights, ensuring balanced expert utilization and improving the model's overall capacity. + - Overflow loss: The batch_overflow_sum measures how much the usage of experts exceeds a set capacity by applying ReLU to the difference between usage_counts (how many examples each expert handles) and batch_capacity (the allowed limit), then summing the excesses. This helps prevent route collapse by penalizing situations where certain experts are overused, encouraging a more even spread of examples across all experts to keep the model's capacity balanced. +""" + + +class LinearMoE(layers.Layer): + def __init__( + self, + hidden_size, + num_experts=NUM_EXPERTS, + top_k=TOP_K, + ): + super(LinearMoE, self).__init__() + + # Initialize experts + self.experts = [ + layers.Dense( + hidden_size, + kernel_initializer=tf.keras.initializers.RandomNormal( + mean=0.0, stddev=0.001 + ), + bias_initializer="zeros", + ) + for _ in range(num_experts) + ] + # Initialize gating network + self.gating_network = layers.Dense( + num_experts, # Match output to num_experts + kernel_initializer=tf.keras.initializers.RandomNormal( + mean=0.0, stddev=0.001 + ), + bias_initializer="zeros", + ) + + self.num_experts = num_experts + self.top_k = top_k + # Keep track of how many times each expert is used as a layer weight + self.expert_usage_count = tf.Variable( + tf.zeros((num_experts,), dtype=tf.float32) + ) + + self.batch_capacity = BATCH_SIZE // num_experts + + def _diversity_loss(self, weights): + entropy = -K.sum(weights * K.log(weights + 1e-10), axis=1) + self.diversity_loss = -K.mean(entropy) + + def _importance_loss(self, gating_weights): + batch_importance_sum = K.sum(gating_weights, axis=0) + mean_importance = K.mean(batch_importance_sum) + self.importance_loss = K.mean( + K.square( + batch_importance_sum + - mean_importance * tf.ones_like(batch_importance_sum) + ) + ) + + def call(self, x): + # Get gating weights and normalize + gating_weights = self.gating_network(x) + gating_weights = K.softmax(gating_weights) # Ensure weights are probabilities + self._diversity_loss(gating_weights) + self._importance_loss(gating_weights) + + # Get the top k experts based on the gating weights + top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k) + + # Count usage of each expert symbolically + updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32) + # Use tf.tensor_scatter_nd_add to increment the usage count + self.expert_usage_count.assign( + tf.tensor_scatter_nd_add( + self.expert_usage_count, tf.reshape(top_k_indices, [-1, 1]), updates + ) + ) + + # Calculate overflow using updated usage count + self.batch_overflow_sum = K.sum( + K.relu(tf.convert_to_tensor(self.expert_usage_count) - self.batch_capacity) + ) + + # Compute all expert outputs + expert_outputs = tf.stack( + [expert(x) for expert in self.experts], axis=1 + ) # Shape: (batch_size, num_experts, hidden_size) + + # Gather the top-k expert outputs using top_k_indices + batch_size = tf.shape(x)[0] + batch_indices = tf.expand_dims( + tf.range(batch_size), 1 + ) # Shape: (batch_size, 1) + batch_indices = tf.tile( + batch_indices, [1, self.top_k] + ) # Shape: (batch_size, top_k) + + # Create indices for gathering + indices = tf.stack( + [batch_indices, top_k_indices], axis=2 + ) # Shape: (batch_size, top_k, 2) + top_k_expert_outputs = tf.gather_nd( + expert_outputs, indices + ) # Shape: (batch_size, top_k, hidden_size) + + # Combine outputs using top-k weights + combined_output = tf.reduce_sum( + top_k_expert_outputs * tf.expand_dims(top_k_weights, axis=-1), axis=1 + ) + + return combined_output + + def compute_total_loss(self, load_balance_coef=0.01): + return load_balance_coef * ( + self.diversity_loss + self.batch_overflow_sum + self.importance_loss + ) + + +""" +## MNIST classification with MoE +""" + + +class MoEModel(keras.Model): + def __init__(self, input_shape, num_classes, num_experts=NUM_EXPERTS, top_k=TOP_K): + super(MoEModel, self).__init__() + + # Define the convolutional block + self.conv_block = keras.Sequential( + [ + layers.Conv2D(32, kernel_size=(3, 3), activation="relu"), + layers.MaxPooling2D(pool_size=(2, 2)), + layers.Conv2D(64, kernel_size=(3, 3), activation="relu"), + layers.MaxPooling2D(pool_size=(2, 2)), + layers.Flatten(), + layers.Dropout(0.5), + ] + ) + + # MoE classifier + self.moe_classifier = LinearMoE( + hidden_size=num_classes, num_experts=num_experts, top_k=top_k + ) + + # Softmax layer + self.softmax = layers.Softmax() + + def call(self, inputs, training=False): + conv_flatten = self.conv_block(inputs) + moe_output = self.moe_classifier(conv_flatten) + outputs = self.softmax(moe_output) + return outputs + + def train_step(self, data): + x, y = data # Unpack input data and labels + + with tf.GradientTape() as tape: + y_pred = self(x, training=True) + classification_loss = self.compute_loss(x, y, y_pred) + moe_loss = self.moe_classifier.compute_total_loss(load_balance_coef=0.01) + total_loss = classification_loss + moe_loss + + # Compute gradients + gradients = tape.gradient(total_loss, self.trainable_variables) + + # Update weights + self.optimizer.apply_gradients( + zip(gradients, self.trainable_variables) + ) # Update metrics (e.g., accuracy) + self.compiled_metrics.update_state(y, y_pred) + # Return a dict of metrics for monitoring + return { + "loss": total_loss, + "moe_loss": moe_loss, + **{m.name: m.result() for m in self.metrics}, + } + + def test_step(self, data): + x, y = data + y_pred = self(x, training=False) + classification_loss = self.compute_loss(x, y, y_pred) + moe_loss = self.moe_classifier.compute_total_loss(load_balance_coef=0.01) + total_loss = classification_loss + moe_loss + + self.compiled_metrics.update_state(y, y_pred) + return { + "loss": total_loss, + "moe_loss": moe_loss, + **{m.name: m.result() for m in self.metrics}, + } + + +# Instantiate and compile the model +inputs = keras.Input(shape=input_shape) +model = MoEModel( + input_shape=input_shape, num_classes=num_classes, num_experts=6, top_k=4 +) + +model.compile( + optimizer=keras.optimizers.Adam(learning_rate=LEARNING_RATE), + loss=keras.losses.CategoricalCrossentropy(), # Assumes one-hot encoded labels + metrics=["accuracy"], +) + +""" +### Training +""" +history = model.fit( + x_train, + y_train, + batch_size=BATCH_SIZE, + epochs=NUM_EPOCHS, + validation_data=(x_test, y_test), +) + +""" +### Evaluation +""" + +score = model.evaluate(x_test, y_test, verbose=0) +print("Test loss:", score[0]) +print("Test accuracy:", score[1]) diff --git a/templates/examples/audio/vocal_track_separation.md b/templates/examples/audio/vocal_track_separation.md new file mode 100644 index 0000000000..af44162d78 --- /dev/null +++ b/templates/examples/audio/vocal_track_separation.md @@ -0,0 +1,921 @@ +# Vocal Track Separation with Encoder-Decoder Architecture + +**Author:** [Joaquin Jimenez](https://github.com/johacks/)
+**Date created:** 2024/12/10
+**Last modified:** 2024/12/10
+**Description:** Train a model to separate vocal tracks from music mixtures. + + +
ⓘ This example uses Keras 3
+ [**View in Colab**](https://colab.research.google.com/github/keras-team/keras-io/blob/master/examples/audio/ipynb/vocal_track_separation.ipynb) [**GitHub source**](https://github.com/keras-team/keras-io/blob/master/examples/audio/vocal_track_separation.py) + + + +--- +## Introduction + +In this tutorial, we build a vocal track separation model using an encoder-decoder +architecture in Keras 3. + +We train the model on the [MUSDB18 dataset](https://doi.org/10.5281/zenodo.1117372), +which provides music mixtures and isolated tracks for drums, bass, other, and vocals. + +Key concepts covered: + +- Audio data preprocessing using the Short-Time Fourier Transform (STFT). +- Audio data augmentation techniques. +- Implementing custom encoders and decoders specialized for audio data. +- Defining appropriate loss functions and metrics for audio source separation tasks. + +The model architecture is derived from the TFC_TDF_Net model described in: + +W. Choi, M. Kim, J. Chung, D. Lee, and S. Jung, “Investigating U-Nets with various +intermediate blocks for spectrogram-based singing voice separation,” in the 21st +International Society for Music Information Retrieval Conference, 2020. + +For reference code, see: +[GitHub: ws-choi/ISMIR2020_U_Nets_SVS](https://github.com/ws-choi/ISMIR2020_U_Nets_SVS). + +The data processing and model training routines are partly derived from: +[ZFTurbo/Music-Source-Separation-Training](https://github.com/ZFTurbo/Music-Source-Separation-Training/tree/main). + +--- +## Setup + +Import and install all the required dependencies. + + +```python +!pip install -qq audiomentations soundfile ffmpeg-binaries +!pip install -qq "keras==3.7.0" +!sudo -n apt-get install -y graphviz >/dev/null 2>&1 # Required for plotting the model +``` + + +```python +import glob +import os + +os.environ["KERAS_BACKEND"] = "jax" # or "tensorflow" or "torch" + +import random +import subprocess +import tempfile +import typing +from os import path + +import audiomentations as aug +import ffmpeg +import keras +import numpy as np +import soundfile as sf +from IPython import display +from keras import callbacks, layers, ops, saving +from matplotlib import pyplot as plt +``` + +--- +## Configuration + +The following constants define configuration parameters for audio processing +and model training, including dataset paths, audio chunk sizes, Short-Time Fourier +Transform (STFT) parameters, and training hyperparameters. + + +```python +# MUSDB18 dataset configuration +MUSDB_STREAMS = {"mixture": 0, "drums": 1, "bass": 2, "other": 3, "vocals": 4} +TARGET_INSTRUMENTS = {track: MUSDB_STREAMS[track] for track in ("vocals",)} +N_INSTRUMENTS = len(TARGET_INSTRUMENTS) +SOURCE_INSTRUMENTS = tuple(k for k in MUSDB_STREAMS if k != "mixture") + +# Audio preprocessing parameters for Short-Time Fourier Transform (STFT) +N_SUBBANDS = 4 # Number of subbands into which frequencies are split +CHUNK_SIZE = 65024 # Number of amplitude samples per audio chunk (~4 seconds) +STFT_N_FFT = 2048 # FFT points used in STFT +STFT_HOP_LENGTH = 512 # Hop length for STFT + +# Training hyperparameters +N_CHANNELS = 64 # Base channel count for the model +BATCH_SIZE = 3 +ACCUMULATION_STEPS = 2 +EFFECTIVE_BATCH_SIZE = BATCH_SIZE * (ACCUMULATION_STEPS or 1) + +# Paths +TMP_DIR = path.expanduser("~/.keras/tmp") +DATASET_DIR = path.expanduser("~/.keras/datasets") +MODEL_PATH = path.join(TMP_DIR, f"model_{keras.backend.backend()}.keras") +CSV_LOG_PATH = path.join(TMP_DIR, f"training_{keras.backend.backend()}.csv") +os.makedirs(DATASET_DIR, exist_ok=True) +os.makedirs(TMP_DIR, exist_ok=True) + +# Set random seed for reproducibility +keras.utils.set_random_seed(21) +``` + +
+``` +WARNING: All log messages before absl::InitializeLog() is called are written to STDERR +E0000 00:00:1734318393.806217 81028 cuda_dnn.cc:8310] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered +E0000 00:00:1734318393.809885 81028 cuda_blas.cc:1418] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered + +``` +
+--- +## MUSDB18 Dataset + +The MUSDB18 dataset is a standard benchmark for music source separation, containing +150 full-length music tracks along with isolated drums, bass, other, and vocals. +The dataset is stored in .mp4 format, and each .mp4 file includes multiple audio +streams (mixture and individual tracks). + +### Download and Conversion + +The following utility function downloads MUSDB18 and converts its .mp4 files to +.wav files for each instrument track, resampled to 16 kHz. + + +```python + +def download_musdb18(out_dir=None): + """Download and extract the MUSDB18 dataset, then convert .mp4 files to .wav files. + + MUSDB18 reference: + Rafii, Z., Liutkus, A., Stöter, F.-R., Mimilakis, S. I., & Bittner, R. (2017). + MUSDB18 - a corpus for music separation (1.0.0) [Data set]. Zenodo. + """ + ffmpeg.init() + from ffmpeg import FFMPEG_PATH + + # Create output directories + os.makedirs((base := out_dir or tempfile.mkdtemp()), exist_ok=True) + if path.exists((out_dir := path.join(base, "musdb18_wav"))): + print("MUSDB18 dataset already downloaded") + return out_dir + + # Download and extract the dataset + download_dir = keras.utils.get_file( + fname="musdb18", + origin="https://zenodo.org/records/1117372/files/musdb18.zip", + extract=True, + ) + + # ffmpeg command template: input, stream index, output + ffmpeg_args = str(FFMPEG_PATH) + " -v error -i {} -map 0:{} -vn -ar 16000 {}" + + # Convert each mp4 file to multiple .wav files for each track + for split in ("train", "test"): + songs = os.listdir(path.join(download_dir, split)) + for i, song in enumerate(songs): + if i % 10 == 0: + print(f"{split.capitalize()}: {i}/{len(songs)} songs processed") + + mp4_path_orig = path.join(download_dir, split, song) + mp4_path = path.join(tempfile.mkdtemp(), split, song.replace(" ", "_")) + os.makedirs(path.dirname(mp4_path), exist_ok=True) + os.rename(mp4_path_orig, mp4_path) + + wav_dir = path.join(out_dir, split, path.basename(mp4_path).split(".")[0]) + os.makedirs(wav_dir, exist_ok=True) + + for track in SOURCE_INSTRUMENTS: + out_path = path.join(wav_dir, f"{track}.wav") + stream_index = MUSDB_STREAMS[track] + args = ffmpeg_args.format(mp4_path, stream_index, out_path).split() + assert subprocess.run(args).returncode == 0, "ffmpeg conversion failed" + return out_dir + + +# Download and prepare the MUSDB18 dataset +songs = download_musdb18(out_dir=DATASET_DIR) +``` + +
+``` +MUSDB18 dataset already downloaded + +``` +
+### Custom Dataset + +We define a custom dataset class to generate random audio chunks and their corresponding +labels. The dataset does the following: + +1. Selects a random chunk from a random song and instrument. +2. Applies optional data augmentations. +3. Combines isolated tracks to form new synthetic mixtures. +4. Prepares features (mixtures) and labels (vocals) for training. + +This approach allows creating an effectively infinite variety of training examples +through randomization and augmentation. + + +```python + +class Dataset(keras.utils.PyDataset): + def __init__( + self, + songs, + batch_size=BATCH_SIZE, + chunk_size=CHUNK_SIZE, + batches_per_epoch=1000 * ACCUMULATION_STEPS, + augmentation=True, + **kwargs, + ): + super().__init__(**kwargs) + self.augmentation = augmentation + self.vocals_augmentations = [ + aug.PitchShift(min_semitones=-5, max_semitones=5, p=0.1), + aug.SevenBandParametricEQ(-9, 9, p=0.25), + aug.TanhDistortion(0.1, 0.7, p=0.1), + ] + self.other_augmentations = [ + aug.PitchShift(p=0.1), + aug.AddGaussianNoise(p=0.1), + ] + self.songs = songs + self.sizes = {song: self.get_track_set_size(song) for song in self.songs} + self.batch_size = batch_size + self.chunk_size = chunk_size + self.batches_per_epoch = batches_per_epoch + + def get_track_set_size(self, song: str): + """Return the smallest track length in the given song directory.""" + sizes = [len(sf.read(p)[0]) for p in glob.glob(path.join(song, "*.wav"))] + if max(sizes) != min(sizes): + print(f"Warning: {song} has different track lengths") + return min(sizes) + + def random_chunk_of_instrument_type(self, instrument: str): + """Extract a random chunk for the specified instrument from a random song.""" + song, size = random.choice(list(self.sizes.items())) + track = path.join(song, f"{instrument}.wav") + + if self.chunk_size <= size: + start = np.random.randint(size - self.chunk_size + 1) + audio = sf.read(track, self.chunk_size, start, dtype="float32")[0] + audio_mono = np.mean(audio, axis=1) + else: + # If the track is shorter than chunk_size, pad the signal + audio_mono = np.mean(sf.read(track, dtype="float32")[0], axis=1) + audio_mono = np.pad(audio_mono, ((0, self.chunk_size - size),)) + + # If the chunk is almost silent, retry + if np.mean(np.abs(audio_mono)) < 0.01: + return self.random_chunk_of_instrument_type(instrument) + + return self.data_augmentation(audio_mono, instrument) + + def data_augmentation(self, audio: np.ndarray, instrument: str): + """Apply data augmentation to the audio chunk, if enabled.""" + + def coin_flip(x, probability: float, fn: typing.Callable): + return fn(x) if random.uniform(0, 1) < probability else x + + if self.augmentation: + augmentations = ( + self.vocals_augmentations + if instrument == "vocals" + else self.other_augmentations + ) + # Loudness augmentation + audio *= np.random.uniform(0.5, 1.5, (len(audio),)).astype("float32") + # Random reverse + audio = coin_flip(audio, 0.1, lambda x: np.flip(x)) + # Random polarity inversion + audio = coin_flip(audio, 0.5, lambda x: -x) + # Apply selected augmentations + for aug_ in augmentations: + aug_.randomize_parameters(audio, sample_rate=16000) + audio = aug_(audio, sample_rate=16000) + return audio + + def random_mix_of_tracks(self) -> dict: + """Create a random mix of instruments by summing their individual chunks.""" + tracks = {} + for instrument in SOURCE_INSTRUMENTS: + # Start with a single random chunk + mixup = [self.random_chunk_of_instrument_type(instrument)] + + # Randomly add more chunks of the same instrument (mixup augmentation) + if self.augmentation: + for p in (0.2, 0.02): + if random.uniform(0, 1) < p: + mixup.append(self.random_chunk_of_instrument_type(instrument)) + + tracks[instrument] = np.mean(mixup, axis=0, dtype="float32") + return tracks + + def __len__(self): + return self.batches_per_epoch + + def __getitem__(self, idx): + # Generate a batch of random mixtures + batch = [self.random_mix_of_tracks() for _ in range(self.batch_size)] + + # Features: sum of all tracks + batch_x = ops.sum( + np.array([list(track_set.values()) for track_set in batch]), axis=1 + ) + + # Labels: isolated target instruments (e.g., vocals) + batch_y = np.array( + [[track_set[t] for t in TARGET_INSTRUMENTS] for track_set in batch] + ) + + return batch_x, ops.convert_to_tensor(batch_y) + + +# Create train and validation datasets +train_ds = Dataset(glob.glob(path.join(songs, "train", "*"))) +val_ds = Dataset( + glob.glob(path.join(songs, "test", "*")), + batches_per_epoch=int(0.1 * train_ds.batches_per_epoch), + augmentation=False, +) +``` + +### Visualize a Sample + +Let's visualize a random mixed audio chunk and its corresponding isolated vocals. +This helps to understand the nature of the preprocessed input data. + + +```python + +def visualize_audio_np(audio: np.ndarray, rate=16000, name="mixup"): + """Plot and display an audio waveform and also produce an Audio widget.""" + plt.figure(figsize=(10, 6)) + plt.plot(audio) + plt.title(f"Waveform: {name}") + plt.xlim(0, len(audio)) + plt.ylabel("Amplitude") + plt.show() + # plt.savefig(f"tmp/{name}.png") + + # Normalize and display audio + audio_norm = (audio - np.min(audio)) / (np.max(audio) - np.min(audio) + 1e-8) + audio_norm = (audio_norm * 2 - 1) * 0.6 + display.display(display.Audio(audio_norm, rate=rate)) + # sf.write(f"tmp/{name}.wav", audio_norm, rate) + + +sample_batch_x, sample_batch_y = val_ds[None] # Random batch +visualize_audio_np(ops.convert_to_numpy(sample_batch_x[0])) +visualize_audio_np(ops.convert_to_numpy(sample_batch_y[0, 0]), name="vocals") +``` + + + +![png](/img/examples/audio/vocal_track_separation/vocal_track_separation_12_0.png) + + + + + + + + + + + +![png](/img/examples/audio/vocal_track_separation/vocal_track_separation_12_2.png) + + + + + + + + + +--- +## Model + +### Preprocessing + +The model operates on STFT representations rather than raw audio. We define a +preprocessing model to compute STFT and a corresponding inverse transform (iSTFT). + + +```python + +def stft(inputs, fft_size=STFT_N_FFT, sequence_stride=STFT_HOP_LENGTH): + """Compute the STFT for the input audio and return the real and imaginary parts.""" + real_x, imag_x = ops.stft(inputs, fft_size, sequence_stride, fft_size) + real_x, imag_x = ops.expand_dims(real_x, -1), ops.expand_dims(imag_x, -1) + x = ops.concatenate((real_x, imag_x), axis=-1) + + # Drop last freq sample for convenience + return ops.split(x, [x.shape[2] - 1], axis=2)[0] + + +def inverse_stft(inputs, fft_size=STFT_N_FFT, sequence_stride=STFT_HOP_LENGTH): + """Compute the inverse STFT for the given STFT input.""" + x = inputs + + # Pad back dropped freq sample if using torch backend + if keras.backend.backend() == "torch": + x = ops.pad(x, ((0, 0), (0, 0), (0, 1), (0, 0))) + + real_x, imag_x = ops.split(x, 2, axis=-1) + real_x = ops.squeeze(real_x, axis=-1) + imag_x = ops.squeeze(imag_x, axis=-1) + + return ops.istft((real_x, imag_x), fft_size, sequence_stride, fft_size) + +``` + +### Model Architecture + +The model uses a custom encoder-decoder architecture with Time-Frequency Convolution +(TFC) and Time-Distributed Fully Connected (TDF) blocks. They are grouped into a +`TimeFrequencyTransformBlock`, i.e. "TFC_TDF" in the original paper by Choi et al. + +We then define an encoder-decoder network with multiple scales. Each encoder scale +applies TFC_TDF blocks followed by downsampling, while decoder scales apply TFC_TDF +blocks over the concatenation of upsampled features and associated encoder outputs. + + +```python + +@saving.register_keras_serializable() +class TimeDistributedDenseBlock(layers.Layer): + """Time-Distributed Fully Connected layer block. + + Applies frequency-wise dense transformations across time frames with instance + normalization and GELU activation. + """ + + def __init__(self, bottleneck_factor, fft_dim, **kwargs): + super().__init__(**kwargs) + self.fft_dim = fft_dim + self.hidden_dim = fft_dim // bottleneck_factor + + def build(self, *_): + self.group_norm_1 = layers.GroupNormalization(groups=-1) + self.group_norm_2 = layers.GroupNormalization(groups=-1) + self.dense_1 = layers.Dense(self.hidden_dim, use_bias=False) + self.dense_2 = layers.Dense(self.fft_dim, use_bias=False) + + def call(self, x): + # Apply normalization and dense layers frequency-wise + x = ops.gelu(self.group_norm_1(x)) + x = ops.swapaxes(x, -1, -2) + x = self.dense_1(x) + + x = ops.gelu(self.group_norm_2(ops.swapaxes(x, -1, -2))) + x = ops.swapaxes(x, -1, -2) + x = self.dense_2(x) + return ops.swapaxes(x, -1, -2) + + +@saving.register_keras_serializable() +class TimeFrequencyConvolution(layers.Layer): + """Time-Frequency Convolutional layer. + + Applies a 2D convolution over time-frequency representations and applies instance + normalization and GELU activation. + """ + + def __init__(self, channels, **kwargs): + super().__init__(**kwargs) + self.channels = channels + + def build(self, *_): + self.group_norm = layers.GroupNormalization(groups=-1) + self.conv = layers.Conv2D(self.channels, 3, padding="same", use_bias=False) + + def call(self, x): + return self.conv(ops.gelu(self.group_norm(x))) + + +@saving.register_keras_serializable() +class TimeFrequencyTransformBlock(layers.Layer): + """Implements TFC_TDF block for encoder-decoder architecture. + + Repeatedly apply Time-Frequency Convolution and Time-Distributed Dense blocks as + many times as specified by the `length` parameter. + """ + + def __init__( + self, channels, length, fft_dim, bottleneck_factor, in_channels=None, **kwargs + ): + super().__init__(**kwargs) + self.channels = channels + self.length = length + self.fft_dim = fft_dim + self.bottleneck_factor = bottleneck_factor + self.in_channels = in_channels or channels + self.blocks = [] + + def build(self, *_): + # Add blocks in a flat list to avoid nested structures + for i in range(self.length): + in_channels = self.channels if i > 0 else self.in_channels + self.blocks.append(TimeFrequencyConvolution(in_channels)) + self.blocks.append( + TimeDistributedDenseBlock(self.bottleneck_factor, self.fft_dim) + ) + self.blocks.append(TimeFrequencyConvolution(self.channels)) + # Residual connection + self.blocks.append(layers.Conv2D(self.channels, 1, 1, use_bias=False)) + + def call(self, inputs): + x = inputs + # Each block consists of 4 layers: + # 1. Time-Frequency Convolution + # 2. Time-Distributed Dense + # 3. Time-Frequency Convolution + # 4. Residual connection + for i in range(0, len(self.blocks), 4): + tfc_1 = self.blocks[i](x) + tdf = self.blocks[i + 1](x) + tfc_2 = self.blocks[i + 2](tfc_1 + tdf) + x = tfc_2 + self.blocks[i + 3](x) # Residual connection + return x + + +@saving.register_keras_serializable() +class Downscale(layers.Layer): + """Downscale time-frequency dimensions using a convolution.""" + + conv_cls = layers.Conv2D + + def __init__(self, channels, scale, **kwargs): + super().__init__(**kwargs) + self.channels = channels + self.scale = scale + + def build(self, *_): + self.conv = self.conv_cls(self.channels, self.scale, self.scale, use_bias=False) + self.norm = layers.GroupNormalization(groups=-1) + + def call(self, inputs): + return self.norm(ops.gelu(self.conv(inputs))) + + +@saving.register_keras_serializable() +class Upscale(Downscale): + """Upscale time-frequency dimensions using a transposed convolution.""" + + conv_cls = layers.Conv2DTranspose + + +def build_model( + inputs, + n_instruments=N_INSTRUMENTS, + n_subbands=N_SUBBANDS, + channels=N_CHANNELS, + fft_dim=(STFT_N_FFT // 2) // N_SUBBANDS, + n_scales=4, + scale=(2, 2), + block_size=2, + growth=128, + bottleneck_factor=2, + **kwargs, +): + """Build the TFC_TDF encoder-decoder model for source separation.""" + # Compute STFT + x = stft(inputs) + + # Split mixture into subbands as separate channels + mix = ops.reshape(x, (-1, x.shape[1], x.shape[2] // n_subbands, 2 * n_subbands)) + first_conv_out = layers.Conv2D(channels, 1, 1, use_bias=False)(mix) + x = first_conv_out + + # Encoder path + encoder_outs = [] + for _ in range(n_scales): + x = TimeFrequencyTransformBlock( + channels, block_size, fft_dim, bottleneck_factor + )(x) + encoder_outs.append(x) + fft_dim, channels = fft_dim // scale[0], channels + growth + x = Downscale(channels, scale)(x) + + # Bottleneck + x = TimeFrequencyTransformBlock(channels, block_size, fft_dim, bottleneck_factor)(x) + + # Decoder path + for _ in range(n_scales): + fft_dim, channels = fft_dim * scale[0], channels - growth + x = ops.concatenate([Upscale(channels, scale)(x), encoder_outs.pop()], axis=-1) + x = TimeFrequencyTransformBlock( + channels, block_size, fft_dim, bottleneck_factor, in_channels=x.shape[-1] + )(x) + + # Residual connection and final convolutions + x = ops.concatenate([mix, x * first_conv_out], axis=-1) + x = layers.Conv2D(channels, 1, 1, use_bias=False, activation="gelu")(x) + x = layers.Conv2D(n_instruments * n_subbands * 2, 1, 1, use_bias=False)(x) + + # Reshape back to instrument-wise STFT + x = ops.reshape(x, (-1, x.shape[1], x.shape[2] * n_subbands, n_instruments, 2)) + x = ops.transpose(x, (0, 3, 1, 2, 4)) + x = ops.reshape(x, (-1, n_instruments, x.shape[2], x.shape[3] * 2)) + + return keras.Model(inputs=inputs, outputs=x, **kwargs) + +``` + +--- +## Loss and Metrics + +We define: + +- `spectral_loss`: Mean absolute error in STFT domain. +- `sdr`: Signal-to-Distortion Ratio, a common source separation metric. + + +```python + +def prediction_to_wave(x, n_instruments=N_INSTRUMENTS): + """Convert STFT predictions back to waveform.""" + x = ops.reshape(x, (-1, x.shape[2], x.shape[3] // 2, 2)) + x = inverse_stft(x) + return ops.reshape(x, (-1, n_instruments, x.shape[1])) + + +def target_to_stft(y): + """Convert target waveforms to their STFT representations.""" + y = ops.reshape(y, (-1, CHUNK_SIZE)) + y_real, y_imag = ops.stft(y, STFT_N_FFT, STFT_HOP_LENGTH, STFT_N_FFT) + y_real, y_imag = y_real[..., :-1], y_imag[..., :-1] + y = ops.stack([y_real, y_imag], axis=-1) + return ops.reshape(y, (-1, N_INSTRUMENTS, y.shape[1], y.shape[2] * 2)) + + +@saving.register_keras_serializable() +def sdr(y_true, y_pred): + """Signal-to-Distortion Ratio metric.""" + y_pred = prediction_to_wave(y_pred) + # Add epsilon for numerical stability + num = ops.sum(ops.square(y_true), axis=-1) + 1e-8 + den = ops.sum(ops.square(y_true - y_pred), axis=-1) + 1e-8 + return 10 * ops.log10(num / den) + + +@saving.register_keras_serializable() +def spectral_loss(y_true, y_pred): + """Mean absolute error in the STFT domain.""" + y_true = target_to_stft(y_true) + return ops.mean(ops.absolute(y_true - y_pred)) + +``` + +--- +## Training + +### Visualize Model Architecture + + +```python +# Load or create the model +if path.exists(MODEL_PATH): + model = saving.load_model(MODEL_PATH) +else: + model = build_model(keras.Input(sample_batch_x.shape[1:]), name="tfc_tdf_net") + +# Display the model architecture +model.summary() +img = keras.utils.plot_model(model, path.join(TMP_DIR, "model.png"), show_shapes=True) +display.display(img) +``` + + +
Model: "tfc_tdf_net"
+
+ + + + +
┏━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━┓
+┃ Layer (type)         Output Shape          Param #  Connected to      ┃
+┡━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━┩
+│ input_layer         │ (None, 65024)     │          0 │ -                 │
+│ (InputLayer)        │                   │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ stft (STFT)         │ [(None, 128,      │          0 │ input_layer[0][0] │
+│                     │ 1025), (None,     │            │                   │
+│                     │ 128, 1025)]       │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ expand_dims         │ (None, 128, 1025, │          0 │ stft[0][0]        │
+│ (ExpandDims)        │ 1)                │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ expand_dims_1       │ (None, 128, 1025, │          0 │ stft[0][1]        │
+│ (ExpandDims)        │ 1)                │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ concatenate         │ (None, 128, 1025, │          0 │ expand_dims[0][0… │
+│ (Concatenate)       │ 2)                │            │ expand_dims_1[0]… │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ split (Split)       │ [(None, 128,      │          0 │ concatenate[0][0] │
+│                     │ 1024, 2), (None,  │            │                   │
+│                     │ 128, 1, 2)]       │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ reshape (Reshape)   │ (None, 128, 256,  │          0 │ split[0][0]       │
+│                     │ 8)                │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ conv2d (Conv2D)     │ (None, 128, 256,  │        512 │ reshape[0][0]     │
+│                     │ 64)               │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ time_frequency_tra… │ (None, 128, 256,  │    287,744 │ conv2d[0][0]      │
+│ (TimeFrequencyTran…64)               │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ downscale           │ (None, 64, 128,   │     49,536 │ time_frequency_t… │
+│ (Downscale)         │ 192)              │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ time_frequency_tra… │ (None, 64, 128,   │  1,436,672 │ downscale[0][0]   │
+│ (TimeFrequencyTran…192)              │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ downscale_1         │ (None, 32, 64,    │    246,400 │ time_frequency_t… │
+│ (Downscale)         │ 320)              │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ time_frequency_tra… │ (None, 32, 64,    │  3,904,512 │ downscale_1[0][0] │
+│ (TimeFrequencyTran…320)              │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ downscale_2         │ (None, 16, 32,    │    574,336 │ time_frequency_t… │
+│ (Downscale)         │ 448)              │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ time_frequency_tra… │ (None, 16, 32,    │  7,635,968 │ downscale_2[0][0] │
+│ (TimeFrequencyTran…448)              │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ downscale_3         │ (None, 8, 16,     │  1,033,344 │ time_frequency_t… │
+│ (Downscale)         │ 576)              │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ time_frequency_tra… │ (None, 8, 16,     │ 12,617,216 │ downscale_3[0][0] │
+│ (TimeFrequencyTran…576)              │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ upscale (Upscale)   │ (None, 16, 32,    │  1,033,088 │ time_frequency_t… │
+│                     │ 448)              │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ concatenate_1       │ (None, 16, 32,    │          0 │ upscale[0][0],    │
+│ (Concatenate)       │ 896)              │            │ time_frequency_t… │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ time_frequency_tra… │ (None, 16, 32,    │ 15,065,600 │ concatenate_1[0]… │
+│ (TimeFrequencyTran…448)              │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ upscale_1 (Upscale) │ (None, 32, 64,    │    574,080 │ time_frequency_t… │
+│                     │ 320)              │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ concatenate_2       │ (None, 32, 64,    │          0 │ upscale_1[0][0],  │
+│ (Concatenate)       │ 640)              │            │ time_frequency_t… │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ time_frequency_tra… │ (None, 32, 64,    │  7,695,872 │ concatenate_2[0]… │
+│ (TimeFrequencyTran…320)              │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ upscale_2 (Upscale) │ (None, 64, 128,   │    246,144 │ time_frequency_t… │
+│                     │ 192)              │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ concatenate_3       │ (None, 64, 128,   │          0 │ upscale_2[0][0],  │
+│ (Concatenate)       │ 384)              │            │ time_frequency_t… │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ time_frequency_tra… │ (None, 64, 128,   │  2,802,176 │ concatenate_3[0]… │
+│ (TimeFrequencyTran…192)              │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ upscale_3 (Upscale) │ (None, 128, 256,  │     49,280 │ time_frequency_t… │
+│                     │ 64)               │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ concatenate_4       │ (None, 128, 256,  │          0 │ upscale_3[0][0],  │
+│ (Concatenate)       │ 128)              │            │ time_frequency_t… │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ time_frequency_tra… │ (None, 128, 256,  │    439,808 │ concatenate_4[0]… │
+│ (TimeFrequencyTran…64)               │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ multiply (Multiply) │ (None, 128, 256,  │          0 │ time_frequency_t… │
+│                     │ 64)               │            │ conv2d[0][0]      │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ concatenate_5       │ (None, 128, 256,  │          0 │ reshape[0][0],    │
+│ (Concatenate)       │ 72)               │            │ multiply[0][0]    │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ conv2d_59 (Conv2D)  │ (None, 128, 256,  │      4,608 │ concatenate_5[0]… │
+│                     │ 64)               │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ conv2d_60 (Conv2D)  │ (None, 128, 256,  │        512 │ conv2d_59[0][0]   │
+│                     │ 8)                │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ reshape_1 (Reshape) │ (None, 128, 1024, │          0 │ conv2d_60[0][0]   │
+│                     │ 1, 2)             │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ transpose           │ (None, 1, 128,    │          0 │ reshape_1[0][0]   │
+│ (Transpose)         │ 1024, 2)          │            │                   │
+├─────────────────────┼───────────────────┼────────────┼───────────────────┤
+│ reshape_2 (Reshape) │ (None, 1, 128,    │          0 │ transpose[0][0]   │
+│                     │ 2048)             │            │                   │
+└─────────────────────┴───────────────────┴────────────┴───────────────────┘
+
+ + + + +
 Total params: 222,789,634 (849.88 MB)
+
+ + + + +
 Trainable params: 55,697,408 (212.47 MB)
+
+ + + + +
 Non-trainable params: 0 (0.00 B)
+
+ + + + +
 Optimizer params: 167,092,226 (637.41 MB)
+
+ + + + + +![png](/img/examples/audio/vocal_track_separation/vocal_track_separation_20_6.png) + + + +### Compile and Train the Model + + +```python +# Compile the model +optimizer = keras.optimizers.Adam(5e-05, gradient_accumulation_steps=ACCUMULATION_STEPS) +model.compile(optimizer=optimizer, loss=spectral_loss, metrics=[sdr]) + +# Define callbacks +cbs = [ + callbacks.ModelCheckpoint(MODEL_PATH, "val_sdr", save_best_only=True, mode="max"), + callbacks.ReduceLROnPlateau(factor=0.95, patience=2), + callbacks.CSVLogger(CSV_LOG_PATH), +] + +if not path.exists(MODEL_PATH): + model.fit(train_ds, validation_data=val_ds, epochs=10, callbacks=cbs, shuffle=False) +else: + # Demonstration of a single epoch of training when model already exists + model.fit(train_ds, validation_data=val_ds, epochs=1, shuffle=False, verbose=2) +``` + +
+``` +2000/2000 - 490s - 245ms/step - loss: 0.2977 - sdr: 5.6497 - val_loss: 0.1720 - val_sdr: 6.0508 + +``` +
+--- +## Evaluation + +Evaluate the model on the validation dataset and visualize predicted vocals. + + +```python +model.evaluate(val_ds, verbose=2) +y_pred = model.predict(sample_batch_x, verbose=2) +y_pred = prediction_to_wave(y_pred) +visualize_audio_np(ops.convert_to_numpy(y_pred[0, 0]), name="vocals_pred") +``` + +
+``` +200/200 - 8s - 41ms/step - loss: 0.1747 - sdr: 5.9374 + +1/1 - 4s - 4s/step + +``` +
+ +![png](/img/examples/audio/vocal_track_separation/vocal_track_separation_24_2.png) + + + + + + + + + +--- +## Conclusion + +We built and trained a vocal track separation model using an encoder-decoder +architecture with custom blocks applied to the MUSDB18 dataset. We demonstrated +STFT-based preprocessing, data augmentation, and a source separation metric (SDR). + +**Next steps:** + +- Train for more epochs and refine hyperparameters. +- Separate multiple instruments simultaneously. +- Enhance the model to handle instruments not present in the mixture. + From ee5be7d1b58f040be0f0f67cdacb95c0f21fbbce Mon Sep 17 00:00:00 2001 From: Damoon Date: Thu, 27 Feb 2025 21:21:53 +0300 Subject: [PATCH 2/5] added by mistake --- examples/audio/vocal_track_separation.py | 673 ----------------------- 1 file changed, 673 deletions(-) delete mode 100644 examples/audio/vocal_track_separation.py diff --git a/examples/audio/vocal_track_separation.py b/examples/audio/vocal_track_separation.py deleted file mode 100644 index 25574e0ab8..0000000000 --- a/examples/audio/vocal_track_separation.py +++ /dev/null @@ -1,673 +0,0 @@ -""" -Title: Vocal Track Separation with Encoder-Decoder Architecture -Author: [Joaquin Jimenez](https://github.com/johacks/) -Date created: 2024/12/10 -Last modified: 2024/12/10 -Description: Train a model to separate vocal tracks from music mixtures. -Accelerator: GPU -""" - -""" -## Introduction - -In this tutorial, we build a vocal track separation model using an encoder-decoder -architecture in Keras 3. - -We train the model on the [MUSDB18 dataset](https://doi.org/10.5281/zenodo.1117372), -which provides music mixtures and isolated tracks for drums, bass, other, and vocals. - -Key concepts covered: - -- Audio data preprocessing using the Short-Time Fourier Transform (STFT). -- Audio data augmentation techniques. -- Implementing custom encoders and decoders specialized for audio data. -- Defining appropriate loss functions and metrics for audio source separation tasks. - -The model architecture is derived from the TFC_TDF_Net model described in: - -W. Choi, M. Kim, J. Chung, D. Lee, and S. Jung, “Investigating U-Nets with various -intermediate blocks for spectrogram-based singing voice separation,” in the 21st -International Society for Music Information Retrieval Conference, 2020. - -For reference code, see: -[GitHub: ws-choi/ISMIR2020_U_Nets_SVS](https://github.com/ws-choi/ISMIR2020_U_Nets_SVS). - -The data processing and model training routines are partly derived from: -[ZFTurbo/Music-Source-Separation-Training](https://github.com/ZFTurbo/Music-Source-Separation-Training/tree/main). -""" - -""" -## Setup - -Import and install all the required dependencies. -""" - -"""shell -pip install -qq audiomentations soundfile ffmpeg-binaries -pip install -qq "keras==3.7.0" -sudo -n apt-get install -y graphviz >/dev/null 2>&1 # Required for plotting the model -""" - -import glob -import os - -os.environ["KERAS_BACKEND"] = "jax" # or "tensorflow" or "torch" - -import random -import subprocess -import tempfile -import typing -from os import path - -import audiomentations as aug -import ffmpeg -import keras -import numpy as np -import soundfile as sf -from IPython import display -from keras import callbacks, layers, ops, saving -from matplotlib import pyplot as plt - -""" -## Configuration - -The following constants define configuration parameters for audio processing -and model training, including dataset paths, audio chunk sizes, Short-Time Fourier -Transform (STFT) parameters, and training hyperparameters. -""" - -# MUSDB18 dataset configuration -MUSDB_STREAMS = {"mixture": 0, "drums": 1, "bass": 2, "other": 3, "vocals": 4} -TARGET_INSTRUMENTS = {track: MUSDB_STREAMS[track] for track in ("vocals",)} -N_INSTRUMENTS = len(TARGET_INSTRUMENTS) -SOURCE_INSTRUMENTS = tuple(k for k in MUSDB_STREAMS if k != "mixture") - -# Audio preprocessing parameters for Short-Time Fourier Transform (STFT) -N_SUBBANDS = 4 # Number of subbands into which frequencies are split -CHUNK_SIZE = 65024 # Number of amplitude samples per audio chunk (~4 seconds) -STFT_N_FFT = 2048 # FFT points used in STFT -STFT_HOP_LENGTH = 512 # Hop length for STFT - -# Training hyperparameters -N_CHANNELS = 64 # Base channel count for the model -BATCH_SIZE = 3 -ACCUMULATION_STEPS = 2 -EFFECTIVE_BATCH_SIZE = BATCH_SIZE * (ACCUMULATION_STEPS or 1) - -# Paths -TMP_DIR = path.expanduser("~/.keras/tmp") -DATASET_DIR = path.expanduser("~/.keras/datasets") -MODEL_PATH = path.join(TMP_DIR, f"model_{keras.backend.backend()}.keras") -CSV_LOG_PATH = path.join(TMP_DIR, f"training_{keras.backend.backend()}.csv") -os.makedirs(DATASET_DIR, exist_ok=True) -os.makedirs(TMP_DIR, exist_ok=True) - -# Set random seed for reproducibility -keras.utils.set_random_seed(21) - -""" -## MUSDB18 Dataset - -The MUSDB18 dataset is a standard benchmark for music source separation, containing -150 full-length music tracks along with isolated drums, bass, other, and vocals. -The dataset is stored in .mp4 format, and each .mp4 file includes multiple audio -streams (mixture and individual tracks). - -### Download and Conversion - -The following utility function downloads MUSDB18 and converts its .mp4 files to -.wav files for each instrument track, resampled to 16 kHz. -""" - - -def download_musdb18(out_dir=None): - """Download and extract the MUSDB18 dataset, then convert .mp4 files to .wav files. - - MUSDB18 reference: - Rafii, Z., Liutkus, A., Stöter, F.-R., Mimilakis, S. I., & Bittner, R. (2017). - MUSDB18 - a corpus for music separation (1.0.0) [Data set]. Zenodo. - """ - ffmpeg.init() - from ffmpeg import FFMPEG_PATH - - # Create output directories - os.makedirs((base := out_dir or tempfile.mkdtemp()), exist_ok=True) - if path.exists((out_dir := path.join(base, "musdb18_wav"))): - print("MUSDB18 dataset already downloaded") - return out_dir - - # Download and extract the dataset - download_dir = keras.utils.get_file( - fname="musdb18", - origin="https://zenodo.org/records/1117372/files/musdb18.zip", - extract=True, - ) - - # ffmpeg command template: input, stream index, output - ffmpeg_args = str(FFMPEG_PATH) + " -v error -i {} -map 0:{} -vn -ar 16000 {}" - - # Convert each mp4 file to multiple .wav files for each track - for split in ("train", "test"): - songs = os.listdir(path.join(download_dir, split)) - for i, song in enumerate(songs): - if i % 10 == 0: - print(f"{split.capitalize()}: {i}/{len(songs)} songs processed") - - mp4_path_orig = path.join(download_dir, split, song) - mp4_path = path.join(tempfile.mkdtemp(), split, song.replace(" ", "_")) - os.makedirs(path.dirname(mp4_path), exist_ok=True) - os.rename(mp4_path_orig, mp4_path) - - wav_dir = path.join(out_dir, split, path.basename(mp4_path).split(".")[0]) - os.makedirs(wav_dir, exist_ok=True) - - for track in SOURCE_INSTRUMENTS: - out_path = path.join(wav_dir, f"{track}.wav") - stream_index = MUSDB_STREAMS[track] - args = ffmpeg_args.format(mp4_path, stream_index, out_path).split() - assert subprocess.run(args).returncode == 0, "ffmpeg conversion failed" - return out_dir - - -# Download and prepare the MUSDB18 dataset -songs = download_musdb18(out_dir=DATASET_DIR) - -""" -### Custom Dataset - -We define a custom dataset class to generate random audio chunks and their corresponding -labels. The dataset does the following: - -1. Selects a random chunk from a random song and instrument. -2. Applies optional data augmentations. -3. Combines isolated tracks to form new synthetic mixtures. -4. Prepares features (mixtures) and labels (vocals) for training. - -This approach allows creating an effectively infinite variety of training examples -through randomization and augmentation. -""" - - -class Dataset(keras.utils.PyDataset): - def __init__( - self, - songs, - batch_size=BATCH_SIZE, - chunk_size=CHUNK_SIZE, - batches_per_epoch=1000 * ACCUMULATION_STEPS, - augmentation=True, - **kwargs, - ): - super().__init__(**kwargs) - self.augmentation = augmentation - self.vocals_augmentations = [ - aug.PitchShift(min_semitones=-5, max_semitones=5, p=0.1), - aug.SevenBandParametricEQ(-9, 9, p=0.25), - aug.TanhDistortion(0.1, 0.7, p=0.1), - ] - self.other_augmentations = [ - aug.PitchShift(p=0.1), - aug.AddGaussianNoise(p=0.1), - ] - self.songs = songs - self.sizes = {song: self.get_track_set_size(song) for song in self.songs} - self.batch_size = batch_size - self.chunk_size = chunk_size - self.batches_per_epoch = batches_per_epoch - - def get_track_set_size(self, song: str): - """Return the smallest track length in the given song directory.""" - sizes = [len(sf.read(p)[0]) for p in glob.glob(path.join(song, "*.wav"))] - if max(sizes) != min(sizes): - print(f"Warning: {song} has different track lengths") - return min(sizes) - - def random_chunk_of_instrument_type(self, instrument: str): - """Extract a random chunk for the specified instrument from a random song.""" - song, size = random.choice(list(self.sizes.items())) - track = path.join(song, f"{instrument}.wav") - - if self.chunk_size <= size: - start = np.random.randint(size - self.chunk_size + 1) - audio = sf.read(track, self.chunk_size, start, dtype="float32")[0] - audio_mono = np.mean(audio, axis=1) - else: - # If the track is shorter than chunk_size, pad the signal - audio_mono = np.mean(sf.read(track, dtype="float32")[0], axis=1) - audio_mono = np.pad(audio_mono, ((0, self.chunk_size - size),)) - - # If the chunk is almost silent, retry - if np.mean(np.abs(audio_mono)) < 0.01: - return self.random_chunk_of_instrument_type(instrument) - - return self.data_augmentation(audio_mono, instrument) - - def data_augmentation(self, audio: np.ndarray, instrument: str): - """Apply data augmentation to the audio chunk, if enabled.""" - - def coin_flip(x, probability: float, fn: typing.Callable): - return fn(x) if random.uniform(0, 1) < probability else x - - if self.augmentation: - augmentations = ( - self.vocals_augmentations - if instrument == "vocals" - else self.other_augmentations - ) - # Loudness augmentation - audio *= np.random.uniform(0.5, 1.5, (len(audio),)).astype("float32") - # Random reverse - audio = coin_flip(audio, 0.1, lambda x: np.flip(x)) - # Random polarity inversion - audio = coin_flip(audio, 0.5, lambda x: -x) - # Apply selected augmentations - for aug_ in augmentations: - aug_.randomize_parameters(audio, sample_rate=16000) - audio = aug_(audio, sample_rate=16000) - return audio - - def random_mix_of_tracks(self) -> dict: - """Create a random mix of instruments by summing their individual chunks.""" - tracks = {} - for instrument in SOURCE_INSTRUMENTS: - # Start with a single random chunk - mixup = [self.random_chunk_of_instrument_type(instrument)] - - # Randomly add more chunks of the same instrument (mixup augmentation) - if self.augmentation: - for p in (0.2, 0.02): - if random.uniform(0, 1) < p: - mixup.append(self.random_chunk_of_instrument_type(instrument)) - - tracks[instrument] = np.mean(mixup, axis=0, dtype="float32") - return tracks - - def __len__(self): - return self.batches_per_epoch - - def __getitem__(self, idx): - # Generate a batch of random mixtures - batch = [self.random_mix_of_tracks() for _ in range(self.batch_size)] - - # Features: sum of all tracks - batch_x = ops.sum( - np.array([list(track_set.values()) for track_set in batch]), axis=1 - ) - - # Labels: isolated target instruments (e.g., vocals) - batch_y = np.array( - [[track_set[t] for t in TARGET_INSTRUMENTS] for track_set in batch] - ) - - return batch_x, ops.convert_to_tensor(batch_y) - - -# Create train and validation datasets -train_ds = Dataset(glob.glob(path.join(songs, "train", "*"))) -val_ds = Dataset( - glob.glob(path.join(songs, "test", "*")), - batches_per_epoch=int(0.1 * train_ds.batches_per_epoch), - augmentation=False, -) - -""" -### Visualize a Sample - -Let's visualize a random mixed audio chunk and its corresponding isolated vocals. -This helps to understand the nature of the preprocessed input data. -""" - - -def visualize_audio_np(audio: np.ndarray, rate=16000, name="mixup"): - """Plot and display an audio waveform and also produce an Audio widget.""" - plt.figure(figsize=(10, 6)) - plt.plot(audio) - plt.title(f"Waveform: {name}") - plt.xlim(0, len(audio)) - plt.ylabel("Amplitude") - plt.show() - # plt.savefig(f"tmp/{name}.png") - - # Normalize and display audio - audio_norm = (audio - np.min(audio)) / (np.max(audio) - np.min(audio) + 1e-8) - audio_norm = (audio_norm * 2 - 1) * 0.6 - display.display(display.Audio(audio_norm, rate=rate)) - # sf.write(f"tmp/{name}.wav", audio_norm, rate) - - -sample_batch_x, sample_batch_y = val_ds[None] # Random batch -visualize_audio_np(ops.convert_to_numpy(sample_batch_x[0])) -visualize_audio_np(ops.convert_to_numpy(sample_batch_y[0, 0]), name="vocals") - -""" -## Model - -### Preprocessing - -The model operates on STFT representations rather than raw audio. We define a -preprocessing model to compute STFT and a corresponding inverse transform (iSTFT). -""" - - -def stft(inputs, fft_size=STFT_N_FFT, sequence_stride=STFT_HOP_LENGTH): - """Compute the STFT for the input audio and return the real and imaginary parts.""" - real_x, imag_x = ops.stft(inputs, fft_size, sequence_stride, fft_size) - real_x, imag_x = ops.expand_dims(real_x, -1), ops.expand_dims(imag_x, -1) - x = ops.concatenate((real_x, imag_x), axis=-1) - - # Drop last freq sample for convenience - return ops.split(x, [x.shape[2] - 1], axis=2)[0] - - -def inverse_stft(inputs, fft_size=STFT_N_FFT, sequence_stride=STFT_HOP_LENGTH): - """Compute the inverse STFT for the given STFT input.""" - x = inputs - - # Pad back dropped freq sample if using torch backend - if keras.backend.backend() == "torch": - x = ops.pad(x, ((0, 0), (0, 0), (0, 1), (0, 0))) - - real_x, imag_x = ops.split(x, 2, axis=-1) - real_x = ops.squeeze(real_x, axis=-1) - imag_x = ops.squeeze(imag_x, axis=-1) - - return ops.istft((real_x, imag_x), fft_size, sequence_stride, fft_size) - - -""" -### Model Architecture - -The model uses a custom encoder-decoder architecture with Time-Frequency Convolution -(TFC) and Time-Distributed Fully Connected (TDF) blocks. They are grouped into a -`TimeFrequencyTransformBlock`, i.e. "TFC_TDF" in the original paper by Choi et al. - -We then define an encoder-decoder network with multiple scales. Each encoder scale -applies TFC_TDF blocks followed by downsampling, while decoder scales apply TFC_TDF -blocks over the concatenation of upsampled features and associated encoder outputs. -""" - - -@saving.register_keras_serializable() -class TimeDistributedDenseBlock(layers.Layer): - """Time-Distributed Fully Connected layer block. - - Applies frequency-wise dense transformations across time frames with instance - normalization and GELU activation. - """ - - def __init__(self, bottleneck_factor, fft_dim, **kwargs): - super().__init__(**kwargs) - self.fft_dim = fft_dim - self.hidden_dim = fft_dim // bottleneck_factor - - def build(self, *_): - self.group_norm_1 = layers.GroupNormalization(groups=-1) - self.group_norm_2 = layers.GroupNormalization(groups=-1) - self.dense_1 = layers.Dense(self.hidden_dim, use_bias=False) - self.dense_2 = layers.Dense(self.fft_dim, use_bias=False) - - def call(self, x): - # Apply normalization and dense layers frequency-wise - x = ops.gelu(self.group_norm_1(x)) - x = ops.swapaxes(x, -1, -2) - x = self.dense_1(x) - - x = ops.gelu(self.group_norm_2(ops.swapaxes(x, -1, -2))) - x = ops.swapaxes(x, -1, -2) - x = self.dense_2(x) - return ops.swapaxes(x, -1, -2) - - -@saving.register_keras_serializable() -class TimeFrequencyConvolution(layers.Layer): - """Time-Frequency Convolutional layer. - - Applies a 2D convolution over time-frequency representations and applies instance - normalization and GELU activation. - """ - - def __init__(self, channels, **kwargs): - super().__init__(**kwargs) - self.channels = channels - - def build(self, *_): - self.group_norm = layers.GroupNormalization(groups=-1) - self.conv = layers.Conv2D(self.channels, 3, padding="same", use_bias=False) - - def call(self, x): - return self.conv(ops.gelu(self.group_norm(x))) - - -@saving.register_keras_serializable() -class TimeFrequencyTransformBlock(layers.Layer): - """Implements TFC_TDF block for encoder-decoder architecture. - - Repeatedly apply Time-Frequency Convolution and Time-Distributed Dense blocks as - many times as specified by the `length` parameter. - """ - - def __init__( - self, channels, length, fft_dim, bottleneck_factor, in_channels=None, **kwargs - ): - super().__init__(**kwargs) - self.channels = channels - self.length = length - self.fft_dim = fft_dim - self.bottleneck_factor = bottleneck_factor - self.in_channels = in_channels or channels - - def build(self, *_): - self.blocks = [] - # Add blocks in a flat list to avoid nested structures - for i in range(self.length): - in_channels = self.channels if i > 0 else self.in_channels - self.blocks.append(TimeFrequencyConvolution(in_channels)) - self.blocks.append( - TimeDistributedDenseBlock(self.bottleneck_factor, self.fft_dim) - ) - self.blocks.append(TimeFrequencyConvolution(self.channels)) - # Residual connection - self.blocks.append(layers.Conv2D(self.channels, 1, 1, use_bias=False)) - - def call(self, inputs): - x = inputs - # Each block consists of 4 layers: - # 1. Time-Frequency Convolution - # 2. Time-Distributed Dense - # 3. Time-Frequency Convolution - # 4. Residual connection - for i in range(0, len(self.blocks), 4): - tfc_1 = self.blocks[i](x) - tdf = self.blocks[i + 1](x) - tfc_2 = self.blocks[i + 2](tfc_1 + tdf) - x = tfc_2 + self.blocks[i + 3](x) # Residual connection - return x - - -@saving.register_keras_serializable() -class Downscale(layers.Layer): - """Downscale time-frequency dimensions using a convolution.""" - - conv_cls = layers.Conv2D - - def __init__(self, channels, scale, **kwargs): - super().__init__(**kwargs) - self.channels = channels - self.scale = scale - - def build(self, *_): - self.conv = self.conv_cls(self.channels, self.scale, self.scale, use_bias=False) - self.norm = layers.GroupNormalization(groups=-1) - - def call(self, inputs): - return self.norm(ops.gelu(self.conv(inputs))) - - -@saving.register_keras_serializable() -class Upscale(Downscale): - """Upscale time-frequency dimensions using a transposed convolution.""" - - conv_cls = layers.Conv2DTranspose - - -def build_model( - inputs, - n_instruments=N_INSTRUMENTS, - n_subbands=N_SUBBANDS, - channels=N_CHANNELS, - fft_dim=(STFT_N_FFT // 2) // N_SUBBANDS, - n_scales=4, - scale=(2, 2), - block_size=2, - growth=128, - bottleneck_factor=2, - **kwargs, -): - """Build the TFC_TDF encoder-decoder model for source separation.""" - # Compute STFT - x = stft(inputs) - - # Split mixture into subbands as separate channels - mix = ops.reshape(x, (-1, x.shape[1], x.shape[2] // n_subbands, 2 * n_subbands)) - first_conv_out = layers.Conv2D(channels, 1, 1, use_bias=False)(mix) - x = first_conv_out - - # Encoder path - encoder_outs = [] - for _ in range(n_scales): - x = TimeFrequencyTransformBlock( - channels, block_size, fft_dim, bottleneck_factor - )(x) - encoder_outs.append(x) - fft_dim, channels = fft_dim // scale[0], channels + growth - x = Downscale(channels, scale)(x) - - # Bottleneck - x = TimeFrequencyTransformBlock(channels, block_size, fft_dim, bottleneck_factor)(x) - - # Decoder path - for _ in range(n_scales): - fft_dim, channels = fft_dim * scale[0], channels - growth - x = ops.concatenate([Upscale(channels, scale)(x), encoder_outs.pop()], axis=-1) - x = TimeFrequencyTransformBlock( - channels, block_size, fft_dim, bottleneck_factor, in_channels=x.shape[-1] - )(x) - - # Residual connection and final convolutions - x = ops.concatenate([mix, x * first_conv_out], axis=-1) - x = layers.Conv2D(channels, 1, 1, use_bias=False, activation="gelu")(x) - x = layers.Conv2D(n_instruments * n_subbands * 2, 1, 1, use_bias=False)(x) - - # Reshape back to instrument-wise STFT - x = ops.reshape(x, (-1, x.shape[1], x.shape[2] * n_subbands, n_instruments, 2)) - x = ops.transpose(x, (0, 3, 1, 2, 4)) - x = ops.reshape(x, (-1, n_instruments, x.shape[2], x.shape[3] * 2)) - - return keras.Model(inputs=inputs, outputs=x, **kwargs) - - -""" -## Loss and Metrics - -We define: - -- `spectral_loss`: Mean absolute error in STFT domain. -- `sdr`: Signal-to-Distortion Ratio, a common source separation metric. -""" - - -def prediction_to_wave(x, n_instruments=N_INSTRUMENTS): - """Convert STFT predictions back to waveform.""" - x = ops.reshape(x, (-1, x.shape[2], x.shape[3] // 2, 2)) - x = inverse_stft(x) - return ops.reshape(x, (-1, n_instruments, x.shape[1])) - - -def target_to_stft(y): - """Convert target waveforms to their STFT representations.""" - y = ops.reshape(y, (-1, CHUNK_SIZE)) - y_real, y_imag = ops.stft(y, STFT_N_FFT, STFT_HOP_LENGTH, STFT_N_FFT) - y_real, y_imag = y_real[..., :-1], y_imag[..., :-1] - y = ops.stack([y_real, y_imag], axis=-1) - return ops.reshape(y, (-1, N_INSTRUMENTS, y.shape[1], y.shape[2] * 2)) - - -@saving.register_keras_serializable() -def sdr(y_true, y_pred): - """Signal-to-Distortion Ratio metric.""" - y_pred = prediction_to_wave(y_pred) - # Add epsilon for numerical stability - num = ops.sum(ops.square(y_true), axis=-1) + 1e-8 - den = ops.sum(ops.square(y_true - y_pred), axis=-1) + 1e-8 - return 10 * ops.log10(num / den) - - -@saving.register_keras_serializable() -def spectral_loss(y_true, y_pred): - """Mean absolute error in the STFT domain.""" - y_true = target_to_stft(y_true) - return ops.mean(ops.absolute(y_true - y_pred)) - - -""" -## Training - -### Visualize Model Architecture -""" - -# Load or create the model -if path.exists(MODEL_PATH): - model = saving.load_model(MODEL_PATH) -else: - model = build_model(keras.Input(sample_batch_x.shape[1:]), name="tfc_tdf_net") - -# Display the model architecture -model.summary() -img = keras.utils.plot_model(model, path.join(TMP_DIR, "model.png"), show_shapes=True) -display.display(img) - -""" -### Compile and Train the Model -""" - -# Compile the model -optimizer = keras.optimizers.Adam(5e-05, gradient_accumulation_steps=ACCUMULATION_STEPS) -model.compile(optimizer=optimizer, loss=spectral_loss, metrics=[sdr]) - -# Define callbacks -cbs = [ - callbacks.ModelCheckpoint(MODEL_PATH, "val_sdr", save_best_only=True, mode="max"), - callbacks.ReduceLROnPlateau(factor=0.95, patience=2), - callbacks.CSVLogger(CSV_LOG_PATH), -] - -if not path.exists(MODEL_PATH): - model.fit(train_ds, validation_data=val_ds, epochs=10, callbacks=cbs, shuffle=False) -else: - # Demonstration of a single epoch of training when model already exists - model.fit(train_ds, validation_data=val_ds, epochs=1, shuffle=False, verbose=2) - -""" -## Evaluation - -Evaluate the model on the validation dataset and visualize predicted vocals. -""" - -model.evaluate(val_ds, verbose=2) -y_pred = model.predict(sample_batch_x, verbose=2) -y_pred = prediction_to_wave(y_pred) -visualize_audio_np(ops.convert_to_numpy(y_pred[0, 0]), name="vocals_pred") - -""" -## Conclusion - -We built and trained a vocal track separation model using an encoder-decoder -architecture with custom blocks applied to the MUSDB18 dataset. We demonstrated -STFT-based preprocessing, data augmentation, and a source separation metric (SDR). - -**Next steps:** - -- Train for more epochs and refine hyperparameters. -- Separate multiple instruments simultaneously. -- Enhance the model to handle instruments not present in the mixture. -""" From 790b008504e829d1246d80c476085fa39e0dd572 Mon Sep 17 00:00:00 2001 From: Damoon Date: Fri, 28 Feb 2025 01:45:13 +0300 Subject: [PATCH 3/5] Optimized the code, more details added --- examples/vision/ipynb/mnist_moe.ipynb | 235 +- examples/vision/md/mnist_moe.md | 22592 +----------------------- examples/vision/mnist_moe.py | 228 +- 3 files changed, 443 insertions(+), 22612 deletions(-) diff --git a/examples/vision/ipynb/mnist_moe.ipynb b/examples/vision/ipynb/mnist_moe.ipynb index 0d4c4a0ad4..931cb764c6 100644 --- a/examples/vision/ipynb/mnist_moe.ipynb +++ b/examples/vision/ipynb/mnist_moe.ipynb @@ -11,7 +11,7 @@ "**Author:** [Damoon Shahhosseini](https://www.linkedin.com/in/damoonsh/)
\n", "**Date created:** 2015/06/19
\n", "**Last modified:** 2020/04/21
\n", - "**Description:** Showcasing concepts relates to Mixture of Experts (MoE)." + "**Description:** Simple MoE implementation for MNIST classification." ] }, { @@ -29,11 +29,10 @@ "At each forward pass, a gating network selects a subset of experts to apply to the input.\n", "\n", "The components to implement are:\n", + "\n", "- Gating network: A dense layer that outputs a probability distribution over the experts.\n", "- MoE layer: A layer that applies a different expert to each input in the batch. And a loss function that ensures specialization among the experts.\n", - "- Model: A simple model that uses the MoE layer.\n", - "\n", - "In this example, we will first implement a linear MoE layer and then a CNN-based MoE layer. Lastly we will combine the two using an abstract implementation to showcase its capacties." + "- Model: A simple model that uses the MoE layer." ] }, { @@ -120,7 +119,7 @@ "NUM_EXPERTS = 5\n", "TOP_K = 3\n", "BATCH_SIZE = 128\n", - "NUM_EPOCHS = 20\n", + "NUM_EPOCHS = 12\n", "LEARNING_RATE = 0.001\n", "" ] @@ -207,6 +206,7 @@ " mean=0.0, stddev=0.001\n", " ),\n", " bias_initializer=\"zeros\",\n", + " activation=\"softmax\",\n", " )\n", "\n", " self.num_experts = num_experts\n", @@ -216,33 +216,50 @@ " tf.zeros((num_experts,), dtype=tf.float32)\n", " )\n", "\n", - " def call(self, x):\n", - " # Get gating weights\n", - " gating_weights = self.gating_network(x)\n", + " def get_top_outputs(self, x, top_k_indices, top_k_weights):\n", + " batch_size = tf.shape(x)[0]\n", + " flat_indices = tf.reshape(top_k_indices, [-1])\n", + " repeated_x = tf.repeat(x, repeats=self.top_k, axis=0)\n", + "\n", + " # Compute outputs for unique experts\n", + " unique_expert_ids = tf.unique(flat_indices)[0] # Get unique expert indices\n", + " expert_outputs_dict = {}\n", + " for idx in unique_expert_ids:\n", + " mask = tf.equal(flat_indices, idx)\n", + " selected_inputs = tf.boolean_mask(repeated_x, mask)\n", + " expert_outputs_dict[idx.numpy()] = self.experts[idx](selected_inputs)\n", + "\n", + " # Gather outputs back into the correct shape\n", + " output_size = self.experts[0].compute_output_shape(input_shape=(None, 10))[-1]\n", + " flat_outputs = tf.zeros(\n", + " [batch_size * self.top_k, output_size], dtype=tf.float32\n", + " )\n", + " for idx in unique_expert_ids:\n", + " mask = tf.equal(flat_indices, idx)\n", + " indices = tf.where(mask)\n", + " flat_outputs = tf.tensor_scatter_nd_update(\n", + " flat_outputs, indices, expert_outputs_dict[idx.numpy()]\n", + " )\n", + " top_k_expert_outputs = tf.reshape(\n", + " flat_outputs, [batch_size, self.top_k, output_size]\n", + " )\n", "\n", - " # Get the top k experts based on the gating weights\n", - " top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k)\n", + " # Combine outputs using top-k weights\n", + " return tf.einsum(\"ijk,ij->ik\", top_k_expert_outputs, top_k_weights)\n", "\n", - " # Count usage of each expert symbolically\n", - " updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32)\n", - " # Use tf.tensor_scatter_nd_add to increment the usage count\n", + " def update_usage_counts(self, indices):\n", + " updates = tf.ones_like(tf.reshape(indices, [-1]), dtype=tf.float32)\n", " self.expert_usage_count.assign(\n", " tf.tensor_scatter_nd_add(\n", - " self.expert_usage_count, tf.reshape(top_k_indices, [-1, 1]), updates\n", + " self.expert_usage_count, tf.reshape(indices, [-1, 1]), updates\n", " )\n", " )\n", "\n", - " # Get outputs from only the top-k experts\n", - " top_k_expert_outputs = tf.stack(\n", - " [\n", - " self.experts[expert_index](x)\n", - " for expert_index in top_k_indices.numpy()[0]\n", - " ],\n", - " axis=1,\n", - " ) # Stack outputs along axis 1\n", - "\n", - " # Combine outputs using top-k weights\n", - " combined_output = tf.einsum(\"ijk,ij->ik\", top_k_expert_outputs, top_k_weights)\n", + " def call(self, x):\n", + " gating_weights = self.gating_network(x)\n", + " top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k)\n", + " combined_output = self.get_top_outputs(x, top_k_indices, top_k_weights)\n", + " self.update_usage_counts(top_k_indices)\n", "\n", " return combined_output\n", "" @@ -278,9 +295,12 @@ "source": [ "## Routing Collapse\n", "\n", - "Routing collapse is a problem that occurs with MoE layers. The route terminology refers to the selection process of which expert to use for a given input.\n", + "One common challenge with MoE architectures is \"routing collapse\". The \"route\" refers to the selection process of which expert to use for a given input where the model falls into a pattern of only using a small subset of experts. This happens because:\n", "\n", - "Route collapse happens when a routing model, early in training, starts favoring just a few experts because they perform slightly better due to random starting conditions. This leads to most examples being sent to these experts, leaving others unused and reducing the model\u2019s overall capacity.\n", + "1. Early in training, some experts may perform slightly better by chance\n", + "2. These better-performing experts get selected more frequently\n", + "3. With more practice, these experts improve further, creating a feedback loop\n", + "4. Other experts become neglected and never improve\n", "\n", "Code below demonstrates the randomness of expert selection:" ] @@ -312,14 +332,24 @@ "colab_type": "text" }, "source": [ - "### Adding loss functions to prevent route collapse\n", - "To fix this, the authors use extra rules (importance and load losses), ideas borrowed from [Shazeer et al.](https://arxiv.org/abs/1701.06538), to ensure all experts get used evenly.\n", + "### Load Balancing Solutions\n", + "\n", + "To prevent routing collapse, we implement three types of losses that were introduced in various MoE research:\n", + "\n", + "1. Diversity Loss: Encourages the gating network to use all experts by maximizing the entropy\n", + " of expert selection probabilities\n", + " [Shazeer et al., \"Outrageously Large Neural Networks\" (2017)](https://arxiv.org/abs/1701.06538)\n", "\n", - "The importance_loss calculates how much the usage of each expert (tracked in batch_importance_sum) deviates from the average usage (mean_importance) by using mean squared error, aiming to balance expert utilization. This helps prevent route collapse by discouraging the model from overloading a few experts, instead promoting an even distribution of examples across all experts to maintain diverse and effective routing.\n", + "2. Importance Loss: Ensures each expert handles a similar total amount of input across the batch\n", + " by penalizing deviations from the mean usage\n", + " [Lepikhin et al., \"GShard: Scaling Giant Models with Conditional Computation\" (2020)](https://arxiv.org/abs/2006.16668)\n", "\n", - "#### Load losses:\n", - " - Diversity loss: Diversity loss helps prevent route collapse by encouraging the routing model to evenly distribute examples across all experts, rather than favoring just a few due to their initial performance. It does this by maximizing the entropy of the gating weights, ensuring balanced expert utilization and improving the model's overall capacity.\n", - " - Overflow loss: The batch_overflow_sum measures how much the usage of experts exceeds a set capacity by applying ReLU to the difference between usage_counts (how many examples each expert handles) and batch_capacity (the allowed limit), then summing the excesses. This helps prevent route collapse by penalizing situations where certain experts are overused, encouraging a more even spread of examples across all experts to keep the model's capacity balanced." + "3. Overflow Loss: Prevents individual experts from being overloaded by penalizing usage above\n", + " a specified capacity threshold\n", + " [Fedus et al., \"Switch Transformers\" (2021)](https://arxiv.org/abs/2101.03961)\n", + "\n", + "These losses are combined with the main classification loss during training to ensure balanced expert utilization.\n", + "The combination of these techniques has proven effective in large-scale models like GShard and Switch Transformers." ] }, { @@ -358,6 +388,7 @@ " mean=0.0, stddev=0.001\n", " ),\n", " bias_initializer=\"zeros\",\n", + " activation=\"softmax\",\n", " )\n", "\n", " self.num_experts = num_experts\n", @@ -383,60 +414,52 @@ " )\n", " )\n", "\n", - " def call(self, x):\n", - " # Get gating weights and normalize\n", - " gating_weights = self.gating_network(x)\n", - " gating_weights = K.softmax(gating_weights) # Ensure weights are probabilities\n", - " self._diversity_loss(gating_weights)\n", - " self._importance_loss(gating_weights)\n", - "\n", - " # Get the top k experts based on the gating weights\n", + " # Replace the current get_top_outputs method with this vectorized version\n", + " def get_top_outputs(\n", + " self, x, gating_weights\n", + " ): # Changed to take gating_weights directly\n", + " \"\"\"Compute outputs from top-k experts.\"\"\"\n", " top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k)\n", "\n", - " # Count usage of each expert symbolically\n", - " updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32)\n", - " # Use tf.tensor_scatter_nd_add to increment the usage count\n", - " self.expert_usage_count.assign(\n", - " tf.tensor_scatter_nd_add(\n", - " self.expert_usage_count, tf.reshape(top_k_indices, [-1, 1]), updates\n", - " )\n", - " )\n", - "\n", - " # Calculate overflow using updated usage count\n", - " self.batch_overflow_sum = K.sum(\n", - " K.relu(tf.convert_to_tensor(self.expert_usage_count) - self.batch_capacity)\n", - " )\n", + " # Store indices and updates for usage count\n", + " self.indices = tf.reshape(top_k_indices, [-1, 1])\n", + " self.updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32)\n", "\n", - " # Compute all expert outputs\n", - " expert_outputs = tf.stack(\n", - " [expert(x) for expert in self.experts], axis=1\n", - " ) # Shape: (batch_size, num_experts, hidden_size)\n", - "\n", - " # Gather the top-k expert outputs using top_k_indices\n", + " # Compute expert outputs symbolically\n", + " expert_outputs = tf.stack([expert(x) for expert in self.experts], axis=1)\n", " batch_size = tf.shape(x)[0]\n", - " batch_indices = tf.expand_dims(\n", - " tf.range(batch_size), 1\n", - " ) # Shape: (batch_size, 1)\n", - " batch_indices = tf.tile(\n", - " batch_indices, [1, self.top_k]\n", - " ) # Shape: (batch_size, top_k)\n", - "\n", - " # Create indices for gathering\n", - " indices = tf.stack(\n", - " [batch_indices, top_k_indices], axis=2\n", - " ) # Shape: (batch_size, top_k, 2)\n", - " top_k_expert_outputs = tf.gather_nd(\n", - " expert_outputs, indices\n", - " ) # Shape: (batch_size, top_k, hidden_size)\n", + " batch_indices = tf.tile(tf.range(batch_size)[:, tf.newaxis], [1, self.top_k])\n", + " gather_indices = tf.stack([batch_indices, top_k_indices], axis=-1)\n", + " top_k_expert_outputs = tf.gather_nd(expert_outputs, gather_indices)\n", "\n", - " # Combine outputs using top-k weights\n", " combined_output = tf.reduce_sum(\n", - " top_k_expert_outputs * tf.expand_dims(top_k_weights, axis=-1), axis=1\n", + " top_k_expert_outputs * top_k_weights[:, :, tf.newaxis], axis=1\n", " )\n", + " return combined_output\n", + "\n", + " def update_usage_counts(self):\n", + " updates = tf.ones_like(tf.reshape(self.indices, [-1]), dtype=tf.float32)\n", + " self.expert_usage_count.assign(\n", + " tf.tensor_scatter_nd_add(\n", + " self.expert_usage_count, tf.reshape(self.indices, [-1, 1]), updates\n", + " )\n", + " )\n", + "\n", + " def call(self, x):\n", + " # Get gating weights and normalize\n", + " gating_weights = self.gating_network(x)\n", + " # top_k_weights, top_k_indices = tf.nn.top_k(gating_weights, k=self.top_k)\n", + " combined_output = self.get_top_outputs(x, gating_weights)\n", + " self.update_usage_counts()\n", + " self._diversity_loss(gating_weights)\n", + " self._importance_loss(gating_weights)\n", "\n", " return combined_output\n", "\n", " def compute_total_loss(self, load_balance_coef=0.01):\n", + " self.batch_overflow_sum = K.sum(\n", + " K.relu(tf.convert_to_tensor(self.expert_usage_count) - self.batch_capacity)\n", + " )\n", " return load_balance_coef * (\n", " self.diversity_loss + self.batch_overflow_sum + self.importance_loss\n", " )\n", @@ -462,7 +485,13 @@ "source": [ "\n", "class MoEModel(keras.Model):\n", - " def __init__(self, input_shape, num_classes, num_experts=NUM_EXPERTS, top_k=TOP_K):\n", + " def __init__(\n", + " self,\n", + " num_classes,\n", + " num_experts=NUM_EXPERTS,\n", + " top_k=TOP_K,\n", + " moe_loss_considered=True,\n", + " ):\n", " super(MoEModel, self).__init__()\n", "\n", " # Define the convolutional block\n", @@ -484,6 +513,7 @@ "\n", " # Softmax layer\n", " self.softmax = layers.Softmax()\n", + " self.moe_loss_considered = moe_loss_considered\n", "\n", " def call(self, inputs, training=False):\n", " conv_flatten = self.conv_block(inputs)\n", @@ -497,17 +527,21 @@ " with tf.GradientTape() as tape:\n", " y_pred = self(x, training=True)\n", " classification_loss = self.compute_loss(x, y, y_pred)\n", - " moe_loss = self.moe_classifier.compute_total_loss(load_balance_coef=0.01)\n", - " total_loss = classification_loss + moe_loss\n", + " if self.moe_loss_considered:\n", + " moe_loss = self.moe_classifier.compute_total_loss(\n", + " load_balance_coef=0.01\n", + " )\n", + " total_loss = classification_loss + moe_loss\n", + " else:\n", + " total_loss = classification_loss\n", "\n", " # Compute gradients\n", " gradients = tape.gradient(total_loss, self.trainable_variables)\n", "\n", " # Update weights\n", - " self.optimizer.apply_gradients(\n", - " zip(gradients, self.trainable_variables)\n", - " ) # Update metrics (e.g., accuracy)\n", - " self.compiled_metrics.update_state(y, y_pred)\n", + " self.optimizer.apply_gradients(zip(gradients, self.trainable_variables))\n", + " for metric in self.metrics:\n", + " metric.update_state(y, y_pred)\n", " # Return a dict of metrics for monitoring\n", " return {\n", " \"loss\": total_loss,\n", @@ -522,7 +556,8 @@ " moe_loss = self.moe_classifier.compute_total_loss(load_balance_coef=0.01)\n", " total_loss = classification_loss + moe_loss\n", "\n", - " self.compiled_metrics.update_state(y, y_pred)\n", + " for metric in self.metrics:\n", + " metric.update_state(y, y_pred)\n", " return {\n", " \"loss\": total_loss,\n", " \"moe_loss\": moe_loss,\n", @@ -532,9 +567,7 @@ "\n", "# Instantiate and compile the model\n", "inputs = keras.Input(shape=input_shape)\n", - "model = MoEModel(\n", - " input_shape=input_shape, num_classes=num_classes, num_experts=6, top_k=4\n", - ")\n", + "model = MoEModel(num_classes=num_classes, num_experts=5, top_k=3)\n", "\n", "model.compile(\n", " optimizer=keras.optimizers.Adam(learning_rate=LEARNING_RATE),\n", @@ -566,6 +599,7 @@ " batch_size=BATCH_SIZE,\n", " epochs=NUM_EPOCHS,\n", " validation_data=(x_test, y_test),\n", + " verbose=0,\n", ")" ] }, @@ -590,6 +624,33 @@ "print(\"Test loss:\", score[0])\n", "print(\"Test accuracy:\", score[1])" ] + }, + { + "cell_type": "markdown", + "metadata": { + "colab_type": "text" + }, + "source": [ + "# Conclusion\n", + "\n", + "This example demonstrated how Mixture of Experts (MoE) can be used to increase model capacity without a proportional increase in computation cost. The key benefits are:\n", + "\n", + "1. Conditional Computation: Only a subset of experts (TOP_K=3 out of NUM_EXPERTS=5) process each input,\n", + " making the model more computationally efficient than a model that uses all parameters for every input.\n", + "\n", + "2. Specialized Processing: Each expert learns to handle different aspects of the input space,\n", + " allowing for more sophisticated processing without requiring a larger dense network.\n", + "\n", + "In our implementation, we:\n", + "1. Created a basic MoE layer using dense networks as experts\n", + "2. Implemented three types of load balancing losses to prevent routing collapse\n", + "3. Applied the MoE architecture to MNIST classification by replacing the final dense layer\n", + "4. Achieved comparable accuracy to the baseline model while using experts conditionally\n", + "\n", + "This approach is particularly valuable for large-scale models where computational efficiency\n", + "is crucial. The same principles demonstrated here are used in much larger language models\n", + "and other applications where model capacity needs to scale efficiently" + ] } ], "metadata": { diff --git a/examples/vision/md/mnist_moe.md b/examples/vision/md/mnist_moe.md index 9165321e16..2696176d0c 100644 --- a/examples/vision/md/mnist_moe.md +++ b/examples/vision/md/mnist_moe.md @@ -3,7 +3,7 @@ **Author:** [Damoon Shahhosseini](https://www.linkedin.com/in/damoonsh/)
**Date created:** 2015/06/19
**Last modified:** 2020/04/21
-**Description:** Showcasing concepts relates to Mixture of Experts (MoE). +**Description:** Simple MoE implementation for MNIST classification. [**View in Colab**](https://colab.research.google.com/github/keras-team/keras-io/blob/master/examples/vision/ipynb/mnist_moe.ipynb) [**GitHub source**](https://github.com/keras-team/keras-io/blob/master/examples/vision/mnist_moe.py) @@ -19,12 +19,11 @@ Experts are identical blocks within a layer where each are trained to specialize At each forward pass, a gating network selects a subset of experts to apply to the input. The components to implement are: + - Gating network: A dense layer that outputs a probability distribution over the experts. - MoE layer: A layer that applies a different expert to each input in the batch. And a loss function that ensures specialization among the experts. - Model: A simple model that uses the MoE layer. -In this example, we will first implement a linear MoE layer and then a CNN-based MoE layer. Lastly we will combine the two using an abstract implementation to showcase its capacties. - --- ## Imports @@ -80,7 +79,7 @@ x_train shape: (60000, 28, 28, 1) NUM_EXPERTS = 5 TOP_K = 3 BATCH_SIZE = 128 -NUM_EPOCHS = 20 +NUM_EPOCHS = 12 LEARNING_RATE = 0.001 ``` @@ -188,6 +187,7 @@ class LinearMoE(layers.Layer): mean=0.0, stddev=0.001 ), bias_initializer="zeros", + activation="softmax", ) self.num_experts = num_experts @@ -197,33 +197,50 @@ class LinearMoE(layers.Layer): tf.zeros((num_experts,), dtype=tf.float32) ) - def call(self, x): - # Get gating weights - gating_weights = self.gating_network(x) + def get_top_outputs(self, x, top_k_indices, top_k_weights): + batch_size = tf.shape(x)[0] + flat_indices = tf.reshape(top_k_indices, [-1]) + repeated_x = tf.repeat(x, repeats=self.top_k, axis=0) + + # Compute outputs for unique experts + unique_expert_ids = tf.unique(flat_indices)[0] # Get unique expert indices + expert_outputs_dict = {} + for idx in unique_expert_ids: + mask = tf.equal(flat_indices, idx) + selected_inputs = tf.boolean_mask(repeated_x, mask) + expert_outputs_dict[idx.numpy()] = self.experts[idx](selected_inputs) + + # Gather outputs back into the correct shape + output_size = self.experts[0].compute_output_shape(input_shape=(None, 10))[-1] + flat_outputs = tf.zeros( + [batch_size * self.top_k, output_size], dtype=tf.float32 + ) + for idx in unique_expert_ids: + mask = tf.equal(flat_indices, idx) + indices = tf.where(mask) + flat_outputs = tf.tensor_scatter_nd_update( + flat_outputs, indices, expert_outputs_dict[idx.numpy()] + ) + top_k_expert_outputs = tf.reshape( + flat_outputs, [batch_size, self.top_k, output_size] + ) - # Get the top k experts based on the gating weights - top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k) + # Combine outputs using top-k weights + return tf.einsum("ijk,ij->ik", top_k_expert_outputs, top_k_weights) - # Count usage of each expert symbolically - updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32) - # Use tf.tensor_scatter_nd_add to increment the usage count + def update_usage_counts(self, indices): + updates = tf.ones_like(tf.reshape(indices, [-1]), dtype=tf.float32) self.expert_usage_count.assign( tf.tensor_scatter_nd_add( - self.expert_usage_count, tf.reshape(top_k_indices, [-1, 1]), updates + self.expert_usage_count, tf.reshape(indices, [-1, 1]), updates ) ) - # Get outputs from only the top-k experts - top_k_expert_outputs = tf.stack( - [ - self.experts[expert_index](x) - for expert_index in top_k_indices.numpy()[0] - ], - axis=1, - ) # Stack outputs along axis 1 - - # Combine outputs using top-k weights - combined_output = tf.einsum("ijk,ij->ik", top_k_expert_outputs, top_k_weights) + def call(self, x): + gating_weights = self.gating_network(x) + top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k) + combined_output = self.get_top_outputs(x, top_k_indices, top_k_weights) + self.update_usage_counts(top_k_indices) return combined_output @@ -244,14 +261,14 @@ linear_mode(sample_data)
``` ``` @@ -259,9 +276,12 @@ array([[ 7.8205157e-06, -9.6895346e-06, -1.3567289e-06, -1.1132732e-05, --- ## Routing Collapse -Routing collapse is a problem that occurs with MoE layers. The route terminology refers to the selection process of which expert to use for a given input. +One common challenge with MoE architectures is "routing collapse". The "route" refers to the selection process of which expert to use for a given input where the model falls into a pattern of only using a small subset of experts. This happens because: -Route collapse happens when a routing model, early in training, starts favoring just a few experts because they perform slightly better due to random starting conditions. This leads to most examples being sent to these experts, leaving others unused and reducing the model’s overall capacity. +1. Early in training, some experts may perform slightly better by chance +2. These better-performing experts get selected more frequently +3. With more practice, these experts improve further, creating a feedback loop +4. Other experts become neglected and never improve Code below demonstrates the randomness of expert selection: @@ -282,21 +302,31 @@ check_expert_usage(4)
``` -Run 0, Expert usage: [1. 0. 1. 1. 0.] -Run 1, Expert usage: [0. 1. 1. 0. 1.] -Run 2, Expert usage: [1. 1. 0. 1. 0.] -Run 3, Expert usage: [1. 0. 1. 1. 0.] +Run 0, Expert usage: [1. 1. 0. 0. 1.] +Run 1, Expert usage: [1. 1. 1. 0. 0.] +Run 2, Expert usage: [0. 1. 1. 0. 1.] +Run 3, Expert usage: [0. 1. 1. 1. 0.] ```
-### Adding loss functions to prevent route collapse -To fix this, the authors use extra rules (importance and load losses), ideas borrowed from [Shazeer et al.](https://arxiv.org/abs/1701.06538), to ensure all experts get used evenly. +### Load Balancing Solutions -The importance_loss calculates how much the usage of each expert (tracked in batch_importance_sum) deviates from the average usage (mean_importance) by using mean squared error, aiming to balance expert utilization. This helps prevent route collapse by discouraging the model from overloading a few experts, instead promoting an even distribution of examples across all experts to maintain diverse and effective routing. +To prevent routing collapse, we implement three types of losses that were introduced in various MoE research: -#### Load losses: - - Diversity loss: Diversity loss helps prevent route collapse by encouraging the routing model to evenly distribute examples across all experts, rather than favoring just a few due to their initial performance. It does this by maximizing the entropy of the gating weights, ensuring balanced expert utilization and improving the model's overall capacity. - - Overflow loss: The batch_overflow_sum measures how much the usage of experts exceeds a set capacity by applying ReLU to the difference between usage_counts (how many examples each expert handles) and batch_capacity (the allowed limit), then summing the excesses. This helps prevent route collapse by penalizing situations where certain experts are overused, encouraging a more even spread of examples across all experts to keep the model's capacity balanced. +1. Diversity Loss: Encourages the gating network to use all experts by maximizing the entropy + of expert selection probabilities + [Shazeer et al., "Outrageously Large Neural Networks" (2017)](https://arxiv.org/abs/1701.06538) + +2. Importance Loss: Ensures each expert handles a similar total amount of input across the batch + by penalizing deviations from the mean usage + [Lepikhin et al., "GShard: Scaling Giant Models with Conditional Computation" (2020)](https://arxiv.org/abs/2006.16668) + +3. Overflow Loss: Prevents individual experts from being overloaded by penalizing usage above + a specified capacity threshold + [Fedus et al., "Switch Transformers" (2021)](https://arxiv.org/abs/2101.03961) + +These losses are combined with the main classification loss during training to ensure balanced expert utilization. +The combination of these techniques has proven effective in large-scale models like GShard and Switch Transformers. ```python @@ -328,6 +358,7 @@ class LinearMoE(layers.Layer): mean=0.0, stddev=0.001 ), bias_initializer="zeros", + activation="softmax", ) self.num_experts = num_experts @@ -353,60 +384,52 @@ class LinearMoE(layers.Layer): ) ) - def call(self, x): - # Get gating weights and normalize - gating_weights = self.gating_network(x) - gating_weights = K.softmax(gating_weights) # Ensure weights are probabilities - self._diversity_loss(gating_weights) - self._importance_loss(gating_weights) - - # Get the top k experts based on the gating weights + # Replace the current get_top_outputs method with this vectorized version + def get_top_outputs( + self, x, gating_weights + ): # Changed to take gating_weights directly + """Compute outputs from top-k experts.""" top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k) - # Count usage of each expert symbolically - updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32) - # Use tf.tensor_scatter_nd_add to increment the usage count - self.expert_usage_count.assign( - tf.tensor_scatter_nd_add( - self.expert_usage_count, tf.reshape(top_k_indices, [-1, 1]), updates - ) - ) - - # Calculate overflow using updated usage count - self.batch_overflow_sum = K.sum( - K.relu(tf.convert_to_tensor(self.expert_usage_count) - self.batch_capacity) - ) - - # Compute all expert outputs - expert_outputs = tf.stack( - [expert(x) for expert in self.experts], axis=1 - ) # Shape: (batch_size, num_experts, hidden_size) + # Store indices and updates for usage count + self.indices = tf.reshape(top_k_indices, [-1, 1]) + self.updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32) - # Gather the top-k expert outputs using top_k_indices + # Compute expert outputs symbolically + expert_outputs = tf.stack([expert(x) for expert in self.experts], axis=1) batch_size = tf.shape(x)[0] - batch_indices = tf.expand_dims( - tf.range(batch_size), 1 - ) # Shape: (batch_size, 1) - batch_indices = tf.tile( - batch_indices, [1, self.top_k] - ) # Shape: (batch_size, top_k) + batch_indices = tf.tile(tf.range(batch_size)[:, tf.newaxis], [1, self.top_k]) + gather_indices = tf.stack([batch_indices, top_k_indices], axis=-1) + top_k_expert_outputs = tf.gather_nd(expert_outputs, gather_indices) - # Create indices for gathering - indices = tf.stack( - [batch_indices, top_k_indices], axis=2 - ) # Shape: (batch_size, top_k, 2) - top_k_expert_outputs = tf.gather_nd( - expert_outputs, indices - ) # Shape: (batch_size, top_k, hidden_size) - - # Combine outputs using top-k weights combined_output = tf.reduce_sum( - top_k_expert_outputs * tf.expand_dims(top_k_weights, axis=-1), axis=1 + top_k_expert_outputs * top_k_weights[:, :, tf.newaxis], axis=1 + ) + return combined_output + + def update_usage_counts(self): + updates = tf.ones_like(tf.reshape(self.indices, [-1]), dtype=tf.float32) + self.expert_usage_count.assign( + tf.tensor_scatter_nd_add( + self.expert_usage_count, tf.reshape(self.indices, [-1, 1]), updates + ) ) + def call(self, x): + # Get gating weights and normalize + gating_weights = self.gating_network(x) + # top_k_weights, top_k_indices = tf.nn.top_k(gating_weights, k=self.top_k) + combined_output = self.get_top_outputs(x, gating_weights) + self.update_usage_counts() + self._diversity_loss(gating_weights) + self._importance_loss(gating_weights) + return combined_output def compute_total_loss(self, load_balance_coef=0.01): + self.batch_overflow_sum = K.sum( + K.relu(tf.convert_to_tensor(self.expert_usage_count) - self.batch_capacity) + ) return load_balance_coef * ( self.diversity_loss + self.batch_overflow_sum + self.importance_loss ) @@ -420,7 +443,13 @@ class LinearMoE(layers.Layer): ```python class MoEModel(keras.Model): - def __init__(self, input_shape, num_classes, num_experts=NUM_EXPERTS, top_k=TOP_K): + def __init__( + self, + num_classes, + num_experts=NUM_EXPERTS, + top_k=TOP_K, + moe_loss_considered=True, + ): super(MoEModel, self).__init__() # Define the convolutional block @@ -442,6 +471,7 @@ class MoEModel(keras.Model): # Softmax layer self.softmax = layers.Softmax() + self.moe_loss_considered = moe_loss_considered def call(self, inputs, training=False): conv_flatten = self.conv_block(inputs) @@ -455,17 +485,21 @@ class MoEModel(keras.Model): with tf.GradientTape() as tape: y_pred = self(x, training=True) classification_loss = self.compute_loss(x, y, y_pred) - moe_loss = self.moe_classifier.compute_total_loss(load_balance_coef=0.01) - total_loss = classification_loss + moe_loss + if self.moe_loss_considered: + moe_loss = self.moe_classifier.compute_total_loss( + load_balance_coef=0.01 + ) + total_loss = classification_loss + moe_loss + else: + total_loss = classification_loss # Compute gradients gradients = tape.gradient(total_loss, self.trainable_variables) # Update weights - self.optimizer.apply_gradients( - zip(gradients, self.trainable_variables) - ) # Update metrics (e.g., accuracy) - self.compiled_metrics.update_state(y, y_pred) + self.optimizer.apply_gradients(zip(gradients, self.trainable_variables)) + for metric in self.metrics: + metric.update_state(y, y_pred) # Return a dict of metrics for monitoring return { "loss": total_loss, @@ -480,7 +514,8 @@ class MoEModel(keras.Model): moe_loss = self.moe_classifier.compute_total_loss(load_balance_coef=0.01) total_loss = classification_loss + moe_loss - self.compiled_metrics.update_state(y, y_pred) + for metric in self.metrics: + metric.update_state(y, y_pred) return { "loss": total_loss, "moe_loss": moe_loss, @@ -490,9 +525,7 @@ class MoEModel(keras.Model): # Instantiate and compile the model inputs = keras.Input(shape=input_shape) -model = MoEModel( - input_shape=input_shape, num_classes=num_classes, num_experts=6, top_k=4 -) +model = MoEModel(num_classes=num_classes, num_experts=5, top_k=3) model.compile( optimizer=keras.optimizers.Adam(learning_rate=LEARNING_RATE), @@ -511,22361 +544,42 @@ history = model.fit( batch_size=BATCH_SIZE, epochs=NUM_EPOCHS, validation_data=(x_test, y_test), + verbose=0, ) ``` -
-``` -Epoch 1/20 - -/opt/homebrew/Caskroom/miniforge/base/envs/keras-io/lib/python3.11/site-packages/keras/src/backend/tensorflow/trainer.py:642: UserWarning: `model.compiled_metrics()` is deprecated. Instead, use e.g.: -``` -for metric in self.metrics: - metric.update_state(y, y_pred) -``` -``` -
- -
-``` - return self._compiled_metrics_update_state( - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 8:21 1s/step - accuracy: 0.1406 - loss: 0.1000 - moe_loss: 3.8421 - -
-``` - -``` -
- 4/469 ━━━━━━━━━━━━━━━━━━━━ 8s 17ms/step - accuracy: 0.1637 - loss: 0.1000 - moe_loss: 11.5298 - -
-``` - -``` -
- 8/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.2071 - loss: 0.1000 - moe_loss: 21.7700 - -
-``` - -``` -
- 12/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.2438 - loss: 0.1000 - moe_loss: 32.0082 - -
-``` - -``` -
- 16/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.2744 - loss: 0.1000 - moe_loss: 42.2476 - -
-``` - -``` -
- 20/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.2988 - loss: 0.1000 - moe_loss: 52.4873 - -
-``` - -``` -
- 24/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.3206 - loss: 0.1000 - moe_loss: 62.7278 - -
-``` - -``` -
- 28/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.3409 - loss: 0.1000 - moe_loss: 72.9716 - -
-``` - -``` -
- 32/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.3598 - loss: 0.1000 - moe_loss: 83.2221 - -
-``` - -``` -
- 36/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.3774 - loss: 0.1000 - moe_loss: 93.4818 - -
-``` - -``` -
- 40/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.3938 - loss: 0.1000 - moe_loss: 103.7338 - -
-``` - -``` -
- 44/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4092 - loss: 0.1000 - moe_loss: 113.9789 - -
-``` - -``` -
- 48/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4237 - loss: 0.1000 - moe_loss: 124.2205 - -
-``` - -``` -
- 52/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4373 - loss: 0.1000 - moe_loss: 134.4638 - -
-``` - -``` -
- 56/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4503 - loss: 0.1000 - moe_loss: 144.7069 - -
-``` - -``` -
- 60/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4626 - loss: 0.1000 - moe_loss: 154.9452 - -
-``` - -``` -
- 64/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4744 - loss: 0.1000 - moe_loss: 165.1867 - -
-``` - -``` -
- 67/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4828 - loss: 0.1000 - moe_loss: 172.8672 - -
-``` - -``` -
- 71/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.4937 - loss: 0.1000 - moe_loss: 183.1078 - -
-``` - -``` -
- 75/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5039 - loss: 0.1000 - moe_loss: 193.3479 - -
-``` - -``` -
- 79/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5137 - loss: 0.1000 - moe_loss: 203.5896 - -
-``` - -``` -
- 82/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5207 - loss: 0.1000 - moe_loss: 211.2682 - -
-``` - -``` -
- 86/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5297 - loss: 0.1000 - moe_loss: 221.5096 - -
-``` - -``` -
- 89/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5362 - loss: 0.1000 - moe_loss: 229.1885 - -
-``` - -``` -
- 93/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5445 - loss: 0.1000 - moe_loss: 239.4279 - -
-``` - -``` -
- 97/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5524 - loss: 0.1000 - moe_loss: 249.6689 - -
-``` - -``` -
- 100/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.5581 - loss: 0.1000 - moe_loss: 257.3486 - -
-``` - -``` -
- 104/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.5654 - loss: 0.1000 - moe_loss: 267.5893 - -
-``` - -``` -
- 107/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.5707 - loss: 0.1000 - moe_loss: 275.2696 - -
-``` - -``` -
- 110/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.5758 - loss: 0.1000 - moe_loss: 282.9488 - -
-``` - -``` -
- 113/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.5808 - loss: 0.1000 - moe_loss: 290.6284 - -
-``` - -``` -
- 116/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.5856 - loss: 0.1000 - moe_loss: 298.3105 +### Evaluation -
-``` - -``` -
- 119/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.5903 - loss: 0.1000 - moe_loss: 305.9899 -
-``` - +```python +score = model.evaluate(x_test, y_test, verbose=0) +print("Test loss:", score[0]) +print("Test accuracy:", score[1]) ``` -
- 123/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.5963 - loss: 0.1000 - moe_loss: 316.2303
``` - -``` -
- 126/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.6007 - loss: 0.1000 - moe_loss: 323.9091 +Test loss: tf.Tensor(0.9811909, shape=(), dtype=float32) +Test accuracy: {'accuracy': } -
-``` - ```
- 129/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.6049 - loss: 0.1000 - moe_loss: 331.5898 +# Conclusion -
-``` - -``` -
- 133/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.6104 - loss: 0.1000 - moe_loss: 341.8302 +This example demonstrated how Mixture of Experts (MoE) can be used to increase model capacity without a proportional increase in computation cost. The key benefits are: -
-``` - -``` -
- 136/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.6143 - loss: 0.1000 - moe_loss: 349.5105 +1. Conditional Computation: Only a subset of experts (TOP_K=3 out of NUM_EXPERTS=5) process each input, + making the model more computationally efficient than a model that uses all parameters for every input. -
-``` - -``` -
- 140/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.6195 - loss: 0.1000 - moe_loss: 359.7512 +2. Specialized Processing: Each expert learns to handle different aspects of the input space, + allowing for more sophisticated processing without requiring a larger dense network. -
-``` - -``` -
- 143/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.6232 - loss: 0.1000 - moe_loss: 367.4311 +In our implementation, we: +1. Created a basic MoE layer using dense networks as experts +2. Implemented three types of load balancing losses to prevent routing collapse +3. Applied the MoE architecture to MNIST classification by replacing the final dense layer +4. Achieved comparable accuracy to the baseline model while using experts conditionally -
-``` - -``` -
- 147/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.6281 - loss: 0.1000 - moe_loss: 377.6704 - -
-``` - -``` -
- 150/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.6316 - loss: 0.1000 - moe_loss: 385.3510 - -
-``` - -``` -
- 154/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.6361 - loss: 0.1000 - moe_loss: 395.5921 - -
-``` - -``` -
- 157/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.6394 - loss: 0.1000 - moe_loss: 403.2722 - -
-``` - -``` -
- 160/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.6427 - loss: 0.1000 - moe_loss: 410.9522 - -
-``` - -``` -
- 163/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.6458 - loss: 0.1000 - moe_loss: 418.6319 - -
-``` - -``` -
- 167/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6499 - loss: 0.1000 - moe_loss: 428.8718 - -
-``` - -``` -
- 171/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6539 - loss: 0.1000 - moe_loss: 439.1111 - -
-``` - -``` -
- 175/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6578 - loss: 0.1000 - moe_loss: 449.3512 - -
-``` - -``` -
- 179/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6615 - loss: 0.1000 - moe_loss: 459.5908 - -
-``` - -``` -
- 182/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6643 - loss: 0.1000 - moe_loss: 467.2707 - -
-``` - -``` -
- 186/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6678 - loss: 0.1000 - moe_loss: 477.5105 - -
-``` - -``` -
- 190/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6713 - loss: 0.1000 - moe_loss: 487.7509 - -
-``` - -``` -
- 194/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6747 - loss: 0.1000 - moe_loss: 497.9902 - -
-``` - -``` -
- 197/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6771 - loss: 0.1000 - moe_loss: 505.6700 - -
-``` - -``` -
- 201/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6804 - loss: 0.1000 - moe_loss: 515.9094 - -
-``` - -``` -
- 204/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6827 - loss: 0.1000 - moe_loss: 523.5893 - -
-``` - -``` -
- 207/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6850 - loss: 0.1000 - moe_loss: 531.2690 - -
-``` - -``` -
- 211/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6880 - loss: 0.1000 - moe_loss: 541.5093 - -
-``` - -``` -
- 215/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6910 - loss: 0.1000 - moe_loss: 551.7495 - -
-``` - -``` -
- 219/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6938 - loss: 0.1000 - moe_loss: 561.9893 - -
-``` - -``` -
- 222/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6959 - loss: 0.1000 - moe_loss: 569.6691 - -
-``` - -``` -
- 225/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.6980 - loss: 0.1000 - moe_loss: 577.3488 - -
-``` - -``` -
- 229/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7007 - loss: 0.1000 - moe_loss: 587.5882 - -
-``` - -``` -
- 233/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7033 - loss: 0.1000 - moe_loss: 597.8292 - -
-``` - -``` -
- 237/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7059 - loss: 0.1000 - moe_loss: 608.0698 - -
-``` - -``` -
- 241/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7084 - loss: 0.1000 - moe_loss: 618.3093 - -
-``` - -``` -
- 244/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7102 - loss: 0.1000 - moe_loss: 625.9894 - -
-``` - -``` -
- 247/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7121 - loss: 0.1000 - moe_loss: 633.6691 - -
-``` - -``` -
- 251/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7144 - loss: 0.1000 - moe_loss: 643.9095 - -
-``` - -``` -
- 255/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7167 - loss: 0.1000 - moe_loss: 654.1490 - -
-``` - -``` -
- 258/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7184 - loss: 0.1000 - moe_loss: 661.8292 - -
-``` - -``` -
- 262/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7207 - loss: 0.1000 - moe_loss: 672.0692 - -
-``` - -``` -
- 265/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7223 - loss: 0.1000 - moe_loss: 679.7494 - -
-``` - -``` -
- 269/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7245 - loss: 0.1000 - moe_loss: 689.9895 - -
-``` - -``` -
- 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7266 - loss: 0.1000 - moe_loss: 700.2294 - -
-``` - -``` -
- 277/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7286 - loss: 0.1000 - moe_loss: 710.4691 - -
-``` - -``` -
- 281/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7306 - loss: 0.1000 - moe_loss: 720.7094 - -
-``` - -``` -
- 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.7326 - loss: 0.1000 - moe_loss: 730.9502 - -
-``` - -``` -
- 289/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7346 - loss: 0.1000 - moe_loss: 741.1905 - -
-``` - -``` -
- 293/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7365 - loss: 0.1000 - moe_loss: 751.4304 - -
-``` - -``` -
- 295/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7374 - loss: 0.1000 - moe_loss: 756.5504 - -
-``` - -``` -
- 298/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7388 - loss: 0.1000 - moe_loss: 764.2300 - -
-``` - -``` -
- 302/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7406 - loss: 0.1000 - moe_loss: 774.4699 - -
-``` - -``` -
- 306/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7424 - loss: 0.1000 - moe_loss: 784.7094 - -
-``` - -``` -
- 310/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7441 - loss: 0.1000 - moe_loss: 794.9492 - -
-``` - -``` -
- 314/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7458 - loss: 0.1000 - moe_loss: 805.1893 - -
-``` - -``` -
- 318/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7475 - loss: 0.1000 - moe_loss: 815.4291 - -
-``` - -``` -
- 321/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7488 - loss: 0.1000 - moe_loss: 823.1090 - -
-``` - -``` -
- 325/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7504 - loss: 0.1000 - moe_loss: 833.3490 - -
-``` - -``` -
- 329/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7520 - loss: 0.1000 - moe_loss: 843.5892 - -
-``` - -``` -
- 332/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7532 - loss: 0.1000 - moe_loss: 851.2693 - -
-``` - -``` -
- 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7548 - loss: 0.1000 - moe_loss: 861.5092 - -
-``` - -``` -
- 340/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7563 - loss: 0.1000 - moe_loss: 871.7490 - -
-``` - -``` -
- 343/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7575 - loss: 0.1000 - moe_loss: 879.4288 - -
-``` - -``` -
- 347/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7590 - loss: 0.1000 - moe_loss: 889.6685 - -
-``` - -``` -
- 350/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.7601 - loss: 0.1000 - moe_loss: 897.3481 - -
-``` - -``` -
- 353/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7612 - loss: 0.1000 - moe_loss: 905.0280 - -
-``` - -``` -
- 357/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7626 - loss: 0.1000 - moe_loss: 915.2678 - -
-``` - -``` -
- 361/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7640 - loss: 0.1000 - moe_loss: 925.5076 - -
-``` - -``` -
- 365/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7654 - loss: 0.1000 - moe_loss: 935.7476 - -
-``` - -``` -
- 368/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7664 - loss: 0.1000 - moe_loss: 943.4277 - -
-``` - -``` -
- 372/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7678 - loss: 0.1000 - moe_loss: 953.6683 - -
-``` - -``` -
- 375/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7688 - loss: 0.1000 - moe_loss: 961.3480 - -
-``` - -``` -
- 378/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7698 - loss: 0.1000 - moe_loss: 969.0279 - -
-``` - -``` -
- 382/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7711 - loss: 0.1000 - moe_loss: 979.2679 - -
-``` - -``` -
- 386/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7724 - loss: 0.1000 - moe_loss: 989.5076 - -
-``` - -``` -
- 390/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7736 - loss: 0.1000 - moe_loss: 999.7477 - -
-``` - -``` -
- 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7749 - loss: 0.1000 - moe_loss: 1009.9877 - -
-``` - -``` -
- 398/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7761 - loss: 0.1000 - moe_loss: 1020.2275 - -
-``` - -``` -
- 402/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7773 - loss: 0.1000 - moe_loss: 1030.4677 - -
-``` - -``` -
- 406/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.7785 - loss: 0.1000 - moe_loss: 1040.7075 - -
-``` - -``` -
- 410/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7797 - loss: 0.1000 - moe_loss: 1050.9473 - -
-``` - -``` -
- 414/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7808 - loss: 0.1000 - moe_loss: 1061.1871 - -
-``` - -``` -
- 418/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7820 - loss: 0.1000 - moe_loss: 1071.4269 - -
-``` - -``` -
- 421/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7828 - loss: 0.1000 - moe_loss: 1079.1069 - -
-``` - -``` -
- 425/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7839 - loss: 0.1000 - moe_loss: 1089.3467 - -
-``` - -``` -
- 429/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7850 - loss: 0.1000 - moe_loss: 1099.5865 - -
-``` - -``` -
- 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7861 - loss: 0.1000 - moe_loss: 1109.8267 - -
-``` - -``` -
- 436/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7869 - loss: 0.1000 - moe_loss: 1117.5068 - -
-``` - -``` -
- 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7877 - loss: 0.1000 - moe_loss: 1125.1870 - -
-``` - -``` -
- 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7887 - loss: 0.1000 - moe_loss: 1135.4268 - -
-``` - -``` -
- 446/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7895 - loss: 0.1000 - moe_loss: 1143.1067 - -
-``` - -``` -
- 450/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7905 - loss: 0.1000 - moe_loss: 1153.3466 - -
-``` - -``` -
- 454/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7916 - loss: 0.1000 - moe_loss: 1163.5869 - -
-``` - -``` -
- 458/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7926 - loss: 0.1000 - moe_loss: 1173.8270 - -
-``` - -``` -
- 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7933 - loss: 0.1000 - moe_loss: 1181.5070 - -
-``` - -``` -
- 464/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7940 - loss: 0.1000 - moe_loss: 1189.1869 - -
-``` - -``` -
- 468/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.7950 - loss: 0.1000 - moe_loss: 1199.4266 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 9s 18ms/step - accuracy: 0.7956 - loss: 0.1000 - moe_loss: 1204.5302 - val_loss: 0.1000 - val_moe_loss: 2798.7275 - - -
-``` -Epoch 2/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 14s 30ms/step - accuracy: 0.9688 - loss: 0.1000 - moe_loss: 2803.8604 - -
-``` - -``` -
- 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9682 - loss: 0.1000 - moe_loss: 2814.1450 - -
-``` - -``` -
- 9/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9677 - loss: 0.1000 - moe_loss: 2824.3696 - -
-``` - -``` -
- 13/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9686 - loss: 0.1000 - moe_loss: 2834.6130 - -
-``` - -``` -
- 17/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9693 - loss: 0.1000 - moe_loss: 2844.8762 - -
-``` - -``` -
- 20/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9693 - loss: 0.1000 - moe_loss: 2852.5579 - -
-``` - -``` -
- 23/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9692 - loss: 0.1000 - moe_loss: 2860.2383 - -
-``` - -``` -
- 26/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9691 - loss: 0.1000 - moe_loss: 2867.9192 - -
-``` - -``` -
- 29/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9688 - loss: 0.1000 - moe_loss: 2875.5964 - -
-``` - -``` -
- 33/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9683 - loss: 0.1000 - moe_loss: 2885.8335 - -
-``` - -``` -
- 36/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9679 - loss: 0.1000 - moe_loss: 2893.5164 - -
-``` - -``` -
- 40/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9676 - loss: 0.1000 - moe_loss: 2903.7554 - -
-``` - -``` -
- 44/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9672 - loss: 0.1000 - moe_loss: 2913.9944 - -
-``` - -``` -
- 48/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9668 - loss: 0.1000 - moe_loss: 2924.2329 - -
-``` - -``` -
- 52/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9665 - loss: 0.1000 - moe_loss: 2934.4727 - -
-``` - -``` -
- 56/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 2944.7114 - -
-``` - -``` -
- 60/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9661 - loss: 0.1000 - moe_loss: 2954.9500 - -
-``` - -``` -
- 64/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9660 - loss: 0.1000 - moe_loss: 2965.1897 - -
-``` - -``` -
- 68/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 2975.4287 - -
-``` - -``` -
- 72/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 2985.6675 - -
-``` - -``` -
- 75/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 2993.3472 - -
-``` - -``` -
- 79/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 3003.5850 - -
-``` - -``` -
- 83/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 3013.8240 - -
-``` - -``` -
- 87/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 3024.0654 - -
-``` - -``` -
- 91/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 3034.3062 - -
-``` - -``` -
- 95/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 3044.5454 - -
-``` - -``` -
- 99/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9659 - loss: 0.1000 - moe_loss: 3054.7854 - -
-``` - -``` -
- 103/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9660 - loss: 0.1000 - moe_loss: 3065.0247 - -
-``` - -``` -
- 107/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9660 - loss: 0.1000 - moe_loss: 3075.2642 - -
-``` - -``` -
- 110/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9660 - loss: 0.1000 - moe_loss: 3082.9436 - -
-``` - -``` -
- 114/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9661 - loss: 0.1000 - moe_loss: 3093.1829 - -
-``` - -``` -
- 117/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9661 - loss: 0.1000 - moe_loss: 3100.8628 - -
-``` - -``` -
- 120/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9661 - loss: 0.1000 - moe_loss: 3108.5425 - -
-``` - -``` -
- 123/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9662 - loss: 0.1000 - moe_loss: 3116.2224 - -
-``` - -``` -
- 127/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9662 - loss: 0.1000 - moe_loss: 3126.4617 - -
-``` - -``` -
- 131/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9662 - loss: 0.1000 - moe_loss: 3136.7017 - -
-``` - -``` -
- 134/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9662 - loss: 0.1000 - moe_loss: 3144.3816 - -
-``` - -``` -
- 138/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9662 - loss: 0.1000 - moe_loss: 3154.6211 - -
-``` - -``` -
- 142/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3164.8611 - -
-``` - -``` -
- 145/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3172.5408 - -
-``` - -``` -
- 148/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3180.2202 - -
-``` - -``` -
- 151/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3187.8999 - -
-``` - -``` -
- 154/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3195.5798 - -
-``` - -``` -
- 158/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3205.8191 - -
-``` - -``` -
- 162/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3216.0586 - -
-``` - -``` -
- 166/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3226.2993 - -
-``` - -``` -
- 170/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9663 - loss: 0.1000 - moe_loss: 3236.5393 - -
-``` - -``` -
- 174/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9664 - loss: 0.1000 - moe_loss: 3246.7805 - -
-``` - -``` -
- 178/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9664 - loss: 0.1000 - moe_loss: 3257.0203 - -
-``` - -``` -
- 182/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9664 - loss: 0.1000 - moe_loss: 3267.2603 - -
-``` - -``` -
- 185/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9664 - loss: 0.1000 - moe_loss: 3274.9407 - -
-``` - -``` -
- 188/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9664 - loss: 0.1000 - moe_loss: 3282.6201 - -
-``` - -``` -
- 192/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9665 - loss: 0.1000 - moe_loss: 3292.8596 - -
-``` - -``` -
- 195/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9665 - loss: 0.1000 - moe_loss: 3300.5391 - -
-``` - -``` -
- 199/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9665 - loss: 0.1000 - moe_loss: 3310.7786 - -
-``` - -``` -
- 202/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9665 - loss: 0.1000 - moe_loss: 3318.4583 - -
-``` - -``` -
- 206/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9666 - loss: 0.1000 - moe_loss: 3328.6982 - -
-``` - -``` -
- 210/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9666 - loss: 0.1000 - moe_loss: 3338.9380 - -
-``` - -``` -
- 213/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9666 - loss: 0.1000 - moe_loss: 3346.6179 - -
-``` - -``` -
- 217/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9667 - loss: 0.1000 - moe_loss: 3356.8574 - -
-``` - -``` -
- 221/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9667 - loss: 0.1000 - moe_loss: 3367.0972 - -
-``` - -``` -
- 225/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9667 - loss: 0.1000 - moe_loss: 3377.3372 - -
-``` - -``` -
- 229/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9667 - loss: 0.1000 - moe_loss: 3387.5769 - -
-``` - -``` -
- 233/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9668 - loss: 0.1000 - moe_loss: 3397.8169 - -
-``` - -``` -
- 237/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9668 - loss: 0.1000 - moe_loss: 3408.0564 - -
-``` - -``` -
- 241/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9669 - loss: 0.1000 - moe_loss: 3418.2964 - -
-``` - -``` -
- 244/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9669 - loss: 0.1000 - moe_loss: 3425.9768 - -
-``` - -``` -
- 247/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9669 - loss: 0.1000 - moe_loss: 3433.6567 - -
-``` - -``` -
- 251/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9669 - loss: 0.1000 - moe_loss: 3443.8967 - -
-``` - -``` -
- 255/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9670 - loss: 0.1000 - moe_loss: 3454.1365 - -
-``` - -``` -
- 259/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9670 - loss: 0.1000 - moe_loss: 3464.3762 - -
-``` - -``` -
- 263/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9670 - loss: 0.1000 - moe_loss: 3474.6157 - -
-``` - -``` -
- 267/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9671 - loss: 0.1000 - moe_loss: 3484.8552 - -
-``` - -``` -
- 271/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9671 - loss: 0.1000 - moe_loss: 3495.0950 - -
-``` - -``` -
- 274/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9671 - loss: 0.1000 - moe_loss: 3502.7751 - -
-``` - -``` -
- 278/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9672 - loss: 0.1000 - moe_loss: 3513.0149 - -
-``` - -``` -
- 282/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9672 - loss: 0.1000 - moe_loss: 3523.2546 - -
-``` - -``` -
- 286/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9672 - loss: 0.1000 - moe_loss: 3533.4944 - -
-``` - -``` -
- 290/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9673 - loss: 0.1000 - moe_loss: 3543.7341 - -
-``` - -``` -
- 294/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9673 - loss: 0.1000 - moe_loss: 3553.9744 - -
-``` - -``` -
- 298/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9674 - loss: 0.1000 - moe_loss: 3564.2141 - -
-``` - -``` -
- 302/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9674 - loss: 0.1000 - moe_loss: 3574.4539 - -
-``` - -``` -
- 306/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9674 - loss: 0.1000 - moe_loss: 3584.6936 - -
-``` - -``` -
- 310/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9675 - loss: 0.1000 - moe_loss: 3594.9331 - -
-``` - -``` -
- 314/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9675 - loss: 0.1000 - moe_loss: 3605.1729 - -
-``` - -``` -
- 318/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9675 - loss: 0.1000 - moe_loss: 3615.4126 - -
-``` - -``` -
- 322/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9676 - loss: 0.1000 - moe_loss: 3625.6523 - -
-``` - -``` -
- 325/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9676 - loss: 0.1000 - moe_loss: 3633.3323 - -
-``` - -``` -
- 328/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9676 - loss: 0.1000 - moe_loss: 3641.0125 - -
-``` - -``` -
- 332/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9676 - loss: 0.1000 - moe_loss: 3651.2524 - -
-``` - -``` -
- 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9677 - loss: 0.1000 - moe_loss: 3661.4922 - -
-``` - -``` -
- 340/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9677 - loss: 0.1000 - moe_loss: 3671.7319 - -
-``` - -``` -
- 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9677 - loss: 0.1000 - moe_loss: 3681.9717 - -
-``` - -``` -
- 348/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9678 - loss: 0.1000 - moe_loss: 3692.2117 - -
-``` - -``` -
- 352/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9678 - loss: 0.1000 - moe_loss: 3702.4514 - -
-``` - -``` -
- 356/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9678 - loss: 0.1000 - moe_loss: 3712.6914 - -
-``` - -``` -
- 360/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9679 - loss: 0.1000 - moe_loss: 3722.9312 - -
-``` - -``` -
- 364/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9679 - loss: 0.1000 - moe_loss: 3733.1711 - -
-``` - -``` -
- 367/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9679 - loss: 0.1000 - moe_loss: 3740.8511 - -
-``` - -``` -
- 370/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9679 - loss: 0.1000 - moe_loss: 3748.5310 - -
-``` - -``` -
- 374/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9680 - loss: 0.1000 - moe_loss: 3758.7710 - -
-``` - -``` -
- 378/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9680 - loss: 0.1000 - moe_loss: 3769.0112 - -
-``` - -``` -
- 381/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9680 - loss: 0.1000 - moe_loss: 3776.6914 - -
-``` - -``` -
- 384/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9680 - loss: 0.1000 - moe_loss: 3784.3713 - -
-``` - -``` -
- 388/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9680 - loss: 0.1000 - moe_loss: 3794.6113 - -
-``` - -``` -
- 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9681 - loss: 0.1000 - moe_loss: 3802.2913 - -
-``` - -``` -
- 392/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9681 - loss: 0.1000 - moe_loss: 3804.8511 - -
-``` - -``` -
- 395/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9681 - loss: 0.1000 - moe_loss: 3812.5310 - -
-``` - -``` -
- 398/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9681 - loss: 0.1000 - moe_loss: 3820.2109 - -
-``` - -``` -
- 401/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9681 - loss: 0.1000 - moe_loss: 3827.8906 - -
-``` - -``` -
- 404/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9682 - loss: 0.1000 - moe_loss: 3835.5706 - -
-``` - -``` -
- 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9682 - loss: 0.1000 - moe_loss: 3843.2505 - -
-``` - -``` -
- 410/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9682 - loss: 0.1000 - moe_loss: 3850.9304 - -
-``` - -``` -
- 413/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9682 - loss: 0.1000 - moe_loss: 3858.6106 - -
-``` - -``` -
- 417/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9682 - loss: 0.1000 - moe_loss: 3868.8503 - -
-``` - -``` -
- 421/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9683 - loss: 0.1000 - moe_loss: 3879.0901 - -
-``` - -``` -
- 425/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9683 - loss: 0.1000 - moe_loss: 3889.3303 - -
-``` - -``` -
- 429/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9683 - loss: 0.1000 - moe_loss: 3899.5706 - -
-``` - -``` -
- 432/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9683 - loss: 0.1000 - moe_loss: 3907.2507 - -
-``` - -``` -
- 435/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9684 - loss: 0.1000 - moe_loss: 3914.9309 - -
-``` - -``` -
- 438/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9684 - loss: 0.1000 - moe_loss: 3922.6106 - -
-``` - -``` -
- 441/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9684 - loss: 0.1000 - moe_loss: 3930.2908 - -
-``` - -``` -
- 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9684 - loss: 0.1000 - moe_loss: 3940.5305 - -
-``` - -``` -
- 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9685 - loss: 0.1000 - moe_loss: 3950.7703 - -
-``` - -``` -
- 452/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9685 - loss: 0.1000 - moe_loss: 3958.4500 - -
-``` - -``` -
- 456/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9685 - loss: 0.1000 - moe_loss: 3968.6899 - -
-``` - -``` -
- 459/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9685 - loss: 0.1000 - moe_loss: 3976.3699 - -
-``` - -``` -
- 462/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9686 - loss: 0.1000 - moe_loss: 3984.0498 - -
-``` - -``` -
- 466/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9686 - loss: 0.1000 - moe_loss: 3994.2898 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9686 - loss: 0.1000 - moe_loss: 4004.5132 - val_loss: 0.1000 - val_moe_loss: 5598.7266 - - -
-``` -Epoch 3/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 13s 28ms/step - accuracy: 0.9766 - loss: 0.1000 - moe_loss: 5603.8740 - -
-``` - -``` -
- 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9725 - loss: 0.1000 - moe_loss: 5614.1147 - -
-``` - -``` -
- 9/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9711 - loss: 0.1000 - moe_loss: 5624.3594 - -
-``` - -``` -
- 12/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9713 - loss: 0.1000 - moe_loss: 5632.0366 - -
-``` - -``` -
- 16/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9720 - loss: 0.1000 - moe_loss: 5642.2812 - -
-``` - -``` -
- 20/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9727 - loss: 0.1000 - moe_loss: 5652.5317 - -
-``` - -``` -
- 24/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9731 - loss: 0.1000 - moe_loss: 5662.7671 - -
-``` - -``` -
- 28/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9734 - loss: 0.1000 - moe_loss: 5673.0073 - -
-``` - -``` -
- 31/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9736 - loss: 0.1000 - moe_loss: 5680.6851 - -
-``` - -``` -
- 35/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9738 - loss: 0.1000 - moe_loss: 5690.9282 - -
-``` - -``` -
- 39/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9740 - loss: 0.1000 - moe_loss: 5701.1680 - -
-``` - -``` -
- 43/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9742 - loss: 0.1000 - moe_loss: 5711.4087 - -
-``` - -``` -
- 47/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9742 - loss: 0.1000 - moe_loss: 5721.6470 - -
-``` - -``` -
- 51/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9743 - loss: 0.1000 - moe_loss: 5731.8843 - -
-``` - -``` -
- 54/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9743 - loss: 0.1000 - moe_loss: 5739.5645 - -
-``` - -``` -
- 58/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9744 - loss: 0.1000 - moe_loss: 5749.8052 - -
-``` - -``` -
- 61/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9745 - loss: 0.1000 - moe_loss: 5757.4844 - -
-``` - -``` -
- 65/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9746 - loss: 0.1000 - moe_loss: 5767.7251 - -
-``` - -``` -
- 69/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9747 - loss: 0.1000 - moe_loss: 5777.9648 - -
-``` - -``` -
- 73/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9748 - loss: 0.1000 - moe_loss: 5788.2041 - -
-``` - -``` -
- 77/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9749 - loss: 0.1000 - moe_loss: 5798.4434 - -
-``` - -``` -
- 81/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9750 - loss: 0.1000 - moe_loss: 5808.6831 - -
-``` - -``` -
- 84/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9751 - loss: 0.1000 - moe_loss: 5816.3623 - -
-``` - -``` -
- 88/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9752 - loss: 0.1000 - moe_loss: 5826.6025 - -
-``` - -``` -
- 92/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9752 - loss: 0.1000 - moe_loss: 5836.8413 - -
-``` - -``` -
- 96/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9753 - loss: 0.1000 - moe_loss: 5847.0811 - -
-``` - -``` -
- 100/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9753 - loss: 0.1000 - moe_loss: 5857.3213 - -
-``` - -``` -
- 104/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9754 - loss: 0.1000 - moe_loss: 5867.5610 - -
-``` - -``` -
- 108/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9755 - loss: 0.1000 - moe_loss: 5877.8013 - -
-``` - -``` -
- 111/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9755 - loss: 0.1000 - moe_loss: 5885.4810 - -
-``` - -``` -
- 115/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9756 - loss: 0.1000 - moe_loss: 5895.7212 - -
-``` - -``` -
- 119/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9756 - loss: 0.1000 - moe_loss: 5905.9614 - -
-``` - -``` -
- 122/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9757 - loss: 0.1000 - moe_loss: 5913.6421 - -
-``` - -``` -
- 126/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9758 - loss: 0.1000 - moe_loss: 5923.8813 - -
-``` - -``` -
- 129/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9758 - loss: 0.1000 - moe_loss: 5931.5615 - -
-``` - -``` -
- 132/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9758 - loss: 0.1000 - moe_loss: 5939.2412 - -
-``` - -``` -
- 136/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9759 - loss: 0.1000 - moe_loss: 5949.4810 - -
-``` - -``` -
- 140/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9759 - loss: 0.1000 - moe_loss: 5959.7207 - -
-``` - -``` -
- 144/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9760 - loss: 0.1000 - moe_loss: 5969.9600 - -
-``` - -``` -
- 148/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9760 - loss: 0.1000 - moe_loss: 5980.2007 - -
-``` - -``` -
- 152/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9761 - loss: 0.1000 - moe_loss: 5990.4404 - -
-``` - -``` -
- 156/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9761 - loss: 0.1000 - moe_loss: 6000.6802 - -
-``` - -``` -
- 160/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9762 - loss: 0.1000 - moe_loss: 6010.9199 - -
-``` - -``` -
- 164/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9762 - loss: 0.1000 - moe_loss: 6021.1602 - -
-``` - -``` -
- 168/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9763 - loss: 0.1000 - moe_loss: 6031.3994 - -
-``` - -``` -
- 172/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9763 - loss: 0.1000 - moe_loss: 6041.6392 - -
-``` - -``` -
- 173/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9763 - loss: 0.1000 - moe_loss: 6044.1992 - -
-``` - -``` -
- 176/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9764 - loss: 0.1000 - moe_loss: 6051.8794 - -
-``` - -``` -
- 179/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9764 - loss: 0.1000 - moe_loss: 6059.5596 - -
-``` - -``` -
- 183/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9765 - loss: 0.1000 - moe_loss: 6069.7998 - -
-``` - -``` -
- 187/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9765 - loss: 0.1000 - moe_loss: 6080.0405 - -
-``` - -``` -
- 191/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9765 - loss: 0.1000 - moe_loss: 6090.2808 - -
-``` - -``` -
- 195/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9766 - loss: 0.1000 - moe_loss: 6100.5200 - -
-``` - -``` -
- 199/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9766 - loss: 0.1000 - moe_loss: 6110.7603 - -
-``` - -``` -
- 203/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9767 - loss: 0.1000 - moe_loss: 6120.9995 - -
-``` - -``` -
- 207/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9767 - loss: 0.1000 - moe_loss: 6131.2402 - -
-``` - -``` -
- 211/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9768 - loss: 0.1000 - moe_loss: 6141.4800 - -
-``` - -``` -
- 215/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9768 - loss: 0.1000 - moe_loss: 6151.7197 - -
-``` - -``` -
- 219/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9769 - loss: 0.1000 - moe_loss: 6161.9600 - -
-``` - -``` -
- 223/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9769 - loss: 0.1000 - moe_loss: 6172.1992 - -
-``` - -``` -
- 227/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9769 - loss: 0.1000 - moe_loss: 6182.4390 - -
-``` - -``` -
- 231/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9770 - loss: 0.1000 - moe_loss: 6192.6792 - -
-``` - -``` -
- 235/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9770 - loss: 0.1000 - moe_loss: 6202.9194 - -
-``` - -``` -
- 239/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9771 - loss: 0.1000 - moe_loss: 6213.1592 - -
-``` - -``` -
- 243/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9771 - loss: 0.1000 - moe_loss: 6223.3989 - -
-``` - -``` -
- 246/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9771 - loss: 0.1000 - moe_loss: 6231.0786 - -
-``` - -``` -
- 250/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9772 - loss: 0.1000 - moe_loss: 6241.3188 - -
-``` - -``` -
- 253/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9772 - loss: 0.1000 - moe_loss: 6248.9990 - -
-``` - -``` -
- 256/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9772 - loss: 0.1000 - moe_loss: 6256.6792 - -
-``` - -``` -
- 260/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9773 - loss: 0.1000 - moe_loss: 6266.9189 - -
-``` - -``` -
- 264/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9773 - loss: 0.1000 - moe_loss: 6277.1587 - -
-``` - -``` -
- 267/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9773 - loss: 0.1000 - moe_loss: 6284.8384 - -
-``` - -``` -
- 270/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9773 - loss: 0.1000 - moe_loss: 6292.5186 - -
-``` - -``` -
- 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9773 - loss: 0.1000 - moe_loss: 6300.1987 - -
-``` - -``` -
- 276/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9774 - loss: 0.1000 - moe_loss: 6307.8789 - -
-``` - -``` -
- 279/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9774 - loss: 0.1000 - moe_loss: 6315.5586 - -
-``` - -``` -
- 282/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9774 - loss: 0.1000 - moe_loss: 6323.2388 - -
-``` - -``` -
- 286/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9774 - loss: 0.1000 - moe_loss: 6333.4790 - -
-``` - -``` -
- 290/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9775 - loss: 0.1000 - moe_loss: 6343.7188 - -
-``` - -``` -
- 294/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9775 - loss: 0.1000 - moe_loss: 6353.9590 - -
-``` - -``` -
- 298/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9775 - loss: 0.1000 - moe_loss: 6364.1992 - -
-``` - -``` -
- 302/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9776 - loss: 0.1000 - moe_loss: 6374.4390 - -
-``` - -``` -
- 305/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9776 - loss: 0.1000 - moe_loss: 6382.1191 - -
-``` - -``` -
- 309/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9776 - loss: 0.1000 - moe_loss: 6392.3589 - -
-``` - -``` -
- 313/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9776 - loss: 0.1000 - moe_loss: 6402.5991 - -
-``` - -``` -
- 317/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9777 - loss: 0.1000 - moe_loss: 6412.8389 - -
-``` - -``` -
- 321/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9777 - loss: 0.1000 - moe_loss: 6423.0786 - -
-``` - -``` -
- 325/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9777 - loss: 0.1000 - moe_loss: 6433.3184 - -
-``` - -``` -
- 329/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9778 - loss: 0.1000 - moe_loss: 6443.5581 - -
-``` - -``` -
- 333/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9778 - loss: 0.1000 - moe_loss: 6453.7983 - -
-``` - -``` -
- 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9778 - loss: 0.1000 - moe_loss: 6461.4780 - -
-``` - -``` -
- 340/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9778 - loss: 0.1000 - moe_loss: 6471.7178 - -
-``` - -``` -
- 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9779 - loss: 0.1000 - moe_loss: 6481.9580 - -
-``` - -``` -
- 348/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9779 - loss: 0.1000 - moe_loss: 6492.1978 - -
-``` - -``` -
- 352/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9779 - loss: 0.1000 - moe_loss: 6502.4375 - -
-``` - -``` -
- 356/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9779 - loss: 0.1000 - moe_loss: 6512.6777 - -
-``` - -``` -
- 360/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9780 - loss: 0.1000 - moe_loss: 6522.9180 - -
-``` - -``` -
- 364/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9780 - loss: 0.1000 - moe_loss: 6533.1577 - -
-``` - -``` -
- 367/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9780 - loss: 0.1000 - moe_loss: 6540.8379 - -
-``` - -``` -
- 371/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9780 - loss: 0.1000 - moe_loss: 6551.0776 - -
-``` - -``` -
- 375/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9780 - loss: 0.1000 - moe_loss: 6561.3174 - -
-``` - -``` -
- 379/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9781 - loss: 0.1000 - moe_loss: 6571.5576 - -
-``` - -``` -
- 383/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9781 - loss: 0.1000 - moe_loss: 6581.7974 - -
-``` - -``` -
- 387/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9781 - loss: 0.1000 - moe_loss: 6592.0371 - -
-``` - -``` -
- 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9781 - loss: 0.1000 - moe_loss: 6602.2773 - -
-``` - -``` -
- 395/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9781 - loss: 0.1000 - moe_loss: 6612.5176 - -
-``` - -``` -
- 398/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9782 - loss: 0.1000 - moe_loss: 6620.1973 - -
-``` - -``` -
- 402/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9782 - loss: 0.1000 - moe_loss: 6630.4375 - -
-``` - -``` -
- 405/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9782 - loss: 0.1000 - moe_loss: 6638.1172 - -
-``` - -``` -
- 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9782 - loss: 0.1000 - moe_loss: 6648.3569 - -
-``` - -``` -
- 413/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9782 - loss: 0.1000 - moe_loss: 6658.5972 - -
-``` - -``` -
- 416/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9782 - loss: 0.1000 - moe_loss: 6666.2769 - -
-``` - -``` -
- 419/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 6673.9570 - -
-``` - -``` -
- 423/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 6684.1973 - -
-``` - -``` -
- 426/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 6691.8770 - -
-``` - -``` -
- 429/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 6699.5571 - -
-``` - -``` -
- 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 6709.7969 - -
-``` - -``` -
- 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 6720.0366 - -
-``` - -``` -
- 441/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 6730.2764 - -
-``` - -``` -
- 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 6740.5166 - -
-``` - -``` -
- 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 6750.7563 - -
-``` - -``` -
- 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 6760.9961 - -
-``` - -``` -
- 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 6771.2363 - -
-``` - -``` -
- 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 6781.4766 - -
-``` - -``` -
- 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 6791.7163 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 6801.9536 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9785 - loss: 0.1000 - moe_loss: 6804.5000 - val_loss: 0.1000 - val_moe_loss: 8398.7275 - - -
-``` -Epoch 4/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 12s 26ms/step - accuracy: 0.9766 - loss: 0.1000 - moe_loss: 8403.8486 - -
-``` - -``` -
- 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 8414.1064 - -
-``` - -``` -
- 9/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9777 - loss: 0.1000 - moe_loss: 8424.3496 - -
-``` - -``` -
- 13/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9775 - loss: 0.1000 - moe_loss: 8434.5850 - -
-``` - -``` -
- 17/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9771 - loss: 0.1000 - moe_loss: 8444.8232 - -
-``` - -``` -
- 21/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9770 - loss: 0.1000 - moe_loss: 8455.0625 - -
-``` - -``` -
- 25/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9772 - loss: 0.1000 - moe_loss: 8465.3047 - -
-``` - -``` -
- 28/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9773 - loss: 0.1000 - moe_loss: 8472.9844 - -
-``` - -``` -
- 32/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9774 - loss: 0.1000 - moe_loss: 8483.2256 - -
-``` - -``` -
- 36/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9776 - loss: 0.1000 - moe_loss: 8493.4678 - -
-``` - -``` -
- 40/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9779 - loss: 0.1000 - moe_loss: 8503.7090 - -
-``` - -``` -
- 44/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9781 - loss: 0.1000 - moe_loss: 8513.9502 - -
-``` - -``` -
- 48/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9782 - loss: 0.1000 - moe_loss: 8524.1924 - -
-``` - -``` -
- 52/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9783 - loss: 0.1000 - moe_loss: 8534.4336 - -
-``` - -``` -
- 56/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9784 - loss: 0.1000 - moe_loss: 8544.6738 - -
-``` - -``` -
- 60/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9786 - loss: 0.1000 - moe_loss: 8554.9131 - -
-``` - -``` -
- 64/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9787 - loss: 0.1000 - moe_loss: 8565.1514 - -
-``` - -``` -
- 68/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9789 - loss: 0.1000 - moe_loss: 8575.3916 - -
-``` - -``` -
- 72/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9790 - loss: 0.1000 - moe_loss: 8585.6318 - -
-``` - -``` -
- 75/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9791 - loss: 0.1000 - moe_loss: 8593.3125 - -
-``` - -``` -
- 79/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9792 - loss: 0.1000 - moe_loss: 8603.5527 - -
-``` - -``` -
- 83/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9793 - loss: 0.1000 - moe_loss: 8613.7930 - -
-``` - -``` -
- 87/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9793 - loss: 0.1000 - moe_loss: 8624.0332 - -
-``` - -``` -
- 90/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9794 - loss: 0.1000 - moe_loss: 8631.7139 - -
-``` - -``` -
- 94/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9794 - loss: 0.1000 - moe_loss: 8641.9541 - -
-``` - -``` -
- 98/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9795 - loss: 0.1000 - moe_loss: 8652.1943 - -
-``` - -``` -
- 101/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9795 - loss: 0.1000 - moe_loss: 8659.8740 - -
-``` - -``` -
- 104/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9796 - loss: 0.1000 - moe_loss: 8667.5547 - -
-``` - -``` -
- 108/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9797 - loss: 0.1000 - moe_loss: 8677.7939 - -
-``` - -``` -
- 111/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9797 - loss: 0.1000 - moe_loss: 8685.4736 - -
-``` - -``` -
- 115/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9798 - loss: 0.1000 - moe_loss: 8695.7139 - -
-``` - -``` -
- 119/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9799 - loss: 0.1000 - moe_loss: 8705.9541 - -
-``` - -``` -
- 122/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9799 - loss: 0.1000 - moe_loss: 8713.6348 - -
-``` - -``` -
- 126/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9800 - loss: 0.1000 - moe_loss: 8723.8750 - -
-``` - -``` -
- 130/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9800 - loss: 0.1000 - moe_loss: 8734.1143 - -
-``` - -``` -
- 134/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9801 - loss: 0.1000 - moe_loss: 8744.3545 - -
-``` - -``` -
- 138/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9802 - loss: 0.1000 - moe_loss: 8754.5947 - -
-``` - -``` -
- 142/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9802 - loss: 0.1000 - moe_loss: 8764.8350 - -
-``` - -``` -
- 146/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9803 - loss: 0.1000 - moe_loss: 8775.0742 - -
-``` - -``` -
- 149/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9803 - loss: 0.1000 - moe_loss: 8782.7549 - -
-``` - -``` -
- 152/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9804 - loss: 0.1000 - moe_loss: 8790.4346 - -
-``` - -``` -
- 156/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9805 - loss: 0.1000 - moe_loss: 8800.6738 - -
-``` - -``` -
- 160/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9805 - loss: 0.1000 - moe_loss: 8810.9141 - -
-``` - -``` -
- 164/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9806 - loss: 0.1000 - moe_loss: 8821.1533 - -
-``` - -``` -
- 168/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9806 - loss: 0.1000 - moe_loss: 8831.3936 - -
-``` - -``` -
- 172/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9806 - loss: 0.1000 - moe_loss: 8841.6328 - -
-``` - -``` -
- 176/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9807 - loss: 0.1000 - moe_loss: 8851.8730 - -
-``` - -``` -
- 180/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9807 - loss: 0.1000 - moe_loss: 8862.1123 - -
-``` - -``` -
- 184/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9808 - loss: 0.1000 - moe_loss: 8872.3525 - -
-``` - -``` -
- 188/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9808 - loss: 0.1000 - moe_loss: 8882.5928 - -
-``` - -``` -
- 192/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9809 - loss: 0.1000 - moe_loss: 8892.8330 - -
-``` - -``` -
- 196/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9809 - loss: 0.1000 - moe_loss: 8903.0732 - -
-``` - -``` -
- 200/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9809 - loss: 0.1000 - moe_loss: 8913.3135 - -
-``` - -``` -
- 204/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9810 - loss: 0.1000 - moe_loss: 8923.5537 - -
-``` - -``` -
- 207/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9810 - loss: 0.1000 - moe_loss: 8931.2334 - -
-``` - -``` -
- 210/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9810 - loss: 0.1000 - moe_loss: 8938.9131 - -
-``` - -``` -
- 214/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9810 - loss: 0.1000 - moe_loss: 8949.1523 - -
-``` - -``` -
- 218/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9811 - loss: 0.1000 - moe_loss: 8959.3926 - -
-``` - -``` -
- 222/469 ━━━━━━━━━━━━━━━━━━━━ 4s 16ms/step - accuracy: 0.9811 - loss: 0.1000 - moe_loss: 8969.6318 - -
-``` - -``` -
- 226/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9811 - loss: 0.1000 - moe_loss: 8979.8721 - -
-``` - -``` -
- 229/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9811 - loss: 0.1000 - moe_loss: 8987.5518 - -
-``` - -``` -
- 233/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9812 - loss: 0.1000 - moe_loss: 8997.7920 - -
-``` - -``` -
- 236/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9812 - loss: 0.1000 - moe_loss: 9005.4717 - -
-``` - -``` -
- 240/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9812 - loss: 0.1000 - moe_loss: 9015.7119 - -
-``` - -``` -
- 244/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9812 - loss: 0.1000 - moe_loss: 9025.9521 - -
-``` - -``` -
- 248/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9813 - loss: 0.1000 - moe_loss: 9036.1914 - -
-``` - -``` -
- 252/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9813 - loss: 0.1000 - moe_loss: 9046.4316 - -
-``` - -``` -
- 255/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9813 - loss: 0.1000 - moe_loss: 9054.1113 - -
-``` - -``` -
- 258/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9813 - loss: 0.1000 - moe_loss: 9061.7910 - -
-``` - -``` -
- 262/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9813 - loss: 0.1000 - moe_loss: 9072.0312 - -
-``` - -``` -
- 266/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 9082.2715 - -
-``` - -``` -
- 269/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 9089.9512 - -
-``` - -``` -
- 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 9100.1914 - -
-``` - -``` -
- 277/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 9110.4307 - -
-``` - -``` -
- 280/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 9118.1113 - -
-``` - -``` -
- 284/469 ━━━━━━━━━━━━━━━━━━━━ 3s 16ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 9128.3516 - -
-``` - -``` -
- 288/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9815 - loss: 0.1000 - moe_loss: 9138.5908 - -
-``` - -``` -
- 292/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9815 - loss: 0.1000 - moe_loss: 9148.8311 - -
-``` - -``` -
- 296/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9815 - loss: 0.1000 - moe_loss: 9159.0713 - -
-``` - -``` -
- 300/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9815 - loss: 0.1000 - moe_loss: 9169.3105 - -
-``` - -``` -
- 304/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9816 - loss: 0.1000 - moe_loss: 9179.5508 - -
-``` - -``` -
- 307/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9816 - loss: 0.1000 - moe_loss: 9187.2305 - -
-``` - -``` -
- 311/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9816 - loss: 0.1000 - moe_loss: 9197.4707 - -
-``` - -``` -
- 314/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9816 - loss: 0.1000 - moe_loss: 9205.1504 - -
-``` - -``` -
- 318/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9816 - loss: 0.1000 - moe_loss: 9215.3906 - -
-``` - -``` -
- 322/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9816 - loss: 0.1000 - moe_loss: 9225.6309 - -
-``` - -``` -
- 326/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9817 - loss: 0.1000 - moe_loss: 9235.8711 - -
-``` - -``` -
- 329/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9817 - loss: 0.1000 - moe_loss: 9243.5508 - -
-``` - -``` -
- 332/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9817 - loss: 0.1000 - moe_loss: 9251.2314 - -
-``` - -``` -
- 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9817 - loss: 0.1000 - moe_loss: 9261.4707 - -
-``` - -``` -
- 340/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9817 - loss: 0.1000 - moe_loss: 9271.7109 - -
-``` - -``` -
- 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 16ms/step - accuracy: 0.9817 - loss: 0.1000 - moe_loss: 9281.9512 - -
-``` - -``` -
- 348/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 9292.1914 - -
-``` - -``` -
- 351/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 9299.8711 - -
-``` - -``` -
- 355/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 9310.1113 - -
-``` - -``` -
- 359/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 9320.3516 - -
-``` - -``` -
- 363/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 9330.5908 - -
-``` - -``` -
- 367/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 9340.8311 - -
-``` - -``` -
- 370/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9348.5107 - -
-``` - -``` -
- 374/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9358.7510 - -
-``` - -``` -
- 378/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9368.9912 - -
-``` - -``` -
- 381/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9376.6709 - -
-``` - -``` -
- 385/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9386.9111 - -
-``` - -``` -
- 389/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9397.1514 - -
-``` - -``` -
- 392/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9404.8311 - -
-``` - -``` -
- 396/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9415.0713 - -
-``` - -``` -
- 399/469 ━━━━━━━━━━━━━━━━━━━━ 1s 16ms/step - accuracy: 0.9819 - loss: 0.1000 - moe_loss: 9422.7510 - -
-``` - -``` -
- 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9432.9912 - -
-``` - -``` -
- 406/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9440.6709 - -
-``` - -``` -
- 409/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9448.3506 - -
-``` - -``` -
- 413/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9458.5908 - -
-``` - -``` -
- 417/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9468.8311 - -
-``` - -``` -
- 421/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9479.0703 - -
-``` - -``` -
- 425/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9489.3105 - -
-``` - -``` -
- 429/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9499.5498 - -
-``` - -``` -
- 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 9509.7900 - -
-``` - -``` -
- 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9520.0303 - -
-``` - -``` -
- 440/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9527.7100 - -
-``` - -``` -
- 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9535.3896 - -
-``` - -``` -
- 447/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9545.6299 - -
-``` - -``` -
- 450/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9553.3096 - -
-``` - -``` -
- 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9560.9902 - -
-``` - -``` -
- 456/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9568.6699 - -
-``` - -``` -
- 460/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9578.9102 - -
-``` - -``` -
- 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9586.5898 - -
-``` - -``` -
- 466/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 9594.2695 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 17ms/step - accuracy: 0.9822 - loss: 0.1000 - moe_loss: 9604.4932 - val_loss: 0.1000 - val_moe_loss: 11198.7256 - - -
-``` -Epoch 5/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 12s 26ms/step - accuracy: 0.9688 - loss: 0.1000 - moe_loss: 11203.8506 - -
-``` - -``` -
- 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9796 - loss: 0.1000 - moe_loss: 11214.1025 - -
-``` - -``` -
- 8/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9802 - loss: 0.1000 - moe_loss: 11221.7900 - -
-``` - -``` -
- 11/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9807 - loss: 0.1000 - moe_loss: 11229.4717 - -
-``` - -``` -
- 15/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9807 - loss: 0.1000 - moe_loss: 11239.7148 - -
-``` - -``` -
- 18/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9809 - loss: 0.1000 - moe_loss: 11247.3945 - -
-``` - -``` -
- 21/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9810 - loss: 0.1000 - moe_loss: 11255.0732 - -
-``` - -``` -
- 25/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9813 - loss: 0.1000 - moe_loss: 11265.3115 - -
-``` - -``` -
- 29/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9813 - loss: 0.1000 - moe_loss: 11275.5498 - -
-``` - -``` -
- 33/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 11285.7881 - -
-``` - -``` -
- 37/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 11296.0273 - -
-``` - -``` -
- 40/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9814 - loss: 0.1000 - moe_loss: 11303.7070 - -
-``` - -``` -
- 42/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9815 - loss: 0.1000 - moe_loss: 11308.8281 - -
-``` - -``` -
- 43/469 ━━━━━━━━━━━━━━━━━━━━ 9s 22ms/step - accuracy: 0.9815 - loss: 0.1000 - moe_loss: 11311.3887 - -
-``` - -``` -
- 46/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9816 - loss: 0.1000 - moe_loss: 11319.0693 - -
-``` - -``` -
- 49/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9817 - loss: 0.1000 - moe_loss: 11326.7500 - -
-``` - -``` -
- 53/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 11336.9912 - -
-``` - -``` -
- 56/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9818 - loss: 0.1000 - moe_loss: 11344.6709 - -
-``` - -``` -
- 60/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9820 - loss: 0.1000 - moe_loss: 11354.9111 - -
-``` - -``` -
- 64/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 11365.1504 - -
-``` - -``` -
- 67/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 11372.8301 - -
-``` - -``` -
- 70/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 11380.5098 - -
-``` - -``` -
- 74/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9822 - loss: 0.1000 - moe_loss: 11390.7490 - -
-``` - -``` -
- 77/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9822 - loss: 0.1000 - moe_loss: 11398.4287 - -
-``` - -``` -
- 81/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9822 - loss: 0.1000 - moe_loss: 11408.6680 - -
-``` - -``` -
- 84/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9822 - loss: 0.1000 - moe_loss: 11416.3486 - -
-``` - -``` -
- 87/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9823 - loss: 0.1000 - moe_loss: 11424.0283 - -
-``` - -``` -
- 90/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9823 - loss: 0.1000 - moe_loss: 11431.7080 - -
-``` - -``` -
- 94/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9823 - loss: 0.1000 - moe_loss: 11441.9473 - -
-``` - -``` -
- 97/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9823 - loss: 0.1000 - moe_loss: 11449.6270 - -
-``` - -``` -
- 100/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9824 - loss: 0.1000 - moe_loss: 11457.3076 - -
-``` - -``` -
- 104/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9824 - loss: 0.1000 - moe_loss: 11467.5479 - -
-``` - -``` -
- 108/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9824 - loss: 0.1000 - moe_loss: 11477.7871 - -
-``` - -``` -
- 112/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9825 - loss: 0.1000 - moe_loss: 11488.0273 - -
-``` - -``` -
- 116/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9825 - loss: 0.1000 - moe_loss: 11498.2666 - -
-``` - -``` -
- 120/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9825 - loss: 0.1000 - moe_loss: 11508.5078 - -
-``` - -``` -
- 124/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9826 - loss: 0.1000 - moe_loss: 11518.7480 - -
-``` - -``` -
- 127/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9826 - loss: 0.1000 - moe_loss: 11526.4277 - -
-``` - -``` -
- 131/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9827 - loss: 0.1000 - moe_loss: 11536.6680 - -
-``` - -``` -
- 134/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9827 - loss: 0.1000 - moe_loss: 11544.3477 - -
-``` - -``` -
- 138/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9827 - loss: 0.1000 - moe_loss: 11554.5879 - -
-``` - -``` -
- 142/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9828 - loss: 0.1000 - moe_loss: 11564.8271 - -
-``` - -``` -
- 146/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9828 - loss: 0.1000 - moe_loss: 11575.0674 - -
-``` - -``` -
- 150/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9829 - loss: 0.1000 - moe_loss: 11585.3066 - -
-``` - -``` -
- 153/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9829 - loss: 0.1000 - moe_loss: 11592.9873 - -
-``` - -``` -
- 157/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9830 - loss: 0.1000 - moe_loss: 11603.2266 - -
-``` - -``` -
- 161/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9830 - loss: 0.1000 - moe_loss: 11613.4668 - -
-``` - -``` -
- 165/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9830 - loss: 0.1000 - moe_loss: 11623.7061 - -
-``` - -``` -
- 169/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9831 - loss: 0.1000 - moe_loss: 11633.9463 - -
-``` - -``` -
- 172/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9831 - loss: 0.1000 - moe_loss: 11641.6270 - -
-``` - -``` -
- 175/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9831 - loss: 0.1000 - moe_loss: 11649.3066 - -
-``` - -``` -
- 179/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9832 - loss: 0.1000 - moe_loss: 11659.5459 - -
-``` - -``` -
- 182/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9832 - loss: 0.1000 - moe_loss: 11667.2256 - -
-``` - -``` -
- 185/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9832 - loss: 0.1000 - moe_loss: 11674.9062 - -
-``` - -``` -
- 189/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9833 - loss: 0.1000 - moe_loss: 11685.1455 - -
-``` - -``` -
- 193/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9833 - loss: 0.1000 - moe_loss: 11695.3857 - -
-``` - -``` -
- 197/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9833 - loss: 0.1000 - moe_loss: 11705.6260 - -
-``` - -``` -
- 201/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9834 - loss: 0.1000 - moe_loss: 11715.8662 - -
-``` - -``` -
- 204/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9834 - loss: 0.1000 - moe_loss: 11723.5459 - -
-``` - -``` -
- 208/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9834 - loss: 0.1000 - moe_loss: 11733.7861 - -
-``` - -``` -
- 211/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9834 - loss: 0.1000 - moe_loss: 11741.4658 - -
-``` - -``` -
- 215/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9835 - loss: 0.1000 - moe_loss: 11751.7061 - -
-``` - -``` -
- 218/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9835 - loss: 0.1000 - moe_loss: 11759.3857 - -
-``` - -``` -
- 221/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9835 - loss: 0.1000 - moe_loss: 11767.0654 - -
-``` - -``` -
- 225/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9835 - loss: 0.1000 - moe_loss: 11777.3057 - -
-``` - -``` -
- 229/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9836 - loss: 0.1000 - moe_loss: 11787.5459 - -
-``` - -``` -
- 233/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9836 - loss: 0.1000 - moe_loss: 11797.7861 - -
-``` - -``` -
- 236/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9836 - loss: 0.1000 - moe_loss: 11805.4658 - -
-``` - -``` -
- 240/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9836 - loss: 0.1000 - moe_loss: 11815.7051 - -
-``` - -``` -
- 243/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9837 - loss: 0.1000 - moe_loss: 11823.3857 - -
-``` - -``` -
- 246/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9837 - loss: 0.1000 - moe_loss: 11831.0654 - -
-``` - -``` -
- 249/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9837 - loss: 0.1000 - moe_loss: 11838.7461 - -
-``` - -``` -
- 253/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9837 - loss: 0.1000 - moe_loss: 11848.9854 - -
-``` - -``` -
- 257/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9837 - loss: 0.1000 - moe_loss: 11859.2256 - -
-``` - -``` -
- 260/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9838 - loss: 0.1000 - moe_loss: 11866.9053 - -
-``` - -``` -
- 263/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9838 - loss: 0.1000 - moe_loss: 11874.5859 - -
-``` - -``` -
- 267/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9838 - loss: 0.1000 - moe_loss: 11884.8252 - -
-``` - -``` -
- 270/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9838 - loss: 0.1000 - moe_loss: 11892.5059 - -
-``` - -``` -
- 274/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9838 - loss: 0.1000 - moe_loss: 11902.7451 - -
-``` - -``` -
- 278/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11912.9854 - -
-``` - -``` -
- 281/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11920.6650 - -
-``` - -``` -
- 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11930.9053 - -
-``` - -``` -
- 288/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11938.5850 - -
-``` - -``` -
- 291/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11946.2656 - -
-``` - -``` -
- 294/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11953.9453 - -
-``` - -``` -
- 298/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11964.1855 - -
-``` - -``` -
- 302/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9839 - loss: 0.1000 - moe_loss: 11974.4248 - -
-``` - -``` -
- 305/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 11982.1055 - -
-``` - -``` -
- 308/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 11989.7852 - -
-``` - -``` -
- 312/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12000.0254 - -
-``` - -``` -
- 315/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12007.7051 - -
-``` - -``` -
- 318/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12015.3848 - -
-``` - -``` -
- 321/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12023.0654 - -
-``` - -``` -
- 324/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12030.7451 - -
-``` - -``` -
- 327/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12038.4248 - -
-``` - -``` -
- 330/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12046.1055 - -
-``` - -``` -
- 333/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12053.7852 - -
-``` - -``` -
- 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12061.4648 - -
-``` - -``` -
- 340/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9840 - loss: 0.1000 - moe_loss: 12071.7051 - -
-``` - -``` -
- 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12081.9453 - -
-``` - -``` -
- 348/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12092.1846 - -
-``` - -``` -
- 352/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12102.4248 - -
-``` - -``` -
- 355/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12110.1055 - -
-``` - -``` -
- 359/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12120.3447 - -
-``` - -``` -
- 362/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12128.0254 - -
-``` - -``` -
- 365/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12135.7051 - -
-``` - -``` -
- 368/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12143.3848 - -
-``` - -``` -
- 372/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12153.6250 - -
-``` - -``` -
- 376/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12163.8652 - -
-``` - -``` -
- 380/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12174.1055 - -
-``` - -``` -
- 384/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12184.3447 - -
-``` - -``` -
- 387/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12192.0254 - -
-``` - -``` -
- 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12202.2646 - -
-``` - -``` -
- 395/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12212.5049 - -
-``` - -``` -
- 399/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9841 - loss: 0.1000 - moe_loss: 12222.7451 - -
-``` - -``` -
- 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12232.9854 - -
-``` - -``` -
- 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12243.2256 - -
-``` - -``` -
- 411/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12253.4658 - -
-``` - -``` -
- 414/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12261.1455 - -
-``` - -``` -
- 418/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12271.3857 - -
-``` - -``` -
- 421/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12279.0654 - -
-``` - -``` -
- 424/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12286.7451 - -
-``` - -``` -
- 427/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12294.4248 - -
-``` - -``` -
- 431/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12304.6650 - -
-``` - -``` -
- 434/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12312.3447 - -
-``` - -``` -
- 438/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12322.5850 - -
-``` - -``` -
- 441/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12330.2646 - -
-``` - -``` -
- 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12340.5049 - -
-``` - -``` -
- 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12350.7451 - -
-``` - -``` -
- 452/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12358.4248 - -
-``` - -``` -
- 456/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12368.6650 - -
-``` - -``` -
- 460/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12378.9053 - -
-``` - -``` -
- 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9842 - loss: 0.1000 - moe_loss: 12386.5850 - -
-``` - -``` -
- 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9843 - loss: 0.1000 - moe_loss: 12396.8252 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9843 - loss: 0.1000 - moe_loss: 12404.4883 - val_loss: 0.1000 - val_moe_loss: 13998.7246 - - -
-``` -Epoch 6/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 13s 29ms/step - accuracy: 0.9609 - loss: 0.1000 - moe_loss: 14003.8555 - -
-``` - -``` -
- 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9740 - loss: 0.1000 - moe_loss: 14014.0947 - -
-``` - -``` -
- 8/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9780 - loss: 0.1000 - moe_loss: 14021.7832 - -
-``` - -``` -
- 12/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9807 - loss: 0.1000 - moe_loss: 14032.0244 - -
-``` - -``` -
- 16/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9821 - loss: 0.1000 - moe_loss: 14042.2637 - -
-``` - -``` -
- 19/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9827 - loss: 0.1000 - moe_loss: 14049.9434 - -
-``` - -``` -
- 22/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9834 - loss: 0.1000 - moe_loss: 14057.6221 - -
-``` - -``` -
- 25/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9838 - loss: 0.1000 - moe_loss: 14065.3027 - -
-``` - -``` -
- 29/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9843 - loss: 0.1000 - moe_loss: 14075.5439 - -
-``` - -``` -
- 33/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9848 - loss: 0.1000 - moe_loss: 14085.7842 - -
-``` - -``` -
- 37/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9852 - loss: 0.1000 - moe_loss: 14096.0244 - -
-``` - -``` -
- 40/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9854 - loss: 0.1000 - moe_loss: 14103.7041 - -
-``` - -``` -
- 43/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9856 - loss: 0.1000 - moe_loss: 14111.3838 - -
-``` - -``` -
- 46/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9858 - loss: 0.1000 - moe_loss: 14119.0635 - -
-``` - -``` -
- 49/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9859 - loss: 0.1000 - moe_loss: 14126.7432 - -
-``` - -``` -
- 52/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9860 - loss: 0.1000 - moe_loss: 14134.4229 - -
-``` - -``` -
- 55/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9861 - loss: 0.1000 - moe_loss: 14142.1035 - -
-``` - -``` -
- 59/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9862 - loss: 0.1000 - moe_loss: 14152.3428 - -
-``` - -``` -
- 62/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9863 - loss: 0.1000 - moe_loss: 14160.0225 - -
-``` - -``` -
- 65/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9864 - loss: 0.1000 - moe_loss: 14167.7021 - -
-``` - -``` -
- 68/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9865 - loss: 0.1000 - moe_loss: 14175.3818 - -
-``` - -``` -
- 71/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9865 - loss: 0.1000 - moe_loss: 14183.0625 - -
-``` - -``` -
- 74/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9866 - loss: 0.1000 - moe_loss: 14190.7422 - -
-``` - -``` -
- 77/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9866 - loss: 0.1000 - moe_loss: 14198.4219 - -
-``` - -``` -
- 80/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9867 - loss: 0.1000 - moe_loss: 14206.1016 - -
-``` - -``` -
- 83/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9867 - loss: 0.1000 - moe_loss: 14213.7812 - -
-``` - -``` -
- 86/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9868 - loss: 0.1000 - moe_loss: 14221.4619 - -
-``` - -``` -
- 89/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9868 - loss: 0.1000 - moe_loss: 14229.1416 - -
-``` - -``` -
- 92/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9869 - loss: 0.1000 - moe_loss: 14236.8223 - -
-``` - -``` -
- 95/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9869 - loss: 0.1000 - moe_loss: 14244.5029 - -
-``` - -``` -
- 98/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9869 - loss: 0.1000 - moe_loss: 14252.1826 - -
-``` - -``` -
- 102/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9869 - loss: 0.1000 - moe_loss: 14262.4229 - -
-``` - -``` -
- 106/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9869 - loss: 0.1000 - moe_loss: 14272.6621 - -
-``` - -``` -
- 109/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9870 - loss: 0.1000 - moe_loss: 14280.3428 - -
-``` - -``` -
- 112/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9870 - loss: 0.1000 - moe_loss: 14288.0225 - -
-``` - -``` -
- 115/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9870 - loss: 0.1000 - moe_loss: 14295.7021 - -
-``` - -``` -
- 118/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9870 - loss: 0.1000 - moe_loss: 14303.3818 - -
-``` - -``` -
- 121/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9871 - loss: 0.1000 - moe_loss: 14311.0625 - -
-``` - -``` -
- 124/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9871 - loss: 0.1000 - moe_loss: 14318.7422 - -
-``` - -``` -
- 127/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9871 - loss: 0.1000 - moe_loss: 14326.4229 - -
-``` - -``` -
- 130/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9871 - loss: 0.1000 - moe_loss: 14334.1035 - -
-``` - -``` -
- 133/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9871 - loss: 0.1000 - moe_loss: 14341.7832 - -
-``` - -``` -
- 136/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9871 - loss: 0.1000 - moe_loss: 14349.4629 - -
-``` - -``` -
- 140/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9871 - loss: 0.1000 - moe_loss: 14359.7031 - -
-``` - -``` -
- 143/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14367.3838 - -
-``` - -``` -
- 146/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14375.0635 - -
-``` - -``` -
- 150/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14385.3037 - -
-``` - -``` -
- 153/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14392.9844 - -
-``` - -``` -
- 156/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14400.6641 - -
-``` - -``` -
- 159/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14408.3447 - -
-``` - -``` -
- 163/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14418.5840 - -
-``` - -``` -
- 166/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14426.2637 - -
-``` - -``` -
- 169/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14433.9443 - -
-``` - -``` -
- 172/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 14441.6240 - -
-``` - -``` -
- 175/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14449.3037 - -
-``` - -``` -
- 179/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14459.5430 - -
-``` - -``` -
- 182/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14467.2236 - -
-``` - -``` -
- 185/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14474.9033 - -
-``` - -``` -
- 188/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14482.5830 - -
-``` - -``` -
- 191/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14490.2627 - -
-``` - -``` -
- 194/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14497.9434 - -
-``` - -``` -
- 197/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14505.6230 - -
-``` - -``` -
- 200/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14513.3037 - -
-``` - -``` -
- 204/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14523.5430 - -
-``` - -``` -
- 207/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14531.2236 - -
-``` - -``` -
- 210/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14538.9033 - -
-``` - -``` -
- 213/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14546.5840 - -
-``` - -``` -
- 216/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14554.2637 - -
-``` - -``` -
- 219/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14561.9434 - -
-``` - -``` -
- 222/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14569.6230 - -
-``` - -``` -
- 225/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14577.3037 - -
-``` - -``` -
- 228/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14584.9834 - -
-``` - -``` -
- 231/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14592.6631 - -
-``` - -``` -
- 234/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14600.3428 - -
-``` - -``` -
- 237/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14608.0234 - -
-``` - -``` -
- 240/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14615.7031 - -
-``` - -``` -
- 243/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14623.3828 - -
-``` - -``` -
- 246/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14631.0625 - -
-``` - -``` -
- 249/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14638.7432 - -
-``` - -``` -
- 252/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14646.4229 - -
-``` - -``` -
- 255/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14654.1025 - -
-``` - -``` -
- 258/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14661.7832 - -
-``` - -``` -
- 261/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14669.4629 - -
-``` - -``` -
- 264/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14677.1426 - -
-``` - -``` -
- 267/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14684.8223 - -
-``` - -``` -
- 270/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14692.5029 - -
-``` - -``` -
- 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14700.1826 - -
-``` - -``` -
- 276/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14707.8623 - -
-``` - -``` -
- 279/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14715.5420 - -
-``` - -``` -
- 282/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14723.2227 - -
-``` - -``` -
- 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14730.9023 - -
-``` - -``` -
- 288/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14738.5820 - -
-``` - -``` -
- 291/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14746.2627 - -
-``` - -``` -
- 294/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14753.9424 - -
-``` - -``` -
- 297/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14761.6221 - -
-``` - -``` -
- 300/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14769.3027 - -
-``` - -``` -
- 303/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14776.9824 - -
-``` - -``` -
- 306/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14784.6631 - -
-``` - -``` -
- 309/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14792.3428 - -
-``` - -``` -
- 312/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14800.0234 - -
-``` - -``` -
- 315/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14807.7031 - -
-``` - -``` -
- 318/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14815.3828 - -
-``` - -``` -
- 321/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14823.0625 - -
-``` - -``` -
- 324/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14830.7432 - -
-``` - -``` -
- 327/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14838.4229 - -
-``` - -``` -
- 330/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14846.1025 - -
-``` - -``` -
- 333/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14853.7832 - -
-``` - -``` -
- 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14861.4629 - -
-``` - -``` -
- 339/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14869.1426 - -
-``` - -``` -
- 342/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14876.8232 - -
-``` - -``` -
- 345/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14884.5029 - -
-``` - -``` -
- 348/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14892.1826 - -
-``` - -``` -
- 351/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14899.8623 - -
-``` - -``` -
- 354/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14907.5430 - -
-``` - -``` -
- 357/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14915.2227 - -
-``` - -``` -
- 360/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14922.9023 - -
-``` - -``` -
- 363/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14930.5820 - -
-``` - -``` -
- 367/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14940.8223 - -
-``` - -``` -
- 370/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14948.5029 - -
-``` - -``` -
- 373/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14956.1826 - -
-``` - -``` -
- 376/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14963.8623 - -
-``` - -``` -
- 379/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14971.5430 - -
-``` - -``` -
- 382/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14979.2227 - -
-``` - -``` -
- 385/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14986.9023 - -
-``` - -``` -
- 388/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 14994.5830 - -
-``` - -``` -
- 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15002.2627 - -
-``` - -``` -
- 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15009.9424 - -
-``` - -``` -
- 398/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15020.1826 - -
-``` - -``` -
- 401/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15027.8623 - -
-``` - -``` -
- 404/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15035.5420 - -
-``` - -``` -
- 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15043.2227 - -
-``` - -``` -
- 410/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15050.9023 - -
-``` - -``` -
- 413/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15058.5820 - -
-``` - -``` -
- 416/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9873 - loss: 0.1000 - moe_loss: 15066.2627 - -
-``` - -``` -
- 419/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15073.9424 - -
-``` - -``` -
- 422/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15081.6221 - -
-``` - -``` -
- 425/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15089.3027 - -
-``` - -``` -
- 427/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15094.4229 - -
-``` - -``` -
- 431/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15104.6621 - -
-``` - -``` -
- 434/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15112.3428 - -
-``` - -``` -
- 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15120.0225 - -
-``` - -``` -
- 440/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15127.7021 - -
-``` - -``` -
- 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15135.3828 - -
-``` - -``` -
- 446/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15143.0625 - -
-``` - -``` -
- 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15150.7422 - -
-``` - -``` -
- 452/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15158.4219 - -
-``` - -``` -
- 455/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15166.1025 - -
-``` - -``` -
- 458/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15173.7822 - -
-``` - -``` -
- 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15181.4619 - -
-``` - -``` -
- 464/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15189.1426 - -
-``` - -``` -
- 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15196.8223 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 15204.4854 - val_loss: 0.1000 - val_moe_loss: 16798.7246 - - -
-``` -Epoch 7/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 13s 29ms/step - accuracy: 0.9844 - loss: 0.1000 - moe_loss: 16803.8555 - -
-``` - -``` -
- 4/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9907 - loss: 0.1000 - moe_loss: 16811.5371 - -
-``` - -``` -
- 7/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 16819.2188 - -
-``` - -``` -
- 10/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 16826.9043 - -
-``` - -``` -
- 13/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 16834.5898 - -
-``` - -``` -
- 16/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 16842.2676 - -
-``` - -``` -
- 19/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 16849.9531 - -
-``` - -``` -
- 22/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 16857.6309 - -
-``` - -``` -
- 23/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9907 - loss: 0.1000 - moe_loss: 16860.1934 - -
-``` - -``` -
- 24/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9906 - loss: 0.1000 - moe_loss: 16862.7539 - -
-``` - -``` -
- 26/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9904 - loss: 0.1000 - moe_loss: 16867.8711 - -
-``` - -``` -
- 29/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9902 - loss: 0.1000 - moe_loss: 16875.5527 - -
-``` - -``` -
- 33/469 ━━━━━━━━━━━━━━━━━━━━ 10s 24ms/step - accuracy: 0.9901 - loss: 0.1000 - moe_loss: 16885.7910 - -
-``` - -``` -
- 36/469 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 0.9899 - loss: 0.1000 - moe_loss: 16893.4688 - -
-``` - -``` -
- 39/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9898 - loss: 0.1000 - moe_loss: 16901.1484 - -
-``` - -``` -
- 42/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9897 - loss: 0.1000 - moe_loss: 16908.8281 - -
-``` - -``` -
- 45/469 ━━━━━━━━━━━━━━━━━━━━ 9s 22ms/step - accuracy: 0.9896 - loss: 0.1000 - moe_loss: 16916.5078 - -
-``` - -``` -
- 48/469 ━━━━━━━━━━━━━━━━━━━━ 9s 22ms/step - accuracy: 0.9895 - loss: 0.1000 - moe_loss: 16924.1875 - -
-``` - -``` -
- 51/469 ━━━━━━━━━━━━━━━━━━━━ 9s 22ms/step - accuracy: 0.9895 - loss: 0.1000 - moe_loss: 16931.8691 - -
-``` - -``` -
- 54/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 16939.5488 - -
-``` - -``` -
- 57/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 16947.2285 - -
-``` - -``` -
- 60/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 16954.9082 - -
-``` - -``` -
- 63/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 16962.5898 - -
-``` - -``` -
- 66/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9892 - loss: 0.1000 - moe_loss: 16970.2676 - -
-``` - -``` -
- 69/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9892 - loss: 0.1000 - moe_loss: 16977.9473 - -
-``` - -``` -
- 72/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9891 - loss: 0.1000 - moe_loss: 16985.6270 - -
-``` - -``` -
- 75/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9891 - loss: 0.1000 - moe_loss: 16993.3086 - -
-``` - -``` -
- 78/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9890 - loss: 0.1000 - moe_loss: 17000.9863 - -
-``` - -``` -
- 81/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9890 - loss: 0.1000 - moe_loss: 17008.6660 - -
-``` - -``` -
- 84/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9889 - loss: 0.1000 - moe_loss: 17016.3457 - -
-``` - -``` -
- 87/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9889 - loss: 0.1000 - moe_loss: 17024.0273 - -
-``` - -``` -
- 90/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9888 - loss: 0.1000 - moe_loss: 17031.7070 - -
-``` - -``` -
- 93/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9888 - loss: 0.1000 - moe_loss: 17039.3867 - -
-``` - -``` -
- 96/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9888 - loss: 0.1000 - moe_loss: 17047.0664 - -
-``` - -``` -
- 99/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9887 - loss: 0.1000 - moe_loss: 17054.7461 - -
-``` - -``` -
- 102/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9887 - loss: 0.1000 - moe_loss: 17062.4277 - -
-``` - -``` -
- 105/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9887 - loss: 0.1000 - moe_loss: 17070.1074 - -
-``` - -``` -
- 108/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9887 - loss: 0.1000 - moe_loss: 17077.7871 - -
-``` - -``` -
- 111/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9886 - loss: 0.1000 - moe_loss: 17085.4668 - -
-``` - -``` -
- 114/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9886 - loss: 0.1000 - moe_loss: 17093.1465 - -
-``` - -``` -
- 117/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9886 - loss: 0.1000 - moe_loss: 17100.8262 - -
-``` - -``` -
- 120/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9886 - loss: 0.1000 - moe_loss: 17108.5059 - -
-``` - -``` -
- 123/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9886 - loss: 0.1000 - moe_loss: 17116.1855 - -
-``` - -``` -
- 126/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9885 - loss: 0.1000 - moe_loss: 17123.8652 - -
-``` - -``` -
- 129/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9885 - loss: 0.1000 - moe_loss: 17131.5449 - -
-``` - -``` -
- 132/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9885 - loss: 0.1000 - moe_loss: 17139.2246 - -
-``` - -``` -
- 135/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9885 - loss: 0.1000 - moe_loss: 17146.9043 - -
-``` - -``` -
- 138/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9885 - loss: 0.1000 - moe_loss: 17154.5840 - -
-``` - -``` -
- 141/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17162.2637 - -
-``` - -``` -
- 144/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17169.9434 - -
-``` - -``` -
- 147/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17177.6230 - -
-``` - -``` -
- 150/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17185.3047 - -
-``` - -``` -
- 153/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17192.9844 - -
-``` - -``` -
- 156/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17200.6641 - -
-``` - -``` -
- 159/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17208.3438 - -
-``` - -``` -
- 162/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17216.0234 - -
-``` - -``` -
- 165/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17223.7031 - -
-``` - -``` -
- 168/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17231.3848 - -
-``` - -``` -
- 171/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17239.0645 - -
-``` - -``` -
- 174/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17246.7441 - -
-``` - -``` -
- 177/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17254.4238 - -
-``` - -``` -
- 180/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17262.1035 - -
-``` - -``` -
- 183/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17269.7832 - -
-``` - -``` -
- 186/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17277.4629 - -
-``` - -``` -
- 189/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17285.1445 - -
-``` - -``` -
- 192/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17292.8242 - -
-``` - -``` -
- 195/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17300.5039 - -
-``` - -``` -
- 198/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17308.1836 - -
-``` - -``` -
- 201/469 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17315.8633 - -
-``` - -``` -
- 204/469 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17323.5430 - -
-``` - -``` -
- 207/469 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17331.2227 - -
-``` - -``` -
- 210/469 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17338.9023 - -
-``` - -``` -
- 213/469 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17346.5820 - -
-``` - -``` -
- 216/469 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17354.2617 - -
-``` - -``` -
- 219/469 ━━━━━━━━━━━━━━━━━━━━ 4s 19ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17361.9434 - -
-``` - -``` -
- 222/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17369.6230 - -
-``` - -``` -
- 225/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17377.3027 - -
-``` - -``` -
- 228/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17384.9824 - -
-``` - -``` -
- 231/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17392.6621 - -
-``` - -``` -
- 234/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17400.3438 - -
-``` - -``` -
- 237/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17408.0234 - -
-``` - -``` -
- 240/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17415.7031 - -
-``` - -``` -
- 243/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17423.3828 - -
-``` - -``` -
- 246/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17431.0625 - -
-``` - -``` -
- 249/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17438.7422 - -
-``` - -``` -
- 252/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17446.4219 - -
-``` - -``` -
- 255/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17454.1035 - -
-``` - -``` -
- 258/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17461.7832 - -
-``` - -``` -
- 261/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17469.4629 - -
-``` - -``` -
- 264/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17477.1426 - -
-``` - -``` -
- 267/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17484.8223 - -
-``` - -``` -
- 270/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17492.5020 - -
-``` - -``` -
- 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17500.1816 - -
-``` - -``` -
- 276/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17507.8633 - -
-``` - -``` -
- 279/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17515.5430 - -
-``` - -``` -
- 282/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17523.2227 - -
-``` - -``` -
- 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17530.9023 - -
-``` - -``` -
- 288/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17538.5820 - -
-``` - -``` -
- 291/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17546.2617 - -
-``` - -``` -
- 294/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17553.9414 - -
-``` - -``` -
- 297/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17561.6211 - -
-``` - -``` -
- 300/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17569.3027 - -
-``` - -``` -
- 303/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17576.9824 - -
-``` - -``` -
- 306/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17584.6621 - -
-``` - -``` -
- 309/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17592.3418 - -
-``` - -``` -
- 312/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17600.0234 - -
-``` - -``` -
- 315/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17607.7031 - -
-``` - -``` -
- 318/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17615.3828 - -
-``` - -``` -
- 321/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17623.0625 - -
-``` - -``` -
- 324/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17630.7422 - -
-``` - -``` -
- 327/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17638.4219 - -
-``` - -``` -
- 330/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17646.1016 - -
-``` - -``` -
- 333/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17653.7832 - -
-``` - -``` -
- 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17661.4629 - -
-``` - -``` -
- 339/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17669.1426 - -
-``` - -``` -
- 342/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17676.8223 - -
-``` - -``` -
- 345/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17684.5020 - -
-``` - -``` -
- 348/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17692.1816 - -
-``` - -``` -
- 351/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17699.8613 - -
-``` - -``` -
- 354/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17707.5430 - -
-``` - -``` -
- 357/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17715.2227 - -
-``` - -``` -
- 360/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17722.9023 - -
-``` - -``` -
- 363/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17730.5820 - -
-``` - -``` -
- 366/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17738.2617 - -
-``` - -``` -
- 369/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17745.9414 - -
-``` - -``` -
- 372/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 17753.6230 - -
-``` - -``` -
- 375/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17761.3027 - -
-``` - -``` -
- 378/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17768.9824 - -
-``` - -``` -
- 381/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17776.6621 - -
-``` - -``` -
- 384/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17784.3418 - -
-``` - -``` -
- 387/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17792.0215 - -
-``` - -``` -
- 390/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17799.7012 - -
-``` - -``` -
- 393/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17807.3828 - -
-``` - -``` -
- 396/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17815.0625 - -
-``` - -``` -
- 399/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17822.7422 - -
-``` - -``` -
- 402/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17830.4219 - -
-``` - -``` -
- 405/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17838.1016 - -
-``` - -``` -
- 408/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17845.7832 - -
-``` - -``` -
- 411/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17853.4629 - -
-``` - -``` -
- 414/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17861.1426 - -
-``` - -``` -
- 417/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17868.8223 - -
-``` - -``` -
- 420/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17876.5020 - -
-``` - -``` -
- 423/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17884.1816 - -
-``` - -``` -
- 426/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17891.8633 - -
-``` - -``` -
- 429/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17899.5430 - -
-``` - -``` -
- 432/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 17907.2227 - -
-``` - -``` -
- 435/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17914.9023 - -
-``` - -``` -
- 438/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17922.5820 - -
-``` - -``` -
- 441/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17930.2617 - -
-``` - -``` -
- 444/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17937.9414 - -
-``` - -``` -
- 447/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17945.6230 - -
-``` - -``` -
- 450/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17953.3027 - -
-``` - -``` -
- 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17960.9824 - -
-``` - -``` -
- 456/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17968.6621 - -
-``` - -``` -
- 459/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17976.3418 - -
-``` - -``` -
- 462/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17984.0215 - -
-``` - -``` -
- 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17991.7031 - -
-``` - -``` -
- 468/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 17999.3828 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 9s 19ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 18004.4863 - val_loss: 0.1000 - val_moe_loss: 19598.7227 - - -
-``` -Epoch 8/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 13s 28ms/step - accuracy: 0.9844 - loss: 0.1000 - moe_loss: 19603.8477 - -
-``` - -``` -
- 4/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9844 - loss: 0.1000 - moe_loss: 19611.5293 - -
-``` - -``` -
- 7/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9864 - loss: 0.1000 - moe_loss: 19619.2109 - -
-``` - -``` -
- 10/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9872 - loss: 0.1000 - moe_loss: 19626.8906 - -
-``` - -``` -
- 13/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9877 - loss: 0.1000 - moe_loss: 19634.5723 - -
-``` - -``` -
- 16/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9880 - loss: 0.1000 - moe_loss: 19642.2539 - -
-``` - -``` -
- 19/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 19649.9355 - -
-``` - -``` -
- 22/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 19657.6152 - -
-``` - -``` -
- 25/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9884 - loss: 0.1000 - moe_loss: 19665.2969 - -
-``` - -``` -
- 28/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9886 - loss: 0.1000 - moe_loss: 19672.9785 - -
-``` - -``` -
- 31/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9887 - loss: 0.1000 - moe_loss: 19680.6582 - -
-``` - -``` -
- 34/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9889 - loss: 0.1000 - moe_loss: 19688.3379 - -
-``` - -``` -
- 37/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9890 - loss: 0.1000 - moe_loss: 19696.0195 - -
-``` - -``` -
- 40/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9891 - loss: 0.1000 - moe_loss: 19703.6992 - -
-``` - -``` -
- 43/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9892 - loss: 0.1000 - moe_loss: 19711.3809 - -
-``` - -``` -
- 46/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 19719.0605 - -
-``` - -``` -
- 49/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 19726.7422 - -
-``` - -``` -
- 52/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 19734.4219 - -
-``` - -``` -
- 55/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 19742.1016 - -
-``` - -``` -
- 58/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 19749.7812 - -
-``` - -``` -
- 61/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19757.4609 - -
-``` - -``` -
- 64/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19765.1406 - -
-``` - -``` -
- 67/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19772.8203 - -
-``` - -``` -
- 70/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19780.5000 - -
-``` - -``` -
- 73/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19788.1797 - -
-``` - -``` -
- 76/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9895 - loss: 0.1000 - moe_loss: 19795.8594 - -
-``` - -``` -
- 79/469 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.9895 - loss: 0.1000 - moe_loss: 19803.5391 - -
-``` - -``` -
- 82/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9895 - loss: 0.1000 - moe_loss: 19811.2188 - -
-``` - -``` -
- 85/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19818.9004 - -
-``` - -``` -
- 88/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19826.5801 - -
-``` - -``` -
- 91/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19834.2598 - -
-``` - -``` -
- 94/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19841.9395 - -
-``` - -``` -
- 97/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19849.6191 - -
-``` - -``` -
- 100/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19857.3008 - -
-``` - -``` -
- 103/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19864.9805 - -
-``` - -``` -
- 106/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19872.6602 - -
-``` - -``` -
- 109/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19880.3398 - -
-``` - -``` -
- 112/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19888.0195 - -
-``` - -``` -
- 115/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19895.6992 - -
-``` - -``` -
- 118/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19903.3809 - -
-``` - -``` -
- 121/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19911.0605 - -
-``` - -``` -
- 124/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19918.7402 - -
-``` - -``` -
- 127/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19926.4199 - -
-``` - -``` -
- 130/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19934.0996 - -
-``` - -``` -
- 133/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19941.7812 - -
-``` - -``` -
- 136/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19949.4609 - -
-``` - -``` -
- 139/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19957.1406 - -
-``` - -``` -
- 142/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19964.8203 - -
-``` - -``` -
- 145/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19972.5000 - -
-``` - -``` -
- 148/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19980.1797 - -
-``` - -``` -
- 151/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19987.8594 - -
-``` - -``` -
- 154/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 19995.5391 - -
-``` - -``` -
- 157/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20003.2207 - -
-``` - -``` -
- 160/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20010.9004 - -
-``` - -``` -
- 163/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20018.5801 - -
-``` - -``` -
- 166/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20026.2598 - -
-``` - -``` -
- 169/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20033.9395 - -
-``` - -``` -
- 172/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20041.6191 - -
-``` - -``` -
- 175/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20049.3008 - -
-``` - -``` -
- 178/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20056.9805 - -
-``` - -``` -
- 181/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20064.6602 - -
-``` - -``` -
- 184/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20072.3398 - -
-``` - -``` -
- 187/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20080.0195 - -
-``` - -``` -
- 190/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20087.6992 - -
-``` - -``` -
- 193/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20095.3789 - -
-``` - -``` -
- 196/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20103.0586 - -
-``` - -``` -
- 199/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20110.7402 - -
-``` - -``` -
- 202/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20118.4199 - -
-``` - -``` -
- 205/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20126.0996 - -
-``` - -``` -
- 208/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20133.7793 - -
-``` - -``` -
- 211/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20141.4590 - -
-``` - -``` -
- 214/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20149.1387 - -
-``` - -``` -
- 217/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20156.8203 - -
-``` - -``` -
- 220/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20164.5000 - -
-``` - -``` -
- 223/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20172.1797 - -
-``` - -``` -
- 226/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20179.8594 - -
-``` - -``` -
- 229/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20187.5391 - -
-``` - -``` -
- 232/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20195.2188 - -
-``` - -``` -
- 235/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20202.8984 - -
-``` - -``` -
- 238/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20210.5801 - -
-``` - -``` -
- 241/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20218.2598 - -
-``` - -``` -
- 244/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20225.9395 - -
-``` - -``` -
- 247/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20233.6191 - -
-``` - -``` -
- 250/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20241.2988 - -
-``` - -``` -
- 253/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20248.9805 - -
-``` - -``` -
- 256/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20256.6602 - -
-``` - -``` -
- 259/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20264.3398 - -
-``` - -``` -
- 262/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20272.0195 - -
-``` - -``` -
- 265/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20279.6992 - -
-``` - -``` -
- 268/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20287.3789 - -
-``` - -``` -
- 271/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20295.0586 - -
-``` - -``` -
- 274/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20302.7402 - -
-``` - -``` -
- 277/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20310.4199 - -
-``` - -``` -
- 280/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20318.0996 - -
-``` - -``` -
- 283/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20325.7793 - -
-``` - -``` -
- 286/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20333.4590 - -
-``` - -``` -
- 289/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20341.1387 - -
-``` - -``` -
- 292/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20348.8203 - -
-``` - -``` -
- 295/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20356.5000 - -
-``` - -``` -
- 298/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20364.1797 - -
-``` - -``` -
- 301/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20371.8594 - -
-``` - -``` -
- 304/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20379.5391 - -
-``` - -``` -
- 307/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20387.2188 - -
-``` - -``` -
- 310/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20394.9004 - -
-``` - -``` -
- 313/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20402.5801 - -
-``` - -``` -
- 316/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20410.2598 - -
-``` - -``` -
- 319/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20417.9395 - -
-``` - -``` -
- 320/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20420.5000 - -
-``` - -``` -
- 323/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20428.1797 - -
-``` - -``` -
- 326/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20435.8594 - -
-``` - -``` -
- 329/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20443.5391 - -
-``` - -``` -
- 332/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20451.2188 - -
-``` - -``` -
- 335/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20458.8984 - -
-``` - -``` -
- 338/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20466.5781 - -
-``` - -``` -
- 341/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20474.2598 - -
-``` - -``` -
- 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20481.9395 - -
-``` - -``` -
- 347/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20489.6191 - -
-``` - -``` -
- 350/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20497.2988 - -
-``` - -``` -
- 353/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20504.9785 - -
-``` - -``` -
- 356/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20512.6582 - -
-``` - -``` -
- 359/469 ━━━━━━━━━━━━━━━━━━━━ 2s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20520.3398 - -
-``` - -``` -
- 362/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9894 - loss: 0.1000 - moe_loss: 20528.0195 - -
-``` - -``` -
- 365/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20535.6992 - -
-``` - -``` -
- 368/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20543.3789 - -
-``` - -``` -
- 371/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20551.0586 - -
-``` - -``` -
- 374/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20558.7383 - -
-``` - -``` -
- 377/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20566.4180 - -
-``` - -``` -
- 380/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20574.0996 - -
-``` - -``` -
- 383/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20581.7793 - -
-``` - -``` -
- 386/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20589.4590 - -
-``` - -``` -
- 389/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20597.1387 - -
-``` - -``` -
- 392/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20604.8184 - -
-``` - -``` -
- 393/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20607.3789 - -
-``` - -``` -
- 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20609.9395 - -
-``` - -``` -
- 395/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20612.4980 - -
-``` - -``` -
- 398/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20620.1797 - -
-``` - -``` -
- 401/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20627.8594 - -
-``` - -``` -
- 404/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20635.5391 - -
-``` - -``` -
- 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20643.2188 - -
-``` - -``` -
- 410/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20650.8984 - -
-``` - -``` -
- 413/469 ━━━━━━━━━━━━━━━━━━━━ 1s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20658.5781 - -
-``` - -``` -
- 416/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20666.2578 - -
-``` - -``` -
- 419/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20673.9395 - -
-``` - -``` -
- 422/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20681.6191 - -
-``` - -``` -
- 425/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20689.2988 - -
-``` - -``` -
- 428/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20696.9785 - -
-``` - -``` -
- 431/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20704.6582 - -
-``` - -``` -
- 434/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20712.3379 - -
-``` - -``` -
- 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20720.0195 - -
-``` - -``` -
- 440/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20727.6992 - -
-``` - -``` -
- 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20735.3789 - -
-``` - -``` -
- 446/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20743.0586 - -
-``` - -``` -
- 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20750.7383 - -
-``` - -``` -
- 452/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20758.4180 - -
-``` - -``` -
- 455/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20766.0996 - -
-``` - -``` -
- 458/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20773.7793 - -
-``` - -``` -
- 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20781.4590 - -
-``` - -``` -
- 464/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20789.1387 - -
-``` - -``` -
- 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 19ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20796.8184 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 10s 20ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 20804.4824 - val_loss: 0.1000 - val_moe_loss: 22398.7227 - - -
-``` -Epoch 9/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 15s 33ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 22403.8516 - -
-``` - -``` -
- 4/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9959 - loss: 0.1000 - moe_loss: 22411.5312 - -
-``` - -``` -
- 7/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9964 - loss: 0.1000 - moe_loss: 22419.2129 - -
-``` - -``` -
- 10/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9967 - loss: 0.1000 - moe_loss: 22426.8926 - -
-``` - -``` -
- 13/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9965 - loss: 0.1000 - moe_loss: 22434.5742 - -
-``` - -``` -
- 16/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9960 - loss: 0.1000 - moe_loss: 22442.2539 - -
-``` - -``` -
- 19/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9954 - loss: 0.1000 - moe_loss: 22449.9355 - -
-``` - -``` -
- 22/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9948 - loss: 0.1000 - moe_loss: 22457.6172 - -
-``` - -``` -
- 25/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 22465.2969 - -
-``` - -``` -
- 28/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 22472.9785 - -
-``` - -``` -
- 30/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 22478.0977 - -
-``` - -``` -
- 32/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 22483.2168 - -
-``` - -``` -
- 35/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 22490.8965 - -
-``` - -``` -
- 38/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 22498.5762 - -
-``` - -``` -
- 41/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 22506.2559 - -
-``` - -``` -
- 44/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 22513.9375 - -
-``` - -``` -
- 47/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 22521.6172 - -
-``` - -``` -
- 50/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 22529.2969 - -
-``` - -``` -
- 53/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 22536.9766 - -
-``` - -``` -
- 56/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 22544.6562 - -
-``` - -``` -
- 59/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 22552.3359 - -
-``` - -``` -
- 62/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 22560.0176 - -
-``` - -``` -
- 65/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 22567.6973 - -
-``` - -``` -
- 68/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 22575.3770 - -
-``` - -``` -
- 71/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 22583.0566 - -
-``` - -``` -
- 74/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 22590.7363 - -
-``` - -``` -
- 77/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 22598.4180 - -
-``` - -``` -
- 80/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 22606.0977 - -
-``` - -``` -
- 83/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 22613.7793 - -
-``` - -``` -
- 86/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 22621.4590 - -
-``` - -``` -
- 89/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 22629.1387 - -
-``` - -``` -
- 92/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 22636.8184 - -
-``` - -``` -
- 95/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 22644.5000 - -
-``` - -``` -
- 97/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 22649.6191 - -
-``` - -``` -
- 99/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 22654.7383 - -
-``` - -``` -
- 102/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 22662.4199 - -
-``` - -``` -
- 105/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 22670.0996 - -
-``` - -``` -
- 108/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 22677.7793 - -
-``` - -``` -
- 111/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 22685.4590 - -
-``` - -``` -
- 114/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 22693.1387 - -
-``` - -``` -
- 117/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 22700.8184 - -
-``` - -``` -
- 120/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 22708.5000 - -
-``` - -``` -
- 123/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 22716.1797 - -
-``` - -``` -
- 126/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 22723.8594 - -
-``` - -``` -
- 129/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 22731.5391 - -
-``` - -``` -
- 132/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 22739.2188 - -
-``` - -``` -
- 135/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 22746.9004 - -
-``` - -``` -
- 138/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 22754.5801 - -
-``` - -``` -
- 141/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 22762.2598 - -
-``` - -``` -
- 144/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 22769.9395 - -
-``` - -``` -
- 147/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 22777.6211 - -
-``` - -``` -
- 150/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 22785.3008 - -
-``` - -``` -
- 153/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 22792.9805 - -
-``` - -``` -
- 156/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 22800.6602 - -
-``` - -``` -
- 159/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 22808.3398 - -
-``` - -``` -
- 162/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 22816.0195 - -
-``` - -``` -
- 165/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 22823.7012 - -
-``` - -``` -
- 168/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 22831.3809 - -
-``` - -``` -
- 171/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 22839.0605 - -
-``` - -``` -
- 174/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22846.7402 - -
-``` - -``` -
- 177/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22854.4199 - -
-``` - -``` -
- 180/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22862.1016 - -
-``` - -``` -
- 183/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22869.7812 - -
-``` - -``` -
- 186/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22877.4609 - -
-``` - -``` -
- 189/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22885.1406 - -
-``` - -``` -
- 192/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22892.8203 - -
-``` - -``` -
- 195/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22900.5000 - -
-``` - -``` -
- 198/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22908.1797 - -
-``` - -``` -
- 201/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22915.8594 - -
-``` - -``` -
- 204/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22923.5410 - -
-``` - -``` -
- 207/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 22931.2207 - -
-``` - -``` -
- 210/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22938.9004 - -
-``` - -``` -
- 213/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22946.5801 - -
-``` - -``` -
- 216/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22954.2598 - -
-``` - -``` -
- 219/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22961.9395 - -
-``` - -``` -
- 222/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22969.6191 - -
-``` - -``` -
- 225/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22977.3008 - -
-``` - -``` -
- 228/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22984.9805 - -
-``` - -``` -
- 231/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 22992.6602 - -
-``` - -``` -
- 234/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23000.3398 - -
-``` - -``` -
- 237/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23008.0195 - -
-``` - -``` -
- 240/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23015.6992 - -
-``` - -``` -
- 243/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23023.3789 - -
-``` - -``` -
- 246/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23031.0605 - -
-``` - -``` -
- 249/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23038.7402 - -
-``` - -``` -
- 252/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23046.4199 - -
-``` - -``` -
- 255/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 23054.0996 - -
-``` - -``` -
- 258/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23061.7793 - -
-``` - -``` -
- 261/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23069.4590 - -
-``` - -``` -
- 264/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23077.1387 - -
-``` - -``` -
- 267/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23084.8203 - -
-``` - -``` -
- 270/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23092.5000 - -
-``` - -``` -
- 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23100.1797 - -
-``` - -``` -
- 276/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23107.8594 - -
-``` - -``` -
- 279/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23115.5391 - -
-``` - -``` -
- 282/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23123.2188 - -
-``` - -``` -
- 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23130.9004 - -
-``` - -``` -
- 288/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23138.5801 - -
-``` - -``` -
- 291/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23146.2598 - -
-``` - -``` -
- 294/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23153.9395 - -
-``` - -``` -
- 297/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23161.6191 - -
-``` - -``` -
- 300/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23169.2988 - -
-``` - -``` -
- 303/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23176.9785 - -
-``` - -``` -
- 306/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23184.6602 - -
-``` - -``` -
- 309/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 23192.3398 - -
-``` - -``` -
- 312/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23200.0195 - -
-``` - -``` -
- 315/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23207.6992 - -
-``` - -``` -
- 318/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23215.3789 - -
-``` - -``` -
- 321/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23223.0586 - -
-``` - -``` -
- 324/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23230.7383 - -
-``` - -``` -
- 327/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23238.4180 - -
-``` - -``` -
- 330/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23246.0996 - -
-``` - -``` -
- 333/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23253.7793 - -
-``` - -``` -
- 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23261.4590 - -
-``` - -``` -
- 339/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23269.1387 - -
-``` - -``` -
- 342/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23276.8184 - -
-``` - -``` -
- 345/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23284.4980 - -
-``` - -``` -
- 348/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23292.1797 - -
-``` - -``` -
- 351/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23299.8594 - -
-``` - -``` -
- 354/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23307.5391 - -
-``` - -``` -
- 357/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23315.2188 - -
-``` - -``` -
- 360/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23322.8984 - -
-``` - -``` -
- 363/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23330.5781 - -
-``` - -``` -
- 365/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 23335.6992 - -
-``` - -``` -
- 368/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23343.3789 - -
-``` - -``` -
- 370/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23348.5000 - -
-``` - -``` -
- 373/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23356.1797 - -
-``` - -``` -
- 376/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23363.8594 - -
-``` - -``` -
- 379/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23371.5391 - -
-``` - -``` -
- 382/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23379.2188 - -
-``` - -``` -
- 385/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23386.8984 - -
-``` - -``` -
- 388/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23394.5801 - -
-``` - -``` -
- 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23402.2598 - -
-``` - -``` -
- 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23409.9395 - -
-``` - -``` -
- 397/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23417.6191 - -
-``` - -``` -
- 400/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23425.2988 - -
-``` - -``` -
- 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23432.9785 - -
-``` - -``` -
- 406/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23440.6582 - -
-``` - -``` -
- 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23448.3379 - -
-``` - -``` -
- 412/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23456.0195 - -
-``` - -``` -
- 415/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23463.6992 - -
-``` - -``` -
- 418/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23471.3789 - -
-``` - -``` -
- 421/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23479.0586 - -
-``` - -``` -
- 424/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 23486.7383 - -
-``` - -``` -
- 427/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23494.4180 - -
-``` - -``` -
- 430/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23502.0996 - -
-``` - -``` -
- 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23509.7793 - -
-``` - -``` -
- 436/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23517.4590 - -
-``` - -``` -
- 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23525.1387 - -
-``` - -``` -
- 442/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23532.8184 - -
-``` - -``` -
- 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23540.4980 - -
-``` - -``` -
- 448/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23548.1797 - -
-``` - -``` -
- 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23555.8594 - -
-``` - -``` -
- 454/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23563.5391 - -
-``` - -``` -
- 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23571.2188 - -
-``` - -``` -
- 460/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23578.8984 - -
-``` - -``` -
- 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23586.5781 - -
-``` - -``` -
- 466/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23594.2578 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23601.9355 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 10s 21ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 23604.4824 - val_loss: 0.1000 - val_moe_loss: 25198.7246 - - -
-``` -Epoch 10/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 15s 33ms/step - accuracy: 1.0000 - loss: 0.1000 - moe_loss: 25203.8496 - -
-``` - -``` -
- 4/469 ━━━━━━━━━━━━━━━━━━━━ 9s 20ms/step - accuracy: 0.9963 - loss: 0.1000 - moe_loss: 25211.5312 - -
-``` - -``` -
- 7/469 ━━━━━━━━━━━━━━━━━━━━ 9s 20ms/step - accuracy: 0.9955 - loss: 0.1000 - moe_loss: 25219.2148 - -
-``` - -``` -
- 10/469 ━━━━━━━━━━━━━━━━━━━━ 9s 20ms/step - accuracy: 0.9951 - loss: 0.1000 - moe_loss: 25226.8945 - -
-``` - -``` -
- 13/469 ━━━━━━━━━━━━━━━━━━━━ 9s 20ms/step - accuracy: 0.9948 - loss: 0.1000 - moe_loss: 25234.5781 - -
-``` - -``` -
- 16/469 ━━━━━━━━━━━━━━━━━━━━ 9s 20ms/step - accuracy: 0.9946 - loss: 0.1000 - moe_loss: 25242.2578 - -
-``` - -``` -
- 19/469 ━━━━━━━━━━━━━━━━━━━━ 9s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 25249.9375 - -
-``` - -``` -
- 22/469 ━━━━━━━━━━━━━━━━━━━━ 9s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 25257.6191 - -
-``` - -``` -
- 25/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 25265.2988 - -
-``` - -``` -
- 28/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 25272.9785 - -
-``` - -``` -
- 31/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 25280.6582 - -
-``` - -``` -
- 34/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 25288.3379 - -
-``` - -``` -
- 37/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 25296.0176 - -
-``` - -``` -
- 40/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 25303.6992 - -
-``` - -``` -
- 43/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 25311.3789 - -
-``` - -``` -
- 46/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 25319.0586 - -
-``` - -``` -
- 49/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 25326.7383 - -
-``` - -``` -
- 52/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 25334.4160 - -
-``` - -``` -
- 55/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 25342.0977 - -
-``` - -``` -
- 58/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 25349.7773 - -
-``` - -``` -
- 61/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 25357.4570 - -
-``` - -``` -
- 64/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 25365.1367 - -
-``` - -``` -
- 67/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 25372.8164 - -
-``` - -``` -
- 70/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 25380.4961 - -
-``` - -``` -
- 73/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 25388.1758 - -
-``` - -``` -
- 76/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 25395.8555 - -
-``` - -``` -
- 79/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 25403.5352 - -
-``` - -``` -
- 82/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 25411.2148 - -
-``` - -``` -
- 85/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 25418.8965 - -
-``` - -``` -
- 88/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 25426.5762 - -
-``` - -``` -
- 91/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 25434.2559 - -
-``` - -``` -
- 94/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 25441.9355 - -
-``` - -``` -
- 97/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 25449.6152 - -
-``` - -``` -
- 100/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 25457.2949 - -
-``` - -``` -
- 103/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 25464.9766 - -
-``` - -``` -
- 106/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 25472.6562 - -
-``` - -``` -
- 109/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 25480.3359 - -
-``` - -``` -
- 112/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 25488.0156 - -
-``` - -``` -
- 115/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 25495.6953 - -
-``` - -``` -
- 118/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 25503.3770 - -
-``` - -``` -
- 121/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 25511.0566 - -
-``` - -``` -
- 124/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25518.7363 - -
-``` - -``` -
- 127/469 ━━━━━━━━━━━━━━━━━━━━ 7s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25526.4160 - -
-``` - -``` -
- 130/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25534.0977 - -
-``` - -``` -
- 133/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25541.7773 - -
-``` - -``` -
- 136/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25549.4570 - -
-``` - -``` -
- 139/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25557.1367 - -
-``` - -``` -
- 142/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25564.8164 - -
-``` - -``` -
- 145/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 25572.4961 - -
-``` - -``` -
- 148/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25580.1758 - -
-``` - -``` -
- 151/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25587.8574 - -
-``` - -``` -
- 154/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25595.5371 - -
-``` - -``` -
- 157/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25603.2168 - -
-``` - -``` -
- 160/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25610.8965 - -
-``` - -``` -
- 163/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25618.5762 - -
-``` - -``` -
- 166/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25626.2559 - -
-``` - -``` -
- 169/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25633.9375 - -
-``` - -``` -
- 172/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25641.6172 - -
-``` - -``` -
- 175/469 ━━━━━━━━━━━━━━━━━━━━ 6s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25649.2969 - -
-``` - -``` -
- 178/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25656.9766 - -
-``` - -``` -
- 181/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25664.6562 - -
-``` - -``` -
- 184/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25672.3359 - -
-``` - -``` -
- 187/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 25680.0156 - -
-``` - -``` -
- 190/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25687.6973 - -
-``` - -``` -
- 193/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25695.3770 - -
-``` - -``` -
- 196/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25703.0566 - -
-``` - -``` -
- 199/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25710.7363 - -
-``` - -``` -
- 202/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25718.4160 - -
-``` - -``` -
- 205/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25726.0957 - -
-``` - -``` -
- 208/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25733.7773 - -
-``` - -``` -
- 211/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25741.4570 - -
-``` - -``` -
- 214/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25749.1367 - -
-``` - -``` -
- 217/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25756.8164 - -
-``` - -``` -
- 220/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25764.4961 - -
-``` - -``` -
- 223/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25772.1758 - -
-``` - -``` -
- 226/469 ━━━━━━━━━━━━━━━━━━━━ 5s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25779.8574 - -
-``` - -``` -
- 229/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25787.5371 - -
-``` - -``` -
- 232/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25795.2168 - -
-``` - -``` -
- 235/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25802.8965 - -
-``` - -``` -
- 238/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25810.5762 - -
-``` - -``` -
- 241/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25818.2559 - -
-``` - -``` -
- 244/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25825.9355 - -
-``` - -``` -
- 247/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25833.6172 - -
-``` - -``` -
- 250/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25841.2969 - -
-``` - -``` -
- 253/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25848.9766 - -
-``` - -``` -
- 256/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25856.6562 - -
-``` - -``` -
- 259/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25864.3359 - -
-``` - -``` -
- 262/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25872.0176 - -
-``` - -``` -
- 265/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25879.6973 - -
-``` - -``` -
- 268/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25887.3770 - -
-``` - -``` -
- 271/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25895.0566 - -
-``` - -``` -
- 274/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25902.7363 - -
-``` - -``` -
- 277/469 ━━━━━━━━━━━━━━━━━━━━ 4s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25910.4160 - -
-``` - -``` -
- 280/469 ━━━━━━━━━━━━━━━━━━━━ 3s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25918.0977 - -
-``` - -``` -
- 283/469 ━━━━━━━━━━━━━━━━━━━━ 3s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25925.7773 - -
-``` - -``` -
- 286/469 ━━━━━━━━━━━━━━━━━━━━ 3s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25933.4570 - -
-``` - -``` -
- 288/469 ━━━━━━━━━━━━━━━━━━━━ 3s 21ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25938.5762 - -
-``` - -``` -
- 289/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25941.1367 - -
-``` - -``` -
- 290/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25943.6973 - -
-``` - -``` -
- 293/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25951.3770 - -
-``` - -``` -
- 296/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25959.0566 - -
-``` - -``` -
- 299/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25966.7363 - -
-``` - -``` -
- 302/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25974.4160 - -
-``` - -``` -
- 305/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25982.0977 - -
-``` - -``` -
- 308/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 25989.7773 - -
-``` - -``` -
- 311/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 25997.4570 - -
-``` - -``` -
- 314/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26005.1367 - -
-``` - -``` -
- 317/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26012.8164 - -
-``` - -``` -
- 320/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26020.4961 - -
-``` - -``` -
- 323/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26028.1777 - -
-``` - -``` -
- 326/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26035.8574 - -
-``` - -``` -
- 329/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26043.5371 - -
-``` - -``` -
- 332/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26051.2168 - -
-``` - -``` -
- 335/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26058.8965 - -
-``` - -``` -
- 338/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26066.5781 - -
-``` - -``` -
- 341/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26074.2578 - -
-``` - -``` -
- 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26081.9375 - -
-``` - -``` -
- 347/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26089.6172 - -
-``` - -``` -
- 350/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26097.2969 - -
-``` - -``` -
- 353/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26104.9766 - -
-``` - -``` -
- 356/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26112.6562 - -
-``` - -``` -
- 359/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26120.3379 - -
-``` - -``` -
- 362/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26128.0176 - -
-``` - -``` -
- 365/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26135.6973 - -
-``` - -``` -
- 368/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26143.3770 - -
-``` - -``` -
- 371/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26151.0566 - -
-``` - -``` -
- 374/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26158.7363 - -
-``` - -``` -
- 377/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26166.4180 - -
-``` - -``` -
- 380/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26174.0977 - -
-``` - -``` -
- 383/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26181.7773 - -
-``` - -``` -
- 385/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26186.8965 - -
-``` - -``` -
- 388/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26194.5762 - -
-``` - -``` -
- 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26202.2578 - -
-``` - -``` -
- 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26209.9375 - -
-``` - -``` -
- 397/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26217.6172 - -
-``` - -``` -
- 400/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26225.2969 - -
-``` - -``` -
- 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26232.9766 - -
-``` - -``` -
- 406/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26240.6562 - -
-``` - -``` -
- 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26248.3379 - -
-``` - -``` -
- 412/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26256.0176 - -
-``` - -``` -
- 415/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26263.6973 - -
-``` - -``` -
- 418/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26271.3770 - -
-``` - -``` -
- 421/469 ━━━━━━━━━━━━━━━━━━━━ 1s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26279.0566 - -
-``` - -``` -
- 424/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26286.7363 - -
-``` - -``` -
- 427/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26294.4180 - -
-``` - -``` -
- 430/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26302.0977 - -
-``` - -``` -
- 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26309.7773 - -
-``` - -``` -
- 436/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26317.4570 - -
-``` - -``` -
- 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26325.1367 - -
-``` - -``` -
- 442/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26332.8164 - -
-``` - -``` -
- 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26340.4980 - -
-``` - -``` -
- 448/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26348.1777 - -
-``` - -``` -
- 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26355.8574 - -
-``` - -``` -
- 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26360.9766 - -
-``` - -``` -
- 456/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26368.6562 - -
-``` - -``` -
- 459/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26376.3379 - -
-``` - -``` -
- 462/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26384.0176 - -
-``` - -``` -
- 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26391.6973 - -
-``` - -``` -
- 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 22ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26396.8164 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 11s 23ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 26404.4805 - val_loss: 0.1000 - val_moe_loss: 27998.7227 - - -
-``` -Epoch 11/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 18s 39ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28003.8535 - -
-``` - -``` -
- 4/469 ━━━━━━━━━━━━━━━━━━━━ 10s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 28011.5312 - -
-``` - -``` -
- 7/469 ━━━━━━━━━━━━━━━━━━━━ 10s 22ms/step - accuracy: 0.9948 - loss: 0.1000 - moe_loss: 28019.2109 - -
-``` - -``` -
- 10/469 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 0.9955 - loss: 0.1000 - moe_loss: 28026.8926 - -
-``` - -``` -
- 13/469 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 0.9954 - loss: 0.1000 - moe_loss: 28034.5723 - -
-``` - -``` -
- 16/469 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 0.9952 - loss: 0.1000 - moe_loss: 28042.2539 - -
-``` - -``` -
- 19/469 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 0.9950 - loss: 0.1000 - moe_loss: 28049.9336 - -
-``` - -``` -
- 22/469 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 0.9947 - loss: 0.1000 - moe_loss: 28057.6152 - -
-``` - -``` -
- 25/469 ━━━━━━━━━━━━━━━━━━━━ 10s 23ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 28065.2949 - -
-``` - -``` -
- 28/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 28072.9766 - -
-``` - -``` -
- 31/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 28080.6562 - -
-``` - -``` -
- 34/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 28088.3359 - -
-``` - -``` -
- 37/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 28096.0156 - -
-``` - -``` -
- 40/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 28103.6953 - -
-``` - -``` -
- 43/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 28111.3750 - -
-``` - -``` -
- 46/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 28119.0566 - -
-``` - -``` -
- 49/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 28126.7363 - -
-``` - -``` -
- 52/469 ━━━━━━━━━━━━━━━━━━━━ 9s 22ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 28134.4160 - -
-``` - -``` -
- 55/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 28142.0957 - -
-``` - -``` -
- 58/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 28149.7754 - -
-``` - -``` -
- 61/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 28157.4551 - -
-``` - -``` -
- 64/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 28165.1348 - -
-``` - -``` -
- 67/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 28172.8145 - -
-``` - -``` -
- 70/469 ━━━━━━━━━━━━━━━━━━━━ 9s 23ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 28180.4941 - -
-``` - -``` -
- 73/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 28188.1738 - -
-``` - -``` -
- 76/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 28195.8555 - -
-``` - -``` -
- 79/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28203.5352 - -
-``` - -``` -
- 82/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28211.2148 - -
-``` - -``` -
- 85/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28218.8945 - -
-``` - -``` -
- 88/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28226.5762 - -
-``` - -``` -
- 91/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28234.2559 - -
-``` - -``` -
- 94/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28241.9355 - -
-``` - -``` -
- 97/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28249.6172 - -
-``` - -``` -
- 100/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28257.2969 - -
-``` - -``` -
- 102/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28262.4160 - -
-``` - -``` -
- 105/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 28270.0957 - -
-``` - -``` -
- 108/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 28277.7773 - -
-``` - -``` -
- 111/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 28285.4570 - -
-``` - -``` -
- 114/469 ━━━━━━━━━━━━━━━━━━━━ 8s 23ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 28293.1367 - -
-``` - -``` -
- 117/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 28300.8164 - -
-``` - -``` -
- 120/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 28308.4961 - -
-``` - -``` -
- 123/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 28316.1758 - -
-``` - -``` -
- 126/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 28323.8574 - -
-``` - -``` -
- 129/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 28331.5371 - -
-``` - -``` -
- 132/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 28339.2168 - -
-``` - -``` -
- 135/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 28346.8965 - -
-``` - -``` -
- 138/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 28354.5762 - -
-``` - -``` -
- 141/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 28362.2559 - -
-``` - -``` -
- 144/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 28369.9355 - -
-``` - -``` -
- 147/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 28377.6172 - -
-``` - -``` -
- 150/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 28385.2969 - -
-``` - -``` -
- 153/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 28392.9766 - -
-``` - -``` -
- 156/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 28400.6562 - -
-``` - -``` -
- 159/469 ━━━━━━━━━━━━━━━━━━━━ 7s 23ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 28408.3359 - -
-``` - -``` -
- 162/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 28416.0156 - -
-``` - -``` -
- 165/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 28423.6953 - -
-``` - -``` -
- 168/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 28431.3770 - -
-``` - -``` -
- 171/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 28439.0566 - -
-``` - -``` -
- 174/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 28446.7363 - -
-``` - -``` -
- 177/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28454.4160 - -
-``` - -``` -
- 179/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28459.5371 - -
-``` - -``` -
- 181/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28464.6562 - -
-``` - -``` -
- 183/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28469.7773 - -
-``` - -``` -
- 185/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28474.8965 - -
-``` - -``` -
- 188/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28482.5762 - -
-``` - -``` -
- 191/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28490.2578 - -
-``` - -``` -
- 194/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28497.9375 - -
-``` - -``` -
- 197/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28505.6172 - -
-``` - -``` -
- 200/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 28513.2969 - -
-``` - -``` -
- 203/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28520.9766 - -
-``` - -``` -
- 206/469 ━━━━━━━━━━━━━━━━━━━━ 6s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28528.6562 - -
-``` - -``` -
- 209/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28536.3379 - -
-``` - -``` -
- 212/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28544.0176 - -
-``` - -``` -
- 215/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28551.6973 - -
-``` - -``` -
- 218/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28559.3770 - -
-``` - -``` -
- 221/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28567.0566 - -
-``` - -``` -
- 224/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28574.7363 - -
-``` - -``` -
- 227/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28582.4160 - -
-``` - -``` -
- 230/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28590.0977 - -
-``` - -``` -
- 233/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28597.7773 - -
-``` - -``` -
- 236/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28605.4570 - -
-``` - -``` -
- 239/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28613.1367 - -
-``` - -``` -
- 242/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28620.8164 - -
-``` - -``` -
- 245/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28628.4961 - -
-``` - -``` -
- 247/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28633.6172 - -
-``` - -``` -
- 250/469 ━━━━━━━━━━━━━━━━━━━━ 5s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28641.2969 - -
-``` - -``` -
- 253/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28648.9766 - -
-``` - -``` -
- 256/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28656.6562 - -
-``` - -``` -
- 259/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28664.3359 - -
-``` - -``` -
- 262/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28672.0176 - -
-``` - -``` -
- 265/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28679.6973 - -
-``` - -``` -
- 267/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28684.8164 - -
-``` - -``` -
- 269/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28689.9375 - -
-``` - -``` -
- 272/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28697.6172 - -
-``` - -``` -
- 275/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 28705.2969 - -
-``` - -``` -
- 277/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28710.4160 - -
-``` - -``` -
- 280/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28718.0957 - -
-``` - -``` -
- 283/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28725.7773 - -
-``` - -``` -
- 286/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28733.4570 - -
-``` - -``` -
- 289/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28741.1367 - -
-``` - -``` -
- 292/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28748.8164 - -
-``` - -``` -
- 295/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28756.4961 - -
-``` - -``` -
- 297/469 ━━━━━━━━━━━━━━━━━━━━ 4s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28761.6172 - -
-``` - -``` -
- 299/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28766.7363 - -
-``` - -``` -
- 302/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28774.4160 - -
-``` - -``` -
- 305/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28782.0957 - -
-``` - -``` -
- 308/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28789.7773 - -
-``` - -``` -
- 311/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28797.4570 - -
-``` - -``` -
- 314/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28805.1367 - -
-``` - -``` -
- 317/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28812.8164 - -
-``` - -``` -
- 320/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28820.4961 - -
-``` - -``` -
- 323/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28828.1777 - -
-``` - -``` -
- 325/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28833.2969 - -
-``` - -``` -
- 328/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28840.9766 - -
-``` - -``` -
- 331/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28848.6562 - -
-``` - -``` -
- 333/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28853.7773 - -
-``` - -``` -
- 336/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28861.4570 - -
-``` - -``` -
- 338/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28866.5762 - -
-``` - -``` -
- 341/469 ━━━━━━━━━━━━━━━━━━━━ 3s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28874.2578 - -
-``` - -``` -
- 343/469 ━━━━━━━━━━━━━━━━━━━━ 2s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28879.3770 - -
-``` - -``` -
- 346/469 ━━━━━━━━━━━━━━━━━━━━ 2s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28887.0566 - -
-``` - -``` -
- 349/469 ━━━━━━━━━━━━━━━━━━━━ 2s 23ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 28894.7363 - -
-``` - -``` -
- 352/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28902.4160 - -
-``` - -``` -
- 355/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28910.0977 - -
-``` - -``` -
- 358/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28917.7773 - -
-``` - -``` -
- 361/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28925.4570 - -
-``` - -``` -
- 364/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28933.1367 - -
-``` - -``` -
- 367/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28940.8164 - -
-``` - -``` -
- 370/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28948.4961 - -
-``` - -``` -
- 373/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28956.1777 - -
-``` - -``` -
- 376/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28963.8574 - -
-``` - -``` -
- 379/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28971.5371 - -
-``` - -``` -
- 381/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28976.6562 - -
-``` - -``` -
- 384/469 ━━━━━━━━━━━━━━━━━━━━ 2s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28984.3359 - -
-``` - -``` -
- 387/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28992.0176 - -
-``` - -``` -
- 390/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 28999.6973 - -
-``` - -``` -
- 392/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 29004.8164 - -
-``` - -``` -
- 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 29009.9375 - -
-``` - -``` -
- 396/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 29015.0566 - -
-``` - -``` -
- 398/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 29020.1777 - -
-``` - -``` -
- 400/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 29025.2969 - -
-``` - -``` -
- 402/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29030.4180 - -
-``` - -``` -
- 404/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29035.5371 - -
-``` - -``` -
- 406/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29040.6562 - -
-``` - -``` -
- 408/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29045.7773 - -
-``` - -``` -
- 410/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29050.8965 - -
-``` - -``` -
- 412/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29056.0176 - -
-``` - -``` -
- 414/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29061.1367 - -
-``` - -``` -
- 416/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29066.2578 - -
-``` - -``` -
- 418/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29071.3770 - -
-``` - -``` -
- 421/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29079.0566 - -
-``` - -``` -
- 423/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29084.1777 - -
-``` - -``` -
- 425/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29089.2969 - -
-``` - -``` -
- 427/469 ━━━━━━━━━━━━━━━━━━━━ 1s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29094.4160 - -
-``` - -``` -
- 429/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29099.5371 - -
-``` - -``` -
- 431/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29104.6562 - -
-``` - -``` -
- 434/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29112.3359 - -
-``` - -``` -
- 436/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29117.4570 - -
-``` - -``` -
- 438/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29122.5762 - -
-``` - -``` -
- 440/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29127.6973 - -
-``` - -``` -
- 442/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29132.8164 - -
-``` - -``` -
- 444/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29137.9375 - -
-``` - -``` -
- 446/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29143.0566 - -
-``` - -``` -
- 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29150.7363 - -
-``` - -``` -
- 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29155.8574 - -
-``` - -``` -
- 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29160.9766 - -
-``` - -``` -
- 455/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 29166.0977 - -
-``` - -``` -
- 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 29171.2168 - -
-``` - -``` -
- 459/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 29176.3359 - -
-``` - -``` -
- 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 29181.4570 - -
-``` - -``` -
- 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 29186.5762 - -
-``` - -``` -
- 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 29191.6973 - -
-``` - -``` -
- 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 24ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 29196.8164 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 12s 25ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 29204.4805 - val_loss: 0.1000 - val_moe_loss: 30798.7227 - - -
-``` -Epoch 12/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 20s 45ms/step - accuracy: 0.9844 - loss: 0.1000 - moe_loss: 30803.8516 - -
-``` - -``` -
- 3/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9883 - loss: 0.1000 - moe_loss: 30808.9746 - -
-``` - -``` -
- 5/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9880 - loss: 0.1000 - moe_loss: 30814.0918 - -
-``` - -``` -
- 7/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9889 - loss: 0.1000 - moe_loss: 30819.2129 - -
-``` - -``` -
- 10/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9896 - loss: 0.1000 - moe_loss: 30826.8965 - -
-``` - -``` -
- 12/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9898 - loss: 0.1000 - moe_loss: 30832.0176 - -
-``` - -``` -
- 15/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9900 - loss: 0.1000 - moe_loss: 30839.6973 - -
-``` - -``` -
- 18/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9903 - loss: 0.1000 - moe_loss: 30847.3789 - -
-``` - -``` -
- 21/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9904 - loss: 0.1000 - moe_loss: 30855.0605 - -
-``` - -``` -
- 24/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9906 - loss: 0.1000 - moe_loss: 30862.7422 - -
-``` - -``` -
- 26/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9907 - loss: 0.1000 - moe_loss: 30867.8633 - -
-``` - -``` -
- 28/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 30872.9824 - -
-``` - -``` -
- 31/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 30880.6621 - -
-``` - -``` -
- 33/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 30885.7812 - -
-``` - -``` -
- 36/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 30893.4609 - -
-``` - -``` -
- 38/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 30898.5801 - -
-``` - -``` -
- 40/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 30903.6992 - -
-``` - -``` -
- 42/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 30908.8203 - -
-``` - -``` -
- 44/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 30913.9395 - -
-``` - -``` -
- 46/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 30919.0586 - -
-``` - -``` -
- 48/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 30924.1797 - -
-``` - -``` -
- 50/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 30929.2988 - -
-``` - -``` -
- 52/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 30934.4199 - -
-``` - -``` -
- 54/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 30939.5391 - -
-``` - -``` -
- 56/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 30944.6602 - -
-``` - -``` -
- 58/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 30949.7793 - -
-``` - -``` -
- 60/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 30954.8984 - -
-``` - -``` -
- 62/469 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 30960.0195 - -
-``` - -``` -
- 64/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 30965.1387 - -
-``` - -``` -
- 65/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 30967.6992 - -
-``` - -``` -
- 67/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 30972.8184 - -
-``` - -``` -
- 69/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 30977.9395 - -
-``` - -``` -
- 71/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 30983.0586 - -
-``` - -``` -
- 74/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 30990.7383 - -
-``` - -``` -
- 76/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 30995.8594 - -
-``` - -``` -
- 78/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 31000.9785 - -
-``` - -``` -
- 80/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 31006.0996 - -
-``` - -``` -
- 82/469 ━━━━━━━━━━━━━━━━━━━━ 11s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 31011.2188 - -
-``` - -``` -
- 85/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 31018.9004 - -
-``` - -``` -
- 87/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 31024.0195 - -
-``` - -``` -
- 89/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 31029.1406 - -
-``` - -``` -
- 91/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 31034.2598 - -
-``` - -``` -
- 93/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 31039.3789 - -
-``` - -``` -
- 95/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 31044.5000 - -
-``` - -``` -
- 97/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31049.6191 - -
-``` - -``` -
- 99/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31054.7402 - -
-``` - -``` -
- 101/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31059.8594 - -
-``` - -``` -
- 103/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31064.9805 - -
-``` - -``` -
- 105/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31070.0996 - -
-``` - -``` -
- 107/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31075.2188 - -
-``` - -``` -
- 109/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31080.3398 - -
-``` - -``` -
- 111/469 ━━━━━━━━━━━━━━━━━━━━ 10s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31085.4590 - -
-``` - -``` -
- 113/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31090.5801 - -
-``` - -``` -
- 115/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31095.6992 - -
-``` - -``` -
- 117/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31100.8203 - -
-``` - -``` -
- 119/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31105.9395 - -
-``` - -``` -
- 121/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31111.0586 - -
-``` - -``` -
- 123/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31116.1797 - -
-``` - -``` -
- 125/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31121.2988 - -
-``` - -``` -
- 127/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31126.4199 - -
-``` - -``` -
- 129/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31131.5391 - -
-``` - -``` -
- 131/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31136.6602 - -
-``` - -``` -
- 133/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31141.7793 - -
-``` - -``` -
- 135/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31146.8984 - -
-``` - -``` -
- 137/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31152.0195 - -
-``` - -``` -
- 139/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31157.1387 - -
-``` - -``` -
- 141/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31162.2598 - -
-``` - -``` -
- 143/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31167.3789 - -
-``` - -``` -
- 145/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31172.5000 - -
-``` - -``` -
- 147/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31177.6191 - -
-``` - -``` -
- 149/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31182.7402 - -
-``` - -``` -
- 151/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31187.8594 - -
-``` - -``` -
- 153/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31192.9805 - -
-``` - -``` -
- 155/469 ━━━━━━━━━━━━━━━━━━━━ 8s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31198.0996 - -
-``` - -``` -
- 157/469 ━━━━━━━━━━━━━━━━━━━━ 8s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31203.2207 - -
-``` - -``` -
- 159/469 ━━━━━━━━━━━━━━━━━━━━ 8s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31208.3398 - -
-``` - -``` -
- 161/469 ━━━━━━━━━━━━━━━━━━━━ 8s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31213.4609 - -
-``` - -``` -
- 163/469 ━━━━━━━━━━━━━━━━━━━━ 8s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31218.5801 - -
-``` - -``` -
- 165/469 ━━━━━━━━━━━━━━━━━━━━ 8s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31223.6992 - -
-``` - -``` -
- 166/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31226.2598 - -
-``` - -``` -
- 167/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31228.8203 - -
-``` - -``` -
- 169/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31233.9395 - -
-``` - -``` -
- 171/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31239.0605 - -
-``` - -``` -
- 173/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31244.1797 - -
-``` - -``` -
- 175/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31249.2988 - -
-``` - -``` -
- 177/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31254.4199 - -
-``` - -``` -
- 179/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31259.5391 - -
-``` - -``` -
- 181/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31264.6602 - -
-``` - -``` -
- 183/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31269.7793 - -
-``` - -``` -
- 185/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31274.8984 - -
-``` - -``` -
- 187/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31280.0195 - -
-``` - -``` -
- 189/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31285.1387 - -
-``` - -``` -
- 191/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31290.2598 - -
-``` - -``` -
- 193/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31295.3789 - -
-``` - -``` -
- 195/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31300.5000 - -
-``` - -``` -
- 197/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31305.6191 - -
-``` - -``` -
- 199/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31310.7383 - -
-``` - -``` -
- 201/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31315.8594 - -
-``` - -``` -
- 203/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31320.9785 - -
-``` - -``` -
- 205/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31326.0996 - -
-``` - -``` -
- 207/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31331.2188 - -
-``` - -``` -
- 209/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31336.3398 - -
-``` - -``` -
- 211/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31341.4590 - -
-``` - -``` -
- 213/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31346.5801 - -
-``` - -``` -
- 215/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31351.6992 - -
-``` - -``` -
- 217/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31356.8184 - -
-``` - -``` -
- 219/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31361.9395 - -
-``` - -``` -
- 221/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31367.0586 - -
-``` - -``` -
- 223/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31372.1797 - -
-``` - -``` -
- 225/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31377.2988 - -
-``` - -``` -
- 227/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31382.4199 - -
-``` - -``` -
- 229/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31387.5391 - -
-``` - -``` -
- 231/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31392.6582 - -
-``` - -``` -
- 233/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31397.7793 - -
-``` - -``` -
- 235/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31402.8984 - -
-``` - -``` -
- 237/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31408.0195 - -
-``` - -``` -
- 239/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31413.1387 - -
-``` - -``` -
- 241/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31418.2578 - -
-``` - -``` -
- 243/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31423.3789 - -
-``` - -``` -
- 245/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31428.4980 - -
-``` - -``` -
- 247/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31433.6191 - -
-``` - -``` -
- 249/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31438.7383 - -
-``` - -``` -
- 251/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31443.8574 - -
-``` - -``` -
- 253/469 ━━━━━━━━━━━━━━━━━━━━ 6s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31448.9785 - -
-``` - -``` -
- 255/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31454.0977 - -
-``` - -``` -
- 257/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31459.2188 - -
-``` - -``` -
- 259/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31464.3379 - -
-``` - -``` -
- 261/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 31469.4590 - -
-``` - -``` -
- 263/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31474.5781 - -
-``` - -``` -
- 265/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31479.6992 - -
-``` - -``` -
- 267/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31484.8184 - -
-``` - -``` -
- 269/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31489.9375 - -
-``` - -``` -
- 271/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31495.0586 - -
-``` - -``` -
- 273/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31500.1777 - -
-``` - -``` -
- 275/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31505.2988 - -
-``` - -``` -
- 277/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31510.4180 - -
-``` - -``` -
- 279/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31515.5391 - -
-``` - -``` -
- 281/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31520.6582 - -
-``` - -``` -
- 283/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31525.7773 - -
-``` - -``` -
- 285/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31530.8984 - -
-``` - -``` -
- 287/469 ━━━━━━━━━━━━━━━━━━━━ 5s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31536.0176 - -
-``` - -``` -
- 289/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31541.1387 - -
-``` - -``` -
- 291/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31546.2578 - -
-``` - -``` -
- 293/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31551.3770 - -
-``` - -``` -
- 295/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31556.4980 - -
-``` - -``` -
- 297/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31561.6172 - -
-``` - -``` -
- 299/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31566.7383 - -
-``` - -``` -
- 301/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31571.8574 - -
-``` - -``` -
- 303/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31576.9785 - -
-``` - -``` -
- 305/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31582.0977 - -
-``` - -``` -
- 307/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31587.2168 - -
-``` - -``` -
- 309/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31592.3379 - -
-``` - -``` -
- 311/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31597.4570 - -
-``` - -``` -
- 313/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31602.5781 - -
-``` - -``` -
- 315/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31607.6973 - -
-``` - -``` -
- 317/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31612.8184 - -
-``` - -``` -
- 319/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31617.9375 - -
-``` - -``` -
- 321/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31623.0586 - -
-``` - -``` -
- 323/469 ━━━━━━━━━━━━━━━━━━━━ 4s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31628.1777 - -
-``` - -``` -
- 325/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31633.2969 - -
-``` - -``` -
- 327/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31638.4180 - -
-``` - -``` -
- 329/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31643.5371 - -
-``` - -``` -
- 331/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31648.6582 - -
-``` - -``` -
- 333/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31653.7773 - -
-``` - -``` -
- 335/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31658.8984 - -
-``` - -``` -
- 337/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31664.0176 - -
-``` - -``` -
- 339/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31669.1367 - -
-``` - -``` -
- 341/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31674.2578 - -
-``` - -``` -
- 343/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31679.3770 - -
-``` - -``` -
- 345/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31684.4980 - -
-``` - -``` -
- 347/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31689.6172 - -
-``` - -``` -
- 349/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31694.7383 - -
-``` - -``` -
- 351/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31699.8574 - -
-``` - -``` -
- 353/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31704.9785 - -
-``` - -``` -
- 355/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31710.0977 - -
-``` - -``` -
- 357/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31715.2168 - -
-``` - -``` -
- 359/469 ━━━━━━━━━━━━━━━━━━━━ 3s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31720.3379 - -
-``` - -``` -
- 361/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31725.4570 - -
-``` - -``` -
- 363/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31730.5781 - -
-``` - -``` -
- 365/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31735.6973 - -
-``` - -``` -
- 367/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31740.8184 - -
-``` - -``` -
- 369/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31745.9375 - -
-``` - -``` -
- 371/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31751.0566 - -
-``` - -``` -
- 373/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31756.1777 - -
-``` - -``` -
- 375/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31761.2969 - -
-``` - -``` -
- 377/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31766.4180 - -
-``` - -``` -
- 379/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31771.5371 - -
-``` - -``` -
- 381/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 31776.6582 - -
-``` - -``` -
- 383/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31781.7773 - -
-``` - -``` -
- 385/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31786.8965 - -
-``` - -``` -
- 387/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31792.0176 - -
-``` - -``` -
- 389/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31797.1367 - -
-``` - -``` -
- 391/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31802.2578 - -
-``` - -``` -
- 393/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31807.3770 - -
-``` - -``` -
- 395/469 ━━━━━━━━━━━━━━━━━━━━ 2s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31812.4980 - -
-``` - -``` -
- 397/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31817.6172 - -
-``` - -``` -
- 399/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31822.7383 - -
-``` - -``` -
- 401/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31827.8574 - -
-``` - -``` -
- 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31832.9766 - -
-``` - -``` -
- 405/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31838.0977 - -
-``` - -``` -
- 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31843.2168 - -
-``` - -``` -
- 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31848.3379 - -
-``` - -``` -
- 411/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31853.4570 - -
-``` - -``` -
- 413/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31858.5781 - -
-``` - -``` -
- 415/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31863.6973 - -
-``` - -``` -
- 417/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31868.8164 - -
-``` - -``` -
- 419/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31873.9375 - -
-``` - -``` -
- 421/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31879.0566 - -
-``` - -``` -
- 423/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31884.1777 - -
-``` - -``` -
- 425/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31889.2969 - -
-``` - -``` -
- 427/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31894.4180 - -
-``` - -``` -
- 429/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31899.5371 - -
-``` - -``` -
- 431/469 ━━━━━━━━━━━━━━━━━━━━ 1s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31904.6562 - -
-``` - -``` -
- 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31909.7773 - -
-``` - -``` -
- 435/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31914.8965 - -
-``` - -``` -
- 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31920.0176 - -
-``` - -``` -
- 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31925.1367 - -
-``` - -``` -
- 441/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31930.2578 - -
-``` - -``` -
- 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31935.3770 - -
-``` - -``` -
- 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31940.4961 - -
-``` - -``` -
- 447/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31945.6172 - -
-``` - -``` -
- 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31950.7363 - -
-``` - -``` -
- 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31955.8574 - -
-``` - -``` -
- 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31960.9766 - -
-``` - -``` -
- 455/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31966.0977 - -
-``` - -``` -
- 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31971.2168 - -
-``` - -``` -
- 459/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31976.3379 - -
-``` - -``` -
- 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31981.4570 - -
-``` - -``` -
- 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31986.5762 - -
-``` - -``` -
- 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31991.6973 - -
-``` - -``` -
- 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 28ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 31996.8164 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 14s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 32004.4805 - val_loss: 0.1000 - val_moe_loss: 33598.7227 - - -
-``` -Epoch 13/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 21s 45ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 33603.8633 - -
-``` - -``` -
- 4/469 ━━━━━━━━━━━━━━━━━━━━ 11s 24ms/step - accuracy: 0.9954 - loss: 0.1000 - moe_loss: 33611.5352 - -
-``` - -``` -
- 6/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9953 - loss: 0.1000 - moe_loss: 33616.6523 - -
-``` - -``` -
- 9/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9948 - loss: 0.1000 - moe_loss: 33624.3320 - -
-``` - -``` -
- 11/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 33629.4531 - -
-``` - -``` -
- 13/469 ━━━━━━━━━━━━━━━━━━━━ 11s 25ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 33634.5742 - -
-``` - -``` -
- 15/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 33639.6953 - -
-``` - -``` -
- 17/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 33644.8125 - -
-``` - -``` -
- 19/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 33649.9336 - -
-``` - -``` -
- 21/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 33655.0547 - -
-``` - -``` -
- 23/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 33660.1758 - -
-``` - -``` -
- 25/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 33665.2930 - -
-``` - -``` -
- 27/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 33670.4141 - -
-``` - -``` -
- 29/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 33675.5352 - -
-``` - -``` -
- 31/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 33680.6562 - -
-``` - -``` -
- 33/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 33685.7734 - -
-``` - -``` -
- 35/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 33690.8945 - -
-``` - -``` -
- 37/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 33696.0156 - -
-``` - -``` -
- 39/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 33701.1328 - -
-``` - -``` -
- 41/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 33706.2539 - -
-``` - -``` -
- 43/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 33711.3750 - -
-``` - -``` -
- 45/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 33716.4961 - -
-``` - -``` -
- 47/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 33721.6133 - -
-``` - -``` -
- 49/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 33726.7344 - -
-``` - -``` -
- 51/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 33731.8555 - -
-``` - -``` -
- 53/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 33736.9766 - -
-``` - -``` -
- 55/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 33742.0938 - -
-``` - -``` -
- 57/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 33747.2148 - -
-``` - -``` -
- 59/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 33752.3359 - -
-``` - -``` -
- 61/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 33757.4531 - -
-``` - -``` -
- 63/469 ━━━━━━━━━━━━━━━━━━━━ 11s 27ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 33762.5742 - -
-``` - -``` -
- 65/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 33767.6953 - -
-``` - -``` -
- 67/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33772.8164 - -
-``` - -``` -
- 69/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33777.9336 - -
-``` - -``` -
- 71/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33783.0547 - -
-``` - -``` -
- 73/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33788.1758 - -
-``` - -``` -
- 75/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33793.2930 - -
-``` - -``` -
- 77/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33798.4141 - -
-``` - -``` -
- 79/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33803.5352 - -
-``` - -``` -
- 81/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33808.6562 - -
-``` - -``` -
- 83/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33813.7734 - -
-``` - -``` -
- 85/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33818.8945 - -
-``` - -``` -
- 87/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33824.0156 - -
-``` - -``` -
- 89/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33829.1367 - -
-``` - -``` -
- 91/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33834.2539 - -
-``` - -``` -
- 93/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33839.3750 - -
-``` - -``` -
- 95/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33844.4961 - -
-``` - -``` -
- 97/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33849.6133 - -
-``` - -``` -
- 99/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33854.7344 - -
-``` - -``` -
- 101/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33859.8555 - -
-``` - -``` -
- 103/469 ━━━━━━━━━━━━━━━━━━━━ 10s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33864.9766 - -
-``` - -``` -
- 105/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33870.0938 - -
-``` - -``` -
- 107/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33875.2148 - -
-``` - -``` -
- 109/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33880.3359 - -
-``` - -``` -
- 111/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33885.4531 - -
-``` - -``` -
- 113/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33890.5742 - -
-``` - -``` -
- 115/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33895.6953 - -
-``` - -``` -
- 117/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33900.8164 - -
-``` - -``` -
- 119/469 ━━━━━━━━━━━━━━━━━━━━ 9s 27ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 33905.9336 - -
-``` - -``` -
- 121/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33911.0547 - -
-``` - -``` -
- 123/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33916.1758 - -
-``` - -``` -
- 125/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33921.2930 - -
-``` - -``` -
- 127/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33926.4141 - -
-``` - -``` -
- 129/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33931.5352 - -
-``` - -``` -
- 131/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33936.6562 - -
-``` - -``` -
- 133/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33941.7734 - -
-``` - -``` -
- 135/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33946.8945 - -
-``` - -``` -
- 137/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33952.0156 - -
-``` - -``` -
- 139/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33957.1367 - -
-``` - -``` -
- 141/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33962.2539 - -
-``` - -``` -
- 143/469 ━━━━━━━━━━━━━━━━━━━━ 9s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33967.3750 - -
-``` - -``` -
- 145/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33972.4961 - -
-``` - -``` -
- 147/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 33977.6172 - -
-``` - -``` -
- 149/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 33982.7344 - -
-``` - -``` -
- 151/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 33987.8555 - -
-``` - -``` -
- 153/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 33992.9766 - -
-``` - -``` -
- 155/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 33998.0938 - -
-``` - -``` -
- 157/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34003.2148 - -
-``` - -``` -
- 159/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34008.3359 - -
-``` - -``` -
- 161/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34013.4570 - -
-``` - -``` -
- 163/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34018.5742 - -
-``` - -``` -
- 165/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34023.6953 - -
-``` - -``` -
- 167/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34028.8164 - -
-``` - -``` -
- 169/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34033.9336 - -
-``` - -``` -
- 171/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34039.0547 - -
-``` - -``` -
- 173/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 34044.1758 - -
-``` - -``` -
- 175/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34049.2969 - -
-``` - -``` -
- 177/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34054.4141 - -
-``` - -``` -
- 179/469 ━━━━━━━━━━━━━━━━━━━━ 8s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34059.5352 - -
-``` - -``` -
- 181/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34064.6562 - -
-``` - -``` -
- 183/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34069.7734 - -
-``` - -``` -
- 185/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34074.8945 - -
-``` - -``` -
- 186/469 ━━━━━━━━━━━━━━━━━━━━ 7s 28ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34077.4531 - -
-``` - -``` -
- 187/469 ━━━━━━━━━━━━━━━━━━━━ 8s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34080.0156 - -
-``` - -``` -
- 188/469 ━━━━━━━━━━━━━━━━━━━━ 8s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34082.5742 - -
-``` - -``` -
- 190/469 ━━━━━━━━━━━━━━━━━━━━ 8s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34087.6953 - -
-``` - -``` -
- 192/469 ━━━━━━━━━━━━━━━━━━━━ 8s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34092.8164 - -
-``` - -``` -
- 194/469 ━━━━━━━━━━━━━━━━━━━━ 8s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34097.9336 - -
-``` - -``` -
- 196/469 ━━━━━━━━━━━━━━━━━━━━ 8s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34103.0547 - -
-``` - -``` -
- 198/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34108.1758 - -
-``` - -``` -
- 200/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34113.2930 - -
-``` - -``` -
- 202/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34118.4141 - -
-``` - -``` -
- 204/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34123.5352 - -
-``` - -``` -
- 206/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34128.6562 - -
-``` - -``` -
- 208/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34133.7734 - -
-``` - -``` -
- 210/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34138.8945 - -
-``` - -``` -
- 212/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34144.0156 - -
-``` - -``` -
- 214/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34149.1367 - -
-``` - -``` -
- 216/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34154.2539 - -
-``` - -``` -
- 218/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34159.3750 - -
-``` - -``` -
- 220/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34164.4961 - -
-``` - -``` -
- 222/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34169.6133 - -
-``` - -``` -
- 224/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34174.7344 - -
-``` - -``` -
- 226/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34179.8555 - -
-``` - -``` -
- 228/469 ━━━━━━━━━━━━━━━━━━━━ 7s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34184.9766 - -
-``` - -``` -
- 230/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34190.0938 - -
-``` - -``` -
- 232/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34195.2148 - -
-``` - -``` -
- 234/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34200.3359 - -
-``` - -``` -
- 236/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34205.4531 - -
-``` - -``` -
- 238/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34210.5742 - -
-``` - -``` -
- 240/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34215.6953 - -
-``` - -``` -
- 242/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34220.8164 - -
-``` - -``` -
- 244/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34225.9336 - -
-``` - -``` -
- 246/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34231.0547 - -
-``` - -``` -
- 248/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34236.1758 - -
-``` - -``` -
- 250/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34241.2969 - -
-``` - -``` -
- 252/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34246.4141 - -
-``` - -``` -
- 254/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34251.5352 - -
-``` - -``` -
- 256/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34256.6562 - -
-``` - -``` -
- 258/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34261.7734 - -
-``` - -``` -
- 260/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34266.8945 - -
-``` - -``` -
- 262/469 ━━━━━━━━━━━━━━━━━━━━ 6s 29ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34272.0156 - -
-``` - -``` -
- 263/469 ━━━━━━━━━━━━━━━━━━━━ 6s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34274.5742 - -
-``` - -``` -
- 265/469 ━━━━━━━━━━━━━━━━━━━━ 6s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34279.6953 - -
-``` - -``` -
- 267/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34284.8164 - -
-``` - -``` -
- 269/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34289.9336 - -
-``` - -``` -
- 271/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34295.0547 - -
-``` - -``` -
- 273/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34300.1758 - -
-``` - -``` -
- 275/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34305.2930 - -
-``` - -``` -
- 277/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34310.4141 - -
-``` - -``` -
- 279/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34315.5352 - -
-``` - -``` -
- 281/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34320.6562 - -
-``` - -``` -
- 283/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34325.7734 - -
-``` - -``` -
- 285/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34330.8945 - -
-``` - -``` -
- 287/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34336.0156 - -
-``` - -``` -
- 289/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34341.1328 - -
-``` - -``` -
- 291/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34346.2539 - -
-``` - -``` -
- 293/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34351.3750 - -
-``` - -``` -
- 295/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34356.4961 - -
-``` - -``` -
- 297/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34361.6133 - -
-``` - -``` -
- 299/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34366.7344 - -
-``` - -``` -
- 301/469 ━━━━━━━━━━━━━━━━━━━━ 5s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34371.8555 - -
-``` - -``` -
- 303/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34376.9727 - -
-``` - -``` -
- 305/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34382.0938 - -
-``` - -``` -
- 307/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34387.2148 - -
-``` - -``` -
- 309/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34392.3359 - -
-``` - -``` -
- 311/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34397.4531 - -
-``` - -``` -
- 313/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34402.5742 - -
-``` - -``` -
- 315/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34407.6953 - -
-``` - -``` -
- 317/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34412.8125 - -
-``` - -``` -
- 319/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34417.9336 - -
-``` - -``` -
- 321/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34423.0547 - -
-``` - -``` -
- 323/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34428.1758 - -
-``` - -``` -
- 325/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34433.2930 - -
-``` - -``` -
- 327/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34438.4141 - -
-``` - -``` -
- 329/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34443.5352 - -
-``` - -``` -
- 331/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34448.6562 - -
-``` - -``` -
- 333/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34453.7734 - -
-``` - -``` -
- 335/469 ━━━━━━━━━━━━━━━━━━━━ 4s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34458.8945 - -
-``` - -``` -
- 337/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34464.0156 - -
-``` - -``` -
- 339/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34469.1328 - -
-``` - -``` -
- 341/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34474.2539 - -
-``` - -``` -
- 343/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34479.3750 - -
-``` - -``` -
- 345/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34484.4961 - -
-``` - -``` -
- 347/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34489.6133 - -
-``` - -``` -
- 349/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34494.7344 - -
-``` - -``` -
- 351/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34499.8555 - -
-``` - -``` -
- 353/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34504.9727 - -
-``` - -``` -
- 355/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34510.0938 - -
-``` - -``` -
- 357/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34515.2148 - -
-``` - -``` -
- 359/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34520.3359 - -
-``` - -``` -
- 361/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34525.4531 - -
-``` - -``` -
- 363/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34530.5742 - -
-``` - -``` -
- 365/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34535.6953 - -
-``` - -``` -
- 367/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34540.8164 - -
-``` - -``` -
- 369/469 ━━━━━━━━━━━━━━━━━━━━ 3s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34545.9336 - -
-``` - -``` -
- 371/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34551.0547 - -
-``` - -``` -
- 373/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34556.1758 - -
-``` - -``` -
- 375/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34561.2930 - -
-``` - -``` -
- 377/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34566.4141 - -
-``` - -``` -
- 379/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34571.5352 - -
-``` - -``` -
- 381/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34576.6562 - -
-``` - -``` -
- 383/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34581.7734 - -
-``` - -``` -
- 385/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34586.8945 - -
-``` - -``` -
- 387/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34592.0156 - -
-``` - -``` -
- 389/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34597.1328 - -
-``` - -``` -
- 391/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34602.2539 - -
-``` - -``` -
- 393/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34607.3750 - -
-``` - -``` -
- 395/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34612.4961 - -
-``` - -``` -
- 397/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34617.6133 - -
-``` - -``` -
- 399/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34622.7344 - -
-``` - -``` -
- 401/469 ━━━━━━━━━━━━━━━━━━━━ 2s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34627.8555 - -
-``` - -``` -
- 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34632.9727 - -
-``` - -``` -
- 405/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 34638.0938 - -
-``` - -``` -
- 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34643.2148 - -
-``` - -``` -
- 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34648.3359 - -
-``` - -``` -
- 411/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34653.4531 - -
-``` - -``` -
- 413/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34658.5742 - -
-``` - -``` -
- 415/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34663.6953 - -
-``` - -``` -
- 417/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34668.8164 - -
-``` - -``` -
- 419/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34673.9336 - -
-``` - -``` -
- 421/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34679.0547 - -
-``` - -``` -
- 423/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34684.1758 - -
-``` - -``` -
- 425/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34689.2930 - -
-``` - -``` -
- 427/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34694.4141 - -
-``` - -``` -
- 429/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34699.5352 - -
-``` - -``` -
- 431/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34704.6562 - -
-``` - -``` -
- 433/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34709.7734 - -
-``` - -``` -
- 435/469 ━━━━━━━━━━━━━━━━━━━━ 1s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34714.8945 - -
-``` - -``` -
- 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34720.0156 - -
-``` - -``` -
- 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34725.1328 - -
-``` - -``` -
- 441/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34730.2539 - -
-``` - -``` -
- 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34735.3750 - -
-``` - -``` -
- 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34740.4961 - -
-``` - -``` -
- 447/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34745.6133 - -
-``` - -``` -
- 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34750.7344 - -
-``` - -``` -
- 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34755.8555 - -
-``` - -``` -
- 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34760.9727 - -
-``` - -``` -
- 455/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34766.0938 - -
-``` - -``` -
- 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34771.2148 - -
-``` - -``` -
- 459/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34776.3359 - -
-``` - -``` -
- 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34781.4531 - -
-``` - -``` -
- 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34786.5742 - -
-``` - -``` -
- 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34791.6953 - -
-``` - -``` -
- 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34796.8164 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 0s 30ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34801.9336 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 15s 32ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 34804.4766 - val_loss: 0.1000 - val_moe_loss: 36398.7227 - - -
-``` -Epoch 14/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 23s 50ms/step - accuracy: 1.0000 - loss: 0.1000 - moe_loss: 36403.8555 - -
-``` - -``` -
- 3/469 ━━━━━━━━━━━━━━━━━━━━ 13s 28ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 36408.9766 - -
-``` - -``` -
- 5/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9905 - loss: 0.1000 - moe_loss: 36414.0938 - -
-``` - -``` -
- 7/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9898 - loss: 0.1000 - moe_loss: 36419.2148 - -
-``` - -``` -
- 9/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9898 - loss: 0.1000 - moe_loss: 36424.3320 - -
-``` - -``` -
- 11/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9899 - loss: 0.1000 - moe_loss: 36429.4531 - -
-``` - -``` -
- 13/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9899 - loss: 0.1000 - moe_loss: 36434.5742 - -
-``` - -``` -
- 15/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9900 - loss: 0.1000 - moe_loss: 36439.6914 - -
-``` - -``` -
- 17/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9901 - loss: 0.1000 - moe_loss: 36444.8125 - -
-``` - -``` -
- 19/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9901 - loss: 0.1000 - moe_loss: 36449.9336 - -
-``` - -``` -
- 21/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9902 - loss: 0.1000 - moe_loss: 36455.0547 - -
-``` - -``` -
- 23/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9903 - loss: 0.1000 - moe_loss: 36460.1719 - -
-``` - -``` -
- 25/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9903 - loss: 0.1000 - moe_loss: 36465.2930 - -
-``` - -``` -
- 27/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9904 - loss: 0.1000 - moe_loss: 36470.4141 - -
-``` - -``` -
- 29/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9905 - loss: 0.1000 - moe_loss: 36475.5312 - -
-``` - -``` -
- 31/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9906 - loss: 0.1000 - moe_loss: 36480.6523 - -
-``` - -``` -
- 33/469 ━━━━━━━━━━━━━━━━━━━━ 12s 28ms/step - accuracy: 0.9906 - loss: 0.1000 - moe_loss: 36485.7734 - -
-``` - -``` -
- 35/469 ━━━━━━━━━━━━━━━━━━━━ 12s 29ms/step - accuracy: 0.9907 - loss: 0.1000 - moe_loss: 36490.8945 - -
-``` - -``` -
- 36/469 ━━━━━━━━━━━━━━━━━━━━ 13s 32ms/step - accuracy: 0.9907 - loss: 0.1000 - moe_loss: 36493.4531 - -
-``` - -``` -
- 37/469 ━━━━━━━━━━━━━━━━━━━━ 14s 33ms/step - accuracy: 0.9907 - loss: 0.1000 - moe_loss: 36496.0117 - -
-``` - -``` -
- 39/469 ━━━━━━━━━━━━━━━━━━━━ 14s 34ms/step - accuracy: 0.9908 - loss: 0.1000 - moe_loss: 36501.1328 - -
-``` - -``` -
- 41/469 ━━━━━━━━━━━━━━━━━━━━ 14s 34ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 36506.2539 - -
-``` - -``` -
- 42/469 ━━━━━━━━━━━━━━━━━━━━ 14s 35ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 36508.8125 - -
-``` - -``` -
- 44/469 ━━━━━━━━━━━━━━━━━━━━ 15s 37ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 36513.9336 - -
-``` - -``` -
- 45/469 ━━━━━━━━━━━━━━━━━━━━ 16s 39ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 36516.4922 - -
-``` - -``` -
- 46/469 ━━━━━━━━━━━━━━━━━━━━ 18s 44ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 36519.0508 - -
-``` - -``` -
- 48/469 ━━━━━━━━━━━━━━━━━━━━ 18s 44ms/step - accuracy: 0.9910 - loss: 0.1000 - moe_loss: 36524.1719 - -
-``` - -``` -
- 50/469 ━━━━━━━━━━━━━━━━━━━━ 18s 43ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 36529.2930 - -
-``` - -``` -
- 52/469 ━━━━━━━━━━━━━━━━━━━━ 18s 43ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 36534.4102 - -
-``` - -``` -
- 54/469 ━━━━━━━━━━━━━━━━━━━━ 17s 43ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 36539.5312 - -
-``` - -``` -
- 55/469 ━━━━━━━━━━━━━━━━━━━━ 18s 44ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 36542.0898 - -
-``` - -``` -
- 56/469 ━━━━━━━━━━━━━━━━━━━━ 18s 44ms/step - accuracy: 0.9912 - loss: 0.1000 - moe_loss: 36544.6523 - -
-``` - -``` -
- 58/469 ━━━━━━━━━━━━━━━━━━━━ 17s 44ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 36549.7695 - -
-``` - -``` -
- 60/469 ━━━━━━━━━━━━━━━━━━━━ 17s 43ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 36554.8906 - -
-``` - -``` -
- 62/469 ━━━━━━━━━━━━━━━━━━━━ 17s 43ms/step - accuracy: 0.9913 - loss: 0.1000 - moe_loss: 36560.0117 - -
-``` - -``` -
- 64/469 ━━━━━━━━━━━━━━━━━━━━ 17s 43ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 36565.1328 - -
-``` - -``` -
- 66/469 ━━━━━━━━━━━━━━━━━━━━ 17s 43ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 36570.2500 - -
-``` - -``` -
- 68/469 ━━━━━━━━━━━━━━━━━━━━ 17s 43ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 36575.3711 - -
-``` - -``` -
- 70/469 ━━━━━━━━━━━━━━━━━━━━ 16s 42ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 36580.4922 - -
-``` - -``` -
- 72/469 ━━━━━━━━━━━━━━━━━━━━ 16s 42ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 36585.6133 - -
-``` - -``` -
- 74/469 ━━━━━━━━━━━━━━━━━━━━ 16s 42ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 36590.7305 - -
-``` - -``` -
- 76/469 ━━━━━━━━━━━━━━━━━━━━ 16s 42ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 36595.8516 - -
-``` - -``` -
- 78/469 ━━━━━━━━━━━━━━━━━━━━ 16s 41ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 36600.9727 - -
-``` - -``` -
- 80/469 ━━━━━━━━━━━━━━━━━━━━ 16s 41ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 36606.0898 - -
-``` - -``` -
- 82/469 ━━━━━━━━━━━━━━━━━━━━ 15s 41ms/step - accuracy: 0.9915 - loss: 0.1000 - moe_loss: 36611.2109 - -
-``` - -``` -
- 84/469 ━━━━━━━━━━━━━━━━━━━━ 15s 41ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 36616.3320 - -
-``` - -``` -
- 86/469 ━━━━━━━━━━━━━━━━━━━━ 15s 41ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 36621.4531 - -
-``` - -``` -
- 88/469 ━━━━━━━━━━━━━━━━━━━━ 15s 41ms/step - accuracy: 0.9916 - loss: 0.1000 - moe_loss: 36626.5703 - -
-``` - -``` -
- 90/469 ━━━━━━━━━━━━━━━━━━━━ 15s 41ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 36631.6914 - -
-``` - -``` -
- 91/469 ━━━━━━━━━━━━━━━━━━━━ 15s 41ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 36634.2500 - -
-``` - -``` -
- 93/469 ━━━━━━━━━━━━━━━━━━━━ 15s 41ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 36639.3711 - -
-``` - -``` -
- 95/469 ━━━━━━━━━━━━━━━━━━━━ 15s 40ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 36644.4922 - -
-``` - -``` -
- 97/469 ━━━━━━━━━━━━━━━━━━━━ 15s 40ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 36649.6133 - -
-``` - -``` -
- 99/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36654.7344 - -
-``` - -``` -
- 101/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36659.8516 - -
-``` - -``` -
- 103/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36664.9727 - -
-``` - -``` -
- 105/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36670.0938 - -
-``` - -``` -
- 107/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36675.2148 - -
-``` - -``` -
- 109/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36680.3320 - -
-``` - -``` -
- 111/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36685.4531 - -
-``` - -``` -
- 113/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36690.5742 - -
-``` - -``` -
- 115/469 ━━━━━━━━━━━━━━━━━━━━ 14s 40ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 36695.6953 - -
-``` - -``` -
- 117/469 ━━━━━━━━━━━━━━━━━━━━ 13s 40ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36700.8125 - -
-``` - -``` -
- 119/469 ━━━━━━━━━━━━━━━━━━━━ 13s 40ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36705.9336 - -
-``` - -``` -
- 121/469 ━━━━━━━━━━━━━━━━━━━━ 13s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36711.0547 - -
-``` - -``` -
- 123/469 ━━━━━━━━━━━━━━━━━━━━ 13s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36716.1758 - -
-``` - -``` -
- 125/469 ━━━━━━━━━━━━━━━━━━━━ 13s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36721.2930 - -
-``` - -``` -
- 127/469 ━━━━━━━━━━━━━━━━━━━━ 13s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36726.4141 - -
-``` - -``` -
- 129/469 ━━━━━━━━━━━━━━━━━━━━ 13s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36731.5352 - -
-``` - -``` -
- 131/469 ━━━━━━━━━━━━━━━━━━━━ 13s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36736.6562 - -
-``` - -``` -
- 133/469 ━━━━━━━━━━━━━━━━━━━━ 13s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36741.7734 - -
-``` - -``` -
- 135/469 ━━━━━━━━━━━━━━━━━━━━ 12s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36746.8945 - -
-``` - -``` -
- 137/469 ━━━━━━━━━━━━━━━━━━━━ 12s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36752.0156 - -
-``` - -``` -
- 139/469 ━━━━━━━━━━━━━━━━━━━━ 12s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36757.1328 - -
-``` - -``` -
- 141/469 ━━━━━━━━━━━━━━━━━━━━ 12s 39ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36762.2539 - -
-``` - -``` -
- 143/469 ━━━━━━━━━━━━━━━━━━━━ 12s 38ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36767.3750 - -
-``` - -``` -
- 145/469 ━━━━━━━━━━━━━━━━━━━━ 12s 38ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36772.4961 - -
-``` - -``` -
- 147/469 ━━━━━━━━━━━━━━━━━━━━ 12s 38ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36777.6133 - -
-``` - -``` -
- 149/469 ━━━━━━━━━━━━━━━━━━━━ 12s 38ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 36782.7344 - -
-``` - -``` -
- 151/469 ━━━━━━━━━━━━━━━━━━━━ 12s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36787.8555 - -
-``` - -``` -
- 153/469 ━━━━━━━━━━━━━━━━━━━━ 12s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36792.9727 - -
-``` - -``` -
- 155/469 ━━━━━━━━━━━━━━━━━━━━ 11s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36798.0938 - -
-``` - -``` -
- 157/469 ━━━━━━━━━━━━━━━━━━━━ 11s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36803.2148 - -
-``` - -``` -
- 159/469 ━━━━━━━━━━━━━━━━━━━━ 11s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36808.3359 - -
-``` - -``` -
- 161/469 ━━━━━━━━━━━━━━━━━━━━ 11s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36813.4531 - -
-``` - -``` -
- 163/469 ━━━━━━━━━━━━━━━━━━━━ 11s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36818.5742 - -
-``` - -``` -
- 165/469 ━━━━━━━━━━━━━━━━━━━━ 11s 38ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36823.6953 - -
-``` - -``` -
- 167/469 ━━━━━━━━━━━━━━━━━━━━ 11s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36828.8125 - -
-``` - -``` -
- 169/469 ━━━━━━━━━━━━━━━━━━━━ 11s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36833.9336 - -
-``` - -``` -
- 171/469 ━━━━━━━━━━━━━━━━━━━━ 11s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36839.0547 - -
-``` - -``` -
- 173/469 ━━━━━━━━━━━━━━━━━━━━ 11s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36844.1758 - -
-``` - -``` -
- 175/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36849.2930 - -
-``` - -``` -
- 177/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36854.4141 - -
-``` - -``` -
- 179/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36859.5352 - -
-``` - -``` -
- 181/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36864.6523 - -
-``` - -``` -
- 183/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36869.7734 - -
-``` - -``` -
- 185/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36874.8945 - -
-``` - -``` -
- 187/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36880.0156 - -
-``` - -``` -
- 189/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36885.1328 - -
-``` - -``` -
- 191/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36890.2539 - -
-``` - -``` -
- 193/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36895.3750 - -
-``` - -``` -
- 195/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36900.4922 - -
-``` - -``` -
- 197/469 ━━━━━━━━━━━━━━━━━━━━ 9s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36905.6133 - -
-``` - -``` -
- 198/469 ━━━━━━━━━━━━━━━━━━━━ 10s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36908.1758 - -
-``` - -``` -
- 200/469 ━━━━━━━━━━━━━━━━━━━━ 9s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36913.2930 - -
-``` - -``` -
- 202/469 ━━━━━━━━━━━━━━━━━━━━ 9s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36918.4141 - -
-``` - -``` -
- 204/469 ━━━━━━━━━━━━━━━━━━━━ 9s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36923.5352 - -
-``` - -``` -
- 206/469 ━━━━━━━━━━━━━━━━━━━━ 9s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36928.6523 - -
-``` - -``` -
- 208/469 ━━━━━━━━━━━━━━━━━━━━ 9s 37ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 36933.7734 - -
-``` - -``` -
- 209/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36936.3359 - -
-``` - -``` -
- 211/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36941.4531 - -
-``` - -``` -
- 213/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36946.5742 - -
-``` - -``` -
- 215/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36951.6953 - -
-``` - -``` -
- 217/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36956.8125 - -
-``` - -``` -
- 219/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36961.9336 - -
-``` - -``` -
- 221/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36967.0547 - -
-``` - -``` -
- 222/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36969.6133 - -
-``` - -``` -
- 224/469 ━━━━━━━━━━━━━━━━━━━━ 9s 38ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36974.7344 - -
-``` - -``` -
- 225/469 ━━━━━━━━━━━━━━━━━━━━ 26s 107ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36977.2930 - -
-``` - -``` -
- 226/469 ━━━━━━━━━━━━━━━━━━━━ 26s 107ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36979.8555 - -
-``` - -``` -
- 227/469 ━━━━━━━━━━━━━━━━━━━━ 26s 109ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36982.4141 - -
-``` - -``` -
- 228/469 ━━━━━━━━━━━━━━━━━━━━ 26s 109ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36984.9766 - -
-``` - -``` -
- 229/469 ━━━━━━━━━━━━━━━━━━━━ 27s 113ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36987.5352 - -
-``` - -``` -
- 230/469 ━━━━━━━━━━━━━━━━━━━━ 26s 112ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36990.0938 - -
-``` - -``` -
- 231/469 ━━━━━━━━━━━━━━━━━━━━ 26s 112ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36992.6562 - -
-``` - -``` -
- 232/469 ━━━━━━━━━━━━━━━━━━━━ 26s 112ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36995.2148 - -
-``` - -``` -
- 233/469 ━━━━━━━━━━━━━━━━━━━━ 26s 112ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 36997.7734 - -
-``` - -``` -
- 235/469 ━━━━━━━━━━━━━━━━━━━━ 26s 111ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37002.8945 - -
-``` - -``` -
- 236/469 ━━━━━━━━━━━━━━━━━━━━ 25s 111ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37005.4531 - -
-``` - -``` -
- 238/469 ━━━━━━━━━━━━━━━━━━━━ 25s 111ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37010.5742 - -
-``` - -``` -
- 240/469 ━━━━━━━━━━━━━━━━━━━━ 25s 110ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37015.6953 - -
-``` - -``` -
- 242/469 ━━━━━━━━━━━━━━━━━━━━ 24s 110ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37020.8164 - -
-``` - -``` -
- 244/469 ━━━━━━━━━━━━━━━━━━━━ 24s 109ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37025.9336 - -
-``` - -``` -
- 246/469 ━━━━━━━━━━━━━━━━━━━━ 24s 108ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37031.0547 - -
-``` - -``` -
- 248/469 ━━━━━━━━━━━━━━━━━━━━ 23s 108ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37036.1758 - -
-``` - -``` -
- 250/469 ━━━━━━━━━━━━━━━━━━━━ 23s 107ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37041.2930 - -
-``` - -``` -
- 252/469 ━━━━━━━━━━━━━━━━━━━━ 23s 107ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37046.4141 - -
-``` - -``` -
- 254/469 ━━━━━━━━━━━━━━━━━━━━ 22s 106ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37051.5352 - -
-``` - -``` -
- 256/469 ━━━━━━━━━━━━━━━━━━━━ 22s 106ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37056.6562 - -
-``` - -``` -
- 258/469 ━━━━━━━━━━━━━━━━━━━━ 22s 105ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37061.7734 - -
-``` - -``` -
- 260/469 ━━━━━━━━━━━━━━━━━━━━ 21s 105ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37066.8945 - -
-``` - -``` -
- 262/469 ━━━━━━━━━━━━━━━━━━━━ 21s 104ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37072.0156 - -
-``` - -``` -
- 264/469 ━━━━━━━━━━━━━━━━━━━━ 21s 104ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37077.1328 - -
-``` - -``` -
- 266/469 ━━━━━━━━━━━━━━━━━━━━ 20s 103ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37082.2539 - -
-``` - -``` -
- 268/469 ━━━━━━━━━━━━━━━━━━━━ 20s 103ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37087.3750 - -
-``` - -``` -
- 270/469 ━━━━━━━━━━━━━━━━━━━━ 20s 102ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37092.4961 - -
-``` - -``` -
- 272/469 ━━━━━━━━━━━━━━━━━━━━ 19s 101ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37097.6133 - -
-``` - -``` -
- 274/469 ━━━━━━━━━━━━━━━━━━━━ 19s 101ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37102.7344 - -
-``` - -``` -
- 276/469 ━━━━━━━━━━━━━━━━━━━━ 19s 100ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37107.8555 - -
-``` - -``` -
- 279/469 ━━━━━━━━━━━━━━━━━━━━ 18s 100ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37115.5352 - -
-``` - -``` -
- 282/469 ━━━━━━━━━━━━━━━━━━━━ 18s 99ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37123.2148 - -
-``` - -``` -
- 285/469 ━━━━━━━━━━━━━━━━━━━━ 18s 98ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37130.8945 - -
-``` - -``` -
- 288/469 ━━━━━━━━━━━━━━━━━━━━ 17s 97ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 37138.5742 - -
-``` - -``` -
- 291/469 ━━━━━━━━━━━━━━━━━━━━ 17s 96ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37146.2539 - -
-``` - -``` -
- 294/469 ━━━━━━━━━━━━━━━━━━━━ 16s 95ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37153.9336 - -
-``` - -``` -
- 297/469 ━━━━━━━━━━━━━━━━━━━━ 16s 95ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37161.6133 - -
-``` - -``` -
- 300/469 ━━━━━━━━━━━━━━━━━━━━ 15s 94ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37169.2930 - -
-``` - -``` -
- 304/469 ━━━━━━━━━━━━━━━━━━━━ 15s 93ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37179.5352 - -
-``` - -``` -
- 308/469 ━━━━━━━━━━━━━━━━━━━━ 14s 92ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37189.7734 - -
-``` - -``` -
- 312/469 ━━━━━━━━━━━━━━━━━━━━ 14s 91ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37200.0156 - -
-``` - -``` -
- 316/469 ━━━━━━━━━━━━━━━━━━━━ 13s 90ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37210.2539 - -
-``` - -``` -
- 320/469 ━━━━━━━━━━━━━━━━━━━━ 13s 89ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37220.4961 - -
-``` - -``` -
- 323/469 ━━━━━━━━━━━━━━━━━━━━ 12s 88ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37228.1758 - -
-``` - -``` -
- 326/469 ━━━━━━━━━━━━━━━━━━━━ 12s 88ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37235.8555 - -
-``` - -``` -
- 329/469 ━━━━━━━━━━━━━━━━━━━━ 12s 87ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37243.5352 - -
-``` - -``` -
- 332/469 ━━━━━━━━━━━━━━━━━━━━ 11s 86ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37251.2148 - -
-``` - -``` -
- 335/469 ━━━━━━━━━━━━━━━━━━━━ 11s 86ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37258.8945 - -
-``` - -``` -
- 338/469 ━━━━━━━━━━━━━━━━━━━━ 11s 85ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37266.5742 - -
-``` - -``` -
- 342/469 ━━━━━━━━━━━━━━━━━━━━ 10s 84ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37276.8164 - -
-``` - -``` -
- 346/469 ━━━━━━━━━━━━━━━━━━━━ 10s 84ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37287.0547 - -
-``` - -``` -
- 350/469 ━━━━━━━━━━━━━━━━━━━━ 9s 83ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37297.2969 - -
-``` - -``` -
- 353/469 ━━━━━━━━━━━━━━━━━━━━ 9s 82ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37304.9766 - -
-``` - -``` -
- 357/469 ━━━━━━━━━━━━━━━━━━━━ 9s 81ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37315.2148 - -
-``` - -``` -
- 360/469 ━━━━━━━━━━━━━━━━━━━━ 8s 81ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37322.8945 - -
-``` - -``` -
- 363/469 ━━━━━━━━━━━━━━━━━━━━ 8s 80ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37330.5742 - -
-``` - -``` -
- 367/469 ━━━━━━━━━━━━━━━━━━━━ 8s 80ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37340.8164 - -
-``` - -``` -
- 370/469 ━━━━━━━━━━━━━━━━━━━━ 7s 79ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37348.4961 - -
-``` - -``` -
- 373/469 ━━━━━━━━━━━━━━━━━━━━ 7s 79ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37356.1758 - -
-``` - -``` -
- 377/469 ━━━━━━━━━━━━━━━━━━━━ 7s 78ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37366.4141 - -
-``` - -``` -
- 381/469 ━━━━━━━━━━━━━━━━━━━━ 6s 77ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37376.6562 - -
-``` - -``` -
- 384/469 ━━━━━━━━━━━━━━━━━━━━ 6s 77ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37384.3359 - -
-``` - -``` -
- 387/469 ━━━━━━━━━━━━━━━━━━━━ 6s 76ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37392.0156 - -
-``` - -``` -
- 390/469 ━━━━━━━━━━━━━━━━━━━━ 6s 76ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37399.6953 - -
-``` - -``` -
- 392/469 ━━━━━━━━━━━━━━━━━━━━ 5s 76ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37404.8164 - -
-``` - -``` -
- 395/469 ━━━━━━━━━━━━━━━━━━━━ 5s 76ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37412.4961 - -
-``` - -``` -
- 398/469 ━━━━━━━━━━━━━━━━━━━━ 5s 75ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37420.1758 - -
-``` - -``` -
- 401/469 ━━━━━━━━━━━━━━━━━━━━ 5s 75ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37427.8555 - -
-``` - -``` -
- 404/469 ━━━━━━━━━━━━━━━━━━━━ 4s 74ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37435.5352 - -
-``` - -``` -
- 407/469 ━━━━━━━━━━━━━━━━━━━━ 4s 74ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37443.2148 - -
-``` - -``` -
- 411/469 ━━━━━━━━━━━━━━━━━━━━ 4s 73ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37453.4531 - -
-``` - -``` -
- 414/469 ━━━━━━━━━━━━━━━━━━━━ 4s 73ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37461.1367 - -
-``` - -``` -
- 417/469 ━━━━━━━━━━━━━━━━━━━━ 3s 73ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37468.8164 - -
-``` - -``` -
- 421/469 ━━━━━━━━━━━━━━━━━━━━ 3s 72ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37479.0547 - -
-``` - -``` -
- 425/469 ━━━━━━━━━━━━━━━━━━━━ 3s 72ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37489.2930 - -
-``` - -``` -
- 429/469 ━━━━━━━━━━━━━━━━━━━━ 2s 71ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37499.5352 - -
-``` - -``` -
- 433/469 ━━━━━━━━━━━━━━━━━━━━ 2s 70ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37509.7734 - -
-``` - -``` -
- 437/469 ━━━━━━━━━━━━━━━━━━━━ 2s 70ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37520.0156 - -
-``` - -``` -
- 440/469 ━━━━━━━━━━━━━━━━━━━━ 2s 70ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37527.6953 - -
-``` - -``` -
- 444/469 ━━━━━━━━━━━━━━━━━━━━ 1s 69ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37537.9336 - -
-``` - -``` -
- 448/469 ━━━━━━━━━━━━━━━━━━━━ 1s 69ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37548.1758 - -
-``` - -``` -
- 452/469 ━━━━━━━━━━━━━━━━━━━━ 1s 68ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37558.4141 - -
-``` - -``` -
- 454/469 ━━━━━━━━━━━━━━━━━━━━ 1s 68ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37563.5352 - -
-``` - -``` -
- 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 68ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37571.2148 - -
-``` - -``` -
- 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37581.4531 - -
-``` - -``` -
- 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 67ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37591.6953 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 0s 66ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37601.9336 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 32s 67ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 37604.4766 - val_loss: 0.1000 - val_moe_loss: 39198.7227 - - -
-``` -Epoch 15/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 39203.8633 - -
-``` - -``` -
- 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9954 - loss: 0.1000 - moe_loss: 39214.0938 - -
-``` - -``` -
- 9/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 39224.3320 - -
-``` - -``` -
- 13/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 39234.5742 - -
-``` - -``` -
- 17/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39244.8125 - -
-``` - -``` -
- 21/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39255.0547 - -
-``` - -``` -
- 25/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39265.2930 - -
-``` - -``` -
- 29/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 39275.5352 - -
-``` - -``` -
- 33/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 39285.7773 - -
-``` - -``` -
- 37/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 39296.0156 - -
-``` - -``` -
- 41/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39306.2539 - -
-``` - -``` -
- 45/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39316.4961 - -
-``` - -``` -
- 49/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39326.7344 - -
-``` - -``` -
- 53/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39336.9766 - -
-``` - -``` -
- 57/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39347.2148 - -
-``` - -``` -
- 61/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39357.4531 - -
-``` - -``` -
- 65/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39367.6953 - -
-``` - -``` -
- 69/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39377.9336 - -
-``` - -``` -
- 72/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39385.6133 - -
-``` - -``` -
- 76/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39395.8555 - -
-``` - -``` -
- 80/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39406.0938 - -
-``` - -``` -
- 84/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39416.3320 - -
-``` - -``` -
- 88/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39426.5742 - -
-``` - -``` -
- 92/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39436.8125 - -
-``` - -``` -
- 95/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39444.4922 - -
-``` - -``` -
- 98/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39452.1758 - -
-``` - -``` -
- 101/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39459.8555 - -
-``` - -``` -
- 102/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39462.4141 - -
-``` - -``` -
- 103/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39464.9727 - -
-``` - -``` -
- 105/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39470.0938 - -
-``` - -``` -
- 108/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39477.7734 - -
-``` - -``` -
- 111/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39485.4531 - -
-``` - -``` -
- 114/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39493.1328 - -
-``` - -``` -
- 117/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 39500.8125 - -
-``` - -``` -
- 120/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39508.4922 - -
-``` - -``` -
- 123/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39516.1719 - -
-``` - -``` -
- 126/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39523.8555 - -
-``` - -``` -
- 129/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39531.5352 - -
-``` - -``` -
- 132/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39539.2148 - -
-``` - -``` -
- 135/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39546.8945 - -
-``` - -``` -
- 138/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39554.5742 - -
-``` - -``` -
- 141/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39562.2539 - -
-``` - -``` -
- 144/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39569.9336 - -
-``` - -``` -
- 147/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39577.6133 - -
-``` - -``` -
- 150/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39585.2930 - -
-``` - -``` -
- 153/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39592.9727 - -
-``` - -``` -
- 156/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39600.6523 - -
-``` - -``` -
- 159/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39608.3320 - -
-``` - -``` -
- 162/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39616.0117 - -
-``` - -``` -
- 165/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39623.6914 - -
-``` - -``` -
- 168/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39631.3750 - -
-``` - -``` -
- 172/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39641.6133 - -
-``` - -``` -
- 175/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39649.2930 - -
-``` - -``` -
- 179/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39659.5352 - -
-``` - -``` -
- 183/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39669.7734 - -
-``` - -``` -
- 187/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39680.0117 - -
-``` - -``` -
- 190/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39687.6953 - -
-``` - -``` -
- 194/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39697.9336 - -
-``` - -``` -
- 198/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39708.1719 - -
-``` - -``` -
- 201/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39715.8555 - -
-``` - -``` -
- 204/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39723.5352 - -
-``` - -``` -
- 208/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39733.7734 - -
-``` - -``` -
- 212/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39744.0156 - -
-``` - -``` -
- 216/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39754.2539 - -
-``` - -``` -
- 219/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39761.9336 - -
-``` - -``` -
- 223/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39772.1758 - -
-``` - -``` -
- 226/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39779.8555 - -
-``` - -``` -
- 230/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39790.0938 - -
-``` - -``` -
- 234/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39800.3359 - -
-``` - -``` -
- 238/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39810.5742 - -
-``` - -``` -
- 242/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39820.8125 - -
-``` - -``` -
- 245/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39828.4922 - -
-``` - -``` -
- 248/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39836.1758 - -
-``` - -``` -
- 252/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39846.4141 - -
-``` - -``` -
- 256/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39856.6523 - -
-``` - -``` -
- 259/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39864.3320 - -
-``` - -``` -
- 262/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39872.0156 - -
-``` - -``` -
- 265/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39879.6953 - -
-``` - -``` -
- 269/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39889.9336 - -
-``` - -``` -
- 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39900.1758 - -
-``` - -``` -
- 277/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39910.4141 - -
-``` - -``` -
- 281/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 39920.6523 - -
-``` - -``` -
- 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39930.8945 - -
-``` - -``` -
- 289/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39941.1328 - -
-``` - -``` -
- 292/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39948.8125 - -
-``` - -``` -
- 296/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39959.0547 - -
-``` - -``` -
- 299/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39966.7344 - -
-``` - -``` -
- 303/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39976.9727 - -
-``` - -``` -
- 307/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39987.2148 - -
-``` - -``` -
- 311/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 39997.4531 - -
-``` - -``` -
- 315/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40007.6953 - -
-``` - -``` -
- 319/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40017.9336 - -
-``` - -``` -
- 323/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40028.1758 - -
-``` - -``` -
- 327/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40038.4141 - -
-``` - -``` -
- 331/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40048.6523 - -
-``` - -``` -
- 335/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40058.8945 - -
-``` - -``` -
- 339/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40069.1328 - -
-``` - -``` -
- 343/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40079.3750 - -
-``` - -``` -
- 346/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40087.0547 - -
-``` - -``` -
- 350/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40097.2930 - -
-``` - -``` -
- 354/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40107.5352 - -
-``` - -``` -
- 358/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40117.7734 - -
-``` - -``` -
- 362/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40128.0156 - -
-``` - -``` -
- 366/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40138.2539 - -
-``` - -``` -
- 369/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40145.9336 - -
-``` - -``` -
- 373/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40156.1758 - -
-``` - -``` -
- 377/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40166.4141 - -
-``` - -``` -
- 381/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40176.6523 - -
-``` - -``` -
- 385/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40186.8945 - -
-``` - -``` -
- 389/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40197.1328 - -
-``` - -``` -
- 393/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40207.3750 - -
-``` - -``` -
- 396/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40215.0547 - -
-``` - -``` -
- 400/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40225.2930 - -
-``` - -``` -
- 404/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40235.5352 - -
-``` - -``` -
- 408/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40245.7734 - -
-``` - -``` -
- 412/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40256.0156 - -
-``` - -``` -
- 416/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40266.2539 - -
-``` - -``` -
- 420/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40276.4961 - -
-``` - -``` -
- 423/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40284.1758 - -
-``` - -``` -
- 427/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40294.4141 - -
-``` - -``` -
- 431/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40304.6523 - -
-``` - -``` -
- 435/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40314.8945 - -
-``` - -``` -
- 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40325.1328 - -
-``` - -``` -
- 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40335.3750 - -
-``` - -``` -
- 447/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40345.6133 - -
-``` - -``` -
- 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40355.8555 - -
-``` - -``` -
- 454/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40363.5352 - -
-``` - -``` -
- 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40371.2148 - -
-``` - -``` -
- 460/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40378.8945 - -
-``` - -``` -
- 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40386.5742 - -
-``` - -``` -
- 466/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40394.2539 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 40404.4766 - val_loss: 0.1000 - val_moe_loss: 41998.7227 - - -
-``` -Epoch 16/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 11s 26ms/step - accuracy: 0.9844 - loss: 0.1000 - moe_loss: 42003.8633 - -
-``` - -``` -
- 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9882 - loss: 0.1000 - moe_loss: 42014.1016 - -
-``` - -``` -
- 9/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9901 - loss: 0.1000 - moe_loss: 42024.3359 - -
-``` - -``` -
- 13/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9909 - loss: 0.1000 - moe_loss: 42034.5742 - -
-``` - -``` -
- 16/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9914 - loss: 0.1000 - moe_loss: 42042.2539 - -
-``` - -``` -
- 20/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 42052.4922 - -
-``` - -``` -
- 24/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 42062.7344 - -
-``` - -``` -
- 28/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 42072.9727 - -
-``` - -``` -
- 32/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 42083.2109 - -
-``` - -``` -
- 36/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 42093.4531 - -
-``` - -``` -
- 39/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 42101.1328 - -
-``` - -``` -
- 42/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 42108.8125 - -
-``` - -``` -
- 45/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 42116.4922 - -
-``` - -``` -
- 49/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 42126.7344 - -
-``` - -``` -
- 53/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 42136.9727 - -
-``` - -``` -
- 57/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 42147.2148 - -
-``` - -``` -
- 61/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 42157.4531 - -
-``` - -``` -
- 65/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42167.6953 - -
-``` - -``` -
- 69/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42177.9336 - -
-``` - -``` -
- 73/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42188.1758 - -
-``` - -``` -
- 77/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42198.4141 - -
-``` - -``` -
- 81/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42208.6562 - -
-``` - -``` -
- 85/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42218.8945 - -
-``` - -``` -
- 89/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42229.1328 - -
-``` - -``` -
- 93/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42239.3750 - -
-``` - -``` -
- 97/469 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42249.6133 - -
-``` - -``` -
- 100/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42257.2930 - -
-``` - -``` -
- 104/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42267.5352 - -
-``` - -``` -
- 108/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42277.7734 - -
-``` - -``` -
- 112/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42288.0156 - -
-``` - -``` -
- 116/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42298.2539 - -
-``` - -``` -
- 120/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42308.4961 - -
-``` - -``` -
- 124/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42318.7344 - -
-``` - -``` -
- 128/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42328.9766 - -
-``` - -``` -
- 132/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42339.2148 - -
-``` - -``` -
- 136/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42349.4531 - -
-``` - -``` -
- 140/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42359.6953 - -
-``` - -``` -
- 144/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42369.9336 - -
-``` - -``` -
- 148/469 ━━━━━━━━━━━━━━━━━━━━ 5s 16ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42380.1758 - -
-``` - -``` -
- 150/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42385.2930 - -
-``` - -``` -
- 152/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42390.4141 - -
-``` - -``` -
- 155/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42398.0938 - -
-``` - -``` -
- 159/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42408.3359 - -
-``` - -``` -
- 163/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42418.5742 - -
-``` - -``` -
- 167/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42428.8125 - -
-``` - -``` -
- 171/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42439.0547 - -
-``` - -``` -
- 175/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42449.2930 - -
-``` - -``` -
- 179/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42459.5352 - -
-``` - -``` -
- 183/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 42469.7734 - -
-``` - -``` -
- 187/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42480.0156 - -
-``` - -``` -
- 191/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42490.2539 - -
-``` - -``` -
- 195/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42500.4922 - -
-``` - -``` -
- 199/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42510.7344 - -
-``` - -``` -
- 203/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42520.9727 - -
-``` - -``` -
- 207/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42531.2148 - -
-``` - -``` -
- 211/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42541.4531 - -
-``` - -``` -
- 214/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42549.1328 - -
-``` - -``` -
- 218/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42559.3750 - -
-``` - -``` -
- 222/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42569.6133 - -
-``` - -``` -
- 226/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42579.8555 - -
-``` - -``` -
- 230/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42590.0938 - -
-``` - -``` -
- 234/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42600.3320 - -
-``` - -``` -
- 238/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42610.5742 - -
-``` - -``` -
- 242/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 42620.8125 - -
-``` - -``` -
- 246/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42631.0547 - -
-``` - -``` -
- 250/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42641.2930 - -
-``` - -``` -
- 254/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42651.5352 - -
-``` - -``` -
- 258/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42661.7734 - -
-``` - -``` -
- 262/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42672.0117 - -
-``` - -``` -
- 266/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42682.2539 - -
-``` - -``` -
- 269/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42689.9336 - -
-``` - -``` -
- 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42700.1719 - -
-``` - -``` -
- 276/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42707.8555 - -
-``` - -``` -
- 279/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42715.5352 - -
-``` - -``` -
- 281/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42720.6523 - -
-``` - -``` -
- 282/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42723.2148 - -
-``` - -``` -
- 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42730.8945 - -
-``` - -``` -
- 288/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42738.5742 - -
-``` - -``` -
- 291/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 42746.2539 - -
-``` - -``` -
- 294/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42753.9336 - -
-``` - -``` -
- 298/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42764.1719 - -
-``` - -``` -
- 302/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42774.4141 - -
-``` - -``` -
- 305/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42782.0938 - -
-``` - -``` -
- 309/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42792.3320 - -
-``` - -``` -
- 312/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42800.0117 - -
-``` - -``` -
- 315/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42807.6914 - -
-``` - -``` -
- 318/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42815.3750 - -
-``` - -``` -
- 321/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42823.0547 - -
-``` - -``` -
- 324/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42830.7344 - -
-``` - -``` -
- 327/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42838.4141 - -
-``` - -``` -
- 331/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42848.6523 - -
-``` - -``` -
- 335/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42858.8945 - -
-``` - -``` -
- 338/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42866.5742 - -
-``` - -``` -
- 342/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42876.8125 - -
-``` - -``` -
- 346/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42887.0547 - -
-``` - -``` -
- 349/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42894.7344 - -
-``` - -``` -
- 353/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42904.9727 - -
-``` - -``` -
- 357/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 42915.2148 - -
-``` - -``` -
- 361/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42925.4531 - -
-``` - -``` -
- 364/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42933.1328 - -
-``` - -``` -
- 368/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42943.3750 - -
-``` - -``` -
- 371/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42951.0547 - -
-``` - -``` -
- 374/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42958.7344 - -
-``` - -``` -
- 378/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42968.9727 - -
-``` - -``` -
- 382/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42979.2148 - -
-``` - -``` -
- 386/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42989.4531 - -
-``` - -``` -
- 390/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 42999.6914 - -
-``` - -``` -
- 393/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43007.3750 - -
-``` - -``` -
- 397/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43017.6133 - -
-``` - -``` -
- 401/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43027.8516 - -
-``` - -``` -
- 405/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43038.0938 - -
-``` - -``` -
- 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43048.3320 - -
-``` - -``` -
- 413/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43058.5742 - -
-``` - -``` -
- 416/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43066.2539 - -
-``` - -``` -
- 420/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43076.4922 - -
-``` - -``` -
- 423/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43084.1719 - -
-``` - -``` -
- 426/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43091.8516 - -
-``` - -``` -
- 430/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43102.0938 - -
-``` - -``` -
- 434/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43112.3320 - -
-``` - -``` -
- 438/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43122.5742 - -
-``` - -``` -
- 442/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43132.8125 - -
-``` - -``` -
- 446/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43143.0547 - -
-``` - -``` -
- 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43150.7344 - -
-``` - -``` -
- 453/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43160.9727 - -
-``` - -``` -
- 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43171.2148 - -
-``` - -``` -
- 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43181.4531 - -
-``` - -``` -
- 465/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43191.6953 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43201.9297 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 43204.4766 - val_loss: 0.1000 - val_moe_loss: 44798.7227 - - -
-``` -Epoch 17/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 12s 26ms/step - accuracy: 1.0000 - loss: 0.1000 - moe_loss: 44803.8477 - -
-``` - -``` -
- 5/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9947 - loss: 0.1000 - moe_loss: 44814.0938 - -
-``` - -``` -
- 9/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 44824.3359 - -
-``` - -``` -
- 13/469 ━━━━━━━━━━━━━━━━━━━━ 7s 16ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 44834.5742 - -
-``` - -``` -
- 16/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 44842.2539 - -
-``` - -``` -
- 19/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 44849.9336 - -
-``` - -``` -
- 23/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 44860.1719 - -
-``` - -``` -
- 27/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 44870.4141 - -
-``` - -``` -
- 31/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 44880.6523 - -
-``` - -``` -
- 33/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 44885.7734 - -
-``` - -``` -
- 37/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 44896.0117 - -
-``` - -``` -
- 41/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 44906.2539 - -
-``` - -``` -
- 45/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 44916.4922 - -
-``` - -``` -
- 48/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 44924.1719 - -
-``` - -``` -
- 51/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 44931.8516 - -
-``` - -``` -
- 55/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 44942.0938 - -
-``` - -``` -
- 59/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 44952.3320 - -
-``` - -``` -
- 63/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 44962.5742 - -
-``` - -``` -
- 67/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 44972.8125 - -
-``` - -``` -
- 71/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 44983.0547 - -
-``` - -``` -
- 74/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 44990.7344 - -
-``` - -``` -
- 78/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 45000.9727 - -
-``` - -``` -
- 81/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 45008.6523 - -
-``` - -``` -
- 84/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 45016.3320 - -
-``` - -``` -
- 87/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 45024.0117 - -
-``` - -``` -
- 90/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 45031.6914 - -
-``` - -``` -
- 94/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 45041.9336 - -
-``` - -``` -
- 97/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 45049.6133 - -
-``` - -``` -
- 101/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 45059.8516 - -
-``` - -``` -
- 105/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 45070.0938 - -
-``` - -``` -
- 108/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 45077.7734 - -
-``` - -``` -
- 111/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 45085.4531 - -
-``` - -``` -
- 114/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 45093.1328 - -
-``` - -``` -
- 118/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 45103.3711 - -
-``` - -``` -
- 121/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 45111.0508 - -
-``` - -``` -
- 124/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 45118.7305 - -
-``` - -``` -
- 127/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 45126.4102 - -
-``` - -``` -
- 130/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 45134.0938 - -
-``` - -``` -
- 133/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 45141.7734 - -
-``` - -``` -
- 136/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 45149.4531 - -
-``` - -``` -
- 139/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 45157.1328 - -
-``` - -``` -
- 142/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 45164.8125 - -
-``` - -``` -
- 145/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 45172.4922 - -
-``` - -``` -
- 149/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 45182.7344 - -
-``` - -``` -
- 153/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 45192.9727 - -
-``` - -``` -
- 156/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45200.6523 - -
-``` - -``` -
- 159/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45208.3320 - -
-``` - -``` -
- 163/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45218.5742 - -
-``` - -``` -
- 167/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45228.8125 - -
-``` - -``` -
- 170/469 ━━━━━━━━━━━━━━━━━━━━ 5s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45236.4922 - -
-``` - -``` -
- 174/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45246.7344 - -
-``` - -``` -
- 177/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45254.4141 - -
-``` - -``` -
- 180/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45262.0938 - -
-``` - -``` -
- 183/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45269.7734 - -
-``` - -``` -
- 186/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45277.4531 - -
-``` - -``` -
- 189/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45285.1328 - -
-``` - -``` -
- 193/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45295.3711 - -
-``` - -``` -
- 196/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45303.0547 - -
-``` - -``` -
- 199/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45310.7344 - -
-``` - -``` -
- 202/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45318.4141 - -
-``` - -``` -
- 205/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45326.0938 - -
-``` - -``` -
- 208/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45333.7734 - -
-``` - -``` -
- 211/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45341.4531 - -
-``` - -``` -
- 214/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45349.1328 - -
-``` - -``` -
- 217/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45356.8125 - -
-``` - -``` -
- 220/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45364.4922 - -
-``` - -``` -
- 223/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45372.1719 - -
-``` - -``` -
- 227/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45382.4141 - -
-``` - -``` -
- 231/469 ━━━━━━━━━━━━━━━━━━━━ 4s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45392.6523 - -
-``` - -``` -
- 234/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45400.3320 - -
-``` - -``` -
- 237/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45408.0117 - -
-``` - -``` -
- 241/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45418.2539 - -
-``` - -``` -
- 244/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45425.9336 - -
-``` - -``` -
- 247/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45433.6133 - -
-``` - -``` -
- 250/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45441.2930 - -
-``` - -``` -
- 253/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45448.9727 - -
-``` - -``` -
- 257/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45459.2109 - -
-``` - -``` -
- 261/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45469.4531 - -
-``` - -``` -
- 264/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45477.1328 - -
-``` - -``` -
- 267/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45484.8125 - -
-``` - -``` -
- 270/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45492.4922 - -
-``` - -``` -
- 273/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45500.1719 - -
-``` - -``` -
- 276/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45507.8516 - -
-``` - -``` -
- 279/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45515.5312 - -
-``` - -``` -
- 282/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45523.2109 - -
-``` - -``` -
- 285/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45530.8945 - -
-``` - -``` -
- 288/469 ━━━━━━━━━━━━━━━━━━━━ 3s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45538.5742 - -
-``` - -``` -
- 291/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45546.2539 - -
-``` - -``` -
- 294/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45553.9336 - -
-``` - -``` -
- 297/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45561.6133 - -
-``` - -``` -
- 300/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45569.2930 - -
-``` - -``` -
- 303/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45576.9727 - -
-``` - -``` -
- 306/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45584.6523 - -
-``` - -``` -
- 307/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45587.2109 - -
-``` - -``` -
- 310/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45594.8906 - -
-``` - -``` -
- 313/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45602.5703 - -
-``` - -``` -
- 316/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45610.2539 - -
-``` - -``` -
- 319/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45617.9336 - -
-``` - -``` -
- 322/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45625.6133 - -
-``` - -``` -
- 326/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45635.8516 - -
-``` - -``` -
- 329/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45643.5312 - -
-``` - -``` -
- 332/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45651.2109 - -
-``` - -``` -
- 335/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45658.8906 - -
-``` - -``` -
- 338/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45666.5703 - -
-``` - -``` -
- 341/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45674.2539 - -
-``` - -``` -
- 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45681.9336 - -
-``` - -``` -
- 347/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45689.6133 - -
-``` - -``` -
- 350/469 ━━━━━━━━━━━━━━━━━━━━ 2s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45697.2930 - -
-``` - -``` -
- 353/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45704.9727 - -
-``` - -``` -
- 356/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45712.6523 - -
-``` - -``` -
- 359/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45720.3320 - -
-``` - -``` -
- 362/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45728.0117 - -
-``` - -``` -
- 365/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45735.6914 - -
-``` - -``` -
- 368/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45743.3711 - -
-``` - -``` -
- 371/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45751.0508 - -
-``` - -``` -
- 374/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45758.7305 - -
-``` - -``` -
- 377/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45766.4141 - -
-``` - -``` -
- 380/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45774.0938 - -
-``` - -``` -
- 383/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45781.7734 - -
-``` - -``` -
- 386/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45789.4531 - -
-``` - -``` -
- 389/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 45797.1328 - -
-``` - -``` -
- 392/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45804.8125 - -
-``` - -``` -
- 395/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45812.4922 - -
-``` - -``` -
- 398/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45820.1719 - -
-``` - -``` -
- 401/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45827.8516 - -
-``` - -``` -
- 404/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45835.5312 - -
-``` - -``` -
- 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45843.2109 - -
-``` - -``` -
- 410/469 ━━━━━━━━━━━━━━━━━━━━ 1s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45850.8906 - -
-``` - -``` -
- 413/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45858.5742 - -
-``` - -``` -
- 416/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45866.2539 - -
-``` - -``` -
- 419/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45873.9336 - -
-``` - -``` -
- 422/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45881.6133 - -
-``` - -``` -
- 425/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45889.2930 - -
-``` - -``` -
- 428/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45896.9727 - -
-``` - -``` -
- 431/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45904.6523 - -
-``` - -``` -
- 434/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45912.3320 - -
-``` - -``` -
- 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45920.0117 - -
-``` - -``` -
- 440/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45927.6914 - -
-``` - -``` -
- 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45935.3711 - -
-``` - -``` -
- 446/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45943.0547 - -
-``` - -``` -
- 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45950.7344 - -
-``` - -``` -
- 452/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45958.4141 - -
-``` - -``` -
- 455/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45966.0938 - -
-``` - -``` -
- 458/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45973.7734 - -
-``` - -``` -
- 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45981.4531 - -
-``` - -``` -
- 464/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45989.1328 - -
-``` - -``` -
- 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 45996.8125 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 8s 18ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 46004.4766 - val_loss: 0.1000 - val_moe_loss: 47598.7227 - - -
-``` -Epoch 18/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 12s 27ms/step - accuracy: 1.0000 - loss: 0.1000 - moe_loss: 47603.8477 - -
-``` - -``` -
- 4/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9967 - loss: 0.1000 - moe_loss: 47611.5312 - -
-``` - -``` -
- 7/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9965 - loss: 0.1000 - moe_loss: 47619.2109 - -
-``` - -``` -
- 10/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9965 - loss: 0.1000 - moe_loss: 47626.8906 - -
-``` - -``` -
- 13/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9966 - loss: 0.1000 - moe_loss: 47634.5703 - -
-``` - -``` -
- 16/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9964 - loss: 0.1000 - moe_loss: 47642.2500 - -
-``` - -``` -
- 19/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9961 - loss: 0.1000 - moe_loss: 47649.9336 - -
-``` - -``` -
- 22/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9957 - loss: 0.1000 - moe_loss: 47657.6133 - -
-``` - -``` -
- 25/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9954 - loss: 0.1000 - moe_loss: 47665.2930 - -
-``` - -``` -
- 28/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9952 - loss: 0.1000 - moe_loss: 47672.9727 - -
-``` - -``` -
- 31/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9950 - loss: 0.1000 - moe_loss: 47680.6523 - -
-``` - -``` -
- 34/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9950 - loss: 0.1000 - moe_loss: 47688.3359 - -
-``` - -``` -
- 37/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9949 - loss: 0.1000 - moe_loss: 47696.0156 - -
-``` - -``` -
- 40/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9948 - loss: 0.1000 - moe_loss: 47703.6953 - -
-``` - -``` -
- 43/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9947 - loss: 0.1000 - moe_loss: 47711.3750 - -
-``` - -``` -
- 46/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9946 - loss: 0.1000 - moe_loss: 47719.0547 - -
-``` - -``` -
- 49/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9946 - loss: 0.1000 - moe_loss: 47726.7344 - -
-``` - -``` -
- 52/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 47734.4141 - -
-``` - -``` -
- 55/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 47742.0938 - -
-``` - -``` -
- 58/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 47749.7734 - -
-``` - -``` -
- 61/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 47757.4570 - -
-``` - -``` -
- 64/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 47765.1367 - -
-``` - -``` -
- 67/469 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 47772.8164 - -
-``` - -``` -
- 70/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9945 - loss: 0.1000 - moe_loss: 47780.4961 - -
-``` - -``` -
- 73/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 47788.1758 - -
-``` - -``` -
- 76/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 47795.8555 - -
-``` - -``` -
- 79/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 47803.5352 - -
-``` - -``` -
- 82/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 47811.2148 - -
-``` - -``` -
- 85/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 47818.8945 - -
-``` - -``` -
- 88/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 47826.5742 - -
-``` - -``` -
- 91/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9944 - loss: 0.1000 - moe_loss: 47834.2539 - -
-``` - -``` -
- 94/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47841.9336 - -
-``` - -``` -
- 97/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47849.6133 - -
-``` - -``` -
- 100/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47857.2930 - -
-``` - -``` -
- 103/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47864.9727 - -
-``` - -``` -
- 106/469 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47872.6562 - -
-``` - -``` -
- 107/469 ━━━━━━━━━━━━━━━━━━━━ 6s 18ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47875.2148 - -
-``` - -``` -
- 108/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47877.7734 - -
-``` - -``` -
- 111/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47885.4531 - -
-``` - -``` -
- 114/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47893.1328 - -
-``` - -``` -
- 117/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47900.8164 - -
-``` - -``` -
- 121/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 47911.0547 - -
-``` - -``` -
- 124/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47918.7344 - -
-``` - -``` -
- 127/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47926.4141 - -
-``` - -``` -
- 130/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47934.0938 - -
-``` - -``` -
- 133/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47941.7734 - -
-``` - -``` -
- 136/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47949.4531 - -
-``` - -``` -
- 139/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47957.1328 - -
-``` - -``` -
- 142/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47964.8125 - -
-``` - -``` -
- 145/469 ━━━━━━━━━━━━━━━━━━━━ 6s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47972.4922 - -
-``` - -``` -
- 148/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47980.1758 - -
-``` - -``` -
- 151/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47987.8555 - -
-``` - -``` -
- 154/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 47995.5352 - -
-``` - -``` -
- 157/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48003.2148 - -
-``` - -``` -
- 160/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48010.8945 - -
-``` - -``` -
- 163/469 ━━━━━━━━━━━━━━━━━━━━ 5s 19ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48018.5742 - -
-``` - -``` -
- 166/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48026.2539 - -
-``` - -``` -
- 169/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48033.9336 - -
-``` - -``` -
- 172/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48041.6133 - -
-``` - -``` -
- 175/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48049.2930 - -
-``` - -``` -
- 178/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48056.9727 - -
-``` - -``` -
- 181/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48064.6562 - -
-``` - -``` -
- 184/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48072.3359 - -
-``` - -``` -
- 187/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48080.0156 - -
-``` - -``` -
- 190/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48087.6953 - -
-``` - -``` -
- 193/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48095.3750 - -
-``` - -``` -
- 196/469 ━━━━━━━━━━━━━━━━━━━━ 5s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48103.0547 - -
-``` - -``` -
- 199/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48110.7344 - -
-``` - -``` -
- 202/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48118.4141 - -
-``` - -``` -
- 205/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48126.0938 - -
-``` - -``` -
- 208/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48133.7734 - -
-``` - -``` -
- 211/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48141.4531 - -
-``` - -``` -
- 214/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48149.1328 - -
-``` - -``` -
- 217/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48156.8164 - -
-``` - -``` -
- 220/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 48164.4961 - -
-``` - -``` -
- 223/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48172.1758 - -
-``` - -``` -
- 226/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48179.8555 - -
-``` - -``` -
- 229/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48187.5352 - -
-``` - -``` -
- 232/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48195.2148 - -
-``` - -``` -
- 235/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48202.8945 - -
-``` - -``` -
- 238/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48210.5742 - -
-``` - -``` -
- 241/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48218.2539 - -
-``` - -``` -
- 244/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48225.9336 - -
-``` - -``` -
- 247/469 ━━━━━━━━━━━━━━━━━━━━ 4s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48233.6133 - -
-``` - -``` -
- 250/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48241.2930 - -
-``` - -``` -
- 253/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48248.9727 - -
-``` - -``` -
- 256/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48256.6562 - -
-``` - -``` -
- 259/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48264.3359 - -
-``` - -``` -
- 262/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48272.0156 - -
-``` - -``` -
- 265/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48279.6953 - -
-``` - -``` -
- 268/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48287.3750 - -
-``` - -``` -
- 271/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48295.0547 - -
-``` - -``` -
- 274/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48302.7344 - -
-``` - -``` -
- 277/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48310.4141 - -
-``` - -``` -
- 280/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48318.0938 - -
-``` - -``` -
- 283/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48325.7734 - -
-``` - -``` -
- 286/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48333.4531 - -
-``` - -``` -
- 289/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48341.1328 - -
-``` - -``` -
- 292/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48348.8125 - -
-``` - -``` -
- 295/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48356.4922 - -
-``` - -``` -
- 298/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48364.1758 - -
-``` - -``` -
- 301/469 ━━━━━━━━━━━━━━━━━━━━ 3s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48371.8555 - -
-``` - -``` -
- 304/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48379.5352 - -
-``` - -``` -
- 307/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48387.2148 - -
-``` - -``` -
- 310/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48394.8945 - -
-``` - -``` -
- 313/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48402.5742 - -
-``` - -``` -
- 316/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48410.2539 - -
-``` - -``` -
- 319/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48417.9336 - -
-``` - -``` -
- 322/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48425.6133 - -
-``` - -``` -
- 325/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48433.2930 - -
-``` - -``` -
- 328/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48440.9727 - -
-``` - -``` -
- 331/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48448.6523 - -
-``` - -``` -
- 334/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48456.3320 - -
-``` - -``` -
- 337/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48464.0156 - -
-``` - -``` -
- 340/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48471.6953 - -
-``` - -``` -
- 343/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48479.3750 - -
-``` - -``` -
- 346/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48487.0547 - -
-``` - -``` -
- 349/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48494.7344 - -
-``` - -``` -
- 352/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48502.4141 - -
-``` - -``` -
- 355/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48510.0938 - -
-``` - -``` -
- 358/469 ━━━━━━━━━━━━━━━━━━━━ 2s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48517.7734 - -
-``` - -``` -
- 361/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48525.4531 - -
-``` - -``` -
- 364/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48533.1328 - -
-``` - -``` -
- 367/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48540.8125 - -
-``` - -``` -
- 370/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48548.4922 - -
-``` - -``` -
- 373/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48556.1719 - -
-``` - -``` -
- 376/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48563.8555 - -
-``` - -``` -
- 379/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48571.5352 - -
-``` - -``` -
- 382/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48579.2148 - -
-``` - -``` -
- 385/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48586.8945 - -
-``` - -``` -
- 388/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48594.5742 - -
-``` - -``` -
- 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48602.2539 - -
-``` - -``` -
- 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48609.9336 - -
-``` - -``` -
- 397/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48617.6133 - -
-``` - -``` -
- 400/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48625.2930 - -
-``` - -``` -
- 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48632.9727 - -
-``` - -``` -
- 406/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48640.6523 - -
-``` - -``` -
- 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48648.3320 - -
-``` - -``` -
- 412/469 ━━━━━━━━━━━━━━━━━━━━ 1s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48656.0117 - -
-``` - -``` -
- 415/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48663.6914 - -
-``` - -``` -
- 418/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48671.3750 - -
-``` - -``` -
- 421/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48679.0547 - -
-``` - -``` -
- 424/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48686.7344 - -
-``` - -``` -
- 427/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48694.4141 - -
-``` - -``` -
- 430/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48702.0938 - -
-``` - -``` -
- 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48709.7734 - -
-``` - -``` -
- 436/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 48717.4531 - -
-``` - -``` -
- 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48725.1328 - -
-``` - -``` -
- 442/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48732.8125 - -
-``` - -``` -
- 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48740.4922 - -
-``` - -``` -
- 448/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48748.1719 - -
-``` - -``` -
- 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48755.8516 - -
-``` - -``` -
- 454/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48763.5352 - -
-``` - -``` -
- 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48771.2148 - -
-``` - -``` -
- 460/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48778.8945 - -
-``` - -``` -
- 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48786.5742 - -
-``` - -``` -
- 466/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48794.2539 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 0s 18ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48801.9297 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 9s 19ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 48804.4766 - val_loss: 0.1000 - val_moe_loss: 50398.7227 - - -
-``` -Epoch 19/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 14s 31ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 50403.8594 - -
-``` - -``` -
- 4/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50411.5312 - -
-``` - -``` -
- 7/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50419.2070 - -
-``` - -``` -
- 10/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50426.8867 - -
-``` - -``` -
- 13/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50434.5703 - -
-``` - -``` -
- 16/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50442.2500 - -
-``` - -``` -
- 19/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50449.9297 - -
-``` - -``` -
- 22/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50457.6094 - -
-``` - -``` -
- 25/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50465.2891 - -
-``` - -``` -
- 28/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50472.9688 - -
-``` - -``` -
- 31/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 50480.6523 - -
-``` - -``` -
- 34/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50488.3320 - -
-``` - -``` -
- 37/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50496.0117 - -
-``` - -``` -
- 40/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 50503.6914 - -
-``` - -``` -
- 43/469 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 50511.3711 - -
-``` - -``` -
- 46/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 50519.0508 - -
-``` - -``` -
- 49/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 50526.7305 - -
-``` - -``` -
- 52/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50534.4102 - -
-``` - -``` -
- 55/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50542.0938 - -
-``` - -``` -
- 58/469 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50549.7734 - -
-``` - -``` -
- 61/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50557.4531 - -
-``` - -``` -
- 64/469 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50565.1328 - -
-``` - -``` -
- 67/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 50572.8125 - -
-``` - -``` -
- 70/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 50580.4922 - -
-``` - -``` -
- 73/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 50588.1719 - -
-``` - -``` -
- 76/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 50595.8516 - -
-``` - -``` -
- 79/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 50603.5312 - -
-``` - -``` -
- 82/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 50611.2148 - -
-``` - -``` -
- 85/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 50618.8945 - -
-``` - -``` -
- 88/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50626.5742 - -
-``` - -``` -
- 91/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50634.2539 - -
-``` - -``` -
- 94/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50641.9336 - -
-``` - -``` -
- 97/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50649.6133 - -
-``` - -``` -
- 100/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50657.2930 - -
-``` - -``` -
- 103/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50664.9727 - -
-``` - -``` -
- 106/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50672.6562 - -
-``` - -``` -
- 109/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50680.3359 - -
-``` - -``` -
- 112/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50688.0156 - -
-``` - -``` -
- 115/469 ━━━━━━━━━━━━━━━━━━━━ 7s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50695.6953 - -
-``` - -``` -
- 118/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50703.3750 - -
-``` - -``` -
- 121/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 50711.0547 - -
-``` - -``` -
- 124/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50718.7344 - -
-``` - -``` -
- 127/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50726.4141 - -
-``` - -``` -
- 130/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50734.0938 - -
-``` - -``` -
- 133/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50741.7734 - -
-``` - -``` -
- 136/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50749.4531 - -
-``` - -``` -
- 139/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50757.1328 - -
-``` - -``` -
- 142/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50764.8125 - -
-``` - -``` -
- 145/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50772.4922 - -
-``` - -``` -
- 148/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 50780.1719 - -
-``` - -``` -
- 151/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50787.8555 - -
-``` - -``` -
- 154/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50795.5352 - -
-``` - -``` -
- 157/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50803.2148 - -
-``` - -``` -
- 160/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50810.8945 - -
-``` - -``` -
- 163/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50818.5742 - -
-``` - -``` -
- 166/469 ━━━━━━━━━━━━━━━━━━━━ 6s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50826.2539 - -
-``` - -``` -
- 169/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50833.9336 - -
-``` - -``` -
- 172/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 50841.6133 - -
-``` - -``` -
- 175/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50849.2930 - -
-``` - -``` -
- 178/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50856.9727 - -
-``` - -``` -
- 181/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50864.6523 - -
-``` - -``` -
- 184/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50872.3320 - -
-``` - -``` -
- 187/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50880.0117 - -
-``` - -``` -
- 190/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50887.6914 - -
-``` - -``` -
- 193/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50895.3750 - -
-``` - -``` -
- 196/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50903.0547 - -
-``` - -``` -
- 199/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 50910.7344 - -
-``` - -``` -
- 202/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50918.4141 - -
-``` - -``` -
- 205/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50926.0938 - -
-``` - -``` -
- 208/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50933.7734 - -
-``` - -``` -
- 211/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50941.4531 - -
-``` - -``` -
- 214/469 ━━━━━━━━━━━━━━━━━━━━ 5s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50949.1328 - -
-``` - -``` -
- 217/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50956.8125 - -
-``` - -``` -
- 220/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50964.4922 - -
-``` - -``` -
- 223/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50972.1758 - -
-``` - -``` -
- 226/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50979.8555 - -
-``` - -``` -
- 229/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 50987.5352 - -
-``` - -``` -
- 232/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 50995.2148 - -
-``` - -``` -
- 235/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51002.8945 - -
-``` - -``` -
- 238/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51010.5742 - -
-``` - -``` -
- 241/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51018.2539 - -
-``` - -``` -
- 242/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51020.8125 - -
-``` - -``` -
- 243/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51023.3750 - -
-``` - -``` -
- 245/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51028.4922 - -
-``` - -``` -
- 247/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51033.6133 - -
-``` - -``` -
- 250/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51041.2930 - -
-``` - -``` -
- 253/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51048.9727 - -
-``` - -``` -
- 256/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51056.6523 - -
-``` - -``` -
- 259/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51064.3320 - -
-``` - -``` -
- 262/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51072.0117 - -
-``` - -``` -
- 265/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51079.6953 - -
-``` - -``` -
- 268/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 51087.3750 - -
-``` - -``` -
- 271/469 ━━━━━━━━━━━━━━━━━━━━ 4s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51095.0547 - -
-``` - -``` -
- 274/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51102.7344 - -
-``` - -``` -
- 277/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51110.4141 - -
-``` - -``` -
- 280/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51118.0938 - -
-``` - -``` -
- 283/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51125.7734 - -
-``` - -``` -
- 286/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51133.4531 - -
-``` - -``` -
- 289/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51141.1328 - -
-``` - -``` -
- 292/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51148.8125 - -
-``` - -``` -
- 295/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51156.4922 - -
-``` - -``` -
- 298/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51164.1719 - -
-``` - -``` -
- 301/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51171.8555 - -
-``` - -``` -
- 304/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51179.5352 - -
-``` - -``` -
- 307/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51187.2148 - -
-``` - -``` -
- 310/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51194.8945 - -
-``` - -``` -
- 313/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51202.5742 - -
-``` - -``` -
- 316/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51210.2539 - -
-``` - -``` -
- 319/469 ━━━━━━━━━━━━━━━━━━━━ 3s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51217.9336 - -
-``` - -``` -
- 322/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51225.6133 - -
-``` - -``` -
- 325/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51233.2930 - -
-``` - -``` -
- 328/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51240.9727 - -
-``` - -``` -
- 331/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51248.6523 - -
-``` - -``` -
- 334/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 51256.3320 - -
-``` - -``` -
- 337/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51264.0117 - -
-``` - -``` -
- 340/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51271.6953 - -
-``` - -``` -
- 343/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51279.3750 - -
-``` - -``` -
- 346/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51287.0547 - -
-``` - -``` -
- 349/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51294.7344 - -
-``` - -``` -
- 352/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51302.4141 - -
-``` - -``` -
- 355/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51310.0938 - -
-``` - -``` -
- 358/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51317.7734 - -
-``` - -``` -
- 361/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51325.4531 - -
-``` - -``` -
- 364/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51333.1328 - -
-``` - -``` -
- 367/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51340.8125 - -
-``` - -``` -
- 370/469 ━━━━━━━━━━━━━━━━━━━━ 2s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51348.4922 - -
-``` - -``` -
- 373/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51356.1719 - -
-``` - -``` -
- 376/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51363.8516 - -
-``` - -``` -
- 379/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51371.5352 - -
-``` - -``` -
- 382/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51379.2148 - -
-``` - -``` -
- 385/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51386.8945 - -
-``` - -``` -
- 388/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51394.5742 - -
-``` - -``` -
- 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51402.2539 - -
-``` - -``` -
- 394/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51409.9336 - -
-``` - -``` -
- 397/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51417.6133 - -
-``` - -``` -
- 400/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51425.2930 - -
-``` - -``` -
- 403/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51432.9727 - -
-``` - -``` -
- 406/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51440.6523 - -
-``` - -``` -
- 409/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51448.3320 - -
-``` - -``` -
- 412/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51456.0117 - -
-``` - -``` -
- 415/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51463.6914 - -
-``` - -``` -
- 418/469 ━━━━━━━━━━━━━━━━━━━━ 1s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51471.3711 - -
-``` - -``` -
- 421/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51479.0547 - -
-``` - -``` -
- 424/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51486.7344 - -
-``` - -``` -
- 427/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51494.4141 - -
-``` - -``` -
- 430/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51502.0938 - -
-``` - -``` -
- 433/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51509.7734 - -
-``` - -``` -
- 436/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51517.4531 - -
-``` - -``` -
- 439/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51525.1328 - -
-``` - -``` -
- 442/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51532.8125 - -
-``` - -``` -
- 445/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51540.4922 - -
-``` - -``` -
- 448/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51548.1719 - -
-``` - -``` -
- 451/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51555.8516 - -
-``` - -``` -
- 454/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51563.5312 - -
-``` - -``` -
- 457/469 ━━━━━━━━━━━━━━━━━━━━ 0s 20ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51571.2109 - -
-``` - -``` -
- 459/469 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51576.3320 - -
-``` - -``` -
- 460/469 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51578.8945 - -
-``` - -``` -
- 463/469 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51586.5742 - -
-``` - -``` -
- 466/469 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51594.2539 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 0s 21ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51601.9297 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 10s 22ms/step - accuracy: 0.9943 - loss: 0.1000 - moe_loss: 51604.4766 - val_loss: 0.1000 - val_moe_loss: 53198.7227 - - -
-``` -Epoch 20/20 - -``` -
- - 1/469 ━━━━━━━━━━━━━━━━━━━━ 18s 40ms/step - accuracy: 0.9844 - loss: 0.1000 - moe_loss: 53203.8633 - -
-``` - -``` -
- 4/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9893 - loss: 0.1000 - moe_loss: 53211.5391 - -
-``` - -``` -
- 7/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9911 - loss: 0.1000 - moe_loss: 53219.2188 - -
-``` - -``` -
- 10/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 53226.8945 - -
-``` - -``` -
- 13/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 53234.5742 - -
-``` - -``` -
- 16/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 53242.2578 - -
-``` - -``` -
- 19/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9917 - loss: 0.1000 - moe_loss: 53249.9375 - -
-``` - -``` -
- 22/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9918 - loss: 0.1000 - moe_loss: 53257.6172 - -
-``` - -``` -
- 25/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9919 - loss: 0.1000 - moe_loss: 53265.2969 - -
-``` - -``` -
- 28/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 53272.9766 - -
-``` - -``` -
- 31/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 53280.6562 - -
-``` - -``` -
- 34/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 53288.3359 - -
-``` - -``` -
- 37/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9920 - loss: 0.1000 - moe_loss: 53296.0156 - -
-``` - -``` -
- 40/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 53303.6953 - -
-``` - -``` -
- 43/469 ━━━━━━━━━━━━━━━━━━━━ 9s 21ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 53311.3750 - -
-``` - -``` -
- 46/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9921 - loss: 0.1000 - moe_loss: 53319.0547 - -
-``` - -``` -
- 49/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 53326.7383 - -
-``` - -``` -
- 52/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 53334.4180 - -
-``` - -``` -
- 55/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9922 - loss: 0.1000 - moe_loss: 53342.0977 - -
-``` - -``` -
- 58/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9923 - loss: 0.1000 - moe_loss: 53349.7773 - -
-``` - -``` -
- 61/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 53357.4570 - -
-``` - -``` -
- 64/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9924 - loss: 0.1000 - moe_loss: 53365.1367 - -
-``` - -``` -
- 67/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9925 - loss: 0.1000 - moe_loss: 53372.8164 - -
-``` - -``` -
- 70/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 53380.4961 - -
-``` - -``` -
- 73/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9926 - loss: 0.1000 - moe_loss: 53388.1758 - -
-``` - -``` -
- 76/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 53395.8555 - -
-``` - -``` -
- 79/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9927 - loss: 0.1000 - moe_loss: 53403.5352 - -
-``` - -``` -
- 82/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 53411.2148 - -
-``` - -``` -
- 85/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9928 - loss: 0.1000 - moe_loss: 53418.8945 - -
-``` - -``` -
- 88/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 53426.5742 - -
-``` - -``` -
- 91/469 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 53434.2539 - -
-``` - -``` -
- 94/469 ━━━━━━━━━━━━━━━━━━━━ 8s 22ms/step - accuracy: 0.9929 - loss: 0.1000 - moe_loss: 53441.9336 - -
-``` - -``` -
- 97/469 ━━━━━━━━━━━━━━━━━━━━ 8s 22ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 53449.6133 - -
-``` - -``` -
- 100/469 ━━━━━━━━━━━━━━━━━━━━ 8s 22ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 53457.2930 - -
-``` - -``` -
- 102/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 53462.4141 - -
-``` - -``` -
- 105/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9930 - loss: 0.1000 - moe_loss: 53470.0938 - -
-``` - -``` -
- 108/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 53477.7734 - -
-``` - -``` -
- 111/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 53485.4531 - -
-``` - -``` -
- 114/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 53493.1328 - -
-``` - -``` -
- 117/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 53500.8125 - -
-``` - -``` -
- 120/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9931 - loss: 0.1000 - moe_loss: 53508.4922 - -
-``` - -``` -
- 123/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 53516.1758 - -
-``` - -``` -
- 126/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 53523.8555 - -
-``` - -``` -
- 129/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 53531.5352 - -
-``` - -``` -
- 132/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 53539.2148 - -
-``` - -``` -
- 135/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9932 - loss: 0.1000 - moe_loss: 53546.8945 - -
-``` - -``` -
- 138/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 53554.5742 - -
-``` - -``` -
- 141/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 53562.2539 - -
-``` - -``` -
- 144/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 53569.9336 - -
-``` - -``` -
- 147/469 ━━━━━━━━━━━━━━━━━━━━ 7s 22ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 53577.6133 - -
-``` - -``` -
- 150/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9933 - loss: 0.1000 - moe_loss: 53585.2930 - -
-``` - -``` -
- 153/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 53592.9727 - -
-``` - -``` -
- 156/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 53600.6523 - -
-``` - -``` -
- 159/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 53608.3320 - -
-``` - -``` -
- 162/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 53616.0156 - -
-``` - -``` -
- 165/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9934 - loss: 0.1000 - moe_loss: 53623.6953 - -
-``` - -``` -
- 168/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 53631.3750 - -
-``` - -``` -
- 171/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 53639.0547 - -
-``` - -``` -
- 174/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 53646.7344 - -
-``` - -``` -
- 177/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 53654.4141 - -
-``` - -``` -
- 180/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 53662.0938 - -
-``` - -``` -
- 183/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9935 - loss: 0.1000 - moe_loss: 53669.7734 - -
-``` - -``` -
- 186/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53677.4531 - -
-``` - -``` -
- 189/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53685.1328 - -
-``` - -``` -
- 192/469 ━━━━━━━━━━━━━━━━━━━━ 6s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53692.8125 - -
-``` - -``` -
- 195/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53700.4922 - -
-``` - -``` -
- 198/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53708.1719 - -
-``` - -``` -
- 201/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53715.8516 - -
-``` - -``` -
- 204/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53723.5352 - -
-``` - -``` -
- 207/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9936 - loss: 0.1000 - moe_loss: 53731.2148 - -
-``` - -``` -
- 210/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53738.8945 - -
-``` - -``` -
- 213/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53746.5742 - -
-``` - -``` -
- 214/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53749.1328 - -
-``` - -``` -
- 216/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53754.2539 - -
-``` - -``` -
- 219/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53761.9336 - -
-``` - -``` -
- 222/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53769.6133 - -
-``` - -``` -
- 225/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53777.2930 - -
-``` - -``` -
- 228/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53784.9727 - -
-``` - -``` -
- 231/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9937 - loss: 0.1000 - moe_loss: 53792.6523 - -
-``` - -``` -
- 234/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53800.3320 - -
-``` - -``` -
- 237/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53808.0117 - -
-``` - -``` -
- 240/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53815.6914 - -
-``` - -``` -
- 243/469 ━━━━━━━━━━━━━━━━━━━━ 5s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53823.3711 - -
-``` - -``` -
- 246/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53831.0547 - -
-``` - -``` -
- 249/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53838.7344 - -
-``` - -``` -
- 252/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53846.4141 - -
-``` - -``` -
- 255/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53854.0938 - -
-``` - -``` -
- 258/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53861.7734 - -
-``` - -``` -
- 261/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9938 - loss: 0.1000 - moe_loss: 53869.4531 - -
-``` - -``` -
- 264/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53877.1328 - -
-``` - -``` -
- 267/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53884.8125 - -
-``` - -``` -
- 270/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53892.4922 - -
-``` - -``` -
- 273/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53900.1719 - -
-``` - -``` -
- 276/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53907.8516 - -
-``` - -``` -
- 279/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53915.5312 - -
-``` - -``` -
- 282/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53923.2109 - -
-``` - -``` -
- 285/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53930.8945 - -
-``` - -``` -
- 288/469 ━━━━━━━━━━━━━━━━━━━━ 4s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53938.5742 - -
-``` - -``` -
- 291/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53946.2539 - -
-``` - -``` -
- 294/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53953.9336 - -
-``` - -``` -
- 297/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9939 - loss: 0.1000 - moe_loss: 53961.6133 - -
-``` - -``` -
- 300/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 53969.2930 - -
-``` - -``` -
- 303/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 53976.9727 - -
-``` - -``` -
- 306/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 53984.6523 - -
-``` - -``` -
- 309/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 53992.3320 - -
-``` - -``` -
- 312/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54000.0117 - -
-``` - -``` -
- 315/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54007.6914 - -
-``` - -``` -
- 318/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54015.3711 - -
-``` - -``` -
- 321/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54023.0508 - -
-``` - -``` -
- 324/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54030.7305 - -
-``` - -``` -
- 327/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54038.4141 - -
-``` - -``` -
- 330/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54046.0938 - -
-``` - -``` -
- 333/469 ━━━━━━━━━━━━━━━━━━━━ 3s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54053.7734 - -
-``` - -``` -
- 336/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54061.4531 - -
-``` - -``` -
- 339/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54069.1328 - -
-``` - -``` -
- 342/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54076.8125 - -
-``` - -``` -
- 344/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54081.9336 - -
-``` - -``` -
- 346/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9940 - loss: 0.1000 - moe_loss: 54087.0508 - -
-``` - -``` -
- 349/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54094.7305 - -
-``` - -``` -
- 352/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54102.4141 - -
-``` - -``` -
- 354/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54107.5312 - -
-``` - -``` -
- 357/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54115.2109 - -
-``` - -``` -
- 360/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54122.8906 - -
-``` - -``` -
- 363/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54130.5703 - -
-``` - -``` -
- 366/469 ━━━━━━━━━━━━━━━━━━━━ 2s 22ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54138.2539 - -
-``` - -``` -
- 369/469 ━━━━━━━━━━━━━━━━━━━━ 2s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54145.9336 - -
-``` - -``` -
- 372/469 ━━━━━━━━━━━━━━━━━━━━ 2s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54153.6133 - -
-``` - -``` -
- 375/469 ━━━━━━━━━━━━━━━━━━━━ 2s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54161.2930 - -
-``` - -``` -
- 378/469 ━━━━━━━━━━━━━━━━━━━━ 2s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54168.9727 - -
-``` - -``` -
- 381/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54176.6523 - -
-``` - -``` -
- 384/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54184.3320 - -
-``` - -``` -
- 386/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54189.4531 - -
-``` - -``` -
- 389/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54197.1328 - -
-``` - -``` -
- 391/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54202.2539 - -
-``` - -``` -
- 393/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54207.3711 - -
-``` - -``` -
- 395/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54212.4922 - -
-``` - -``` -
- 397/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54217.6133 - -
-``` - -``` -
- 399/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54222.7305 - -
-``` - -``` -
- 402/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54230.4141 - -
-``` - -``` -
- 405/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54238.0938 - -
-``` - -``` -
- 407/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54243.2109 - -
-``` - -``` -
- 410/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54250.8906 - -
-``` - -``` -
- 413/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54258.5742 - -
-``` - -``` -
- 416/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54266.2539 - -
-``` - -``` -
- 419/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9941 - loss: 0.1000 - moe_loss: 54273.9336 - -
-``` - -``` -
- 422/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54281.6133 - -
-``` - -``` -
- 425/469 ━━━━━━━━━━━━━━━━━━━━ 1s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54289.2930 - -
-``` - -``` -
- 428/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54296.9727 - -
-``` - -``` -
- 431/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54304.6523 - -
-``` - -``` -
- 434/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54312.3320 - -
-``` - -``` -
- 437/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54320.0117 - -
-``` - -``` -
- 440/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54327.6914 - -
-``` - -``` -
- 443/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54335.3711 - -
-``` - -``` -
- 446/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54343.0508 - -
-``` - -``` -
- 449/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54350.7305 - -
-``` - -``` -
- 452/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54358.4141 - -
-``` - -``` -
- 455/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54366.0938 - -
-``` - -``` -
- 458/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54373.7734 - -
-``` - -``` -
- 461/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54381.4531 - -
-``` - -``` -
- 464/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54389.1328 - -
-``` - -``` -
- 467/469 ━━━━━━━━━━━━━━━━━━━━ 0s 23ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54396.8125 - -
-``` - -``` -
- 469/469 ━━━━━━━━━━━━━━━━━━━━ 11s 24ms/step - accuracy: 0.9942 - loss: 0.1000 - moe_loss: 54404.4766 - val_loss: 0.1000 - val_moe_loss: 55998.7227 - - -### Evaluation - - -```python -score = model.evaluate(x_test, y_test, verbose=0) -print("Test loss:", score[0]) -print("Test accuracy:", score[1]) -``` - -
-``` -Test loss: tf.Tensor(0.10000026, shape=(), dtype=float32) -Test accuracy: {'accuracy': } - -``` -
\ No newline at end of file +This approach is particularly valuable for large-scale models where computational efficiency +is crucial. The same principles demonstrated here are used in much larger language models +and other applications where model capacity needs to scale efficiently diff --git a/examples/vision/mnist_moe.py b/examples/vision/mnist_moe.py index 8b1911511d..bcec4edf87 100644 --- a/examples/vision/mnist_moe.py +++ b/examples/vision/mnist_moe.py @@ -3,7 +3,7 @@ Author: [Damoon Shahhosseini](https://www.linkedin.com/in/damoonsh/) Date created: 2015/06/19 Last modified: 2020/04/21 -Description: Showcasing concepts relates to Mixture of Experts (MoE). +Description: Simple MoE implementation for MNIST classification. Accelerator: GPU """ @@ -17,11 +17,10 @@ At each forward pass, a gating network selects a subset of experts to apply to the input. The components to implement are: + - Gating network: A dense layer that outputs a probability distribution over the experts. - MoE layer: A layer that applies a different expert to each input in the batch. And a loss function that ensures specialization among the experts. - Model: A simple model that uses the MoE layer. - -In this example, we will first implement a linear MoE layer and then a CNN-based MoE layer. Lastly we will combine the two using an abstract implementation to showcase its capacties. """ """ @@ -67,7 +66,7 @@ NUM_EXPERTS = 5 TOP_K = 3 BATCH_SIZE = 128 -NUM_EPOCHS = 20 +NUM_EPOCHS = 12 LEARNING_RATE = 0.001 @@ -126,6 +125,7 @@ def __init__( mean=0.0, stddev=0.001 ), bias_initializer="zeros", + activation="softmax", ) self.num_experts = num_experts @@ -135,33 +135,50 @@ def __init__( tf.zeros((num_experts,), dtype=tf.float32) ) - def call(self, x): - # Get gating weights - gating_weights = self.gating_network(x) + def get_top_outputs(self, x, top_k_indices, top_k_weights): + batch_size = tf.shape(x)[0] + flat_indices = tf.reshape(top_k_indices, [-1]) + repeated_x = tf.repeat(x, repeats=self.top_k, axis=0) + + # Compute outputs for unique experts + unique_expert_ids = tf.unique(flat_indices)[0] # Get unique expert indices + expert_outputs_dict = {} + for idx in unique_expert_ids: + mask = tf.equal(flat_indices, idx) + selected_inputs = tf.boolean_mask(repeated_x, mask) + expert_outputs_dict[idx.numpy()] = self.experts[idx](selected_inputs) + + # Gather outputs back into the correct shape + output_size = self.experts[0].compute_output_shape(input_shape=(None, 10))[-1] + flat_outputs = tf.zeros( + [batch_size * self.top_k, output_size], dtype=tf.float32 + ) + for idx in unique_expert_ids: + mask = tf.equal(flat_indices, idx) + indices = tf.where(mask) + flat_outputs = tf.tensor_scatter_nd_update( + flat_outputs, indices, expert_outputs_dict[idx.numpy()] + ) + top_k_expert_outputs = tf.reshape( + flat_outputs, [batch_size, self.top_k, output_size] + ) - # Get the top k experts based on the gating weights - top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k) + # Combine outputs using top-k weights + return tf.einsum("ijk,ij->ik", top_k_expert_outputs, top_k_weights) - # Count usage of each expert symbolically - updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32) - # Use tf.tensor_scatter_nd_add to increment the usage count + def update_usage_counts(self, indices): + updates = tf.ones_like(tf.reshape(indices, [-1]), dtype=tf.float32) self.expert_usage_count.assign( tf.tensor_scatter_nd_add( - self.expert_usage_count, tf.reshape(top_k_indices, [-1, 1]), updates + self.expert_usage_count, tf.reshape(indices, [-1, 1]), updates ) ) - # Get outputs from only the top-k experts - top_k_expert_outputs = tf.stack( - [ - self.experts[expert_index](x) - for expert_index in top_k_indices.numpy()[0] - ], - axis=1, - ) # Stack outputs along axis 1 - - # Combine outputs using top-k weights - combined_output = tf.einsum("ijk,ij->ik", top_k_expert_outputs, top_k_weights) + def call(self, x): + gating_weights = self.gating_network(x) + top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k) + combined_output = self.get_top_outputs(x, top_k_indices, top_k_weights) + self.update_usage_counts(top_k_indices) return combined_output @@ -176,9 +193,12 @@ def call(self, x): """ ## Routing Collapse -Routing collapse is a problem that occurs with MoE layers. The route terminology refers to the selection process of which expert to use for a given input. +One common challenge with MoE architectures is "routing collapse". The "route" refers to the selection process of which expert to use for a given input where the model falls into a pattern of only using a small subset of experts. This happens because: -Route collapse happens when a routing model, early in training, starts favoring just a few experts because they perform slightly better due to random starting conditions. This leads to most examples being sent to these experts, leaving others unused and reducing the model’s overall capacity. +1. Early in training, some experts may perform slightly better by chance +2. These better-performing experts get selected more frequently +3. With more practice, these experts improve further, creating a feedback loop +4. Other experts become neglected and never improve Code below demonstrates the randomness of expert selection: """ @@ -196,14 +216,24 @@ def check_expert_usage(runs): check_expert_usage(4) """ -### Adding loss functions to prevent route collapse -To fix this, the authors use extra rules (importance and load losses), ideas borrowed from [Shazeer et al.](https://arxiv.org/abs/1701.06538), to ensure all experts get used evenly. +### Load Balancing Solutions -The importance_loss calculates how much the usage of each expert (tracked in batch_importance_sum) deviates from the average usage (mean_importance) by using mean squared error, aiming to balance expert utilization. This helps prevent route collapse by discouraging the model from overloading a few experts, instead promoting an even distribution of examples across all experts to maintain diverse and effective routing. +To prevent routing collapse, we implement three types of losses that were introduced in various MoE research: -#### Load losses - - Diversity loss: Diversity loss helps prevent route collapse by encouraging the routing model to evenly distribute examples across all experts, rather than favoring just a few due to their initial performance. It does this by maximizing the entropy of the gating weights, ensuring balanced expert utilization and improving the model's overall capacity. - - Overflow loss: The batch_overflow_sum measures how much the usage of experts exceeds a set capacity by applying ReLU to the difference between usage_counts (how many examples each expert handles) and batch_capacity (the allowed limit), then summing the excesses. This helps prevent route collapse by penalizing situations where certain experts are overused, encouraging a more even spread of examples across all experts to keep the model's capacity balanced. +1. Diversity Loss: Encourages the gating network to use all experts by maximizing the entropy + of expert selection probabilities + [Shazeer et al., "Outrageously Large Neural Networks" (2017)](https://arxiv.org/abs/1701.06538) + +2. Importance Loss: Ensures each expert handles a similar total amount of input across the batch + by penalizing deviations from the mean usage + [Lepikhin et al., "GShard: Scaling Giant Models with Conditional Computation" (2020)](https://arxiv.org/abs/2006.16668) + +3. Overflow Loss: Prevents individual experts from being overloaded by penalizing usage above + a specified capacity threshold + [Fedus et al., "Switch Transformers" (2021)](https://arxiv.org/abs/2101.03961) + +These losses are combined with the main classification loss during training to ensure balanced expert utilization. +The combination of these techniques has proven effective in large-scale models like GShard and Switch Transformers. """ @@ -234,6 +264,7 @@ def __init__( mean=0.0, stddev=0.001 ), bias_initializer="zeros", + activation="softmax", ) self.num_experts = num_experts @@ -259,60 +290,52 @@ def _importance_loss(self, gating_weights): ) ) - def call(self, x): - # Get gating weights and normalize - gating_weights = self.gating_network(x) - gating_weights = K.softmax(gating_weights) # Ensure weights are probabilities - self._diversity_loss(gating_weights) - self._importance_loss(gating_weights) - - # Get the top k experts based on the gating weights + # Replace the current get_top_outputs method with this vectorized version + def get_top_outputs( + self, x, gating_weights + ): # Changed to take gating_weights directly + """Compute outputs from top-k experts.""" top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k) - # Count usage of each expert symbolically - updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32) - # Use tf.tensor_scatter_nd_add to increment the usage count - self.expert_usage_count.assign( - tf.tensor_scatter_nd_add( - self.expert_usage_count, tf.reshape(top_k_indices, [-1, 1]), updates - ) - ) - - # Calculate overflow using updated usage count - self.batch_overflow_sum = K.sum( - K.relu(tf.convert_to_tensor(self.expert_usage_count) - self.batch_capacity) - ) - - # Compute all expert outputs - expert_outputs = tf.stack( - [expert(x) for expert in self.experts], axis=1 - ) # Shape: (batch_size, num_experts, hidden_size) + # Store indices and updates for usage count + self.indices = tf.reshape(top_k_indices, [-1, 1]) + self.updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32) - # Gather the top-k expert outputs using top_k_indices + # Compute expert outputs symbolically + expert_outputs = tf.stack([expert(x) for expert in self.experts], axis=1) batch_size = tf.shape(x)[0] - batch_indices = tf.expand_dims( - tf.range(batch_size), 1 - ) # Shape: (batch_size, 1) - batch_indices = tf.tile( - batch_indices, [1, self.top_k] - ) # Shape: (batch_size, top_k) - - # Create indices for gathering - indices = tf.stack( - [batch_indices, top_k_indices], axis=2 - ) # Shape: (batch_size, top_k, 2) - top_k_expert_outputs = tf.gather_nd( - expert_outputs, indices - ) # Shape: (batch_size, top_k, hidden_size) + batch_indices = tf.tile(tf.range(batch_size)[:, tf.newaxis], [1, self.top_k]) + gather_indices = tf.stack([batch_indices, top_k_indices], axis=-1) + top_k_expert_outputs = tf.gather_nd(expert_outputs, gather_indices) - # Combine outputs using top-k weights combined_output = tf.reduce_sum( - top_k_expert_outputs * tf.expand_dims(top_k_weights, axis=-1), axis=1 + top_k_expert_outputs * top_k_weights[:, :, tf.newaxis], axis=1 + ) + return combined_output + + def update_usage_counts(self): + updates = tf.ones_like(tf.reshape(self.indices, [-1]), dtype=tf.float32) + self.expert_usage_count.assign( + tf.tensor_scatter_nd_add( + self.expert_usage_count, tf.reshape(self.indices, [-1, 1]), updates + ) ) + def call(self, x): + # Get gating weights and normalize + gating_weights = self.gating_network(x) + # top_k_weights, top_k_indices = tf.nn.top_k(gating_weights, k=self.top_k) + combined_output = self.get_top_outputs(x, gating_weights) + self.update_usage_counts() + self._diversity_loss(gating_weights) + self._importance_loss(gating_weights) + return combined_output def compute_total_loss(self, load_balance_coef=0.01): + self.batch_overflow_sum = K.sum( + K.relu(tf.convert_to_tensor(self.expert_usage_count) - self.batch_capacity) + ) return load_balance_coef * ( self.diversity_loss + self.batch_overflow_sum + self.importance_loss ) @@ -324,7 +347,13 @@ def compute_total_loss(self, load_balance_coef=0.01): class MoEModel(keras.Model): - def __init__(self, input_shape, num_classes, num_experts=NUM_EXPERTS, top_k=TOP_K): + def __init__( + self, + num_classes, + num_experts=NUM_EXPERTS, + top_k=TOP_K, + moe_loss_considered=True, + ): super(MoEModel, self).__init__() # Define the convolutional block @@ -346,6 +375,7 @@ def __init__(self, input_shape, num_classes, num_experts=NUM_EXPERTS, top_k=TOP_ # Softmax layer self.softmax = layers.Softmax() + self.moe_loss_considered = moe_loss_considered def call(self, inputs, training=False): conv_flatten = self.conv_block(inputs) @@ -359,17 +389,21 @@ def train_step(self, data): with tf.GradientTape() as tape: y_pred = self(x, training=True) classification_loss = self.compute_loss(x, y, y_pred) - moe_loss = self.moe_classifier.compute_total_loss(load_balance_coef=0.01) - total_loss = classification_loss + moe_loss + if self.moe_loss_considered: + moe_loss = self.moe_classifier.compute_total_loss( + load_balance_coef=0.01 + ) + total_loss = classification_loss + moe_loss + else: + total_loss = classification_loss # Compute gradients gradients = tape.gradient(total_loss, self.trainable_variables) # Update weights - self.optimizer.apply_gradients( - zip(gradients, self.trainable_variables) - ) # Update metrics (e.g., accuracy) - self.compiled_metrics.update_state(y, y_pred) + self.optimizer.apply_gradients(zip(gradients, self.trainable_variables)) + for metric in self.metrics: + metric.update_state(y, y_pred) # Return a dict of metrics for monitoring return { "loss": total_loss, @@ -384,7 +418,8 @@ def test_step(self, data): moe_loss = self.moe_classifier.compute_total_loss(load_balance_coef=0.01) total_loss = classification_loss + moe_loss - self.compiled_metrics.update_state(y, y_pred) + for metric in self.metrics: + metric.update_state(y, y_pred) return { "loss": total_loss, "moe_loss": moe_loss, @@ -394,9 +429,7 @@ def test_step(self, data): # Instantiate and compile the model inputs = keras.Input(shape=input_shape) -model = MoEModel( - input_shape=input_shape, num_classes=num_classes, num_experts=6, top_k=4 -) +model = MoEModel(num_classes=num_classes, num_experts=5, top_k=3) model.compile( optimizer=keras.optimizers.Adam(learning_rate=LEARNING_RATE), @@ -413,6 +446,7 @@ def test_step(self, data): batch_size=BATCH_SIZE, epochs=NUM_EPOCHS, validation_data=(x_test, y_test), + verbose=0, ) """ @@ -422,3 +456,25 @@ def test_step(self, data): score = model.evaluate(x_test, y_test, verbose=0) print("Test loss:", score[0]) print("Test accuracy:", score[1]) + +""" +# Conclusion + +This example demonstrated how Mixture of Experts (MoE) can be used to increase model capacity without a proportional increase in computation cost. The key benefits are: + +1. Conditional Computation: Only a subset of experts (TOP_K=3 out of NUM_EXPERTS=5) process each input, + making the model more computationally efficient than a model that uses all parameters for every input. + +2. Specialized Processing: Each expert learns to handle different aspects of the input space, + allowing for more sophisticated processing without requiring a larger dense network. + +In our implementation, we: +1. Created a basic MoE layer using dense networks as experts +2. Implemented three types of load balancing losses to prevent routing collapse +3. Applied the MoE architecture to MNIST classification by replacing the final dense layer +4. Achieved comparable accuracy to the baseline model while using experts conditionally + +This approach is particularly valuable for large-scale models where computational efficiency +is crucial. The same principles demonstrated here are used in much larger language models +and other applications where model capacity needs to scale efficiently +""" From 542f560fdd3ae0fd4a333f91f31e09e25c68c0c4 Mon Sep 17 00:00:00 2001 From: Damoon Date: Mon, 3 Mar 2025 14:44:15 +0300 Subject: [PATCH 4/5] Converted to keras 3 with torch backend --- examples/audio/vocal_track_separation.py | 673 ++++++++++++++++++++++ examples/vision/ipynb/mnist_moe.ipynb | 685 ----------------------- examples/vision/md/mnist_moe.md | 585 ------------------- examples/vision/mnist_moe.py | 279 +++++---- 4 files changed, 831 insertions(+), 1391 deletions(-) create mode 100644 examples/audio/vocal_track_separation.py delete mode 100644 examples/vision/ipynb/mnist_moe.ipynb delete mode 100644 examples/vision/md/mnist_moe.md diff --git a/examples/audio/vocal_track_separation.py b/examples/audio/vocal_track_separation.py new file mode 100644 index 0000000000..ca16c35ab1 --- /dev/null +++ b/examples/audio/vocal_track_separation.py @@ -0,0 +1,673 @@ +""" +Title: Vocal Track Separation with Encoder-Decoder Architecture +Author: [Joaquin Jimenez](https://github.com/johacks/) +Date created: 2024/12/10 +Last modified: 2024/12/10 +Description: Train a model to separate vocal tracks from music mixtures. +Accelerator: GPU +""" + +""" +## Introduction + +In this tutorial, we build a vocal track separation model using an encoder-decoder +architecture in Keras 3. + +We train the model on the [MUSDB18 dataset](https://doi.org/10.5281/zenodo.1117372), +which provides music mixtures and isolated tracks for drums, bass, other, and vocals. + +Key concepts covered: + +- Audio data preprocessing using the Short-Time Fourier Transform (STFT). +- Audio data augmentation techniques. +- Implementing custom encoders and decoders specialized for audio data. +- Defining appropriate loss functions and metrics for audio source separation tasks. + +The model architecture is derived from the TFC_TDF_Net model described in: + +W. Choi, M. Kim, J. Chung, D. Lee, and S. Jung, “Investigating U-Nets with various +intermediate blocks for spectrogram-based singing voice separation,” in the 21st +International Society for Music Information Retrieval Conference, 2020. + +For reference code, see: +[GitHub: ws-choi/ISMIR2020_U_Nets_SVS](https://github.com/ws-choi/ISMIR2020_U_Nets_SVS). + +The data processing and model training routines are partly derived from: +[ZFTurbo/Music-Source-Separation-Training](https://github.com/ZFTurbo/Music-Source-Separation-Training/tree/main). +""" + +""" +## Setup + +Import and install all the required dependencies. +""" + +"""shell +pip install -qq audiomentations soundfile ffmpeg-binaries +pip install -qq "keras==3.7.0" +sudo -n apt-get install -y graphviz >/dev/null 2>&1 # Required for plotting the model +""" + +import glob +import os + +os.environ["KERAS_BACKEND"] = "jax" # or "tensorflow" or "torch" + +import random +import subprocess +import tempfile +import typing +from os import path + +import audiomentations as aug +import ffmpeg +import keras +import numpy as np +import soundfile as sf +from IPython import display +from keras import callbacks, layers, ops, saving +from matplotlib import pyplot as plt + +""" +## Configuration + +The following constants define configuration parameters for audio processing +and model training, including dataset paths, audio chunk sizes, Short-Time Fourier +Transform (STFT) parameters, and training hyperparameters. +""" + +# MUSDB18 dataset configuration +MUSDB_STREAMS = {"mixture": 0, "drums": 1, "bass": 2, "other": 3, "vocals": 4} +TARGET_INSTRUMENTS = {track: MUSDB_STREAMS[track] for track in ("vocals",)} +N_INSTRUMENTS = len(TARGET_INSTRUMENTS) +SOURCE_INSTRUMENTS = tuple(k for k in MUSDB_STREAMS if k != "mixture") + +# Audio preprocessing parameters for Short-Time Fourier Transform (STFT) +N_SUBBANDS = 4 # Number of subbands into which frequencies are split +CHUNK_SIZE = 65024 # Number of amplitude samples per audio chunk (~4 seconds) +STFT_N_FFT = 2048 # FFT points used in STFT +STFT_HOP_LENGTH = 512 # Hop length for STFT + +# Training hyperparameters +N_CHANNELS = 64 # Base channel count for the model +BATCH_SIZE = 3 +ACCUMULATION_STEPS = 2 +EFFECTIVE_BATCH_SIZE = BATCH_SIZE * (ACCUMULATION_STEPS or 1) + +# Paths +TMP_DIR = path.expanduser("~/.keras/tmp") +DATASET_DIR = path.expanduser("~/.keras/datasets") +MODEL_PATH = path.join(TMP_DIR, f"model_{keras.backend.backend()}.keras") +CSV_LOG_PATH = path.join(TMP_DIR, f"training_{keras.backend.backend()}.csv") +os.makedirs(DATASET_DIR, exist_ok=True) +os.makedirs(TMP_DIR, exist_ok=True) + +# Set random seed for reproducibility +keras.utils.set_random_seed(21) + +""" +## MUSDB18 Dataset + +The MUSDB18 dataset is a standard benchmark for music source separation, containing +150 full-length music tracks along with isolated drums, bass, other, and vocals. +The dataset is stored in .mp4 format, and each .mp4 file includes multiple audio +streams (mixture and individual tracks). + +### Download and Conversion + +The following utility function downloads MUSDB18 and converts its .mp4 files to +.wav files for each instrument track, resampled to 16 kHz. +""" + + +def download_musdb18(out_dir=None): + """Download and extract the MUSDB18 dataset, then convert .mp4 files to .wav files. + + MUSDB18 reference: + Rafii, Z., Liutkus, A., Stöter, F.-R., Mimilakis, S. I., & Bittner, R. (2017). + MUSDB18 - a corpus for music separation (1.0.0) [Data set]. Zenodo. + """ + ffmpeg.init() + from ffmpeg import FFMPEG_PATH + + # Create output directories + os.makedirs((base := out_dir or tempfile.mkdtemp()), exist_ok=True) + if path.exists((out_dir := path.join(base, "musdb18_wav"))): + print("MUSDB18 dataset already downloaded") + return out_dir + + # Download and extract the dataset + download_dir = keras.utils.get_file( + fname="musdb18", + origin="https://zenodo.org/records/1117372/files/musdb18.zip", + extract=True, + ) + + # ffmpeg command template: input, stream index, output + ffmpeg_args = str(FFMPEG_PATH) + " -v error -i {} -map 0:{} -vn -ar 16000 {}" + + # Convert each mp4 file to multiple .wav files for each track + for split in ("train", "test"): + songs = os.listdir(path.join(download_dir, split)) + for i, song in enumerate(songs): + if i % 10 == 0: + print(f"{split.capitalize()}: {i}/{len(songs)} songs processed") + + mp4_path_orig = path.join(download_dir, split, song) + mp4_path = path.join(tempfile.mkdtemp(), split, song.replace(" ", "_")) + os.makedirs(path.dirname(mp4_path), exist_ok=True) + os.rename(mp4_path_orig, mp4_path) + + wav_dir = path.join(out_dir, split, path.basename(mp4_path).split(".")[0]) + os.makedirs(wav_dir, exist_ok=True) + + for track in SOURCE_INSTRUMENTS: + out_path = path.join(wav_dir, f"{track}.wav") + stream_index = MUSDB_STREAMS[track] + args = ffmpeg_args.format(mp4_path, stream_index, out_path).split() + assert subprocess.run(args).returncode == 0, "ffmpeg conversion failed" + return out_dir + + +# Download and prepare the MUSDB18 dataset +songs = download_musdb18(out_dir=DATASET_DIR) + +""" +### Custom Dataset + +We define a custom dataset class to generate random audio chunks and their corresponding +labels. The dataset does the following: + +1. Selects a random chunk from a random song and instrument. +2. Applies optional data augmentations. +3. Combines isolated tracks to form new synthetic mixtures. +4. Prepares features (mixtures) and labels (vocals) for training. + +This approach allows creating an effectively infinite variety of training examples +through randomization and augmentation. +""" + + +class Dataset(keras.utils.PyDataset): + def __init__( + self, + songs, + batch_size=BATCH_SIZE, + chunk_size=CHUNK_SIZE, + batches_per_epoch=1000 * ACCUMULATION_STEPS, + augmentation=True, + **kwargs, + ): + super().__init__(**kwargs) + self.augmentation = augmentation + self.vocals_augmentations = [ + aug.PitchShift(min_semitones=-5, max_semitones=5, p=0.1), + aug.SevenBandParametricEQ(-9, 9, p=0.25), + aug.TanhDistortion(0.1, 0.7, p=0.1), + ] + self.other_augmentations = [ + aug.PitchShift(p=0.1), + aug.AddGaussianNoise(p=0.1), + ] + self.songs = songs + self.sizes = {song: self.get_track_set_size(song) for song in self.songs} + self.batch_size = batch_size + self.chunk_size = chunk_size + self.batches_per_epoch = batches_per_epoch + + def get_track_set_size(self, song: str): + """Return the smallest track length in the given song directory.""" + sizes = [len(sf.read(p)[0]) for p in glob.glob(path.join(song, "*.wav"))] + if max(sizes) != min(sizes): + print(f"Warning: {song} has different track lengths") + return min(sizes) + + def random_chunk_of_instrument_type(self, instrument: str): + """Extract a random chunk for the specified instrument from a random song.""" + song, size = random.choice(list(self.sizes.items())) + track = path.join(song, f"{instrument}.wav") + + if self.chunk_size <= size: + start = np.random.randint(size - self.chunk_size + 1) + audio = sf.read(track, self.chunk_size, start, dtype="float32")[0] + audio_mono = np.mean(audio, axis=1) + else: + # If the track is shorter than chunk_size, pad the signal + audio_mono = np.mean(sf.read(track, dtype="float32")[0], axis=1) + audio_mono = np.pad(audio_mono, ((0, self.chunk_size - size),)) + + # If the chunk is almost silent, retry + if np.mean(np.abs(audio_mono)) < 0.01: + return self.random_chunk_of_instrument_type(instrument) + + return self.data_augmentation(audio_mono, instrument) + + def data_augmentation(self, audio: np.ndarray, instrument: str): + """Apply data augmentation to the audio chunk, if enabled.""" + + def coin_flip(x, probability: float, fn: typing.Callable): + return fn(x) if random.uniform(0, 1) < probability else x + + if self.augmentation: + augmentations = ( + self.vocals_augmentations + if instrument == "vocals" + else self.other_augmentations + ) + # Loudness augmentation + audio *= np.random.uniform(0.5, 1.5, (len(audio),)).astype("float32") + # Random reverse + audio = coin_flip(audio, 0.1, lambda x: np.flip(x)) + # Random polarity inversion + audio = coin_flip(audio, 0.5, lambda x: -x) + # Apply selected augmentations + for aug_ in augmentations: + aug_.randomize_parameters(audio, sample_rate=16000) + audio = aug_(audio, sample_rate=16000) + return audio + + def random_mix_of_tracks(self) -> dict: + """Create a random mix of instruments by summing their individual chunks.""" + tracks = {} + for instrument in SOURCE_INSTRUMENTS: + # Start with a single random chunk + mixup = [self.random_chunk_of_instrument_type(instrument)] + + # Randomly add more chunks of the same instrument (mixup augmentation) + if self.augmentation: + for p in (0.2, 0.02): + if random.uniform(0, 1) < p: + mixup.append(self.random_chunk_of_instrument_type(instrument)) + + tracks[instrument] = np.mean(mixup, axis=0, dtype="float32") + return tracks + + def __len__(self): + return self.batches_per_epoch + + def __getitem__(self, idx): + # Generate a batch of random mixtures + batch = [self.random_mix_of_tracks() for _ in range(self.batch_size)] + + # Features: sum of all tracks + batch_x = ops.sum( + np.array([list(track_set.values()) for track_set in batch]), axis=1 + ) + + # Labels: isolated target instruments (e.g., vocals) + batch_y = np.array( + [[track_set[t] for t in TARGET_INSTRUMENTS] for track_set in batch] + ) + + return batch_x, ops.convert_to_tensor(batch_y) + + +# Create train and validation datasets +train_ds = Dataset(glob.glob(path.join(songs, "train", "*"))) +val_ds = Dataset( + glob.glob(path.join(songs, "test", "*")), + batches_per_epoch=int(0.1 * train_ds.batches_per_epoch), + augmentation=False, +) + +""" +### Visualize a Sample + +Let's visualize a random mixed audio chunk and its corresponding isolated vocals. +This helps to understand the nature of the preprocessed input data. +""" + + +def visualize_audio_np(audio: np.ndarray, rate=16000, name="mixup"): + """Plot and display an audio waveform and also produce an Audio widget.""" + plt.figure(figsize=(10, 6)) + plt.plot(audio) + plt.title(f"Waveform: {name}") + plt.xlim(0, len(audio)) + plt.ylabel("Amplitude") + plt.show() + # plt.savefig(f"tmp/{name}.png") + + # Normalize and display audio + audio_norm = (audio - np.min(audio)) / (np.max(audio) - np.min(audio) + 1e-8) + audio_norm = (audio_norm * 2 - 1) * 0.6 + display.display(display.Audio(audio_norm, rate=rate)) + # sf.write(f"tmp/{name}.wav", audio_norm, rate) + + +sample_batch_x, sample_batch_y = val_ds[None] # Random batch +visualize_audio_np(ops.convert_to_numpy(sample_batch_x[0])) +visualize_audio_np(ops.convert_to_numpy(sample_batch_y[0, 0]), name="vocals") + +""" +## Model + +### Preprocessing + +The model operates on STFT representations rather than raw audio. We define a +preprocessing model to compute STFT and a corresponding inverse transform (iSTFT). +""" + + +def stft(inputs, fft_size=STFT_N_FFT, sequence_stride=STFT_HOP_LENGTH): + """Compute the STFT for the input audio and return the real and imaginary parts.""" + real_x, imag_x = ops.stft(inputs, fft_size, sequence_stride, fft_size) + real_x, imag_x = ops.expand_dims(real_x, -1), ops.expand_dims(imag_x, -1) + x = ops.concatenate((real_x, imag_x), axis=-1) + + # Drop last freq sample for convenience + return ops.split(x, [x.shape[2] - 1], axis=2)[0] + + +def inverse_stft(inputs, fft_size=STFT_N_FFT, sequence_stride=STFT_HOP_LENGTH): + """Compute the inverse STFT for the given STFT input.""" + x = inputs + + # Pad back dropped freq sample if using torch backend + if keras.backend.backend() == "torch": + x = ops.pad(x, ((0, 0), (0, 0), (0, 1), (0, 0))) + + real_x, imag_x = ops.split(x, 2, axis=-1) + real_x = ops.squeeze(real_x, axis=-1) + imag_x = ops.squeeze(imag_x, axis=-1) + + return ops.istft((real_x, imag_x), fft_size, sequence_stride, fft_size) + + +""" +### Model Architecture + +The model uses a custom encoder-decoder architecture with Time-Frequency Convolution +(TFC) and Time-Distributed Fully Connected (TDF) blocks. They are grouped into a +`TimeFrequencyTransformBlock`, i.e. "TFC_TDF" in the original paper by Choi et al. + +We then define an encoder-decoder network with multiple scales. Each encoder scale +applies TFC_TDF blocks followed by downsampling, while decoder scales apply TFC_TDF +blocks over the concatenation of upsampled features and associated encoder outputs. +""" + + +@saving.register_keras_serializable() +class TimeDistributedDenseBlock(layers.Layer): + """Time-Distributed Fully Connected layer block. + + Applies frequency-wise dense transformations across time frames with instance + normalization and GELU activation. + """ + + def __init__(self, bottleneck_factor, fft_dim, **kwargs): + super().__init__(**kwargs) + self.fft_dim = fft_dim + self.hidden_dim = fft_dim // bottleneck_factor + + def build(self, *_): + self.group_norm_1 = layers.GroupNormalization(groups=-1) + self.group_norm_2 = layers.GroupNormalization(groups=-1) + self.dense_1 = layers.Dense(self.hidden_dim, use_bias=False) + self.dense_2 = layers.Dense(self.fft_dim, use_bias=False) + + def call(self, x): + # Apply normalization and dense layers frequency-wise + x = ops.gelu(self.group_norm_1(x)) + x = ops.swapaxes(x, -1, -2) + x = self.dense_1(x) + + x = ops.gelu(self.group_norm_2(ops.swapaxes(x, -1, -2))) + x = ops.swapaxes(x, -1, -2) + x = self.dense_2(x) + return ops.swapaxes(x, -1, -2) + + +@saving.register_keras_serializable() +class TimeFrequencyConvolution(layers.Layer): + """Time-Frequency Convolutional layer. + + Applies a 2D convolution over time-frequency representations and applies instance + normalization and GELU activation. + """ + + def __init__(self, channels, **kwargs): + super().__init__(**kwargs) + self.channels = channels + + def build(self, *_): + self.group_norm = layers.GroupNormalization(groups=-1) + self.conv = layers.Conv2D(self.channels, 3, padding="same", use_bias=False) + + def call(self, x): + return self.conv(ops.gelu(self.group_norm(x))) + + +@saving.register_keras_serializable() +class TimeFrequencyTransformBlock(layers.Layer): + """Implements TFC_TDF block for encoder-decoder architecture. + + Repeatedly apply Time-Frequency Convolution and Time-Distributed Dense blocks as + many times as specified by the `length` parameter. + """ + + def __init__( + self, channels, length, fft_dim, bottleneck_factor, in_channels=None, **kwargs + ): + super().__init__(**kwargs) + self.channels = channels + self.length = length + self.fft_dim = fft_dim + self.bottleneck_factor = bottleneck_factor + self.in_channels = in_channels or channels + + def build(self, *_): + self.blocks = [] + # Add blocks in a flat list to avoid nested structures + for i in range(self.length): + in_channels = self.channels if i > 0 else self.in_channels + self.blocks.append(TimeFrequencyConvolution(in_channels)) + self.blocks.append( + TimeDistributedDenseBlock(self.bottleneck_factor, self.fft_dim) + ) + self.blocks.append(TimeFrequencyConvolution(self.channels)) + # Residual connection + self.blocks.append(layers.Conv2D(self.channels, 1, 1, use_bias=False)) + + def call(self, inputs): + x = inputs + # Each block consists of 4 layers: + # 1. Time-Frequency Convolution + # 2. Time-Distributed Dense + # 3. Time-Frequency Convolution + # 4. Residual connection + for i in range(0, len(self.blocks), 4): + tfc_1 = self.blocks[i](x) + tdf = self.blocks[i + 1](x) + tfc_2 = self.blocks[i + 2](tfc_1 + tdf) + x = tfc_2 + self.blocks[i + 3](x) # Residual connection + return x + + +@saving.register_keras_serializable() +class Downscale(layers.Layer): + """Downscale time-frequency dimensions using a convolution.""" + + conv_cls = layers.Conv2D + + def __init__(self, channels, scale, **kwargs): + super().__init__(**kwargs) + self.channels = channels + self.scale = scale + + def build(self, *_): + self.conv = self.conv_cls(self.channels, self.scale, self.scale, use_bias=False) + self.norm = layers.GroupNormalization(groups=-1) + + def call(self, inputs): + return self.norm(ops.gelu(self.conv(inputs))) + + +@saving.register_keras_serializable() +class Upscale(Downscale): + """Upscale time-frequency dimensions using a transposed convolution.""" + + conv_cls = layers.Conv2DTranspose + + +def build_model( + inputs, + n_instruments=N_INSTRUMENTS, + n_subbands=N_SUBBANDS, + channels=N_CHANNELS, + fft_dim=(STFT_N_FFT // 2) // N_SUBBANDS, + n_scales=4, + scale=(2, 2), + block_size=2, + growth=128, + bottleneck_factor=2, + **kwargs, +): + """Build the TFC_TDF encoder-decoder model for source separation.""" + # Compute STFT + x = stft(inputs) + + # Split mixture into subbands as separate channels + mix = ops.reshape(x, (-1, x.shape[1], x.shape[2] // n_subbands, 2 * n_subbands)) + first_conv_out = layers.Conv2D(channels, 1, 1, use_bias=False)(mix) + x = first_conv_out + + # Encoder path + encoder_outs = [] + for _ in range(n_scales): + x = TimeFrequencyTransformBlock( + channels, block_size, fft_dim, bottleneck_factor + )(x) + encoder_outs.append(x) + fft_dim, channels = fft_dim // scale[0], channels + growth + x = Downscale(channels, scale)(x) + + # Bottleneck + x = TimeFrequencyTransformBlock(channels, block_size, fft_dim, bottleneck_factor)(x) + + # Decoder path + for _ in range(n_scales): + fft_dim, channels = fft_dim * scale[0], channels - growth + x = ops.concatenate([Upscale(channels, scale)(x), encoder_outs.pop()], axis=-1) + x = TimeFrequencyTransformBlock( + channels, block_size, fft_dim, bottleneck_factor, in_channels=x.shape[-1] + )(x) + + # Residual connection and final convolutions + x = ops.concatenate([mix, x * first_conv_out], axis=-1) + x = layers.Conv2D(channels, 1, 1, use_bias=False, activation="gelu")(x) + x = layers.Conv2D(n_instruments * n_subbands * 2, 1, 1, use_bias=False)(x) + + # Reshape back to instrument-wise STFT + x = ops.reshape(x, (-1, x.shape[1], x.shape[2] * n_subbands, n_instruments, 2)) + x = ops.transpose(x, (0, 3, 1, 2, 4)) + x = ops.reshape(x, (-1, n_instruments, x.shape[2], x.shape[3] * 2)) + + return keras.Model(inputs=inputs, outputs=x, **kwargs) + + +""" +## Loss and Metrics + +We define: + +- `spectral_loss`: Mean absolute error in STFT domain. +- `sdr`: Signal-to-Distortion Ratio, a common source separation metric. +""" + + +def prediction_to_wave(x, n_instruments=N_INSTRUMENTS): + """Convert STFT predictions back to waveform.""" + x = ops.reshape(x, (-1, x.shape[2], x.shape[3] // 2, 2)) + x = inverse_stft(x) + return ops.reshape(x, (-1, n_instruments, x.shape[1])) + + +def target_to_stft(y): + """Convert target waveforms to their STFT representations.""" + y = ops.reshape(y, (-1, CHUNK_SIZE)) + y_real, y_imag = ops.stft(y, STFT_N_FFT, STFT_HOP_LENGTH, STFT_N_FFT) + y_real, y_imag = y_real[..., :-1], y_imag[..., :-1] + y = ops.stack([y_real, y_imag], axis=-1) + return ops.reshape(y, (-1, N_INSTRUMENTS, y.shape[1], y.shape[2] * 2)) + + +@saving.register_keras_serializable() +def sdr(y_true, y_pred): + """Signal-to-Distortion Ratio metric.""" + y_pred = prediction_to_wave(y_pred) + # Add epsilon for numerical stability + num = ops.sum(ops.square(y_true), axis=-1) + 1e-8 + den = ops.sum(ops.square(y_true - y_pred), axis=-1) + 1e-8 + return 10 * ops.log10(num / den) + + +@saving.register_keras_serializable() +def spectral_loss(y_true, y_pred): + """Mean absolute error in the STFT domain.""" + y_true = target_to_stft(y_true) + return ops.mean(ops.absolute(y_true - y_pred)) + + +""" +## Training + +### Visualize Model Architecture +""" + +# Load or create the model +if path.exists(MODEL_PATH): + model = saving.load_model(MODEL_PATH) +else: + model = build_model(keras.Input(sample_batch_x.shape[1:]), name="tfc_tdf_net") + +# Display the model architecture +model.summary() +img = keras.utils.plot_model(model, path.join(TMP_DIR, "model.png"), show_shapes=True) +display.display(img) + +""" +### Compile and Train the Model +""" + +# Compile the model +optimizer = keras.optimizers.Adam(5e-05, gradient_accumulation_steps=ACCUMULATION_STEPS) +model.compile(optimizer=optimizer, loss=spectral_loss, metrics=[sdr]) + +# Define callbacks +cbs = [ + callbacks.ModelCheckpoint(MODEL_PATH, "val_sdr", save_best_only=True, mode="max"), + callbacks.ReduceLROnPlateau(factor=0.95, patience=2), + callbacks.CSVLogger(CSV_LOG_PATH), +] + +if not path.exists(MODEL_PATH): + model.fit(train_ds, validation_data=val_ds, epochs=10, callbacks=cbs, shuffle=False) +else: + # Demonstration of a single epoch of training when model already exists + model.fit(train_ds, validation_data=val_ds, epochs=1, shuffle=False, verbose=2) + +""" +## Evaluation + +Evaluate the model on the validation dataset and visualize predicted vocals. +""" + +model.evaluate(val_ds, verbose=2) +y_pred = model.predict(sample_batch_x, verbose=2) +y_pred = prediction_to_wave(y_pred) +visualize_audio_np(ops.convert_to_numpy(y_pred[0, 0]), name="vocals_pred") + +""" +## Conclusion + +We built and trained a vocal track separation model using an encoder-decoder +architecture with custom blocks applied to the MUSDB18 dataset. We demonstrated +STFT-based preprocessing, data augmentation, and a source separation metric (SDR). + +**Next steps:** + +- Train for more epochs and refine hyperparameters. +- Separate multiple instruments simultaneously. +- Enhance the model to handle instruments not present in the mixture. +""" \ No newline at end of file diff --git a/examples/vision/ipynb/mnist_moe.ipynb b/examples/vision/ipynb/mnist_moe.ipynb deleted file mode 100644 index 931cb764c6..0000000000 --- a/examples/vision/ipynb/mnist_moe.ipynb +++ /dev/null @@ -1,685 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": { - "colab_type": "text" - }, - "source": [ - "# MoE for MNIST\n", - "\n", - "**Author:** [Damoon Shahhosseini](https://www.linkedin.com/in/damoonsh/)
\n", - "**Date created:** 2015/06/19
\n", - "**Last modified:** 2020/04/21
\n", - "**Description:** Simple MoE implementation for MNIST classification." - ] - }, - { - "cell_type": "markdown", - "metadata": { - "colab_type": "text" - }, - "source": [ - "# Introduction\n", - "\n", - "In this example, we implement an adaptation of the Mixture of Experts (MoE) architecture\n", - "([Shazeer et al.](https://arxiv.org/abs/1701.06538)).\n", - "The idea is to use conditional computation to increases model capacity without increasing computation.\n", - "Experts are identical blocks within a layer where each are trained to specialize in different parts of the input space.\n", - "At each forward pass, a gating network selects a subset of experts to apply to the input.\n", - "\n", - "The components to implement are:\n", - "\n", - "- Gating network: A dense layer that outputs a probability distribution over the experts.\n", - "- MoE layer: A layer that applies a different expert to each input in the batch. And a loss function that ensures specialization among the experts.\n", - "- Model: A simple model that uses the MoE layer." - ] - }, - { - "cell_type": "markdown", - "metadata": { - "colab_type": "text" - }, - "source": [ - "## Imports" - ] - }, - { - "cell_type": "code", - "execution_count": 0, - "metadata": { - "colab_type": "code" - }, - "outputs": [], - "source": [ - "import numpy as np\n", - "import keras\n", - "from keras import layers, models\n", - "import tensorflow as tf\n", - "from tensorflow.keras import backend as K" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "colab_type": "text" - }, - "source": [ - "### Data Prepration" - ] - }, - { - "cell_type": "code", - "execution_count": 0, - "metadata": { - "colab_type": "code" - }, - "outputs": [], - "source": [ - "# Model / data parameters\n", - "num_classes = 10\n", - "input_shape = (28, 28, 1)\n", - "\n", - "# Load the data and split it between train and test sets\n", - "(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data()\n", - "\n", - "# Scale images to the [0, 1] range\n", - "x_train = x_train.astype(\"float32\") / 255\n", - "x_test = x_test.astype(\"float32\") / 255\n", - "# Make sure images have shape (28, 28, 1)\n", - "x_train = np.expand_dims(x_train, -1)\n", - "x_test = np.expand_dims(x_test, -1)\n", - "print(\"x_train shape:\", x_train.shape)\n", - "print(x_train.shape[0], \"train samples\")\n", - "print(x_test.shape[0], \"test samples\")\n", - "\n", - "\n", - "# convert class vectors to binary class matrices\n", - "y_train = keras.utils.to_categorical(y_train, num_classes)\n", - "y_test = keras.utils.to_categorical(y_test, num_classes)" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "colab_type": "text" - }, - "source": [ - "## Constants" - ] - }, - { - "cell_type": "code", - "execution_count": 0, - "metadata": { - "colab_type": "code" - }, - "outputs": [], - "source": [ - "NUM_EXPERTS = 5\n", - "TOP_K = 3\n", - "BATCH_SIZE = 128\n", - "NUM_EPOCHS = 12\n", - "LEARNING_RATE = 0.001\n", - "" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "colab_type": "text" - }, - "source": [ - "## Base architecture\n", - "\n", - "The most basic [MNIST classifier](https://keras.io/examples/vision/mnist_convnet/) consists of a stack of convolutional layers followed by a dense layer. In this tutorial, we will first replace the dense layer with a MoE layer. Then do the same for convolutional layers." - ] - }, - { - "cell_type": "code", - "execution_count": 0, - "metadata": { - "colab_type": "code" - }, - "outputs": [], - "source": [ - "model = keras.Sequential(\n", - " [\n", - " keras.Input(shape=input_shape),\n", - " layers.Conv2D(32, kernel_size=(3, 3), activation=\"relu\"),\n", - " layers.MaxPooling2D(pool_size=(2, 2)),\n", - " layers.Conv2D(64, kernel_size=(3, 3), activation=\"relu\"),\n", - " layers.MaxPooling2D(pool_size=(2, 2)),\n", - " layers.Flatten(),\n", - " layers.Dropout(0.5),\n", - " layers.Dense(num_classes, activation=\"softmax\"),\n", - " ]\n", - ")\n", - "\n", - "model.summary()" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "colab_type": "text" - }, - "source": [ - "# Linear MoE using Dense layers\n", - "\n", - "For this layer, we will create multiple dense layers that will be used as experts. Then a simple gating network will select at each step which exerts should be utilized for the current input. We will keep track of the number of times each expert is used. Then the selected experts will be combined using a weighted sum." - ] - }, - { - "cell_type": "code", - "execution_count": 0, - "metadata": { - "colab_type": "code" - }, - "outputs": [], - "source": [ - "\n", - "class LinearMoE(layers.Layer):\n", - " def __init__(\n", - " self,\n", - " hidden_size,\n", - " num_experts=NUM_EXPERTS,\n", - " top_k=TOP_K,\n", - " ):\n", - " super(LinearMoE, self).__init__()\n", - "\n", - " # Initialize experts\n", - " self.experts = [\n", - " layers.Dense(\n", - " hidden_size,\n", - " kernel_initializer=tf.keras.initializers.RandomNormal(\n", - " mean=0.0, stddev=0.001\n", - " ),\n", - " bias_initializer=\"zeros\",\n", - " )\n", - " for _ in range(num_experts)\n", - " ]\n", - " # Initialize gating network\n", - " self.gating_network = layers.Dense(\n", - " NUM_EXPERTS,\n", - " kernel_initializer=tf.keras.initializers.RandomNormal(\n", - " mean=0.0, stddev=0.001\n", - " ),\n", - " bias_initializer=\"zeros\",\n", - " activation=\"softmax\",\n", - " )\n", - "\n", - " self.num_experts = num_experts\n", - " self.top_k = top_k\n", - " # Keep track of how many times each expert is used\n", - " self.expert_usage_count = tf.Variable(\n", - " tf.zeros((num_experts,), dtype=tf.float32)\n", - " )\n", - "\n", - " def get_top_outputs(self, x, top_k_indices, top_k_weights):\n", - " batch_size = tf.shape(x)[0]\n", - " flat_indices = tf.reshape(top_k_indices, [-1])\n", - " repeated_x = tf.repeat(x, repeats=self.top_k, axis=0)\n", - "\n", - " # Compute outputs for unique experts\n", - " unique_expert_ids = tf.unique(flat_indices)[0] # Get unique expert indices\n", - " expert_outputs_dict = {}\n", - " for idx in unique_expert_ids:\n", - " mask = tf.equal(flat_indices, idx)\n", - " selected_inputs = tf.boolean_mask(repeated_x, mask)\n", - " expert_outputs_dict[idx.numpy()] = self.experts[idx](selected_inputs)\n", - "\n", - " # Gather outputs back into the correct shape\n", - " output_size = self.experts[0].compute_output_shape(input_shape=(None, 10))[-1]\n", - " flat_outputs = tf.zeros(\n", - " [batch_size * self.top_k, output_size], dtype=tf.float32\n", - " )\n", - " for idx in unique_expert_ids:\n", - " mask = tf.equal(flat_indices, idx)\n", - " indices = tf.where(mask)\n", - " flat_outputs = tf.tensor_scatter_nd_update(\n", - " flat_outputs, indices, expert_outputs_dict[idx.numpy()]\n", - " )\n", - " top_k_expert_outputs = tf.reshape(\n", - " flat_outputs, [batch_size, self.top_k, output_size]\n", - " )\n", - "\n", - " # Combine outputs using top-k weights\n", - " return tf.einsum(\"ijk,ij->ik\", top_k_expert_outputs, top_k_weights)\n", - "\n", - " def update_usage_counts(self, indices):\n", - " updates = tf.ones_like(tf.reshape(indices, [-1]), dtype=tf.float32)\n", - " self.expert_usage_count.assign(\n", - " tf.tensor_scatter_nd_add(\n", - " self.expert_usage_count, tf.reshape(indices, [-1, 1]), updates\n", - " )\n", - " )\n", - "\n", - " def call(self, x):\n", - " gating_weights = self.gating_network(x)\n", - " top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k)\n", - " combined_output = self.get_top_outputs(x, top_k_indices, top_k_weights)\n", - " self.update_usage_counts(top_k_indices)\n", - "\n", - " return combined_output\n", - "" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "colab_type": "text" - }, - "source": [ - "Output of the top 3 experts out of 10 for one layer of MoE:" - ] - }, - { - "cell_type": "code", - "execution_count": 0, - "metadata": { - "colab_type": "code" - }, - "outputs": [], - "source": [ - "sample_data = tf.random.uniform((1, 10))\n", - "linear_mode = LinearMoE(32, 10, 3)\n", - "linear_mode(sample_data)" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "colab_type": "text" - }, - "source": [ - "## Routing Collapse\n", - "\n", - "One common challenge with MoE architectures is \"routing collapse\". The \"route\" refers to the selection process of which expert to use for a given input where the model falls into a pattern of only using a small subset of experts. This happens because:\n", - "\n", - "1. Early in training, some experts may perform slightly better by chance\n", - "2. These better-performing experts get selected more frequently\n", - "3. With more practice, these experts improve further, creating a feedback loop\n", - "4. Other experts become neglected and never improve\n", - "\n", - "Code below demonstrates the randomness of expert selection:" - ] - }, - { - "cell_type": "code", - "execution_count": 0, - "metadata": { - "colab_type": "code" - }, - "outputs": [], - "source": [ - "\n", - "def check_expert_usage(runs):\n", - " # Running the later multiple times to show randomness of expert selection\n", - " for i in range(runs):\n", - " sample_data = tf.random.uniform((1, 10))\n", - " linear_mode = LinearMoE(10, 5)\n", - " _ = linear_mode(sample_data)\n", - " print(f\"Run {i}, Expert usage: {linear_mode.expert_usage_count.numpy()}\")\n", - "\n", - "\n", - "check_expert_usage(4)" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "colab_type": "text" - }, - "source": [ - "### Load Balancing Solutions\n", - "\n", - "To prevent routing collapse, we implement three types of losses that were introduced in various MoE research:\n", - "\n", - "1. Diversity Loss: Encourages the gating network to use all experts by maximizing the entropy\n", - " of expert selection probabilities\n", - " [Shazeer et al., \"Outrageously Large Neural Networks\" (2017)](https://arxiv.org/abs/1701.06538)\n", - "\n", - "2. Importance Loss: Ensures each expert handles a similar total amount of input across the batch\n", - " by penalizing deviations from the mean usage\n", - " [Lepikhin et al., \"GShard: Scaling Giant Models with Conditional Computation\" (2020)](https://arxiv.org/abs/2006.16668)\n", - "\n", - "3. Overflow Loss: Prevents individual experts from being overloaded by penalizing usage above\n", - " a specified capacity threshold\n", - " [Fedus et al., \"Switch Transformers\" (2021)](https://arxiv.org/abs/2101.03961)\n", - "\n", - "These losses are combined with the main classification loss during training to ensure balanced expert utilization.\n", - "The combination of these techniques has proven effective in large-scale models like GShard and Switch Transformers." - ] - }, - { - "cell_type": "code", - "execution_count": 0, - "metadata": { - "colab_type": "code" - }, - "outputs": [], - "source": [ - "\n", - "class LinearMoE(layers.Layer):\n", - " def __init__(\n", - " self,\n", - " hidden_size,\n", - " num_experts=NUM_EXPERTS,\n", - " top_k=TOP_K,\n", - " ):\n", - " super(LinearMoE, self).__init__()\n", - "\n", - " # Initialize experts\n", - " self.experts = [\n", - " layers.Dense(\n", - " hidden_size,\n", - " kernel_initializer=tf.keras.initializers.RandomNormal(\n", - " mean=0.0, stddev=0.001\n", - " ),\n", - " bias_initializer=\"zeros\",\n", - " )\n", - " for _ in range(num_experts)\n", - " ]\n", - " # Initialize gating network\n", - " self.gating_network = layers.Dense(\n", - " num_experts, # Match output to num_experts\n", - " kernel_initializer=tf.keras.initializers.RandomNormal(\n", - " mean=0.0, stddev=0.001\n", - " ),\n", - " bias_initializer=\"zeros\",\n", - " activation=\"softmax\",\n", - " )\n", - "\n", - " self.num_experts = num_experts\n", - " self.top_k = top_k\n", - " # Keep track of how many times each expert is used as a layer weight\n", - " self.expert_usage_count = tf.Variable(\n", - " tf.zeros((num_experts,), dtype=tf.float32)\n", - " )\n", - "\n", - " self.batch_capacity = BATCH_SIZE // num_experts\n", - "\n", - " def _diversity_loss(self, weights):\n", - " entropy = -K.sum(weights * K.log(weights + 1e-10), axis=1)\n", - " self.diversity_loss = -K.mean(entropy)\n", - "\n", - " def _importance_loss(self, gating_weights):\n", - " batch_importance_sum = K.sum(gating_weights, axis=0)\n", - " mean_importance = K.mean(batch_importance_sum)\n", - " self.importance_loss = K.mean(\n", - " K.square(\n", - " batch_importance_sum\n", - " - mean_importance * tf.ones_like(batch_importance_sum)\n", - " )\n", - " )\n", - "\n", - " # Replace the current get_top_outputs method with this vectorized version\n", - " def get_top_outputs(\n", - " self, x, gating_weights\n", - " ): # Changed to take gating_weights directly\n", - " \"\"\"Compute outputs from top-k experts.\"\"\"\n", - " top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k)\n", - "\n", - " # Store indices and updates for usage count\n", - " self.indices = tf.reshape(top_k_indices, [-1, 1])\n", - " self.updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32)\n", - "\n", - " # Compute expert outputs symbolically\n", - " expert_outputs = tf.stack([expert(x) for expert in self.experts], axis=1)\n", - " batch_size = tf.shape(x)[0]\n", - " batch_indices = tf.tile(tf.range(batch_size)[:, tf.newaxis], [1, self.top_k])\n", - " gather_indices = tf.stack([batch_indices, top_k_indices], axis=-1)\n", - " top_k_expert_outputs = tf.gather_nd(expert_outputs, gather_indices)\n", - "\n", - " combined_output = tf.reduce_sum(\n", - " top_k_expert_outputs * top_k_weights[:, :, tf.newaxis], axis=1\n", - " )\n", - " return combined_output\n", - "\n", - " def update_usage_counts(self):\n", - " updates = tf.ones_like(tf.reshape(self.indices, [-1]), dtype=tf.float32)\n", - " self.expert_usage_count.assign(\n", - " tf.tensor_scatter_nd_add(\n", - " self.expert_usage_count, tf.reshape(self.indices, [-1, 1]), updates\n", - " )\n", - " )\n", - "\n", - " def call(self, x):\n", - " # Get gating weights and normalize\n", - " gating_weights = self.gating_network(x)\n", - " # top_k_weights, top_k_indices = tf.nn.top_k(gating_weights, k=self.top_k)\n", - " combined_output = self.get_top_outputs(x, gating_weights)\n", - " self.update_usage_counts()\n", - " self._diversity_loss(gating_weights)\n", - " self._importance_loss(gating_weights)\n", - "\n", - " return combined_output\n", - "\n", - " def compute_total_loss(self, load_balance_coef=0.01):\n", - " self.batch_overflow_sum = K.sum(\n", - " K.relu(tf.convert_to_tensor(self.expert_usage_count) - self.batch_capacity)\n", - " )\n", - " return load_balance_coef * (\n", - " self.diversity_loss + self.batch_overflow_sum + self.importance_loss\n", - " )\n", - "" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "colab_type": "text" - }, - "source": [ - "## MNIST classification with MoE" - ] - }, - { - "cell_type": "code", - "execution_count": 0, - "metadata": { - "colab_type": "code" - }, - "outputs": [], - "source": [ - "\n", - "class MoEModel(keras.Model):\n", - " def __init__(\n", - " self,\n", - " num_classes,\n", - " num_experts=NUM_EXPERTS,\n", - " top_k=TOP_K,\n", - " moe_loss_considered=True,\n", - " ):\n", - " super(MoEModel, self).__init__()\n", - "\n", - " # Define the convolutional block\n", - " self.conv_block = keras.Sequential(\n", - " [\n", - " layers.Conv2D(32, kernel_size=(3, 3), activation=\"relu\"),\n", - " layers.MaxPooling2D(pool_size=(2, 2)),\n", - " layers.Conv2D(64, kernel_size=(3, 3), activation=\"relu\"),\n", - " layers.MaxPooling2D(pool_size=(2, 2)),\n", - " layers.Flatten(),\n", - " layers.Dropout(0.5),\n", - " ]\n", - " )\n", - "\n", - " # MoE classifier\n", - " self.moe_classifier = LinearMoE(\n", - " hidden_size=num_classes, num_experts=num_experts, top_k=top_k\n", - " )\n", - "\n", - " # Softmax layer\n", - " self.softmax = layers.Softmax()\n", - " self.moe_loss_considered = moe_loss_considered\n", - "\n", - " def call(self, inputs, training=False):\n", - " conv_flatten = self.conv_block(inputs)\n", - " moe_output = self.moe_classifier(conv_flatten)\n", - " outputs = self.softmax(moe_output)\n", - " return outputs\n", - "\n", - " def train_step(self, data):\n", - " x, y = data # Unpack input data and labels\n", - "\n", - " with tf.GradientTape() as tape:\n", - " y_pred = self(x, training=True)\n", - " classification_loss = self.compute_loss(x, y, y_pred)\n", - " if self.moe_loss_considered:\n", - " moe_loss = self.moe_classifier.compute_total_loss(\n", - " load_balance_coef=0.01\n", - " )\n", - " total_loss = classification_loss + moe_loss\n", - " else:\n", - " total_loss = classification_loss\n", - "\n", - " # Compute gradients\n", - " gradients = tape.gradient(total_loss, self.trainable_variables)\n", - "\n", - " # Update weights\n", - " self.optimizer.apply_gradients(zip(gradients, self.trainable_variables))\n", - " for metric in self.metrics:\n", - " metric.update_state(y, y_pred)\n", - " # Return a dict of metrics for monitoring\n", - " return {\n", - " \"loss\": total_loss,\n", - " \"moe_loss\": moe_loss,\n", - " **{m.name: m.result() for m in self.metrics},\n", - " }\n", - "\n", - " def test_step(self, data):\n", - " x, y = data\n", - " y_pred = self(x, training=False)\n", - " classification_loss = self.compute_loss(x, y, y_pred)\n", - " moe_loss = self.moe_classifier.compute_total_loss(load_balance_coef=0.01)\n", - " total_loss = classification_loss + moe_loss\n", - "\n", - " for metric in self.metrics:\n", - " metric.update_state(y, y_pred)\n", - " return {\n", - " \"loss\": total_loss,\n", - " \"moe_loss\": moe_loss,\n", - " **{m.name: m.result() for m in self.metrics},\n", - " }\n", - "\n", - "\n", - "# Instantiate and compile the model\n", - "inputs = keras.Input(shape=input_shape)\n", - "model = MoEModel(num_classes=num_classes, num_experts=5, top_k=3)\n", - "\n", - "model.compile(\n", - " optimizer=keras.optimizers.Adam(learning_rate=LEARNING_RATE),\n", - " loss=keras.losses.CategoricalCrossentropy(), # Assumes one-hot encoded labels\n", - " metrics=[\"accuracy\"],\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "colab_type": "text" - }, - "source": [ - "### Training" - ] - }, - { - "cell_type": "code", - "execution_count": 0, - "metadata": { - "colab_type": "code" - }, - "outputs": [], - "source": [ - "history = model.fit(\n", - " x_train,\n", - " y_train,\n", - " batch_size=BATCH_SIZE,\n", - " epochs=NUM_EPOCHS,\n", - " validation_data=(x_test, y_test),\n", - " verbose=0,\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "colab_type": "text" - }, - "source": [ - "### Evaluation" - ] - }, - { - "cell_type": "code", - "execution_count": 0, - "metadata": { - "colab_type": "code" - }, - "outputs": [], - "source": [ - "score = model.evaluate(x_test, y_test, verbose=0)\n", - "print(\"Test loss:\", score[0])\n", - "print(\"Test accuracy:\", score[1])" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "colab_type": "text" - }, - "source": [ - "# Conclusion\n", - "\n", - "This example demonstrated how Mixture of Experts (MoE) can be used to increase model capacity without a proportional increase in computation cost. The key benefits are:\n", - "\n", - "1. Conditional Computation: Only a subset of experts (TOP_K=3 out of NUM_EXPERTS=5) process each input,\n", - " making the model more computationally efficient than a model that uses all parameters for every input.\n", - "\n", - "2. Specialized Processing: Each expert learns to handle different aspects of the input space,\n", - " allowing for more sophisticated processing without requiring a larger dense network.\n", - "\n", - "In our implementation, we:\n", - "1. Created a basic MoE layer using dense networks as experts\n", - "2. Implemented three types of load balancing losses to prevent routing collapse\n", - "3. Applied the MoE architecture to MNIST classification by replacing the final dense layer\n", - "4. Achieved comparable accuracy to the baseline model while using experts conditionally\n", - "\n", - "This approach is particularly valuable for large-scale models where computational efficiency\n", - "is crucial. The same principles demonstrated here are used in much larger language models\n", - "and other applications where model capacity needs to scale efficiently" - ] - } - ], - "metadata": { - "accelerator": "GPU", - "colab": { - "collapsed_sections": [], - "name": "mnist_moe", - "private_outputs": false, - "provenance": [], - "toc_visible": true - }, - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.7.0" - } - }, - "nbformat": 4, - "nbformat_minor": 0 -} \ No newline at end of file diff --git a/examples/vision/md/mnist_moe.md b/examples/vision/md/mnist_moe.md deleted file mode 100644 index 2696176d0c..0000000000 --- a/examples/vision/md/mnist_moe.md +++ /dev/null @@ -1,585 +0,0 @@ -# MoE for MNIST - -**Author:** [Damoon Shahhosseini](https://www.linkedin.com/in/damoonsh/)
-**Date created:** 2015/06/19
-**Last modified:** 2020/04/21
-**Description:** Simple MoE implementation for MNIST classification. - - - [**View in Colab**](https://colab.research.google.com/github/keras-team/keras-io/blob/master/examples/vision/ipynb/mnist_moe.ipynb) [**GitHub source**](https://github.com/keras-team/keras-io/blob/master/examples/vision/mnist_moe.py) - - - -# Introduction - -In this example, we implement an adaptation of the Mixture of Experts (MoE) architecture -([Shazeer et al.](https://arxiv.org/abs/1701.06538)). -The idea is to use conditional computation to increases model capacity without increasing computation. -Experts are identical blocks within a layer where each are trained to specialize in different parts of the input space. -At each forward pass, a gating network selects a subset of experts to apply to the input. - -The components to implement are: - -- Gating network: A dense layer that outputs a probability distribution over the experts. -- MoE layer: A layer that applies a different expert to each input in the batch. And a loss function that ensures specialization among the experts. -- Model: A simple model that uses the MoE layer. - ---- -## Imports - - -```python -import numpy as np -import keras -from keras import layers, models -import tensorflow as tf -from tensorflow.keras import backend as K -``` - -### Data Prepration - - -```python -# Model / data parameters -num_classes = 10 -input_shape = (28, 28, 1) - -# Load the data and split it between train and test sets -(x_train, y_train), (x_test, y_test) = keras.datasets.mnist.load_data() - -# Scale images to the [0, 1] range -x_train = x_train.astype("float32") / 255 -x_test = x_test.astype("float32") / 255 -# Make sure images have shape (28, 28, 1) -x_train = np.expand_dims(x_train, -1) -x_test = np.expand_dims(x_test, -1) -print("x_train shape:", x_train.shape) -print(x_train.shape[0], "train samples") -print(x_test.shape[0], "test samples") - - -# convert class vectors to binary class matrices -y_train = keras.utils.to_categorical(y_train, num_classes) -y_test = keras.utils.to_categorical(y_test, num_classes) -``` - -
-``` -x_train shape: (60000, 28, 28, 1) -60000 train samples -10000 test samples - -``` -
---- -## Constants - - -```python -NUM_EXPERTS = 5 -TOP_K = 3 -BATCH_SIZE = 128 -NUM_EPOCHS = 12 -LEARNING_RATE = 0.001 - -``` - ---- -## Base architecture - -The most basic [MNIST classifier](https://keras.io/examples/vision/mnist_convnet/) consists of a stack of convolutional layers followed by a dense layer. In this tutorial, we will first replace the dense layer with a MoE layer. Then do the same for convolutional layers. - - -```python -model = keras.Sequential( - [ - keras.Input(shape=input_shape), - layers.Conv2D(32, kernel_size=(3, 3), activation="relu"), - layers.MaxPooling2D(pool_size=(2, 2)), - layers.Conv2D(64, kernel_size=(3, 3), activation="relu"), - layers.MaxPooling2D(pool_size=(2, 2)), - layers.Flatten(), - layers.Dropout(0.5), - layers.Dense(num_classes, activation="softmax"), - ] -) - -model.summary() -``` - - -
Model: "sequential"
-
- - - - -
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
-┃ Layer (type)                     Output Shape                  Param # ┃
-┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
-│ conv2d (Conv2D)                 │ (None, 26, 26, 32)     │           320 │
-├─────────────────────────────────┼────────────────────────┼───────────────┤
-│ max_pooling2d (MaxPooling2D)    │ (None, 13, 13, 32)     │             0 │
-├─────────────────────────────────┼────────────────────────┼───────────────┤
-│ conv2d_1 (Conv2D)               │ (None, 11, 11, 64)     │        18,496 │
-├─────────────────────────────────┼────────────────────────┼───────────────┤
-│ max_pooling2d_1 (MaxPooling2D)  │ (None, 5, 5, 64)       │             0 │
-├─────────────────────────────────┼────────────────────────┼───────────────┤
-│ flatten (Flatten)               │ (None, 1600)           │             0 │
-├─────────────────────────────────┼────────────────────────┼───────────────┤
-│ dropout (Dropout)               │ (None, 1600)           │             0 │
-├─────────────────────────────────┼────────────────────────┼───────────────┤
-│ dense (Dense)                   │ (None, 10)             │        16,010 │
-└─────────────────────────────────┴────────────────────────┴───────────────┘
-
- - - - -
 Total params: 34,826 (136.04 KB)
-
- - - - -
 Trainable params: 34,826 (136.04 KB)
-
- - - - -
 Non-trainable params: 0 (0.00 B)
-
- - - -# Linear MoE using Dense layers - -For this layer, we will create multiple dense layers that will be used as experts. Then a simple gating network will select at each step which exerts should be utilized for the current input. We will keep track of the number of times each expert is used. Then the selected experts will be combined using a weighted sum. - - -```python - -class LinearMoE(layers.Layer): - def __init__( - self, - hidden_size, - num_experts=NUM_EXPERTS, - top_k=TOP_K, - ): - super(LinearMoE, self).__init__() - - # Initialize experts - self.experts = [ - layers.Dense( - hidden_size, - kernel_initializer=tf.keras.initializers.RandomNormal( - mean=0.0, stddev=0.001 - ), - bias_initializer="zeros", - ) - for _ in range(num_experts) - ] - # Initialize gating network - self.gating_network = layers.Dense( - NUM_EXPERTS, - kernel_initializer=tf.keras.initializers.RandomNormal( - mean=0.0, stddev=0.001 - ), - bias_initializer="zeros", - activation="softmax", - ) - - self.num_experts = num_experts - self.top_k = top_k - # Keep track of how many times each expert is used - self.expert_usage_count = tf.Variable( - tf.zeros((num_experts,), dtype=tf.float32) - ) - - def get_top_outputs(self, x, top_k_indices, top_k_weights): - batch_size = tf.shape(x)[0] - flat_indices = tf.reshape(top_k_indices, [-1]) - repeated_x = tf.repeat(x, repeats=self.top_k, axis=0) - - # Compute outputs for unique experts - unique_expert_ids = tf.unique(flat_indices)[0] # Get unique expert indices - expert_outputs_dict = {} - for idx in unique_expert_ids: - mask = tf.equal(flat_indices, idx) - selected_inputs = tf.boolean_mask(repeated_x, mask) - expert_outputs_dict[idx.numpy()] = self.experts[idx](selected_inputs) - - # Gather outputs back into the correct shape - output_size = self.experts[0].compute_output_shape(input_shape=(None, 10))[-1] - flat_outputs = tf.zeros( - [batch_size * self.top_k, output_size], dtype=tf.float32 - ) - for idx in unique_expert_ids: - mask = tf.equal(flat_indices, idx) - indices = tf.where(mask) - flat_outputs = tf.tensor_scatter_nd_update( - flat_outputs, indices, expert_outputs_dict[idx.numpy()] - ) - top_k_expert_outputs = tf.reshape( - flat_outputs, [batch_size, self.top_k, output_size] - ) - - # Combine outputs using top-k weights - return tf.einsum("ijk,ij->ik", top_k_expert_outputs, top_k_weights) - - def update_usage_counts(self, indices): - updates = tf.ones_like(tf.reshape(indices, [-1]), dtype=tf.float32) - self.expert_usage_count.assign( - tf.tensor_scatter_nd_add( - self.expert_usage_count, tf.reshape(indices, [-1, 1]), updates - ) - ) - - def call(self, x): - gating_weights = self.gating_network(x) - top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k) - combined_output = self.get_top_outputs(x, top_k_indices, top_k_weights) - self.update_usage_counts(top_k_indices) - - return combined_output - -``` - -Output of the top 3 experts out of 10 for one layer of MoE: - - -```python -sample_data = tf.random.uniform((1, 10)) -linear_mode = LinearMoE(32, 10, 3) -linear_mode(sample_data) -``` - - - - -
-``` - - -``` -
---- -## Routing Collapse - -One common challenge with MoE architectures is "routing collapse". The "route" refers to the selection process of which expert to use for a given input where the model falls into a pattern of only using a small subset of experts. This happens because: - -1. Early in training, some experts may perform slightly better by chance -2. These better-performing experts get selected more frequently -3. With more practice, these experts improve further, creating a feedback loop -4. Other experts become neglected and never improve - -Code below demonstrates the randomness of expert selection: - - -```python - -def check_expert_usage(runs): - # Running the later multiple times to show randomness of expert selection - for i in range(runs): - sample_data = tf.random.uniform((1, 10)) - linear_mode = LinearMoE(10, 5) - _ = linear_mode(sample_data) - print(f"Run {i}, Expert usage: {linear_mode.expert_usage_count.numpy()}") - - -check_expert_usage(4) -``` - -
-``` -Run 0, Expert usage: [1. 1. 0. 0. 1.] -Run 1, Expert usage: [1. 1. 1. 0. 0.] -Run 2, Expert usage: [0. 1. 1. 0. 1.] -Run 3, Expert usage: [0. 1. 1. 1. 0.] - -``` -
-### Load Balancing Solutions - -To prevent routing collapse, we implement three types of losses that were introduced in various MoE research: - -1. Diversity Loss: Encourages the gating network to use all experts by maximizing the entropy - of expert selection probabilities - [Shazeer et al., "Outrageously Large Neural Networks" (2017)](https://arxiv.org/abs/1701.06538) - -2. Importance Loss: Ensures each expert handles a similar total amount of input across the batch - by penalizing deviations from the mean usage - [Lepikhin et al., "GShard: Scaling Giant Models with Conditional Computation" (2020)](https://arxiv.org/abs/2006.16668) - -3. Overflow Loss: Prevents individual experts from being overloaded by penalizing usage above - a specified capacity threshold - [Fedus et al., "Switch Transformers" (2021)](https://arxiv.org/abs/2101.03961) - -These losses are combined with the main classification loss during training to ensure balanced expert utilization. -The combination of these techniques has proven effective in large-scale models like GShard and Switch Transformers. - - -```python - -class LinearMoE(layers.Layer): - def __init__( - self, - hidden_size, - num_experts=NUM_EXPERTS, - top_k=TOP_K, - ): - super(LinearMoE, self).__init__() - - # Initialize experts - self.experts = [ - layers.Dense( - hidden_size, - kernel_initializer=tf.keras.initializers.RandomNormal( - mean=0.0, stddev=0.001 - ), - bias_initializer="zeros", - ) - for _ in range(num_experts) - ] - # Initialize gating network - self.gating_network = layers.Dense( - num_experts, # Match output to num_experts - kernel_initializer=tf.keras.initializers.RandomNormal( - mean=0.0, stddev=0.001 - ), - bias_initializer="zeros", - activation="softmax", - ) - - self.num_experts = num_experts - self.top_k = top_k - # Keep track of how many times each expert is used as a layer weight - self.expert_usage_count = tf.Variable( - tf.zeros((num_experts,), dtype=tf.float32) - ) - - self.batch_capacity = BATCH_SIZE // num_experts - - def _diversity_loss(self, weights): - entropy = -K.sum(weights * K.log(weights + 1e-10), axis=1) - self.diversity_loss = -K.mean(entropy) - - def _importance_loss(self, gating_weights): - batch_importance_sum = K.sum(gating_weights, axis=0) - mean_importance = K.mean(batch_importance_sum) - self.importance_loss = K.mean( - K.square( - batch_importance_sum - - mean_importance * tf.ones_like(batch_importance_sum) - ) - ) - - # Replace the current get_top_outputs method with this vectorized version - def get_top_outputs( - self, x, gating_weights - ): # Changed to take gating_weights directly - """Compute outputs from top-k experts.""" - top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k) - - # Store indices and updates for usage count - self.indices = tf.reshape(top_k_indices, [-1, 1]) - self.updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32) - - # Compute expert outputs symbolically - expert_outputs = tf.stack([expert(x) for expert in self.experts], axis=1) - batch_size = tf.shape(x)[0] - batch_indices = tf.tile(tf.range(batch_size)[:, tf.newaxis], [1, self.top_k]) - gather_indices = tf.stack([batch_indices, top_k_indices], axis=-1) - top_k_expert_outputs = tf.gather_nd(expert_outputs, gather_indices) - - combined_output = tf.reduce_sum( - top_k_expert_outputs * top_k_weights[:, :, tf.newaxis], axis=1 - ) - return combined_output - - def update_usage_counts(self): - updates = tf.ones_like(tf.reshape(self.indices, [-1]), dtype=tf.float32) - self.expert_usage_count.assign( - tf.tensor_scatter_nd_add( - self.expert_usage_count, tf.reshape(self.indices, [-1, 1]), updates - ) - ) - - def call(self, x): - # Get gating weights and normalize - gating_weights = self.gating_network(x) - # top_k_weights, top_k_indices = tf.nn.top_k(gating_weights, k=self.top_k) - combined_output = self.get_top_outputs(x, gating_weights) - self.update_usage_counts() - self._diversity_loss(gating_weights) - self._importance_loss(gating_weights) - - return combined_output - - def compute_total_loss(self, load_balance_coef=0.01): - self.batch_overflow_sum = K.sum( - K.relu(tf.convert_to_tensor(self.expert_usage_count) - self.batch_capacity) - ) - return load_balance_coef * ( - self.diversity_loss + self.batch_overflow_sum + self.importance_loss - ) - -``` - ---- -## MNIST classification with MoE - - -```python - -class MoEModel(keras.Model): - def __init__( - self, - num_classes, - num_experts=NUM_EXPERTS, - top_k=TOP_K, - moe_loss_considered=True, - ): - super(MoEModel, self).__init__() - - # Define the convolutional block - self.conv_block = keras.Sequential( - [ - layers.Conv2D(32, kernel_size=(3, 3), activation="relu"), - layers.MaxPooling2D(pool_size=(2, 2)), - layers.Conv2D(64, kernel_size=(3, 3), activation="relu"), - layers.MaxPooling2D(pool_size=(2, 2)), - layers.Flatten(), - layers.Dropout(0.5), - ] - ) - - # MoE classifier - self.moe_classifier = LinearMoE( - hidden_size=num_classes, num_experts=num_experts, top_k=top_k - ) - - # Softmax layer - self.softmax = layers.Softmax() - self.moe_loss_considered = moe_loss_considered - - def call(self, inputs, training=False): - conv_flatten = self.conv_block(inputs) - moe_output = self.moe_classifier(conv_flatten) - outputs = self.softmax(moe_output) - return outputs - - def train_step(self, data): - x, y = data # Unpack input data and labels - - with tf.GradientTape() as tape: - y_pred = self(x, training=True) - classification_loss = self.compute_loss(x, y, y_pred) - if self.moe_loss_considered: - moe_loss = self.moe_classifier.compute_total_loss( - load_balance_coef=0.01 - ) - total_loss = classification_loss + moe_loss - else: - total_loss = classification_loss - - # Compute gradients - gradients = tape.gradient(total_loss, self.trainable_variables) - - # Update weights - self.optimizer.apply_gradients(zip(gradients, self.trainable_variables)) - for metric in self.metrics: - metric.update_state(y, y_pred) - # Return a dict of metrics for monitoring - return { - "loss": total_loss, - "moe_loss": moe_loss, - **{m.name: m.result() for m in self.metrics}, - } - - def test_step(self, data): - x, y = data - y_pred = self(x, training=False) - classification_loss = self.compute_loss(x, y, y_pred) - moe_loss = self.moe_classifier.compute_total_loss(load_balance_coef=0.01) - total_loss = classification_loss + moe_loss - - for metric in self.metrics: - metric.update_state(y, y_pred) - return { - "loss": total_loss, - "moe_loss": moe_loss, - **{m.name: m.result() for m in self.metrics}, - } - - -# Instantiate and compile the model -inputs = keras.Input(shape=input_shape) -model = MoEModel(num_classes=num_classes, num_experts=5, top_k=3) - -model.compile( - optimizer=keras.optimizers.Adam(learning_rate=LEARNING_RATE), - loss=keras.losses.CategoricalCrossentropy(), # Assumes one-hot encoded labels - metrics=["accuracy"], -) -``` - -### Training - - -```python -history = model.fit( - x_train, - y_train, - batch_size=BATCH_SIZE, - epochs=NUM_EPOCHS, - validation_data=(x_test, y_test), - verbose=0, -) -``` - -### Evaluation - - -```python -score = model.evaluate(x_test, y_test, verbose=0) -print("Test loss:", score[0]) -print("Test accuracy:", score[1]) -``` - -
-``` -Test loss: tf.Tensor(0.9811909, shape=(), dtype=float32) -Test accuracy: {'accuracy': } - -``` -
-# Conclusion - -This example demonstrated how Mixture of Experts (MoE) can be used to increase model capacity without a proportional increase in computation cost. The key benefits are: - -1. Conditional Computation: Only a subset of experts (TOP_K=3 out of NUM_EXPERTS=5) process each input, - making the model more computationally efficient than a model that uses all parameters for every input. - -2. Specialized Processing: Each expert learns to handle different aspects of the input space, - allowing for more sophisticated processing without requiring a larger dense network. - -In our implementation, we: -1. Created a basic MoE layer using dense networks as experts -2. Implemented three types of load balancing losses to prevent routing collapse -3. Applied the MoE architecture to MNIST classification by replacing the final dense layer -4. Achieved comparable accuracy to the baseline model while using experts conditionally - -This approach is particularly valuable for large-scale models where computational efficiency -is crucial. The same principles demonstrated here are used in much larger language models -and other applications where model capacity needs to scale efficiently diff --git a/examples/vision/mnist_moe.py b/examples/vision/mnist_moe.py index bcec4edf87..0b47e29edf 100644 --- a/examples/vision/mnist_moe.py +++ b/examples/vision/mnist_moe.py @@ -1,8 +1,8 @@ """ Title: MoE for MNIST Author: [Damoon Shahhosseini](https://www.linkedin.com/in/damoonsh/) -Date created: 2015/06/19 -Last modified: 2020/04/21 +Date created: 2025/02/28 +Last modified: 2050/02/28 Description: Simple MoE implementation for MNIST classification. Accelerator: GPU """ @@ -27,11 +27,14 @@ ## Imports """ +import os + +os.environ["KERAS_BACKEND"] = "torch" + import numpy as np import keras -from keras import layers, models -import tensorflow as tf -from tensorflow.keras import backend as K +from keras import layers +import torch """ ### Data Prepration @@ -66,7 +69,7 @@ NUM_EXPERTS = 5 TOP_K = 3 BATCH_SIZE = 128 -NUM_EPOCHS = 12 +NUM_EPOCHS = 1 LEARNING_RATE = 0.001 @@ -106,12 +109,12 @@ def __init__( top_k=TOP_K, ): super(LinearMoE, self).__init__() - + self.expert_output_size = hidden_size # Initialize experts self.experts = [ layers.Dense( hidden_size, - kernel_initializer=tf.keras.initializers.RandomNormal( + kernel_initializer=keras.initializers.RandomNormal( mean=0.0, stddev=0.001 ), bias_initializer="zeros", @@ -121,9 +124,7 @@ def __init__( # Initialize gating network self.gating_network = layers.Dense( NUM_EXPERTS, - kernel_initializer=tf.keras.initializers.RandomNormal( - mean=0.0, stddev=0.001 - ), + kernel_initializer=keras.initializers.RandomNormal(mean=0.0, stddev=0.001), bias_initializer="zeros", activation="softmax", ) @@ -131,62 +132,79 @@ def __init__( self.num_experts = num_experts self.top_k = top_k # Keep track of how many times each expert is used - self.expert_usage_count = tf.Variable( - tf.zeros((num_experts,), dtype=tf.float32) + self.expert_usage_count = self.add_weight( + shape=(self.num_experts,), + initializer="zeros", + trainable=False, + name="expert_usage_count", ) + self.batch_capacity = BATCH_SIZE / num_experts + + def build(self, input_shape): + super(LinearMoE, self).build(input_shape) + self.built = True def get_top_outputs(self, x, top_k_indices, top_k_weights): - batch_size = tf.shape(x)[0] - flat_indices = tf.reshape(top_k_indices, [-1]) - repeated_x = tf.repeat(x, repeats=self.top_k, axis=0) + batch_size = x.size(0) + output_size = self.expert_output_size - # Compute outputs for unique experts - unique_expert_ids = tf.unique(flat_indices)[0] # Get unique expert indices + # Get outputs from top-k experts + top_k_expert_outputs = torch.zeros( + batch_size, self.top_k, output_size, device=x.device + ) + flat_indices = top_k_indices.view(-1).to(x.device) + repeated_x = x.repeat_interleave(self.top_k, dim=0) + + unique_expert_ids = flat_indices.unique(sorted=True) expert_outputs_dict = {} for idx in unique_expert_ids: - mask = tf.equal(flat_indices, idx) - selected_inputs = tf.boolean_mask(repeated_x, mask) - expert_outputs_dict[idx.numpy()] = self.experts[idx](selected_inputs) - - # Gather outputs back into the correct shape - output_size = self.experts[0].compute_output_shape(input_shape=(None, 10))[-1] - flat_outputs = tf.zeros( - [batch_size * self.top_k, output_size], dtype=tf.float32 + mask = (flat_indices == idx).to(x.device) + selected_inputs = repeated_x[mask] + expert_outputs_dict[idx.item()] = self.experts[idx](selected_inputs) + + flat_outputs = torch.zeros( + batch_size * self.top_k, output_size, device=x.device ) for idx in unique_expert_ids: - mask = tf.equal(flat_indices, idx) - indices = tf.where(mask) - flat_outputs = tf.tensor_scatter_nd_update( - flat_outputs, indices, expert_outputs_dict[idx.numpy()] - ) - top_k_expert_outputs = tf.reshape( - flat_outputs, [batch_size, self.top_k, output_size] - ) + mask = (flat_indices == idx).to(x.device) + flat_outputs[mask] = expert_outputs_dict[idx.item()].to(flat_outputs.device) - # Combine outputs using top-k weights - return tf.einsum("ijk,ij->ik", top_k_expert_outputs, top_k_weights) + top_k_expert_outputs = flat_outputs.view(batch_size, self.top_k, output_size) - def update_usage_counts(self, indices): - updates = tf.ones_like(tf.reshape(indices, [-1]), dtype=tf.float32) - self.expert_usage_count.assign( - tf.tensor_scatter_nd_add( - self.expert_usage_count, tf.reshape(indices, [-1, 1]), updates - ) + return torch.einsum( + "ijk,ij->ik", top_k_expert_outputs.cpu(), top_k_weights.cpu() ) + def update_usage_counts(self, indices): + flat_indices = keras.ops.reshape(indices, [-1]) + one_hot = keras.ops.one_hot(flat_indices, self.num_experts) + updates = keras.ops.sum(one_hot, axis=0) + self.batch_usage_count = updates + self.expert_usage_count.assign_add(updates) + def call(self, x): gating_weights = self.gating_network(x) - top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k) + top_k_weights, top_k_indices = torch.topk(gating_weights, self.top_k, dim=-1) combined_output = self.get_top_outputs(x, top_k_indices, top_k_weights) self.update_usage_counts(top_k_indices) - + self.batch_overflow_sum = torch.relu( + self.batch_usage_count - self.batch_capacity + ).sum() return combined_output + def compute_total_loss(self, load_balance_coef=0.01): + self.batch_overflow_sum = torch.sum( + torch.relu(self.expert_usage_count - self.batch_capacity) + ) + return load_balance_coef * ( + self.diversity_loss + self.batch_overflow_sum + self.importance_loss + ) + """ Output of the top 3 experts out of 10 for one layer of MoE: """ -sample_data = tf.random.uniform((1, 10)) +sample_data = torch.randn((1, 10)) linear_mode = LinearMoE(32, 10, 3) linear_mode(sample_data) @@ -207,7 +225,7 @@ def call(self, x): def check_expert_usage(runs): # Running the later multiple times to show randomness of expert selection for i in range(runs): - sample_data = tf.random.uniform((1, 10)) + sample_data = torch.randn((1, 10)) linear_mode = LinearMoE(10, 5) _ = linear_mode(sample_data) print(f"Run {i}, Expert usage: {linear_mode.expert_usage_count.numpy()}") @@ -245,12 +263,12 @@ def __init__( top_k=TOP_K, ): super(LinearMoE, self).__init__() - + self.expert_output_size = hidden_size # Initialize experts self.experts = [ layers.Dense( hidden_size, - kernel_initializer=tf.keras.initializers.RandomNormal( + kernel_initializer=keras.initializers.RandomNormal( mean=0.0, stddev=0.001 ), bias_initializer="zeros", @@ -259,82 +277,97 @@ def __init__( ] # Initialize gating network self.gating_network = layers.Dense( - num_experts, # Match output to num_experts - kernel_initializer=tf.keras.initializers.RandomNormal( - mean=0.0, stddev=0.001 - ), + NUM_EXPERTS, + kernel_initializer=keras.initializers.RandomNormal(mean=0.0, stddev=0.001), bias_initializer="zeros", activation="softmax", ) self.num_experts = num_experts self.top_k = top_k - # Keep track of how many times each expert is used as a layer weight - self.expert_usage_count = tf.Variable( - tf.zeros((num_experts,), dtype=tf.float32) + # Keep track of how many times each expert is used + self.expert_usage_count = self.add_weight( + shape=(self.num_experts,), + initializer="zeros", + trainable=False, + name="expert_usage_count", ) + self.batch_capacity = BATCH_SIZE / num_experts - self.batch_capacity = BATCH_SIZE // num_experts + def build(self, input_shape): + for expert in self.experts: + if not expert.built: + expert.build(input_shape) - def _diversity_loss(self, weights): - entropy = -K.sum(weights * K.log(weights + 1e-10), axis=1) - self.diversity_loss = -K.mean(entropy) - - def _importance_loss(self, gating_weights): - batch_importance_sum = K.sum(gating_weights, axis=0) - mean_importance = K.mean(batch_importance_sum) - self.importance_loss = K.mean( - K.square( - batch_importance_sum - - mean_importance * tf.ones_like(batch_importance_sum) - ) + if not self.gating_network.built: + self.gating_network.build(input_shape) + self.built = True + + def get_top_outputs(self, x, top_k_indices, top_k_weights): + batch_size = x.size(0) + output_size = self.expert_output_size + + # Get outputs from top-k experts + top_k_expert_outputs = torch.zeros( + batch_size, self.top_k, output_size, device=x.device ) + flat_indices = top_k_indices.view(-1).to(x.device) + repeated_x = x.repeat_interleave(self.top_k, dim=0) + + unique_expert_ids = flat_indices.unique(sorted=True) + expert_outputs_dict = {} + for idx in unique_expert_ids: + mask = (flat_indices == idx).to(x.device) + selected_inputs = repeated_x[mask] + expert_outputs_dict[idx.item()] = self.experts[idx](selected_inputs) - # Replace the current get_top_outputs method with this vectorized version - def get_top_outputs( - self, x, gating_weights - ): # Changed to take gating_weights directly - """Compute outputs from top-k experts.""" - top_k_weights, top_k_indices = tf.math.top_k(gating_weights, k=self.top_k) - - # Store indices and updates for usage count - self.indices = tf.reshape(top_k_indices, [-1, 1]) - self.updates = tf.ones_like(tf.reshape(top_k_indices, [-1]), dtype=tf.float32) - - # Compute expert outputs symbolically - expert_outputs = tf.stack([expert(x) for expert in self.experts], axis=1) - batch_size = tf.shape(x)[0] - batch_indices = tf.tile(tf.range(batch_size)[:, tf.newaxis], [1, self.top_k]) - gather_indices = tf.stack([batch_indices, top_k_indices], axis=-1) - top_k_expert_outputs = tf.gather_nd(expert_outputs, gather_indices) - - combined_output = tf.reduce_sum( - top_k_expert_outputs * top_k_weights[:, :, tf.newaxis], axis=1 + flat_outputs = torch.zeros( + batch_size * self.top_k, output_size, device=x.device ) - return combined_output + for idx in unique_expert_ids: + mask = (flat_indices == idx).to(x.device) + flat_outputs[mask] = expert_outputs_dict[idx.item()].to(flat_outputs.device) + + top_k_expert_outputs = flat_outputs.view(batch_size, self.top_k, output_size) - def update_usage_counts(self): - updates = tf.ones_like(tf.reshape(self.indices, [-1]), dtype=tf.float32) - self.expert_usage_count.assign( - tf.tensor_scatter_nd_add( - self.expert_usage_count, tf.reshape(self.indices, [-1, 1]), updates + return torch.einsum( + "ijk,ij->ik", top_k_expert_outputs.cpu(), top_k_weights.cpu() + ) + + def update_usage_counts(self, indices): + flat_indices = keras.ops.reshape(indices, [-1]) + one_hot = keras.ops.one_hot(flat_indices, self.num_experts) + updates = keras.ops.sum(one_hot, axis=0) + self.batch_usage_count = updates + self.expert_usage_count.assign_add(updates) + + def _importance_loss(self, weights): + batch_importance_sum = torch.sum(weights, dim=0) + mean_importance = torch.mean(batch_importance_sum) + self.importance_loss = torch.mean( + torch.square( + batch_importance_sum + - mean_importance * torch.ones_like(batch_importance_sum) ) ) + def _diversity_loss(self, weights): + entropy = -torch.sum(weights * torch.log(weights + 1e-10), dim=1) + self.diversity_loss = torch.mean(entropy) + def call(self, x): - # Get gating weights and normalize gating_weights = self.gating_network(x) - # top_k_weights, top_k_indices = tf.nn.top_k(gating_weights, k=self.top_k) - combined_output = self.get_top_outputs(x, gating_weights) - self.update_usage_counts() - self._diversity_loss(gating_weights) + top_k_weights, top_k_indices = torch.topk(gating_weights, self.top_k, dim=-1) + combined_output = self.get_top_outputs(x, top_k_indices, top_k_weights) + self.update_usage_counts(top_k_indices) self._importance_loss(gating_weights) + self._diversity_loss(gating_weights) return combined_output def compute_total_loss(self, load_balance_coef=0.01): - self.batch_overflow_sum = K.sum( - K.relu(tf.convert_to_tensor(self.expert_usage_count) - self.batch_capacity) + self.batch_overflow_sum = torch.sum( + torch.relu(self.expert_usage_count - self.batch_capacity) ) return load_balance_coef * ( self.diversity_loss + self.batch_overflow_sum + self.importance_loss @@ -384,30 +417,33 @@ def call(self, inputs, training=False): return outputs def train_step(self, data): - x, y = data # Unpack input data and labels - - with tf.GradientTape() as tape: - y_pred = self(x, training=True) - classification_loss = self.compute_loss(x, y, y_pred) - if self.moe_loss_considered: - moe_loss = self.moe_classifier.compute_total_loss( - load_balance_coef=0.01 - ) - total_loss = classification_loss + moe_loss - else: - total_loss = classification_loss - - # Compute gradients - gradients = tape.gradient(total_loss, self.trainable_variables) - - # Update weights - self.optimizer.apply_gradients(zip(gradients, self.trainable_variables)) + x, y = data + self.zero_grad() + + y_pred = self(x) + + if self.moe_loss_considered: + moe_loss = self.moe_classifier.compute_total_loss(load_balance_coef=0.01) + total_loss = self.compute_loss(x, y, y_pred) + moe_loss + else: + total_loss = self.compute_loss(x, y, y_pred) + + total_loss.backward() + + trainable_weights = [v for v in self.trainable_weights] + gradients = [v.value.grad for v in trainable_weights] + + with torch.no_grad(): + self.optimizer.apply(gradients, trainable_weights) + + # Update metrics for metric in self.metrics: metric.update_state(y, y_pred) + # Return a dict of metrics for monitoring return { - "loss": total_loss, - "moe_loss": moe_loss, + "loss": total_loss.item(), # Convert to Python number + "moe_loss": moe_loss.item() if self.moe_loss_considered else 0.0, **{m.name: m.result() for m in self.metrics}, } @@ -469,6 +505,7 @@ def test_step(self, data): allowing for more sophisticated processing without requiring a larger dense network. In our implementation, we: + 1. Created a basic MoE layer using dense networks as experts 2. Implemented three types of load balancing losses to prevent routing collapse 3. Applied the MoE architecture to MNIST classification by replacing the final dense layer From 76de5f49e010198a527b9874fb19e3a4e20e9810 Mon Sep 17 00:00:00 2001 From: Damoon Date: Wed, 5 Mar 2025 23:56:05 +0300 Subject: [PATCH 5/5] Revert changes to match upstream --- examples/audio/vocal_track_separation.py | 2 +- .../examples/audio/vocal_track_separation.md | 921 ------------------ 2 files changed, 1 insertion(+), 922 deletions(-) delete mode 100644 templates/examples/audio/vocal_track_separation.md diff --git a/examples/audio/vocal_track_separation.py b/examples/audio/vocal_track_separation.py index ca16c35ab1..25574e0ab8 100644 --- a/examples/audio/vocal_track_separation.py +++ b/examples/audio/vocal_track_separation.py @@ -670,4 +670,4 @@ def spectral_loss(y_true, y_pred): - Train for more epochs and refine hyperparameters. - Separate multiple instruments simultaneously. - Enhance the model to handle instruments not present in the mixture. -""" \ No newline at end of file +""" diff --git a/templates/examples/audio/vocal_track_separation.md b/templates/examples/audio/vocal_track_separation.md deleted file mode 100644 index af44162d78..0000000000 --- a/templates/examples/audio/vocal_track_separation.md +++ /dev/null @@ -1,921 +0,0 @@ -# Vocal Track Separation with Encoder-Decoder Architecture - -**Author:** [Joaquin Jimenez](https://github.com/johacks/)
-**Date created:** 2024/12/10
-**Last modified:** 2024/12/10
-**Description:** Train a model to separate vocal tracks from music mixtures. - - -
ⓘ This example uses Keras 3
- [**View in Colab**](https://colab.research.google.com/github/keras-team/keras-io/blob/master/examples/audio/ipynb/vocal_track_separation.ipynb) [**GitHub source**](https://github.com/keras-team/keras-io/blob/master/examples/audio/vocal_track_separation.py) - - - ---- -## Introduction - -In this tutorial, we build a vocal track separation model using an encoder-decoder -architecture in Keras 3. - -We train the model on the [MUSDB18 dataset](https://doi.org/10.5281/zenodo.1117372), -which provides music mixtures and isolated tracks for drums, bass, other, and vocals. - -Key concepts covered: - -- Audio data preprocessing using the Short-Time Fourier Transform (STFT). -- Audio data augmentation techniques. -- Implementing custom encoders and decoders specialized for audio data. -- Defining appropriate loss functions and metrics for audio source separation tasks. - -The model architecture is derived from the TFC_TDF_Net model described in: - -W. Choi, M. Kim, J. Chung, D. Lee, and S. Jung, “Investigating U-Nets with various -intermediate blocks for spectrogram-based singing voice separation,” in the 21st -International Society for Music Information Retrieval Conference, 2020. - -For reference code, see: -[GitHub: ws-choi/ISMIR2020_U_Nets_SVS](https://github.com/ws-choi/ISMIR2020_U_Nets_SVS). - -The data processing and model training routines are partly derived from: -[ZFTurbo/Music-Source-Separation-Training](https://github.com/ZFTurbo/Music-Source-Separation-Training/tree/main). - ---- -## Setup - -Import and install all the required dependencies. - - -```python -!pip install -qq audiomentations soundfile ffmpeg-binaries -!pip install -qq "keras==3.7.0" -!sudo -n apt-get install -y graphviz >/dev/null 2>&1 # Required for plotting the model -``` - - -```python -import glob -import os - -os.environ["KERAS_BACKEND"] = "jax" # or "tensorflow" or "torch" - -import random -import subprocess -import tempfile -import typing -from os import path - -import audiomentations as aug -import ffmpeg -import keras -import numpy as np -import soundfile as sf -from IPython import display -from keras import callbacks, layers, ops, saving -from matplotlib import pyplot as plt -``` - ---- -## Configuration - -The following constants define configuration parameters for audio processing -and model training, including dataset paths, audio chunk sizes, Short-Time Fourier -Transform (STFT) parameters, and training hyperparameters. - - -```python -# MUSDB18 dataset configuration -MUSDB_STREAMS = {"mixture": 0, "drums": 1, "bass": 2, "other": 3, "vocals": 4} -TARGET_INSTRUMENTS = {track: MUSDB_STREAMS[track] for track in ("vocals",)} -N_INSTRUMENTS = len(TARGET_INSTRUMENTS) -SOURCE_INSTRUMENTS = tuple(k for k in MUSDB_STREAMS if k != "mixture") - -# Audio preprocessing parameters for Short-Time Fourier Transform (STFT) -N_SUBBANDS = 4 # Number of subbands into which frequencies are split -CHUNK_SIZE = 65024 # Number of amplitude samples per audio chunk (~4 seconds) -STFT_N_FFT = 2048 # FFT points used in STFT -STFT_HOP_LENGTH = 512 # Hop length for STFT - -# Training hyperparameters -N_CHANNELS = 64 # Base channel count for the model -BATCH_SIZE = 3 -ACCUMULATION_STEPS = 2 -EFFECTIVE_BATCH_SIZE = BATCH_SIZE * (ACCUMULATION_STEPS or 1) - -# Paths -TMP_DIR = path.expanduser("~/.keras/tmp") -DATASET_DIR = path.expanduser("~/.keras/datasets") -MODEL_PATH = path.join(TMP_DIR, f"model_{keras.backend.backend()}.keras") -CSV_LOG_PATH = path.join(TMP_DIR, f"training_{keras.backend.backend()}.csv") -os.makedirs(DATASET_DIR, exist_ok=True) -os.makedirs(TMP_DIR, exist_ok=True) - -# Set random seed for reproducibility -keras.utils.set_random_seed(21) -``` - -
-``` -WARNING: All log messages before absl::InitializeLog() is called are written to STDERR -E0000 00:00:1734318393.806217 81028 cuda_dnn.cc:8310] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered -E0000 00:00:1734318393.809885 81028 cuda_blas.cc:1418] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered - -``` -
---- -## MUSDB18 Dataset - -The MUSDB18 dataset is a standard benchmark for music source separation, containing -150 full-length music tracks along with isolated drums, bass, other, and vocals. -The dataset is stored in .mp4 format, and each .mp4 file includes multiple audio -streams (mixture and individual tracks). - -### Download and Conversion - -The following utility function downloads MUSDB18 and converts its .mp4 files to -.wav files for each instrument track, resampled to 16 kHz. - - -```python - -def download_musdb18(out_dir=None): - """Download and extract the MUSDB18 dataset, then convert .mp4 files to .wav files. - - MUSDB18 reference: - Rafii, Z., Liutkus, A., Stöter, F.-R., Mimilakis, S. I., & Bittner, R. (2017). - MUSDB18 - a corpus for music separation (1.0.0) [Data set]. Zenodo. - """ - ffmpeg.init() - from ffmpeg import FFMPEG_PATH - - # Create output directories - os.makedirs((base := out_dir or tempfile.mkdtemp()), exist_ok=True) - if path.exists((out_dir := path.join(base, "musdb18_wav"))): - print("MUSDB18 dataset already downloaded") - return out_dir - - # Download and extract the dataset - download_dir = keras.utils.get_file( - fname="musdb18", - origin="https://zenodo.org/records/1117372/files/musdb18.zip", - extract=True, - ) - - # ffmpeg command template: input, stream index, output - ffmpeg_args = str(FFMPEG_PATH) + " -v error -i {} -map 0:{} -vn -ar 16000 {}" - - # Convert each mp4 file to multiple .wav files for each track - for split in ("train", "test"): - songs = os.listdir(path.join(download_dir, split)) - for i, song in enumerate(songs): - if i % 10 == 0: - print(f"{split.capitalize()}: {i}/{len(songs)} songs processed") - - mp4_path_orig = path.join(download_dir, split, song) - mp4_path = path.join(tempfile.mkdtemp(), split, song.replace(" ", "_")) - os.makedirs(path.dirname(mp4_path), exist_ok=True) - os.rename(mp4_path_orig, mp4_path) - - wav_dir = path.join(out_dir, split, path.basename(mp4_path).split(".")[0]) - os.makedirs(wav_dir, exist_ok=True) - - for track in SOURCE_INSTRUMENTS: - out_path = path.join(wav_dir, f"{track}.wav") - stream_index = MUSDB_STREAMS[track] - args = ffmpeg_args.format(mp4_path, stream_index, out_path).split() - assert subprocess.run(args).returncode == 0, "ffmpeg conversion failed" - return out_dir - - -# Download and prepare the MUSDB18 dataset -songs = download_musdb18(out_dir=DATASET_DIR) -``` - -
-``` -MUSDB18 dataset already downloaded - -``` -
-### Custom Dataset - -We define a custom dataset class to generate random audio chunks and their corresponding -labels. The dataset does the following: - -1. Selects a random chunk from a random song and instrument. -2. Applies optional data augmentations. -3. Combines isolated tracks to form new synthetic mixtures. -4. Prepares features (mixtures) and labels (vocals) for training. - -This approach allows creating an effectively infinite variety of training examples -through randomization and augmentation. - - -```python - -class Dataset(keras.utils.PyDataset): - def __init__( - self, - songs, - batch_size=BATCH_SIZE, - chunk_size=CHUNK_SIZE, - batches_per_epoch=1000 * ACCUMULATION_STEPS, - augmentation=True, - **kwargs, - ): - super().__init__(**kwargs) - self.augmentation = augmentation - self.vocals_augmentations = [ - aug.PitchShift(min_semitones=-5, max_semitones=5, p=0.1), - aug.SevenBandParametricEQ(-9, 9, p=0.25), - aug.TanhDistortion(0.1, 0.7, p=0.1), - ] - self.other_augmentations = [ - aug.PitchShift(p=0.1), - aug.AddGaussianNoise(p=0.1), - ] - self.songs = songs - self.sizes = {song: self.get_track_set_size(song) for song in self.songs} - self.batch_size = batch_size - self.chunk_size = chunk_size - self.batches_per_epoch = batches_per_epoch - - def get_track_set_size(self, song: str): - """Return the smallest track length in the given song directory.""" - sizes = [len(sf.read(p)[0]) for p in glob.glob(path.join(song, "*.wav"))] - if max(sizes) != min(sizes): - print(f"Warning: {song} has different track lengths") - return min(sizes) - - def random_chunk_of_instrument_type(self, instrument: str): - """Extract a random chunk for the specified instrument from a random song.""" - song, size = random.choice(list(self.sizes.items())) - track = path.join(song, f"{instrument}.wav") - - if self.chunk_size <= size: - start = np.random.randint(size - self.chunk_size + 1) - audio = sf.read(track, self.chunk_size, start, dtype="float32")[0] - audio_mono = np.mean(audio, axis=1) - else: - # If the track is shorter than chunk_size, pad the signal - audio_mono = np.mean(sf.read(track, dtype="float32")[0], axis=1) - audio_mono = np.pad(audio_mono, ((0, self.chunk_size - size),)) - - # If the chunk is almost silent, retry - if np.mean(np.abs(audio_mono)) < 0.01: - return self.random_chunk_of_instrument_type(instrument) - - return self.data_augmentation(audio_mono, instrument) - - def data_augmentation(self, audio: np.ndarray, instrument: str): - """Apply data augmentation to the audio chunk, if enabled.""" - - def coin_flip(x, probability: float, fn: typing.Callable): - return fn(x) if random.uniform(0, 1) < probability else x - - if self.augmentation: - augmentations = ( - self.vocals_augmentations - if instrument == "vocals" - else self.other_augmentations - ) - # Loudness augmentation - audio *= np.random.uniform(0.5, 1.5, (len(audio),)).astype("float32") - # Random reverse - audio = coin_flip(audio, 0.1, lambda x: np.flip(x)) - # Random polarity inversion - audio = coin_flip(audio, 0.5, lambda x: -x) - # Apply selected augmentations - for aug_ in augmentations: - aug_.randomize_parameters(audio, sample_rate=16000) - audio = aug_(audio, sample_rate=16000) - return audio - - def random_mix_of_tracks(self) -> dict: - """Create a random mix of instruments by summing their individual chunks.""" - tracks = {} - for instrument in SOURCE_INSTRUMENTS: - # Start with a single random chunk - mixup = [self.random_chunk_of_instrument_type(instrument)] - - # Randomly add more chunks of the same instrument (mixup augmentation) - if self.augmentation: - for p in (0.2, 0.02): - if random.uniform(0, 1) < p: - mixup.append(self.random_chunk_of_instrument_type(instrument)) - - tracks[instrument] = np.mean(mixup, axis=0, dtype="float32") - return tracks - - def __len__(self): - return self.batches_per_epoch - - def __getitem__(self, idx): - # Generate a batch of random mixtures - batch = [self.random_mix_of_tracks() for _ in range(self.batch_size)] - - # Features: sum of all tracks - batch_x = ops.sum( - np.array([list(track_set.values()) for track_set in batch]), axis=1 - ) - - # Labels: isolated target instruments (e.g., vocals) - batch_y = np.array( - [[track_set[t] for t in TARGET_INSTRUMENTS] for track_set in batch] - ) - - return batch_x, ops.convert_to_tensor(batch_y) - - -# Create train and validation datasets -train_ds = Dataset(glob.glob(path.join(songs, "train", "*"))) -val_ds = Dataset( - glob.glob(path.join(songs, "test", "*")), - batches_per_epoch=int(0.1 * train_ds.batches_per_epoch), - augmentation=False, -) -``` - -### Visualize a Sample - -Let's visualize a random mixed audio chunk and its corresponding isolated vocals. -This helps to understand the nature of the preprocessed input data. - - -```python - -def visualize_audio_np(audio: np.ndarray, rate=16000, name="mixup"): - """Plot and display an audio waveform and also produce an Audio widget.""" - plt.figure(figsize=(10, 6)) - plt.plot(audio) - plt.title(f"Waveform: {name}") - plt.xlim(0, len(audio)) - plt.ylabel("Amplitude") - plt.show() - # plt.savefig(f"tmp/{name}.png") - - # Normalize and display audio - audio_norm = (audio - np.min(audio)) / (np.max(audio) - np.min(audio) + 1e-8) - audio_norm = (audio_norm * 2 - 1) * 0.6 - display.display(display.Audio(audio_norm, rate=rate)) - # sf.write(f"tmp/{name}.wav", audio_norm, rate) - - -sample_batch_x, sample_batch_y = val_ds[None] # Random batch -visualize_audio_np(ops.convert_to_numpy(sample_batch_x[0])) -visualize_audio_np(ops.convert_to_numpy(sample_batch_y[0, 0]), name="vocals") -``` - - - -![png](/img/examples/audio/vocal_track_separation/vocal_track_separation_12_0.png) - - - - - - - - - - - -![png](/img/examples/audio/vocal_track_separation/vocal_track_separation_12_2.png) - - - - - - - - - ---- -## Model - -### Preprocessing - -The model operates on STFT representations rather than raw audio. We define a -preprocessing model to compute STFT and a corresponding inverse transform (iSTFT). - - -```python - -def stft(inputs, fft_size=STFT_N_FFT, sequence_stride=STFT_HOP_LENGTH): - """Compute the STFT for the input audio and return the real and imaginary parts.""" - real_x, imag_x = ops.stft(inputs, fft_size, sequence_stride, fft_size) - real_x, imag_x = ops.expand_dims(real_x, -1), ops.expand_dims(imag_x, -1) - x = ops.concatenate((real_x, imag_x), axis=-1) - - # Drop last freq sample for convenience - return ops.split(x, [x.shape[2] - 1], axis=2)[0] - - -def inverse_stft(inputs, fft_size=STFT_N_FFT, sequence_stride=STFT_HOP_LENGTH): - """Compute the inverse STFT for the given STFT input.""" - x = inputs - - # Pad back dropped freq sample if using torch backend - if keras.backend.backend() == "torch": - x = ops.pad(x, ((0, 0), (0, 0), (0, 1), (0, 0))) - - real_x, imag_x = ops.split(x, 2, axis=-1) - real_x = ops.squeeze(real_x, axis=-1) - imag_x = ops.squeeze(imag_x, axis=-1) - - return ops.istft((real_x, imag_x), fft_size, sequence_stride, fft_size) - -``` - -### Model Architecture - -The model uses a custom encoder-decoder architecture with Time-Frequency Convolution -(TFC) and Time-Distributed Fully Connected (TDF) blocks. They are grouped into a -`TimeFrequencyTransformBlock`, i.e. "TFC_TDF" in the original paper by Choi et al. - -We then define an encoder-decoder network with multiple scales. Each encoder scale -applies TFC_TDF blocks followed by downsampling, while decoder scales apply TFC_TDF -blocks over the concatenation of upsampled features and associated encoder outputs. - - -```python - -@saving.register_keras_serializable() -class TimeDistributedDenseBlock(layers.Layer): - """Time-Distributed Fully Connected layer block. - - Applies frequency-wise dense transformations across time frames with instance - normalization and GELU activation. - """ - - def __init__(self, bottleneck_factor, fft_dim, **kwargs): - super().__init__(**kwargs) - self.fft_dim = fft_dim - self.hidden_dim = fft_dim // bottleneck_factor - - def build(self, *_): - self.group_norm_1 = layers.GroupNormalization(groups=-1) - self.group_norm_2 = layers.GroupNormalization(groups=-1) - self.dense_1 = layers.Dense(self.hidden_dim, use_bias=False) - self.dense_2 = layers.Dense(self.fft_dim, use_bias=False) - - def call(self, x): - # Apply normalization and dense layers frequency-wise - x = ops.gelu(self.group_norm_1(x)) - x = ops.swapaxes(x, -1, -2) - x = self.dense_1(x) - - x = ops.gelu(self.group_norm_2(ops.swapaxes(x, -1, -2))) - x = ops.swapaxes(x, -1, -2) - x = self.dense_2(x) - return ops.swapaxes(x, -1, -2) - - -@saving.register_keras_serializable() -class TimeFrequencyConvolution(layers.Layer): - """Time-Frequency Convolutional layer. - - Applies a 2D convolution over time-frequency representations and applies instance - normalization and GELU activation. - """ - - def __init__(self, channels, **kwargs): - super().__init__(**kwargs) - self.channels = channels - - def build(self, *_): - self.group_norm = layers.GroupNormalization(groups=-1) - self.conv = layers.Conv2D(self.channels, 3, padding="same", use_bias=False) - - def call(self, x): - return self.conv(ops.gelu(self.group_norm(x))) - - -@saving.register_keras_serializable() -class TimeFrequencyTransformBlock(layers.Layer): - """Implements TFC_TDF block for encoder-decoder architecture. - - Repeatedly apply Time-Frequency Convolution and Time-Distributed Dense blocks as - many times as specified by the `length` parameter. - """ - - def __init__( - self, channels, length, fft_dim, bottleneck_factor, in_channels=None, **kwargs - ): - super().__init__(**kwargs) - self.channels = channels - self.length = length - self.fft_dim = fft_dim - self.bottleneck_factor = bottleneck_factor - self.in_channels = in_channels or channels - self.blocks = [] - - def build(self, *_): - # Add blocks in a flat list to avoid nested structures - for i in range(self.length): - in_channels = self.channels if i > 0 else self.in_channels - self.blocks.append(TimeFrequencyConvolution(in_channels)) - self.blocks.append( - TimeDistributedDenseBlock(self.bottleneck_factor, self.fft_dim) - ) - self.blocks.append(TimeFrequencyConvolution(self.channels)) - # Residual connection - self.blocks.append(layers.Conv2D(self.channels, 1, 1, use_bias=False)) - - def call(self, inputs): - x = inputs - # Each block consists of 4 layers: - # 1. Time-Frequency Convolution - # 2. Time-Distributed Dense - # 3. Time-Frequency Convolution - # 4. Residual connection - for i in range(0, len(self.blocks), 4): - tfc_1 = self.blocks[i](x) - tdf = self.blocks[i + 1](x) - tfc_2 = self.blocks[i + 2](tfc_1 + tdf) - x = tfc_2 + self.blocks[i + 3](x) # Residual connection - return x - - -@saving.register_keras_serializable() -class Downscale(layers.Layer): - """Downscale time-frequency dimensions using a convolution.""" - - conv_cls = layers.Conv2D - - def __init__(self, channels, scale, **kwargs): - super().__init__(**kwargs) - self.channels = channels - self.scale = scale - - def build(self, *_): - self.conv = self.conv_cls(self.channels, self.scale, self.scale, use_bias=False) - self.norm = layers.GroupNormalization(groups=-1) - - def call(self, inputs): - return self.norm(ops.gelu(self.conv(inputs))) - - -@saving.register_keras_serializable() -class Upscale(Downscale): - """Upscale time-frequency dimensions using a transposed convolution.""" - - conv_cls = layers.Conv2DTranspose - - -def build_model( - inputs, - n_instruments=N_INSTRUMENTS, - n_subbands=N_SUBBANDS, - channels=N_CHANNELS, - fft_dim=(STFT_N_FFT // 2) // N_SUBBANDS, - n_scales=4, - scale=(2, 2), - block_size=2, - growth=128, - bottleneck_factor=2, - **kwargs, -): - """Build the TFC_TDF encoder-decoder model for source separation.""" - # Compute STFT - x = stft(inputs) - - # Split mixture into subbands as separate channels - mix = ops.reshape(x, (-1, x.shape[1], x.shape[2] // n_subbands, 2 * n_subbands)) - first_conv_out = layers.Conv2D(channels, 1, 1, use_bias=False)(mix) - x = first_conv_out - - # Encoder path - encoder_outs = [] - for _ in range(n_scales): - x = TimeFrequencyTransformBlock( - channels, block_size, fft_dim, bottleneck_factor - )(x) - encoder_outs.append(x) - fft_dim, channels = fft_dim // scale[0], channels + growth - x = Downscale(channels, scale)(x) - - # Bottleneck - x = TimeFrequencyTransformBlock(channels, block_size, fft_dim, bottleneck_factor)(x) - - # Decoder path - for _ in range(n_scales): - fft_dim, channels = fft_dim * scale[0], channels - growth - x = ops.concatenate([Upscale(channels, scale)(x), encoder_outs.pop()], axis=-1) - x = TimeFrequencyTransformBlock( - channels, block_size, fft_dim, bottleneck_factor, in_channels=x.shape[-1] - )(x) - - # Residual connection and final convolutions - x = ops.concatenate([mix, x * first_conv_out], axis=-1) - x = layers.Conv2D(channels, 1, 1, use_bias=False, activation="gelu")(x) - x = layers.Conv2D(n_instruments * n_subbands * 2, 1, 1, use_bias=False)(x) - - # Reshape back to instrument-wise STFT - x = ops.reshape(x, (-1, x.shape[1], x.shape[2] * n_subbands, n_instruments, 2)) - x = ops.transpose(x, (0, 3, 1, 2, 4)) - x = ops.reshape(x, (-1, n_instruments, x.shape[2], x.shape[3] * 2)) - - return keras.Model(inputs=inputs, outputs=x, **kwargs) - -``` - ---- -## Loss and Metrics - -We define: - -- `spectral_loss`: Mean absolute error in STFT domain. -- `sdr`: Signal-to-Distortion Ratio, a common source separation metric. - - -```python - -def prediction_to_wave(x, n_instruments=N_INSTRUMENTS): - """Convert STFT predictions back to waveform.""" - x = ops.reshape(x, (-1, x.shape[2], x.shape[3] // 2, 2)) - x = inverse_stft(x) - return ops.reshape(x, (-1, n_instruments, x.shape[1])) - - -def target_to_stft(y): - """Convert target waveforms to their STFT representations.""" - y = ops.reshape(y, (-1, CHUNK_SIZE)) - y_real, y_imag = ops.stft(y, STFT_N_FFT, STFT_HOP_LENGTH, STFT_N_FFT) - y_real, y_imag = y_real[..., :-1], y_imag[..., :-1] - y = ops.stack([y_real, y_imag], axis=-1) - return ops.reshape(y, (-1, N_INSTRUMENTS, y.shape[1], y.shape[2] * 2)) - - -@saving.register_keras_serializable() -def sdr(y_true, y_pred): - """Signal-to-Distortion Ratio metric.""" - y_pred = prediction_to_wave(y_pred) - # Add epsilon for numerical stability - num = ops.sum(ops.square(y_true), axis=-1) + 1e-8 - den = ops.sum(ops.square(y_true - y_pred), axis=-1) + 1e-8 - return 10 * ops.log10(num / den) - - -@saving.register_keras_serializable() -def spectral_loss(y_true, y_pred): - """Mean absolute error in the STFT domain.""" - y_true = target_to_stft(y_true) - return ops.mean(ops.absolute(y_true - y_pred)) - -``` - ---- -## Training - -### Visualize Model Architecture - - -```python -# Load or create the model -if path.exists(MODEL_PATH): - model = saving.load_model(MODEL_PATH) -else: - model = build_model(keras.Input(sample_batch_x.shape[1:]), name="tfc_tdf_net") - -# Display the model architecture -model.summary() -img = keras.utils.plot_model(model, path.join(TMP_DIR, "model.png"), show_shapes=True) -display.display(img) -``` - - -
Model: "tfc_tdf_net"
-
- - - - -
┏━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━┓
-┃ Layer (type)         Output Shape          Param #  Connected to      ┃
-┡━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━┩
-│ input_layer         │ (None, 65024)     │          0 │ -                 │
-│ (InputLayer)        │                   │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ stft (STFT)         │ [(None, 128,      │          0 │ input_layer[0][0] │
-│                     │ 1025), (None,     │            │                   │
-│                     │ 128, 1025)]       │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ expand_dims         │ (None, 128, 1025, │          0 │ stft[0][0]        │
-│ (ExpandDims)        │ 1)                │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ expand_dims_1       │ (None, 128, 1025, │          0 │ stft[0][1]        │
-│ (ExpandDims)        │ 1)                │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ concatenate         │ (None, 128, 1025, │          0 │ expand_dims[0][0… │
-│ (Concatenate)       │ 2)                │            │ expand_dims_1[0]… │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ split (Split)       │ [(None, 128,      │          0 │ concatenate[0][0] │
-│                     │ 1024, 2), (None,  │            │                   │
-│                     │ 128, 1, 2)]       │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ reshape (Reshape)   │ (None, 128, 256,  │          0 │ split[0][0]       │
-│                     │ 8)                │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ conv2d (Conv2D)     │ (None, 128, 256,  │        512 │ reshape[0][0]     │
-│                     │ 64)               │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ time_frequency_tra… │ (None, 128, 256,  │    287,744 │ conv2d[0][0]      │
-│ (TimeFrequencyTran…64)               │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ downscale           │ (None, 64, 128,   │     49,536 │ time_frequency_t… │
-│ (Downscale)         │ 192)              │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ time_frequency_tra… │ (None, 64, 128,   │  1,436,672 │ downscale[0][0]   │
-│ (TimeFrequencyTran…192)              │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ downscale_1         │ (None, 32, 64,    │    246,400 │ time_frequency_t… │
-│ (Downscale)         │ 320)              │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ time_frequency_tra… │ (None, 32, 64,    │  3,904,512 │ downscale_1[0][0] │
-│ (TimeFrequencyTran…320)              │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ downscale_2         │ (None, 16, 32,    │    574,336 │ time_frequency_t… │
-│ (Downscale)         │ 448)              │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ time_frequency_tra… │ (None, 16, 32,    │  7,635,968 │ downscale_2[0][0] │
-│ (TimeFrequencyTran…448)              │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ downscale_3         │ (None, 8, 16,     │  1,033,344 │ time_frequency_t… │
-│ (Downscale)         │ 576)              │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ time_frequency_tra… │ (None, 8, 16,     │ 12,617,216 │ downscale_3[0][0] │
-│ (TimeFrequencyTran…576)              │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ upscale (Upscale)   │ (None, 16, 32,    │  1,033,088 │ time_frequency_t… │
-│                     │ 448)              │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ concatenate_1       │ (None, 16, 32,    │          0 │ upscale[0][0],    │
-│ (Concatenate)       │ 896)              │            │ time_frequency_t… │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ time_frequency_tra… │ (None, 16, 32,    │ 15,065,600 │ concatenate_1[0]… │
-│ (TimeFrequencyTran…448)              │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ upscale_1 (Upscale) │ (None, 32, 64,    │    574,080 │ time_frequency_t… │
-│                     │ 320)              │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ concatenate_2       │ (None, 32, 64,    │          0 │ upscale_1[0][0],  │
-│ (Concatenate)       │ 640)              │            │ time_frequency_t… │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ time_frequency_tra… │ (None, 32, 64,    │  7,695,872 │ concatenate_2[0]… │
-│ (TimeFrequencyTran…320)              │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ upscale_2 (Upscale) │ (None, 64, 128,   │    246,144 │ time_frequency_t… │
-│                     │ 192)              │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ concatenate_3       │ (None, 64, 128,   │          0 │ upscale_2[0][0],  │
-│ (Concatenate)       │ 384)              │            │ time_frequency_t… │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ time_frequency_tra… │ (None, 64, 128,   │  2,802,176 │ concatenate_3[0]… │
-│ (TimeFrequencyTran…192)              │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ upscale_3 (Upscale) │ (None, 128, 256,  │     49,280 │ time_frequency_t… │
-│                     │ 64)               │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ concatenate_4       │ (None, 128, 256,  │          0 │ upscale_3[0][0],  │
-│ (Concatenate)       │ 128)              │            │ time_frequency_t… │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ time_frequency_tra… │ (None, 128, 256,  │    439,808 │ concatenate_4[0]… │
-│ (TimeFrequencyTran…64)               │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ multiply (Multiply) │ (None, 128, 256,  │          0 │ time_frequency_t… │
-│                     │ 64)               │            │ conv2d[0][0]      │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ concatenate_5       │ (None, 128, 256,  │          0 │ reshape[0][0],    │
-│ (Concatenate)       │ 72)               │            │ multiply[0][0]    │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ conv2d_59 (Conv2D)  │ (None, 128, 256,  │      4,608 │ concatenate_5[0]… │
-│                     │ 64)               │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ conv2d_60 (Conv2D)  │ (None, 128, 256,  │        512 │ conv2d_59[0][0]   │
-│                     │ 8)                │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ reshape_1 (Reshape) │ (None, 128, 1024, │          0 │ conv2d_60[0][0]   │
-│                     │ 1, 2)             │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ transpose           │ (None, 1, 128,    │          0 │ reshape_1[0][0]   │
-│ (Transpose)         │ 1024, 2)          │            │                   │
-├─────────────────────┼───────────────────┼────────────┼───────────────────┤
-│ reshape_2 (Reshape) │ (None, 1, 128,    │          0 │ transpose[0][0]   │
-│                     │ 2048)             │            │                   │
-└─────────────────────┴───────────────────┴────────────┴───────────────────┘
-
- - - - -
 Total params: 222,789,634 (849.88 MB)
-
- - - - -
 Trainable params: 55,697,408 (212.47 MB)
-
- - - - -
 Non-trainable params: 0 (0.00 B)
-
- - - - -
 Optimizer params: 167,092,226 (637.41 MB)
-
- - - - - -![png](/img/examples/audio/vocal_track_separation/vocal_track_separation_20_6.png) - - - -### Compile and Train the Model - - -```python -# Compile the model -optimizer = keras.optimizers.Adam(5e-05, gradient_accumulation_steps=ACCUMULATION_STEPS) -model.compile(optimizer=optimizer, loss=spectral_loss, metrics=[sdr]) - -# Define callbacks -cbs = [ - callbacks.ModelCheckpoint(MODEL_PATH, "val_sdr", save_best_only=True, mode="max"), - callbacks.ReduceLROnPlateau(factor=0.95, patience=2), - callbacks.CSVLogger(CSV_LOG_PATH), -] - -if not path.exists(MODEL_PATH): - model.fit(train_ds, validation_data=val_ds, epochs=10, callbacks=cbs, shuffle=False) -else: - # Demonstration of a single epoch of training when model already exists - model.fit(train_ds, validation_data=val_ds, epochs=1, shuffle=False, verbose=2) -``` - -
-``` -2000/2000 - 490s - 245ms/step - loss: 0.2977 - sdr: 5.6497 - val_loss: 0.1720 - val_sdr: 6.0508 - -``` -
---- -## Evaluation - -Evaluate the model on the validation dataset and visualize predicted vocals. - - -```python -model.evaluate(val_ds, verbose=2) -y_pred = model.predict(sample_batch_x, verbose=2) -y_pred = prediction_to_wave(y_pred) -visualize_audio_np(ops.convert_to_numpy(y_pred[0, 0]), name="vocals_pred") -``` - -
-``` -200/200 - 8s - 41ms/step - loss: 0.1747 - sdr: 5.9374 - -1/1 - 4s - 4s/step - -``` -
- -![png](/img/examples/audio/vocal_track_separation/vocal_track_separation_24_2.png) - - - - - - - - - ---- -## Conclusion - -We built and trained a vocal track separation model using an encoder-decoder -architecture with custom blocks applied to the MUSDB18 dataset. We demonstrated -STFT-based preprocessing, data augmentation, and a source separation metric (SDR). - -**Next steps:** - -- Train for more epochs and refine hyperparameters. -- Separate multiple instruments simultaneously. -- Enhance the model to handle instruments not present in the mixture. -