CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutSign UpSign In
better-data-science

CoCalc provides the best real-time collaborative environment for Jupyter Notebooks, LaTeX documents, and SageMath, scalable from individual users to large groups and classes!

GitHub Repository: better-data-science/TensorFlow
Path: blob/main/003_TensorFlow_Classification.ipynb
Views: 47
Kernel: Python 3 (ipykernel)
import numpy as np import pandas as pd df = pd.read_csv('data/winequalityN.csv') df.sample(5)
df.shape
(6497, 13)
df.isnull().sum()
type 0 fixed acidity 10 volatile acidity 8 citric acid 3 residual sugar 2 chlorides 2 free sulfur dioxide 0 total sulfur dioxide 0 density 0 pH 9 sulphates 4 alcohol 0 quality 0 dtype: int64

Drop missing values:

df = df.dropna() df.isnull().sum()
type 0 fixed acidity 0 volatile acidity 0 citric acid 0 residual sugar 0 chlorides 0 free sulfur dioxide 0 total sulfur dioxide 0 density 0 pH 0 sulphates 0 alcohol 0 quality 0 dtype: int64

Encode string data:

df['type'].value_counts()
white 4870 red 1593 Name: type, dtype: int64
df['is_white_wine'] = [1 if typ == 'white' else 0 for typ in df['type']] df.drop('type', axis=1, inplace=True) df.head()

All data is numeric now:

df.dtypes
fixed acidity float64 volatile acidity float64 citric acid float64 residual sugar float64 chlorides float64 free sulfur dioxide float64 total sulfur dioxide float64 density float64 pH float64 sulphates float64 alcohol float64 quality int64 is_white_wine int64 dtype: object

Convert to a binary classification problem

  • This is not a binary classification problem by default

  • We can make it one by declaring wines above some quality point good wines and rest of them bad wines

df['quality'].value_counts()
6 2820 5 2128 7 1074 4 214 8 192 3 30 9 5 Name: quality, dtype: int64
  • So we'll have 63.3% good wines and the rest are bad

len(df[df['quality'] >= 6]) / len(df)
0.6329877765743462
df['is_good_wine'] = [1 if quality >= 6 else 0 for quality in df['quality']] df.drop('quality', axis=1, inplace=True) df.head()
df['is_good_wine'].value_counts()
1 4091 0 2372 Name: is_good_wine, dtype: int64
df.head()

Train/Test split

from sklearn.model_selection import train_test_split X = df.drop('is_good_wine', axis=1) y = df['is_good_wine'] X_train, X_test, y_train, y_test = train_test_split( X, y, test_size=0.2, random_state=42 )
X_train.shape, X_test.shape
((5170, 12), (1293, 12))
X_train.head()

Data scaling

  • Input features aren't on the same scale, so we'll fix it quickly:

from sklearn.preprocessing import StandardScaler scaler = StandardScaler() X_train_scaled = scaler.fit_transform(X_train) X_test_scaled = scaler.transform(X_test)
X_train_scaled[:3]
array([[-0.86265684, 0.56588915, 0.22079121, 0.75048207, 0.07674805, 2.9415276 , 2.35882933, 0.788386 , 0.94784355, -0.20357893, -1.66472797, 0.57094748], [ 0.99186667, -1.02945526, 2.92098728, -0.3929423 , -0.17512717, -0.08344183, 0.21383119, -0.64578381, 0.38700578, -0.87653487, 1.63149383, 0.57094748], [-1.55810316, -0.72265826, 0.98238498, 0.14758559, 0.27265101, 0.5887736 , 1.25973937, -0.37229096, -0.17383198, -0.74194369, -0.62233304, 0.57094748]])

Model training

import tensorflow as tf
Init Plugin Init Graph Optimizer Init Kernel
  • This is a completely random neural network architecture

  • Use sigmoid as the activation function in the last layer when working with binary classification problems

  • Use binary_crossentropy as a loss function when working with binary classification problems

  • We'll track accuracy, precision, and recall and train for 100 epochs

tf.random.set_seed(42) model = tf.keras.Sequential([ tf.keras.layers.Dense(128, activation='relu'), tf.keras.layers.Dense(256, activation='relu'), tf.keras.layers.Dense(256, activation='relu'), tf.keras.layers.Dense(1, activation='sigmoid') ]) model.compile( loss=tf.keras.losses.binary_crossentropy, optimizer=tf.keras.optimizers.Adam(lr=0.03), metrics=[ tf.keras.metrics.BinaryAccuracy(name='accuracy'), tf.keras.metrics.Precision(name='precision'), tf.keras.metrics.Recall(name='recall') ] ) history = model.fit(X_train_scaled, y_train, epochs=100)
Metal device set to: Apple M1 Epoch 1/100
2021-10-12 11:47:05.723021: I tensorflow/core/common_runtime/pluggable_device/pluggable_device_factory.cc:305] Could not identify NUMA node of platform GPU ID 0, defaulting to 0. Your kernel may not have been built with NUMA support. 2021-10-12 11:47:05.723315: I tensorflow/core/common_runtime/pluggable_device/pluggable_device_factory.cc:271] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 0 MB memory) -> physical PluggableDevice (device: 0, name: METAL, pci bus id: <undefined>) /opt/homebrew/Caskroom/miniforge/base/envs/env_tensorflow/lib/python3.9/site-packages/tensorflow/python/keras/optimizer_v2/optimizer_v2.py:374: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead. warnings.warn( 2021-10-12 11:47:05.788746: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:176] None of the MLIR Optimization Passes are enabled (registered 2) 2021-10-12 11:47:05.791902: W tensorflow/core/platform/profile_utils/cpu_utils.cc:128] Failed to get CPU frequency: 0 Hz
6/162 [>.............................] - ETA: 1s - loss: 2.0929 - accuracy: 0.4844 - precision: 0.6214 - recall: 0.5161
2021-10-12 11:47:06.030247: I tensorflow/core/grappler/optimizers/custom_graph_optimizer_registry.cc:112] Plugin optimizer for device_type GPU is enabled.
162/162 [==============================] - 2s 7ms/step - loss: 0.6260 - accuracy: 0.7044 - precision: 0.7356 - recall: 0.8338 Epoch 2/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5383 - accuracy: 0.7298 - precision: 0.7659 - recall: 0.8268 Epoch 3/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5331 - accuracy: 0.7433 - precision: 0.7703 - recall: 0.8485 Epoch 4/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5337 - accuracy: 0.7482 - precision: 0.7912 - recall: 0.8192 Epoch 5/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5135 - accuracy: 0.7573 - precision: 0.7920 - recall: 0.8372 Epoch 6/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5188 - accuracy: 0.7545 - precision: 0.7830 - recall: 0.8482 Epoch 7/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5146 - accuracy: 0.7487 - precision: 0.8052 - recall: 0.7966 Epoch 8/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5091 - accuracy: 0.7594 - precision: 0.8219 - recall: 0.7924 Epoch 9/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4922 - accuracy: 0.7685 - precision: 0.8219 - recall: 0.8107 Epoch 10/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4989 - accuracy: 0.7681 - precision: 0.8244 - recall: 0.8061 Epoch 11/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4944 - accuracy: 0.7631 - precision: 0.8164 - recall: 0.8082 Epoch 12/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5093 - accuracy: 0.7472 - precision: 0.7920 - recall: 0.8159 Epoch 13/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5572 - accuracy: 0.7437 - precision: 0.8059 - recall: 0.7851 Epoch 14/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5024 - accuracy: 0.7627 - precision: 0.8207 - recall: 0.8009 Epoch 15/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4998 - accuracy: 0.7646 - precision: 0.8194 - recall: 0.8067 Epoch 16/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4859 - accuracy: 0.7706 - precision: 0.8252 - recall: 0.8101 Epoch 17/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4775 - accuracy: 0.7723 - precision: 0.8287 - recall: 0.8082 Epoch 18/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4719 - accuracy: 0.7760 - precision: 0.8269 - recall: 0.8183 Epoch 19/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4761 - accuracy: 0.7768 - precision: 0.8281 - recall: 0.8180 Epoch 20/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4682 - accuracy: 0.7760 - precision: 0.8316 - recall: 0.8113 Epoch 21/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4747 - accuracy: 0.7725 - precision: 0.8192 - recall: 0.8232 Epoch 22/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5112 - accuracy: 0.7617 - precision: 0.8198 - recall: 0.8003 Epoch 23/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4896 - accuracy: 0.7646 - precision: 0.8053 - recall: 0.8296 Epoch 24/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4919 - accuracy: 0.7663 - precision: 0.8192 - recall: 0.8107 Epoch 25/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4881 - accuracy: 0.7764 - precision: 0.8335 - recall: 0.8091 Epoch 26/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4947 - accuracy: 0.7629 - precision: 0.8217 - recall: 0.7997 Epoch 27/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4678 - accuracy: 0.7826 - precision: 0.8297 - recall: 0.8271 Epoch 28/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4644 - accuracy: 0.7896 - precision: 0.8406 - recall: 0.8247 Epoch 29/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5595 - accuracy: 0.7427 - precision: 0.8028 - recall: 0.7881 Epoch 30/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5118 - accuracy: 0.7534 - precision: 0.8323 - recall: 0.7655 Epoch 31/100 162/162 [==============================] - 1s 8ms/step - loss: 0.4940 - accuracy: 0.7658 - precision: 0.8355 - recall: 0.7854 Epoch 32/100 162/162 [==============================] - 1s 8ms/step - loss: 0.5478 - accuracy: 0.7168 - precision: 0.8341 - recall: 0.6912 Epoch 33/100 162/162 [==============================] - 1s 8ms/step - loss: 0.5668 - accuracy: 0.7155 - precision: 0.8010 - recall: 0.7338 Epoch 34/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5159 - accuracy: 0.7520 - precision: 0.8361 - recall: 0.7576 Epoch 35/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5173 - accuracy: 0.7540 - precision: 0.8191 - recall: 0.7857 Epoch 36/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5012 - accuracy: 0.7501 - precision: 0.8365 - recall: 0.7534 Epoch 37/100 162/162 [==============================] - 1s 8ms/step - loss: 0.5187 - accuracy: 0.7526 - precision: 0.8243 - recall: 0.7753 Epoch 38/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4799 - accuracy: 0.7756 - precision: 0.8361 - recall: 0.8040 Epoch 39/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4970 - accuracy: 0.7594 - precision: 0.8421 - recall: 0.7640 Epoch 40/100 162/162 [==============================] - 1s 8ms/step - loss: 0.5166 - accuracy: 0.7573 - precision: 0.8457 - recall: 0.7552 Epoch 41/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4844 - accuracy: 0.7694 - precision: 0.8381 - recall: 0.7890 Epoch 42/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4667 - accuracy: 0.7776 - precision: 0.8400 - recall: 0.8021 Epoch 43/100 162/162 [==============================] - 1s 8ms/step - loss: 0.4687 - accuracy: 0.7849 - precision: 0.8513 - recall: 0.8009 Epoch 44/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4779 - accuracy: 0.7805 - precision: 0.8404 - recall: 0.8073 Epoch 45/100 162/162 [==============================] - 1s 8ms/step - loss: 0.4775 - accuracy: 0.7774 - precision: 0.8455 - recall: 0.7942 Epoch 46/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4775 - accuracy: 0.7805 - precision: 0.8445 - recall: 0.8015 Epoch 47/100 162/162 [==============================] - 1s 8ms/step - loss: 0.4857 - accuracy: 0.7673 - precision: 0.8238 - recall: 0.8055 Epoch 48/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4629 - accuracy: 0.7830 - precision: 0.8465 - recall: 0.8037 Epoch 49/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4595 - accuracy: 0.7911 - precision: 0.8539 - recall: 0.8091 Epoch 50/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4650 - accuracy: 0.7925 - precision: 0.8552 - recall: 0.8101 Epoch 51/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4617 - accuracy: 0.7857 - precision: 0.8591 - recall: 0.7921 Epoch 52/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4564 - accuracy: 0.7897 - precision: 0.8646 - recall: 0.7927 Epoch 53/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4497 - accuracy: 0.7909 - precision: 0.8527 - recall: 0.8104 Epoch 54/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4595 - accuracy: 0.7909 - precision: 0.8652 - recall: 0.7942 Epoch 55/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5268 - accuracy: 0.7515 - precision: 0.8238 - recall: 0.7738 Epoch 56/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5252 - accuracy: 0.7491 - precision: 0.8135 - recall: 0.7845 Epoch 57/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4976 - accuracy: 0.7590 - precision: 0.8397 - recall: 0.7665 Epoch 58/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4729 - accuracy: 0.7853 - precision: 0.8471 - recall: 0.8073 Epoch 59/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4554 - accuracy: 0.7913 - precision: 0.8547 - recall: 0.8085 Epoch 60/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4542 - accuracy: 0.7894 - precision: 0.8546 - recall: 0.8049 Epoch 61/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4471 - accuracy: 0.7975 - precision: 0.8490 - recall: 0.8280 Epoch 62/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4432 - accuracy: 0.7996 - precision: 0.8544 - recall: 0.8247 Epoch 63/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4341 - accuracy: 0.8043 - precision: 0.8573 - recall: 0.8296 Epoch 64/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4303 - accuracy: 0.8048 - precision: 0.8567 - recall: 0.8314 Epoch 65/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4367 - accuracy: 0.8029 - precision: 0.8579 - recall: 0.8262 Epoch 66/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4438 - accuracy: 0.7998 - precision: 0.8531 - recall: 0.8268 Epoch 67/100 162/162 [==============================] - 1s 7ms/step - loss: 0.6403 - accuracy: 0.7988 - precision: 0.8583 - recall: 0.8180 Epoch 68/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4428 - accuracy: 0.7946 - precision: 0.8532 - recall: 0.8168 Epoch 69/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4410 - accuracy: 0.8006 - precision: 0.8571 - recall: 0.8229 Epoch 70/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4301 - accuracy: 0.8064 - precision: 0.8607 - recall: 0.8290 Epoch 71/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4285 - accuracy: 0.8074 - precision: 0.8619 - recall: 0.8293 Epoch 72/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4253 - accuracy: 0.8075 - precision: 0.8628 - recall: 0.8284 Epoch 73/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4220 - accuracy: 0.8079 - precision: 0.8659 - recall: 0.8250 Epoch 74/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4250 - accuracy: 0.8056 - precision: 0.8610 - recall: 0.8271 Epoch 75/100 162/162 [==============================] - 1s 8ms/step - loss: 0.4292 - accuracy: 0.8072 - precision: 0.8648 - recall: 0.8250 Epoch 76/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5055 - accuracy: 0.7907 - precision: 0.8584 - recall: 0.8024 Epoch 77/100 162/162 [==============================] - 1s 8ms/step - loss: 0.4458 - accuracy: 0.7911 - precision: 0.8477 - recall: 0.8177 Epoch 78/100 162/162 [==============================] - 1s 8ms/step - loss: 0.4253 - accuracy: 0.8048 - precision: 0.8590 - recall: 0.8284 Epoch 79/100 162/162 [==============================] - 1s 8ms/step - loss: 0.4129 - accuracy: 0.8132 - precision: 0.8666 - recall: 0.8338 Epoch 80/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5085 - accuracy: 0.7859 - precision: 0.8352 - recall: 0.8253 Epoch 81/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4552 - accuracy: 0.7915 - precision: 0.8462 - recall: 0.8204 Epoch 82/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4293 - accuracy: 0.8043 - precision: 0.8542 - recall: 0.8338 Epoch 83/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4237 - accuracy: 0.8128 - precision: 0.8684 - recall: 0.8308 Epoch 84/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4216 - accuracy: 0.8135 - precision: 0.8672 - recall: 0.8338 Epoch 85/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4195 - accuracy: 0.8081 - precision: 0.8724 - recall: 0.8171 Epoch 86/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4415 - accuracy: 0.7957 - precision: 0.8787 - recall: 0.7866 Epoch 87/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4055 - accuracy: 0.8217 - precision: 0.8748 - recall: 0.8390 Epoch 88/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4051 - accuracy: 0.8182 - precision: 0.8717 - recall: 0.8366 Epoch 89/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4012 - accuracy: 0.8222 - precision: 0.8780 - recall: 0.8360 Epoch 90/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4296 - accuracy: 0.8159 - precision: 0.8748 - recall: 0.8284 Epoch 91/100 162/162 [==============================] - 1s 7ms/step - loss: 0.5947 - accuracy: 0.7292 - precision: 0.8775 - recall: 0.6662 Epoch 92/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4585 - accuracy: 0.7810 - precision: 0.8563 - recall: 0.7869 Epoch 93/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4318 - accuracy: 0.8046 - precision: 0.8652 - recall: 0.8198 Epoch 94/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4169 - accuracy: 0.8137 - precision: 0.8733 - recall: 0.8262 Epoch 95/100 162/162 [==============================] - 1s 8ms/step - loss: 0.4120 - accuracy: 0.8172 - precision: 0.8734 - recall: 0.8326 Epoch 96/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4117 - accuracy: 0.8166 - precision: 0.8728 - recall: 0.8323 Epoch 97/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4064 - accuracy: 0.8205 - precision: 0.8767 - recall: 0.8345 Epoch 98/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4081 - accuracy: 0.8168 - precision: 0.8742 - recall: 0.8308 Epoch 99/100 162/162 [==============================] - 1s 7ms/step - loss: 0.4048 - accuracy: 0.8244 - precision: 0.8794 - recall: 0.8381 Epoch 100/100 162/162 [==============================] - 1s 7ms/step - loss: 0.3995 - accuracy: 0.8215 - precision: 0.8791 - recall: 0.8332

Model performance visualization

import matplotlib.pyplot as plt from matplotlib import rcParams rcParams['figure.figsize'] = (18, 8) rcParams['axes.spines.top'] = False rcParams['axes.spines.right'] = False
plt.plot(np.arange(1, 101), history.history['loss'], label='Loss') plt.plot(np.arange(1, 101), history.history['accuracy'], label='Accuracy') plt.plot(np.arange(1, 101), history.history['precision'], label='Precision') plt.plot(np.arange(1, 101), history.history['recall'], label='Recall') plt.title('Evaluation metrics', size=20) plt.xlabel('Epoch', size=14) plt.legend();
Image in a Jupyter notebook
  • You could keep training the model, as accuracy, precision, and recall seem to grow slightly


Making predictions

predictions = model.predict(X_test_scaled) predictions
2021-10-12 11:49:29.139159: I tensorflow/core/grappler/optimizers/custom_graph_optimizer_registry.cc:112] Plugin optimizer for device_type GPU is enabled.
array([[0.33089876], [0.31528813], [0.59943026], ..., [0.79872453], [0.3015677 ], [0.5736396 ]], dtype=float32)
  • These are probabilities - here's how to convert them to classes (threshold = 0.5)

prediction_classes = [1 if prob > 0.5 else 0 for prob in np.ravel(predictions)] print(prediction_classes[:20])
[0, 0, 1, 1, 1, 0, 1, 0, 1, 1, 1, 0, 1, 1, 0, 0, 0, 1, 0, 1]

Model evaluation

  • Evaluation on the test set:

loss, accuracy, precision, recall = model.evaluate(X_test_scaled, y_test) loss, accuracy, precision, recall
20/41 [=============>................] - ETA: 0s - loss: 0.5541 - accuracy: 0.7328 - precision: 0.8520 - recall: 0.6980
2021-10-12 11:49:36.157210: I tensorflow/core/grappler/optimizers/custom_graph_optimizer_registry.cc:112] Plugin optimizer for device_type GPU is enabled.
41/41 [==============================] - 0s 6ms/step - loss: 0.5380 - accuracy: 0.7579 - precision: 0.8578 - recall: 0.7361
(0.5380053520202637, 0.7579273581504822, 0.857758641242981, 0.7361282110214233)
from sklearn.metrics import confusion_matrix print(confusion_matrix(y_test, prediction_classes))
[[383 99] [214 597]]
  • 383 True Negatives, 597 True positives, 214 False negatives, 99 False positives

  • Further evaluation:

from sklearn.metrics import accuracy_score, precision_score, recall_score print(f'Accuracy: {accuracy_score(y_test, prediction_classes):.2f}') print(f'Precision: {precision_score(y_test, prediction_classes):.2f}') print(f'Recall: {recall_score(y_test, prediction_classes):.2f}')
Accuracy: 0.76 Precision: 0.86 Recall: 0.74