2252 lines (2251 with data), 140.8 kB
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# RSNA Intracranial Hemorrhage Detection "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<b>Competition Overview</b><br/><br/>\n",
"Intracranial hemorrhage, bleeding that occurs inside the cranium, is a serious health problem requiring rapid and often intensive medical treatment. For example, intracranial hemorrhages account for approximately 10% of strokes in the U.S., where stroke is the fifth-leading cause of death. Identifying the location and type of any hemorrhage present is a critical step in treating the patient.\n",
"\n",
"Diagnosis requires an urgent procedure. When a patient shows acute neurological symptoms such as severe headache or loss of consciousness, highly trained specialists review medical images of the patient’s cranium to look for the presence, location and type of hemorrhage. The process is complicated and often time consuming."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<b>What am i predicting?</b><br/><br/>\n",
"In this competition our goal is to predict intracranial hemorrhage and its subtypes. Given an image the we need to predict probablity of each subtype. This indicates its a multilabel classification problem."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<b>Competition Evaluation Metric</b><br/><br/>\n",
"Evaluation metric is weighted multi-label logarithmic loss. So for given image we need to predict probality for each subtype. There is also an any label, which indicates that a hemorrhage of ANY kind exists in the image. The any label is weighted more highly than specific hemorrhage sub-types.\n",
"\n",
"<b>Note:</b>The weights for each subtype for calculating weighted multi-label logarithmic loss is **not** given as part of the competition. We will be using binary cross entropy loss as weights are not available"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<b>Dataset Description</b>\n",
"\n",
"The dataset is divided into two parts\n",
"\n",
"1. Train\n",
"2. Test\n",
"\n",
"**1. Train**\n",
"Number of rows: 40,45,548 records.\n",
"Number of columns: 2\n",
"\n",
"Columns:\n",
"\n",
"**Id**: An image Id. Each Id corresponds to a unique image, and will contain an underscore.\n",
"\n",
"Example: ID_28fbab7eb_epidural. So the Id consists of two parts one is image file id ID_28fbab7eb and the other is sub type name\n",
"\n",
"**Label**: The target label whether that sub-type of hemorrhage (or any hemorrhage in the case of any) exists in the indicated image. 1 --> Exists and 0 --> Doesn't exist.\n",
"\n",
"**2. Test**\n",
"Number of rows: 4,71,270 records.\n",
"\n",
"Columns:\n",
"\n",
"**Id**: An image Id. Each Id corresponds to a unique image, and will contain an underscore.\n",
"\n",
"Example: ID_28fbab7eb_epidural. So the Id consists of two parts one is image file id ID_28fbab7eb and the other is sub type name"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"Using TensorFlow backend.\n"
]
}
],
"source": [
"import numpy as np\n",
"import pandas as pd\n",
"import pydicom\n",
"import os\n",
"import glob\n",
"import random\n",
"import cv2\n",
"import tensorflow as tf\n",
"from math import ceil, floor\n",
"from tqdm import tqdm\n",
"from imgaug import augmenters as iaa\n",
"import matplotlib.pyplot as plt\n",
"from math import ceil, floor\n",
"import keras\n",
"import keras.backend as K\n",
"from keras.callbacks import Callback, ModelCheckpoint\n",
"from keras.layers import Dense, Flatten, Dropout\n",
"from keras.models import Model, load_model\n",
"from keras.utils import Sequence\n",
"from keras.losses import binary_crossentropy\n",
"from keras.optimizers import Adam"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"# Random Seed\n",
"SEED = 42\n",
"np.random.seed(SEED)\n",
"\n",
"# some constants\n",
"TEST_SIZE = 0.06\n",
"HEIGHT = 256\n",
"WIDTH = 256\n",
"TRAIN_BATCH_SIZE = 32\n",
"VALID_BATCH_SIZE = 64\n",
"\n",
"# Train and Test folders\n",
"input_folder = '../input/rsna-intracranial-hemorrhage-detection/'\n",
"path_train_img = input_folder + 'stage_1_train_images/'\n",
"path_test_img = input_folder + 'stage_1_test_images/'"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>ID</th>\n",
" <th>Label</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>0</th>\n",
" <td>ID_63eb1e259_epidural</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>ID_63eb1e259_intraparenchymal</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2</th>\n",
" <td>ID_63eb1e259_intraventricular</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>3</th>\n",
" <td>ID_63eb1e259_subarachnoid</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>4</th>\n",
" <td>ID_63eb1e259_subdural</td>\n",
" <td>0</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
" ID Label\n",
"0 ID_63eb1e259_epidural 0\n",
"1 ID_63eb1e259_intraparenchymal 0\n",
"2 ID_63eb1e259_intraventricular 0\n",
"3 ID_63eb1e259_subarachnoid 0\n",
"4 ID_63eb1e259_subdural 0"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"train_df = pd.read_csv(input_folder + 'stage_1_train.csv')\n",
"train_df.head()"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>ID</th>\n",
" <th>Label</th>\n",
" <th>sub_type</th>\n",
" <th>file_name</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>0</th>\n",
" <td>ID_63eb1e259_epidural</td>\n",
" <td>0</td>\n",
" <td>epidural</td>\n",
" <td>ID_63eb1e259.dcm</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>ID_63eb1e259_intraparenchymal</td>\n",
" <td>0</td>\n",
" <td>intraparenchymal</td>\n",
" <td>ID_63eb1e259.dcm</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2</th>\n",
" <td>ID_63eb1e259_intraventricular</td>\n",
" <td>0</td>\n",
" <td>intraventricular</td>\n",
" <td>ID_63eb1e259.dcm</td>\n",
" </tr>\n",
" <tr>\n",
" <th>3</th>\n",
" <td>ID_63eb1e259_subarachnoid</td>\n",
" <td>0</td>\n",
" <td>subarachnoid</td>\n",
" <td>ID_63eb1e259.dcm</td>\n",
" </tr>\n",
" <tr>\n",
" <th>4</th>\n",
" <td>ID_63eb1e259_subdural</td>\n",
" <td>0</td>\n",
" <td>subdural</td>\n",
" <td>ID_63eb1e259.dcm</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
" ID Label sub_type file_name\n",
"0 ID_63eb1e259_epidural 0 epidural ID_63eb1e259.dcm\n",
"1 ID_63eb1e259_intraparenchymal 0 intraparenchymal ID_63eb1e259.dcm\n",
"2 ID_63eb1e259_intraventricular 0 intraventricular ID_63eb1e259.dcm\n",
"3 ID_63eb1e259_subarachnoid 0 subarachnoid ID_63eb1e259.dcm\n",
"4 ID_63eb1e259_subdural 0 subdural ID_63eb1e259.dcm"
]
},
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# extract subtype\n",
"train_df['sub_type'] = train_df['ID'].apply(lambda x: x.split('_')[-1])\n",
"# extract filename\n",
"train_df['file_name'] = train_df['ID'].apply(lambda x: '_'.join(x.split('_')[:2]) + '.dcm')\n",
"train_df.head()"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"(4045572, 4)"
]
},
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"train_df.shape"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"(4045548, 4)"
]
},
"execution_count": 6,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# remove duplicates\n",
"train_df.drop_duplicates(['Label', 'sub_type', 'file_name'], inplace=True)\n",
"train_df.shape"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Number of train images availabe: 674258\n"
]
}
],
"source": [
"print(\"Number of train images availabe:\", len(os.listdir(path_train_img)))"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th>sub_type</th>\n",
" <th>any</th>\n",
" <th>epidural</th>\n",
" <th>intraparenchymal</th>\n",
" <th>intraventricular</th>\n",
" <th>subarachnoid</th>\n",
" <th>subdural</th>\n",
" </tr>\n",
" <tr>\n",
" <th>file_name</th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>ID_000039fa0.dcm</th>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>ID_00005679d.dcm</th>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>ID_00008ce3c.dcm</th>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>ID_0000950d7.dcm</th>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" </tr>\n",
" <tr>\n",
" <th>ID_0000aee4b.dcm</th>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" <td>0</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
"sub_type any epidural intraparenchymal intraventricular \\\n",
"file_name \n",
"ID_000039fa0.dcm 0 0 0 0 \n",
"ID_00005679d.dcm 0 0 0 0 \n",
"ID_00008ce3c.dcm 0 0 0 0 \n",
"ID_0000950d7.dcm 0 0 0 0 \n",
"ID_0000aee4b.dcm 0 0 0 0 \n",
"\n",
"sub_type subarachnoid subdural \n",
"file_name \n",
"ID_000039fa0.dcm 0 0 \n",
"ID_00005679d.dcm 0 0 \n",
"ID_00008ce3c.dcm 0 0 \n",
"ID_0000950d7.dcm 0 0 \n",
"ID_0000aee4b.dcm 0 0 "
]
},
"execution_count": 8,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"train_final_df = pd.pivot_table(train_df.drop(columns='ID'), index=\"file_name\", \\\n",
" columns=\"sub_type\", values=\"Label\")\n",
"train_final_df.head()"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"(674258, 6)"
]
},
"execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"train_final_df.shape"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {},
"outputs": [],
"source": [
"# Invalid image ID_6431af929.dcm\n",
"train_final_df.drop('ID_6431af929.dcm', inplace=True)"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Collecting efficientnet\n",
" Downloading https://files.pythonhosted.org/packages/97/82/f3ae07316f0461417dc54affab6e86ab188a5a22f33176d35271628b96e0/efficientnet-1.0.0-py3-none-any.whl\n",
"Requirement already satisfied: keras-applications<=1.0.8,>=1.0.7 in /opt/conda/lib/python3.6/site-packages (from efficientnet) (1.0.8)\n",
"Requirement already satisfied: scikit-image in /opt/conda/lib/python3.6/site-packages (from efficientnet) (0.16.1)\n",
"Requirement already satisfied: numpy>=1.9.1 in /opt/conda/lib/python3.6/site-packages (from keras-applications<=1.0.8,>=1.0.7->efficientnet) (1.16.4)\n",
"Requirement already satisfied: h5py in /opt/conda/lib/python3.6/site-packages (from keras-applications<=1.0.8,>=1.0.7->efficientnet) (2.9.0)\n",
"Requirement already satisfied: PyWavelets>=0.4.0 in /opt/conda/lib/python3.6/site-packages (from scikit-image->efficientnet) (1.0.3)\n",
"Requirement already satisfied: scipy>=0.19.0 in /opt/conda/lib/python3.6/site-packages (from scikit-image->efficientnet) (1.2.1)\n",
"Requirement already satisfied: networkx>=2.0 in /opt/conda/lib/python3.6/site-packages (from scikit-image->efficientnet) (2.4)\n",
"Requirement already satisfied: imageio>=2.3.0 in /opt/conda/lib/python3.6/site-packages (from scikit-image->efficientnet) (2.6.0)\n",
"Requirement already satisfied: pillow>=4.3.0 in /opt/conda/lib/python3.6/site-packages (from scikit-image->efficientnet) (5.4.1)\n",
"Requirement already satisfied: matplotlib!=3.0.0,>=2.0.0 in /opt/conda/lib/python3.6/site-packages (from scikit-image->efficientnet) (3.0.3)\n",
"Requirement already satisfied: six in /opt/conda/lib/python3.6/site-packages (from h5py->keras-applications<=1.0.8,>=1.0.7->efficientnet) (1.12.0)\n",
"Requirement already satisfied: decorator>=4.3.0 in /opt/conda/lib/python3.6/site-packages (from networkx>=2.0->scikit-image->efficientnet) (4.4.0)\n",
"Requirement already satisfied: cycler>=0.10 in /opt/conda/lib/python3.6/site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet) (0.10.0)\n",
"Requirement already satisfied: kiwisolver>=1.0.1 in /opt/conda/lib/python3.6/site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet) (1.1.0)\n",
"Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /opt/conda/lib/python3.6/site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet) (2.4.2)\n",
"Requirement already satisfied: python-dateutil>=2.1 in /opt/conda/lib/python3.6/site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet) (2.8.0)\n",
"Requirement already satisfied: setuptools in /opt/conda/lib/python3.6/site-packages (from kiwisolver>=1.0.1->matplotlib!=3.0.0,>=2.0.0->scikit-image->efficientnet) (41.4.0)\n",
"Installing collected packages: efficientnet\n",
"Successfully installed efficientnet-1.0.0\n",
"Collecting iterative-stratification\n",
" Downloading https://files.pythonhosted.org/packages/9d/79/9ba64c8c07b07b8b45d80725b2ebd7b7884701c1da34f70d4749f7b45f9a/iterative_stratification-0.1.6-py3-none-any.whl\n",
"Requirement already satisfied: scikit-learn in /opt/conda/lib/python3.6/site-packages (from iterative-stratification) (0.21.3)\n",
"Requirement already satisfied: scipy in /opt/conda/lib/python3.6/site-packages (from iterative-stratification) (1.2.1)\n",
"Requirement already satisfied: numpy in /opt/conda/lib/python3.6/site-packages (from iterative-stratification) (1.16.4)\n",
"Requirement already satisfied: joblib>=0.11 in /opt/conda/lib/python3.6/site-packages (from scikit-learn->iterative-stratification) (0.13.2)\n",
"Installing collected packages: iterative-stratification\n",
"Successfully installed iterative-stratification-0.1.6\n"
]
}
],
"source": [
"# Install Efficient Net as it is not part of Keras\n",
"!pip install efficientnet\n",
"!pip install iterative-stratification"
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {},
"outputs": [],
"source": [
"import efficientnet.keras as efn \n",
"from iterstrat.ml_stratifiers import MultilabelStratifiedShuffleSplit"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {},
"outputs": [],
"source": [
"from IPython.display import HTML\n",
"\n",
"def create_download_link(title = \"Download CSV file\", filename = \"data.csv\"): \n",
" \"\"\"\n",
" Helper function to generate download link to files in kaggle kernel \n",
" \"\"\"\n",
" html = '<a href={filename}>{title}</a>'\n",
" html = html.format(title=title,filename=filename)\n",
" return HTML(html)"
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {},
"outputs": [],
"source": [
"def get_dicom_field_value(val):\n",
" \"\"\"\n",
" Helper function to get value of dicom field in dicom file\n",
" \"\"\"\n",
" if type(val) == pydicom.multival.MultiValue:\n",
" return int(val[0])\n",
" else:\n",
" return int(val)\n",
"\n",
"def get_windowing(data):\n",
" \"\"\"\n",
" Helper function to extract meta data features in dicom file\n",
" return: window center, window width, slope, intercept\n",
" \"\"\"\n",
" dicom_fields = [data.WindowCenter, data.WindowWidth, data.RescaleSlope, data.RescaleIntercept]\n",
" return [get_dicom_field_value(x) for x in dicom_fields]\n",
"\n",
"\n",
"def get_windowed_image(image, wc, ww, slope, intercept):\n",
" \"\"\"\n",
" Helper function to construct windowed image from meta data features\n",
" return: windowed image\n",
" \"\"\"\n",
" img = (image*slope +intercept)\n",
" img_min = wc - ww//2\n",
" img_max = wc + ww//2\n",
" img[img<img_min] = img_min\n",
" img[img>img_max] = img_max\n",
" return img \n",
"\n",
"\n",
"def _normalize(img):\n",
" if img.max() == img.min():\n",
" return np.zeros(img.shape)\n",
" return 2 * (img - img.min())/(img.max() - img.min()) - 1\n",
"\n",
"def _read(path, desired_size=(224, 224)):\n",
" \"\"\"\n",
" Helper function to generate windowed image \n",
" \"\"\"\n",
" # 1. read dicom file\n",
" dcm = pydicom.dcmread(path)\n",
" \n",
" # 2. Extract meta data features\n",
" # window center, window width, slope, intercept\n",
" window_params = get_windowing(dcm)\n",
"\n",
" try:\n",
" # 3. Generate windowed image\n",
" img = get_windowed_image(dcm.pixel_array, *window_params)\n",
" except:\n",
" img = np.zeros(desired_size)\n",
"\n",
" img = _normalize(img)\n",
"\n",
" if desired_size != (512, 512):\n",
" # resize image\n",
" img = cv2.resize(img, desired_size, interpolation = cv2.INTER_LINEAR)\n",
" return img[:,:,np.newaxis]"
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"(128, 128, 1)"
]
},
"execution_count": 15,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"_read(path_train_img + 'ID_ffff922b9.dcm', (128, 128)).shape"
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"<matplotlib.image.AxesImage at 0x7f821e7b95c0>"
]
},
"execution_count": 16,
"metadata": {},
"output_type": "execute_result"
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAQUAAAD8CAYAAAB+fLH0AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzsnXeYVdW5/z/r7NPmTC8wdBhgKKKIoIC9R0W9JmrUxMpNYqLG+Eu50XhTb0zx5t7Um6jYjS3G2GvsURERREFAeu8D08tpe/3+eNfeZw4zwwzMDFLW93nmOWf2XnvttffZe623fl+ltcbCwsLCQ+CzHoCFhcW+BTspWFhYZMFOChYWFlmwk4KFhUUW7KRgYWGRBTspWFhYZMFOChYWFlnotUlBKXWmUmqJUmq5Uuqm3jqPhYVFz0L1RvCSUsoBlgKnA+uBD4Avaa0X9fjJLCwsehTBXup3MrBca70SQCn1KHAe0O6kEFYRHSW3l4ZisbeRLpHfcuygbSxbWgyAbol/lkOyAOqprtJa9+msXW9NCgOBda3+Xw9Mad1AKXU1cDVAlBhT1Km9NBSLvY3aaVMBmPWb25l22kUApBct/SyHZAG8qh9f05V2vTUpqHa2ZekpWusZwAyAAlViEzAOADR9Qeb9bUfK/2cMmEDTF0RSiFnFcb9Bb00K64HBrf4fBGzspXNZ7COIPfk+ACOfbLvNYv9Bb3kfPgAqlVIVSqkwcAnwTC+dy8LCogfRK5KC1jqllPom8DLgAPdorRf2xrksLCx6Fr2lPqC1fgF4obf6t7Cw6B3YiEYLC4ss2EnBwsIiC3ZSsLCwyIKdFCwsLLJgJwULC4ss2EnBwsIiC3ZSsLCwyIKdFCwsLLJgJwULC4ss2EnBwsIiC70W5mxx8CGQnw+AW18vG5Ri5UOHA5Cf1wxA/281s+aSQQAM/PXMvT9Ii05hJQULC4ssWEnBottwRo0AoGlkCQD1g+WxKrvjPYZ/+SMA4v8cBoC7tYorL5OE2TvKTgdgxPdm7c3hWnSCXiFu3V0UqBJt6dj2fQTGj+EfLzwAwKFPXQ9A5TczJCoqEgEgedyhAGy6Js7gCz8BoPm8yQCsP02Rs9EB4OVr/huArww5bi+M3uJV/fhcrfWRnbWz6oOFhUUWrPpg0SmO+igNwC19HwXCAKw8/w7ZeT4Mf/LrAFReJ1JD8LW5APSNHoVTVgpAztOzpc3T8PJGUSlmxyN7ZfwWuwcrKVhYWGTBSgoWHeKCxVsBKHUadtlu5RdEarjocLELLX5mNAAD/nsmadNm3Q+PAWDwLTM5Y8AEAP605t2eHrJFD8BKChYWFlmw3geLLFRdfTRzf3rbXj3n9LXHs3FqfaftLlq8GYDHxvbr9jk3PnkIg69cD0C6rq7b/e0P6Kr3waoPBwkC+fkwQkpxuB8vlm15eaQmjgRg+ZdCAIwdvYbbawYC8I2iDXtlbPcOeduvClL5wDUADL/pPX9/+qSJAHyl8B4AHmPPJ4XgULkHZw5dzMsPjAFgSJGUuXOvySO9eNke932gwKoPFhYWWbCSwkECt74ePlrUZltzmbgYrz3uVQAuzP+YIcGYadHza8b0tceLZNABll0hqsvx74ubs6G/Q9nC5qw2KhJBx/esYG1qjZQ4Pb1gIb+ZPC9r37RNJ+5RnwcarKRgYWGRBWtoPMgR7C/6+af/3R+Arxw+k1AgBcC82iEAnFm6AIB+wVqOizYCEAuEO+37whWn8fgIkUAWJ5oA+H/DjvGDl7oKz4XpYemdR6Ei4uysvOLD3errYEavGxqVUoOBB4B+gAvM0Fr/QSlVAvwNGAasBi7SWlfv6Xks9gyBXGM8a2zcZbvUJrHol75WAcB9oSm4G0R96PeeLBiPLu4LgGps5n9Xr23TR+QtmVieqXwpa7s3IQAMD4khM3HmUcDuTQo7Y9XZd/rfz2DCLlp2DcF+5QCkNm/pdl8HArqjPqSA72qtxwJTgeuUUocANwGvaa0rgdfM/xYWFvsJ9lhS0FpvAjaZ7/VKqcXAQOA84CTT7H7gTeDGbo3SoktYf/MxlM9JABD655wuHeOUixSw4zCRCsJBF7dRAbD9EFkz6i+Tz4uHL6E8VAvAH5acDED+QwUErpNttS+JQbAwkNPmPBElksIfbv8TEO3yNY2+9xqGIe7JAbPyu3zc7sBKCNnoEUOjUmoYcATwPlBuJgxv4ujbwTFXK6XmKKXmJNkzS7KFhUXPo9suSaVUHvAP4P9preuUUl06Tms9A5gBYmjs7jgOZpy1sAaAP73iEqxP7tax7vYdAKRzhwGQSARJ9xUj3kVTJLNxSGQ7ALc9cC7Nh4o0UHG3/M7OG+/z6d1iuzri6f8HwLvn/i8ASeCMu74PgBYKBX795QcYH951LkVrLJl+G0xvu33UfRLkVMF7bXdadAvd8j4opULAc8DLWuvfmm1LgJO01puUUv2BN7XWo3fVj/U+7BkmGDf7P16fCsDImz5EJxO71YczTn6a1edLinPRMhfMI7HpNJkccleI6D/0sY3oRvEipLdIslSwfz+Wf1OMlKUL5MCmPiKADrtwBc0nthXNd9f70BU81ZjHbZUje7zfAwm9TrKiRCS4G1jsTQgGzwBXmu9XAk/v6TksLCz2PvZYUlBKHQe8DSxAXJIANyN2hceAIcBa4Ita6x276stKCruPwndKWfSsrPIDb90zVuRgxVBWXi55DsEWs9GF4PHyc9XVicFw8MOiZeZ+upXkgGIA0hHRB5r7hth8jDxDw58Q1SU0+1PpqqkJAkZvcL0kalj25ykAlMyTNWnOf/VMAtbWtLhfLx98bI/0d6Ch1+MUtNbvAB0ZEOwbbmGxn8LmPuxnCL0pkYcr7x3GwLv3UEIYLHUXEkNKyN0oq3xsm6zkdUOCxOcIK/PgObLya0fm/vrx5cQLZeWPF8q2ouVJ8taIzcFpESOk29SUOVkrCQFABYM+bVvq1En+9vkJEVX+Y5jYR5q+INLE23++w2/jRTYGDh9L/p/FpvHY8Nf8/RtT8jjHpx0FQOSFD7pyOyx2gs19sLCwyIKVFPYTDJqVB8Dch8XSX74HUoIX+tx4mEgbOWvraTxR7AaxbaZNSuO0iBTQVC6PR0up/J/Mh3RYJIuyj8WMFFtdQ7DFBBXNmt/pGHQq5X9f+UWROuYnWvh+5Qlmq3hPYk+KNDHttRNwDQ8ESL0I9+PFVN8kUkPF9K8CsOqsu5hgKOZrvy6ELbXTRNpoTUNv0TnspLAfYNkDE1k2W0T0yj/uucqgjVgfSMgLnSqKUrZAxPu85RKVWHVYKcl8efEbKozonyOfOcsjlH0q+4rmiqtR50YJzpSXVUUlUtFt8ayW7eOYj+XFf7nPDACOveEG8pLy4k5fsgaAS/IlXeaMARNgnvS//HeiWoTqFUN/LPEJo96RPk8892oe+rM4wWqqZAKNbpNJZ/MNx9DvD7ZEXVdh1QcLC4ssWElhH4ZzyCgABvStoeCbVQCkd3XALpBatx6nWNyJDQMl7blhiCJmDI2hgaICxMtcwjtkrQg2yuMRbJLP0sVJUjmyr3FMHwBir85HhaU/v7DsLrDy10dzX/FvABj50H8AMOLv7xEcJK7RS/KzA5taE6pUPij9v/TsQ5zx4+zsyOizs7mq/lvSr5a7FNooGaCLby7uBoHbwQcrKVhYWGTBSgr7MMb+dTkA8248gnTdym73lxorpCkNQ0x4iYbipaL/uyFZH7SjiZvch6JPZFufDyVXQaVd3KjYNtS7sqK7AMaGsPbHUtth0IlCeRY4dR11XxI7QO0F0kfwE8X0EacAMCKZyVt4fvbz7Y556Z3jfCIVPXehv92rPJWu2p7Z9mY24YoeL8SsJbPCfmj1zoQtFm1hJ4V9EA0XyYv0rgkeL3y1+1WZ3eOPYONx4mkIGSk/tsUlUSQveXOJiTxULqFamQz6vSUqi14vYjjpNKp1DALi0fDUh9hmUUWak9Knfmk41e9LX2WPGNKXkMuGG0xQnZFTQ8dtpyPilZWn3dOGSOWWqjHseEBUofxbZaJz3l2Q5dkAcOdLZGXDxUdzS5VMEP+5Us7zi+F2cugIVn2wsLDIgpUU9kFsOU8Ma2Uv7F4B1q3XHkPDUFmtR/5EUig99+CKi8OopLgiC5bJWpCOKJztJt5gmxxX8lGQ3M3GnOnKNt0skYo6lSJw+Fj5/qmoM25jIxjKNy+eob5ajJYVX/qYPKSdF6FYPSbAkOOF0u3lsc916bo80X/E374BwICmhcya8LjsfEQ+1qYauGzx5QAk/ir0aoUPioQ17D/f477oSQD88EsiPfyiS2c+OGElBQsLiyxYSWEfgueCHHGprPLO2Eqgczfk5hvEwFc3NkWwRmwDKlfIVzdfI9WVFn/hD/x4q+QEvPbJ0QCEG1yiq0wCa0geBeUWEt4h0kXNhDLp9/NCnlX+QZwdN4hU0Pe8tmxZQ/5gjI+t7A5q0jgZx1RZf0L18O2hr3RyRe1jxcW3d7hvSDCPe8f8FYDrv3KRbJwnWaTphUuo/LEwUscvkXyOZfdPpPJKywTdHuyksA9h2RViUR/5czHKdVbCbNWv5eV2TK2Ufv8KUPyxvOQNxwvhyMf/8RfTOsSt5fLSji+SSaRgTYqWocVZfW49IkSoQQyFKVMTxjNMPnL/H3edljxqGAAe+9bGUwqJbRH1xCNgqR+suPbdSwF48+Q/AvJCp7W0c9TuCa+bUuLVmNkygJsfEzamoS/KDVG5menUY7WedrmoICsfvLtHmKAPRFj1wcLCIgtWUtiH4BVVrb1EXJL5j7Z1Raqg/GQb/t9k0hHDbWM+tILF3xYj36ppMzo8z6HnS4HZjWcUsmaFqAY56436oMEVQYEhM6Rd6pChQMfkJZu+I5LH4MckbyFdXiT/P7ySdJVILk6ZpGPnj+rP2jzJkWjRGTqO3ZUQPPQPSp5Dn2Adsc3SX3CHSAXpT5e3aR98fS4gLNEjxhl+yoVL9ujcByqspGBhYZEFWzZuH0HqlEn+KubVYvDIUQFUSAKElv23GA4jgxuIhCRYp3ZdoTQqSLLytHs6Pdfpi88FYGzhZt7/gwQSlb0uq3x8VD+cNzo2wHkRigWPzPLHWv606PAbp2bnPgRiMZKTJWho+6EiHTQO0lwx7Q0Aflj2aZv+z/z0bLnOjXIPht8JrzxyLwBnjT4ekByLbdeIPeXDH2Wo3JYmRUI445/CKj36m5LK3W4xWqVY/oDYFEZePq/t/gMQvU7camFhcWDC2hT2EQRfn0vA8BG0lhA8rPqpUJdFh9QBkEo51MclbLmsQnTj2Uf8fZfnmBsXHoNfD/8HAN+5/psUPSd2jPhJIoGEtzR06AJ9eeNHnDEge9v6y0aSnto+V0Fi6liSBeIira0Uw4fOT1EREUaXxxpEwrn5g/O54BBZrZv+INmSBYPk0Vw9TXP2lHMAcOvX+333uc3kTfwoc76hQZGmfnb8kwA8Mvo02bFoeZsQaLQmtqBtJSsLOynsU+hIlXNKS0jlmOjCpQUABCvrGdtPiE5+NORZ0zJTCdpz1XmGOIBJEdk/5s5rARj63Ey2fEuMhLmbZCpoKi8hf1H2+b2IwjPPuxwh786g//92TF4SXbaFTVeY3IS4jN+pCfPLT84EoHmDjC1ns8NTq0UdGBCXl7fxOFEF3jj2z5xRKwVlBv0yMyl4mLZkGgAvjH6BGTXihr39QVFBmq6TmIRBL00i9pQUtqHVPQ6acIrVfxsPwLCLO2eOOhhg1QcLC4ssWElhH0K7BjGA0mJytsr8nThcVtCCWAsb6kX89iQAgLtrhU7k0Y0SvfjK2GfZGYPekIjF2sumMuCFTQBsPUmOaxgChYeKcdD9RAyBFU9dDcCoD2bv1vWk1q0nf53wK9YZRuiC1ZqWGhl3362iUlT/WwMsl4CtmhHiD71xgox7UDCPlrHNHZ/kPFOC7lP4yyLheRzyK5FenMrhch0FOaRPMEzQb2WMiv0fEpfrBdfJ5+PRYZ1SyR0MsJKChYVFFnqiwKwDzAE2aK3PUUpVAI8CJcCHwOVa690rcHgQQoXCHdaBXH5VX5Kloh8HNotxbEtjiFXT7mrT9u9XnA7AK0//tc2+kY9IiG/wWpE2+t/jsOQnEmjkNokuP+rrH/jlvuq+LO7H0ffIaqwR+wZAevsui375CNcbktiYrD+1IyEs9XBRRr2PRJKkmkSSKDtfCFqigUyh3L5ldR32n66TfUuTjRQ8IzaK9MliNMW4Vp2CAlRabCbNZ5maEC9+QLpayGF/8ep5AHz3wxd45pDSLl3XgYyeUB9uABYDBeb/W4Hfaa0fVUrdDnwF6Jm6YAcwAhWDSS9dkbXNi1dwEopUi1jxJx0l+RCra0va7eeldiYDD8cfK8xFn1QJxftTd83gqAe/A8BIE03pFBez/VxRHwpWiSjdUi4TUTQS6fJkABA/+yiSMXnZY5vks3h5Ejco3zecLBNFwT+LibQYQ6SSSeTdOkkGuzR/O5s3y8RVQMe44dAzqblJvpe9Kd6bTSaWod9jS3DN5BFslskhEI36qsKIx2Qydk51/YK7B3OUY7fUB6XUIOBs4C7zvwJOAUyyO/cDn+/OOSwsLPYuuisp/B74PmCqgVAK1GitPafwemBgN89xUGBnKQGg+QjJOUgUuKhSMULOe0fSq5dd0TXhq9YVI93FSy/khDLJBfjzBCm1dsx/f5cR94qLccflsqqWvrScogdEalh/s7gr+8wTUd6dNAbnY+lD5UhMhW5qRhvRXCdkxd1yvfQ18Mm1pKbIz1+4Rh6JnDcW+qnV0XHSf797PmTVzUcAUP+OuDDXJuTaP/rKW1TeuVOMQTtw6+tRKZFAUuvEdVl3vKR+3/79l/nxRf8OgPOvj2WsocyjH3hbjI+//fg0ZjwjktavRozv9JwHKrpTiv4cYKvWem7rze00bdf5rpS6Wik1Ryk1J0kHVncLC4u9ju5ICscC/6aUmgZEEZXv90CRUipopIVBwMb2DtZazwBmgOQ+dGMcBxycInHZbZ5imJPTGmeNrMxLpmckhOlrJRfg3iFvt+ljvQleOv61GwD4n2P/zt+3Sdj7BZfJqhw7xGX1tw8DYNDrsnqnt21j/Q9kBdeGyzW6Vfa50RB1Z0v76HaRHoKvf4jTV2pAqKjQxwWb5Odcf/4QkkaGzFsn23JHDsFJid1g6NPCxLzjwgmkTHBhuELyJ9KL5cCLZ3+NyNHiruzfSZGnin+IvSMwTKSNI4cK7dvkSIill0kflR8YqSbeNm5z+G9djn1KrksdJdepP1jQpt2Bjj2WFLTWP9BaD9JaDwMuAV7XWl8KvAFcaJpdCTzd7VFaWFjsNfRG8NKNwKNKqVuAecDdvXCOAxpbLjoEgFRMVtfIDkX+OrdNO09COPFqCS4KNaTo+4tVALy3QKz3r54lPPGnvfgdxnxXvA9rvi2ei+YhSfqbSu6BdySUOTB+DMYBQOEy+VI1ocCMRzHgQQn08dx5gfx84odKgNKmqSIp5G2QcYdPr6J+hZwrkJL1p/j+T2n+/GQANh4rosghk1fxZIXYpr2w7N+PHQZAVCX53fJ/6/ymAS++9CgAw5/4umxYLqv+Zeok5p4v9+FL930NAD1vYZvj9QcLmL5GsnVvevQh4OC0LfTIpKC1fhN403xfCUzuiX4PZLSXHt14oTAehxrlpQo2y4s05MUaXnzh4TZ9VLwgFZf7FUi7rROjrFo1DIAHPyd8hpf8TEqzjX12OWsekH2JuKgDFfc4hF6bI+MpkBc/WZxDwRqZDJI5YiIKN5jScs3anwy8aMH44GK2j5PJoHipiOR1FfKyP3bofYydKJxu126QmIcVv4YNFxoRfbOMe/m2MkpGSh9b0xJD8dVCiabMC0T5fWVD+zexFbz4CYCV598BwLj3hPZtRzxGwFDEbf+5nLv088FMubtWnJKL7xO26pN++iYAv+r0zAcebESjhYVFFmzuw2eFMkOYaiSFwPgxbD1S5mjHRPeVfyCr2pLvRNvtYuyNUlNh+Xcl4KbvxM1MKN0AwM8uFxdcaYsE7Qx+voFlb0oW4fAnZWUMVlXjmtXSiwx0Zi2iyEgNnuEwNVBWYWfVZpInipHSmScu1Eg4RN9mcRmmY/I41YwUSWFsOOaP9S8DDbXcRvCqQd24RfIRltX3YeocqdkwtkwyPx+ueMM/1nEMqasZV7qujqAxJq47fxAA87/3F3bG/KniXjxq7pf40WbJi/hg4mMAVN5yjU9/54yskH6Xr6L0E7k3HtlL7aWiGhU+1P0qXfsLrKRgYWGRBSspfEZQjdmZf9snFuMaF2CqXHTz7WPFJfnrqY+1Of6ZxhirrxljOhOdf/PCviy/SQJ36r8o0sX2c019x58fxvBn38vqoz0yFR2P43pGxFFiNwg0SVBS4+Rh5L4nEoInWbCwzg9O8R6mQcZ4ecavJnDGJ9LuOyUi1TxaX8wtC4UDIb5MVv5QnUKbg5tOkXOvSoodoV4H6Vcobsod54oBtvChWaTXi6d7/veeaecqBHWuhDGn3AALqoUd5qUikX6mT3udd34nAVLuasm3CA4ehN4q4128UkLBQ+c0m3N2eJoDDnZS+IyQWi0+9GA/KXEWL1Y4CXmBXaMtNE2QB/KivNo2x39n9sWkB4nYPuhlk1/w5Ptw5KEARGrklR/xa3mh9cKF7UeRtQOPpUjViyitXCO+98vLqvLcFfz5lc8BcI8xKhasdolPlPEqMyvpEJR+Iv/MLxsGgCsffNA8jB2NooZEWzlg2jAptYOLll4MwP8c+jhff3U6AP8bkPFcP+Q13u4jqpA2Klxq3Xr00YfLwQkZb9INc7DBqg8WFhZZsJLCZ4xtZ4qInsiHRKmsliFT+m34/WJoPL3/uT5Zys+2iQjtrIwy6q+ywnl5E2rSOPh4KQARk4btLa5bvnUMpZ9IOLnHGg0Z450qlE8dDUOdSZWOiPpSN0Hcp+GaFASMjuN2VsxOMPLb2Qa62sum+sHw/WZn+oiYCMlR4+Sa/lYrnJSbEwW4pj5EzrYkO+OXVWJkvbmsbVbjmvfFCFlQ0ULAuHdXrJNr+c68KwheJP0O/SRzTP0wL7TSSCJue5H7BzaspGBhYZEFKyl8BggOHQwpL9DH4xvQBFvk5+g3U3T5TcfL6j3wmmZOGimReGunyTxe+ZPZYAJ2vApNA/40p0OilgF/XcyOs2VVLR0qbja3OJ/NUyXPIs+Unw/XpHCWiVFw85el37z1si+6qYH4yeJGDL7WOg+uc+hjRFcvWNlM7XDJQ1h/urEtJBRjfisMzzUJMQQmTeLF2sYS6qrFplBQIPcn1Krfv8+QCMSbb17CcfPPB+Cd8U8AULJQrCj/teZcKg8XY+LKWeLKTEc18X7GdnKEFMHVHy0iWi3XevRYkb7mbRy0W9d5IMBOCp8FHAd3h9APxTYb85+CsvkiHq89S0TY5EAR99ercvrOE0t65Tc/AGDNfx1NvL9hS/qaZAq1NiR6EZO6yRRbLSqgdOZmAFLrxHLfNOlISj+R/Y0DxboZSWufuchLbMJEA1YdWUKiQL73q+s8YUgddRjOZvEmrJ8iE0H57CbShpk6vN3wTha5frrzxtUSDLutVBKiNtQXEo7JfYkXiNEvE/0Aff/PZEndDC8d6kV9yrU095H+P104mKlHiFq1JpExcrr9pN/Vn5eJceg8TVMfeSWWVUuS17WH/AuA58guxHsgw6oPFhYWWbCSwl6EX+xl7XoC+bISNhoKmr4futRWyM+RKDYGuCYRofvPbMBZtBqAxbfLSqpicQ75vkQves65QH4+LceIihCqk1WwpVzE8di6RvhYjHGBw4SopW6oQzxfpJL89cYwGQpQdZgc48oH9eLOxw1r34246jxJXEpeP5HAFmnYd65IANsmyGqsHQg1yHV6lIvrTo1RsFLa1UrOFsGmAIGYrP+lH8g1zxsmN6ZqUyE40j5vo3TiFBf7ORgeDnv/yyyYkp0f8vH3Jcrx8P++lsTh0m/hZDFkVi0qQ63NMec34xg8iKCpT7GtXvaVhwyhpJUULCwsDlZYSWEvouVECSyKvr2ILRdJNGKexDCxY4xDzhZDXlooK2Ll12RlT0+oZPmMYQCohMgFRTMjpDZLnoBXbNWJQ6JQVulQnZjjgoYQNQbUXiQkK8lcaRMv1n5GZMMA0de3HpcCDHN0o6yupaMlYGnbhiKCeYZcZYGs7DqUJmpIWdMmzmfs0ZK+3fDzgaz6oqw7sTUynkSJS6hRtumAnFu5iqZT5N70nWWIYZ+RKMZCt4b6E0WkCP3zfTlPO/d20KWroW3leQCa+ms+XCUGxtIScbeGawOETUyY9pZGrSlcINc68Fsyjie2TTI7u05Yu7/DSgoWFhZZsJLCXsSGk4wz7apKlFnVItWyyuZu0FSdIt6GMT8RPbbpJHGVbTw+SHqTWfG3yDze5/aZ/Ga1BAaND0vWYeWD1/i1EgOGXiB/rfTfPCCXdEi+950pq15xSYwtk41efaKskGptEcE6OYc7TDwTJTmidIeGpLlxxEsA3Foi9SBr3+hHyWKRHmoq5fo+3WSCnSZFUIb2LFFgQrgjrm9HcfMMNVqDw5bJIpWURYXOPd94PJb8R4xQVMZR8bwxcqTTbcKcA31KOWvalwHacE+kypIUfCj2nBFfXA3AokkOweflXOUvrjXjaGDDF8WAktws55o4oG39ygMddlLYC1j9cxHvh0+Rh2/JigFETKRcbIvJK0hoSt6WB3Hdb+RFbTBiebRPA+UPi7gee1JccDueG+VPBhXPSwxD4TpFyvjrUrnyEibzjDvxsCD9Z4pbUzvK//RYluKzpAiKKnNJG8ankeVVALw05vk213TaocJydNi2r1NXKy9c7XgxVqotMgj38CYwjEs0mRc65EJctnmqiNvo4LQYJuaIfAaqTZ2GcARt7lVgsCQ16Y1bcAZKwpLnLk2tXgtr248+DOclKF4q+96fL+njueWNvg83vU2uM5CXi2veiGBQJqzyiIxjW7s9H5iw6oOFhUUWrKTQi/Ai5RJ9ZNVZVy3iKmlFjll6St8Rt+K6Cwdz1XQRzZ/fLEa38yokMOj5O48nWmVSrSdL0NAHEzP/JiZ1AAAgAElEQVSVoEZfI3UL3MnjWH+KBAklik25NmNUDNcE2HRM1GyTVfvwE5dyTrFE7jlm2VzdUsoP+gr3Y5mT2+G1zU+IuH9C5XLeSouLM5wjK38ibapBFTRRbySEeJE8atHCOC0ml6EgVySX+nSUfGPw3HKa9FH4kARYVd4cYvmvJbgo2U8+A8tXoRpMgNJRIwDIaWhk5bdGmdFlisgCRGfmETKp6gVLRQpL9YWGwWYcx8v9Dr+1gESR3IcLhgqH4/oW85vRcem6Aw1WUrCwsMiClRR6AMGKoWw/VnTcwgczWYEbTpWVLbxdVp+WXBOmuyZI/0fF3bjkB+Ju+8G0J/jzshMBuGmMSAw3vnWRdDQuRV2lrLjvnP8n03uef56WzwkvwI6xIVr6yIobbJD53rMZtJSnCW+X1f3Kc18H2s8shDVAtoQwq0UknZfrD+OvCyV46rzR8wGIp4P07Su+vaqFEhqcO0JW1eraXM/uiYpKH47jEi4Qg2ptrXFrFqRp7itGypx8UxjIsxWsWoMzX3IwksZVm1sxlNSqNXKOSnHt5qwoZMC/TN7HV7OvSJ9czbpSWfEHvyYSw7qjA4w8UVynOxaJcTGUTPhVpgKGuGZUrgQ7baF9SrwDEXZS6AHocIgqyRMiXiAPcMMwTdp7EcrkQS94V16C/q9uZcnNMhmMPFys239YcjL5UWnnkaqccubvAfgkkc/IkLxoHgV6a2z7ingHkokggQ0mb8KLigyZWIAmh/hgeWn+o3QRAA/UlXNFgRjZbq8Rl8A3ijawyRSSOecWYYIuNwVbf/zSY4yetAmAASGJKHy25gifTn7Ya+IRqBkl526qDlOwRCaiOrMtUpokYOITEtXyoqlomsZBsi32nikjqzOZHPmmkEx0q6gbzZV9CJlJwSs2s+34cj9OYmfMn/wII1ZJxe3qUabc3RKIDRAVZfUwmUDzgX6zZOIpv1Du98QcmTjeZmL7nR+AsOqDhYVFFqyk0AOonlhGmbFtefH8eaOrUUYEDT7hFUQxov2MBoanTAbkqxJpd+Elb/GzPtkFSjxD30k5Lq3VhZ1x3GBJdX5l9ngCxisXLRUxuWWbSA45Gx36nSqr/DHzvgTApRWzecDYzxY1ibuv4s2zGfyybOs3V1yojeNl36VPXceLF/wvAKNCMrbnlYagqRORJ1JB80dyvZE0NA4296CsxR+vY+5L6Rxp31IWJF4s25oGmHs0WFKWU+vWU7BKjk1HpH2wMYUytHOFK6V9c2mA+C7SE3IrRPpKr5BGxUtcPh4l0tGA08TYG/z7MALvfJp13PZ0x/f9QIWVFCwsLLLQLUlBKVUE3AUcioSC/DuwBPgbMAxYDVykta7uoIv9GumTRM/M3Zwg+LYY3gpbRdp59QTW3CrLcdgExKx4aTiuCW5cfH2mXsEZAyZk9f/yxo92ef43DcXY9ris2ocetoatjbKyVdXIpzLuwdSEBtZsNfUbVor0cO9r0yibLxJL6C0p0a7uSvOv23ZV6S/bCHlr+UfcOk3GefsxsvL+/nEp8+aGIVUs9yNm6M1KYs2smiftBm0090oFSUyQylDBxTLu+kkinUSHlBHaJBGe6eWi3wcHDUQXSruECc7KqXKJntNxiNH8yY8AcNi71wJQsMYlUS9GiPMOlWt/etTpRAzXxF8WSZ2IOyb9deeuDnh0V334A/CS1vpCpVQYybu5GXhNa/1rpdRNwE1IfckDB4anMPyhxCqn6+raZUpedal4JPKj8rA2zBTrfKpQs/SK27LaDn/i61Ty/m4NQ9QKOGnEqwA8VF/Kj2fLC6lrjNXNDMxxXL44WnScJyNSH7FpRQFNn5MX7r2/zjG9ep+7j28UiRj+ja/KtVWnm5j09LflXHVi4Nsc0PQZJ/djXUDuR58PNclmmSXTRXJN28fKoxkYFaTPR4ZxyUwKqfUbwEQflxQKo1Pd8Bz+OOZRM5LW3EzZaDxMVJH+76SJrpUJbuYYiXVY/XnFMC33Rs2X+1czIdZOLwc29lh9UEoVACdgCshqrRNa6xrgPOB+0+x+4PPdHaSFhcXeQ3ckheFISPi9SqnDgbnADUC51noTgNZ6k1Kqb/eHuW/BqRS1QBtRsyMMv0OiBdc1yEqULJdlO39M2zTcym+2lRKO+IWIuvP+s21JtPZwaf52fhkTt2NzWub7MUPFuLiproBb+kqEpPfJ0V3q1se1G6byxnOiMnkRk8F6xbcvkIIsnqTgodiJ+cVevcKxx9/3PbaNECPojPPuBOBPk05jgAmo+Gi18EcGK6RNcayZmjqRuPq0M6bGQaIK1Q8JMDnSsYTg4akT5F5eWXgVOS+KFDB3tRh7oxuDuEbNyV8jv9VND1wFwGBmdtr3gYLuGBqDwETgNq31EUAjoip0CUqpq5VSc5RSc5LEuzEMCwuLnkR3JIX1wHqttbfEPY5MCluUUv2NlNAf2NrewVrrGcAMgAJV0tXiRZ85ggMH4Oa0LWHeHuqOGQZAygTDlb8vq+E7l/2tS+c68d9nA1Dx1NWs+vwMAL67SVbqJxdOYOVp97Q55oMp9wJw/LzLAFjxrkkFLnbhqC6d1sehs6SUe2KpBBRFtypM9jVhk14drtHM+KPYMe45VwKhZh/xd78PLxpyalT099nTf8sRr18HwJwmqXmRF4pz99BXABg773oAThsnxsuJeWu49MeS1v2n6+VanhuX8T2Ga2Vl/8X0R+kKvr/yAgB+dsgz3JQU9uexZdK/O1CxoXYYAH0/EkmldK78xl2rcnFgYI8lBa31ZmCdUmq02XQqsAh4BrjSbLsSeLpbI7SwsNir6K734XrgIeN5WAlMRyaax5RSXwHWAl/s5jn2KejCPNyPF3exsXwMfEdWHS/4BuDkfxcOhA0nyE9QQXbxV4DFkwyFO7M541rPXSnSRiUfmrLu2XijRVb1ukXCj+AZ4vOXOYy+9xoAlky/re2BrTD2DrFlxAyxS2GNfCZyM3RvOdtkHIE0BEy+QPrJMrmWzZJ8sOqsu5jbMgyATxOiIl5VsJWfT5F1osgRO0Oh08SDdWJL8Ahe02a9WhXvA/mykl9fLKHN129c47tvw6+IR+U3Kz7H5029h/bwbouMt+ohsR9s/G4xxw4Wb8bKerlXU0pXU3uWiHXuApFG9Io1u7xXByK6NSlorT8Cjmxn16nd6XefhEnQSS9a2qXm1VcdTZ+X5aFrnCAP/N23/87szSP8ktRvqHgpc8x/rhSR+YSdcm8uXHEaEwulmMlb43N2ed4f/v7fAQj0M5GETTLuZJ6wK3eGU674CoOSMoltnpJ9rmiNJhXz+BiNHpHQfr/RGnnx+r5pZqKz4LoiGbdXqOWq8U/whVwxfq4yBXHu3XQcjxu36tNHrQbwq0RHnBQrCsVNOiKUiS70YjhO+qpMrqmHgzwzQgyHlSFRYwYHZWI57JlvMXas+DDL5ktex21LT+Cew8VJdt12YWyauW04J5aLm/mDtMfNePDBRjRaWFhkweY+dBHBQYZYMOj4abvtwSNWKX5gNtq4LtfIItnuSjf875K9N/BNzQnR9iMYvVUU4OZde0GZ9YM/ADDpjzcAkDZSh3Y0eWbYR30oKdmHlm3izU/EJFR5j8jtdePCpHIMXZsRBgIm8LBuaIBwnUgg8RJDULLaJRHMbl87si0t2vcNt+P8RAvjwzKoceG217e92WSS5ta12tdx/sGbd4lb85dVo7nxwauATB2Hpv4iuRzyx40k7hZxJlUq525Y6ZAcb2pBRCSgafmCQTySI47PIaYSViB+8HnGrKRgYWGRBaX1Z+8NLFAleorat80Qy/8qRCYjL5/XScsMvDLvzf8Qo9Ub4zKOGM9QNmCWEALcO+RtLlt9EgDvrRAJo3KgeHNbE6d6XAfXr/k8XyoXl+UFeRmqsEtWnQLA+5+Kuy+01ej3WqpQAbQUy1pQvLgZDIlrw0AhccnZmqSp3NRoyJd9jlksEwWKiDE6erUjkgUQNSQyxYvFFrH1SFnt68YnWHXmXZ3fqFbX1R5fxJ5i6kcXypeHythm3LEeQWzZR5qf3iI5Hq/UScblk29MgX5ysSVFMp6iX8i1qJkf99i4Piu8qh+fq7VuzwaYBTsptELTF6YAEHsyE1244mF5eaMfy8Mx8NauRbYFBw8i1V8mg5efaptUM+ot8douPVGMXSNen47bKNqcSmULcE5xnHDEMB8vKPS3Bw6TdOCmWjEIhraECJgCqm5EftdQbaaEm+dN+P6NQoH+uxWnoR8UcTlSJxOGG1TsGCNidaLQ0MV7TnoXcrZlF35pGuiSsyWb5Sluwk7yV8LAS8XY+kxlK4vqTjh/+ek8MfKVDvd3Fyd98nmiP5A4iYZfGK7GS2tYfa2wNk2/WHLFH15xFK7RgS4bKRNuVInu9PAvzqLg4Vnsz+jqpGDVBwsLiyxYSYG2KcqtU5iDA8U1ltrQiYVvJzSfN5l4gay479/acVzAn2vEXXnn7edSV2no20xZtVSR/F+4MEjSSNUpU5MhNawFt96UhmuQ82iVcUGGjUbR3EfalyyCxv6yb9TZywA4oXQZt//jLABi5vLSOcqPwPQMdg1DRQTI2RrwjYmeGzJR5BLZEcgaW8GKTJv6Yea7MWmrikafxbnuI4kPuPzf3uCHZdnkJj2JMyumoLtiMAw4BE09iaqT5XdpPl+ySMvzG9j6nGzr97v9Mw/CSgoWFhZ7BOuSbIXpa4833+r9bbsrIXiIFzpUn7Xr3AjIBPcMvv4+bviXBNHouCzHJR/Kctw4SPsGMt/FmA6gTOWktOFVCDYG0EFZrZtNSqH3f93wAIb7lU/elSpJH+UNh6GyguZsNZF8IfzaB66p1oQnHajM95ZykWKCDQGS+dLeK1MfajbnHBbIuAdHSvZmYFOMqhwxagaHi35/RGx1p/epO+iSlADgpkmtkyCnoge8T9nljKwgb6KJ4jx8rDTvamTrfgYrKVhYWGTBSgrAUf8pOQEl97bNP9hdpE82fAMFiqvGdd1a/Xb9aIJVpvbBFlmOqw+VpTe20fELtKZLxAsR3hAmUSL7Pcq1QAI/QKnRVD+KbpXPYIumTmgdCNUbd2KhxjGSRNw4NUKNmfM3jJD+o5uNzSIAqTzjkWgxRWhDmTDnyHZTI9JkRjb3hemnvQnAqKiENj++9Ug+rRKKjcZ6kU5yVaLD+7I13cjlF0iAV3tenK6g6QtTsjxKe4L08lXkGeYnt1s97fuwkwI9Mxl42HKkPOiBBDy4VJzj7Rddycazzx5N3hb5Xlcpj13uennbVBpSA+TFcbaZtG0nMxmopHyWfqLZdJy8tEWL5AXVht65qVxhWOFoKTMvdjxAzkfizvQ4I5v7aUJ1ZpJpkj5KFstLnooqakZlRysqV5GOeBVfzPgHy2PlFseZUy3pzoea6s0Bpf3JwCscm0Zx8xahQftl+fys/vs6udSMESvrlBtl8r7mP//BVQUSw/GQSWa6NH97G47LSfPkPr795zuo+qMkX10++nQZ6uABpJcsx6ItrPpgYWGRBeuS7GHUXzwVgIZBAepHSuDLqvNmdHrcEbdc65eR91bcvA2y0tUOD/guPdcxRj2dyTWIVJsApSB+una41hgLg5nIQ29bfYVpU62IlxojpXFlukH8UvRRE5QUNnbXYJMmUivtt04ybshcjSnjQLBB+kgMNOqAVuSXyApdv0OCh5TjgiGV1TGRQKYcsoI5aySl+ZVj/w+AitCuIxtfMoVrfzdy7C7bAdS+MJJZEx7P2laVbmTqY98FYNQ9QjaeXti5RLc/w7okLSws9gjWptDDKH5XXIyFRfl8WlnQaft/NEib+grtG/hyN8pqXGdqHOogBGXBJWEqo4dqlC9ZeMQkiRzIX+vlJmS3DzZCS5mxFSQ9l6O4MSETeBRIgQ6Z/SaUuXGACZluUOwYn23AdBKKhKFlz19t2o8WSSG5Npd6LQMJhEUqcJuDqIDHOy+f4UCKdJM8ip97VOpXVk4Wi+kLo19o976dGRM34+/a3ZuNwmnLOQOxN3iBamVOLuk8U4fjEqmHMexHXejsIICdFHoYqfWG0XijQ7Bmcqftb1k8DQAnAaFGU526RF7UtEjIRLdDkyFN8TwHbghy18u26sNMQtIC5ccxuIZM0ctHcMPQPFDUmbzlQb8Pz3PgGIk/UZr2VRC/ZkQ8kxodqg343wHcoCZcY8brFXj9RJK8Ajkaqs0jpsWSGejfAo3mwky5uR3xXPIXmYNNv0s+EnVixPrpXHioJKHdWt42tdx7yXc2MnYFoe0ytlSuXKhTVEi6pna3+znQYNUHCwuLLFhJoZegJo6lYNz2Tts1LJFMytg25ec3eO7BkGTvkiiEgHE7eqtatEpRZ4rZegbBSL1L/eDsVduLKGzpo8lZLz+3J4FoJ5NNGa0yRsK+mui67PoJ8b6mjPxWh4CRKJImgzLdL47aISfz4hTccCbCMdyYMUgCpLdGfCboeFjOs66mqA1VnGtEe1qCbGgWHWjSXCGHmTvpMY76obgn7/vxbwGoe3EEBWetoCMEckWN8SSKlzd+xNKrJCfliA8uAaDqvEMovr/n3NP7K6ykYGFhkQUrKfQSWspzaO5CyL1jKrSnYmRyCAz9WcCQo4Zr8I2KfsHYnEyuQU6VHNcwIOAbGD03omcsVGnlGySTJjoy2KDQhmSlfpiJ629wfKOjJz2w1VCZ5Wifmi1VKF9CoTSYvIwmY5CMbDf/D8rE/gUbjS3EgXRU2mljU9BkuBv88Ubk4sLRJOsbRFJoahERp+LZr7HqFi/zVIKv3jv8Hz679X11EjH5yJgB/vndRrHUusd5toeMfaJmi9hAYgMUu6hmf9DATgq9hNjM5ThjjQ/9mI7b+UQmOcKjCBA21aQ9oxtK4gcAgualbO7rUrBC2rWI8RztZDwRUTNRVAtlJJEditRORNCBNDgmSSreT17CUFWQcI15gc0L6nkXgo3KV21Ui0wUyXSEfEO84nEieolaKpVRhTyrpdOSSaMOFYgukkgE/YnCbx2X/mNFjT7xSWGuJFCFZuVz8sLzgGw2Kw9etONVG+WztREy8E5bY+XIEZsBWNEygODgQQB+YtTBCKs+WFhYZMFKCr2EdHU1has7LzamSkXHCKyL+rkGXkShJ3JLQ/loHGqSlDY5OHFp19RP9mkHctdJw8aBJo7AqCcqDcl8WcnD1XKeRH4mGlElZVsgrnwVoaU0exxOs/JTsX11IOT4pCyeazJhkqvSJSmiq02uhjFoagfSOeakhnbO1QrXbPPyOJyYDKJ6YyF9KkUv2FglakQkDNXPGnbtcW1u6S5Rf8lU8y0jMRSGRQLRUZeao0VSyLOSgoWFhYWgW5KCUurbwFcRhXEBUjauP/AoUAJ8CFyute44N/YARu7azklW3GbzE0S1r1d7OQSeITHVyt7gSRPBZmg2EYpetGMyX5PMN/0aF59KZvoIGVegR8Qa2qF8qrWwoVRL52i0kQJ8OcUs7DoEEWN8bBxm0rYTipRxpXq2CA+hLSFfKvCuJR3TvrRRWCArdM2OXH+/Z2RN14oxQsXSbGsU66mbzKxhhauk4T+bpN3nYkk6wrI/T6HyOkmdLnzKSAi/zexftEVErYJFIZqk8h2R06RCVOjVuR32e6BijyUFpdRA4FvAkVrrQwEHuAS4Ffid1roSqAa+0hMDtbCw2Dvork0hCOQopZJADNgEnAJ82ey/H/gpsOuKpgcotNP5nHvsOCFRfXdBJeFt8nME0h7/mXy4YUgUeyu6sfrntqoNaQwDwWblByYFjS3B81okCjTR7V7ug+leQWxD9hgjOzSNRl33ApW8PtIhjdOcLQ1EqwI0l8vYvBBsb1yBhPJtFp4HQ6UlNBqgMEcGWZPIR+d63g+xQYSHNfjnqN4oRoqA8coEmyBUL5LCC7WHA/C52Bw6wrzzfs+UKsmI/PRrf2mzv2+BnKu+KZ/6U0TsGvCihDunOuz1wMUeTwpa6w1Kqf9BKks3A/8E5gI1WmvvXq4HBnZ7lPspUvmhTtucUCwFa9/Vo0j0kdsWrDWTg3m50lExAEImQjGZr9HmffZyE1Qa4mWmfkNT9gsabFJ+lGOoXl7KxkEZVSUokjwtZcoX5Z2EV5zWI2XJTAj+C9rYSjUw6o/nhpRzerqHOS6hSBvGqKoGUQuc/CRqvfhLk4Uy/nSdsV6mFZiYBdcPewj4KeGvPyT5JUtveItRoVzaQ2Egp93JwMP6bRKdkD4izeeGyyS9elVzh+0PdHRHfSgGzgMqgAFALnBWO03bJWxQSl2tlJqjlJqT5OCr12dhsa+iO+rDacAqrfU2AKXUE0iYTpFSKmikhUH4cWbZ0FrPAGaAkKwALP3LZMbeLEQX+3u2WiA3l+iG+k7bDQlJfkS4uIXUJglbdENedSeTN5DnEqky+QKt3IRejoFHh6YDyl/dfSuhv0JngpG8DMpAEsLmNicyhaf8duHt2fu0q4mbQCmPlCUVa+We9PItPLKVVjUhEmWeezZAXpGswomEycWoC0OBYYeu2ykJQmlImLUrmFlfQg2iA6VyRV/68i3f8w2vxSdKMNI745+gKxj9wx0A1E7sx+wBQh9XHlop15w8+Gzk3XFJrgWmKqViSikFnAosAt4ATBE/rgTahpxZWFjss+iOTeF9pdTjiNsxBcxDVv7ngUeVUreYbXd3tc9R186m83Cf/QM6HocVHZes9zAxIqtUKuHgRo0xzgTwePYBp1n5rj3PSOeGMiuyx7Ccimk/WMmDx2eSzsnUjmgYkgli8vR/7zNR7Porv1c7wpM6kvnalxA8F2kyH5Ieq3TKC2/2aksqPzvSC312oy6NpvZlaZlIUtubQ2gTNu2TwLa0CvWOmLyM2qC/LZVrXJbmgdFKmKgB6k2ORGfw3JmpVfI75a5aQ+4/ZF+g2NgZqg8+SaFb3get9U+An+y0eSXQObvIAQ6dSqFTnduu6115CWJ5cRpMvD+ewc68KCrl+Fb8UL0Rx0vTBJKZRCUQI56XV9CacxHk5fEIVzzjoxvRuK2MlN6nZ7j01RIzKTjNyo+K9FSbdI72JzFv8vDzHdxMvIQ2F+A0BXBKZOaqbciwOquER1Nvzm36cBoc0t5EmONdQIDGfoa0xbyzTiv1qHajBGs0ubIzFvDYX7Lx9TelyO8o2nou0tXV7R4DsP1rRwNQPwSG/ejAS7W2EY0WFhZZsLkPewE/3HoYALf0XdBm3wjDWlwca6ahWgyNykspVmY1LkwRaDRSgSEfia0L+mqDt0IHWzIruW7JXu0JgDaqRcZImIk78PIttEMmPsJb8b2wiQASooYfGoETVyS9FVx5PI8y7mSR28rQafIcIpqwY8ZoRJBkPFMCz+s3kC+GRJ0IEMw13zeKZBEv1eRUSbu0yfxM5KtM3IZZ6p5slGKxl+a3T3ZT+p4p0NtfIhpTmza3267NcXeKdFAKqCMk+UJ/bArkuvu/AmwlBQsLiyxYSaG3oBSYmhoP/0sIFW65sK2k4MEJuJT1FXKDHTUmCMdzK7rKN+yFakxEY6H2bQnaGOISGgLG2OcFNnkrb3SLon6MrLixlRkjnWcI9NqH6gJ+9KS/zSszl6+JbJPzJwpMm5DGMZRrfnl6Y3h0GgO+29QNG2NhIkDAWD+9YCSnIEm63ozJ2EncRvk/1KRItshjqjwuBzfDDeFfZzpzrYSk44HBju0Cae0SMUFcXZUQdgU99VAZx8yPu93XZw07KfQWWhXZ6feuebsv7KAtcHjJBj7YJgzGfpKUebhJBfyn3zMkBuLgmFkjbdiT3JiLYyaNVK73EmYiG50aw+JsjPOhBuW/VN5L5rTgJzh5qoXXxi1M4RqLvUcDH4hn4iW8RCfPC5EuSvlkLB5jlBt1/VuTMmpGuj6UUTO86EgTxZjKUwRCaXMNxkMR1aRyso2bicLMd49O/qQcPwSyDcbPupy+dd0T9bdedwy1o6WPym91r1blvgSrPlhYWGTBSgp7AfmrGjttUxxsojgqkX6bzErneJLClnCb6MV0TKONSO6YKEAvUhEy7M+eUbGpn+uToHhIxVqlVhvJQgcVOZulnUev5rkoSalW9GoC7bRyI8a8Fd3Ld8iQsng1HlST46sPQc+gWudkpI2db4xWuKZQjC9tBLX/3cutUCnlu1xPGbV0517aoM/dMXaMlovp91KnzdtF8tRaRv/U5GXsWRf7JKykYGFhkQUrKewNBFSnTS4qnMM/VkkacH6hSAwNpmS7W5wiVO1VdTJGOgeCno3AW40DWtyAtJdD0DZ4SaUzKdle+bh4n5QfceRTtRnVO9DQakXPMRvjDoEGOTa0TVbeZB8RP4LbM7YCN2rWHw3RkAR1Ve2QknlB3coeYdyUqi7kjyGQJ+3TZp9LoE2QViqmiRiSlzsHv0tnyF2wicZ+gztttysM+qXyRQQVMkVz0+a+7MeuSSspWFhYZMFKCr0JZUhIF60GIKkNkYhy2jQdGYrQJ09sD9VN2VzsgWbHpylzPS9BK3efZ3VXKeWHHHtuxBZT3SlUG/D5Djy3ZTJP+56CeH9jXEgrP6Mx0OJlaXpuS03Q0MF5K3+wVb/RKq/OpKzybiSTK0HCy6TUNDQb94dHFpsCbfgZvLBsT0zRClwjNXjjcSMu6UhG8gBxrUbGdp5Ze/OW8dLH9h2Uzi9q3cVuQ8/5ZI+P3ZdhJ4VehFMiSTWp0SKmXr5abvejFa+3aRtSDuOKNgHwRr3Ug3ONf95JqsyLaYyLNDtoMyl4cQKBVjSFnkHSM8gFkpmkqnhh2u8rXCXnyFkjL1681M3kRuRli8BOreMbJkPVxrgZybwWYVNDwkuWiJdpP8LSj4SMK1ImFsF7ycO1iniRicMwSV4tfYw7NKT96tQ+yYsO4BgKjniJuc6SOAumPExneP6B4wDo1zQTZ7Xc7/1X0O8dWPXBwsIiC1ZS6CUEolHS2yUtOrhNxN1Hj4MAABV9SURBVNTZc0bJznYkBYB+ZqltaRajledqRGl0IDswKNis0MFsLketaEXBLPDciV7kIkimIoCbyhjKvIKxys24GF0jMnjnDtVlVAVPmgjVBvy+E2I39MvOaQdfmvHo29yoi9tgjIhe9qOTCZDyIyyDGckoaIysrbkfPeIXb6y3HfMAXcGAOwybc24u6arOCwAfjLCSgoWFRRaspNBLSE0ZS+CteQCklwm1V9/ZUviUL7Z/zEd1po5hnSntblx9yQLX18lDJgApHdG+wdDLgYhtCvjkrN6q6pGuNPdP47R45CdmX2siVt8eoXySlRZjLwgb+4EnAQCkoxluAy9YyZMUgl54dMj1JRs/R8FV6IA5tpWU4tk7Ag1erLI5kaN9N6wfCBUAN2IMqqaWxak5u7YMnL74XDm0aZ1026cPNHYeVHYwwk4KvQRn5kKcfuUApDZvASBvo5B+bEo10D+Y1+aYy8olJfeDT0bIcV6UoaMJmVJvPgFKWBOuNkZErwBMMJMC7UX6OXHPCKl8QpJ0K1IWL2U5OUxiI2ILcnyaeE809+IUojsULcawFzFU866j/dgCj4zFe4kDLSorJgJkMvNyMNJF4lJRdQF//86kLCQCvgriEbskStO+SvPG9P8xd6/t/WyNwGlSBs6LJ0hv27bL9gczrPpgYWGRBSsp9BJ0MkHdMcMAiD0hkkLwnU8AOPaNb7Hy9HvaHPNvuVLU4Qd9RKxVRttoaojgGt5Bx5QjCDYrPzchb50pSV+WIU3xuQvNyhvZHsjQtrVSGzypwdlgCExKNKG6bGul06oojFd81pMGgo0ZHkbv3H6bYGYcftxEqJVnX2XaeVKJt0w5vmFSZzgijVs2p6yJYaVixG1P4toZY9+9nCFa0tYDI4WtOb14WafHHaywkoKFhUUWrKTQi4g9ITn2LecIj230udkAVNyv4PSOj3ts0l0AnPPKt2RD0PV16LQpnBSpzujrnp7fGt7KmzSreLBJZcrRGaOiG9ZoY+zz7BPJglZ97UTLlsrRfnCRZ3RMR3Um/t+TTrzVPtGqtJ1n3GzOrEOebUE7GekiaCQKLygpWNJCwNC3pZLS/vyRH7dLbdcRhnxxAU65iF1WQugcdlLYC8j7WOrhxE+eCEDw9bm82yIP+rHRtsLauLDEMuf3kRqH8YVF/r7WlOleirMX3dc42CXa6HEzyjaf1Tmi/XTqpGFNciOaoJfM5JVuVIpkfnZ0oR93EMz06yUkJYrSftSkMrNB6yhKZdQdbzJzQ9pnZfZCslUqE5WZ8CYlz9HguBTnG7Vq5ItARs3qDNPGnGC+1YF7IAYk9w6s+mBhYZEFKynsBaTWiTus5nOSA1H6BvzoG1cD8Pp9d3V43NEDVgPwasMYn2gkut6kJxe4PkOyt3qH6gJ+XIIXi+CrAJFMPoRXOyKdysQAJApaGR+j3kHy4RWRSRa6vhShvdyKpCJQIpZItUUkHC9RK1yjSBm6SU+aSRZAsN7EPZhSceGagD8Or3qNR/fmOC7fG/EK0HUJ4bDfXQvAgLqZAAQHD/J/A4vOYSUFCwuLLHQqKSil7gHOAbZqrQ8120qAvwHDgNXARVrralNT8g/ANKAJuEpr/WHvDH3/Q5+HJMJRlZXCK3M7bX/HIAlmGrFkDCrsBTLJvlB9gJxtsprWSawT4VpF0gtk8lbalkzugxdc5JeMj7pEN8kj4JVcixdljJQhr3CsMTSGdwRo6ZdNuaa0Im0klkBOdip3KpYZh2+LcDTKc4k6bY2a6Zibta8g1sIFeXV0BV9fL5WbBvxmZtZ23WAjF3cHXZEU7gPO3GnbTcBrWutK4DXzP0gp+krzdzVwW88M08LCYm+hU0lBa/0vpdSwnTafB5xkvt8PvAncaLY/oLXWwCylVJFSqr/WelNPDXhvY/nvpgKw4uLbATjrrC/hfrx4j/pyW0ThTx99CM4bkqFX8dJXAVh1Zse2haWn3E3la9IuUWwIUJKKtCnz7tVUQCtfd/fqKvreilZ1Ebzw4uD2UKZWg8lbyKo56VG7m6fEaclkaXoIJBROtcnqNBJAwsu4bMW14PEuKFeRyjMn8IlYM3YIj2NBGzuFZ0/oDHPjCVZPzS4GGxxsckn2gj1h5a0ipSy7XNbBMwZM6PVz9hb21NBY7r3oWutNSnmxdwwE1rVqt95s228nhRGPG8vdxfLx4ouPdPsHD73/KQHzwI76d1PcdGPH7R0VYOVpEgE5/PGvA/KSNffLxCCASVX2AgP9eg9eMZZWuQxVGf5DzzDpqQjxPmmiW0zKtKcGmBwM5QZ8dcQzVqIzEY9e7Yh0vmGjrnJ8VcLbFt4W9FWJdJFhQo4EMoxLxdLZ+KEbADpVHTalxJf646MvAHdL1r705q27PLYn4U0GBwJ62vvQHkNpuw5ipdTViIpBlFgPD8PCwmJPsaeTwhZPLVBK9Qe8KXk90JoidxAdrIFa6xnADIAC1U5I3j4C9e5HbbbtbkHSneE2NaHHiXXQqZWV8MgfXcOcn3e+2qy88A4AKp77GqmYzMGRqkxZeC//IFGanZ7sJBTJYq/evHyEdjgZV2F+Ji/Cd0l6JRta0b0lTaqB10alFEEToOQbJLc7bcajPL7HWCaXAa9CVEkSZVyRv5zyJACX5Hdc8g2k7BvAV0+8VLravMbfF6yQ/IbUqjVtD+xhLL1tsvkmz8nx14kkF2P/rRi1py7JZ4Arzfcrgadbbb9CCaYCtfuzPcHC4mCE0nrXi7RS6hHEqFgGbAF+AjwFPAYMAdYCX9Ra7zAuyf9DvBVNwHSt9ZzOBlGgSvQUdWo3LqP3UX2lGJJm/yqzmveEMSl1yiRAQp//bZEYH68rWrerQ3x49oVIlUdakDHYpYaIscCjPgvVOH5GpDZ6OzVhP1S6daZjosizQ2TbLALJjPHRI3iJl6WJrTeSQSR7fF5WY2u4OW6GLMXUi3QiaV4/7v8AGNKFrEeAM88WCUHPW+hv25uGRQ8vb8yWJPdlA+Or+vG5WusjO2vXFe/DlzrY1eYtNl6H6zof3v6H4vslZoBfZba1fqH3FN6x6ZMn8swhEtJx3cauTQqeKnHO0rMAqGnJ4S9jhNF4fFiSDW7cIg/pU88dDaWiKxQVSmRgdTpAIm7o2H1SFuUbDp1EtokoHZEoRcjkMuig9l9+b19LX9NXQqIgvXaARCwaVSGvSPSO9466h7xA55NBrSvtv3z8JehVC7P2OYeMwl29d6MWB83KjPmUK74CQIg9fxb2FdiIRgsLiyx0qj7sDewP6oMHfczh/PPx+7O29YjIqJRfvj44cAAAz3/wQvf7bYXZccmZdowF8cI3r8HZYayDbqaIjE+dZrIjEyVGzG8M+IZDj4exNUt0tMorZpvhVNR9RTpRntoRcDlyyFoAHq54o0vj/lO1GA6fP7yP9JtK+fucQ4Qh280JoecubHtwL6K16rAvqw0euqo+WEnBwsIiCzZLcjehZn5MkytKdywgkXwNLw0n78yV3etYawJRUdRTG8SLe/aUcwB4/v3nute3weRIdh35VWfczSG3SUZhvMQQmcQgaAKUPKkgUyRW+0SvKWMXcJqUH62YMnYGL/uRgCYvTySFgYVS0u2/Kp5qM45d4YzF5xD4roRb6tQif7szbjQA6VwTujm766Qr3cX+JiHsLqykYGFhkQVrU+gG9taK4ZSV8sL813qtf4CJcySOu3pjIQFTQcrPsCzwkiFaHWCEiJwNjl8zwnNlJgbIhrMPW8CP+0k1rL5O7m6NZ8ydIsFU3PoRblM2j0LqlEl+dazQPzv1ePcYVv7a5DdcIW7pyfO+SPHZ+w+9W1dtCnZS6AaqrpaHZO5PezZ2wYNncPTUCYCrl4qa0tV04j3BQ/WlANy55ngAttXLCx2PhwiYQi75uRIHsaMqn+9MfhWA64u7H0F45rkm/mBu2/iDxvH9AYi9u5R0TecVpnsa+1NMQnuwhkYLC4s9gpUUegC/Xf2eT7Y6fa2srhun1vdY/87YStL5xopnDGrJ0ybx+gN399g5PmtMGy+//85FX93jJlA9Ru5t6V3v7fVxeTgQjItWUrCwsNgjWJdkD+A7w472V5J7h7wNwInnXk302dk90n/rWgWpU01odXOaMwbJd3XEGACOvHv+btVD+Kzxk23jAJh1eAgQCUEF5ZFs+LxcW8FrSyl9Z9cZk72Jby3/1P9+9He/AUABsz6r4ewVWPWhh6Aikg300qpMymxvi5lVXxdDZ6hBfsPS9zbj5ouovX2C1IqoO1tISE4cttznfPws8UBdGY9MkhgDt1XVZzVJJoiWfsKtEXn+g70/uFYof09iIx4Y+i9A1MKeVAk/C1j1wcLCYo9gJYUeRtMXpgDw9p/v8LftNcPU5MPYNkky98J1phjrdokuTEcVeSY1GyU+/sU3FnHESHEjPjGya1yIu4PqdBPH3PU9AIb8bGab/cGhwsez+suDGfK8FIx153/apt3exLofHgPAomv/krV9fzUutoaVFCwsLPYIVlLoJSy7bxIrP5ftMvSpup7ce1RdziihfdM5YTYfVyzbEhmG58KVkpsQWSIBUrqxidS4CgCaBoobdOO5EqEYjKS4dKxEEL6/YxgAS+cOYdS9YghML1wi/Rr7io7HcUpLAIhPMH2Wh0hFRVLp84TkMnwWgUjtYe1Pj2Hx1QeehODBRjTuA1j798MAWHzsX7O27+tGq2C/cgB0sUlECotHIN4nRv0QSUD6/+2df2xV5RnHP19ahVE1UGTaiaHUNWNqJiXGUOUPMmeQjWmMZpORjA0TxmbUmWWOhky3/bFF2Sabc7KpjGQSJIJz2Ew75tyPGME5fzAmVIoss0QpbEgyTFwLz/543yP39La03N5z7m33fBJy73nPhfPtcw9Pn/d53/M8U58/BEDv2WfQVxfOKzZx7a0LlZjO3P1vbF8oGJOUt69GrnvtIADLJ+1/f2wsOYMEnz44jlMSHinkwEDJx4TZ3/kyAFPXVH658P+J/3xmDs+tXlM0PhYjhASPFBzHKQmPFCrAnW+8xBUTiv3x3FtCIrJu8+jtGVCtHFkc2v9tW1UcHTRtDnZvvmVs290jBcdxSsIjhUoxLmToO7oHLwk+86GQb5h+p+cbSmHPfSGX88b1xbmc+w5Pp/2iyXlLqii+JDnaGFdzUgeR9G94pSUvQaOLmin13Lw97Jr81MTi5U+3n08fHMcpkSEfnZa0FlgI9JjZxXFsFfBp4L/AXkJ7uHfiuTbgJuAYcKuZdWSkfWxx/FjRcljhk3p3nxOLfBS0623+wxcAaPpccRPcsc6R33wYgG2zNg36maatSwFoXvJSLprGCsOJFNYRekMWshW42Mw+BrwOtAFIuhC4Ebgo/p2fSqopm1rHcTJnOL0k/ySpsd/YbwsOtwE3xPfXAo+a2XvAPkldwGWAZ8pK4EBrKM46n1m8fXt4eu/Vr5/Ym79n3rrwpiB6ePrd8NzBN78bfkvWrx3dpt+/4nK2fuUeABpSzWeLo6OrFn0RgHF/fBmAZjxCKIVhJRqjU2hPpg/9zj0JbDSzRyT9BNhmZo/Ecw8DT5nZ4DEenmgsla57w9r73s8Wr72fjAs2hgpCjU/2jqg57kipbTgXgN13NJ7yz3DJPaEE/Lmrix/JdgambF2nT4aklUAfsD4ZGuBjA3odScuAZQATmDgSGY7jlJGSIwVJS4DlwJVm9m4cawMws+/F4w7gW2Z20hjWI4Uycll4MvPwXWFZ7oWWx8r2T7/ee5QH/zUXgOd7wqPQh49+gLPiEuDMyT0ALKx/FShPb4qmzV8a8zsN8yLTJUlJVwPfAK5JHEJkC3CjpPGSZgDNQHmqlzqOkwtDRgqSNgDzgLOBA8BdhNWG8SQleEMeYXn8/EpgKWFa8VUze2ooER4p5EuSi2i5tItNF/yuYjpmPLEMgDP31tLwA88NZI3vaHTKz7garDVMT96bHIqtnHa0j9o/7wDA+voqJs0ZGt/R6DhOSXgzGGf4HD+Gngv7AyYUDFc+1nTKiUcKjuOkcKfgOE4KdwqO46Rwp+A4Tgp3Co7jpHCn4DhOiqrYvCTpIHAUOFRpLYSdm67jBK4jzWjWMd3Mpg71oapwCgCSXhzObivX4TpcR7Y6fPrgOE4KdwqO46SoJqfw80oLiLiONK4jzZjXUTU5BcdxqoNqihQcx6kCqsIpSLpaUqekLkkrcrrm+ZKelbRL0t8l3RbH6yVtlbQnvubSW0xSjaSXJbXH4xmStkcdGyWdnoOGSZI2Sdod7dJaCXtIuj1+JzslbZA0IS97SForqUfSzoKxAW2gwI/jfbtD0uyMdayK380OSb+SNKngXFvU0Slp/kiuXXGnEPtC3A8sAC4EFsX+EVnTB3zNzD4KzAFujtddATxjZs3AM/E4D24DdhUc3w3cG3UcJjTYyZofAU+b2UzgkqgnV3tIOg+4Fbg01gStIfQSycse6yjuczKYDRYQSg42E4oQP5Cxjnz6rZhZRf8ArUBHwXEb0FYBHb8GrgI6gYY41gB05nDtaYSb7eNAO6Eq9iGgdiAbZaThLGAfMc9UMJ6rPYDzgDeBekK9j3Zgfp72ABqBnUPZAPgZsGigz2Who9+564D18X3q/wzQAbSWet2KRwqcuAkSuuNYbsRq1S3AduAcM3sLIL5+MAcJq4E7gOPxeArwjpkl9c3ysEkTcBD4RZzGPCSpjpztYWb7ge8D/wTeAo4AfyV/exQymA0qee8uBZL6p2XVUQ1OYdi9IjK5uHQGsJlQZHbkNclP/fpJn87CriyVsEktMBt4wMxaCNvO85o6vU+cr18LzAA+BNQRwvT+VMOyWUXu3ZH0WxkO1eAUuoHzC46nkWqElh2STiM4hPVm9ngcPiCpIZ5vAHoylnEFcI2kfwCPEqYQq4FJkpJyeXnYpBvoNrOkycImgpPI2x6fAPaZ2UEz6wUeBy4nf3sUMpgNcr93Y7+VhcBii3OFcuuoBqfwF6A5ZpdPJyRMtmR9UUkCHgZ2mdkPC05tAZbE90sIuYbMMLM2M5tmZo2En/33ZrYYeJYTPTrz0PE28Kakj8ShK4HXyNkehGnDHEkT43eU6MjVHv0YzAZbgM/HVYg5wJFkmpEFufVbyTJpdAoJlU8Ssql7gZU5XXMuIcTaQehW+krUMYWQ9NsTX+tztMM8QicuCHP8F4Au4DFgfA7XnwW8GG3yBDC5EvYAvg3sBnYCvyT0GMnFHsAGQi6jl/Ab+KbBbEAI2++P9+3fCCsmWeroIuQOkvt1TcHnV0YdncCCkVzbdzQ6jpOiGqYPjuNUEe4UHMdJ4U7BcZwU7hQcx0nhTsFxnBTuFBzHSeFOwXGcFO4UHMdJ8T/z+Hm69fOSnwAAAABJRU5ErkJggg==\n",
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"plt.imshow(\n",
" _read(path_train_img + 'ID_ffff922b9.dcm', (128, 128))[:, :, 0]\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {},
"outputs": [],
"source": [
"# Augmentations\n",
"# Flip Left Right\n",
"# Cropping\n",
"sometimes = lambda aug: iaa.Sometimes(0.25, aug)\n",
"augmentation = iaa.Sequential([ \n",
" iaa.Fliplr(0.25),\n",
" sometimes(iaa.Crop(px=(0, 25), keep_size = True, \n",
" sample_independently = False)) \n",
" ], random_order = True)"
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {},
"outputs": [],
"source": [
"# Train Data Generator\n",
"class TrainDataGenerator(keras.utils.Sequence):\n",
"\n",
" def __init__(self, dataset, labels, batch_size=16, img_size=(512, 512), img_dir = path_train_img, \\\n",
" augment = False, *args, **kwargs):\n",
" self.dataset = dataset\n",
" self.ids = dataset.index\n",
" self.labels = labels\n",
" self.batch_size = batch_size\n",
" self.img_size = img_size\n",
" self.img_dir = img_dir\n",
" self.augment = augment\n",
" self.on_epoch_end()\n",
"\n",
" def __len__(self):\n",
" return int(ceil(len(self.ids) / self.batch_size))\n",
"\n",
" def __getitem__(self, index):\n",
" indices = self.indices[index*self.batch_size:(index+1)*self.batch_size]\n",
" X, Y = self.__data_generation(indices)\n",
" return X, Y\n",
"\n",
" def augmentor(self, image):\n",
" augment_img = augmentation \n",
" image_aug = augment_img.augment_image(image)\n",
" return image_aug\n",
"\n",
" def on_epoch_end(self):\n",
" self.indices = np.arange(len(self.ids))\n",
" np.random.shuffle(self.indices)\n",
" \n",
" def __data_generation(self, indices):\n",
" X = np.empty((self.batch_size, *self.img_size, 3))\n",
" Y = np.empty((self.batch_size, 6), dtype=np.float32)\n",
" \n",
" for i, index in enumerate(indices):\n",
" ID = self.ids[index]\n",
" image = _read(self.img_dir + ID, self.img_size)\n",
" if self.augment:\n",
" X[i,] = self.augmentor(image)\n",
" else:\n",
" X[i,] = image \n",
" Y[i,] = self.labels.iloc[index].values \n",
" return X, Y\n",
" \n",
"class TestDataGenerator(keras.utils.Sequence):\n",
" def __init__(self, ids, labels, batch_size = 5, img_size = (512, 512), img_dir = path_test_img, \\\n",
" *args, **kwargs):\n",
" self.ids = ids\n",
" self.labels = labels\n",
" self.batch_size = batch_size\n",
" self.img_size = img_size\n",
" self.img_dir = img_dir\n",
" self.on_epoch_end()\n",
"\n",
" def __len__(self):\n",
" return int(ceil(len(self.ids) / self.batch_size))\n",
"\n",
" def __getitem__(self, index):\n",
" indices = self.indices[index*self.batch_size:(index+1)*self.batch_size]\n",
" list_IDs_temp = [self.ids[k] for k in indices]\n",
" X = self.__data_generation(list_IDs_temp)\n",
" return X\n",
"\n",
" def on_epoch_end(self):\n",
" self.indices = np.arange(len(self.ids))\n",
"\n",
" def __data_generation(self, list_IDs_temp):\n",
" X = np.empty((self.batch_size, *self.img_size, 3))\n",
" for i, ID in enumerate(list_IDs_temp):\n",
" image = _read(self.img_dir + ID, self.img_size)\n",
" X[i,] = image \n",
" return X"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"As we have seen in EDA notebook that we have very few epidural subtypes so we need oversample this sub type"
]
},
{
"cell_type": "code",
"execution_count": 19,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Train Shape: (677018, 6)\n"
]
}
],
"source": [
"# Oversampling\n",
"epidural_df = train_final_df[train_final_df.epidural == 1]\n",
"train_final_df = pd.concat([train_final_df, epidural_df])\n",
"print('Train Shape: {}'.format(train_final_df.shape))"
]
},
{
"cell_type": "code",
"execution_count": 20,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>ID</th>\n",
" <th>Label</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>0</th>\n",
" <td>ID_28fbab7eb_epidural</td>\n",
" <td>0.5</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>ID_28fbab7eb_intraparenchymal</td>\n",
" <td>0.5</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2</th>\n",
" <td>ID_28fbab7eb_intraventricular</td>\n",
" <td>0.5</td>\n",
" </tr>\n",
" <tr>\n",
" <th>3</th>\n",
" <td>ID_28fbab7eb_subarachnoid</td>\n",
" <td>0.5</td>\n",
" </tr>\n",
" <tr>\n",
" <th>4</th>\n",
" <td>ID_28fbab7eb_subdural</td>\n",
" <td>0.5</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
" ID Label\n",
"0 ID_28fbab7eb_epidural 0.5\n",
"1 ID_28fbab7eb_intraparenchymal 0.5\n",
"2 ID_28fbab7eb_intraventricular 0.5\n",
"3 ID_28fbab7eb_subarachnoid 0.5\n",
"4 ID_28fbab7eb_subdural 0.5"
]
},
"execution_count": 20,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# load test set\n",
"test_df = pd.read_csv(input_folder + 'stage_1_sample_submission.csv')\n",
"test_df.head()"
]
},
{
"cell_type": "code",
"execution_count": 21,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"(78545, 6)"
]
},
"execution_count": 21,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# extract subtype\n",
"test_df['sub_type'] = test_df['ID'].apply(lambda x: x.split('_')[-1])\n",
"# extract filename\n",
"test_df['file_name'] = test_df['ID'].apply(lambda x: '_'.join(x.split('_')[:2]) + '.dcm')\n",
"\n",
"test_df = pd.pivot_table(test_df.drop(columns='ID'), index=\"file_name\", \\\n",
" columns=\"sub_type\", values=\"Label\")\n",
"test_df.head()\n",
"\n",
"test_df.shape"
]
},
{
"cell_type": "code",
"execution_count": 22,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th>sub_type</th>\n",
" <th>any</th>\n",
" <th>epidural</th>\n",
" <th>intraparenchymal</th>\n",
" <th>intraventricular</th>\n",
" <th>subarachnoid</th>\n",
" <th>subdural</th>\n",
" </tr>\n",
" <tr>\n",
" <th>file_name</th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>ID_000012eaf.dcm</th>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" </tr>\n",
" <tr>\n",
" <th>ID_0000ca2f6.dcm</th>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" </tr>\n",
" <tr>\n",
" <th>ID_000259ccf.dcm</th>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" </tr>\n",
" <tr>\n",
" <th>ID_0002d438a.dcm</th>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" </tr>\n",
" <tr>\n",
" <th>ID_00032d440.dcm</th>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" <td>0.5</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
"sub_type any epidural intraparenchymal intraventricular \\\n",
"file_name \n",
"ID_000012eaf.dcm 0.5 0.5 0.5 0.5 \n",
"ID_0000ca2f6.dcm 0.5 0.5 0.5 0.5 \n",
"ID_000259ccf.dcm 0.5 0.5 0.5 0.5 \n",
"ID_0002d438a.dcm 0.5 0.5 0.5 0.5 \n",
"ID_00032d440.dcm 0.5 0.5 0.5 0.5 \n",
"\n",
"sub_type subarachnoid subdural \n",
"file_name \n",
"ID_000012eaf.dcm 0.5 0.5 \n",
"ID_0000ca2f6.dcm 0.5 0.5 \n",
"ID_000259ccf.dcm 0.5 0.5 \n",
"ID_0002d438a.dcm 0.5 0.5 \n",
"ID_00032d440.dcm 0.5 0.5 "
]
},
"execution_count": 22,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"test_df.head()"
]
},
{
"cell_type": "code",
"execution_count": 23,
"metadata": {
"scrolled": false
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Downloading data from https://github.com/Callidior/keras-applications/releases/download/efficientnet/efficientnet-b0_weights_tf_dim_ordering_tf_kernels_autoaugment_notop.h5\n",
"16809984/16804768 [==============================] - 1s 0us/step\n",
"Model: \"model_1\"\n",
"__________________________________________________________________________________________________\n",
"Layer (type) Output Shape Param # Connected to \n",
"==================================================================================================\n",
"input_1 (InputLayer) (None, 256, 256, 3) 0 \n",
"__________________________________________________________________________________________________\n",
"stem_conv (Conv2D) (None, 128, 128, 32) 864 input_1[0][0] \n",
"__________________________________________________________________________________________________\n",
"stem_bn (BatchNormalization) (None, 128, 128, 32) 128 stem_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"stem_activation (Activation) (None, 128, 128, 32) 0 stem_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block1a_dwconv (DepthwiseConv2D (None, 128, 128, 32) 288 stem_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block1a_bn (BatchNormalization) (None, 128, 128, 32) 128 block1a_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block1a_activation (Activation) (None, 128, 128, 32) 0 block1a_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block1a_se_squeeze (GlobalAvera (None, 32) 0 block1a_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block1a_se_reshape (Reshape) (None, 1, 1, 32) 0 block1a_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block1a_se_reduce (Conv2D) (None, 1, 1, 8) 264 block1a_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block1a_se_expand (Conv2D) (None, 1, 1, 32) 288 block1a_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block1a_se_excite (Multiply) (None, 128, 128, 32) 0 block1a_activation[0][0] \n",
" block1a_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block1a_project_conv (Conv2D) (None, 128, 128, 16) 512 block1a_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block1a_project_bn (BatchNormal (None, 128, 128, 16) 64 block1a_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2a_expand_conv (Conv2D) (None, 128, 128, 96) 1536 block1a_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2a_expand_bn (BatchNormali (None, 128, 128, 96) 384 block2a_expand_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2a_expand_activation (Acti (None, 128, 128, 96) 0 block2a_expand_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2a_dwconv (DepthwiseConv2D (None, 64, 64, 96) 864 block2a_expand_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2a_bn (BatchNormalization) (None, 64, 64, 96) 384 block2a_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2a_activation (Activation) (None, 64, 64, 96) 0 block2a_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2a_se_squeeze (GlobalAvera (None, 96) 0 block2a_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2a_se_reshape (Reshape) (None, 1, 1, 96) 0 block2a_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2a_se_reduce (Conv2D) (None, 1, 1, 4) 388 block2a_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2a_se_expand (Conv2D) (None, 1, 1, 96) 480 block2a_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2a_se_excite (Multiply) (None, 64, 64, 96) 0 block2a_activation[0][0] \n",
" block2a_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2a_project_conv (Conv2D) (None, 64, 64, 24) 2304 block2a_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2a_project_bn (BatchNormal (None, 64, 64, 24) 96 block2a_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2b_expand_conv (Conv2D) (None, 64, 64, 144) 3456 block2a_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2b_expand_bn (BatchNormali (None, 64, 64, 144) 576 block2b_expand_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2b_expand_activation (Acti (None, 64, 64, 144) 0 block2b_expand_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2b_dwconv (DepthwiseConv2D (None, 64, 64, 144) 1296 block2b_expand_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2b_bn (BatchNormalization) (None, 64, 64, 144) 576 block2b_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2b_activation (Activation) (None, 64, 64, 144) 0 block2b_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2b_se_squeeze (GlobalAvera (None, 144) 0 block2b_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2b_se_reshape (Reshape) (None, 1, 1, 144) 0 block2b_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2b_se_reduce (Conv2D) (None, 1, 1, 6) 870 block2b_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2b_se_expand (Conv2D) (None, 1, 1, 144) 1008 block2b_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2b_se_excite (Multiply) (None, 64, 64, 144) 0 block2b_activation[0][0] \n",
" block2b_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2b_project_conv (Conv2D) (None, 64, 64, 24) 3456 block2b_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2b_project_bn (BatchNormal (None, 64, 64, 24) 96 block2b_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2b_drop (FixedDropout) (None, 64, 64, 24) 0 block2b_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block2b_add (Add) (None, 64, 64, 24) 0 block2b_drop[0][0] \n",
" block2a_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3a_expand_conv (Conv2D) (None, 64, 64, 144) 3456 block2b_add[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3a_expand_bn (BatchNormali (None, 64, 64, 144) 576 block3a_expand_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3a_expand_activation (Acti (None, 64, 64, 144) 0 block3a_expand_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3a_dwconv (DepthwiseConv2D (None, 32, 32, 144) 3600 block3a_expand_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3a_bn (BatchNormalization) (None, 32, 32, 144) 576 block3a_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3a_activation (Activation) (None, 32, 32, 144) 0 block3a_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3a_se_squeeze (GlobalAvera (None, 144) 0 block3a_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3a_se_reshape (Reshape) (None, 1, 1, 144) 0 block3a_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3a_se_reduce (Conv2D) (None, 1, 1, 6) 870 block3a_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3a_se_expand (Conv2D) (None, 1, 1, 144) 1008 block3a_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3a_se_excite (Multiply) (None, 32, 32, 144) 0 block3a_activation[0][0] \n",
" block3a_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3a_project_conv (Conv2D) (None, 32, 32, 40) 5760 block3a_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3a_project_bn (BatchNormal (None, 32, 32, 40) 160 block3a_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3b_expand_conv (Conv2D) (None, 32, 32, 240) 9600 block3a_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3b_expand_bn (BatchNormali (None, 32, 32, 240) 960 block3b_expand_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3b_expand_activation (Acti (None, 32, 32, 240) 0 block3b_expand_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3b_dwconv (DepthwiseConv2D (None, 32, 32, 240) 6000 block3b_expand_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3b_bn (BatchNormalization) (None, 32, 32, 240) 960 block3b_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3b_activation (Activation) (None, 32, 32, 240) 0 block3b_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3b_se_squeeze (GlobalAvera (None, 240) 0 block3b_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3b_se_reshape (Reshape) (None, 1, 1, 240) 0 block3b_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3b_se_reduce (Conv2D) (None, 1, 1, 10) 2410 block3b_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3b_se_expand (Conv2D) (None, 1, 1, 240) 2640 block3b_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3b_se_excite (Multiply) (None, 32, 32, 240) 0 block3b_activation[0][0] \n",
" block3b_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3b_project_conv (Conv2D) (None, 32, 32, 40) 9600 block3b_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3b_project_bn (BatchNormal (None, 32, 32, 40) 160 block3b_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3b_drop (FixedDropout) (None, 32, 32, 40) 0 block3b_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block3b_add (Add) (None, 32, 32, 40) 0 block3b_drop[0][0] \n",
" block3a_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4a_expand_conv (Conv2D) (None, 32, 32, 240) 9600 block3b_add[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4a_expand_bn (BatchNormali (None, 32, 32, 240) 960 block4a_expand_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4a_expand_activation (Acti (None, 32, 32, 240) 0 block4a_expand_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4a_dwconv (DepthwiseConv2D (None, 16, 16, 240) 2160 block4a_expand_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4a_bn (BatchNormalization) (None, 16, 16, 240) 960 block4a_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4a_activation (Activation) (None, 16, 16, 240) 0 block4a_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4a_se_squeeze (GlobalAvera (None, 240) 0 block4a_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4a_se_reshape (Reshape) (None, 1, 1, 240) 0 block4a_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4a_se_reduce (Conv2D) (None, 1, 1, 10) 2410 block4a_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4a_se_expand (Conv2D) (None, 1, 1, 240) 2640 block4a_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4a_se_excite (Multiply) (None, 16, 16, 240) 0 block4a_activation[0][0] \n",
" block4a_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4a_project_conv (Conv2D) (None, 16, 16, 80) 19200 block4a_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4a_project_bn (BatchNormal (None, 16, 16, 80) 320 block4a_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4b_expand_conv (Conv2D) (None, 16, 16, 480) 38400 block4a_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4b_expand_bn (BatchNormali (None, 16, 16, 480) 1920 block4b_expand_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4b_expand_activation (Acti (None, 16, 16, 480) 0 block4b_expand_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4b_dwconv (DepthwiseConv2D (None, 16, 16, 480) 4320 block4b_expand_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4b_bn (BatchNormalization) (None, 16, 16, 480) 1920 block4b_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4b_activation (Activation) (None, 16, 16, 480) 0 block4b_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4b_se_squeeze (GlobalAvera (None, 480) 0 block4b_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4b_se_reshape (Reshape) (None, 1, 1, 480) 0 block4b_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4b_se_reduce (Conv2D) (None, 1, 1, 20) 9620 block4b_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4b_se_expand (Conv2D) (None, 1, 1, 480) 10080 block4b_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4b_se_excite (Multiply) (None, 16, 16, 480) 0 block4b_activation[0][0] \n",
" block4b_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4b_project_conv (Conv2D) (None, 16, 16, 80) 38400 block4b_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4b_project_bn (BatchNormal (None, 16, 16, 80) 320 block4b_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4b_drop (FixedDropout) (None, 16, 16, 80) 0 block4b_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4b_add (Add) (None, 16, 16, 80) 0 block4b_drop[0][0] \n",
" block4a_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4c_expand_conv (Conv2D) (None, 16, 16, 480) 38400 block4b_add[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4c_expand_bn (BatchNormali (None, 16, 16, 480) 1920 block4c_expand_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4c_expand_activation (Acti (None, 16, 16, 480) 0 block4c_expand_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4c_dwconv (DepthwiseConv2D (None, 16, 16, 480) 4320 block4c_expand_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4c_bn (BatchNormalization) (None, 16, 16, 480) 1920 block4c_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4c_activation (Activation) (None, 16, 16, 480) 0 block4c_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4c_se_squeeze (GlobalAvera (None, 480) 0 block4c_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4c_se_reshape (Reshape) (None, 1, 1, 480) 0 block4c_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4c_se_reduce (Conv2D) (None, 1, 1, 20) 9620 block4c_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4c_se_expand (Conv2D) (None, 1, 1, 480) 10080 block4c_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4c_se_excite (Multiply) (None, 16, 16, 480) 0 block4c_activation[0][0] \n",
" block4c_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4c_project_conv (Conv2D) (None, 16, 16, 80) 38400 block4c_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4c_project_bn (BatchNormal (None, 16, 16, 80) 320 block4c_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4c_drop (FixedDropout) (None, 16, 16, 80) 0 block4c_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block4c_add (Add) (None, 16, 16, 80) 0 block4c_drop[0][0] \n",
" block4b_add[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5a_expand_conv (Conv2D) (None, 16, 16, 480) 38400 block4c_add[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5a_expand_bn (BatchNormali (None, 16, 16, 480) 1920 block5a_expand_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5a_expand_activation (Acti (None, 16, 16, 480) 0 block5a_expand_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5a_dwconv (DepthwiseConv2D (None, 16, 16, 480) 12000 block5a_expand_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5a_bn (BatchNormalization) (None, 16, 16, 480) 1920 block5a_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5a_activation (Activation) (None, 16, 16, 480) 0 block5a_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5a_se_squeeze (GlobalAvera (None, 480) 0 block5a_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5a_se_reshape (Reshape) (None, 1, 1, 480) 0 block5a_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5a_se_reduce (Conv2D) (None, 1, 1, 20) 9620 block5a_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5a_se_expand (Conv2D) (None, 1, 1, 480) 10080 block5a_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5a_se_excite (Multiply) (None, 16, 16, 480) 0 block5a_activation[0][0] \n",
" block5a_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5a_project_conv (Conv2D) (None, 16, 16, 112) 53760 block5a_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5a_project_bn (BatchNormal (None, 16, 16, 112) 448 block5a_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5b_expand_conv (Conv2D) (None, 16, 16, 672) 75264 block5a_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5b_expand_bn (BatchNormali (None, 16, 16, 672) 2688 block5b_expand_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5b_expand_activation (Acti (None, 16, 16, 672) 0 block5b_expand_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5b_dwconv (DepthwiseConv2D (None, 16, 16, 672) 16800 block5b_expand_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5b_bn (BatchNormalization) (None, 16, 16, 672) 2688 block5b_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5b_activation (Activation) (None, 16, 16, 672) 0 block5b_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5b_se_squeeze (GlobalAvera (None, 672) 0 block5b_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5b_se_reshape (Reshape) (None, 1, 1, 672) 0 block5b_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5b_se_reduce (Conv2D) (None, 1, 1, 28) 18844 block5b_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5b_se_expand (Conv2D) (None, 1, 1, 672) 19488 block5b_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5b_se_excite (Multiply) (None, 16, 16, 672) 0 block5b_activation[0][0] \n",
" block5b_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5b_project_conv (Conv2D) (None, 16, 16, 112) 75264 block5b_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5b_project_bn (BatchNormal (None, 16, 16, 112) 448 block5b_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5b_drop (FixedDropout) (None, 16, 16, 112) 0 block5b_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5b_add (Add) (None, 16, 16, 112) 0 block5b_drop[0][0] \n",
" block5a_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5c_expand_conv (Conv2D) (None, 16, 16, 672) 75264 block5b_add[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5c_expand_bn (BatchNormali (None, 16, 16, 672) 2688 block5c_expand_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5c_expand_activation (Acti (None, 16, 16, 672) 0 block5c_expand_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5c_dwconv (DepthwiseConv2D (None, 16, 16, 672) 16800 block5c_expand_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5c_bn (BatchNormalization) (None, 16, 16, 672) 2688 block5c_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5c_activation (Activation) (None, 16, 16, 672) 0 block5c_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5c_se_squeeze (GlobalAvera (None, 672) 0 block5c_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5c_se_reshape (Reshape) (None, 1, 1, 672) 0 block5c_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5c_se_reduce (Conv2D) (None, 1, 1, 28) 18844 block5c_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5c_se_expand (Conv2D) (None, 1, 1, 672) 19488 block5c_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5c_se_excite (Multiply) (None, 16, 16, 672) 0 block5c_activation[0][0] \n",
" block5c_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5c_project_conv (Conv2D) (None, 16, 16, 112) 75264 block5c_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5c_project_bn (BatchNormal (None, 16, 16, 112) 448 block5c_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5c_drop (FixedDropout) (None, 16, 16, 112) 0 block5c_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block5c_add (Add) (None, 16, 16, 112) 0 block5c_drop[0][0] \n",
" block5b_add[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6a_expand_conv (Conv2D) (None, 16, 16, 672) 75264 block5c_add[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6a_expand_bn (BatchNormali (None, 16, 16, 672) 2688 block6a_expand_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6a_expand_activation (Acti (None, 16, 16, 672) 0 block6a_expand_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6a_dwconv (DepthwiseConv2D (None, 8, 8, 672) 16800 block6a_expand_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6a_bn (BatchNormalization) (None, 8, 8, 672) 2688 block6a_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6a_activation (Activation) (None, 8, 8, 672) 0 block6a_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6a_se_squeeze (GlobalAvera (None, 672) 0 block6a_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6a_se_reshape (Reshape) (None, 1, 1, 672) 0 block6a_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6a_se_reduce (Conv2D) (None, 1, 1, 28) 18844 block6a_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6a_se_expand (Conv2D) (None, 1, 1, 672) 19488 block6a_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6a_se_excite (Multiply) (None, 8, 8, 672) 0 block6a_activation[0][0] \n",
" block6a_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6a_project_conv (Conv2D) (None, 8, 8, 192) 129024 block6a_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6a_project_bn (BatchNormal (None, 8, 8, 192) 768 block6a_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6b_expand_conv (Conv2D) (None, 8, 8, 1152) 221184 block6a_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6b_expand_bn (BatchNormali (None, 8, 8, 1152) 4608 block6b_expand_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6b_expand_activation (Acti (None, 8, 8, 1152) 0 block6b_expand_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6b_dwconv (DepthwiseConv2D (None, 8, 8, 1152) 28800 block6b_expand_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6b_bn (BatchNormalization) (None, 8, 8, 1152) 4608 block6b_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6b_activation (Activation) (None, 8, 8, 1152) 0 block6b_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6b_se_squeeze (GlobalAvera (None, 1152) 0 block6b_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6b_se_reshape (Reshape) (None, 1, 1, 1152) 0 block6b_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6b_se_reduce (Conv2D) (None, 1, 1, 48) 55344 block6b_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6b_se_expand (Conv2D) (None, 1, 1, 1152) 56448 block6b_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6b_se_excite (Multiply) (None, 8, 8, 1152) 0 block6b_activation[0][0] \n",
" block6b_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6b_project_conv (Conv2D) (None, 8, 8, 192) 221184 block6b_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6b_project_bn (BatchNormal (None, 8, 8, 192) 768 block6b_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6b_drop (FixedDropout) (None, 8, 8, 192) 0 block6b_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6b_add (Add) (None, 8, 8, 192) 0 block6b_drop[0][0] \n",
" block6a_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6c_expand_conv (Conv2D) (None, 8, 8, 1152) 221184 block6b_add[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6c_expand_bn (BatchNormali (None, 8, 8, 1152) 4608 block6c_expand_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6c_expand_activation (Acti (None, 8, 8, 1152) 0 block6c_expand_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6c_dwconv (DepthwiseConv2D (None, 8, 8, 1152) 28800 block6c_expand_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6c_bn (BatchNormalization) (None, 8, 8, 1152) 4608 block6c_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6c_activation (Activation) (None, 8, 8, 1152) 0 block6c_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6c_se_squeeze (GlobalAvera (None, 1152) 0 block6c_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6c_se_reshape (Reshape) (None, 1, 1, 1152) 0 block6c_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6c_se_reduce (Conv2D) (None, 1, 1, 48) 55344 block6c_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6c_se_expand (Conv2D) (None, 1, 1, 1152) 56448 block6c_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6c_se_excite (Multiply) (None, 8, 8, 1152) 0 block6c_activation[0][0] \n",
" block6c_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6c_project_conv (Conv2D) (None, 8, 8, 192) 221184 block6c_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6c_project_bn (BatchNormal (None, 8, 8, 192) 768 block6c_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6c_drop (FixedDropout) (None, 8, 8, 192) 0 block6c_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6c_add (Add) (None, 8, 8, 192) 0 block6c_drop[0][0] \n",
" block6b_add[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6d_expand_conv (Conv2D) (None, 8, 8, 1152) 221184 block6c_add[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6d_expand_bn (BatchNormali (None, 8, 8, 1152) 4608 block6d_expand_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6d_expand_activation (Acti (None, 8, 8, 1152) 0 block6d_expand_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6d_dwconv (DepthwiseConv2D (None, 8, 8, 1152) 28800 block6d_expand_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6d_bn (BatchNormalization) (None, 8, 8, 1152) 4608 block6d_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6d_activation (Activation) (None, 8, 8, 1152) 0 block6d_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6d_se_squeeze (GlobalAvera (None, 1152) 0 block6d_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6d_se_reshape (Reshape) (None, 1, 1, 1152) 0 block6d_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6d_se_reduce (Conv2D) (None, 1, 1, 48) 55344 block6d_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6d_se_expand (Conv2D) (None, 1, 1, 1152) 56448 block6d_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6d_se_excite (Multiply) (None, 8, 8, 1152) 0 block6d_activation[0][0] \n",
" block6d_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6d_project_conv (Conv2D) (None, 8, 8, 192) 221184 block6d_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6d_project_bn (BatchNormal (None, 8, 8, 192) 768 block6d_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6d_drop (FixedDropout) (None, 8, 8, 192) 0 block6d_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block6d_add (Add) (None, 8, 8, 192) 0 block6d_drop[0][0] \n",
" block6c_add[0][0] \n",
"__________________________________________________________________________________________________\n",
"block7a_expand_conv (Conv2D) (None, 8, 8, 1152) 221184 block6d_add[0][0] \n",
"__________________________________________________________________________________________________\n",
"block7a_expand_bn (BatchNormali (None, 8, 8, 1152) 4608 block7a_expand_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block7a_expand_activation (Acti (None, 8, 8, 1152) 0 block7a_expand_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block7a_dwconv (DepthwiseConv2D (None, 8, 8, 1152) 10368 block7a_expand_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block7a_bn (BatchNormalization) (None, 8, 8, 1152) 4608 block7a_dwconv[0][0] \n",
"__________________________________________________________________________________________________\n",
"block7a_activation (Activation) (None, 8, 8, 1152) 0 block7a_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"block7a_se_squeeze (GlobalAvera (None, 1152) 0 block7a_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"block7a_se_reshape (Reshape) (None, 1, 1, 1152) 0 block7a_se_squeeze[0][0] \n",
"__________________________________________________________________________________________________\n",
"block7a_se_reduce (Conv2D) (None, 1, 1, 48) 55344 block7a_se_reshape[0][0] \n",
"__________________________________________________________________________________________________\n",
"block7a_se_expand (Conv2D) (None, 1, 1, 1152) 56448 block7a_se_reduce[0][0] \n",
"__________________________________________________________________________________________________\n",
"block7a_se_excite (Multiply) (None, 8, 8, 1152) 0 block7a_activation[0][0] \n",
" block7a_se_expand[0][0] \n",
"__________________________________________________________________________________________________\n",
"block7a_project_conv (Conv2D) (None, 8, 8, 320) 368640 block7a_se_excite[0][0] \n",
"__________________________________________________________________________________________________\n",
"block7a_project_bn (BatchNormal (None, 8, 8, 320) 1280 block7a_project_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"top_conv (Conv2D) (None, 8, 8, 1280) 409600 block7a_project_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"top_bn (BatchNormalization) (None, 8, 8, 1280) 5120 top_conv[0][0] \n",
"__________________________________________________________________________________________________\n",
"top_activation (Activation) (None, 8, 8, 1280) 0 top_bn[0][0] \n",
"__________________________________________________________________________________________________\n",
"avg_pool (GlobalAveragePooling2 (None, 1280) 0 top_activation[0][0] \n",
"__________________________________________________________________________________________________\n",
"dropout_1 (Dropout) (None, 1280) 0 avg_pool[0][0] \n",
"__________________________________________________________________________________________________\n",
"dense_1 (Dense) (None, 6) 7686 dropout_1[0][0] \n",
"==================================================================================================\n",
"Total params: 4,057,250\n",
"Trainable params: 4,015,234\n",
"Non-trainable params: 42,016\n",
"__________________________________________________________________________________________________\n"
]
}
],
"source": [
"base_model = efn.EfficientNetB0(weights = 'imagenet', include_top = False, \\\n",
" pooling = 'avg', input_shape = (HEIGHT, WIDTH, 3))\n",
"x = base_model.output\n",
"x = Dropout(0.125)(x)\n",
"output_layer = Dense(6, activation = 'sigmoid')(x)\n",
"model = Model(inputs=base_model.input, outputs=output_layer)\n",
"model.compile(optimizer = Adam(learning_rate = 0.0001), \n",
" loss = 'binary_crossentropy',\n",
" metrics = ['acc', tf.keras.metrics.AUC()])\n",
"model.summary()"
]
},
{
"cell_type": "code",
"execution_count": 25,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"(636396, 40622)"
]
},
"execution_count": 25,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# https://github.com/trent-b/iterative-stratification\n",
"# Mutlilabel stratification\n",
"splits = MultilabelStratifiedShuffleSplit(n_splits = 2, test_size = TEST_SIZE, random_state = SEED)\n",
"file_names = train_final_df.index\n",
"labels = train_final_df.values\n",
"# Lets take only the first split\n",
"split = next(splits.split(file_names, labels))\n",
"train_idx = split[0]\n",
"valid_idx = split[1]\n",
"submission_predictions = []\n",
"len(train_idx), len(valid_idx)"
]
},
{
"cell_type": "code",
"execution_count": 26,
"metadata": {},
"outputs": [],
"source": [
"# train data generator\n",
"data_generator_train = TrainDataGenerator(train_final_df.iloc[train_idx], \n",
" train_final_df.iloc[train_idx], \n",
" TRAIN_BATCH_SIZE, \n",
" (WIDTH, HEIGHT),\n",
" augment = True)\n",
"\n",
"# validation data generator\n",
"data_generator_val = TrainDataGenerator(train_final_df.iloc[valid_idx], \n",
" train_final_df.iloc[valid_idx], \n",
" VALID_BATCH_SIZE, \n",
" (WIDTH, HEIGHT),\n",
" augment = False)"
]
},
{
"cell_type": "code",
"execution_count": 27,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"(19888, 635)"
]
},
"execution_count": 27,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"len(data_generator_train), len(data_generator_val)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Competition evaluation metric is evaluated based on weighted log loss but we haven't given weights for each subtype but as per discussion from this thread https://www.kaggle.com/c/rsna-intracranial-hemorrhage-detection/discussion/109526#latest-630190 any has a wieght of 2 than other types below sample is taken from the discussion threas"
]
},
{
"cell_type": "code",
"execution_count": 28,
"metadata": {},
"outputs": [],
"source": [
"from keras import backend as K\n",
"\n",
"def weighted_log_loss(y_true, y_pred):\n",
" \"\"\"\n",
" Can be used as the loss function in model.compile()\n",
" ---------------------------------------------------\n",
" \"\"\"\n",
" \n",
" class_weights = np.array([2., 1., 1., 1., 1., 1.])\n",
" \n",
" eps = K.epsilon()\n",
" \n",
" y_pred = K.clip(y_pred, eps, 1.0-eps)\n",
"\n",
" out = -( y_true * K.log( y_pred) * class_weights\n",
" + (1.0 - y_true) * K.log(1.0 - y_pred) * class_weights)\n",
" \n",
" return K.mean(out, axis=-1)\n",
"\n",
"\n",
"def _normalized_weighted_average(arr, weights=None):\n",
" \"\"\"\n",
" A simple Keras implementation that mimics that of \n",
" numpy.average(), specifically for this competition\n",
" \"\"\"\n",
" \n",
" if weights is not None:\n",
" scl = K.sum(weights)\n",
" weights = K.expand_dims(weights, axis=1)\n",
" return K.sum(K.dot(arr, weights), axis=1) / scl\n",
" return K.mean(arr, axis=1)\n",
"\n",
"\n",
"def weighted_loss(y_true, y_pred):\n",
" \"\"\"\n",
" Will be used as the metric in model.compile()\n",
" ---------------------------------------------\n",
" \n",
" Similar to the custom loss function 'weighted_log_loss()' above\n",
" but with normalized weights, which should be very similar \n",
" to the official competition metric:\n",
" https://www.kaggle.com/kambarakun/lb-probe-weights-n-of-positives-scoring\n",
" and hence:\n",
" sklearn.metrics.log_loss with sample weights\n",
" \"\"\"\n",
" \n",
" class_weights = K.variable([2., 1., 1., 1., 1., 1.])\n",
" \n",
" eps = K.epsilon()\n",
" \n",
" y_pred = K.clip(y_pred, eps, 1.0-eps)\n",
"\n",
" loss = -( y_true * K.log( y_pred)\n",
" + (1.0 - y_true) * K.log(1.0 - y_pred))\n",
" \n",
" loss_samples = _normalized_weighted_average(loss, class_weights)\n",
" \n",
" return K.mean(loss_samples)\n",
"\n",
"\n",
"def weighted_log_loss_metric(trues, preds):\n",
" \"\"\"\n",
" Will be used to calculate the log loss \n",
" of the validation set in PredictionCheckpoint()\n",
" ------------------------------------------\n",
" \"\"\"\n",
" class_weights = [2., 1., 1., 1., 1., 1.]\n",
" \n",
" epsilon = 1e-7\n",
" \n",
" preds = np.clip(preds, epsilon, 1-epsilon)\n",
" loss = trues * np.log(preds) + (1 - trues) * np.log(1 - preds)\n",
" loss_samples = np.average(loss, axis=1, weights=class_weights)\n",
"\n",
" return - loss_samples.mean()"
]
},
{
"cell_type": "code",
"execution_count": 29,
"metadata": {},
"outputs": [],
"source": [
"filepath=\"model.h5\"\n",
"checkpoint = ModelCheckpoint(filepath, monitor='val_loss', verbose=1, \\\n",
" save_best_only=True, mode='min')\n",
"\n",
"callbacks_list = [checkpoint]"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"For a single epoch we are going to train only last 5 layers of Efficient. Since we have a large number of images around 600k so its better to train the all the layers on the whole train dataset but due its high computation resources required to train we only goin to train last five layers on whole dataset and for rest of epochs we only train on a sample of dataset but will train all the layers."
]
},
{
"cell_type": "code",
"execution_count": 31,
"metadata": {},
"outputs": [],
"source": [
"train = False"
]
},
{
"cell_type": "code",
"execution_count": 32,
"metadata": {},
"outputs": [],
"source": [
"if train:\n",
" if not os.path.isfile('../input/orginal-087-eff/model.h5'):\n",
" for layer in model.layers[:-5]:\n",
" layer.trainable = False\n",
" model.compile(optimizer = Adam(learning_rate = 0.0001), \n",
" loss = 'binary_crossentropy',\n",
" metrics = ['acc'])\n",
"\n",
" model.fit_generator(generator = data_generator_train,\n",
" validation_data = data_generator_val,\n",
" epochs = 2,\n",
" callbacks = callbacks_list,\n",
" verbose = 1)"
]
},
{
"cell_type": "code",
"execution_count": 33,
"metadata": {},
"outputs": [],
"source": [
"if train:\n",
" for base_layer in model.layers[:-1]:\n",
" base_layer.trainable = True\n",
"\n",
" model.load_weights('model.h5')\n",
"\n",
" model.compile(optimizer = Adam(learning_rate = 0.0004), \n",
" loss = 'binary_crossentropy',\n",
" metrics = ['acc'])\n",
" model.fit_generator(generator = data_generator_train,\n",
" validation_data = data_generator_val,\n",
" steps_per_epoch=len(data_generator_train)/6,\n",
" epochs = 10,\n",
" callbacks = callbacks_list,\n",
" verbose = 1)"
]
},
{
"cell_type": "code",
"execution_count": 34,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Collecting gdown\n",
" Downloading https://files.pythonhosted.org/packages/b0/b4/a8e9d0b02bca6aa53087001abf064cc9992bda11bd6840875b8098d93573/gdown-3.8.3.tar.gz\n",
"Requirement already satisfied: filelock in /opt/conda/lib/python3.6/site-packages (from gdown) (3.0.12)\n",
"Requirement already satisfied: requests in /opt/conda/lib/python3.6/site-packages (from gdown) (2.22.0)\n",
"Requirement already satisfied: six in /opt/conda/lib/python3.6/site-packages (from gdown) (1.12.0)\n",
"Requirement already satisfied: tqdm in /opt/conda/lib/python3.6/site-packages (from gdown) (4.36.1)\n",
"Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /opt/conda/lib/python3.6/site-packages (from requests->gdown) (1.24.2)\n",
"Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /opt/conda/lib/python3.6/site-packages (from requests->gdown) (3.0.4)\n",
"Requirement already satisfied: idna<2.9,>=2.5 in /opt/conda/lib/python3.6/site-packages (from requests->gdown) (2.8)\n",
"Requirement already satisfied: certifi>=2017.4.17 in /opt/conda/lib/python3.6/site-packages (from requests->gdown) (2019.9.11)\n",
"Building wheels for collected packages: gdown\n",
" Building wheel for gdown (setup.py) ... \u001b[?25ldone\n",
"\u001b[?25h Created wheel for gdown: filename=gdown-3.8.3-cp36-none-any.whl size=8850 sha256=ca7bf131547dd1503032ee6ec7567ff06fb7ddad8d44a32f00f874aadbd01a5e\n",
" Stored in directory: /tmp/.cache/pip/wheels/a7/9d/16/9e0bda9a327ff2cddaee8de48a27553fb1efce73133593d066\n",
"Successfully built gdown\n",
"Installing collected packages: gdown\n",
"Successfully installed gdown-3.8.3\n"
]
}
],
"source": [
"!pip install gdown"
]
},
{
"cell_type": "code",
"execution_count": 35,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Downloading...\n",
"From: https://drive.google.com/uc?id=1kZmMCCBOWSjCZjz2XWaouDIj5gFn2D-q\n",
"To: /kaggle/working/model (4).h5\n",
"49.2MB [00:03, 14.6MB/s]\n"
]
}
],
"source": [
"!gdown https://drive.google.com/uc?id=1kZmMCCBOWSjCZjz2XWaouDIj5gFn2D-q"
]
},
{
"cell_type": "code",
"execution_count": 36,
"metadata": {},
"outputs": [],
"source": [
"!cp \"model (4).h5\" model.h5"
]
},
{
"cell_type": "code",
"execution_count": 37,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"1228/1228 [==============================] - 856s 697ms/step\n"
]
},
{
"data": {
"text/plain": [
"(78592, 6)"
]
},
"execution_count": 37,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"model.load_weights('model.h5')\n",
"\n",
"preds = model.predict_generator(TestDataGenerator(test_df.index, None, VALID_BATCH_SIZE, \\\n",
" (WIDTH, HEIGHT), path_test_img), \n",
" verbose=1)\n",
"preds.shape"
]
},
{
"cell_type": "code",
"execution_count": 38,
"metadata": {},
"outputs": [],
"source": [
"from tqdm import tqdm"
]
},
{
"cell_type": "code",
"execution_count": 39,
"metadata": {},
"outputs": [],
"source": [
"cols = list(train_final_df.columns)"
]
},
{
"cell_type": "code",
"execution_count": 40,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"100%|█████████▉| 78545/78592 [00:01<00:00, 51807.96it/s]\n"
]
}
],
"source": [
"# We have preditions for each of the image\n",
"# We need to make 6 rows for each of file according to the subtype\n",
"ids = []\n",
"values = []\n",
"for i, j in tqdm(zip(preds, test_df.index.to_list()), total=preds.shape[0]):\n",
"# print(i, j)\n",
" # i=[any_prob, epidural_prob, intraparenchymal_prob, intraventricular_prob, subarachnoid_prob, subdural_prob]\n",
" # j = filename ==> ID_xyz.dcm\n",
" for k in range(i.shape[0]):\n",
" ids.append([j.replace('.dcm', '_' + cols[k])])\n",
" values.append(i[k]) "
]
},
{
"cell_type": "code",
"execution_count": 41,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>0</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>0</th>\n",
" <td>ID_000012eaf_any</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>ID_000012eaf_epidural</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2</th>\n",
" <td>ID_000012eaf_intraparenchymal</td>\n",
" </tr>\n",
" <tr>\n",
" <th>3</th>\n",
" <td>ID_000012eaf_intraventricular</td>\n",
" </tr>\n",
" <tr>\n",
" <th>4</th>\n",
" <td>ID_000012eaf_subarachnoid</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
" 0\n",
"0 ID_000012eaf_any\n",
"1 ID_000012eaf_epidural\n",
"2 ID_000012eaf_intraparenchymal\n",
"3 ID_000012eaf_intraventricular\n",
"4 ID_000012eaf_subarachnoid"
]
},
"execution_count": 41,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"df = pd.DataFrame(data=ids)\n",
"df.head()"
]
},
{
"cell_type": "code",
"execution_count": 42,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>ID</th>\n",
" <th>Label</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>0</th>\n",
" <td>ID_28fbab7eb_epidural</td>\n",
" <td>0.5</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>ID_28fbab7eb_intraparenchymal</td>\n",
" <td>0.5</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2</th>\n",
" <td>ID_28fbab7eb_intraventricular</td>\n",
" <td>0.5</td>\n",
" </tr>\n",
" <tr>\n",
" <th>3</th>\n",
" <td>ID_28fbab7eb_subarachnoid</td>\n",
" <td>0.5</td>\n",
" </tr>\n",
" <tr>\n",
" <th>4</th>\n",
" <td>ID_28fbab7eb_subdural</td>\n",
" <td>0.5</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
" ID Label\n",
"0 ID_28fbab7eb_epidural 0.5\n",
"1 ID_28fbab7eb_intraparenchymal 0.5\n",
"2 ID_28fbab7eb_intraventricular 0.5\n",
"3 ID_28fbab7eb_subarachnoid 0.5\n",
"4 ID_28fbab7eb_subdural 0.5"
]
},
"execution_count": 42,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"sample_df = pd.read_csv(input_folder + 'stage_1_sample_submission.csv')\n",
"sample_df.head()"
]
},
{
"cell_type": "code",
"execution_count": 43,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>ID</th>\n",
" <th>Label</th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>0</th>\n",
" <td>ID_000012eaf_any</td>\n",
" <td>0.008506</td>\n",
" </tr>\n",
" <tr>\n",
" <th>1</th>\n",
" <td>ID_000012eaf_epidural</td>\n",
" <td>0.000114</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2</th>\n",
" <td>ID_000012eaf_intraparenchymal</td>\n",
" <td>0.001682</td>\n",
" </tr>\n",
" <tr>\n",
" <th>3</th>\n",
" <td>ID_000012eaf_intraventricular</td>\n",
" <td>0.000329</td>\n",
" </tr>\n",
" <tr>\n",
" <th>4</th>\n",
" <td>ID_000012eaf_subarachnoid</td>\n",
" <td>0.000926</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
" ID Label\n",
"0 ID_000012eaf_any 0.008506\n",
"1 ID_000012eaf_epidural 0.000114\n",
"2 ID_000012eaf_intraparenchymal 0.001682\n",
"3 ID_000012eaf_intraventricular 0.000329\n",
"4 ID_000012eaf_subarachnoid 0.000926"
]
},
"execution_count": 43,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"df['Label'] = values\n",
"df.columns = sample_df.columns\n",
"df.head()"
]
},
{
"cell_type": "code",
"execution_count": 44,
"metadata": {},
"outputs": [],
"source": [
"df.to_csv('submission.csv', index=False)"
]
},
{
"cell_type": "code",
"execution_count": 45,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<a href=submission.csv>Download CSV file</a>"
],
"text/plain": [
"<IPython.core.display.HTML object>"
]
},
"execution_count": 45,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"create_download_link(filename='submission.csv')"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.5"
}
},
"nbformat": 4,
"nbformat_minor": 1
}