719 lines
		
	
	
	
		
			26 KiB
		
	
	
	
		
			Text
		
	
	
	
	
	
			
		
		
	
	
			719 lines
		
	
	
	
		
			26 KiB
		
	
	
	
		
			Text
		
	
	
	
	
	
{
 | 
						||
 "cells": [
 | 
						||
  {
 | 
						||
   "cell_type": "markdown",
 | 
						||
   "metadata": {},
 | 
						||
   "source": [
 | 
						||
    "In this example, we will use tensorflow.keras package to create a keras image classification application using model MobileNetV2, and transfer the application to Cluster Serving step by step."
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "markdown",
 | 
						||
   "metadata": {},
 | 
						||
   "source": [
 | 
						||
    "### Original Keras application\n",
 | 
						||
    "We will first show an original Keras application, which download the data and preprocess it, then create the MobileNetV2 model to predict."
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 1,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [],
 | 
						||
   "source": [
 | 
						||
    "import tensorflow as tf\n",
 | 
						||
    "import os\n",
 | 
						||
    "import PIL"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 2,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [
 | 
						||
    {
 | 
						||
     "data": {
 | 
						||
      "text/plain": [
 | 
						||
       "'2.2.0'"
 | 
						||
      ]
 | 
						||
     },
 | 
						||
     "execution_count": 2,
 | 
						||
     "metadata": {},
 | 
						||
     "output_type": "execute_result"
 | 
						||
    }
 | 
						||
   ],
 | 
						||
   "source": [
 | 
						||
    "tf.__version__"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 3,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [
 | 
						||
    {
 | 
						||
     "name": "stdout",
 | 
						||
     "output_type": "stream",
 | 
						||
     "text": [
 | 
						||
      "Found 1000 images belonging to 2 classes.\n"
 | 
						||
     ]
 | 
						||
    }
 | 
						||
   ],
 | 
						||
   "source": [
 | 
						||
    "# Obtain data from url:\"https://storage.googleapis.com/mledu-datasets/cats_and_dogs_filtered.zip\"\n",
 | 
						||
    "zip_file = tf.keras.utils.get_file(origin=\"https://storage.googleapis.com/mledu-datasets/cats_and_dogs_filtered.zip\",\n",
 | 
						||
    "                                   fname=\"cats_and_dogs_filtered.zip\", extract=True)\n",
 | 
						||
    "\n",
 | 
						||
    "# Find the directory of validation set\n",
 | 
						||
    "base_dir, _ = os.path.splitext(zip_file)\n",
 | 
						||
    "test_dir = os.path.join(base_dir, 'validation')\n",
 | 
						||
    "# Set images size to 160x160x3\n",
 | 
						||
    "image_size = 160\n",
 | 
						||
    "\n",
 | 
						||
    "# Rescale all images by 1./255 and apply image augmentation\n",
 | 
						||
    "test_datagen = tf.keras.preprocessing.image.ImageDataGenerator(rescale=1./255)\n",
 | 
						||
    "\n",
 | 
						||
    "# Flow images using generator to the test_generator\n",
 | 
						||
    "test_generator = test_datagen.flow_from_directory(\n",
 | 
						||
    "                test_dir,\n",
 | 
						||
    "                target_size=(image_size, image_size),\n",
 | 
						||
    "                batch_size=1,\n",
 | 
						||
    "                class_mode='binary')"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 4,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [],
 | 
						||
   "source": [
 | 
						||
    "# Create the base model from the pre-trained model MobileNet V2\n",
 | 
						||
    "IMG_SHAPE=(160,160,3)\n",
 | 
						||
    "model = tf.keras.applications.MobileNetV2(input_shape=IMG_SHAPE,\n",
 | 
						||
    "                                               include_top=False,\n",
 | 
						||
    "                                               weights='imagenet')"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "markdown",
 | 
						||
   "metadata": {},
 | 
						||
   "source": [
 | 
						||
    "In keras, input could be ndarray, or generator. We could just use `model.predict(test_generator)`. But to simplify, here we just input the first record to model."
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 5,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [
 | 
						||
    {
 | 
						||
     "name": "stdout",
 | 
						||
     "output_type": "stream",
 | 
						||
     "text": [
 | 
						||
      "[[[[0.         0.         0.         ... 0.         0.\n",
 | 
						||
      "    0.        ]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         0.\n",
 | 
						||
      "    0.        ]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         0.\n",
 | 
						||
      "    0.        ]\n",
 | 
						||
      "   [0.         0.         0.8406992  ... 0.         0.\n",
 | 
						||
      "    0.        ]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         0.\n",
 | 
						||
      "    0.        ]]\n",
 | 
						||
      "\n",
 | 
						||
      "  [[0.         0.         0.         ... 0.         0.\n",
 | 
						||
      "    0.        ]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         0.81465054\n",
 | 
						||
      "    0.        ]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         0.6572695\n",
 | 
						||
      "    0.23970175]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         1.2423501\n",
 | 
						||
      "    0.8024192 ]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         0.\n",
 | 
						||
      "    0.        ]]\n",
 | 
						||
      "\n",
 | 
						||
      "  [[0.         0.         0.         ... 0.         0.\n",
 | 
						||
      "    0.        ]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         5.185735\n",
 | 
						||
      "    0.21723604]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         4.6399093\n",
 | 
						||
      "    0.40124178]\n",
 | 
						||
      "   [0.3284886  0.         0.         ... 0.         5.295811\n",
 | 
						||
      "    3.4133787 ]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         0.\n",
 | 
						||
      "    0.        ]]\n",
 | 
						||
      "\n",
 | 
						||
      "  [[0.         0.         0.         ... 0.         0.\n",
 | 
						||
      "    0.        ]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         0.52712107\n",
 | 
						||
      "    0.20341969]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         0.8279238\n",
 | 
						||
      "    0.42696333]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         1.0344229\n",
 | 
						||
      "    1.5225778 ]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         0.\n",
 | 
						||
      "    0.        ]]\n",
 | 
						||
      "\n",
 | 
						||
      "  [[0.         0.         0.         ... 0.         0.\n",
 | 
						||
      "    0.        ]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         0.\n",
 | 
						||
      "    1.3237557 ]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         0.\n",
 | 
						||
      "    1.3395147 ]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         0.\n",
 | 
						||
      "    0.        ]\n",
 | 
						||
      "   [0.         0.         0.         ... 0.         0.\n",
 | 
						||
      "    0.        ]]]]\n"
 | 
						||
     ]
 | 
						||
    }
 | 
						||
   ],
 | 
						||
   "source": [
 | 
						||
    "prediction=model.predict(test_generator.next()[0])\n",
 | 
						||
    "print(prediction)"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "markdown",
 | 
						||
   "metadata": {},
 | 
						||
   "source": [
 | 
						||
    "Great! Now the Keras application is completed. "
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "markdown",
 | 
						||
   "metadata": {},
 | 
						||
   "source": [
 | 
						||
    "### Export TensorFlow SavedModel\n",
 | 
						||
    "Next, we transfer the application to Cluster Serving. The first step is to save the model to SavedModel format."
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 6,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [
 | 
						||
    {
 | 
						||
     "name": "stdout",
 | 
						||
     "output_type": "stream",
 | 
						||
     "text": [
 | 
						||
      "WARNING:tensorflow:From /home/user/anaconda3/envs/rec/lib/python3.6/site-packages/tensorflow/python/ops/resource_variable_ops.py:1817: calling BaseResourceVariable.__init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.\n",
 | 
						||
      "Instructions for updating:\n",
 | 
						||
      "If using Keras pass *_constraint arguments to layers.\n",
 | 
						||
      "INFO:tensorflow:Assets written to: /tmp/transfer_learning_mobilenetv2/assets\n",
 | 
						||
      "assets\tsaved_model.pb\tvariables\n"
 | 
						||
     ]
 | 
						||
    }
 | 
						||
   ],
 | 
						||
   "source": [
 | 
						||
    "# Save trained model to ./transfer_learning_mobilenetv2\n",
 | 
						||
    "model.save('/tmp/transfer_learning_mobilenetv2')\n",
 | 
						||
    "! ls /tmp/transfer_learning_mobilenetv2"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "markdown",
 | 
						||
   "metadata": {},
 | 
						||
   "source": [
 | 
						||
    "### Deploy Cluster Serving\n",
 | 
						||
    "After model prepared, we start to deploy it on Cluster Serving.\n",
 | 
						||
    "\n",
 | 
						||
    "First install Cluster Serving"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 7,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [
 | 
						||
    {
 | 
						||
     "name": "stdout",
 | 
						||
     "output_type": "stream",
 | 
						||
     "text": [
 | 
						||
      "Requirement already satisfied: bigdl-serving in /home/user/anaconda3/envs/rec/lib/python3.6/site-packages (0.9.0)\n",
 | 
						||
      "Requirement already satisfied: opencv-python in /home/user/anaconda3/envs/rec/lib/python3.6/site-packages (from bigdl-serving) (4.5.1.48)\n",
 | 
						||
      "Requirement already satisfied: httpx in /home/user/anaconda3/envs/rec/lib/python3.6/site-packages (from bigdl-serving) (0.16.1)\n",
 | 
						||
      "Requirement already satisfied: pyarrow in /home/user/.local/lib/python3.6/site-packages (from bigdl-serving) (1.0.1)\n",
 | 
						||
      "Requirement already satisfied: redis in /home/user/anaconda3/envs/rec/lib/python3.6/site-packages (from bigdl-serving) (3.5.3)\n",
 | 
						||
      "Requirement already satisfied: pyyaml in /home/user/anaconda3/envs/rec/lib/python3.6/site-packages (from bigdl-serving) (5.4.1)\n",
 | 
						||
      "Requirement already satisfied: rfc3986[idna2008]<2,>=1.3 in /home/user/anaconda3/envs/rec/lib/python3.6/site-packages (from httpx->bigdl-serving) (1.4.0)\n",
 | 
						||
      "Requirement already satisfied: certifi in /home/user/anaconda3/envs/rec/lib/python3.6/site-packages (from httpx->bigdl-serving) (2020.12.5)\n",
 | 
						||
      "Requirement already satisfied: httpcore==0.12.* in /home/user/anaconda3/envs/rec/lib/python3.6/site-packages (from httpx->bigdl-serving) (0.12.3)\n",
 | 
						||
      "Requirement already satisfied: sniffio in /home/user/anaconda3/envs/rec/lib/python3.6/site-packages (from httpx->bigdl-serving) (1.2.0)\n",
 | 
						||
      "Requirement already satisfied: h11==0.* in /home/user/anaconda3/envs/rec/lib/python3.6/site-packages (from httpcore==0.12.*->httpx->bigdl-serving) (0.12.0)\n",
 | 
						||
      "Requirement already satisfied: contextvars>=2.1 in /home/user/anaconda3/envs/rec/lib/python3.6/site-packages (from sniffio->httpx->bigdl-serving) (2.4)\n",
 | 
						||
      "Requirement already satisfied: immutables>=0.9 in /home/user/anaconda3/envs/rec/lib/python3.6/site-packages (from contextvars>=2.1->sniffio->httpx->bigdl-serving) (0.14)\n",
 | 
						||
      "Requirement already satisfied: idna in /home/user/anaconda3/envs/rec/lib/python3.6/site-packages (from rfc3986[idna2008]<2,>=1.3->httpx->bigdl-serving) (2.10)\n",
 | 
						||
      "Requirement already satisfied: numpy>=1.13.3 in /home/user/anaconda3/envs/rec/lib/python3.6/site-packages (from opencv-python->bigdl-serving) (1.19.2)\n",
 | 
						||
      "\u001b[33mWARNING: You are using pip version 20.3.3; however, version 21.0.1 is available.\n",
 | 
						||
      "You should consider upgrading via the '/home/user/anaconda3/envs/rec/bin/python -m pip install --upgrade pip' command.\u001b[0m\n"
 | 
						||
     ]
 | 
						||
    }
 | 
						||
   ],
 | 
						||
   "source": [
 | 
						||
    "! pip install bigdl-serving"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 8,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [
 | 
						||
    {
 | 
						||
     "name": "stdout",
 | 
						||
     "output_type": "stream",
 | 
						||
     "text": [
 | 
						||
      "Cluster Serving has been properly set up.\n",
 | 
						||
      "You did not specify BIGDL_VERSION, will download 0.9.0\n",
 | 
						||
      "BIGDL_VERSION is 0.9.0\n",
 | 
						||
      "BIGDL_VERSION is 0.12.1\n",
 | 
						||
      "SPARK_VERSION is 2.4.3\n",
 | 
						||
      "2.4\n",
 | 
						||
      "--2021-02-07 10:01:46--  https://repo1.maven.org/maven2/com/intel/analytics/bigdl/bigdl-spark_2.4.3/0.9.0/bigdl-spark_2.4.3-0.9.0-serving.jar\n",
 | 
						||
      "Resolving child-prc.intel.com (child-prc.intel.com)... You are installing Cluster Serving by pip, downloading...\n",
 | 
						||
      "\n",
 | 
						||
      "SIGHUP received.\n",
 | 
						||
      "Redirecting output to ‘wget-log.2’.\n"
 | 
						||
     ]
 | 
						||
    }
 | 
						||
   ],
 | 
						||
   "source": [
 | 
						||
    "# we go to a new directory and initialize the environment\n",
 | 
						||
    "! mkdir cluster-serving\n",
 | 
						||
    "os.chdir('cluster-serving')\n",
 | 
						||
    "! cluster-serving-init"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 11,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [
 | 
						||
    {
 | 
						||
     "name": "stdout",
 | 
						||
     "output_type": "stream",
 | 
						||
     "text": [
 | 
						||
      "  2150K .......... .......... .......... .......... ..........  0% 27.0K 5h37m\r\n",
 | 
						||
      "  2200K .......... .......... .......... .......... ..........  0% 33.6K 5h36m\r\n",
 | 
						||
      "  2250K .......... .......... .......... .......... ..........  0% 27.3K 5h37m\r\n",
 | 
						||
      "  2300K .......... .......... .......... .......... ..........  0% 30.3K 5h36m\r\n",
 | 
						||
      "  2350K .......... .......... .......... .......... ..........  0% 29.7K 5h36m\r\n",
 | 
						||
      "  2400K .......... .......... .......... .......... ..........  0% 23.7K 5h38m\r\n",
 | 
						||
      "  2450K .......... .......... .......... .......... ..........  0% 23.4K 5h39m\r\n",
 | 
						||
      "  2500K .......... .......... .......... .......... ..........  0% 23.4K 5h41m\r\n",
 | 
						||
      "  2550K .......... .......... .......... .......... ..........  0% 22.3K 5h43m\r\n",
 | 
						||
      "  2600K .......... .......... .......... ....."
 | 
						||
     ]
 | 
						||
    }
 | 
						||
   ],
 | 
						||
   "source": [
 | 
						||
    "! tail wget-log.2"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": null,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [],
 | 
						||
   "source": [
 | 
						||
    "# if you encounter slow download issue like above, you can just use following command to download\n",
 | 
						||
    "# ! wget https://repo1.maven.org/maven2/com/intel/analytics/bigdl/bigdl-spark_2.4.3/0.9.0/bigdl-spark_2.4.3-0.9.0-serving.jar\n",
 | 
						||
    "\n",
 | 
						||
    "# if you are using wget to download, or get \"bigdl-xxx-serving.jar\" after \"ls\", please call mv *serving.jar bigdl.jar after downloaded."
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 13,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [
 | 
						||
    {
 | 
						||
     "name": "stdout",
 | 
						||
     "output_type": "stream",
 | 
						||
     "text": [
 | 
						||
      "config.yaml  bigdl.jar\r\n"
 | 
						||
     ]
 | 
						||
    }
 | 
						||
   ],
 | 
						||
   "source": [
 | 
						||
    "# After initialization finished, check the directory\n",
 | 
						||
    "! ls"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "markdown",
 | 
						||
   "metadata": {},
 | 
						||
   "source": [
 | 
						||
    "We config the model path in `config.yaml` to following (the detail of config is at [Cluster Serving Configuration](https://github.com/intel-analytics/bigdl/blob/master/docs/docs/ClusterServingGuide/ProgrammingGuide.md#2-configuration))"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": null,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [],
 | 
						||
   "source": [
 | 
						||
    "## BigDL Cluster Serving\n",
 | 
						||
    "\n",
 | 
						||
    "model:\n",
 | 
						||
    "  # model path must be provided\n",
 | 
						||
    "  path: /tmp/transfer_learning_mobilenetv2"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 14,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [
 | 
						||
    {
 | 
						||
     "name": "stdout",
 | 
						||
     "output_type": "stream",
 | 
						||
     "text": [
 | 
						||
      "## BigDL Cluster Serving\r\n",
 | 
						||
      "\r\n",
 | 
						||
      "model:\r\n",
 | 
						||
      "  # model path must be provided\r\n",
 | 
						||
      "  path: /tmp/transfer_learning_mobilenetv2\r\n",
 | 
						||
      "  # name, default is serving_stream, you need to specify if running multiple servings\r\n",
 | 
						||
      "  name:\r\n",
 | 
						||
      "data:\r\n",
 | 
						||
      "  # default, localhost:6379\r\n",
 | 
						||
      "  src:\r\n"
 | 
						||
     ]
 | 
						||
    }
 | 
						||
   ],
 | 
						||
   "source": [
 | 
						||
    "! head config.yaml"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "markdown",
 | 
						||
   "metadata": {},
 | 
						||
   "source": [
 | 
						||
    "### Start Cluster Serving\n",
 | 
						||
    "\n",
 | 
						||
    "Cluster Serving requires Flink and Redis installed, and corresponded environment variables set, check [Cluster Serving Installation Guide](https://github.com/intel-analytics/bigdl/blob/master/docs/docs/ClusterServingGuide/ProgrammingGuide.md#1-installation) for detail.\n",
 | 
						||
    "\n",
 | 
						||
    "Flink cluster should start before Cluster Serving starts, if Flink cluster is not started, call following to start a local Flink cluster."
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 16,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [
 | 
						||
    {
 | 
						||
     "name": "stdout",
 | 
						||
     "output_type": "stream",
 | 
						||
     "text": [
 | 
						||
      "Starting cluster.\n",
 | 
						||
      "Starting standalonesession daemon on host my-PC.\n",
 | 
						||
      "Starting taskexecutor daemon on host my-PC.\n"
 | 
						||
     ]
 | 
						||
    }
 | 
						||
   ],
 | 
						||
   "source": [
 | 
						||
    "! $FLINK_HOME/bin/start-cluster.sh"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "markdown",
 | 
						||
   "metadata": {},
 | 
						||
   "source": [
 | 
						||
    "After configuration, start Cluster Serving by `cluster-serving-start` (the detail is at [Cluster Serving Programming Guide](https://github.com/intel-analytics/bigdl/blob/master/docs/docs/ClusterServingGuide/ProgrammingGuide.md#3-launching-service))"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 18,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [
 | 
						||
    {
 | 
						||
     "name": "stdout",
 | 
						||
     "output_type": "stream",
 | 
						||
     "text": [
 | 
						||
      "model_path=\"/tmp/transfer_learning_mobilenetv2\"\n",
 | 
						||
      "redis_timeout=\"5000\"\n",
 | 
						||
      "Redis maxmemory is not set, using default value 8G\n",
 | 
						||
      "redis server started, please check log in redis.log\n",
 | 
						||
      "OK\n",
 | 
						||
      "OK\n",
 | 
						||
      "OK\n",
 | 
						||
      "redis config maxmemory set to 8G\n",
 | 
						||
      "OK\n",
 | 
						||
      "OK\n",
 | 
						||
      "SLF4J: Class path contains multiple SLF4J bindings.\n",
 | 
						||
      "SLF4J: Found binding in [jar:file:/home/user/dep/flink-1.11.2/lib/bigdl-spark_2.4.3-0.9.0-SNAPSHOT-serving.jar!/org/slf4j/impl/StaticLoggerBinder.class]\n",
 | 
						||
      "SLF4J: Found binding in [jar:file:/home/user/dep/flink-1.11.2/lib/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]\n",
 | 
						||
      "SLF4J: Found binding in [jar:file:/home/user/dep/flink-1.11.2/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]\n",
 | 
						||
      "SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.\n",
 | 
						||
      "SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]\n",
 | 
						||
      "log4j:WARN No appenders could be found for logger (org.apache.flink.client.cli.CliFrontend).\n",
 | 
						||
      "log4j:WARN Please initialize the log4j system properly.\n",
 | 
						||
      "log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.\n",
 | 
						||
      "Starting new Cluster Serving job.\n",
 | 
						||
      "Cluster Serving job submitted, check log in log-cluster_serving-serving_stream.txt\n",
 | 
						||
      "To list Cluster Serving job status, use cluster-serving-cli list\n",
 | 
						||
      "SLF4J: Class path contains multiple SLF4J bindings.\n",
 | 
						||
      "SLF4J: Found binding in [jar:file:/home/user/dep/flink-1.11.2/lib/bigdl-spark_2.4.3-0.9.0-SNAPSHOT-serving.jar!/org/slf4j/impl/StaticLoggerBinder.class]\n",
 | 
						||
      "SLF4J: Found binding in [jar:file:/home/user/dep/flink-1.11.2/lib/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]\n",
 | 
						||
      "SLF4J: Found binding in [jar:file:/home/user/dep/flink-1.11.2/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]\n",
 | 
						||
      "SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.\n",
 | 
						||
      "SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]\n",
 | 
						||
      "log4j:WARN No appenders could be found for logger (org.apache.flink.client.cli.CliFrontend).\n",
 | 
						||
      "log4j:WARN Please initialize the log4j system properly.\n",
 | 
						||
      "log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.\n",
 | 
						||
      "[Full GC (Metadata GC Threshold)  32304K->20432K(1030144K), 0.0213821 secs]\n"
 | 
						||
     ]
 | 
						||
    }
 | 
						||
   ],
 | 
						||
   "source": [
 | 
						||
    "! cluster-serving-start"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "markdown",
 | 
						||
   "metadata": {},
 | 
						||
   "source": [
 | 
						||
    "### Prediction using Cluster Serving\n",
 | 
						||
    "Next we start Cluster Serving code at python client."
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 19,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [
 | 
						||
    {
 | 
						||
     "name": "stdout",
 | 
						||
     "output_type": "stream",
 | 
						||
     "text": [
 | 
						||
      "redis group exist, will not create new one\n",
 | 
						||
      "redis group exist, will not create new one\n"
 | 
						||
     ]
 | 
						||
    }
 | 
						||
   ],
 | 
						||
   "source": [
 | 
						||
    "from bigdl.serving.client import InputQueue, OutputQueue\n",
 | 
						||
    "input_queue = InputQueue()"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "markdown",
 | 
						||
   "metadata": {},
 | 
						||
   "source": [
 | 
						||
    "In Cluster Serving, only NdArray is supported as input. Thus, we first transform the generator to ndarray (If you do not know how to transform your input to NdArray, you may get help at [data transform guide](https://github.com/intel-analytics/bigdl/tree/master/docs/docs/ClusterServingGuide/OtherFrameworkUsers#data))"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 20,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [
 | 
						||
    {
 | 
						||
     "data": {
 | 
						||
      "text/plain": [
 | 
						||
       "array([[[[0.12156864, 0.11764707, 0.10980393],\n",
 | 
						||
       "         [0.12156864, 0.11764707, 0.10980393],\n",
 | 
						||
       "         [0.11764707, 0.1137255 , 0.10588236],\n",
 | 
						||
       "         ...,\n",
 | 
						||
       "         [0.28627452, 0.29803923, 0.22352943],\n",
 | 
						||
       "         [0.24705884, 0.25882354, 0.18431373],\n",
 | 
						||
       "         [0.24705884, 0.24705884, 0.20000002]],\n",
 | 
						||
       "\n",
 | 
						||
       "        [[0.15686275, 0.15294118, 0.14509805],\n",
 | 
						||
       "         [0.13725491, 0.13333334, 0.1254902 ],\n",
 | 
						||
       "         [0.09803922, 0.09411766, 0.08627451],\n",
 | 
						||
       "         ...,\n",
 | 
						||
       "         [0.31764707, 0.3254902 , 0.27450982],\n",
 | 
						||
       "         [0.31764707, 0.3254902 , 0.27058825],\n",
 | 
						||
       "         [0.2784314 , 0.2784314 , 0.2392157 ]],\n",
 | 
						||
       "\n",
 | 
						||
       "        [[0.21960786, 0.21568629, 0.20784315],\n",
 | 
						||
       "         [0.23137257, 0.227451  , 0.21960786],\n",
 | 
						||
       "         [0.24705884, 0.24313727, 0.23529413],\n",
 | 
						||
       "         ...,\n",
 | 
						||
       "         [0.29411766, 0.29803923, 0.27450982],\n",
 | 
						||
       "         [0.26666668, 0.27058825, 0.2392157 ],\n",
 | 
						||
       "         [0.30588236, 0.30588236, 0.26666668]],\n",
 | 
						||
       "\n",
 | 
						||
       "        ...,\n",
 | 
						||
       "\n",
 | 
						||
       "        [[0.35686275, 0.3019608 , 0.15686275],\n",
 | 
						||
       "         [0.38431376, 0.29803923, 0.14509805],\n",
 | 
						||
       "         [0.36862746, 0.25490198, 0.12156864],\n",
 | 
						||
       "         ...,\n",
 | 
						||
       "         [0.1764706 , 0.08627451, 0.01568628],\n",
 | 
						||
       "         [0.16862746, 0.08627451, 0.00392157],\n",
 | 
						||
       "         [0.1764706 , 0.08627451, 0.03137255]],\n",
 | 
						||
       "\n",
 | 
						||
       "        [[0.30980393, 0.2784314 , 0.13333334],\n",
 | 
						||
       "         [0.3529412 , 0.29411766, 0.14117648],\n",
 | 
						||
       "         [0.3529412 , 0.26666668, 0.12156864],\n",
 | 
						||
       "         ...,\n",
 | 
						||
       "         [0.1764706 , 0.08627451, 0.01568628],\n",
 | 
						||
       "         [0.17254902, 0.08235294, 0.01176471],\n",
 | 
						||
       "         [0.18039216, 0.09019608, 0.03529412]],\n",
 | 
						||
       "\n",
 | 
						||
       "        [[0.30588236, 0.27450982, 0.13333334],\n",
 | 
						||
       "         [0.33333334, 0.28627452, 0.12941177],\n",
 | 
						||
       "         [0.3372549 , 0.26666668, 0.11764707],\n",
 | 
						||
       "         ...,\n",
 | 
						||
       "         [0.19607845, 0.09411766, 0.03529412],\n",
 | 
						||
       "         [0.18039216, 0.07843138, 0.02745098],\n",
 | 
						||
       "         [0.1764706 , 0.08627451, 0.03137255]]]], dtype=float32)"
 | 
						||
      ]
 | 
						||
     },
 | 
						||
     "execution_count": 20,
 | 
						||
     "metadata": {},
 | 
						||
     "output_type": "execute_result"
 | 
						||
    }
 | 
						||
   ],
 | 
						||
   "source": [
 | 
						||
    "arr = test_generator.next()[0]\n",
 | 
						||
    "arr"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 23,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [
 | 
						||
    {
 | 
						||
     "name": "stdout",
 | 
						||
     "output_type": "stream",
 | 
						||
     "text": [
 | 
						||
      "Write to Redis successful\n",
 | 
						||
      "redis group exist, will not create new one\n",
 | 
						||
      "Write to Redis successful\n"
 | 
						||
     ]
 | 
						||
    }
 | 
						||
   ],
 | 
						||
   "source": [
 | 
						||
    "# Use async api to put and get, you have pass a name arg and use the name to get\n",
 | 
						||
    "input_queue.enqueue('my-input', t=arr)\n",
 | 
						||
    "output_queue = OutputQueue()\n",
 | 
						||
    "prediction = output_queue.query('my-input')\n",
 | 
						||
    "# Use sync api to predict, this will block until the result is get or timeout\n",
 | 
						||
    "prediction = input_queue.predict(arr)"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 24,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [
 | 
						||
    {
 | 
						||
     "data": {
 | 
						||
      "text/plain": [
 | 
						||
       "array([[[0.        , 0.        , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 0.        , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 0.        , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 0.        , 0.        , ..., 1.3543907 ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 0.        , 0.        , ..., 4.1898136 ,\n",
 | 
						||
       "         0.        , 0.        ]],\n",
 | 
						||
       "\n",
 | 
						||
       "       [[0.        , 0.        , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 3.286649  , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 4.0817494 , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 3.3224926 , 0.        , ..., 1.4220613 ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 0.        , 0.        , ..., 4.9100547 ,\n",
 | 
						||
       "         0.        , 0.        ]],\n",
 | 
						||
       "\n",
 | 
						||
       "       [[0.        , 0.        , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 1.5577714 , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 1.767426  , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 2.3534465 , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 0.        , 0.        , ..., 0.21401057,\n",
 | 
						||
       "         0.        , 0.        ]],\n",
 | 
						||
       "\n",
 | 
						||
       "       [[0.        , 0.        , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.34797698, 0.        ],\n",
 | 
						||
       "        [0.        , 1.4496232 , 0.        , ..., 0.        ,\n",
 | 
						||
       "         1.6221215 , 0.        ],\n",
 | 
						||
       "        [0.        , 0.6171873 , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 0.        , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 0.        , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ]],\n",
 | 
						||
       "\n",
 | 
						||
       "       [[0.        , 0.        , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 0.        , 0.        , ..., 0.        ,\n",
 | 
						||
       "         1.192298  , 0.        ],\n",
 | 
						||
       "        [0.        , 0.        , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 0.        , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ],\n",
 | 
						||
       "        [0.        , 0.        , 0.        , ..., 0.        ,\n",
 | 
						||
       "         0.        , 0.        ]]], dtype=float32)"
 | 
						||
      ]
 | 
						||
     },
 | 
						||
     "execution_count": 24,
 | 
						||
     "metadata": {},
 | 
						||
     "output_type": "execute_result"
 | 
						||
    }
 | 
						||
   ],
 | 
						||
   "source": [
 | 
						||
    "prediction"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "markdown",
 | 
						||
   "metadata": {},
 | 
						||
   "source": [
 | 
						||
    "If everything works well, the result `prediction` would be the exactly the same NdArray object with the output of original Keras model."
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "code",
 | 
						||
   "execution_count": 25,
 | 
						||
   "metadata": {},
 | 
						||
   "outputs": [],
 | 
						||
   "source": [
 | 
						||
    "# don't forget to delete the model you save for this tutorial\n",
 | 
						||
    "! rm -rf /tmp/transfer_learning_mobilenetv2"
 | 
						||
   ]
 | 
						||
  },
 | 
						||
  {
 | 
						||
   "cell_type": "markdown",
 | 
						||
   "metadata": {},
 | 
						||
   "source": [
 | 
						||
    "This is the end of this tutorial. If you have any question, you could raise an issue at [BigDL Github](https://github.com/intel-analytics/bigdl/issues)."
 | 
						||
   ]
 | 
						||
  }
 | 
						||
 ],
 | 
						||
 "metadata": {
 | 
						||
  "kernelspec": {
 | 
						||
   "display_name": "Python 3",
 | 
						||
   "language": "python",
 | 
						||
   "name": "python3"
 | 
						||
  },
 | 
						||
  "language_info": {
 | 
						||
   "codemirror_mode": {
 | 
						||
    "name": "ipython",
 | 
						||
    "version": 3
 | 
						||
   },
 | 
						||
   "file_extension": ".py",
 | 
						||
   "mimetype": "text/x-python",
 | 
						||
   "name": "python",
 | 
						||
   "nbconvert_exporter": "python",
 | 
						||
   "pygments_lexer": "ipython3",
 | 
						||
   "version": "3.7.10"
 | 
						||
  }
 | 
						||
 },
 | 
						||
 "nbformat": 4,
 | 
						||
 "nbformat_minor": 4
 | 
						||
}
 |