diff --git a/.gitignore b/.gitignore index 1c9730a5ad57cd..b1dcfda07ba3ec 100644 --- a/.gitignore +++ b/.gitignore @@ -12,3 +12,4 @@ Makefile *~ bazel-* +.ipynb_checkpoints diff --git a/demo/introduction/linear.ipynb b/demo/introduction/linear.ipynb new file mode 100644 index 00000000000000..de81d68a1a5ee7 --- /dev/null +++ b/demo/introduction/linear.ipynb @@ -0,0 +1,721 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "# Paddle Python Trainer API使用说明\n", + "\n", + "目前Paddle的PythonTrainerAPI处在一个测试阶段,很多设计都可以修改。包括接口。\n", + "\n", + "## 开发周期\n", + "\n", + "1. 使用Paddle的swig接口暴露出足够多支持Paddle训练的API,这个暴露级别最好在GradientMachine这个级别,可以方便用户自定义训练过程。\n", + "2. 在swig API的基础上,确定Python的用户接口是什么样子\n", + "3. 将swig API修改成C-API,进而可以多语言操控Paddle的训练\n", + "\n", + "## 预期达到的效果\n", + "\n", + "1. 用户可以完全使用Paddle Python的库完成训练。\n", + " 1. 训练过程中的信息,可以以强类型传递给Python端。\n", + " * 正确率\n", + " * cost\n", + " * pass_id, batch_id\n", + " 2. 训练过程中的测试频率等工作,完全交由用户进行\n", + " * 默认pass结束测试全部数据,但用户可以随意更改\n", + " 3. 用户可以非常自由的选择更新参数\n", + " * 针对不同的数据,训练神经网络的不同部分。\n", + " * 一组数据训练网络的左半边,另一组训练右半边\n", + " 4. 更方便的多目标学习\n", + "\n", + "## Python端用户接口的封装\n", + "\n", + "### 整体样例\n", + "\n", + "下面以一个简单的线性回归作为用户接口的使用样例。这个线性回归是以 x和y为变量,回归一个 y=w*x+b 的方程。其中w和b预设为2和0.3。神经网络中的w和b都是学习出来的。 " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "首先,先import一些包。Python的训练接口复用了目前Paddle的很多逻辑。\n", + "\n", + "1. 网络配置复用的是trainer_config_helpers里面的配置\n", + "2. 数据传输复用的是PyDataProvider2的格式\n", + "\n", + "`from py_paddle.trainer import *`这行import了Paddle Python端的训练函数。`import py_paddle.swig_paddle as api`这行import了底层swig暴露的接口。" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "from paddle.trainer_config_helpers import *\n", + "from paddle.trainer.PyDataProvider2 import *\n", + "from py_paddle.trainer import *\n", + "import py_paddle.swig_paddle as api" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "@network(inputs={\n", + " 'x': dense_vector(1), 'y': dense_vector(1)\n", + "}, learning_rate=1e-3, batch_size=12)\n", + "def linear_network(x, y):\n", + " y_predict = fc_layer(input=x, param_attr=ParamAttr(name='w'), size=1,\n", + " act=LinearActivation(), bias_attr=ParamAttr(name='b'))\n", + " cost = regression_cost(input=y_predict, label=y)\n", + " return cost" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "上面定义Paddle需要训练网络的网络结构。这个网络结构的定义使用了Python的函数。\n", + "\n", + "`@network`是一个decorator,它将下面的函数变成Paddle的神经网络描述(protobuf)。其参数包括:\n", + "\n", + "* inputs. 传输数据类型是一个字典,key是函数的参数名(这里就是x、y),value是x,y对应的数据类型。这里数据类型都是dense的vector\n", + " * 可用的数据类型参考PyDataProvider2的@provider的input_types类型\n", + "* 其余的参数是paddle的优化参数。参考`settings`" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Help on function settings in module paddle.trainer_config_helpers.optimizers:\n", + "\n", + "settings(*args, **kwargs)\n", + " Set the optimization method, learning rate, batch size, and other training\n", + " settings. The currently supported algorithms are SGD and Async-SGD.\n", + " \n", + " .. warning::\n", + " \n", + " Note that the 'batch_size' in PaddlePaddle is not equal to global\n", + " training batch size. It represents the single training process's batch\n", + " size. If you use N processes to train one model, for example use three\n", + " GPU machines, the global batch size is N*'batch_size'.\n", + " \n", + " :param batch_size: batch size for one training process.\n", + " :type batch_size: int\n", + " :param learning_rate: learning rate for SGD\n", + " :type learning_rate: float\n", + " :param learning_method: The extension optimization algorithms of gradient\n", + " descent, such as momentum, adagrad, rmsprop, etc.\n", + " Note that it should be instance with base type\n", + " BaseSGDOptimizer.\n", + " :type learning_method: BaseSGDOptimizer\n", + " :param regularization: The regularization method.\n", + " :type regularization: BaseRegularization\n", + " :param is_async: Is Async-SGD or not. Default value is False.\n", + " :type is_async: bool\n", + " :param model_average: Model Average Settings.\n", + " :type model_average: ModelAverage\n", + " :param gradient_clipping_threshold: gradient clipping threshold. If gradient\n", + " value larger than some value, will be\n", + " clipped.\n", + " :type gradient_clipping_threshold: float\n", + "\n" + ] + } + ], + "source": [ + "help(settings) # run this line to print document of settings method." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "在linear_network里面定义的就是神经网络的计算图。返回值就是优化目标。\n", + "\n", + "使用decorator `@network`,我们将这个函数封装成了一个Python类。进而,我们声明一个网络描述实例`linear`。" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "linear = linear_network()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "这个描述是实例里面包含了一些Paddle的计算图信息和网络输入顺序等等。下面几个block可以手动运行,展开输出。" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Help on class NetworkConfigImpl in module py_paddle.trainer.network:\n", + "\n", + "class NetworkConfigImpl(NetworkConfig)\n", + " | Method resolution order:\n", + " | NetworkConfigImpl\n", + " | NetworkConfig\n", + " | __builtin__.object\n", + " | \n", + " | Methods defined here:\n", + " | \n", + " | __init__(self)\n", + " | \n", + " | input_order(self)\n", + " | \n", + " | input_types(self)\n", + " | \n", + " | network_graph(self)\n", + " | \n", + " | optimize_graph(self)\n", + " | \n", + " | ----------------------------------------------------------------------\n", + " | Methods inherited from NetworkConfig:\n", + " | \n", + " | provider(self, **kwargs)\n", + " | \n", + " | ----------------------------------------------------------------------\n", + " | Data descriptors inherited from NetworkConfig:\n", + " | \n", + " | __dict__\n", + " | dictionary for instance variables (if defined)\n", + " | \n", + " | __weakref__\n", + " | list of weak references to the object (if defined)\n", + "\n" + ] + } + ], + "source": [ + "help(linear_network) # run this line to print document of linear_network" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "{'y': , 'x': }\n" + ] + } + ], + "source": [ + "print linear.input_types()" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "type: \"nn\"\n", + "layers {\n", + " name: \"y\"\n", + " type: \"data\"\n", + " size: 1\n", + " active_type: \"\"\n", + "}\n", + "layers {\n", + " name: \"x\"\n", + " type: \"data\"\n", + " size: 1\n", + " active_type: \"\"\n", + "}\n", + "layers {\n", + " name: \"__fc_layer_0__\"\n", + " type: \"fc\"\n", + " size: 1\n", + " active_type: \"\"\n", + " inputs {\n", + " input_layer_name: \"x\"\n", + " input_parameter_name: \"w\"\n", + " }\n", + " bias_parameter_name: \"b\"\n", + "}\n", + "layers {\n", + " name: \"__regression_cost_0__\"\n", + " type: \"square_error\"\n", + " size: 1\n", + " active_type: \"\"\n", + " inputs {\n", + " input_layer_name: \"__fc_layer_0__\"\n", + " }\n", + " inputs {\n", + " input_layer_name: \"y\"\n", + " }\n", + " coeff: 1.0\n", + "}\n", + "parameters {\n", + " name: \"w\"\n", + " size: 1\n", + " initial_mean: 0.0\n", + " initial_std: 1.0\n", + " dims: 1\n", + " dims: 1\n", + " initial_strategy: 0\n", + " initial_smart: true\n", + "}\n", + "parameters {\n", + " name: \"b\"\n", + " size: 1\n", + " initial_mean: 0.0\n", + " initial_std: 1.0\n", + " dims: 1\n", + " dims: 1\n", + " initial_strategy: 0\n", + " initial_smart: true\n", + "}\n", + "input_layer_names: \"y\"\n", + "input_layer_names: \"x\"\n", + "output_layer_names: \"__regression_cost_0__\"\n", + "sub_models {\n", + " name: \"root\"\n", + " layer_names: \"y\"\n", + " layer_names: \"x\"\n", + " layer_names: \"__fc_layer_0__\"\n", + " layer_names: \"__regression_cost_0__\"\n", + " input_layer_names: \"y\"\n", + " input_layer_names: \"x\"\n", + " output_layer_names: \"__regression_cost_0__\"\n", + " is_recurrent_layer_group: false\n", + "}\n", + "\n" + ] + } + ], + "source": [ + "print linear.network_graph() # Paddle neural network protobuf definition" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "configs = {\n", + " 'w': 2,\n", + " 'b': 0.3\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "进而我们设置一下线性回归的参数。`y=w*x+b`, w和b设置为2和0.3。 这个dict被dataprovider使用。" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import random\n", + "\n", + "@linear.provider()\n", + "def process(*args, **kwargs):\n", + " for i in xrange(2000):\n", + " x = random.random()\n", + " yield {'x': [x], 'y': [configs['w'] * x + configs['b']]}\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "下一步是声明数据读取器(DataProvider)。其本身也是一个函数, `process`。\n", + "\n", + "Paddle的PyDataProvider2的数据读取的主要想法是,用户只需要关注**从一个文件里**如何读取**一条数据**,然后按照一种数据格式yield出去。其他batch组合,数据shuffle等工作Paddle完成。\n", + "\n", + "声明这个DataProvider的过程,也是使用一个Decorator完成。注意这个decorator实际上是**linear实例的一个函数**。\n", + "\n", + "这个函数的参数和PyDataProvider2一样,第一个是settings,第二个是filename。不过这里procees函数实际上没有使用任何参数,故process中使用`*args, **kwargs`来接受任意参数。\n", + "\n", + "返回值是使用yield返回。这里必须使用**字典**。" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Help on class DataProvider in module paddle.trainer.PyDataProvider2:\n", + "\n", + "class DataProvider(__builtin__.object)\n", + " | Methods defined here:\n", + " | \n", + " | __init__(self, file_list, **kwargs)\n", + " | \n", + " | ----------------------------------------------------------------------\n", + " | Data descriptors defined here:\n", + " | \n", + " | __dict__\n", + " | dictionary for instance variables (if defined)\n", + " | \n", + " | __weakref__\n", + " | list of weak references to the object (if defined)\n", + "\n" + ] + } + ], + "source": [ + "help(process)" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "runner = RunnerBuilder(network=linear).with_train_data(method=process).build()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "下一步是构造一个Runner,Runner是Python Trainer API中的最基础数据类型。它具有的操作是\n", + "\n", + "* 执行一个Pass。 run_one_pass。\n", + "* 增加一个Pass中的执行步骤,例如打印输出等等。\n", + "\n", + "RunnerBulder是一个简单的Runner生成器。他负责将Paddle的训练流程插入到Runner的执行步骤中。\n", + "\n", + "这里network传入linear对象,而训练数据的读取函数是process。调用build生成runner\n", + "\n", + "关于Runner的具体说明参考其他文档,或者注释。" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "learning_result = {\n", + " 'cost': [],\n", + " 'w': [],\n", + " 'b': []\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "我们声明一个learning_result字典,来保存训练过程中的数据,三个field分别保存每个pass后的误差,w值和b值。方便我们画图。" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [], + "source": [ + "with runner:\n", + " while True:\n", + " ctx = ContextWrapper(runner.run_one_pass())\n", + " learning_result['cost'].append(ctx.cost())\n", + " params = ctx.gradient_machine().getParameters()\n", + " for param in params:\n", + " learning_result[param.getName()].append(param.getBuf(api.PARAMETER_VALUE)[0])\n", + " \n", + " if abs(ctx.cost() - 0.0) < 1e-10:\n", + " # end training.\n", + " break" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "上面这个循环便是全部训练过程。\n", + "\n", + "第一行with runner,是指我要使用runner这个类来进行训练了。在使用某一个runner前,必须使用with,来初始化一些数据。同时目前Paddle只支持一个进程使用一个runner(Paddle的全局变量问题)。\n", + "\n", + "每一个run_one_pass()会返回一个当前的context,使用context wrapper可以更好(类型安全),更快(TODO 可以使用Cython优化)的访问Context。" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Help on ContextWrapper in module py_paddle.trainer.base_items object:\n", + "\n", + "class ContextWrapper(__builtin__.object)\n", + " | Strong typed wrapper to read/write context value.\n", + " | \n", + " | @TODO(yuyang18): Use Cython to implement this class, make it directly access\n", + " | a C struct.\n", + " | \n", + " | Methods defined here:\n", + " | \n", + " | __init__(self, context)\n", + " | \n", + " | batch_id(self, field_name='current_batch_id')\n", + " | \n", + " | batch_size(self, field_name='current_batch_size')\n", + " | :param field_name:\n", + " | :return:\n", + " | :rtype: int\n", + " | \n", + " | cost(self, field_name='current_cost')\n", + " | :param field_name:\n", + " | :return:\n", + " | :rtype: float\n", + " | \n", + " | get_field(self, field_name)\n", + " | \n", + " | get_field_with_type(self, field_name, tp)\n", + " | \n", + " | gradient_machine(self, field_name='gradient_machine')\n", + " | Get Gradient Machine\n", + " | :param field_name:\n", + " | :return:\n", + " | :rtype: api.GradientMachine\n", + " | \n", + " | in_args(self, field_name='in_args')\n", + " | :param field_name:\n", + " | :return:\n", + " | :rtype: api.Arguments\n", + " | \n", + " | increase_batch_id(self, field_name='current_batch_id')\n", + " | \n", + " | increase_pass_id(self, field_name='current_pass_id')\n", + " | \n", + " | pass_id(self, field_name='current_pass_id')\n", + " | \n", + " | reset_batch_id(self, field_name='current_batch_id')\n", + " | \n", + " | reset_pass_id(self, field_name='current_pass_id')\n", + " | \n", + " | set_batch_size(self, batch_size, field_name='current_batch_size')\n", + " | \n", + " | set_cost(self, cost, field_name='current_cost')\n", + " | \n", + " | set_field_with_type(self, field_name, value, tp, must_not_set=True)\n", + " | \n", + " | set_gradient_machine(self, machine, field_name='gradient_machine')\n", + " | \n", + " | set_in_args(self, in_args, field_name='in_args')\n", + " | \n", + " | set_updater(self, updater, field_name='updater')\n", + " | \n", + " | set_updater_callback(self, updater_callback, field_name='updater_callback')\n", + " | \n", + " | updater(self, field_name='updater')\n", + " | Get Parameter Updater\n", + " | :param field_name:\n", + " | :return:\n", + " | :rtype: api.ParameterUpdater\n", + " | \n", + " | updater_callback(self, field_name='updater_callback')\n", + " | :param field_name:\n", + " | :return:\n", + " | :rtype: callable\n", + " | \n", + " | ----------------------------------------------------------------------\n", + " | Data descriptors defined here:\n", + " | \n", + " | __dict__\n", + " | dictionary for instance variables (if defined)\n", + " | \n", + " | __weakref__\n", + " | list of weak references to the object (if defined)\n", + "\n" + ] + } + ], + "source": [ + "help(ctx)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "这个训练过程中,我们不指定训练次数,而是指定当误差小于1e-10的时候,我们就退出。\n", + "\n", + "同时,记录下每一个pass的w和b值。\n", + "\n", + "之后我们便可以使用matplotlib画图。画图的方法不在赘述。是标准的matplotlib使用" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAhIAAAFkCAYAAAB1rtL+AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3X+UXWV97/H3Nz+A8CPhR0hCMhMUFSFRgYyKqVURSinq\nEqtV7qh13eJSqVgwttZi642C2BY1UYq5F/FeIUuZFosiXsUoKtQSflwzEvldhECAQEgEwo8EEpLn\n/rH3kZNhzsycPSez5+zzfq111snZ5zn7fDd7yHzy7Od5dqSUkCRJKmJC2QVIkqT2ZZCQJEmFGSQk\nSVJhBglJklSYQUKSJBVmkJAkSYUZJCRJUmEGCUmSVJhBQpIkFWaQkCRJhRUKEhFxWkSsiYgtEXF9\nRLxmiLbzIuLf8/Y7IuL0QdqcGRE3RsQTEbE+Ir4XEYcWqU2SJI2dpoNERJwMfBlYDBwFrAZWRMT0\nBh/ZE7gb+BTwUIM2bwD+BTga+CNgMvCTiJjSbH2SJGnsRLM37YqI64EbUkpn5K8DuB84L6V07jCf\nXQMsTSmdN0y76cAjwBtTSv/ZVIGSJGnMNNUjERGTgR7gZ7VtKUsiVwELW1jXvkACHm3hPiVJUotN\narL9dGAisH7A9vXAy1tRUN7D8RXgP1NKtzVocwBwAnAv8EwrvleSpA6xB/AiYEVK6Xej3VmzQaKR\nIOtBaIVlwDzg9UO0OQH4dou+T5KkTvQ+4JLR7qTZILER2A7MHLB9Bi/spWhaRJwPvAV4Q0qp0cBM\nyHoi+Na3vsXhhx8+2q8d1xYtWsTSpUvLLmNMdMqxepzV4nFWSycc5+2338773/9+yH+XjlZTQSKl\ntC0iVgHHAVfA7y9FHAcMOYByOHmIOAl4U0pp7TDNnwE4/PDDWbBgwWi+dtybNm1a5Y+xplOO1eOs\nFo+zWjrlOHMtGRpQ5NLGEuDiPFDcCCwim+J5EUBELAceSCl9On89mexSRQC7AXMi4gjgqZTS3Xmb\nZUAv8Hbg6Yio9XhsSik5BkKSpHGq6SCRUro0n555FtkljpuAE1JKG/ImXcBzdR+ZDfya58dQ/E3+\nuAY4Nt92av7+1QO+7i+A5c3WKEmSxkahwZYppWVkgyIHe+/YAa/vY5hppikll+qWJKkN+Qt8nOvt\n7S27hDHTKcfqcVaLx1ktnXKcrdT0ypbjQUQsAFatWrWqkwbFSJI0av39/fT09AD0pJT6R7s/eyQk\nSVJhBglJklSYQUKSJBVmkJAkSYUZJCRJUmEGCUmSVJhBQpIkFWaQkCRJhRkkJElSYQYJSZJUmEFC\nkiQVZpCQJEmFGSQkSVJhBglJklSYQUKSJBVmkJAkSYUZJCRJUmEGCUmSVJhBQpIkFdbWQWL79rIr\nkCSps7V1kHj00bIrkCSps7V1kNiwoewKJEnqbG0dJB55pOwKJEnqbAYJSZJUmEFCkiQVZpCQJEmF\nGSQkSVJhbR0knLUhSVK52jpI2CMhSVK52jpIbN4MTzxRdhWSJHWutg4SAA8+WHYFkiR1LoOEJEkq\nzCAhSZIKa+sgMW2aQUKSpDK1dZA48ECDhCRJZWrrIDFjhkFCkqQytX2QeOCBsquQJKlztX2QsEdC\nkqTytH2QWL8etm0ruxJJkjpToSAREadFxJqI2BIR10fEa4ZoOy8i/j1vvyMiTh/tPmtmzICU4OGH\nixyFJEkaraaDREScDHwZWAwcBawGVkTE9AYf2RO4G/gU8FCL9glkQQK8vCFJUlmK9EgsAi5IKS1P\nKd0BnApsBk4ZrHFK6VcppU+llC4FtrZinzUHHpg9GyQkSSpHU0EiIiYDPcDPattSSgm4ClhYpIDR\n7HPaNNh9d2duSJJUlmZ7JKYDE4H1A7avB2YVrKHwPiOgq8seCUmSyjKpRfsJILVoXyPe56JFi3js\nsWlccgnccUe2rbe3l97e3haXIklS++nr66Ovr2+nbZs2bWrpdzQbJDYC24GZA7bP4IU9Crt8n0uX\nLuVLX1rAunVwxRUFv12SpIoa7B/X/f399PT0tOw7mrq0kVLaBqwCjqtti4jIX68sUsBo9zlnjpc2\nJEkqS5FLG0uAiyNiFXAj2YyLPYGLACJiOfBASunT+evJwDyySxW7AXMi4gjgqZTS3SPZ51BqQSKl\nbMyEJEkaO00HiZTSpfn6DmeRXY64CTghpbQhb9IFPFf3kdnAr3l+vMPf5I9rgGNHuM+G5syBLVvg\nscdg//2bPRpJkjQahQZbppSWAcsavHfsgNf3MYJLKEPtcyhdXdnzgw8aJCRJGmttfa8NyHokwHES\nkiSVoe2DxEEHZWMjDBKSJI29tg8Skyd7O3FJksrS9kECnAIqSVJZDBKSJKmwygQJb9wlSdLYq0SQ\n8MZdkiSVoxJBYs4c2LgRnn227EokSeoslQkSAOvWlVuHJEmdplJBwssbkiSNLYOEJEkqrBJBYupU\n2GsvZ25IkjTWKhEkIpy5IUlSGSoRJMBFqSRJKoNBQpIkFWaQkCRJhVUqSKxbBzt2lF2JJEmdo1JB\nYuvWbIVLSZI0NioTJLq6smcvb0iSNHYqEyRclEqSpLFXmSAxcyZMnGiQkCRpLFUmSEycCLNmGSQk\nSRpLlQkS4BRQSZLGWuWChPfbkCRp7FQqSHi/DUmSxlalgoSXNiRJGluVCxKPPw6bN5ddiSRJnaFy\nQQLslZAkaawYJCRJUmEGCUmSVFilgsRee8G++zoFVJKksVKpIAHO3JAkaSwZJCRJUmEGCUmSVJhB\nQpIkFVbJIPHQQ7B9e9mVSJJUfZULEl1dWYhYv77sSiRJqr7KBQnXkpAkaewYJCRJUmGVCxLTp8Pk\nyQYJSZLGQqEgERGnRcSaiNgSEddHxGuGaf/uiLg9b786Ik4c8P5eEXF+RNwfEZsj4taI+EiR2iZM\ngNmzDRKSJI2FpoNERJwMfBlYDBwFrAZWRMT0Bu0XApcAFwJHApcDl0fEvLpmS4E/Bt4LHAZ8BTg/\nIt7WbH3gFFBJksZKkR6JRcAFKaXlKaU7gFOBzcApDdqfAVyZUlqSUrozpbQY6Ac+VtdmIXBxSumX\nKaW1KaULyQLKawvUR1eX99uQJGksNBUkImIy0AP8rLYtpZSAq8jCwGAW5u/XWzGg/Urg7RExO/+e\nNwMvy9s1zR4JSZLGxqQm208HJgIDV2lYD7y8wWdmNWg/q+71XwFfBx6IiOeA7cCHUkrXNlkfYJCQ\nJGmsNBskGgkgjaL96cDRwNuAtcAbgWURsS6l9PNmi5kzB556Cp54AqZObfbTkiRppJoNEhvJegtm\nDtg+gxf2OtQ8PFT7iNgDOAc4KaX04/z9WyLiKOBvgIZBYtGiRUybNm2nbb29vXR19QJZr4RBQpLU\nqfr6+ujr69tp26ZNm1r6HU0FiZTStohYBRwHXAEQEZG/Pq/Bx64b5P3j8+0Ak/PHwB6N7QwzhmPp\n0qUsWLDgBdvvuSd7fvBBOPzwofYgSVJ19fb20tvbu9O2/v5+enp6WvYdRS5tLAEuzgPFjWSzOPYE\nLgKIiOXAAymlT+ftvwpcExGfAH4I9JIN2PwQQErpyYi4BvhiRDwD3AccA3wA+HiRg5o9O3t25oYk\nSbtW00EipXRpvmbEWWSXLG4CTkgpbcibdAHP1bW/LiJ6yS5fnAPcRXYZ47a63Z4M/CPwLWB/sjBx\nZkrp680fEuyxR7bCpQMuJUnatQoNtkwpLQOWNXjv2EG2XQZcNsT+HgE+WKSWRpy5IUnSrle5e23U\nGCQkSdr1DBKSJKkwg4QkSSqs0kFi/XrYtq3sSiRJqq7KBomuLkgJHnqo7EokSaquygaJOXOyZy9v\nSJK06xgkJElSYZUNEvvtly1MZZCQJGnXqWyQiHDmhiRJu1plgwQYJCRJ2tUqHSS6urxxlyRJu1Kl\ng4Q9EpIk7VodESRSKrsSSZKqqfJB4pln4LHHyq5EkqRqqnyQAC9vSJK0qxgkJElSYZUOEgcdlK0n\n4cwNSZJ2jUoHicmTYeZMeyQkSdpVKh0kwCmgkiTtSgYJSZJUmEFCkiQVZpCQJEmFVT5IdHXBxo3Z\nwlSSJKm1Kh8kamtJrFtXbh2SJFVRxwQJL29IktR6BglJklRY5YPE1Kmw994GCUmSdoXKBwlw5oYk\nSbtKRwSJri7vtyFJ0q7QEUGiuxvuv7/sKiRJqp6OCRJr15ZdhSRJ1dMxQeKhh2DbtrIrkSSpWjom\nSKTkolSSJLVaxwQJcJyEJEmtZpCQJEmFdUSQmDoVpk0zSEiS1GodESTAKaCSJO0KBglJklSYQUKS\nJBVmkJAkSYV1VJDYsAG2bCm7EkmSqqNQkIiI0yJiTURsiYjrI+I1w7R/d0TcnrdfHREnDtLm8Ij4\nfkQ8HhFPRcQNEdFVpL7B1KaAevMuSZJap+kgEREnA18GFgNHAauBFRExvUH7hcAlwIXAkcDlwOUR\nMa+uzUuAXwK3AW8EXgmcDTzTbH2NuJaEJEmtV6RHYhFwQUppeUrpDuBUYDNwSoP2ZwBXppSWpJTu\nTCktBvqBj9W1+Tzww5TSmSml36SU1qSU/m9KaWOB+gbVlfdtGCQkSWqdpoJEREwGeoCf1ballBJw\nFbCwwccW5u/XW1FrHxEBvBW4KyJ+HBHr88slJzVT23CmTIEDDzRISJLUSs32SEwHJgLrB2xfD8xq\n8JlZw7SfAewNfAr4EXA88D3guxHxhibrG5IzNyRJaq1JLdpPAKlg+1qYuTyldF7+599ExB+QXTb5\nZaOdLFq0iGnTpu20rbe3l97e3kHbGyQkSZ2kr6+Pvr6+nbZt2rSppd/RbJDYCGwHZg7YPoMX9jrU\nPDxM+43Ac8DtA9rcDrx+qGKWLl3KggULhin5ed3dcPXVI24uSVJbG+wf1/39/fT09LTsO5q6tJFS\n2gasAo6rbcvHOBwHrGzwsevq2+eOz7fX9vn/gJcPaHMocF8z9Q3HHglJklqryKWNJcDFEbEKuJFs\nFseewEUAEbEceCCl9Om8/VeBayLiE8APgV6yAZsfqtvnF4F/jYhfAr8ATgTeBrypQH0NdXfDpk3w\n5JOwzz6t3LMkSZ2p6emfKaVLgb8GzgJ+DbwKOCGltCFv0kXdwMuU0nVk4eHDwE3AO4GTUkq31bW5\nnGw8xN8CvyGbSvrO/LMt41oSkiS1VqHBlimlZcCyBu8dO8i2y4DLhtnnReS9GrtKfZCYN2/otpIk\naXgdc68NgNmzIQLWri27EkmSqqGjgsTkyXDQQV7akCSpVToqSIAzNyRJaqWOCxJz5xokJElqlY4L\nEvZISJLUOh0bJFIzC3pLkqRBdWSQ2LIFHn207EokSWp/HRkkwMsbkiS1gkFCkiQV1nFBYubMbD0J\ng4QkSaPXcUFiwgSYM8fVLSVJaoWOCxLgFFBJklrFICFJkgrryCDh6paSJLVGRwaJ7m548EHYsaPs\nSiRJam8dGyS2bYP168uuRJKk9taxQQK8vCFJ0mgZJCRJUmEdGST23x+mTDFISJI0Wh0ZJCKcAipJ\nUit0ZJAAg4QkSa3Q0UHCZbIlSRqdjg4S9khIkjQ6HR0kHnooW09CkiQV07FBYu5cSAnWrSu7EkmS\n2lfHBgnXkpAkafQMEqMMEtu3w5Yto69HkqR21LFBYp99YNq00QeJc8+Fo49uTU2SJLWbjg0S0JqZ\nGytWwM03w1NPtaYmSZLaiUFiFEFi2za48cbsz7ff3pqaJElqJwaJUQSJm256fnzErbe2piZJktqJ\nQWIUQWLlSth9d5gzB267rXV1SZLULiaVXUCZurthw4asV2HKlOY/f+218OpXw7772iMhSepMHd8j\nAfDAA81/NqUsSLz+9TBvnj0SkqTOZJCg2OWN++/PVsX8gz+A+fPh3nuduSFJ6jwdHSS6urLnIkHi\n2muz54ULsx4JgDvuaE1dkiS1i44OElOmwIEHFgsSK1fCy14GM2bA4Ydn2xwnIUnqNB0dJKD4zI1r\nr80uawDsvTccfLDjJCRJnccgUSBIPPUUrF6dDbSsmT/fHglJUucxSBQIEjfcADt2PN8jAc7ckCR1\nJoNEgSCxcmW2dkRtbARkPRJr1sDTT7e2PkmSxrNCQSIiTouINRGxJSKuj4jXDNP+3RFxe95+dUSc\nOETbCyJiR0ScXqS2ZnV3w6ZN8MQTI//MtddmszUm1P3Xc+aGJKkTNR0kIuJk4MvAYuAoYDWwIiKm\nN2i/ELgEuBA4ErgcuDwi5g3S9h3Aa4EHm62rqGbXktixA667bufxEeDMDUlSZyrSI7EIuCCltDyl\ndAdwKrAZOKVB+zOAK1NKS1JKd6aUFgP9wMfqG0XEHOA84L3AcwXqKqTZIHHrrVnvRf34CIB99oG5\ncx0nIUnqLE0FiYiYDPQAP6ttSykl4CpgYYOPLczfr7eivn1EBLAcODelNKY35J49GyJGHiRWroSJ\nE+G1r33he87ckCR1mmZ7JKYDE4H1A7avB2Y1+MysEbT/O2BrSun8JusZtcmTszAx0iBx7bVw5JGw\n114vfM+ZG5KkTtOqu38GkIq0j4ge4HSy8RZNWbRoEdOmTdtpW29vL729vU3tp5mZGytXwlveMvh7\n8+fDkiWweTPsuWdTJUiS1HJ9fX309fXttG3Tpk0t/Y5mg8RGYDswc8D2Gbyw16Hm4WHa/yFwIHB/\ndoUDyHo9lkTEx1NKhzQqZunSpSxYsGDk1Tcw0iCxfj3cffcLB1rWzJuX3RX0jjugBWVJkjQqg/3j\nur+/n56enpZ9R1OXNlJK24BVwHG1bfn4huOAlQ0+dl19+9zx+XbIxka8Cjii7rEOOBc4oZn6ihpp\nkFiZH+HAgZY1tSmgjpOQJHWKIpc2lgAXR8Qq4EayWRx7AhcBRMRy4IGU0qfz9l8FromITwA/BHrJ\nBmx+CCCl9BjwWP0XRMQ24OGU0l0F6mtaLUiklA28bGTlyqxtbabHQM7ckCR1mqaDRErp0nzNiLPI\nLlncBJyQUtqQN+mibvpmSum6iOgFzskfdwEnpZSG+nXbzHiLUevuhi1b4NFH4YADGrerv1FXI/Pm\n2SMhSeochQZbppSWAcsavHfsINsuAy5rYv8Nx0XsCvVrSTQKEs88A6tWwXDjOOfPh+99r7X1SZI0\nXnX8vTZgZItSrVoFW7eOrEdizZps5oYkSVVnkABmzszWk1i7tnGblSuzKZ1HHDH0vubPf37mhiRJ\nVWeQILv51pw5Q/dIXHstHH00TBrmYlDtnhsOuJQkdQKDRG6oKaApZT0SjdaPqDd1arYvB1xKkjqB\nQSI3d27jIPHb38KGDcOPj6hxqWxJUqcwSOSG6pGoLUT1uteNbF/evEuS1CkMErnubnjwQdix44Xv\nrVyZhYP99hvZvubNg3vuydamkCSpygwSue5u2LYtu5/GQCNZiKqeMzckSZ3CIJFrtJbE449nlylG\nMtCyxpkbkqROYZDINQoS1+W3FmumR2LaNOjqcpyEJKn6DBK5/feHKVNeGCRWroQDD4SXvrS5/Tlz\nQ5LUCQwSuYjBZ27UxkcMdVfQwThzQ5LUCQwSdbq7d14m+7nn4IYbmhsfUePMDUlSJzBI1BnYI7F6\ndXbzrWbGR9TMn59NJb3zztbVJ0nSeGOQqDNwdcuVK2G33aCnp/l9zZuXPTtOQpJUZQaJOt3d8NBD\n2XoSkI2P6OmBPfZofl/TpmU3AnOchCSpygwSdbq7s4Wk1q3LXq9cWeyyRs38+fZISJKqzSBRp34t\nidqjyEDLmnnz7JGQJFWbQaJOfZCo3ahrtD0Sd98Nzzwz+tokSRqPDBJ19tknG9tQCxIveQnMnFl8\nf/PmOXNDklRtBokBalNAm71R12BqMze8vCFJqiqDxADd3dldO2+6aXTjIwD23Rdmz3bApSSpugwS\nA3R3w9VXw/bto++RAJfKliRVm0FigO7ubGnsqVOzEDBa3rxLklRlBokBajM3Fi6ECS34rzN/Pvz2\nt87ckCRVk0FigLlzs+fRjo+oqc3c+K//as3+JEkaTwwSAxx6KEyeDMcd15r9OXNDklRlk8ouYLyZ\nMwceeSSbcdEK++0HBx3kOAlJUjXZIzGIVoWIGmduSJKqyiAxBpy5IUmqKoPEGKjN3Hj22bIrkSSp\ntQwSY2D+/GyBK2duSJKqxiAxBpy5IUmqKoPEGHDmhiSpqgwSY2TePHskJEnVY5AYI/Pn2yMhSaoe\ng8QYmTcP7rrLmRuSpGoxSIwRZ25IkqrIIDFGajM3vLwhSaoSg8QY2X9/mDXLAZeSpGopFCQi4rSI\nWBMRWyLi+oh4zTDt3x0Rt+ftV0fEiXXvTYqIf46I30TEUxHxYERcHBEHFaltPHOpbElS1TQdJCLi\nZODLwGLgKGA1sCIipjdovxC4BLgQOBK4HLg8IvLOfvbMt38u39+fAi8Hvt9sbeOdN++SJFVNkR6J\nRcAFKaXlKaU7gFOBzcApDdqfAVyZUlqSUrozpbQY6Ac+BpBSeiKldEJK6bKU0l0ppRvz93oioqtA\nfeNWbebG1q1lVyJJUms0FSQiYjLQA/ysti2llICrgIUNPrYwf7/eiiHaA+wLJODxZuob75y5IUmq\nmmZ7JKYDE4H1A7avB2Y1+MysZtpHxO7APwGXpJSearK+cc2ZG5KkqmnVrI0g60EYVfuImAR8J3/v\no60pbfw44ACYPRt+8YuyK5EkqTUmNdl+I7AdmDlg+wxe2OtQ8/BI2teFiG7g2JH0RixatIhp06bt\ntK23t5fe3t7hPlqaj34UzjoLzjwT5s4tuxpJUpX19fXR19e307ZNmza19DsiG+LQxAcirgduSCmd\nkb8OYC1wXkrpi4O0/1dgSkrppLpt1wKrU0ofzV/XQsQhwJtTSo8OU8MCYNWqVatYsGBBU/WX7ckn\n4ZBD4B3vgAsvLLsaSVKn6e/vp6enB6AnpdQ/2v0VubSxBPhwRHwgIg4D/hfZFM6LACJieUR8oa79\nV4ETI+ITEfHyiPgs2YDN8/P2E4HLgAXA+4HJETEzf0wueFzj1j77ZL0R3/xmNoNDkqR21nSQSCld\nCvw1cBbwa+BVwAkppQ15ky7qBlKmlK4DeoEPAzcB7wROSindVtf+bfnzTcA64KH8eaiZHW3rL/8S\nZs6Ez3627EokSRqdZsdIAJBSWgYsa/DesYNsu4ys12Gw9veRzQTpGFOmwGc+k42XOPNMeMUryq5I\nkqRivNdGSU45BQ4+GP7H/yi7EkmSijNIlGS33bJLG9/7HvzqV2VXI0lSMQaJEr3//XDYYfAP/1B2\nJZIkFWOQKNHEidmaEitWwC9/WXY1kiQ1zyBRsne9C444IuuVaHJJD0mSSmeQKNmECfD5z8N//Af8\n9KdlVyNJUnMMEuPAW98Kr3sd/P3f2yshSWovBolxIALOOSebvXHFFWVXI0nSyBkkxoljj4U3vzlb\nqGrHjrKrkSRpZAwS48g558DNN8O//VvZlUiSNDIGiXFk4cJsvMTixfDcc2VXI0nS8AwS48zZZ2d3\nBV2+vOxKJEkankFinDnqKPizP4PPfQ6efbbsaiRJGppBYhw66yx44AG48MKyK5EkaWgGiXHo8MOz\n+3Cccw5s3lx2NZIkNWaQGKcWL4aNG+FrXyu7EkmSGjNIjFOHHAIf/CD80z9llzkkSRqPDBLj2OLF\nsPfecMwxcP/9ZVcjSdILGSTGsYMOgquvztaUMExIksYjg8Q49+IXZ2Fix44sTKxdW3ZFkiQ9zyDR\nBl70op3DxH33lVyQJEk5g0SbOPhguOaa7M/HHAP33ltmNZIkZQwSbWTu3CxMTJhgmJAkjQ8GiTbT\n3Z1d5pg0Cd70JlizpuyKJEmdzCDRhmphYrfdsp6Je+4puyJJUqcySLSprq4sTOy+exYm7r677Iok\nSZ3IINHG5syBX/wC9tjDMCFJKodBos3NmZP1TOy5ZzZm4q67yq5IktRJDBIVMHt2Fib23hsWLMju\nz/HMM2VXJUnqBAaJijjoILjuuuxGX5/5DMybB9/9LqRUdmWSpCozSFTIfvvBV74CN98Mhx8O73oX\nHHss3HRT2ZVJkqrKIFFBhx0GP/whXHklPPxwdrnjwx+GRx4puzJJUtUYJCrsT/4EfvObrJfiO9+B\nl70MvvQl2Lq17MokSVVhkKi4yZPh9NPht7+FP/9z+Lu/g/nz4fvfd/yEJGn0DBId4oAD4PzzYfVq\nOOQQeMc74Pjj4YYbyq5MktTODBIdZv58+PGP4Qc/gAcfhNe9LlvM6sor7aGQJDXPINGBIuBtb4Nb\nboHLLoMtW+Atb4GjjoJLLoHnniu7QklSuzBIdLCJE+Gd74Trr4ef/xxmzYL3vS8blPm1r8HmzWVX\nKEka7wwSIgLe/Obskkd/f3a54/TT4eCD4eyz4dFHy65QkjReGSS0k6OOgr6+7J4d73kPfOELMHcu\nfOITcP/9ZVcnSRpvDBLjXF9fXynfe8gh2eWN++6Dj38cvvnNLFAceSR88pPwk59kYytaqaxjHWse\nZ7V4nNXSKcfZSoWCREScFhFrImJLRFwfEa8Zpv27I+L2vP3qiDhxkDZnRcS6iNgcET+NiJcWqa1q\nyv6hnjEDPv95WLsWli+HV70Kvv1tOOGEbEnu447LbhK2ahXs2DG67yr7WMeKx1ktHme1dMpxtlLT\nQSIiTga+DCwGjgJWAysiYnqD9guBS4ALgSOBy4HLI2JeXZtPAR8DPgK8Fng63+duzdanXWOffbIF\nrZYvz6aN3nJLFiCmTMmCxqtfnYWOk0+Gb3wD7r237IolSWNhUoHPLAIuSCktB4iIU4G3AqcA5w7S\n/gzgypTSkvz14oj4Y7Lg8NG6NmenlH6Q7/MDwHrgHcClBWrULhSRrUcxf3522WPr1mzmx09/Cldd\nBR/5SNY7MWUK7LUX7LnnC58Hbtt7b3jggWzBrMMOg913L/soJUkj0VSQiIjJQA/whdq2lFKKiKuA\nhQ0+tpCsB6PeCuCkfJ+HALOAn9Xt84mIuCH/rEFinNttN3jjG7PH2WfD44/D1Vdnl0OefjqbRjrY\n84YNz79+/PHsBmNHHplNSz30UHjFK7LHK1+ZPR9ySPbeSGzdmu2z9pg8GaZNg6lTs+fJk3fpfxJJ\n6hjN9khH5wEbAAAH8ElEQVRMByaS9RbUWw+8vMFnZjVoPyv/80wgDdNmoD0Abr/99uErbnObNm2i\nv7+/7DKaNndu9mjGX/3VJj74wX7uvju7N8jdd2eDOjdtyt7fbbcsTLz0pTB9ehZCnnwyezzxRPb8\n1FPZ87PPDv1du++e9YLsvffzPSL1rydNykJL7TFpEkyYMPjrCROeHx+yY0e2QmjtMfB1SnDrrZv4\n5Cf7f//ZCROe31/9PmvPEdm+a8819dsbtRlOo/Yj3T7U99133yYuuGB8/Ow2+9+lGffdt4mvf318\nHOeu5HFWx9q1v//duUdLdphSGvEDOAjYARw9YPu5wMoGn3kWOHnAto8C6/I/LwS2AzMHtLkUuKTB\nPt9LFj58+PDhw4cPH8Ue720mAzR6NNsjsZH8l/6A7TN4YY9CzcPDtH8YiLzN+gFtft1gnyuA9wH3\nAs+MoG5JkpTZA3gR2e/SUWsqSKSUtkXEKuA44AqAiIj89XkNPnbdIO8fn28npbQmIh7O2/wm3+dU\n4Gjgaw3q+B3ZTBBJktS8la3aUZFZG0uAi/NAcSPZLI49gYsAImI58EBK6dN5+68C10TEJ4AfAr1k\nAzY/VLfPrwD/EBG/JetlOBt4APh+gfokSdIYaTpIpJQuzdeMOIvscsRNwAkppQ15ky7gubr210VE\nL3BO/rgLOCmldFtdm3MjYk/gAmBf4JfAiSmlrcUOS5IkjYXIBy9KkiQ1zXttSJKkwgwSkiSpsLYM\nEs3eNKzdRMTiiNgx4HHb8J8c3yLiDRFxRUQ8mB/T2wdp0/Y3bxvuOCPim4Oc3x+VVW9REXFmRNwY\nEU9ExPqI+F5EHDqgze4R8bWI2BgRT0bEv0fEjLJqLmKEx3n1gPO5PSKWlVVzERFxan5TxU35Y2VE\n/End+21/LmFEx9n253Iw+c/xjohYUretJee07YJEszcNa2O3kA1mnZU//rDcclpiL7LBuaeRLYay\nkwrdvG3I48xdyc7nt3dsSmupNwD/QjZV+4+AycBPImJKXZuvkN2L513AG4HZwGVjXOdojeQ4E/B1\nnj+nBwF/O8Z1jtb9wKfIZtX1AD8Hvh8Rh+fvV+FcwvDHWYVzuZP8H9sfIvt9Wa8157QVq1qN5QO4\nHvhq3esgmyr6t2XX1sJjXAz0l13HLj7GHcDbB2xbByyqez0V2AK8p+x6W3yc3wS+W3Ztu+BYp+fH\n+4d15+9Z4E/r2rw8b/Pasutt1XHm234BLCm7tl1wrL8D/qKq53LgcVbxXAJ7A3cCx9YfWyvPaVv1\nSNTdNKz+Bl8JGOqmYe3qZXnX+N0R8a2I6C67oF0pIl7MIDdvA2o3b6uaY/Ju8jsiYllE7F92QS2w\nL9m/5h7NX/eQTTGvP6d3Amtp73M68Dhr3hcRGyLi5oj4woAei7YSERMi4r+RrRF0HRU9lwOOs36B\npsqcS7KFHX+QUvr5gO2vpkXntMiCVGUqctOwdnQ98N/JUuRBwGeB/4iIV6SUni6xrl1pFtlfzs3c\nvK1dXUnWfbgGeAnwj8CPImJhHozbTkQEWTfpf6bn14iZBWzNA2G9tj2nDY4T4NvAfWS9aq8iu//Q\nocCfjXmRoxARryALDnsAT5L9a/WOiDiKCp3LBsd5Z/52Jc4lQB6SjiQLDQPNpEXntN2CRCNB42vR\nbSelVL/++S0RcSPZD/Z7yLrFO0mlzi1ki7rVvbw1Im4G7gaOIet6bEfLgHmMbCxPO5/T2nG+vn5j\nSukbdS9vjWzZ/6si4sUppTVjWeAo3QEcQdbr8i5geUS8cYj27XouBz3OlNIdVTmXEdFFFnqPTylt\na+ajNHlO2+rSBsVuGtb2UkqbgP8C2m4GQxPqb95Wr9LnFrL7zZD9bLfl+Y2I84G3AMeklNbVvfUw\nsFtk986p15bndMBxPjRM8xvIfp7b6pymlJ5LKd2TUupPKf092eC8M6jYuRziOAfTlueS7HLUgcCq\niNgWEduANwFnRMRWsvO2eyvOaVsFiTxV1W4aBux007CW3YBkvImIvcm6wIf7y6tt5b9MazdvA3a6\neVtlzy38/l8OB9CG5zf/5XoS8OaU0toBb68iWy6//pweCswlv2lfuxjmOAdzFNm/6trunA4wAdid\nCp3LBmrHOZh2PZdXAa8ku7RxRP74FfCtuj9vowXntB0vbQx507AqiIgvAj8gu5wxB/gc2f/EfWXW\nNVoRsRdZqo980yERcQTwaErpfipy87ahjjN/LCYbI/Fw3u6fyXqcWnJL37GSz63vBd4OPB0Rtd6k\nTSmlZ1JKT0TE/waWRMRjZNeizwOuTSndWE7VzRvuOCPiEOC9wI/IRv8fQfb31DUppVvKqLmIiDiH\nbPzO/cA+wPvI/gX7x1U5lzD0cVblXALk4+l2Wn8oIp4GfpdSuj1/3ZpzWvbUlILTWT5K9otmC1ly\nenXZNbX4+PrIfoFuIRtBewnw4rLrasFxvYlsatH2AY//U9fms2SDnDaT/WJ9adl1t/I4yQZ3/Zgs\nRDwD3AP8T+DAsusucJyDHeN24AN1bXYnW4NhY/4X1XeAGWXX3srjJLtR4dXAhvzn9k6yAbR7l117\nk8f5jfzncUv+8/kT4NgqncvhjrMq53KIY/85dVNbW3VOvWmXJEkqrK3GSEiSpPHFICFJkgozSEiS\npMIMEpIkqTCDhCRJKswgIUmSCjNISJKkwgwSkiSpMIOEJEkqzCAhSZIKM0hIkqTC/j+fyQN125NE\ncQAAAABJRU5ErkJggg==\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "% matplotlib inline\n", + "import matplotlib.pyplot as plt\n", + "\n", + "plt.plot(\"cost\", data=learning_result)\n", + "plt.show()\n" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAgkAAAFkCAYAAACq4KjhAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3XuYXFWd7//3NwSIBIjILTBEIdyjM5EEEIQwjKMiIAF/\n4KVhPAo4KGbOORNHRcV58HKUHx4hohgELxhGadGZn0xHkKsMCXftJlwUwkUQJCYIgY6SREKyfn+s\n6kmnqb5UdXXturxfz1NPV+3au/Z3s5vUp9dee61IKSFJkjTQuKILkCRJjcmQIEmSyjIkSJKksgwJ\nkiSpLEOCJEkqy5AgSZLKMiRIkqSyDAmSJKksQ4IkSSrLkCBJksqqKCRExKcj4u6IWBURKyLipxGx\nzwi2e3dEPBgRayLi3og4uvqSJUlSPVTakjAL+AbwJuCtwObA9RHxqsE2iIhDgSuAbwNvBK4CroqI\naVVVLEmS6iJGM8FTROwAPAMckVK6dZB1fgRslVKa3W/ZHcA9KaWPVr1zSZI0pkbbJ+HVQAJWDrHO\nocCNA5ZdV1ouSZIa1PhqN4yIAL4G3JpS+s0Qq04GVgxYtqK0fLDP3h44CngCWFttjZIktaEJwO7A\ndSml50bzQVWHBGA+MA04rIptg9wCMZijgB9WU5QkSQLgFHKfwKpVFRIi4iLgGGBWSukPw6y+HNh5\nwLKdeGXrQn9PAPzgBz9g//33r6bEpjF37lzmzZtXdBljzuMszvr1sHr1xseLL5b/uXYt/OUvG3/2\nPdaufeWyZ56Zy1ZbzeOll+Cll+Dll2tT6/jxGx+bbZYfA5/3ve6/fNy4/Oh7PtJlwz1uvHEu73jH\nPCLy676f/Z8P/AkbX/d/9L3Xf/2+10M9yq0zcFnf66GW939v4POLLprL//yf817xOeV+9v+scp9f\n7nm57arZttr99HfeeXP51KfG5v/R4equ1/qPPfYgH//4P0Dpu3Q0Kg4JpYBwPPC3KaUnR7DJHcDf\nA1/vt+xtpeWDWQuw//77M2PGjEpLbCqTJk1q+WMEj7NaGzbACy/As8/CypXQ25sfq1ZtfD7wdd/z\nVavgT3/KAWAom28OW28NW22VH6961aaP7bbb+Lzv/Z/+dBIf/OAMttwSttwSttiC/35ebtkWW+T9\nbL754M/Hj6/8H82xNnv2JC65pPV/b7u6JnH66a1/nACXXz6JE09s7WPt6fnvp6O+XF9RSIiI+UAH\nMBt4MSL6Wgh6U0prS+ssAJ5OKX2m9N6FwC0R8THg6tL2M4F/HG3xUrNZvx6eeQaWLYM//AH++Mcc\nAJ57Lv8c+HzlyhwUytl22/yYNGnjzx12gD333PjeNtvkR//n/R/bbpu/xCv1m9/Apz89uv8Wkhpf\npS0JHyH3JfivActPBS4vPZ8CrO97I6V0R0R0AF8qPR4Bjh+ms6PUVFLa+Jf+1VdvDAHLlm36WLHi\nlV/6fV/uO+wA228PU6fCwQfn533Ld9gBXvOajYFgm202Nl9L0lipKCSklIb9Zyml9JYyy/4D+I9K\n9iU1mpdfhqeegt/+Fh57bOOj7/WqVXm9RYtys/nOO8Ouu+bHjBnwzndufL3LLvmx4465qV2SGtFo\n7m5QDXR0dBRdQl0003GuWgX33w/33gu//jU8+mgOAk88sbGD3rhx8NrX5qb9gw6C9743P3/ooQ5O\nOw122ilfY29VzXQ+R8PjbD3tdKy1MKoRF8dKRMwAuru7u9uis5uKsWFD/vK/9164776NPx9/PL8/\nfjzstx/stVcOAHvumS8F7LknvO51tgBIakw9PT3MnDkTYGZKqWe49YfSwn/rSButX587291+O9xz\nTw4E99+fb/+D/Jf/9Olw4onwN3+Tn++3X+59L0ntypCglvSnP8Fdd+VQcNttcOed+TLC+PEwbdrG\nQDB9eg4FOw8cyUOSZEhQ80sJfve7HAj6QsF99+XLCdttB29+M5x1Fhx2WO4/sNVWRVcsSc3BkKCm\n9Oc/w/XXQ1cX3HBDvr0QYJ99chiYMyf/3HdfbxWUpGoZEtQ0nn4aFi7MweCmm/JwwK9/PZxyChx+\nOBx6aL6lUJJUG4YENayUcgfDrq786O7OY+4fcQScdx4cd1y+00CSNDYMCWoo69fDzTfDVVflYPDU\nU3mEwaOPho99LP/cbruiq5Sk9mBIUEN46in43vfgu9/Nz1/7WjjhBJg9O7cceCuiJNWfIUGFefll\nuOYauPRS+PnP8+yCHR3woQ/luQsabUZASWo3hgTV3RNPwHe+A5ddlu9KOPBAuPjiHBC22abo6iRJ\nfQwJqot163Ifg0svzbcsbr01/MM/wD/+IxxwQNHVSZLKMSRoTK1cCfPm5XDwzDNwyCG538F73gMT\nJxZdnSRpKIYEjYm+cHDhhfmOhdNOgzPOgL/+66IrkySNlCFBNfX88xvDwbp1eeTDT3wiT6AkSWou\nhgTVxPPPw9e+lh/r1sFHP5rDgRMnSVLzMiRoVF54YWM4+MtfNoaDyZOLrkySNFqGBFWltzcHg3nz\ncjg480z45CcNB5LUSgwJqkhKcMkl8OlPw9q18JGP5HCwyy5FVyZJqjVDgkbsqafg9NPzOAcf+hB8\n4QuGA0lqZYYEDSsluPxy+F//K4+IeO21cNRRRVclSRpr44ouQI1t+fI80dIHP5h/PvCAAUGS2oUt\nCRrUT36SOyRuthn89Kc5JEiS2octCXqFlSvzZEvveQ8ceWRuPTAgSFL7sSVBm7j66twp8S9/gSuu\ngPe9zymbJald2ZIgAFatyncuvPOdMGNGbj3o6DAgSFI7syVB3HMPvOtd+TLDd76TJ2MyHEiSbElo\ncz/6ERx2GOy4I9x3X25NMCBIksCQ0LbWr8+jJnZ0wIknwqJFsPvuRVclSWokXm5oQ729cPLJeVCk\nr34VPvYxWw8kSa9kSGgzS5fC8cfDihVwzTUOjCRJGpyXG9rIz38Ob3oTjBsHd99tQJAkDa3ikBAR\nsyKiKyKejogNETF7BNucEhFLIuLFiFgWEd+NiNdUV7IqlRJ85Stw7LFwxBFw552w995FVyVJanTV\ntCRMBJYAc4A03MoRcRiwAPg2MA04CTgYuLSKfatCq1fDKafAWWfB2WfDVVfBttsWXZUkqRlU3Cch\npXQtcC1AxIi6ux0CPJ5S+mbp9e8i4hLgk5XuW5V58sk8/sFDD8GPfwzvfnfRFUmSmkk9+iTcAUyJ\niKMBImJncmvC1XXYd9u64w446CB47jm4/XYDgiSpcmMeElJKtwP/AFwZES8BfwBeAP5prPfdrpYu\nhWOOgX32gV/+EqZPL7oiSVIzGvOQEBHTgAuBzwEzgKOAPYBLxnrf7WjlSjjuONhlF/jZz/JIipIk\nVaMe4yR8Crg1pXRB6fUDEfFRYHFEnJ1SWjHYhnPnzmXSpEmbLOvo6KCjo2Psqm1iL70EJ52Ug8Jd\nd8GA/3SSpBbT2dlJZ2fnJst6e3tr9vn1CAlbAesGLNtAvjNiyI6P8+bNY8aMGWNVV0tJCebMgVtv\nhZtugj33LLoiSdJYK/eHc09PDzNnzqzJ51ccEiJiIrAXG7/gp0bEdGBlSumpiDgX2DWl9IHS+wuB\nSyPiI8B1wK7APOCulNLyUR+BAJg3L8/g+P3vw6xZRVcjSWoF1bQkHAjcTG4JSMD5peULgNOAycCU\nvpVTSgsiYmvyuApfJXdavIl8GUI1sHAhfPzjeSyED3xg+PUlSRqJasZJuIUhOjymlE4ts+ybwDfL\nrK5Ruu++PFnT8cfDl79cdDWSpFbi3A1NbMWKfCfD3nvDD36Q52SQJKlW/FppUmvWwAknwLp10NUF\nEycWXZEkqdU4VXQTSglOPx2WLIFFi2C33YquSJLUigwJTeiLX4TOzjwfw0EHFV2NJKlVebmhyVx5\nJZxzTg4KzscgSRpLhoQmcvfd8MEP5qmfzz676GokSa3OkNAknn463+Z4wAF50KQRTdItSdIoGBKa\nwIYNcOqpORhcdRVMmFB0RZKkdmDHxSYwfz7ccANcey3stFPR1UiS2oUtCQ1u6VL45Cfhox+Fo44q\nuhpJUjsxJDSwdevg/e+HKVPgK18puhpJUrvxckMD+/KXoacHbr/dERUlSfVnS0KDuvvuPBbC2WfD\nwQcXXY0kqR0ZEhrQ6tX5MsMBB8BnP1t0NZKkduXlhgZ01lnw5JNwzz2w+eZFVyNJaleGhAZzww1w\n0UXw9a/DfvsVXY0kqZ15uaGBrFyZh11+61thzpyiq5EktTtDQgOZMyf3R7jsMhjnmZEkFczLDQ3i\nRz/Kjx/+EHbbrehqJEmyJaEhPP00nHkmvPe90NFRdDWSJGWGhIKlBKedBlttledocHZHSVKj8HJD\nwebPh+uvz5M3veY1RVcjSdJGtiQUaOlS+MQnnLxJktSYDAkFSSnf7ujkTZKkRuXlhoL89Kdw551w\n881O3iRJaky2JBRg/fo8J8Pb3gZHHll0NZIklWdLQgGuuAIefBAWLCi6EkmSBmdLQp299BKccw6c\ncAIcdFDR1UiSNDhbEurse9+DJ56Arq6iK5EkaWi2JNTRmjXwxS/CySfDG95QdDWSJA3NkFBH8+fD\nM8/A5z5XdCWSJA3PkFAnq1bBuefmIZj32qvoaiRJGp4hoU6+9jX485/hX/+16EokSRoZQ0IdPPcc\nfPWrMGeO00BLkppHxSEhImZFRFdEPB0RGyJi9gi22SIivhQRT0TE2oj4bUR8sKqKm9B55+VhmD/1\nqaIrkSRp5Kq5BXIisAT4HvAfI9zmJ8COwKnAY8AutEkrxrJlcNFF8PGPw447Fl2NJEkjV3FISCld\nC1wLEBEx3PoR8Q5gFjA1pfRCafGTle63WX3pSzBhAvzLvxRdiSRJlanHX/PHAb8CzoqI30fE0oj4\nvxExoQ77LtTjj8O3vw1nnQWTJhVdjSRJlanHiItTyS0Ja4ETgB2Ai4HtgA/VYf+F+fzn4TWvgX/6\np6IrkSSpcvUICeOADcDJKaU/A0TEx4CfRMSclNJfBttw7ty5TBrwJ3hHRwcdHR1jWW9NPPgg/Nu/\nwYUXOhW0JGlsdHZ20tnZucmy3t7emn1+pJSq3zhiA3BCSmnQmQgi4vvAm1NK+/Rbth/wa2CflNJj\nZbaZAXR3d3czY8aMqusr0rvfDXffDQ8/DFtuWXQ1kqR20dPTw8yZMwFmppR6RvNZ9eiTcBuwa0Rs\n1W/ZvuTWhd/XYf9119MD//7vefhlA4IkqVlVM07CxIiYHhFvLC2aWno9pfT+uRGxoN8mVwDPAZdF\nxP4RcQTwFeC7Q11qaGaf/Szsuy+8//1FVyJJUvWq6ZNwIHAzkEqP80vLFwCnAZOBKX0rp5RejIi3\nAd8AfkkODFcCLTlA8a23ws9/DldeCeOdiFuS1MSqGSfhFoZogUgpnVpm2cPAUZXuq9mkBJ/5DEyf\nDiedVHQ1kiSNjn/r1tANN8DixfCzn8G4thhPUpLUyvwqq6GvfhUOPBCOOaboSiRJGj1bEmrkkUdy\nS8KCBTD8YNWSJDU+WxJq5FvfyqMrvuc9RVciSVJtGBJqYM0auOwyOO20PJmTJEmtwJBQA1deCc8/\nDx/+cNGVSJJUO4aEGrj4YjjqKNhrr6IrkSSpduy4OErd3XmOhquuKroSSZJqy5aEUbr4YpgyBY49\ntuhKJEmqLUPCKDz/PFxxBZxxhkMwS5JajyFhFC6/HNatgw99qOhKJEmqPUNClVLKlxpOPBEmTy66\nGkmSas9G8irdfDMsXQqXXFJ0JZIkjQ1bEqo0fz5MmwZHHFF0JZIkjQ1DQhWWLcu3PJ55pvM0SJJa\nlyGhCt/+dh5++f3vL7oSSZLGjiGhQuvWwaWXwimnwKRJRVcjSdLYMSRUaOHCfLnhzDOLrkSSpLFl\nSKjQ/Plw6KHwxjcWXYkkSWPLWyArsHQp3HQT/Nu/FV2JJEljz5aECnzrW7D99nDSSUVXIknS2DMk\njNDq1fD978Ppp+c7GyRJanWGhBH60Y+gtxc+/OGiK5EkqT4MCSN08cXwjnfA1KlFVyJJUn3YcXEE\nfvlL+NWv8u2PkiS1C1sSRuDii+F1r4Ojjy66EkmS6seQMIyVK6GzM/dF2GyzoquRJKl+DAnDWLAA\n1q/PdzVIktRODAlD2LAhX2o46STYaaeiq5Ekqb4MCUO46y545BE444yiK5Ekqf4MCUPo6oIddoBZ\ns4quRJKk+jMkDKGrC4491g6LkqT2ZEgYxGOPwW9+A7NnF12JJEnFqDgkRMSsiOiKiKcjYkNEjPhr\nNCIOi4h1EdFT6X7rbeFC2GILePvbi65EkqRiVNOSMBFYAswB0kg3iohtgQXAjVXss+4WLoS3vAW2\n3rroSiRJKkbFwzKnlK4FrgWIiKhg00uAHwIbgOMr3W89Pf883HILfOMbRVciSVJx6tInISJOBaYC\nn6/H/kbr2mvzAErvfGfRlUiSVJwxn+ApIvYGvgwcnlLaUFnjQzEWLoQDDoApU4quRJKk4oxpSIiI\nceRLDOeklB7rWzzS7efOncukSZM2WdbR0UFHR0ftihxg3Tq45hr4538es11IklQTnZ2ddHZ2brKs\nt7e3Zp8fKY247+ErN47YAJyQUuoa5P1JwPPAy2wMB+NKz18G3p5S+q8y280Auru7u5kxY0bV9VXj\n5ptzh8Vf/QpmzqzrriVJGrWenh5m5i+wmSmlUd1NONaXG1YBbxiwbA7wd8CJwBNjvP+KdXXBrrtC\nnbOJJEkNp+KQEBETgb3Y2DIwNSKmAytTSk9FxLnArimlD6TcTPGbAds/A6xNKT04ytprLqUcEmbP\nhiboOiFJ0piqpiXhQOBm8hgJCTi/tHwBcBowGWjKLn8PPgi//S0cd1zRlUiSVLxqxkm4hSFunUwp\nnTrM9p+nQW+F7OqCrbbKfRIkSWp3zt3QT1dXHoZ5woSiK5EkqXiGhJJnnoE773RCJ0mS+hgSSq6+\nOv889thi65AkqVEYEkoWLoRDDoGddiq6EkmSGoMhAVi7Fq67zksNkiT1Z0gAfvELWL3akCBJUn+G\nBPKlhqlTYf/9i65EkqTG0fYhIaUcEhxlUZKkTbV9SLjnHnj6aUdZlCRpoLYPCV1dMGkSzJpVdCWS\nJDUWQ0IXHHMMbL550ZVIktRY2jok/P73+XKDlxokSXqltg4JCxfC+PHwjncUXYkkSY2n7UPCEUfA\ndtsVXYkkSY2nbUPCn/8MN93kpQZJkgbTtiHh+uvhpZcMCZIkDaZtQ8LChTBtGuy5Z9GVSJLUmNoy\nJKxfDz/7mXM1SJI0lLYMCXfdBc8+a0iQJGkobRkSurpgxx3h4IOLrkSSpMbVtiHhne+EzTYruhJJ\nkhpX24WERx+FBx/0rgZJkobTdiFh4ULYckt429uKrkSSpMbWdiHh6qvhLW+BrbcuuhJJkhpbW4WE\nl16C226Dt7616EokSWp8bRUSurth7VqYNavoSiRJanxtFRIWL4aJE+GAA4quRJKkxtdWIWHRInjz\nm/P00JIkaWhtExI2bMj9EbzUIEnSyLRNSHjgAXjhBUOCJEkj1TYhYdEi2HxzeNObiq5EkqTm0DYh\nYfFiOOggeNWriq5EkqTm0BYhIaXcknDEEUVXIklS86g4JETErIjoioinI2JDRAw54XJEvCsiro+I\nZyKiNyJuj4i3V19y5R57DJYvtz+CJEmVqKYlYSKwBJgDpBGsfwRwPXA0MAO4GVgYEdOr2HdVFi+G\niHz7oyRJGpmKRwxIKV0LXAsQETGC9ecOWHR2RBwPHAfcW+n+q7FoEUyfDq9+dT32JklSa6h7n4RS\nsNgGWFmvfS5e7KUGSZIqVUTHxU+QL1n8uB47W7Ys90kwJEiSVJm6DlAcEScD/wrMTik9W499Ll6c\nfxoSJEmqTN1CQkS8D7gUOCmldPNItpk7dy6TJk3aZFlHRwcdHR0j3u/ixbD33jB5ciXVSpLU+Do7\nO+ns7NxkWW9vb80+P1IayQ0Kg2wcsQE4IaXUNcx6HcB3gPellBaO4HNnAN3d3d3MmDGj6vogd1g8\n8ED47ndH9TGSJDWFnp4eZs6cCTAzpdQzms+qZpyEiRExPSLeWFo0tfR6Sun9cyNiQb/1O4AFwL8A\nd0fEzqXHtqMpfCSefx7uv99BlCRJqkY1HRcPBO4BusnjJJwP9ACfL70/GZjSb/0zgM2AbwLL+j2+\nVl3JI3fbbXm0RfsjSJJUuWrGSbiFIcJFSunUAa//roq6amLRIth1V9hjj6IqkCSpebX03A2LF+dL\nDcMP+SRJkgZq2ZCwejX86ldeapAkqVotGxLuvBNeftlOi5IkVatlQ8LixbDddjBtWtGVSJLUnFo6\nJBx+OIxr2SOUJGlsteRX6Lp1cMcdXmqQJGk0WjIk9PTkjot2WpQkqXotGRIWL4attoJRjugsSVJb\na8mQsGgRHHoobL550ZVIktS8Wi4kbNgAt97qpQZJkkar5ULCr3+dJ3YyJEiSNDotFxIWL4bx4+GQ\nQ4quRJKk5taSIeHAA3PHRUmSVL2WCgkp5U6LXmqQJGn0WiokPP44LFvmIEqSJNVCS4WExYvztNCH\nHVZ0JZIkNb+WCgmLFsEb3pAndpIkSaPTUiFh8WIvNUiSVCstExKWL4dHHrHToiRJtdIyIWHx4vzT\nkCBJUm20VEjYc0/YddeiK5EkqTW0TEhwfARJkmqrJULCCy/AfffZaVGSpFpqiZBw++15tEVbEiRJ\nqp2WCAmLFsHkyblPgiRJqo2WCAl94yNEFF2JJEmto+lDwpo18MtfeqlBkqRaa/qQcNddsG6dIUGS\npFpriZCw9dZ5zgZJklQ7TR8SliyBN74RNtus6EokSWotLRMSJElSbTV1SHjxRVi61JAgSdJYaOqQ\n8MADeRAlQ4IkSbVXcUiIiFkR0RURT0fEhoiYPYJtjoyI7ohYGxEPR8QHqit3U0uW5L4Ir399LT5N\nkiT1V01LwkRgCTAHSMOtHBG7Az8DbgKmAxcC34mIt1Wx700sWQL77w8TJoz2kyRJ0kDjK90gpXQt\ncC1AxIjGODwT+G1K6ZOl10sj4nBgLnBDpfvvz06LkiSNnXr0STgEuHHAsuuAQ0fzoevX55kfDQmS\nJI2NeoSEycCKActWANtGxJbVfuijj8Lq1YYESZLGSlF3N/Rdphi2T8NglizJP6dPr0E1kiTpFSru\nk1CF5cDOA5btBKxKKb001IZz585l0qRJmyzr6Oigo6ODJUtgt91ghx1qW6wkSc2is7OTzs7OTZb1\n9vbW7PMjpar/mCciNgAnpJS6hljn/wWOTilN77fsCuDVKaVjBtlmBtDd3d3NjBkzyn7u0UfD+PGw\ncGHV5UuS1HJ6enqYOXMmwMyUUs9oPquacRImRsT0iOjrDTC19HpK6f1zI2JBv02+BewZEedFxL4R\n8VHgJOCC0RTunQ2SJI2tavokHAjcA3ST+xScD/QAny+9PxmY0rdySukJ4FjgreTxFeYCp6eUBt7x\nMGLLl+eHIUGSpLFTzTgJtzBEuEgpnTrINjMr3ddg7r03/zQkSJI0dppy7oYlS2CbbWCPPYquRJKk\n1tW0IWH6dBjXlNVLktQcmvJr1k6LkiSNvaYLCS++CEuXGhIkSRprTRcSHngAUjIkSJI01pouJCxZ\nApttBq9/fdGVSJLU2poyJEybBhMmFF2JJEmtrSlDgpcaJEkae00VEtavh/vuMyRIklQPTRUSHn0U\nVq82JEiSVA9NFRKWLMk/p08fej1JkjR6TRUS7rkHpkyB7bcvuhJJklpfU4UEOy1KklQ/hgRJklRW\n04SE5cthxQpDgiRJ9dI0IaGv06IhQZKk+miqkLDttrD77kVXIklSe2iqkDB9OoxrmoolSWpuTfOV\na6dFSZLqqylCwosvwsMPGxIkSaqnpggJ998PKRkSJEmqp6YICUuWwPjxeYpoSZJUH00TEvbfHyZM\nKLoSSZLaR9OEBC81SJJUXw0fEtavh/vuMyRIklRvDR8SHnkE1qwxJEiSVG8NHxL6hmOePr3YOiRJ\najdNERKmTIHtty+6EkmS2ktThAQvNUiSVH+GBEmSVFZDh4Rnn4UVKwwJkiQVoaFDwsMP55+GBEmS\n6q+hQ8LSpbDttrD77kVXIklS+2n4kDB9Ooxr6ColSWpNVX39RsSciHg8ItZExJ0RcdAw6/9zRDwU\nEasj4smIuCAithxuP04PLUlScSoOCRHxXuB84BzgAOBe4LqI2GGQ9U8Gzi2tvx9wGvBe4EvD7et3\nv4MDDqi0QkmSVAvVtCTMBS5JKV2eUnoI+AiwmvzlX86hwK0ppStTSk+mlG4EOoGDR7IzWxIkSSpG\nRSEhIjYHZgI39S1LKSXgRnIYKOd2YGbfJYmImAocA1w93P422wymTaukQkmSVCvjK1x/B2AzYMWA\n5SuAfcttkFLqLF2KuDUiorT9t1JK5w23sz32gC2H7bkgSZLGQqUhYTABpLJvRBwJfIZ8WeJuYC/g\n6xHxh5TS/xnqQ1etmsvs2ZM2WdbR0UFHR0ctapYkqal1dnbS2dm5ybLe3t6afX7kqwUjXDlfblgN\nnJhS6uq3/PvApJTSu8psswi4I6V0Vr9lp5D7NWw9yH5mAN0f+1g3558/Y8T1SZLU7np6epg5cybA\nzJRSz2g+q6I+CSmldUA38Pd9y0qXEP6e3PegnK2ADQOWbShtGkPtb599KqlOkiTVUjWXGy4AFkRE\nN/nywVxyEPg+QERcDvw+pfSZ0voLgbkRsQS4C9gb+ALwn2mYZgxDgiRJxak4JKSUflzqiPgFYGdg\nCXBUSumPpVV2A17ut8kXyS0HXwT+Cvgj0AV8drh9TZo03BqSJGmsVNVxMaU0H5g/yHtvGfC6LyB8\nsZp9SZKkYjgrgiRJKsuQIEmSyjIkSJKksgwJkiSpLEOCJEkqy5AgSZLKMiRIkqSyDAmSJKksQ4Ik\nSSrLkCBJksoyJEiSpLIMCZIkqSxDgiRJKsuQIEmSyjIkSJKksgwJkiSpLEOCJEkqy5AgSZLKMiRI\nkqSyDAmSJKksQ4IkSSrLkCBJksoyJEiSpLIMCZIkqSxDgiRJKsuQIEmSyjIkSJKksgwJkiSpLEOC\nJEkqy5BU5kCGAAAIQ0lEQVQgSZLKMiRIkqSyDAmSJKksQ0LBOjs7iy6hLjzO1uJxtpZ2OU5or2Ot\nhapCQkTMiYjHI2JNRNwZEQcNs/6kiPhmRCwrbfNQRLyjupJbS7v8wnqcrcXjbC3tcpzQXsdaC+Mr\n3SAi3gucD5wB3A3MBa6LiH1SSs+WWX9z4EZgOfD/AMuA1wEvjKJuSZI0xioOCeRQcElK6XKAiPgI\ncCxwGvCVMuufDrwaOCSltL607Mkq9itJkuqoossNpVaBmcBNfctSSoncUnDoIJsdB9wBzI+I5RFx\nf0R8OiLsDyFJUgOrtCVhB2AzYMWA5SuAfQfZZirwFuAHwNHA3sD80uf8n0G2mQDw4IMPVlhe8+nt\n7aWnp6foMsacx9laPM7W0i7HCe1xrP2+OyeM9rMiNwSMcOWIXYCngUNTSnf1W/4V4PCU0pvLbLMU\n2BLYo9TqQETMBT6eUvqrQfZzMvDDSg5EkiRt4pSU0hWj+YBKWxKeBdYDOw9YvhOvbF3o8wfgpbRp\nGnkQmBwR41NKL5fZ5jrgFOAJYG2FNUqS1M4mALuTv0tHpaKQkFJaFxHdwN8DXQAREaXXXx9ks9uA\njgHL9gX+MEhAIKX0HDCq9CNJUhu7vRYfUk3nwQuAMyLif0TEfsC3gK2A7wNExOUR8eV+618MbB8R\nF0bE3hFxLPBp4KLRlS5JksZSxbdAppR+HBE7AF8gX3ZYAhyVUvpjaZXdgJf7rf/7iHg7MA+4l9yn\nYR7lb5eUJEkNoqKOi5IkqX04VoEkSSrLkCBJkspquJBQ6eRRzSYizomIDQMevym6rlqIiFkR0RUR\nT5eOa3aZdb5QmuhrdUTcEBF7FVHraAx3nBFxWZlzfE1R9VajNCrq3RGxKiJWRMRPI2KfAetsWZq4\n7dmI+FNE/HtE7FRUzdUa4bH+14DzuT4i5hdVczUi4iMRcW9E9JYet/efaK+Fzudwx9n057Kc0u/x\nhoi4oN+yUZ/ThgoJ/SaPOgc4gNzR8bpSR8lW8gC50+fk0uPwYsupmYnkjqxzgFd0domIs4B/Aj4M\nHAy8SD6/W9SzyBoY8jhLfs6m53jgbcCNbhbwDeBNwFuBzYHrI+JV/db5GnnelhOBI4Bdgf+oc521\nMJJjTcClbDynuwCfrHOdo/UUcBZ5aP2ZwC+A/4yI/Uvvt8r5HO44W+FcbqL0x/Q/kr8z+xv9OU0p\nNcwDuBO4sN/rAH4PfLLo2mp4jOcAPUXXUYfj3ADMHrBsGTC33+ttgTXAe4qut8bHeRnw/xVdW42P\nc4fSsR7e79z9BXhXv3X2La1zcNH11vJYS8tuBi4ourYxONbngFNb+Xz2P85WPJfA1sBS8vQH/31s\ntTqnDdOSUOXkUc1q71JT9WMR8YOImFJ0QWMtIvYgp/b+53cVcBetd34Bjiw1XT8UEfMj4jVFFzRK\nryb/Bbay9Hom+Rbq/udzKXmG12Y/nwOPtc8pEfHHyJPUfXlAS0NTiYhxEfE+8hg3d9Ci53PAcfYf\nXKhlziXwTWBhSukXA5YfSA3OaTVTRY+VaiaPakZ3Ah8kJ79dgM8BiyLiDSmlFwusa6xNJv/DW+78\nTq5/OWPq5+QmvceBPYFzgWsi4tBS8G0qERHkZstbU0p9/Wcmk4dbXzVg9aY+n4McK+S5ZH5Hbg37\nG/I4L/sAJ9W9yFGIiDeQQ8EE4E/kvzIfiogDaKHzOchxLi293RLnEqAUgN5IDgQD7UwNzmkjhYTB\nBINf9206KaX+Y2k/EBF3k39h30Nupm43LXV+IQ841u/lryPifuAx4Ehyc2CzmQ9MY2R9Z5r9fPYd\n62H9F6aUvtPv5a8jYjlwY0TskVJ6vJ4FjtJDwHRya8mJwOURccQQ6zfr+Sx7nCmlh1rlXEbEbuRA\n+7aU0rpKNqWCc9owlxuobvKoppdS6gUeBpqul3+FlpN/Odvq/AKU/uF5liY8xxFxEXAMcGRKaVm/\nt5YDW0TEtgM2adrzOeBY/zDM6neRf5+b6pymlF5OKf02pdSTUjqb3NHtf9Ni53OI4yynKc8l+RLR\njkB3RKyLiHXA3wL/OyJeIp+3LUd7ThsmJJSSUN/kUcAmk0fVZKKKRhQRW5ObpIf7R6mplb4ol7Pp\n+d2W3KO8Zc8v/Hfi354mO8elL83jgb9LKT054O1u8vDr/c/nPsBryc28TWWYYy3nAPJfY011TssY\nB2xJi53PMvqOs5xmPZc3An9NvtwwvfT4FfCDfs/XMcpz2miXGy4AFkSeafJuYC79Jo9qBRHxf4GF\n5EsMfwV8nvw/Z2eRddVCREwkp/EoLZoaEdOBlSmlp8hNY5+NiEfJ04B/kXz3yn8WUG7VhjrO0uMc\ncp+E5aX1ziO3Fo162tZ6Kd033gHMBl6MiL4WoN6U0tqU0qqI+C5wQUQ8T77u+3XgtpTS3cVUXZ3h\njjUipgInA9eQe8lPJ/9bdUtK6YEiaq5GRHyJ3F/mKWAb4BTyX55vb7HzOehxtsq5BCj1YdtkjJ2I\neBF4LqX0YOn16M9p0bdvlLmd46PkL5A15LRzYNE11fj4OslfjGvIvUyvAPYouq4aHdvfkm+vWT/g\n8b1+63yO3GFoNflLc6+i667lcZI7Sl1LDghrgd+SZ0Ldsei6KzzGcse3Hvgf/dbZkjy+wLOlf4B+\nAuxUdO21PlbypHX/Bfyx9Hu7lNwZdeuia6/wOL9T+n1cU/r9vB54Swuez0GPs1XO5RDH/gv63d5Z\ni3PqBE+SJKmshumTIEmSGoshQZIklWVIkCRJZRkSJElSWYYESZJUliFBkiSVZUiQJEllGRIkSVJZ\nhgRJklSWIUGSJJVlSJAkSWX9/3hRY2ZGhrTsAAAAAElFTkSuQmCC\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plt.plot(\"w\", data=learning_result)\n", + "plt.show()\n" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAgkAAAFkCAYAAACq4KjhAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3XuYXVV9//H3N3cSJAgBUuUWRAJIQDIklQqIXORWQYI/\nwhSKghURWjAiRQShglwtoUVE6M8LjehUVEQo+EsFKVS5z3CLEMASCCCEhNagJNyS9ftjnZHJ5Ewy\n58yZ2efyfj3Pec6cffblu58dmM+stfbakVJCkiSpt2FFFyBJkuqTIUGSJJVlSJAkSWUZEiRJUlmG\nBEmSVJYhQZIklWVIkCRJZRkSJElSWYYESZJUliFBkiSVVXFIiIjdI+KGiHg+IlZGxMH92GbPiOiM\niNci4omI+ER15UqSpKFSTUvCOOBB4ERgrQ9+iIgtgX8HbgV2Av4Z+FZE7FvFsSVJ0hCJgTzgKSJW\nAh9LKd2whnUuAg5IKe3YY1kHMD6ldGDVB5ckSYNqKMYkfAC4pdeyucCuQ3BsSZJUpRFDcIyJwKJe\nyxYB60XE6JTS6703iIgNgf2Ap4HXBr1CSZKaxxhgS2BuSunlgexoKEJCOVF676uvYz/g+0NUiyRJ\nzehI4AcD2cFQhIQXgU16LdsYeCWl9EYf2zwNcM0117DddtsNYmnFmzVrFpdeemnRZQw6z7O5eJ7N\nxfNsLo899hhHHXUUlH6XDsRQhIS7gAN6LftIaXlfXgPYbrvtmDp16mDVVRfGjx/f9OcInmez8Tyb\ni+fZtAbcXV/NPAnjImKniHh/adFWpc+blb6/ICL+tccmVwLviYiLImJyRJwAfByYPdDiJUnS4Knm\n7oZdgAeATvKYgkuALuArpe8nApt1r5xSeho4CNiHPL/CLOBTKaXedzxIkqQ6UnF3Q0rpdtYQLlJK\nx/SxTVulx5IkScXx2Q0Fa29vL7qEIeF5NhfPs7l4nurLgGZcHCwRMRXo7OzsbLVBJpIkDUhXVxdt\nbW0AbSmlroHsy5YESZJUliFBkiSVZUiQJEllGRIkSVJZhgRJklSWIUGSJJVlSJAkSWUZEiRJUlmG\nBEmSVJYhQZIklWVIkCRJZRkSJElSWYYESZJUliFBkiSVVdch4fXXi65AkqTWVdchYcGCoiuQJKl1\n1XVI+O1vi65AkqTWZUiQJEllGRIkSVJZdR0Snnyy6AokSWpddR0SliyBl18uugpJklpTXYcEgEce\nKboCSZJaU12HhBEjDAmSJBWlrkPCpEmGBEmSilLXIWHrrQ0JkiQVpa5DwnvfC/PmwcqVRVciSVLr\nqeuQsPXW8Mc/wjPPFF2JJEmtp65Dwnvfm9/tcpAkaejVdUjYaCN45zsNCZIkFaGuQ0IETJliSJAk\nqQh1HRLAkCBJUlGqCgkRcWJELIiI5RFxd0RMW8O6IyLirIj4bWn9ByJiv/4ea8oUePxxeP31aiqV\nJEnVqjgkRMRM4BLgbGBn4CFgbkRM6GOT84BPAycC2wFXAT+NiJ36c7wpU2DFCnjssUorlSRJA1FN\nS8Is4KqU0pyU0nzgeGAZcGwf6x8FnJdSmptSejqldCVwM3BKfw62ww753S4HSZKGVkUhISJGAm3A\nrd3LUkoJuAXYtY/NRgO9OwuWA7v155jrrQdbbGFIkCRpqFXakjABGA4s6rV8ETCxj23mAp+PiK0j\n2xeYAfxZfw/q4EVJkoZere5uCCD18d3JwJPAfHKLwmXAd4AV/d25IUGSpKE3osL1l5B/uW/Sa/nG\nrN66AEBKaQkwIyJGARumlF6IiAuBBWs72KxZsxg/fjzPPw/PPw8HHABHH91Oe3t7hWVLktR8Ojo6\n6OjoWGXZ0qVLa7b/yEMKKtgg4m7gnpTSyaXPASwELkspfa0f248EHgX+LaX05T7WmQp0dnZ2MnXq\nVObNy60Jt98Oe+xRUbmSJLWUrq4u2traANpSSl0D2Vc13Q2zgeMi4uiI2Ba4EhgLXA0QEXMi4vzu\nlSNiekQcGhGTImJ34Ofk7om1BopukyfDyJF2OUiSNJQq7W4gpXRtaU6Ec8jdDg8C+6WUFpdW2RR4\nq8cmY4CvApOAPwI3AUellF7p7zFHjoTttjMkSJI0lCoOCQAppSuAK/r4bq9en+8A3lfNcXpy8KIk\nSUOr7p/d0G3KFJg3DyocQiFJkqrUUCHhlVdg4cKiK5EkqTU0VEgAePjhYuuQJKlVNExI2HRTGD/e\ncQmSJA2VhgkJEQ5elCRpKDVMSABDgiRJQ6nhQsLjj8MbbxRdiSRJza/hQsJbb8H8+UVXIklS82uo\nkLDDDvndLgdJkgZfQ4WE9deHzTYzJEiSNBQaKiSAgxclSRoqDRcSdtzRkCBJ0lBouJAwZQo8+yz8\n/vdFVyJJUnNryJAA+WFPkiRp8DRcSJg8GUaMsMtBkqTB1nAhYdQo2HZbH/QkSdJga7iQAN7hIEnS\nUGjYkDBvHqRUdCWSJDWvhg0JS5fmuxwkSdLgaNiQAHY5SJI0mBoyJGy+Oay3niFBkqTB1JAhISI/\n7MmQIEnS4GnIkADe4SBJ0mBr2JCw444wfz68+WbRlUiS1JwaNiRMmZIDwuOPF12JJEnNqWFDwg47\n5He7HCRJGhwNGxLe+U7YdFNDgiRJg6VhQwLkLgef4SBJ0uBo+JBgS4IkSYOj4UPCwoV5imZJklRb\nDR8SID/sSZIk1VZDh4Rtt4Xhw+1ykCRpMDR0SBg9GiZPNiRIkjQYGjokgIMXJUkaLFWFhIg4MSIW\nRMTyiLg7IqatZf3PRcT8iFgWEQsjYnZEjK6u5FXttBM89BCsXFmLvUmSpG4Vh4SImAlcApwN7Aw8\nBMyNiAl9rP9XwAWl9bcFjgVmAudVWfMqpk+HV16BJ56oxd4kSVK3aloSZgFXpZTmpJTmA8cDy8i/\n/MvZFfhVSumHKaWFKaVbgA5gelUV99LWlt/vvbcWe5MkSd0qCgkRMRJoA27tXpZSSsAt5DBQzp1A\nW3eXRERsBRwI3FRNwb2tv34evHjffbXYmyRJ6jaiwvUnAMOBRb2WLwIml9sgpdRR6or4VUREafsr\nU0oXVVpsX6ZPtyVBkqRaqzQk9CWAVPaLiD2BL5G7Je4FtgYui4gXUkpfXdNOZ82axfjx41dZ1t7e\nTnt7+yrLpk+HH/4QXn893xYpSVIr6OjooKOjY5VlS2s4DXHk3oJ+rpy7G5YBh6WUbuix/GpgfErp\n0DLb3AHclVI6rceyI8njGtbt4zhTgc7Ozk6mTp261rruuQc+8IHcmjBtjfdZSJLU3Lq6umjLA/ba\nUkpdA9lXRWMSUkpvAp3A3t3LSl0Ie5PHHpQzFuh9g+LK0qZRyfH7stNOMHKk4xIkSaqlau5umA0c\nFxFHR8S2wJXkIHA1QETMiYjze6x/I/DZiJgZEVtGxL7AOcDPUiXNGGswZkwOCo5LkCSpdioek5BS\nurY0EPEcYBPgQWC/lNLi0iqbAm/12ORccsvBucC7gcXADcCZA6h7NdOnw2231XKPkiS1tqoGLqaU\nrgCu6OO7vXp97g4I51ZzrP6aNg2++c08sdJ66w3mkSRJag0N/+yGbtOnQ0rQ2Vl0JZIkNYemCQmT\nJ8M73uG4BEmSaqVpQsLw4bDLLoYESZJqpWlCAuRxCYYESZJqo6lCwvTp8Nxz8MILRVciSVLja7qQ\nAE6qJElSLTRVSNh0U9hkE7scJEmqhaYKCRE+EVKSpFppqpAAOSTcd1+eM0GSJFWvKUPC738Pv/1t\n0ZVIktTYmi4k7LJLfrfLQZKkgWm6kLDBBrD11oYESZIGqulCArw9LkGSJFWvaUNCVxe8+WbRlUiS\n1LiaMiRMmwavvw6PPFJ0JZIkNa6mDAk775wf+OS4BEmSqteUIWGddWDHHR2XIEnSQDRlSABnXpQk\naaCaNiRMmwaPPgp//GPRlUiS1JiaNiRMnw4rV+a7HCRJUuWaNiRsvz2MG2eXgyRJ1WrakDB8OLS1\nGRIkSapW04YEyOMSDAmSJFWnqUPC9OnwzDPw0ktFVyJJUuNp+pAAzpcgSVI1mjokbLEFbLSRXQ6S\nJFWjqUNChOMSJEmqVlOHBHj7sdEpFV2JJEmNpSVCwssvw4IFRVciSVJjafqQMG1afrfLQZKkyjR9\nSJgwASZNMiRIklSppg8J8Pa4BEmS1H8tExI6O+Gtt4quRJKkxlFVSIiIEyNiQUQsj4i7I2LaGta9\nLSJWlnndWH3ZlZk2DZYvh9/8ZqiOKElS46s4JETETOAS4GxgZ+AhYG5ETOhjk0OBiT1eOwArgGur\nKbgaU6fCsGGOS5AkqRLVtCTMAq5KKc1JKc0HjgeWAceWWzml9PuU0kvdL+AjwKvAj6stulLjxsEO\nOzguQZKkSlQUEiJiJNAG3Nq9LKWUgFuAXfu5m2OBjpTS8kqOPVDTp9uSIElSJSptSZgADAcW9Vq+\niNyVsEYRMR14H/CtCo87YNOmwbx58OqrQ31kSZIa04ga7SeA/kx8/ClgXkqpsz87nTVrFuPHj19l\nWXt7O+3t7RUXOH06rFgBDzwAu+1W8eaSJNWdjo4OOjo6Vlm2dOnSmu2/0pCwhDzocJNeyzdm9daF\nVUTEOsBM4Mz+HuzSSy9l6tSpFZZY3vveB+usk8clGBIkSc2g3B/OXV1dtLW11WT/FXU3pJTeBDqB\nvbuXRUSUPt+5ls1nAqOA71dYY02MHJnvcnBcgiRJ/VPN3Q2zgeMi4uiI2Ba4EhgLXA0QEXMi4vwy\n230KuD6l9L/VFjtQPjZakqT+qzgkpJSuBU4BzgEeAHYE9kspLS6tsim9BjFGxHuBv6CAAYs9TZ8O\nTz0FS5YUWYUkSY2hqoGLKaUrgCv6+G6vMsueJN8VUajp0/P7/ffD/vsXW4skSfWuJZ7d0G2rrWDD\nDeHXvy66EkmS6l9LhYQIOOAA+NnPiq5EkqT611IhAWDGDHjkEXjyyaIrkSSpvrVcSNhvPxg7Fq67\nruhKJEmqby0XEsaOzV0OhgRJktas5UIC5C6He++FZ58tuhJJkupXS4aEgw7KMzBef33RlUiSVL9a\nMiSMHw/77GOXgyRJa9KSIQFyl8Mdd8DixWtfV5KkVtSyIeHgg/O7cyZIklRey4aEjTeG3Xe3y0GS\npL60bEgAOOwwuOUWWLq06EokSao/LR0SPvYxePNNuOmmoiuRJKn+tHRI2Gyz/GRIuxwkSVpdS4cE\nyHc5/PznsGxZ0ZVIklRfWj4kHHpoDghz5xZdiSRJ9aXlQ8I228AOO9jlIElSby0fEiB3Odx4I7zx\nRtGVSJJUPwwJ5Fshly6F224ruhJJkuqHIQGYMgXe8x67HCRJ6smQAETkLofrr4cVK4quRpKk+mBI\nKJkxA156CX7966IrkSSpPhgSSqZPh3e9yy4HSZK6GRJKhg3LcyZcdx2kVHQ1kiQVz5DQw4wZ8Oyz\n0NlZdCWSJBXPkNDDHnvAhhva5SBJEhgSVjFiBBxyCPzkJ3Y5SJJkSOhlxgx44gl49NGiK5EkqViG\nhF723hve8Q67HCRJMiT0MmYMHHSQIUGSJENCGTNmwIMPwlNPFV2JJEnFMSSUccABuUXhpz8tuhJJ\nkopjSChj3XVhv/3scpAktbaqQkJEnBgRCyJieUTcHRHT1rL++Ij4RkT8rrTN/IjYv7qSh8aMGXDn\nnfDCC0VXIklSMSoOCRExE7gEOBvYGXgImBsRE/pYfyRwC7A5MAOYDHwaeL7KmofEX/5lnjfBLgdJ\nUquqpiVhFnBVSmlOSmk+cDywDDi2j/U/BawPfCyldHdKaWFK6b9SSo9UV/LQ2GAD+PCH7XKQJLWu\nikJCqVWgDbi1e1lKKZFbCnbtY7OPAncBV0TEixHxSEScHhF1Px5ixgz4z/+EJUuKrkSSpKFX6S/q\nCcBwYFGv5YuAiX1ssxXwf0rHOgA4FzgF+FKFxx5yhx6ap2e+/vqiK5EkaejV6q/5APp62sEwcog4\nLqX0QErpWuA84LM1Ovag2WQT2HNP+OEPi65EkqShN6LC9ZcAK4BNei3fmNVbF7q9ALxR6pbo9hgw\nMSJGpJTe6utgs2bNYvz48assa29vp729vcKyq3f44XDCCbB4MWy00ZAdVpKktero6KCjo2OVZUuX\nLq3Z/iNV+LjDiLgbuCeldHLpcwALgctSSl8rs/55QHtKaasey04GTk0pbdrHMaYCnZ2dnUydOrWi\n+mptyRKYOBEuvxyOP77QUiRJWquuri7a2toA2lJKXQPZVzXdDbOB4yLi6IjYFrgSGAtcDRARcyLi\n/B7rfxPYMCL+OSLeGxEHAacDlw+k8KEyYUJ+6JNdDpKkVlNpdwMppWtLcyKcQ+52eBDYL6W0uLTK\npsBbPdZ/LiI+AlxKnlPh+dLPFw+w9iFz+OHw6U/Diy/mVgVJklpBVQMXU0pXpJS2TCmtk1LaNaV0\nf4/v9kopHdtr/XtSSn+RUhqbUnpvSumiVGk/R4EOPRSGD4cf/7joSiRJGjp1P1dBPdhgA9h3X7j2\n2qIrkSRp6BgS+mnmTPjVr+D5up5MWpKk2jEk9NMhh8DIkfCjHxVdiSRJQ8OQ0E/rr58fH22XgySp\nVRgSKjBzJtx1FyxcWHQlkiQNPkNCBT76URg92i4HSVJrMCRUYL314MADnVhJktQaDAkVmjkT7rsP\nFiwouhJJkgaXIaFCBx0E66zjAEZJUvMzJFRo3XXhL//SLgdJUvMzJFRh5kx44AF48smiK5EkafAY\nEqpwwAEwbpxdDpKk5mZIqMLYsXDwwYYESVJzMyRU6fDD4eGHYf78oiuRJGlwGBKqtP/+ed4EBzBK\nkpqVIaFKY8bkhz7Z5SBJalaGhAE4/HB49FGYN6/oSiRJqj1DwgB85CMwfrytCZKk5mRIGIBRo+DQ\nQ/O4hJSKrkaSpNoyJAzQzJnwxBPw0ENFVyJJUm0ZEgZo771hgw3scpAkNR9DwgCNHAkzZtjlIElq\nPoaEGpg5E556Crq6iq5EkqTaMSTUwJ57wkYbObGSJKm5GBJqYMQIOOywPC7BLgdJUrMwJNTIzJnw\nzDNw771FVyJJUm0YEmpk991h4kS7HCRJzcOQUCPDh8NRR8G3vw2LFxddjSRJA2dIqKHTToMIOOec\noiuRJGngDAk1NGECnHEGXHklPP540dVIkjQwhoQa+7u/g3e/G774xaIrkSRpYAwJNTZmDFxwAVx/\nPdxxR9HVSJJUPUPCIJg5E6ZNg1NOgZUri65GkqTqVBUSIuLEiFgQEcsj4u6ImLaGdT8RESsjYkXp\nfWVELKu+5Po3bBhccgncfz/8278VXY0kSdWpOCRExEzgEuBsYGfgIWBuRExYw2ZLgYk9XltUXmpj\n2X13OPRQOP10eO21oquRJKly1bQkzAKuSinNSSnNB44HlgHHrmGblFJanFJ6qfRqiZkELrwQfvc7\nuOyyoiuRJKlyFYWEiBgJtAG3di9LKSXgFmDXNWy6bkQ8HRELI+L6iNi+qmobzDbbwGc/C+edB0uW\nFF2NJEmVqbQlYQIwHFjUa/kicjdCOY+TWxkOBo4sHfPOiHh3hcduSGedld+dYEmS1GhqdXdDAGWf\nf5hSujuldE1K6eGU0n8BM4DFwHE1OnZd655g6ZvfhCeeKLoaSZL6b0SF6y8BVgCb9Fq+Mau3LpSV\nUnorIh4Atl7burNmzWL8+PGrLGtvb6e9vb1/1daJk06Cb3wjT7B03XVFVyNJahYdHR10dHSssmzp\n0qU123/kIQUVbBBxN3BPSunk0ucAFgKXpZS+1o/thwHzgJtTSl/oY52pQGdnZydTp06tqL569YMf\nwJFHwu23wx57FF2NJKlZdXV10dbWBtCWUuoayL6q6W6YDRwXEUdHxLbAlcBY4GqAiJgTEed3rxwR\nX46IfSNiUkTsDHyffAvktwZSeKM54gjYZRf4whecYEmS1BgqDgkppWuBU4BzgAeAHYH9etzWuCmr\nDmJ8J/AvwKPATcC6wK6l2ydbxrBh8I//CPfdBz/8YdHVSJK0dhV3NwyFZuxu6Paxj8GDD8L8+fk5\nD5Ik1VLR3Q0agIsugueeg69/vehKJElaM0PCEJs8GY4/3gmWJEn1z5BQgLPPzoMXzz236EokSeqb\nIaEAG20EX/oSXHGFEyxJkuqXIaEgJ58M73pXnmBJkqR6ZEgoyDrrwPnnw09/CnfcUXQ1kiStzpBQ\noPb2PMHSKac4wZIkqf4YEgo0bBhccgncfz/0mnpbkqTCGRIKtsceeYKl00+H5cuLrkaSpLcZEurA\nRRfBCy/AP/1T0ZVIkvQ2Q0Id2GYbOOEEuOACeOmloquRJCkzJNSJs86C4cPhH/6h6EokScoMCXVi\nww3hzDPhX/4FHnus6GokSTIk1JW//VvYYgs49dSiK5EkyZBQV0aPhgsvhJtugltvLboaSVKrMyTU\nmY9/HHbdNU+wtGJF0dVIklqZIaHOROQJlh56CL73vaKrkSS1MkNCHdp1Vzj8cDjjDHj11aKrkSS1\nKkNCnbrwQliyJLcqSJJUBENCnZo0CU46CS6+OM/GKEnSUDMk1LEzzoAxY+DLXy66EklSKzIk1LH1\n14ezz4bvfAcefrjoaiRJrcaQUOc+8xnYemv4whcgpaKrkSS1EkNCnRs1Ko9L+MUvYO7coquRJLUS\nQ0IDOOQQ2GMP+Pzn4Q9/KLoaSVKrMCQ0gAi4/HJ4/nn46Edh2bKiK5IktQJDQoOYMgVuvhnuvx8O\nPRRee63oiiRJzc6Q0EA++EG48Ua44448I+MbbxRdkSSpmRkSGsyHPww//WkexHjkkfDWW0VXJElq\nVoaEBrT//nDttXD99XDMMT4tUpI0OAwJDeqQQ+D734cf/ACOPx5Wriy6IklSsxlRdAGq3uGH5wGM\nn/xknr75ssvynRCSJNWCIaHBHX10Dgqf+Qyssw5cdJFBQZJUG4aEJnDccTkonHxyDgpf+UrRFUmS\nmkFVYxIi4sSIWBARyyPi7oiY1s/tjoiIlRFxXTXHVd9OOgkuvBDOOSe/S5I0UBW3JETETOAS4Djg\nXmAWMDcitkkpLVnDdlsAXwPuqLJWrcVpp8Hy5XD66XmMwuc+V3RFkqRGVk13wyzgqpTSHICIOB44\nCDgWuLjcBhExDLgGOAvYAxhfVbVaq7PPzkFh1iwYPRo++9miK5IkNaqKQkJEjATagPO7l6WUUkTc\nAuy6hk3PBl5KKX03IvaoqlL1S0TubnjjDTjhBHjzzdwVIUlSpSptSZgADAcW9Vq+CJhcboOI+CBw\nDLBTxdWpKhEwezaMHJkHM77+Opx6atFVSZIaTa3ubgggrbYwYl3ge8CnU0r/W+lOZ82axfjxq/ZM\ntLe3097eXm2dLSMi3w45ejT8/d/noHDmmUVXJUmqpY6ODjo6OlZZtnTp0prtP1Ja7Xd73yvn7oZl\nwGEppRt6LL8aGJ9SOrTX+jsBXcAKcpCAt++oWAFMTiktKHOcqUBnZ2cnU6dO7f/ZqKxzz4WzzoIv\nfznfHuk8CpLUvLq6umhrawNoSyl1DWRfFbUkpJTejIhOYG/gBoCIiNLny8ps8hgwpdey84B1gZOA\nZystWJX78pdh1Cj44hfzWIULLjAoSJLWrpruhtnAv5bCQvctkGOBqwEiYg7wXErpSymlN4BHe24c\nEb8nj3d8bCCFqzKnnZaDwuc/n7seZs82KEiS1qzikJBSujYiJgDnAJsADwL7pZQWl1bZFPABxnWo\n+7bIE0/MLQpf/zoM8xFfkqQ+VDVwMaV0BXBFH9/ttZZtj6nmmKqNE07ILQrHHZeDwlVXGRQkSeX5\n7IYW9Dd/k4PCMcfkoPCd78Dw4UVXJUmqN4aEFnX00TkoHHVUDgpz5uR5FSRJ6mZIaGFHHJGDwRFH\n5KBwzTX5KZKSJEGVT4FU8zjsMLjuOrjpJpgyBW69teiKJEn1wpAgPvpReOghePe7YZ994JOfhCV9\nPs9TktQqDAkCYPJkuO02+Na34Gc/g+22y90PFUzIKUlqMoYE/cmwYfCpT8Fjj+UWhb/+a9hvP3jq\nqaIrkyQVwZCg1UycCB0deZzC44/DDjvAxRfnx05LklqHIUF9OvBA+M1v4LOfhdNPh2nT4L77iq5K\nkjRUDAlao3XXhUsugXvvzd0RH/gAfO5z8Ic/FF2ZJGmwGRLUL21tOShcfDH83/8L22+fb510YKMk\nNS9DgvptxAg45RSYNw922inPsXDQQfDf/110ZZKkwWBIUMUmTYIbb4Trr89jFt73PjjnHHjttaIr\nkyTVkiFBVYmAQw6BRx/Nj6A+99w8Y+N//EfRlUmSasWQoAEZNw4uuCDP2LjppnlehcMPh+efL7oy\nSdJAGRJUE9tvD7/8ZZ6l8Y47YNttYfZs51aQpEZmSFDNRMCRR8L8+XDMMXDqqfmuiF//uujKJEnV\nMCSo5tZfHy67LE+8tM46sNtucPDBMHcurFxZdHWSpP4yJGjQTJ0Kd90F3/0uLFwI++8P22wD//iP\n8PLLRVcnSVobQ4IG1bBh+dHTDzwAd94Ju+4KZ5yRH0v9yU/mCZqckEmS6pMhQUMiIgeE730PnnsO\nvvIVuP12+PM/h112gW9/G5YtK7pKSVJPhgQNuY02gtNOg9/+Fv793/NTJz/96dy6MGtWfvKkJKl4\nhgQVZvjwPK3zTTflqZ0/85nc0rDttvmWylNPhdtugzfeKLpSSWpNhgTVhUmT4MILc1fET36Suyau\nuQb22gsmTICPfzwPgHzxxaIrlaTWMaLoAqSexoyBGTPya+VKePBBuPnm3NrwqU/lQY5tbXDggbkV\nYpddcouEJKn2bElQ3Ro2LN9GeeaZ+VbKl17K3RHbbAOXXw4f+EAez3DUUTBnDrzwQtEVS1JzsSVB\nDWPChBwIjjoK3noL7rkntzDMnQvf/35eZ4cdYN994SMfgT32gLFji61ZkhqZLQlqSCNGwAc/COef\nD52duZWhowOmTYMf/QgOOADe+U7Ye2+46CLo6nK2R0mqlC0JagobbQRHHJFfKeXnR/ziF/nR1eee\nC1/8Ym6J2GefPDfD+98PO+2Ug4QkqTxDgppOBGy3XX6ddFK+hfKuu3JouOUWuP56eO21vO7mm78d\nGN7//vxTBwSeAAAKqklEQVTacss8HkKSWp0hQU1v1Cj40Ify66tfzeMZnngCHnoo3z3x4INw1VW5\nywLgHe94OzRMmQJbbZVv0dx8cxg5sthzkaShZEhQyxkxIk/WtP320N7+9vIXX8yBoTs83HorXHHF\n22MZhg3Ls0JOmrT6a8st4V3v8nZMSc3FRtWCdXR0FF3CkGiE85w4MT+p8rTT8iDIRx+F5cvz9NG3\n3JJbG/76r3NQePxxuPJK+MQn8l0Um2+eH4s9cWIH++wDf/M3udXimmvgv/4Lnn0WVqwo+gxrpxGu\nZy14ns2lVc6zlqpqSYiIE4EvABOBh4C/Synd18e6hwJfArYGRgJPApeklK6pquIm09HRQXvPP2eb\nVKOe56hR8J735Fc5y5bBM8/AggX5demlHWywQTsPPww33ACLF7+97ogRsNlmudVhyy1hiy3ygMsN\nNoANN3z7tcEGsO66eWxFvWrU61kpz7O5tMp51lLFISEiZgKXAMcB9wKzgLkRsU1KaUmZTV4GvgrM\nB94APgp8NyIWpZR+UXXlUh0YO/btQZKQ52y49tq3v3/1VVi4EJ5++u3XM8/kVoqf/xyWLMljJHob\nNWrV8LDBBvnujAkTcrAo9z5uXH0HC0mNp5qWhFnAVSmlOQARcTxwEHAscHHvlVNKd/RadFlEfALY\nDTAkqKmNG7dqiOgtJfjDH+Dll+F//ie/93z1XPbggzlULFkCf/zj6vsaM2bV0LDeejnEjBuX36v5\nedQog4fUyioKCRExEmgDzu9ellJKEXELsGs/97E3sA1weyXHlppRRP5lvt56eQBkfy1fnoPD4sU5\nNHS/d/+8eHEOEosW5S6R7terr+b35cv7d5xhw1YPDmPHwlNP5QmrxozJYzG6X70/dy8bPTrfGTJy\nZA4e5d57/zxixOrvDgyVhlalLQkTgOHAol7LFwGT+9ooItYDngdGA28BJ6SUfrmG44wBeOyxxyos\nr/EsXbqUrq6uossYdJ7n4OnukqjEypXw+ut5vojXXsuhoZKfn356Ka+/3sUrr+T9rOlV65kuR4x4\n+zV8eH4NG5Y/DxuWXz2Xl/u+92v48BzYer8/+OBSDjyw60/rRbz93v1zueXl1oO+14HVP/dct/v7\n3q/u5b2/7/m598/l1n3iiaV89atdq+2v3Hs1P6/p+3L77892azpOX8sXLFjK5ZeX/++zkv30pdJW\nt8FqpVu48E+/O8cMdF+RUur/yhF/Rv5lv2tK6Z4eyy8Gdksp/UUf2wUwCVgX2Bs4CzikTFdE9/p/\nBXy/34VJkqTejkwp/WAgO6i0JWEJsALYpNfyjVm9deFPUk4iT5U+PhwR2wOnA2VDAjAXOBJ4Gnit\nwholSWplY4Atyb9LB6SikJBSejMiOsmtATfAn1oJ9gYuq2BXw8hdD30d52VgQOlHkqQWdmctdlLN\n3Q2zgX8thYXuWyDHAlcDRMQc4LmU0pdKn78I3A/8NzkYHAQcBRw/0OIlSdLgqTgkpJSujYgJwDnk\nbocHgf1SSt3TxmxKHpzYbRzwjdLy5eT5Eo5MKf14IIVLkqTBVdHARUmS1Dp8doMkSSrLkCBJksqq\nu5AQESdGxIKIWB4Rd0fEtKJrqqWIODsiVvZ6PVp0XbUQEbtHxA0R8XzpvA4us845EfG7iFgWEb+I\niK2LqHUg1naeEfHdMtf45qLqrUZEnB4R90bEKxGxKCJ+GhHb9FpndER8IyKWRMQfIuLHEbFxUTVX\no5/n+Z+9ruWKiLiiqJqrFRHHR8RDEbG09LozIvbv8X3DX0/o13k2xfXsqfTveGVEzO6xrCbXs65C\nQo+HR50N7Ex+wuTc0kDJZjKPPOhzYum1W7Hl1Mw48kDWE4HVBrtExGnA3wKfAaYDr5Kv76ihLLIG\n1nieJT9n1WvcaI+e2x34OvDnwD7kJ7j+R0Ss02OdfyLfrXQYsAfwLuAnQ1znQPXnPBPwL7x9Pf8M\n+PshrrMWngVOI0+t3wb8EvhZRHQ/WaQZries/Tyb5XoCUPpD+tPk35c91eZ6ppTq5gXcDfxzj88B\nPAf8fdG11fAczwa6iq5jCM5zJXBwr2W/A2b1+Lwe+Y6Xw4uut8bn+V3guqJrq/F5Tiid6249rt3r\nwKE91plcWmd60fXW6jxLy24DZhdd2yCd78vAMc16PXufZ7NdT/Isxo8De/U8r1pez7ppSejx8Khb\nu5elfGb9fnhUA3lvqan6vyPimojYrOiCBltETCKn9p7X9xXgHprv+gLsWWq+nh8RV0TEBkUXNEDr\nk/8C+5/S5zbyLdQ9r+fjwEIa+3r2Ps9uR0bE4oh4JCLO79XS0HAiYlhEHEGe4+YumvR69jrPnpML\nNcv1/AZwY1r9WUi7UKPrWc1kSoOlqodHNaC7gU+S09+fAf8A3BERO6SUXi2wrsE2kfw/33LXd+LQ\nlzOofk5u1lsAvAe4ALg5InYtBd+GEhFBbrr8VUqpe/zMROCNUtDrqWGvZx/nCfk5Ms+QW8J2BC4m\nP8n240Ne5ABFxA7kUDAG+AP5L835EbEzTXQ9+zjPx0tfN8X1LIWf95MDQW+bUKPrWU8hoS9B3/2+\nDSel1HMu7XkRcS/5H+zh5GbqVtNU1xfyhGM9Pv4mIh4hzzi6J7lJsNFcAWxP/8bONPL17D7PD/Zc\nmFL6Vo+Pv4mIF4FbImJSSmnBUBZYA/OBncgtJocBcyJijzWs36jXs+x5ppTmN8P1jIhNyYF235TS\nm5VsSoXXs266G6jy4VGNLqW0FHgCaLhR/hV6kfwPtKWuL0DpfzxLaMBrHBGXAwcCe6aUftfjqxeB\nUZEfA99TQ17PXuf5wlpWv4f8b7nhrmdK6a2U0lMppa6U0hnkwW4n02TXcw3nWU4jXs82YCOgMyLe\njIg3gQ8BJ0fEG+RrNroW17NuQkIpDXU/PApY5eFRNXlQRT2KiHXJTdJr+x9TQyv9onyRVa/veuRR\n5U17feFPqX9DGuwal35xHgJ8OKW0sNfXneTp13tez22AzcnNvA1jLedZzs7kv8Ya6nr2ofthe01z\nPfuwpocKNuL1vAWYQu5u2Kn0uh+4psfPb1KD61lv3Q1rfHhUM4iIrwE3krsY3g18hfwfZ0eRddVC\nRIwjp/EoLdoqInYC/iel9Cy5eezMiPgt+THg55LvXvlZAeVWbU3nWXqdTR6T8GJpvYvIrUUDfmzr\nUCndN94OHAy8GhHdLUBLU0qvpZReiYhvA7Mj4n/J/b6XAb9OKd1bTNWVW9t5RsRWwF8BN5NHyO9E\n/v/U7SmleUXUXK2IOI88XuZZ4B3AkeS/Pj/SLNcT1nyezXI9S+PXVplfJyJeBV5OKT1W+lyb61n0\nLRxlbuk4gfwLZDk58exSdE01Pr8O8i/G5eSRpj8AJhVdV43O7UPkW2xW9Hp9p8c6/0AeMLSM/Etz\n66LrruV5kgdK/T9yQHgNeAr4JrBR0XVXeI7lzm8FcHSPdUaT5xhYUvqf0I+AjYuuvZbnSX4w3X8C\ni0v/Zh8nD0Rdt+jaqzjXb5X+PS4v/fv8D2CvZrqeazvPZrqeZc77l/S4tbNW19MHPEmSpLLqZkyC\nJEmqL4YESZJUliFBkiSVZUiQJEllGRIkSVJZhgRJklSWIUGSJJVlSJAkSWUZEiRJUlmGBEmSVJYh\nQZIklfX/Ae/RY9Mu9lZgAAAAAElFTkSuQmCC\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plt.plot(\"b\", data=learning_result)\n", + "plt.show()\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "至此,一个简单的Python Trainer API使用说明写完了。" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 2", + "language": "python", + "name": "python2" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 2 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython2", + "version": "2.7.12" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} diff --git a/demo/introduction/linear.md b/demo/introduction/linear.md new file mode 100644 index 00000000000000..155b51be8d5323 --- /dev/null +++ b/demo/introduction/linear.md @@ -0,0 +1,207 @@ +# Paddle Python Trainer API使用说明 + +目前Paddle的PythonTrainerAPI处在一个测试阶段,很多设计都可以修改。包括接口。 + +## 开发周期 + +1. +使用Paddle的swig接口暴露出足够多支持Paddle训练的API,这个暴露级别最好在GradientMachine这个级别,可以方便用户自定义训练过程。 +2. 在swig API的基础上,确定Python的用户接口是什么样子 +3. 将swig API修改成C-API,进而可以多语言操控Paddle的训练 + +## 预期达到的效果 + +1. 用户可以完全使用Paddle Python的库完成训练。 + 1. 训练过程中的信息,可以以强类型传递给Python端。 + * 正确率 + * cost + * pass_id, batch_id + 2. 训练过程中的测试频率等工作,完全交由用户进行 + * 默认pass结束测试全部数据,但用户可以随意更改 + 3. 用户可以非常自由的选择更新参数 + * 针对不同的数据,训练神经网络的不同部分。 + * 一组数据训练网络的左半边,另一组训练右半边 + 4. 更方便的多目标学习 + +## Python端用户接口的封装 + +### 整体样例 + +下面以一个简单的线性回归作为用户接口的使用样例。这个线性回归是以 x和y为变量,回归一个 y=w*x+b +的方程。其中w和b预设为2和0.3。神经网络中的w和b都是学习出来的。 + +首先,先import一些包。Python的训练接口复用了目前Paddle的很多逻辑。 + +1. 网络配置复用的是trainer_config_helpers里面的配置 +2. 数据传输复用的是PyDataProvider2的格式 + +`from py_paddle.trainer import *`这行import了Paddle Python端的训练函数。`import +py_paddle.swig_paddle as api`这行import了底层swig暴露的接口。 + +```python +from paddle.trainer_config_helpers import * +from paddle.trainer.PyDataProvider2 import * +from py_paddle.trainer import * +import py_paddle.swig_paddle as api +``` + +```python +@network(inputs={ + 'x': dense_vector(1), 'y': dense_vector(1) +}, learning_rate=1e-3, batch_size=12) +def linear_network(x, y): + y_predict = fc_layer(input=x, param_attr=ParamAttr(name='w'), size=1, + act=LinearActivation(), bias_attr=ParamAttr(name='b')) + cost = regression_cost(input=y_predict, label=y) + return cost +``` + +上面定义Paddle需要训练网络的网络结构。这个网络结构的定义使用了Python的函数。 + +`@network`是一个decorator,它将下面的函数变成Paddle的神经网络描述(protobuf)。其参数包括: + +* inputs. 传输数据类型是一个字典,key是函数的参数名(这里就是x、y),value是x,y对应的数据类型。这里数据类型都是dense的vector + * 可用的数据类型参考PyDataProvider2的@provider的input_types类型 +* 其余的参数是paddle的优化参数。参考`settings` + +```python +help(settings) # run this line to print document of settings method. +``` + +在linear_network里面定义的就是神经网络的计算图。返回值就是优化目标。 + +使用decorator `@network`,我们将这个函数封装成了一个Python类。进而,我们声明一个网络描述实例`linear`。 + +```python +linear = linear_network() +``` + +这个描述是实例里面包含了一些Paddle的计算图信息和网络输入顺序等等。下面几个block可以手动运行,展开输出。 + +```python +help(linear_network) # run this line to print document of linear_network +``` + +```python +print linear.input_types() +``` + +```python +print linear.network_graph() # Paddle neural network protobuf definition +``` + +```python +configs = { + 'w': 2, + 'b': 0.3 +} +``` + +进而我们设置一下线性回归的参数。`y=w*x+b`, w和b设置为2和0.3。 这个dict被dataprovider使用。 + +```python +import random + +@linear.provider() +def process(*args, **kwargs): + for i in xrange(2000): + x = random.random() + yield {'x': [x], 'y': [configs['w'] * x + configs['b']]} + +``` + +下一步是声明数据读取器(DataProvider)。其本身也是一个函数, `process`。 + +Paddle的PyDataProvider2的数据读取的主要想法是,用户只需要关注**从一个文件里**如何读取**一条数据**,然后按照一种数据格式yield出 +去。其他batch组合,数据shuffle等工作Paddle完成。 + +声明这个DataProvider的过程,也是使用一个Decorator完成。注意这个decorator实际上是**linear实例的一个函数**。 + +这个函数的参数和PyDataProvider2一样,第一个是settings,第二个是filename。不过这里procees函数实际上没有使用任何参数,故pr +ocess中使用`*args, **kwargs`来接受任意参数。 + +返回值是使用yield返回。这里必须使用**字典**。 + +```python +help(process) +``` + +```python +runner = RunnerBuilder(network=linear).with_train_data(method=process).build() +``` + +下一步是构造一个Runner,Runner是Python Trainer API中的最基础数据类型。它具有的操作是 + +* 执行一个Pass。 run_one_pass。 +* 增加一个Pass中的执行步骤,例如打印输出等等。 + +RunnerBulder是一个简单的Runner生成器。他负责将Paddle的训练流程插入到Runner的执行步骤中。 + +这里network传入linear对象,而训练数据的读取函数是process。调用build生成runner + +关于Runner的具体说明参考其他文档,或者注释。 + +```python +learning_result = { + 'cost': [], + 'w': [], + 'b': [] +} +``` + +我们声明一个learning_result字典,来保存训练过程中的数据,三个field分别保存每个pass后的误差,w值和b值。方便我们画图。 + +```python +with runner: + while True: + ctx = ContextWrapper(runner.run_one_pass()) + learning_result['cost'].append(ctx.cost()) + params = ctx.gradient_machine().getParameters() + for param in params: + learning_result[param.getName()].append(param.getBuf(api.PARAMETER_VALUE)[0]) + + if abs(ctx.cost() - 0.0) < 1e-10: + # end training. + break +``` + +上面这个循环便是全部训练过程。 + +第一行with runner,是指我要使用runner这个类来进行训练了。在使用某一个runner前,必须使用with,来初始化一些数据。同时目前Paddle只 +支持一个进程使用一个runner(Paddle的全局变量问题)。 + +每一个run_one_pass()会返回一个当前的context,使用context wrapper可以更好(类型安全),更快(TODO +可以使用Cython优化)的访问Context。 + +```python +help(ctx) +``` + +这个训练过程中,我们不指定训练次数,而是指定当误差小于1e-10的时候,我们就退出。 + +同时,记录下每一个pass的w和b值。 + +之后我们便可以使用matplotlib画图。画图的方法不在赘述。是标准的matplotlib使用 + +```python +% matplotlib inline +import matplotlib.pyplot as plt + +plt.plot("cost", data=learning_result) +plt.show() + +``` + +```python +plt.plot("w", data=learning_result) +plt.show() + +``` + +```python +plt.plot("b", data=learning_result) +plt.show() + +``` + +至此,一个简单的Python Trainer API使用说明写完了。 diff --git a/demo/mnist/.gitignore b/demo/mnist/.gitignore index 8bd9837523ccf9..2a1a321b10c4cd 100644 --- a/demo/mnist/.gitignore +++ b/demo/mnist/.gitignore @@ -5,3 +5,6 @@ plot.png train.log *pyc .ipynb_checkpoints +*.w0 +*.wbias +*.bin diff --git a/demo/mnist/api_train.py b/demo/mnist/api_train.py index f301da382ff8a5..8f5f5217a59a5c 100644 --- a/demo/mnist/api_train.py +++ b/demo/mnist/api_train.py @@ -6,199 +6,43 @@ The user api could be simpler and carefully designed. """ -import py_paddle.swig_paddle as api -from py_paddle import DataProviderConverter + import paddle.trainer.PyDataProvider2 as dp -import numpy as np -import random -from mnist_util import read_from_mnist from paddle.trainer_config_helpers import * - -def optimizer_config(): - settings( - learning_rate=1e-4, - learning_method=AdamOptimizer(), - batch_size=1000, - model_average=ModelAverage(average_window=0.5), - regularization=L2Regularization(rate=0.5)) - - -def network_config(): - imgs = data_layer(name='pixel', size=784) - hidden1 = fc_layer(input=imgs, size=200) +import mnist_provider +from py_paddle.trainer import * + + +@network( + inputs={ + 'pixel': dp.dense_vector(784), + 'label': dp.integer_value(10), + }, + learning_rate=1e-4, + learning_method=AdamOptimizer(), + batch_size=1000, + model_average=ModelAverage(average_window=0.5), + regularization=L2Regularization(rate=0.5)) +def mnist_network(pixel, label): + hidden1 = fc_layer(input=pixel, size=200) hidden2 = fc_layer(input=hidden1, size=200) inference = fc_layer(input=hidden2, size=10, act=SoftmaxActivation()) - cost = classification_cost( - input=inference, label=data_layer( - name='label', size=10)) - outputs(cost) - - -def init_parameter(network): - assert isinstance(network, api.GradientMachine) - for each_param in network.getParameters(): - assert isinstance(each_param, api.Parameter) - array_size = len(each_param) - array = np.random.uniform(-1.0, 1.0, array_size).astype('float32') - each_param.getBuf(api.PARAMETER_VALUE).copyFromNumpyArray(array) - - -def generator_to_batch(generator, batch_size): - ret_val = list() - for each_item in generator: - ret_val.append(each_item) - if len(ret_val) == batch_size: - yield ret_val - ret_val = list() - if len(ret_val) != 0: - yield ret_val - - -class BatchPool(object): - def __init__(self, generator, batch_size): - self.data = list(generator) - self.batch_size = batch_size - - def __call__(self): - random.shuffle(self.data) - for offset in xrange(0, len(self.data), self.batch_size): - limit = min(offset + self.batch_size, len(self.data)) - yield self.data[offset:limit] - - -def input_order_converter(generator): - for each_item in generator: - yield each_item['pixel'], each_item['label'] + cost = classification_cost(input=inference, label=label) + return cost def main(): - api.initPaddle("-use_gpu=false", "-trainer_count=4") # use 4 cpu cores - - # get enable_types for each optimizer. - # enable_types = [value, gradient, momentum, etc] - # For each optimizer(SGD, Adam), GradientMachine should enable different - # buffers. - opt_config_proto = parse_optimizer_config(optimizer_config) - opt_config = api.OptimizationConfig.createFromProto(opt_config_proto) - _temp_optimizer_ = api.ParameterOptimizer.create(opt_config) - enable_types = _temp_optimizer_.getParameterTypes() - - # Create Simple Gradient Machine. - model_config = parse_network_config(network_config) - m = api.GradientMachine.createFromConfigProto( - model_config, api.CREATE_MODE_NORMAL, enable_types) - - # This type check is not useful. Only enable type hint in IDE. - # Such as PyCharm - assert isinstance(m, api.GradientMachine) - - # Initialize Parameter by numpy. - init_parameter(network=m) - - # Create Local Updater. Local means not run in cluster. - # For a cluster training, here we can change to createRemoteUpdater - # in future. - updater = api.ParameterUpdater.createLocalUpdater(opt_config) - assert isinstance(updater, api.ParameterUpdater) - - # Initialize ParameterUpdater. - updater.init(m) - - # DataProvider Converter is a utility convert Python Object to Paddle C++ - # Input. The input format is as same as Paddle's DataProvider. - converter = DataProviderConverter( - input_types=[dp.dense_vector(784), dp.integer_value(10)]) - - train_file = './data/raw_data/train' - test_file = './data/raw_data/t10k' - - # start gradient machine. - # the gradient machine must be started before invoke forward/backward. - # not just for training, but also for inference. - m.start() - - # evaluator can print error rate, etc. It is a C++ class. - batch_evaluator = m.makeEvaluator() - test_evaluator = m.makeEvaluator() - - # Get Train Data. - # TrainData will stored in a data pool. Currently implementation is not care - # about memory, speed. Just a very naive implementation. - train_data_generator = input_order_converter(read_from_mnist(train_file)) - train_data = BatchPool(train_data_generator, 512) - - # outArgs is Neural Network forward result. Here is not useful, just passed - # to gradient_machine.forward - outArgs = api.Arguments.createArguments(0) - - for pass_id in xrange(2): # we train 2 passes. - updater.startPass() - - for batch_id, data_batch in enumerate(train_data()): - # data_batch is input images. - # here, for online learning, we could get data_batch from network. - - # Start update one batch. - pass_type = updater.startBatch(len(data_batch)) - - # Start BatchEvaluator. - # batch_evaluator can be used between start/finish. - batch_evaluator.start() - - # forwardBackward is a shortcut for forward and backward. - # It is sometimes faster than invoke forward/backward separately, - # because in GradientMachine, it may be async. - m.forwardBackward(converter(data_batch), outArgs, pass_type) - - for each_param in m.getParameters(): - updater.update(each_param) - - # Get cost. We use numpy to calculate total cost for this batch. - cost_vec = outArgs.getSlotValue(0) - cost_vec = cost_vec.copyToNumpyMat() - cost = cost_vec.sum() / len(data_batch) - - # Make evaluator works. - m.eval(batch_evaluator) - - # Print logs. - print 'Pass id', pass_id, 'Batch id', batch_id, 'with cost=', \ - cost, batch_evaluator - - batch_evaluator.finish() - # Finish batch. - # * will clear gradient. - # * ensure all values should be updated. - updater.finishBatch(cost) - - # testing stage. use test data set to test current network. - updater.apply() - test_evaluator.start() - test_data_generator = input_order_converter(read_from_mnist(test_file)) - for data_batch in generator_to_batch(test_data_generator, 512): - # in testing stage, only forward is needed. - m.forward(converter(data_batch), outArgs, api.PASS_TEST) - m.eval(test_evaluator) - - # print error rate for test data set - print 'Pass', pass_id, ' test evaluator: ', test_evaluator - test_evaluator.finish() - updater.restore() - - updater.catchUpWith() - params = m.getParameters() - for each_param in params: - assert isinstance(each_param, api.Parameter) - value = each_param.getBuf(api.PARAMETER_VALUE) - value = value.copyToNumpyArray() - - # Here, we could save parameter to every where you want - print each_param.getName(), value - - updater.finishPass() - - m.finish() + mnist = mnist_network() + runner = RunnerBuilder( + network=mnist, device_count=2).with_std_local_trainer( + method=mnist_provider.process, + file_list=['./data/raw_data/train']).with_std_tester( + method=mnist_provider.process, + file_list=['./data/raw_data/t10k']).build() + with runner: + for _ in xrange(2): + runner.run_one_pass() if __name__ == '__main__': diff --git a/demo/mnist/mnist_provider.py b/demo/mnist/mnist_provider.py index 888cfef1e7e3e1..4635833d36b9f2 100644 --- a/demo/mnist/mnist_provider.py +++ b/demo/mnist/mnist_provider.py @@ -1,5 +1,5 @@ from paddle.trainer.PyDataProvider2 import * -from mnist_util import read_from_mnist +import numpy # Define a py data provider @@ -8,5 +8,27 @@ 'label': integer_value(10)}, cache=CacheType.CACHE_PASS_IN_MEM) def process(settings, filename): # settings is not used currently. - for each in read_from_mnist(filename): - yield each + imgf = filename + "-images-idx3-ubyte" + labelf = filename + "-labels-idx1-ubyte" + f = open(imgf, "rb") + l = open(labelf, "rb") + + f.read(16) + l.read(8) + + # Define number of samples for train/test + if "train" in filename: + n = 60000 + else: + n = 10000 + + images = numpy.fromfile( + f, 'ubyte', count=n * 28 * 28).reshape((n, 28 * 28)).astype('float32') + images = images / 255.0 * 2.0 - 1.0 + labels = numpy.fromfile(l, 'ubyte', count=n).astype("int") + + for i in xrange(n): + yield {"pixel": images[i, :], 'label': labels[i]} + + f.close() + l.close() diff --git a/demo/mnist/mnist_util.py b/demo/mnist/mnist_util.py deleted file mode 100644 index 3fd88ae7edc821..00000000000000 --- a/demo/mnist/mnist_util.py +++ /dev/null @@ -1,30 +0,0 @@ -import numpy - -__all__ = ['read_from_mnist'] - - -def read_from_mnist(filename): - imgf = filename + "-images-idx3-ubyte" - labelf = filename + "-labels-idx1-ubyte" - f = open(imgf, "rb") - l = open(labelf, "rb") - - f.read(16) - l.read(8) - - # Define number of samples for train/test - if "train" in filename: - n = 60000 - else: - n = 10000 - - images = numpy.fromfile( - f, 'ubyte', count=n * 28 * 28).reshape((n, 28 * 28)).astype('float32') - images = images / 255.0 * 2.0 - 1.0 - labels = numpy.fromfile(l, 'ubyte', count=n).astype("int") - - for i in xrange(n): - yield {"pixel": images[i, :], 'label': labels[i]} - - f.close() - l.close() diff --git a/paddle/api/Arguments.cpp b/paddle/api/Arguments.cpp index 0cafbd896e2d88..41beed38a87601 100644 --- a/paddle/api/Arguments.cpp +++ b/paddle/api/Arguments.cpp @@ -137,6 +137,10 @@ void Arguments::setSlotSequenceDim(size_t idx, IVector* vec) throw(RangeError) { a.cpuSequenceDims = m->cast(vec->getSharedPtr()); } +float Arguments::sumCosts() const { + return paddle::Argument::sumCosts(m->outputs); +} + int64_t Arguments::getBatchSize(size_t idx) const throw(RangeError) { auto& a = m->getArg(idx); return a.getBatchSize(); diff --git a/paddle/api/CMakeLists.txt b/paddle/api/CMakeLists.txt index da6dad10cd8076..90c2a6f2e70aa2 100644 --- a/paddle/api/CMakeLists.txt +++ b/paddle/api/CMakeLists.txt @@ -38,7 +38,8 @@ configure_file( generate_python_api(python_swig_sources) -file(GLOB PY_PADDLE_PYTHON_FILES ${PROJ_ROOT}/paddle/py_paddle/*.py) +file(GLOB PY_PADDLE_PYTHON_FILES ${PROJ_ROOT}/paddle/py_paddle/*.py + ${PROJ_ROOT}/paddle/py_paddle/trainer/*.py) # TODO(yuyang18) : make wheel name calculated by cmake add_custom_command(OUTPUT ${PROJ_ROOT}/paddle/dist/.timestamp diff --git a/paddle/api/PaddleAPI.h b/paddle/api/PaddleAPI.h index 09c891871a5ca8..eb83671b0450a0 100644 --- a/paddle/api/PaddleAPI.h +++ b/paddle/api/PaddleAPI.h @@ -450,6 +450,8 @@ class Arguments { IVector* vec) throw(RangeError); void setSlotSequenceDim(size_t idx, IVector* vec) throw(RangeError); + float sumCosts() const; + private: static Arguments* createByPaddleArgumentVector(void* ptr); void* getInternalArgumentsPtr() const; @@ -548,6 +550,10 @@ class Parameter { size_t getSize() const; + bool save(const std::string& filename) const; + + bool load(const std::string& filename) const; + private: static Parameter* createFromRawPtr(void* ptr); static Parameter* createFromSharedPtr(void* ptr); diff --git a/paddle/api/Parameter.cpp b/paddle/api/Parameter.cpp index ddc00d8d1af4c5..9ca2c73cd7d0a5 100644 --- a/paddle/api/Parameter.cpp +++ b/paddle/api/Parameter.cpp @@ -58,3 +58,11 @@ size_t Parameter::getID() const { return m->getPtr()->getID(); } void Parameter::setValueUpdated() { m->getPtr()->setValueUpdated(); } size_t Parameter::getSize() const { return m->getPtr()->getSize(); } + +bool Parameter::save(const std::string& filename) const { + return m->getPtr()->save(filename); +} + +bool Parameter::load(const std::string& filename) const { + return m->getPtr()->load(filename); +} diff --git a/paddle/api/test/.gitignore b/paddle/api/test/.gitignore new file mode 100644 index 00000000000000..ef37ef416791c2 --- /dev/null +++ b/paddle/api/test/.gitignore @@ -0,0 +1,6 @@ +___fc_layer_0__.w0 +___fc_layer_0__.wbias +_hidden1.w0 +_hidden1.wbias +_hidden2.w0 +_hidden2.wbias diff --git a/paddle/api/test/testArguments.py b/paddle/api/test/testArguments.py index 8cabecd242fb4e..a04a805d7a64ef 100644 --- a/paddle/api/test/testArguments.py +++ b/paddle/api/test/testArguments.py @@ -22,6 +22,8 @@ def test_load_arguments(self): args = swig_paddle.Arguments.createArguments(1) args.setSlotValue(0, m) + self.assertAlmostEqual(27.0, args.sumCosts()) + mat = args.getSlotValue(0) assert isinstance(mat, swig_paddle.Matrix) np_mat = mat.toNumpyMatInplace() diff --git a/paddle/api/test/testGradientMachine.py b/paddle/api/test/testGradientMachine.py index b81eafa9673ca3..4b705f66eccd26 100644 --- a/paddle/api/test/testGradientMachine.py +++ b/paddle/api/test/testGradientMachine.py @@ -45,6 +45,7 @@ def test_create_gradient_machine(self): assert isinstance(val, swig_paddle.Vector) arr = numpy.full((len(val), ), 0.1, dtype="float32") val.copyFromNumpyArray(arr) + self.assertTrue(param.save(param.getName())) param_config = param.getConfig().toProto() assert isinstance(param_config, paddle.proto.ParameterConfig_pb2.ParameterConfig) @@ -92,6 +93,9 @@ def backward_callback(param_): self.assertTrue(self.isCalled) + for param in machine.getParameters(): + self.assertTrue(param.load(param.getName())) + def test_train_one_pass(self): conf_file_path = './testTrainConfig.py' trainer_config = swig_paddle.TrainerConfig.createFromTrainerConfigFile( diff --git a/paddle/py_paddle/__init__.py b/paddle/py_paddle/__init__.py index 5504d1d50c5233..8b42b894b9891d 100644 --- a/paddle/py_paddle/__init__.py +++ b/paddle/py_paddle/__init__.py @@ -19,6 +19,7 @@ 'paddle', 'DataProviderConverter', 'DataProviderWrapperConverter', # for deprecated usage. - 'loadParameterFile' + 'loadParameterFile', + 'trainer' ] util.monkeypatches() diff --git a/paddle/py_paddle/dataprovider_converter.py b/paddle/py_paddle/dataprovider_converter.py index 981d10afda2671..fe9cc554bd81da 100644 --- a/paddle/py_paddle/dataprovider_converter.py +++ b/paddle/py_paddle/dataprovider_converter.py @@ -12,10 +12,12 @@ # See the License for the specific language governing permissions and # limitations under the License. -import paddle.trainer.PyDataProvider2 as dp2 import collections +import itertools + +import paddle.trainer.PyDataProvider2 as dp2 + import swig_paddle -import numpy __all__ = ['DataProviderConverter'] @@ -26,6 +28,12 @@ def __init__(self, input_type, pos): assert isinstance(self.input_type, dp2.InputType) self.pos = pos + def pre_scan_loop(self, dat): + pass + + def finish_pre_scan(self, argument): + pass + def scan(self, dat): pass @@ -37,18 +45,24 @@ class DenseScanner(IScanner): def __init__(self, input_type, pos): IScanner.__init__(self, input_type, pos) self.__mat__ = None + self.__height__ = 0 + + def pre_scan_loop(self, dat): + self.__height__ += 1 + + def finish_pre_scan(self, argument): + self.__mat__ = swig_paddle.Matrix.createZero(self.__height__, + self.input_type.dim, False) + self.__height__ = 0 def scan(self, dat): - if self.__mat__ is None: - self.__mat__ = numpy.array([dat], dtype='float32') - else: - self.__mat__ = numpy.append(self.__mat__, [dat], axis=0) + assert isinstance(self.__mat__, swig_paddle.Matrix) + a = self.__mat__.toNumpyMatInplace() + a[self.__height__, ] = dat + self.__height__ += 1 def finish_scan(self, argument): - assert isinstance(argument, swig_paddle.Arguments) - assert isinstance(self.input_type, dp2.InputType) - m = swig_paddle.Matrix.createDenseFromNumpy(self.__mat__, True, False) - argument.setSlotValue(self.pos, m) + argument.setSlotValue(self.pos, self.__mat__) class SparseBinaryScanner(IScanner): @@ -146,7 +160,14 @@ def convert(self, dat, argument=None): ] for each_sample in dat: - for each_step, scanner in zip(each_sample, scanners): + for each_step, scanner in itertools.izip(each_sample, scanners): + scanner.pre_scan_loop(each_step) + + for scanner in scanners: + scanner.finish_pre_scan(argument) + + for each_sample in dat: + for each_step, scanner in itertools.izip(each_sample, scanners): scanner.scan(each_step) for scanner in scanners: diff --git a/paddle/py_paddle/trainer/__init__.py b/paddle/py_paddle/trainer/__init__.py new file mode 100644 index 00000000000000..9f80a62b0063f8 --- /dev/null +++ b/paddle/py_paddle/trainer/__init__.py @@ -0,0 +1,5 @@ +from .base import * +from .base_items import * +from .items import * +from .network import * +from .builder import * diff --git a/paddle/py_paddle/trainer/base.py b/paddle/py_paddle/trainer/base.py new file mode 100644 index 00000000000000..b3714d0a2bfb1f --- /dev/null +++ b/paddle/py_paddle/trainer/base.py @@ -0,0 +1,244 @@ +""" +Runner Base Classes. + +We can invoke a runner pass by pass. In one pass, Runner will handle everything, + invoke them batch by batch. + + +Runner class helps us to extract complex logic in `Trainer.cpp`. Runner contains +several RunnerItem; each RunnerChainItem will do some aspect of Trainer +logic. + +For example, the trainer logic could be: + +.. code-block: python + + gradient_machine.startPass() + updater.startPass() + for each_batch in data: + gradient_machine.startBatch() + updater.startBatch() + + gradient_machine.train() + + updater.finishBatch() + gradient_machine.finishBatch() + updater.finishPass() + gradient_machine.finishPass() + +We can extract this logic into two RunnerChainItems. GradientMachineOperations +and UpdaterOperations. It just like a middleware framework. +""" + +import functools + +__all__ = ['RunnerItem', 'Runner', 'RunnerContext'] + + +class RunnerContext(object): + pass + + +class RunnerItem(object): + """ + RunnerItem is an item in Runner. Runner will composite the + RunnerItems together and invoke the first RunnerChainItem's methods. + And Runner will pass the next chain item's method as `next_callback`. + If current chain item is the last item. A default next_callback will be + passed. + + Context is a global object shared by items. + """ + + def __init__(self): + pass + + def initialize(self, context, next_callback): + """ + initialize method. It will be invoked when Runner start to run. + + :param context: a global object shared by items. + :type context: RunnerContext + :param next_callback: next item's initialize method. + :type next_callback: callable + :return: None + :rtype: None + """ + next_callback(context) + + def finalize(self, next_callback): + """ + Finalize method. It will be invoked when Runner complete run, and clean + some state in RunnerItem. + + :param next_callback: next item's initialize method. + :type next_callback: callable + :return: None + :rtype: None + """ + next_callback() + + def on_pass_begin(self, next_callback): + """ + Pass Begin Method. Invoked when a pass begins. + + :param next_callback: next item's initialize method. + :type next_callback: callable + :return: None + :rtype: None + """ + + next_callback() + + def on_pass_end(self, next_callback): + """ + Pass End Method. Invoked when a pass ends. + + :param next_callback: next item's initialize method. + :type next_callback: callable + :return: None + :rtype: None + """ + next_callback() + + def on_batch_begin(self, next_callback): + """ + Batch Begin Method. Invoked when a batch begins. Return true if there is + no more batch could be processed. + + :param next_callback: next item's initialize method. + :type next_callback: callable + :return: True if no more batch could be processed. + :rtype: bool + """ + return next_callback() + + def on_batch_end(self, next_callback): + """ + Batch End Method. Invoked when a batch ends. Return true if there is + no more batch could be processed. + + :param next_callback: next item's initialize method. + :type next_callback: callable + :return: True if no more batch could be processed. + :rtype: bool + """ + return next_callback() + + +def default_next_callback(*args, **kwargs): + """ + Default next_callback for the last RunnerItem. + """ + return False + + +class Runner(object): + """ + Runner contains many RunnerItem. Each item will do some aspect of + Trainer/Tester job. Basic usage is shown as below. + + .. code-block: python + + runner = Runner() + + runner.add_item(ItemA()) + runner.add_item(ItemB()) + + with runner: + runner.run_one_pass() + """ + + # Because Runner is heavily used, so explicit declare the __slots__ for + # faster attribute access. + __slots__ = [ + '__init__', 'add_item', '__initialize__', 'run_one_pass', '__enter__', + '__exit__', '__items__', '__begin_pass__', '__end_pass__', + '__begin_batch__', '__end_batch__', 'finalize', '__context__' + ] + + def __init__(self): + self.__items__ = [] + self.__begin_pass__ = None + self.__end_pass__ = None + self.__begin_batch__ = None + self.__end_batch__ = None + self.finalize = None + + self.__context__ = RunnerContext() + self.__context__.runner = self + + def add_item(self, item): + """ + Add a runner item to runner. + """ + assert isinstance(item, RunnerItem) + self.__items__.append(item) + + def __initialize__(self, parent=None): + if None not in [ + self.__begin_pass__, self.__end_pass__, self.__begin_batch__, + self.__end_batch__, self.finalize + ]: + return False + else: + assert len(self.__items__) != 0 + actual_init = default_next_callback + self.__begin_pass__ = default_next_callback + self.__end_pass__ = default_next_callback + self.__begin_batch__ = default_next_callback + self.__end_batch__ = default_next_callback + self.finalize = default_next_callback + + for chain in reversed(self.__items__): + assert isinstance(chain, RunnerItem) + actual_init = functools.partial( + chain.initialize, next_callback=actual_init) + self.__begin_pass__ = functools.partial( + chain.on_pass_begin, next_callback=self.__begin_pass__) + self.__end_pass__ = functools.partial( + chain.on_pass_end, next_callback=self.__end_pass__) + self.__begin_batch__ = functools.partial( + chain.on_batch_begin, next_callback=self.__begin_batch__) + self.__end_batch__ = functools.partial( + chain.on_batch_end, next_callback=self.__end_batch__) + self.finalize = functools.partial( + chain.finalize, next_callback=self.finalize) + + if parent is not None: + self.__context__.parent = parent + + actual_init(self.__context__) + return True + + def run_one_pass(self): + """ + Run one pass for runner. The parent argument will passed to context. + """ + + self.__begin_pass__() + exit_flag = False + while not exit_flag: + exit_flag = self.__begin_batch__() + if exit_flag: + break + exit_flag = self.__end_batch__() + self.__end_pass__() + return self.__context__ + + def __enter__(self): + """ + Implementation for with block. + :return: + """ + self.__initialize__() + + def __exit__(self, exc_type, exc_val, exc_tb): + """ + Implementation for with block. + :param exc_type: + :param exc_val: + :param exc_tb: + :return: + """ + self.finalize() diff --git a/paddle/py_paddle/trainer/base_items.py b/paddle/py_paddle/trainer/base_items.py new file mode 100644 index 00000000000000..a9b8fd51a5b50d --- /dev/null +++ b/paddle/py_paddle/trainer/base_items.py @@ -0,0 +1,181 @@ +""" +Some basically items. +""" +from .base import RunnerItem +from .. import swig_paddle as api + +__all__ = ['init_runner_item', 'ContextWrapper', 'BaseRunnerItem'] + + +def init_runner_item(): + def __impl__(func): + class __ImplItem__(RunnerItem): + __doc__ = func.__doc__ + + def __init__(self, **kwargs): + RunnerItem.__init__(self) + self.__kwargs__ = kwargs + + def initialize(self, context, next_callback): + func(context=context, **self.__kwargs__) + next_callback(context=context) + + return __ImplItem__ + + return __impl__ + + +class ContextWrapper(object): + """ + Strong typed wrapper to read/write context value. + + @TODO(yuyang18): Use Cython to implement this class, make it directly access + a C struct. + """ + + def __init__(self, context): + self.real_context = context + + def gradient_machine(self, field_name='gradient_machine'): + """ + Get Gradient Machine + :param field_name: + :return: + :rtype: api.GradientMachine + """ + return self.get_field_with_type( + field_name=field_name, tp=api.GradientMachine) + + def set_gradient_machine(self, machine, field_name='gradient_machine'): + self.set_field_with_type( + field_name=field_name, value=machine, tp=api.GradientMachine) + + def updater(self, field_name='updater'): + """ + Get Parameter Updater + :param field_name: + :return: + :rtype: api.ParameterUpdater + """ + return self.get_field_with_type( + field_name=field_name, tp=api.ParameterUpdater) + + def set_updater(self, updater, field_name='updater'): + self.set_field_with_type( + field_name=field_name, value=updater, tp=api.ParameterUpdater) + + def set_field_with_type(self, field_name, value, tp, must_not_set=True): + assert not must_not_set or not hasattr(self.real_context, field_name) + assert isinstance(value, tp) + setattr(self.real_context, field_name, value) + + def set_updater_callback(self, + updater_callback, + field_name='updater_callback'): + assert callable(updater_callback) + setattr(self.real_context, field_name, updater_callback) + + def updater_callback(self, field_name='updater_callback'): + """ + + :param field_name: + :return: + :rtype: callable + """ + cb = getattr(self.real_context, field_name, None) + assert callable(cb) + return cb + + def batch_size(self, field_name='current_batch_size'): + """ + + :param field_name: + :return: + :rtype: int + """ + return self.get_field_with_type(field_name=field_name, tp=int) + + def set_batch_size(self, batch_size, field_name='current_batch_size'): + self.set_field_with_type( + field_name=field_name, value=batch_size, tp=int, must_not_set=False) + + def cost(self, field_name='current_cost'): + """ + + :param field_name: + :return: + :rtype: float + """ + return self.get_field_with_type(field_name=field_name, tp=float) + + def set_cost(self, cost, field_name='current_cost'): + self.set_field_with_type( + field_name=field_name, value=cost, tp=float, must_not_set=False) + + def reset_batch_id(self, field_name='current_batch_id'): + self.set_field_with_type( + field_name=field_name, value=0, tp=int, must_not_set=False) + + def batch_id(self, field_name='current_batch_id'): + return self.get_field_with_type(field_name=field_name, tp=int) + + def increase_batch_id(self, field_name='current_batch_id'): + self.set_field_with_type( + field_name=field_name, + value=self.batch_id(field_name=field_name) + 1, + tp=int, + must_not_set=False) + + def reset_pass_id(self, field_name='current_pass_id'): + self.set_field_with_type( + field_name=field_name, value=0, tp=int, must_not_set=False) + + def pass_id(self, field_name='current_pass_id'): + return self.get_field_with_type(field_name=field_name, tp=int) + + def increase_pass_id(self, field_name='current_pass_id'): + self.set_field_with_type( + field_name=field_name, + value=self.pass_id(field_name=field_name) + 1, + tp=int, + must_not_set=False) + + def get_field(self, field_name): + field = getattr(self.real_context, field_name, None) + return field + + def get_field_with_type(self, field_name, tp): + field = self.get_field(field_name) + assert isinstance(field, + tp), "Field %s with type %s, should with type %s" % ( + field_name, type(field), tp) + return field + + def in_args(self, field_name='in_args'): + """ + + :param field_name: + :return: + :rtype: api.Arguments + """ + return self.get_field_with_type(field_name=field_name, tp=api.Arguments) + + def set_in_args(self, in_args, field_name='in_args'): + self.set_field_with_type( + field_name=field_name, + value=in_args, + tp=api.Arguments, + must_not_set=False) + + +class BaseRunnerItem(RunnerItem): + """ + :type context: ContextWrapper + """ + + def __init__(self): + super(BaseRunnerItem, self).__init__() + self.context = None + + def store_context(self, context): + self.context = ContextWrapper(context=context) diff --git a/paddle/py_paddle/trainer/builder.py b/paddle/py_paddle/trainer/builder.py new file mode 100644 index 00000000000000..ade63cc45de249 --- /dev/null +++ b/paddle/py_paddle/trainer/builder.py @@ -0,0 +1,97 @@ +from .base import * +from .items import * + + +class RunnerBuilder(object): + def __init__(self, network, use_gpu=False, device_count=1): + self.__runner__ = Runner() + self.__runner__.add_item(Counter()) + self.__network__ = network + self.__runner__.add_item( + set_device( + use_gpu=use_gpu, device_count=device_count)) + self.__runner__.add_item( + CreateGradientMachine(network=self.__network__)) + + self.__train_data__ = None + self.__updater__ = None + self.__gradient_machine__ = None + self.__init_param__ = None + self.__evaluate__ = [] + + def with_std_random_init_params(self): + self.__init_param__ = std_random_init_params() + return self + + def with_train_data(self, method, file_list=None, batch_size=None, + **kwargs): + if batch_size is None: + batch_size = self.__network__.optimize_graph().batch_size + + if file_list is None: + file_list = [None] + + self.__train_data__ = BasicPaddleTrainerDataProvider( + network=self.__network__, + method=method, + file_list=file_list, + batch_size=batch_size, + **kwargs) + return self + + def with_std_local_updater(self): + self.__updater__ = BasicLocalParameterUpdater(network=self.__network__) + return self + + def with_std_gradient_machine_ops(self): + self.__gradient_machine__ = BasicGradientMachineTrainOps() + return self + + def with_batch_evaluator(self, prefix=None): + self.__evaluate__ = [BatchEvaluate(prefix=prefix)] + return self + + def with_pass_evaluator(self, prefix=None): + self.__evaluate__ = [PassEvaluate(prefix=prefix)] + return self + + def with_std_tester(self, method, file_list, batch_size=None, **kwargs): + if batch_size is None: + batch_size = self.__network__.optimize_graph().batch_size + + # tester should be a evaluator, too + self.__evaluate__.append( + TestOnPassEnd( + network=self.__network__, + method=method, + file_list=file_list, + batch_size=batch_size)) + return self + + def with_std_param_saver(self): + self.__evaluate__.append(SaveParamsOnPassEnd()) + return self + + def with_std_local_trainer(self, **kwargs): + return self.with_std_random_init_params().with_train_data( + **kwargs).with_std_local_updater().with_std_gradient_machine_ops( + ).with_batch_evaluator().with_std_param_saver() + + def with_observer(self, on_batch_end=None, on_pass_end=None): + self.__evaluate__.append(BaseObserveItem(on_batch_end, on_pass_end)) + return self + + def build(self): + if self.__init_param__ is None: + self.with_std_random_init_params() + self.__runner__.add_item(self.__init_param__) + self.__runner__.add_item(self.__train_data__) + if self.__updater__ is None: + self.with_std_local_updater() + self.__runner__.add_item(self.__updater__) + if self.__gradient_machine__ is None: + self.with_std_gradient_machine_ops() + self.__runner__.add_item(self.__gradient_machine__) + for each in self.__evaluate__: + self.__runner__.add_item(each) + return self.__runner__ diff --git a/paddle/py_paddle/trainer/data_providers.py b/paddle/py_paddle/trainer/data_providers.py new file mode 100644 index 00000000000000..560a7f4017ea4e --- /dev/null +++ b/paddle/py_paddle/trainer/data_providers.py @@ -0,0 +1,64 @@ +from .. import DataProviderConverter +import random + +__all__ = ['DataProvider', 'NaiveMemPooledDataProvider', 'NaiveDataProvider'] + + +class DataProvider(object): + __slots__ = [ + '__init__', 'reset', 'next', '__method__', '__converter__', + '__batch_size__', '__should_shuffle__' + ] + + def __init__(self, method, input_types, batch_size, should_shuffle=True): + self.__method__ = method + self.__converter__ = DataProviderConverter(input_types) + self.__batch_size__ = batch_size + self.__should_shuffle__ = should_shuffle + + def reset(self): + raise NotImplemented() + + def next(self): + raise NotImplemented() + + +class NaiveMemPooledDataProvider(DataProvider): + def __init__(self, method, input_types, batch_size, should_shuffle): + super(NaiveMemPooledDataProvider, self).__init__( + method=method, + input_types=input_types, + batch_size=batch_size, + should_shuffle=should_shuffle) + self.__pool__ = [] + self.__idx__ = 0 + + def reset(self): + self.__pool__ = list(self.__method__()) + if self.__should_shuffle__: + random.shuffle(self.__pool__) + + self.__idx__ = 0 + + def next(self): + if self.__idx__ < len(self.__pool__): + end = min(self.__idx__ + self.__batch_size__, len(self.__pool__)) + begin = self.__idx__ + self.__idx__ = end + return self.__converter__(self.__pool__[begin:end]), end - begin + else: + raise StopIteration + + +class NaiveDataProvider(NaiveMemPooledDataProvider): + def __init__(self, provider, input_types, batch_size, should_shuffle=True): + def __to_pool__(): + for filename in provider.file_list: + for item in provider.generator(provider, filename): + yield item + + super(NaiveDataProvider, self).__init__( + method=__to_pool__, + input_types=input_types, + batch_size=batch_size, + should_shuffle=should_shuffle) diff --git a/paddle/py_paddle/trainer/items.py b/paddle/py_paddle/trainer/items.py new file mode 100644 index 00000000000000..5cb9db6811f2cd --- /dev/null +++ b/paddle/py_paddle/trainer/items.py @@ -0,0 +1,422 @@ +from .data_providers import * +from .base import RunnerItem, Runner +from .network import NetworkConfig +from .base_items import * +from .. import swig_paddle as api + +__all__ = [ + 'set_device', 'std_random_init_params', 'CreateGradientMachine', + 'BasicLocalParameterUpdaterOps', 'BasicLocalParameterUpdater', 'Counter', + 'BatchEvaluate', 'PassEvaluate', 'BasicGradientMachineTrainOps', + 'BasicGradientMachineTestOps', 'InheritGradientMachineUpdater', + 'TestOnPassEnd', 'BasicPaddleTrainerDataProvider', 'BasicDataProviderOps', + 'BasicPaddleTestDataProvider', 'SaveParamsOnPassEnd', 'BaseObserveItem' +] + + +@init_runner_item() +def set_device(use_gpu, device_count, **kwargs): + """ + Set Device Of PaddlePaddle. It will invoke api.initPaddle, so must + be the first RunnerItem. + + @TODO(yuyang18): Check current Paddle compiled with CUDA support and + Check the max device count in system. + :param use_gpu: True if using GPU. + :param device_count: CPU cores or GPU cards count. + :param kwargs: Not used. But just added for capacity + :return: None + """ + api.initPaddle('--use_gpu=%s' % repr(use_gpu), + '--trainer_count=%d' % device_count) + + +@init_runner_item() +def std_random_init_params(context, **kwargs): + """ + + :param context: + :type context: ContextWrapper + :return: + """ + ContextWrapper(context).gradient_machine().randParameters() + + +class CreateGradientMachine(BaseRunnerItem): + def __init__(self, network, local=True): + RunnerItem.__init__(self) + assert isinstance(network, NetworkConfig) + self.__network__ = network + self.__local__ = local + + def initialize(self, context, next_callback): + """ + + :param context: + :type context: ContextWrapper + :param next_callback: + :return: + """ + self.store_context(context) + # get enable_types for each optimizer. + # enable_types = [value, gradient, momentum, etc] + # For each optimizer(SGD, Adam), GradientMachine should enable different + # buffers. + opt_config = api.OptimizationConfig.createFromProto( + self.__network__.optimize_graph()) + _temp_optimizer_ = api.ParameterOptimizer.create(opt_config) + enable_types = _temp_optimizer_.getParameterTypes() + + graph = self.__network__.network_graph() + + if self.__local__: + for param in graph.parameters: + if param.HasField("sparse_remote_update"): + param.sparse_remote_update = False + + self.context.set_gradient_machine( + api.GradientMachine.createFromConfigProto( + graph, api.CREATE_MODE_NORMAL, enable_types)) + next_callback(context) + self.context.gradient_machine().start() + + def finalize(self, next_callback): + self.context.gradient_machine().finish() + next_callback() + + +class BasicLocalParameterUpdaterOps(BaseRunnerItem): + """ + Basic Operations for local parameter updater. + + Invoke startPass/finishPass etc. + """ + + def __init__(self): + super(BasicLocalParameterUpdaterOps, self).__init__() + + def initialize(self, context, next_callback): + self.store_context(context) + next_callback(context) + + def on_pass_begin(self, next_callback): + self.context.updater().startPass() + next_callback() + + def on_batch_begin(self, next_callback): + self.context.updater().startBatch(self.context.batch_size()) + return next_callback() + + def on_batch_end(self, next_callback): + exit_flag = next_callback() + if not exit_flag: + self.context.updater().finishBatch(self.context.cost()) + return exit_flag + + def on_pass_end(self, next_callback): + self.context.updater().finishPass() + next_callback() + + +class BasicLocalParameterUpdater(BasicLocalParameterUpdaterOps): + """ + Create a basic local parameter updater. + """ + + def __init__(self, network): + super(BasicLocalParameterUpdater, self).__init__() + self.__network__ = network + + def initialize(self, context, next_callback): + self.store_context(context) + # Create Local Updater. Local means not run in cluster. + # For a cluster training, here we can change to createRemoteUpdater + # in future. + opt_config = api.OptimizationConfig.createFromProto( + self.__network__.optimize_graph()) + self.context.set_updater( + api.ParameterUpdater.createLocalUpdater(opt_config)) + self.context.updater().init(self.context.gradient_machine()) + self.context.set_updater_callback(self.context.updater().update) + next_callback(context) + + +class Counter(BaseRunnerItem): + def __init__(self): + super(Counter, self).__init__() + + def initialize(self, context, next_callback): + self.store_context(context) + self.context.reset_batch_id() + self.context.reset_pass_id() + next_callback(context) + + def on_batch_end(self, next_callback): + ret = next_callback() + self.context.increase_batch_id() + return ret + + def on_pass_end(self, next_callback): + next_callback() + self.context.increase_pass_id() + + +class BaseEvaluate(BaseRunnerItem): + """ + Base Evaluate Item. Just create a evaluator, it can be used in derived + class. It will print some stats log during run_one_pass. + + :type __evaluator__: api.Evaluator + :param prefix: The prefix for stats log. + :type prefix: basestring + :type __prefix__: basestring + """ + + def __init__(self, prefix=None): + super(BaseEvaluate, self).__init__() + self.__evaluator__ = None + if prefix is None: + prefix = '' + self.__prefix__ = prefix + + def initialize(self, context, next_callback): + self.store_context(context) + next_callback(context) + self.__evaluator__ = self.context.gradient_machine().makeEvaluator() + + def finalize(self, next_callback): + next_callback() + self.__evaluator__ = None + + +class BatchEvaluate(BaseEvaluate): + """ + Print stats log on each batch end. + """ + + def __init__(self, prefix=None): + BaseEvaluate.__init__(self, prefix) + + def on_batch_end(self, next_callback): + self.__evaluator__.start() + self.context.gradient_machine().eval(self.__evaluator__) + retv = next_callback() + print '%sPass=%d, Batch=%d Cost=%f, Eval:' % (self.__prefix__, + self.context.pass_id(), + self.context.batch_id(), + self.context.cost()), \ + self.__evaluator__ + self.__evaluator__.finish() + return retv + + +class PassEvaluate(BaseEvaluate): + """ + Print stats log on each pass end. + """ + + def __init__(self, prefix=None): + super(PassEvaluate, self).__init__(prefix=prefix) + + def on_pass_begin(self, next_callback): + next_callback() + self.__evaluator__.start() + + def on_batch_end(self, next_callback): + retv = next_callback() + self.context.gradient_machine().eval(self.__evaluator__) + return retv + + def on_pass_end(self, next_callback): + next_callback() + print '%sPass=%d Eval:' % (self.__prefix__, self.context.pass_id()), \ + self.__evaluator__ + self.__evaluator__.finish() + + +class BasicGradientMachineTrainOps(BaseRunnerItem): + """ + Forward/backward a gradient machine. + + :type __out_args__: api.Arguments + """ + + def __init__(self): + super(BasicGradientMachineTrainOps, self).__init__() + self.__out_args__ = api.Arguments.createArguments(0) + + def initialize(self, context, next_callback): + self.store_context(context) + next_callback(context) + + def on_batch_begin(self, next_callback): + # forwardBackward is a shortcut for forward and backward. + # It is sometimes faster than invoke forward/backward separately, + # because in GradientMachine, it may be async. + + self.context.real_context.gradient_machine.forwardBackward( + self.context.in_args(), self.__out_args__, api.PASS_TRAIN) + + for each_param in self.context.gradient_machine().getParameters(): + self.context.real_context.updater_callback(each_param) + + self.context.set_cost(self.__out_args__.sumCosts() / + self.context.batch_size()) + return next_callback() + + +class BasicGradientMachineTestOps(BaseRunnerItem): + def __init__(self): + super(BasicGradientMachineTestOps, self).__init__() + self.__out_args__ = api.Arguments.createArguments(0) + + def initialize(self, context, next_callback): + self.store_context(context) + next_callback(context) + + def on_pass_begin(self, next_callback): + self.context.updater().apply() + next_callback() + + def on_batch_begin(self, next_callback): + self.context.gradient_machine().forward( + self.context.in_args(), self.__out_args__, api.PASS_TEST) + return next_callback() + + def on_pass_end(self, next_callback): + self.context.updater().restore() + next_callback() + + +class InheritGradientMachineUpdater(RunnerItem): + def __init__(self): + RunnerItem.__init__(self) + self.context = None + + def initialize(self, context, next_callback): + if context.parent is not None: # inherit from parent. + context.gradient_machine = context.parent.gradient_machine + context.updater = context.parent.updater + self.context = context + next_callback(context) + + def on_pass_begin(self, next_callback): + if self.context.parent is not None: + self.context.current_pass_id = self.context.parent.current_pass_id + next_callback() + + def on_batch_begin(self, next_callback): + if self.context.parent is not None: + self.context.current_batch_id = self.context.parent.current_batch_id + return next_callback() + + +class BasicDataProviderOps(BaseRunnerItem): + """ + :type __provider__: DataProvider + """ + + def __init__(self): + super(BasicDataProviderOps, self).__init__() + self.__provider__ = None + + def on_pass_begin(self, next_callback): + self.__provider__.reset() + next_callback() + + def on_batch_begin(self, next_callback): + try: + in_args, batch_size = next(self.__provider__) + self.context.set_in_args(in_args) + self.context.set_batch_size(batch_size) + return next_callback() + except StopIteration: + return True + + +def data_provider_creator(is_train): + class __cls__(BasicDataProviderOps): + def __init__(self, network, method, file_list, batch_size, **kwargs): + super(__cls__, self).__init__() + self.__dataprovider__ = method( + file_list=file_list, + input_order=network.input_order(), + is_train=is_train, + **kwargs) + self.__input_types__ = [] + for data_layer_name in network.input_order(): + self.__input_types__.append(network.input_types()[ + data_layer_name]) + self.__batch_size__ = batch_size + + def initialize(self, context, next_callback): + self.store_context(context) + self.__provider__ = NaiveDataProvider( + provider=self.__dataprovider__, + input_types=self.__input_types__, + batch_size=self.__batch_size__, + should_shuffle=True if is_train else False) + next_callback(context) + + return __cls__ + + +BasicPaddleTrainerDataProvider = data_provider_creator(True) +BasicPaddleTestDataProvider = data_provider_creator(False) + + +class TestOnPassEnd(RunnerItem): + def __init__(self, **kwargs): + RunnerItem.__init__(self) + self.__test_runner__ = Runner() + self.__test_runner__.add_item(InheritGradientMachineUpdater()) + self.__test_runner__.add_item(BasicPaddleTestDataProvider(**kwargs)) + self.__test_runner__.add_item(BasicGradientMachineTestOps()) + self.__test_runner__.add_item(PassEvaluate(prefix='Test: ')) + + def initialize(self, context, next_callback): + next_callback(context) + self.__test_runner__.__initialize__(context) + + def on_pass_end(self, next_callback): + self.__test_runner__.run_one_pass() + next_callback() + + +class SaveParamsOnPassEnd(BaseRunnerItem): + def __init__(self): + super(SaveParamsOnPassEnd, self).__init__() + + def initialize(self, context, next_callback): + self.store_context(context) + next_callback(context) + + def on_pass_end(self, next_callback): + self.context.updater().catchUpWith() + params = self.context.gradient_machine().getParameters() + for param in params: + param.save(param.getName()) + + next_callback() + + +class BaseObserveItem(BaseRunnerItem): + def __init__(self, batch_end_callback=None, pass_end_callback=None): + super(BaseObserveItem, self).__init__() + self.__batch_end__ = batch_end_callback + self.__pass_end__ = pass_end_callback + + def initialize(self, context, next_callback): + self.store_context(context) + next_callback(context) + + def on_batch_end(self, next_callback): + r = next_callback() + if self.__batch_end__ is not None: + self.__batch_end__(self.context) + return r + + def on_pass_end(self, next_callback): + next_callback() + if self.__pass_end__ is not None: + self.__pass_end__(self.context) diff --git a/paddle/py_paddle/trainer/network.py b/paddle/py_paddle/trainer/network.py new file mode 100644 index 00000000000000..cdd3e54f0b3f8c --- /dev/null +++ b/paddle/py_paddle/trainer/network.py @@ -0,0 +1,131 @@ +from paddle.trainer_config_helpers import * +from paddle.trainer_config_helpers import inputs as ipts +import paddle.trainer.PyDataProvider2 as dp2 + +__all__ = ['NetworkConfig', 'network'] + + +class NetworkConfig(object): + """ + Base class for a neural network configuration. + + NOTE: this object only hold neural network's compute graph, not hold any + parameters. + """ + + def __init__(self): + pass + + def input_order(self): + """ + Input Order is the order of neural network's data layer. + + The gradient_machine's input arguments list should be the same order of + input order. + :return: list of data layer name. + :rtype: list + """ + raise NotImplemented() + + def input_types(self): + """ + Input types of each data layer. + :return: a dict. Key is data layer's name. Value is the type of this + data layer. + :rtype: dict + """ + raise NotImplemented() + + def network_graph(self): + """ + get the ModelConfig of this neural network. Return raw protobuf object. + :return: ModelConfig protobuf object. + :rtype: paddle.proto.ModelConfig_pb2.ModelConfig + """ + raise NotImplemented() + + def optimize_graph(self): + """ + get the OptimizationConfig of this neural network. Return raw protobuf + object. + :return: OptimizationConfig protobuf object. + :rtype: paddle.proto.TrainerConfig_pb2.OptimizationConfig + """ + raise NotImplemented() + + def provider(self, **kwargs): + return dp2.provider(input_types=self.input_types(), **kwargs) + + +def network(inputs, **opt_kwargs): + """ + A decorator for neural network method. It will wrap a method to a + NetworkConfig. + + .. code-block: python + + @network(inputs={'img': dense_vector(784), 'label':integer_value(10)}, + batch_size=1000, learning_rate=1e-3, learning_method=AdamOptimizer()) + def mnist_network(img, label): + hidden = fc_layer(input=img, size=200) + ... + cost = classification_cost(input=inference, label=label) + return cost + + mnist = mnist_network() + + :param inputs: input dictionary for this neural network. The key of this + dictionary is wrapped method parameter. Value is data type. + :param opt_kwargs: Other arguments of this method are passed to optimizers. + :return: NetworkConfig Class. + :rtype: class + """ + + def __impl__(func): + class NetworkConfigImpl(NetworkConfig): + def __init__(self): + NetworkConfig.__init__(self) + self.__inputs__ = inputs + self.__network_graph__ = None + self.__optimize_graph__ = None + + def input_order(self): + return inputs.keys() + + def input_types(self): + return self.__inputs__ + + def network_graph(self): + if self.__network_graph__ is None: + + def __network_graph_func__(): + kwargs = dict() + lst = list() + for k in inputs: + v = inputs[k] + data = data_layer(name=k, size=v.dim) + kwargs[k] = data + lst.append(data) + ipts(*lst) + rst = func(**kwargs) + if not isinstance(rst, tuple): + rst = (rst, ) + outputs(*rst) + + self.__network_graph__ = parse_network_config( + __network_graph_func__) + return self.__network_graph__ + + def optimize_graph(self): + if self.__optimize_graph__ is None: + + def __optimize_graph_func__(): + settings(**opt_kwargs) + + self.__optimize_graph__ = parse_optimizer_config( + __optimize_graph_func__) + return self.__optimize_graph__ + + return NetworkConfigImpl + + return __impl__ diff --git a/paddle/scripts/docker/00_sshd b/paddle/scripts/docker/00_sshd new file mode 100755 index 00000000000000..ab19d0b5caed92 --- /dev/null +++ b/paddle/scripts/docker/00_sshd @@ -0,0 +1,2 @@ +#!/bin/bash +/usr/sbin/sshd -D diff --git a/paddle/scripts/docker/01_jupyter b/paddle/scripts/docker/01_jupyter new file mode 100755 index 00000000000000..c013dc8e18d5b5 --- /dev/null +++ b/paddle/scripts/docker/01_jupyter @@ -0,0 +1,2 @@ +#!/bin/bash +jupyter notebook /notes/ diff --git a/paddle/scripts/docker/Dockerfile b/paddle/scripts/docker/Dockerfile index b01de499bd1fbc..4269f7fb33ca6e 100644 --- a/paddle/scripts/docker/Dockerfile +++ b/paddle/scripts/docker/Dockerfile @@ -15,7 +15,7 @@ RUN apt-get update \ && apt-get clean -y RUN cd /usr/src/gtest && cmake . && make && cp *.a /usr/lib RUN pip install -U BeautifulSoup docopt PyYAML pillow \ - sphinx sphinx_rtd_theme recommonmark + sphinx sphinx_rtd_theme recommonmark jupyter ARG WITH_AVX ARG WITH_DOC @@ -43,4 +43,16 @@ RUN echo 'root:root' | chpasswd RUN sed -ri 's/^PermitRootLogin\s+.*/PermitRootLogin yes/' /etc/ssh/sshd_config RUN sed -ri 's/UsePAM yes/#UsePAM yes/g' /etc/ssh/sshd_config EXPOSE 22 -CMD ["/usr/sbin/sshd", "-D"] + +# Jupyter Notebook directory. +RUN mkdir /notes/ +WORKDIR "/notes" +EXPOSE 8888 + +RUN mkdir -p /opt/run +COPY ./paddle/scripts/docker/jupyter_notebook_config.py /root/.jupyter/ +COPY ./paddle/scripts/docker/00_sshd /opt/run/ +COPY ./paddle/scripts/docker/01_jupyter /opt/run/ +COPY ./paddle/scripts/docker/run_all /opt/bin/ + +CMD ["/opt/bin/run_all"] diff --git a/paddle/scripts/docker/Dockerfile.gpu b/paddle/scripts/docker/Dockerfile.gpu index a68cc79b84271c..7ab1bdba8123d7 100644 --- a/paddle/scripts/docker/Dockerfile.gpu +++ b/paddle/scripts/docker/Dockerfile.gpu @@ -15,7 +15,7 @@ RUN apt-get update \ && apt-get clean -y RUN cd /usr/src/gtest && cmake . && make && cp *.a /usr/lib RUN pip install -U BeautifulSoup docopt PyYAML pillow \ - sphinx sphinx_rtd_theme recommonmark + sphinx sphinx_rtd_theme recommonmark jupyter ARG WITH_AVX ARG WITH_DOC @@ -43,4 +43,16 @@ RUN echo 'root:root' | chpasswd RUN sed -ri 's/^PermitRootLogin\s+.*/PermitRootLogin yes/' /etc/ssh/sshd_config RUN sed -ri 's/UsePAM yes/#UsePAM yes/g' /etc/ssh/sshd_config EXPOSE 22 -CMD ["/usr/sbin/sshd", "-D"] + +# Jupyter Notebook directory. +RUN mkdir /notes/ +WORKDIR "/notes" +EXPOSE 8888 + +RUN mkdir -p /opt/run +COPY ./paddle/scripts/docker/jupyter_notebook_config.py /root/.jupyter/ +COPY ./paddle/scripts/docker/00_sshd /opt/run/ +COPY ./paddle/scripts/docker/01_jupyter /opt/run/ +COPY ./paddle/scripts/docker/run_all /opt/bin/ + +CMD ["/opt/bin/run_all"] diff --git a/paddle/scripts/docker/build.sh b/paddle/scripts/docker/build.sh index ca3f1c3f1896fe..7edba3dd09cdc5 100755 --- a/paddle/scripts/docker/build.sh +++ b/paddle/scripts/docker/build.sh @@ -43,5 +43,7 @@ cp -rv /woboq/data $WOBOQ_OUT/../data -o $WOBOQ_OUT \ -p paddle:/paddle /woboq/indexgenerator/codebrowser_indexgenerator $WOBOQ_OUT - +cd /woboq +make clean +rm -rf /paddle/build trap : 0 diff --git a/paddle/scripts/docker/jupyter_notebook_config.py b/paddle/scripts/docker/jupyter_notebook_config.py new file mode 100644 index 00000000000000..9b615507dd26c1 --- /dev/null +++ b/paddle/scripts/docker/jupyter_notebook_config.py @@ -0,0 +1,27 @@ +# from https://github.com/tensorflow/tensorflow/blob/master/tensorflow/tools/docker/jupyter_notebook_config.py +# Copyright 2015 The TensorFlow Authors. All Rights Reserved. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# ============================================================================== +import os +from IPython.lib import passwd + +c.NotebookApp.ip = '*' +c.NotebookApp.port = int(os.getenv('PORT', 8888)) +c.NotebookApp.open_browser = False +c.MultiKernelManager.default_kernel_name = 'python2' + +# sets a password if PASSWORD is set in the environment +if 'PASSWORD' in os.environ: + c.NotebookApp.password = passwd(os.environ['PASSWORD']) + del os.environ['PASSWORD'] diff --git a/paddle/scripts/docker/run_all b/paddle/scripts/docker/run_all new file mode 100755 index 00000000000000..ed69e4123f920f --- /dev/null +++ b/paddle/scripts/docker/run_all @@ -0,0 +1,13 @@ +#!/bin/bash +# http://stackoverflow.com/questions/18805073/docker-multiple-entrypoints + +LOG=/var/log/all + +touch $LOG + +for a in /opt/run/* +do + $a >> $LOG & +done + +tail -f $LOG diff --git a/paddle/setup.py.in b/paddle/setup.py.in index 464ad632868bd1..9c07e191eb6a87 100644 --- a/paddle/setup.py.in +++ b/paddle/setup.py.in @@ -64,7 +64,7 @@ setup(name="py_paddle", extra_compile_args = extra_comps ) ], - packages=['py_paddle'], + packages=['py_paddle', 'py_paddle.trainer'], include_dirs = include_dirs, install_requires = [ 'numpy>=1.8.0', # The numpy is required.