mirror of
https://gitee.com/fastnlp/fastNLP.git
synced 2024-12-02 04:07:35 +08:00
1087 lines
57 KiB
Plaintext
1087 lines
57 KiB
Plaintext
{
|
||
"cells": [
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {},
|
||
"source": [
|
||
"# 使用 paddlenlp 和 FastNLP 实现中文文本情感分析\n",
|
||
"\n",
|
||
"本篇教程属于 **`FastNLP v0.8 tutorial` 的 `paddle examples` 系列**。在本篇教程中,我们将为您展示如何使用 `paddlenlp` 自然语言处理库和 `FastNLP` 来完成比较简单的情感分析任务。\n",
|
||
"\n",
|
||
"1. 基础介绍:飞桨自然语言处理库 ``paddlenlp`` 和语义理解框架 ``ERNIE``\n",
|
||
"\n",
|
||
"2. 准备工作:使用 ``tokenizer`` 处理数据并构造 ``dataloader``\n",
|
||
"\n",
|
||
"3. 模型训练:加载 ``ERNIE`` 预训练模型,使用 ``FastNLP`` 进行训练"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {},
|
||
"source": [
|
||
"### 1. 基础介绍:飞桨自然语言处理库 paddlenlp 和语义理解框架 ERNIE\n",
|
||
"\n",
|
||
"#### 1.1 飞桨自然语言处理库 paddlenlp\n",
|
||
"\n",
|
||
"``paddlenlp`` 是由百度以飞桨 ``PaddlePaddle`` 为核心开发的自然语言处理库,集成了多个数据集和 NLP 模型,包括百度自研的语义理解框架 ``ERNIE`` 。在本篇教程中,我们会以 ``paddlenlp`` 为基础,使用模型 ``ERNIE`` 完成中文情感分析任务。"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 8,
|
||
"metadata": {},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"2.3.3\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"import sys\n",
|
||
"sys.path.append(\"../\")\n",
|
||
"\n",
|
||
"import paddle\n",
|
||
"import paddlenlp\n",
|
||
"from paddlenlp.transformers import AutoTokenizer\n",
|
||
"from paddlenlp.transformers import AutoModelForSequenceClassification\n",
|
||
"\n",
|
||
"print(paddlenlp.__version__)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {},
|
||
"source": [
|
||
"#### 1.2 语义理解框架 ERNIE\n",
|
||
"\n",
|
||
"``ERNIE(Enhanced Representation from kNowledge IntEgration)`` 是百度提出的基于知识增强的持续学习语义理解框架,至今已有 ``ERNIE 2.0``、``ERNIE 3.0``、``ERNIE-M``、``ERNIE-tiny`` 等多种预训练模型。``ERNIE 1.0`` 采用``Transformer Encoder`` 作为其语义表示的骨架,并改进了两种 ``mask`` 策略,分别为基于**短语**和**实体**(人名、组织等)的策略。在 ``ERNIE`` 中,由多个字组成的短语或者实体将作为一个统一单元,在训练的时候被统一地 ``mask`` 掉,这样可以潜在地学习到知识的依赖以及更长的语义依赖来让模型更具泛化性。\n",
|
||
"\n",
|
||
"<img src=\"./figures/paddle-ernie-1.0-masking.png\" width=\"50%\" height=\"50%\" align=\"center\"></img>\n",
|
||
"\n",
|
||
"<img src=\"./figures/paddle-ernie-1.0-masking-levels.png\" width=\"70%\" height=\"70%\" align=\"center\"></img>\n",
|
||
"\n",
|
||
"``ERNIE 2.0`` 则提出了连续学习(``Continual Learning``)的概念,即首先用一个简单的任务来初始化模型,在更新时用前一个任务训练好的参数作为下一个任务模型初始化的参数。这样在训练新的任务时,模型便可以记住之前学习到的知识,使得模型在新任务上获得更好的表现。``ERNIE 2.0`` 分别构建了词法、语法、语义不同级别的预训练任务,并使用不同的 task id 来标示不同的任务,在共计16个中英文任务上都取得了SOTA效果。\n",
|
||
"\n",
|
||
"<img src=\"./figures/paddle-ernie-2.0-continual-pretrain.png\" width=\"70%\" height=\"70%\" align=\"center\"></img>\n",
|
||
"\n",
|
||
"``ERNIE 3.0`` 将自回归和自编码网络融合在一起进行预训练,其中自编码网络采用 ``ERNIE 2.0`` 的多任务学习增量式构建预训练任务,持续进行语义理解学习。其中自编码网络增加了知识增强的预训练任务。自回归网络则基于 ``Tranformer-XL`` 结构,支持长文本语言模型建模,并在多个自然语言处理任务中取得了SOTA的效果。\n",
|
||
"\n",
|
||
"<img src=\"./figures/paddle-ernie-3.0-framework.png\" width=\"50%\" height=\"50%\" align=\"center\"></img>\n",
|
||
"\n",
|
||
"接下来,我们将展示如何在 ``FastNLP`` 中使用基于 ``paddle`` 的 ``ERNIE 1.0`` 框架进行中文情感分析。"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {},
|
||
"source": [
|
||
"### 2. 使用 tokenizer 处理数据并构造 dataloader\n",
|
||
"\n",
|
||
"#### 2.1 加载中文数据集 ChnSentiCorp\n",
|
||
"\n",
|
||
"``ChnSentiCorp`` 数据集是由中国科学院发布的中文句子级情感分析数据集,包含了从网络上获取的酒店、电影、书籍等多个领域的评论,每条评论都被划分为两个标签:消极(``0``)和积极(``1``),可以用于二分类的中文情感分析任务。通过 ``paddlenlp.datasets.load_dataset`` 函数,我们可以加载并查看 ``ChnSentiCorp`` 数据集的内容。"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 9,
|
||
"metadata": {},
|
||
"outputs": [
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"训练集大小: 9600\n",
|
||
"{'text': '选择珠江花园的原因就是方便,有电动扶梯直接到达海边,周围餐馆、食廊、商场、超市、摊位一应俱全。酒店装修一般,但还算整洁。 泳池在大堂的屋顶,因此很小,不过女儿倒是喜欢。 包的早餐是西式的,还算丰富。 服务吗,一般', 'label': 1, 'qid': ''}\n",
|
||
"{'text': '15.4寸笔记本的键盘确实爽,基本跟台式机差不多了,蛮喜欢数字小键盘,输数字特方便,样子也很美观,做工也相当不错', 'label': 1, 'qid': ''}\n",
|
||
"{'text': '房间太小。其他的都一般。。。。。。。。。', 'label': 0, 'qid': ''}\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"from paddlenlp.datasets import load_dataset\n",
|
||
"\n",
|
||
"train_dataset, val_dataset, test_dataset = load_dataset(\"chnsenticorp\", splits=[\"train\", \"dev\", \"test\"])\n",
|
||
"print(\"训练集大小:\", len(train_dataset))\n",
|
||
"for i in range(3):\n",
|
||
" print(train_dataset[i])"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {},
|
||
"source": [
|
||
"#### 2.2 处理数据\n",
|
||
"\n",
|
||
"可以看到,原本的数据集仅包含中文的文本和标签,这样的数据是无法被模型识别的。同英文文本分类任务一样,我们需要使用 ``tokenizer`` 对文本进行分词并转换为数字形式的结果。我们可以加载已经预训练好的中文分词模型 ``ernie-1.0-base-zh``,将分词的过程写在函数 ``_process`` 中,然后调用数据集的 ``map`` 函数对每一条数据进行分词。其中:\n",
|
||
"- 参数 ``max_length`` 代表句子的最大长度;\n",
|
||
"- ``padding=\"max_length\"`` 表示将长度不足的结果 padding 至和最大长度相同;\n",
|
||
"- ``truncation=True`` 表示将长度过长的句子进行截断。\n",
|
||
"\n",
|
||
"至此,我们得到了每条数据长度均相同的数据集。"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 10,
|
||
"metadata": {},
|
||
"outputs": [
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"\u001b[32m[2022-06-22 21:31:04,168] [ INFO]\u001b[0m - We are using <class 'paddlenlp.transformers.ernie.tokenizer.ErnieTokenizer'> to load 'ernie-1.0-base-zh'.\u001b[0m\n",
|
||
"\u001b[32m[2022-06-22 21:31:04,171] [ INFO]\u001b[0m - Already cached /remote-home/shxing/.paddlenlp/models/ernie-1.0-base-zh/vocab.txt\u001b[0m\n"
|
||
]
|
||
},
|
||
{
|
||
"name": "stdout",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"{'text': '选择珠江花园的原因就是方便,有电动扶梯直接到达海边,周围餐馆、食廊、商场、超市、摊位一应俱全。酒店装修一般,但还算整洁。 泳池在大堂的屋顶,因此很小,不过女儿倒是喜欢。 包的早餐是西式的,还算丰富。 服务吗,一般', 'label': 1, 'qid': '', 'input_ids': [1, 352, 790, 1252, 409, 283, 509, 5, 250, 196, 113, 10, 58, 518, 4, 9, 128, 70, 1495, 1855, 339, 293, 45, 302, 233, 554, 4, 544, 637, 1134, 774, 6, 494, 2068, 6, 278, 191, 6, 634, 99, 6, 2678, 144, 7, 149, 1573, 62, 12043, 661, 737, 371, 435, 7, 689, 4, 255, 201, 559, 407, 1308, 12043, 2275, 1110, 11, 19, 842, 5, 1207, 878, 4, 196, 198, 321, 96, 4, 16, 93, 291, 464, 1099, 10, 692, 811, 12043, 392, 5, 748, 1134, 10, 213, 220, 5, 4, 201, 559, 723, 595, 12043, 231, 112, 1114, 4, 7, 689, 2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], 'token_type_ids': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], 'attention_mask': [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"max_len = 128\n",
|
||
"model_checkpoint = \"ernie-1.0-base-zh\"\n",
|
||
"tokenizer = AutoTokenizer.from_pretrained(model_checkpoint)\n",
|
||
"def _process(data):\n",
|
||
" data.update(tokenizer(\n",
|
||
" data[\"text\"],\n",
|
||
" max_length=max_len,\n",
|
||
" padding=\"max_length\",\n",
|
||
" truncation=True,\n",
|
||
" return_attention_mask=True,\n",
|
||
" ))\n",
|
||
" return data\n",
|
||
"\n",
|
||
"train_dataset.map(_process, num_workers=5)\n",
|
||
"val_dataset.map(_process, num_workers=5)\n",
|
||
"test_dataset.map(_process, num_workers=5)\n",
|
||
"\n",
|
||
"print(train_dataset[0])"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {},
|
||
"source": [
|
||
"得到数据集之后,我们便可以将数据集包裹在 ``PaddleDataLoader`` 中,用于之后的训练。``FastNLP`` 提供的 ``PaddleDataLoader`` 拓展了 ``paddle.io.DataLoader`` 的功能,详情可以查看相关的文档。"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 11,
|
||
"metadata": {},
|
||
"outputs": [],
|
||
"source": [
|
||
"from fastNLP.core import PaddleDataLoader\n",
|
||
"import paddle.nn as nn\n",
|
||
"\n",
|
||
"train_dataloader = PaddleDataLoader(train_dataset, batch_size=32, shuffle=True)\n",
|
||
"val_dataloader = PaddleDataLoader(val_dataset, batch_size=32, shuffle=False)\n",
|
||
"test_dataloader = PaddleDataLoader(test_dataset, batch_size=1, shuffle=False)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {},
|
||
"source": [
|
||
"### 3. 模型训练:加载 ERNIE 预训练模型,使用 FastNLP 进行训练\n",
|
||
"\n",
|
||
"#### 3.1 使用 ERNIE 预训练模型\n",
|
||
"\n",
|
||
"为了实现文本分类,我们首先需要定义文本分类的模型。``paddlenlp.transformers`` 提供了模型 ``AutoModelForSequenceClassification``,我们可以利用它来加载不同权重的文本分类模型。在 ``FastNLP`` 中,我们可以定义 ``train_step`` 和 ``evaluate_step`` 函数来实现训练和验证过程中的不同行为。\n",
|
||
"\n",
|
||
"- ``train_step`` 函数在获得返回值 ``logits`` (大小为 ``(batch_size, num_labels)``)后计算交叉熵损失 ``CrossEntropyLoss``,然后将 ``loss`` 放在字典中返回。``FastNLP`` 也支持返回 ``dataclass`` 类型的训练结果,但二者都需要包含名为 **``loss``** 的键或成员。\n",
|
||
"- ``evaluate_step`` 函数在获得返回值 ``logits`` 后,将 ``logits`` 和标签 ``label`` 放在字典中返回。\n",
|
||
"\n",
|
||
"这两个函数的参数均为数据集中字典**键**的子集,``FastNLP`` 会自动进行参数匹配然后输入到模型中。"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 12,
|
||
"metadata": {},
|
||
"outputs": [
|
||
{
|
||
"name": "stderr",
|
||
"output_type": "stream",
|
||
"text": [
|
||
"\u001b[32m[2022-06-22 21:31:15,577] [ INFO]\u001b[0m - We are using <class 'paddlenlp.transformers.ernie.modeling.ErnieForSequenceClassification'> to load 'ernie-1.0-base-zh'.\u001b[0m\n",
|
||
"\u001b[32m[2022-06-22 21:31:15,580] [ INFO]\u001b[0m - Already cached /remote-home/shxing/.paddlenlp/models/ernie-1.0-base-zh/ernie_v1_chn_base.pdparams\u001b[0m\n"
|
||
]
|
||
}
|
||
],
|
||
"source": [
|
||
"import paddle.nn as nn\n",
|
||
"\n",
|
||
"class SeqClsModel(nn.Layer):\n",
|
||
" def __init__(self, model_checkpoint, num_labels):\n",
|
||
" super(SeqClsModel, self).__init__()\n",
|
||
" self.model = AutoModelForSequenceClassification.from_pretrained(\n",
|
||
" model_checkpoint,\n",
|
||
" num_classes=num_labels,\n",
|
||
" )\n",
|
||
"\n",
|
||
" def forward(self, input_ids, attention_mask, token_type_ids):\n",
|
||
" logits = self.model(input_ids, attention_mask=attention_mask, token_type_ids=token_type_ids)\n",
|
||
" return logits\n",
|
||
"\n",
|
||
" def train_step(self, input_ids, attention_mask, token_type_ids, label):\n",
|
||
" logits = self(input_ids, attention_mask, token_type_ids)\n",
|
||
" loss = nn.CrossEntropyLoss()(logits, label)\n",
|
||
" return {\"loss\": loss}\n",
|
||
"\n",
|
||
" def evaluate_step(self, input_ids, attention_mask, token_type_ids, label):\n",
|
||
" logits = self(input_ids, attention_mask, token_type_ids)\n",
|
||
" return {'pred': logits, 'target': label}\n",
|
||
"\n",
|
||
"model = SeqClsModel(model_checkpoint, num_labels=2)"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {},
|
||
"source": [
|
||
"#### 3.2 设置参数并使用 Trainer 开始训练\n",
|
||
"\n",
|
||
"现在我们可以着手使用 ``FastNLP.Trainer`` 进行训练了。\n",
|
||
"\n",
|
||
"首先,为了高效地训练 ``ERNIE`` 模型,我们最好为学习率指定一定的策略。``paddlenlp`` 提供的 ``LinearDecayWithWarmup`` 可以令学习率在一段时间内从 0 开始线性地增长(预热),然后再线性地衰减至 0 。在本篇教程中,我们将学习率设置为 ``5e-5``,预热时间为 ``0.1``,然后将得到的的 ``lr_scheduler`` 赋值给 ``AdamW`` 优化器。\n",
|
||
"\n",
|
||
"其次,我们还可以为 ``Trainer`` 指定多个 ``Callback`` 来在基础的训练过程之外进行额外的定制操作。在本篇教程中,我们使用的 ``Callback`` 有以下三种:\n",
|
||
"\n",
|
||
"- ``LRSchedCallback`` - 由于我们使用了 ``Scheduler``,因此需要将 ``lr_scheduler`` 传给该 ``Callback`` 以在训练中进行更新。\n",
|
||
"- ``LoadBestModelCallback`` - 该 ``Callback`` 会评估结果中的 ``'acc#accuracy'`` 值,保存训练中出现的正确率最高的模型,并在训练结束时加载到模型上,方便对模型进行测试和评估。\n",
|
||
"\n",
|
||
"在 ``Trainer`` 中,我们还可以设置 ``metrics`` 来衡量模型的表现。``Accuracy`` 能够根据传入的预测值和真实值计算出模型预测的正确率。还记得模型中 ``evaluate_step`` 函数的返回值吗?键 ``pred`` 和 ``target`` 分别为 ``Accuracy.update`` 的参数名,在验证过程中 ``FastNLP`` 会自动将键和参数名匹配从而计算出正确率,这也是我们规定模型需要返回字典类型数据的原因。\n",
|
||
"\n",
|
||
"``Accuracy`` 的返回值包含三个部分:``acc``、``total`` 和 ``correct``,分别代表 ``正确率``、 ``数据总数`` 和 ``预测正确的数目``,这让您能够直观地知晓训练中模型的变化,``LoadBestModelCallback`` 的参数 ``'acc#accuracy'`` 也正是代表了 ``accuracy`` 指标的 ``acc`` 结果。\n",
|
||
"\n",
|
||
"在设定好参数之后,调用 ``run`` 函数便可以进行训练和验证了。"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 13,
|
||
"metadata": {},
|
||
"outputs": [
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"color: #7fbfbf; text-decoration-color: #7fbfbf\">[21:31:16] </span><span style=\"color: #000080; text-decoration-color: #000080\">INFO </span> Running evaluator sanity check for <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">2</span> batches. <a href=\"file://../fastNLP/core/controllers/trainer.py\"><span style=\"color: #7f7f7f; text-decoration-color: #7f7f7f\">trainer.py</span></a><span style=\"color: #7f7f7f; text-decoration-color: #7f7f7f\">:</span><a href=\"file://../fastNLP/core/controllers/trainer.py#631\"><span style=\"color: #7f7f7f; text-decoration-color: #7f7f7f\">631</span></a>\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\u001b[2;36m[21:31:16]\u001b[0m\u001b[2;36m \u001b[0m\u001b[34mINFO \u001b[0m Running evaluator sanity check for \u001b[1;36m2\u001b[0m batches. \u001b]8;id=4641;file://../fastNLP/core/controllers/trainer.py\u001b\\\u001b[2mtrainer.py\u001b[0m\u001b]8;;\u001b\\\u001b[2m:\u001b[0m\u001b]8;id=822054;file://../fastNLP/core/controllers/trainer.py#631\u001b\\\u001b[2m631\u001b[0m\u001b]8;;\u001b\\\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"></pre>\n"
|
||
],
|
||
"text/plain": []
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">---------------------------- Eval. results on Epoch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0</span>, Batch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">60</span> -----------------------------\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"---------------------------- Eval. results on Epoch:\u001b[1;36m0\u001b[0m, Batch:\u001b[1;36m60\u001b[0m -----------------------------\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">{</span>\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"acc#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0.895833</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"total#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1200.0</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"correct#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1075.0</span>\n",
|
||
"<span style=\"font-weight: bold\">}</span>\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\u001b[1m{\u001b[0m\n",
|
||
" \u001b[1;34m\"acc#accuracy\"\u001b[0m: \u001b[1;36m0.895833\u001b[0m,\n",
|
||
" \u001b[1;34m\"total#accuracy\"\u001b[0m: \u001b[1;36m1200.0\u001b[0m,\n",
|
||
" \u001b[1;34m\"correct#accuracy\"\u001b[0m: \u001b[1;36m1075.0\u001b[0m\n",
|
||
"\u001b[1m}\u001b[0m\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">---------------------------- Eval. results on Epoch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0</span>, Batch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">120</span> ----------------------------\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"---------------------------- Eval. results on Epoch:\u001b[1;36m0\u001b[0m, Batch:\u001b[1;36m120\u001b[0m ----------------------------\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">{</span>\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"acc#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0.8975</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"total#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1200.0</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"correct#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1077.0</span>\n",
|
||
"<span style=\"font-weight: bold\">}</span>\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\u001b[1m{\u001b[0m\n",
|
||
" \u001b[1;34m\"acc#accuracy\"\u001b[0m: \u001b[1;36m0.8975\u001b[0m,\n",
|
||
" \u001b[1;34m\"total#accuracy\"\u001b[0m: \u001b[1;36m1200.0\u001b[0m,\n",
|
||
" \u001b[1;34m\"correct#accuracy\"\u001b[0m: \u001b[1;36m1077.0\u001b[0m\n",
|
||
"\u001b[1m}\u001b[0m\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">---------------------------- Eval. results on Epoch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0</span>, Batch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">180</span> ----------------------------\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"---------------------------- Eval. results on Epoch:\u001b[1;36m0\u001b[0m, Batch:\u001b[1;36m180\u001b[0m ----------------------------\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">{</span>\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"acc#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0.911667</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"total#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1200.0</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"correct#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1094.0</span>\n",
|
||
"<span style=\"font-weight: bold\">}</span>\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\u001b[1m{\u001b[0m\n",
|
||
" \u001b[1;34m\"acc#accuracy\"\u001b[0m: \u001b[1;36m0.911667\u001b[0m,\n",
|
||
" \u001b[1;34m\"total#accuracy\"\u001b[0m: \u001b[1;36m1200.0\u001b[0m,\n",
|
||
" \u001b[1;34m\"correct#accuracy\"\u001b[0m: \u001b[1;36m1094.0\u001b[0m\n",
|
||
"\u001b[1m}\u001b[0m\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">---------------------------- Eval. results on Epoch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0</span>, Batch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">240</span> ----------------------------\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"---------------------------- Eval. results on Epoch:\u001b[1;36m0\u001b[0m, Batch:\u001b[1;36m240\u001b[0m ----------------------------\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">{</span>\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"acc#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0.9225</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"total#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1200.0</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"correct#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1107.0</span>\n",
|
||
"<span style=\"font-weight: bold\">}</span>\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\u001b[1m{\u001b[0m\n",
|
||
" \u001b[1;34m\"acc#accuracy\"\u001b[0m: \u001b[1;36m0.9225\u001b[0m,\n",
|
||
" \u001b[1;34m\"total#accuracy\"\u001b[0m: \u001b[1;36m1200.0\u001b[0m,\n",
|
||
" \u001b[1;34m\"correct#accuracy\"\u001b[0m: \u001b[1;36m1107.0\u001b[0m\n",
|
||
"\u001b[1m}\u001b[0m\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">---------------------------- Eval. results on Epoch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0</span>, Batch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">300</span> ----------------------------\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"---------------------------- Eval. results on Epoch:\u001b[1;36m0\u001b[0m, Batch:\u001b[1;36m300\u001b[0m ----------------------------\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">{</span>\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"acc#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0.9275</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"total#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1200.0</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"correct#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1113.0</span>\n",
|
||
"<span style=\"font-weight: bold\">}</span>\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\u001b[1m{\u001b[0m\n",
|
||
" \u001b[1;34m\"acc#accuracy\"\u001b[0m: \u001b[1;36m0.9275\u001b[0m,\n",
|
||
" \u001b[1;34m\"total#accuracy\"\u001b[0m: \u001b[1;36m1200.0\u001b[0m,\n",
|
||
" \u001b[1;34m\"correct#accuracy\"\u001b[0m: \u001b[1;36m1113.0\u001b[0m\n",
|
||
"\u001b[1m}\u001b[0m\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">---------------------------- Eval. results on Epoch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1</span>, Batch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">60</span> -----------------------------\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"---------------------------- Eval. results on Epoch:\u001b[1;36m1\u001b[0m, Batch:\u001b[1;36m60\u001b[0m -----------------------------\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">{</span>\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"acc#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0.930833</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"total#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1200.0</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"correct#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1117.0</span>\n",
|
||
"<span style=\"font-weight: bold\">}</span>\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\u001b[1m{\u001b[0m\n",
|
||
" \u001b[1;34m\"acc#accuracy\"\u001b[0m: \u001b[1;36m0.930833\u001b[0m,\n",
|
||
" \u001b[1;34m\"total#accuracy\"\u001b[0m: \u001b[1;36m1200.0\u001b[0m,\n",
|
||
" \u001b[1;34m\"correct#accuracy\"\u001b[0m: \u001b[1;36m1117.0\u001b[0m\n",
|
||
"\u001b[1m}\u001b[0m\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">---------------------------- Eval. results on Epoch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1</span>, Batch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">120</span> ----------------------------\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"---------------------------- Eval. results on Epoch:\u001b[1;36m1\u001b[0m, Batch:\u001b[1;36m120\u001b[0m ----------------------------\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">{</span>\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"acc#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0.935833</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"total#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1200.0</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"correct#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1123.0</span>\n",
|
||
"<span style=\"font-weight: bold\">}</span>\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\u001b[1m{\u001b[0m\n",
|
||
" \u001b[1;34m\"acc#accuracy\"\u001b[0m: \u001b[1;36m0.935833\u001b[0m,\n",
|
||
" \u001b[1;34m\"total#accuracy\"\u001b[0m: \u001b[1;36m1200.0\u001b[0m,\n",
|
||
" \u001b[1;34m\"correct#accuracy\"\u001b[0m: \u001b[1;36m1123.0\u001b[0m\n",
|
||
"\u001b[1m}\u001b[0m\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">---------------------------- Eval. results on Epoch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1</span>, Batch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">180</span> ----------------------------\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"---------------------------- Eval. results on Epoch:\u001b[1;36m1\u001b[0m, Batch:\u001b[1;36m180\u001b[0m ----------------------------\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">{</span>\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"acc#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0.935833</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"total#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1200.0</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"correct#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1123.0</span>\n",
|
||
"<span style=\"font-weight: bold\">}</span>\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\u001b[1m{\u001b[0m\n",
|
||
" \u001b[1;34m\"acc#accuracy\"\u001b[0m: \u001b[1;36m0.935833\u001b[0m,\n",
|
||
" \u001b[1;34m\"total#accuracy\"\u001b[0m: \u001b[1;36m1200.0\u001b[0m,\n",
|
||
" \u001b[1;34m\"correct#accuracy\"\u001b[0m: \u001b[1;36m1123.0\u001b[0m\n",
|
||
"\u001b[1m}\u001b[0m\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">---------------------------- Eval. results on Epoch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1</span>, Batch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">240</span> ----------------------------\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"---------------------------- Eval. results on Epoch:\u001b[1;36m1\u001b[0m, Batch:\u001b[1;36m240\u001b[0m ----------------------------\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">{</span>\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"acc#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0.9375</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"total#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1200.0</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"correct#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1125.0</span>\n",
|
||
"<span style=\"font-weight: bold\">}</span>\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\u001b[1m{\u001b[0m\n",
|
||
" \u001b[1;34m\"acc#accuracy\"\u001b[0m: \u001b[1;36m0.9375\u001b[0m,\n",
|
||
" \u001b[1;34m\"total#accuracy\"\u001b[0m: \u001b[1;36m1200.0\u001b[0m,\n",
|
||
" \u001b[1;34m\"correct#accuracy\"\u001b[0m: \u001b[1;36m1125.0\u001b[0m\n",
|
||
"\u001b[1m}\u001b[0m\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">---------------------------- Eval. results on Epoch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1</span>, Batch:<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">300</span> ----------------------------\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"---------------------------- Eval. results on Epoch:\u001b[1;36m1\u001b[0m, Batch:\u001b[1;36m300\u001b[0m ----------------------------\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">{</span>\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"acc#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0.941667</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"total#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1200.0</span>,\n",
|
||
" <span style=\"color: #000080; text-decoration-color: #000080; font-weight: bold\">\"correct#accuracy\"</span>: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">1130.0</span>\n",
|
||
"<span style=\"font-weight: bold\">}</span>\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\u001b[1m{\u001b[0m\n",
|
||
" \u001b[1;34m\"acc#accuracy\"\u001b[0m: \u001b[1;36m0.941667\u001b[0m,\n",
|
||
" \u001b[1;34m\"total#accuracy\"\u001b[0m: \u001b[1;36m1200.0\u001b[0m,\n",
|
||
" \u001b[1;34m\"correct#accuracy\"\u001b[0m: \u001b[1;36m1130.0\u001b[0m\n",
|
||
"\u001b[1m}\u001b[0m\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"color: #7fbfbf; text-decoration-color: #7fbfbf\">[21:34:28] </span><span style=\"color: #000080; text-decoration-color: #000080\">INFO </span> Loading best model from fnlp-ernie/<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">2022</span>-<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0</span> <a href=\"file://../fastNLP/core/callbacks/load_best_model_callback.py\"><span style=\"color: #7f7f7f; text-decoration-color: #7f7f7f\">load_best_model_callback.py</span></a><span style=\"color: #7f7f7f; text-decoration-color: #7f7f7f\">:</span><a href=\"file://../fastNLP/core/callbacks/load_best_model_callback.py#111\"><span style=\"color: #7f7f7f; text-decoration-color: #7f7f7f\">111</span></a>\n",
|
||
"<span style=\"color: #7fbfbf; text-decoration-color: #7fbfbf\"> </span> <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">6</span>-<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">22</span>-21_29_12_898095/best_so_far with <span style=\"color: #7f7f7f; text-decoration-color: #7f7f7f\"> </span>\n",
|
||
"<span style=\"color: #7fbfbf; text-decoration-color: #7fbfbf\"> </span> acc#accuracy: <span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">0.941667</span><span style=\"color: #808000; text-decoration-color: #808000\">...</span> <span style=\"color: #7f7f7f; text-decoration-color: #7f7f7f\"> </span>\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\u001b[2;36m[21:34:28]\u001b[0m\u001b[2;36m \u001b[0m\u001b[34mINFO \u001b[0m Loading best model from fnlp-ernie/\u001b[1;36m2022\u001b[0m-\u001b[1;36m0\u001b[0m \u001b]8;id=340364;file://../fastNLP/core/callbacks/load_best_model_callback.py\u001b\\\u001b[2mload_best_model_callback.py\u001b[0m\u001b]8;;\u001b\\\u001b[2m:\u001b[0m\u001b]8;id=763898;file://../fastNLP/core/callbacks/load_best_model_callback.py#111\u001b\\\u001b[2m111\u001b[0m\u001b]8;;\u001b\\\n",
|
||
"\u001b[2;36m \u001b[0m \u001b[1;36m6\u001b[0m-\u001b[1;36m22\u001b[0m-21_29_12_898095/best_so_far with \u001b[2m \u001b[0m\n",
|
||
"\u001b[2;36m \u001b[0m acc#accuracy: \u001b[1;36m0.941667\u001b[0m\u001b[33m...\u001b[0m \u001b[2m \u001b[0m\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"color: #7fbfbf; text-decoration-color: #7fbfbf\">[21:34:34] </span><span style=\"color: #000080; text-decoration-color: #000080\">INFO </span> Deleting fnlp-ernie/<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">2022</span>-<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">06</span>-<span style=\"color: #008080; text-decoration-color: #008080; font-weight: bold\">22</span>-21_29_12_8 <a href=\"file://../fastNLP/core/callbacks/load_best_model_callback.py\"><span style=\"color: #7f7f7f; text-decoration-color: #7f7f7f\">load_best_model_callback.py</span></a><span style=\"color: #7f7f7f; text-decoration-color: #7f7f7f\">:</span><a href=\"file://../fastNLP/core/callbacks/load_best_model_callback.py#131\"><span style=\"color: #7f7f7f; text-decoration-color: #7f7f7f\">131</span></a>\n",
|
||
"<span style=\"color: #7fbfbf; text-decoration-color: #7fbfbf\"> </span> 98095/best_so_far<span style=\"color: #808000; text-decoration-color: #808000\">...</span> <span style=\"color: #7f7f7f; text-decoration-color: #7f7f7f\"> </span>\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\u001b[2;36m[21:34:34]\u001b[0m\u001b[2;36m \u001b[0m\u001b[34mINFO \u001b[0m Deleting fnlp-ernie/\u001b[1;36m2022\u001b[0m-\u001b[1;36m06\u001b[0m-\u001b[1;36m22\u001b[0m-21_29_12_8 \u001b]8;id=430330;file://../fastNLP/core/callbacks/load_best_model_callback.py\u001b\\\u001b[2mload_best_model_callback.py\u001b[0m\u001b]8;;\u001b\\\u001b[2m:\u001b[0m\u001b]8;id=508566;file://../fastNLP/core/callbacks/load_best_model_callback.py#131\u001b\\\u001b[2m131\u001b[0m\u001b]8;;\u001b\\\n",
|
||
"\u001b[2;36m \u001b[0m 98095/best_so_far\u001b[33m...\u001b[0m \u001b[2m \u001b[0m\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"></pre>\n"
|
||
],
|
||
"text/plain": []
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
}
|
||
],
|
||
"source": [
|
||
"from fastNLP import LRSchedCallback, LoadBestModelCallback\n",
|
||
"from fastNLP import Trainer, Accuracy\n",
|
||
"from paddlenlp.transformers import LinearDecayWithWarmup\n",
|
||
"\n",
|
||
"n_epochs = 2\n",
|
||
"num_training_steps = len(train_dataloader) * n_epochs\n",
|
||
"lr_scheduler = LinearDecayWithWarmup(5e-5, num_training_steps, 0.1)\n",
|
||
"optimizer = paddle.optimizer.AdamW(\n",
|
||
" learning_rate=lr_scheduler,\n",
|
||
" parameters=model.parameters(),\n",
|
||
")\n",
|
||
"callbacks = [\n",
|
||
" LRSchedCallback(lr_scheduler, step_on=\"batch\"),\n",
|
||
" LoadBestModelCallback(\"acc#accuracy\", larger_better=True, save_folder=\"fnlp-ernie\"),\n",
|
||
"]\n",
|
||
"trainer = Trainer(\n",
|
||
" model=model,\n",
|
||
" driver=\"paddle\",\n",
|
||
" optimizers=optimizer,\n",
|
||
" device=0,\n",
|
||
" n_epochs=n_epochs,\n",
|
||
" train_dataloader=train_dataloader,\n",
|
||
" evaluate_dataloaders=val_dataloader,\n",
|
||
" evaluate_every=60,\n",
|
||
" metrics={\"accuracy\": Accuracy()},\n",
|
||
" callbacks=callbacks,\n",
|
||
")\n",
|
||
"trainer.run()"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "markdown",
|
||
"metadata": {},
|
||
"source": [
|
||
"#### 3.3 测试和评估\n",
|
||
"\n",
|
||
"现在我们已经得到了一个表现良好的 ``ERNIE`` 模型,接下来可以在测试集上测试模型的效果了。``FastNLP.Evaluator`` 提供了定制函数的功能。我们以 ``test_dataloader`` 初始化一个 ``Evaluator``,然后将写好的测试函数 ``test_batch_step_fn`` 传给参数 ``evaluate_batch_step_fn``,``Evaluate`` 在对每个 batch 进行评估时就会调用我们自定义的 ``test_batch_step_fn`` 函数而不是 ``evaluate_step`` 函数。在这里,我们仅测试 5 条数据并输出文本和对应的标签。"
|
||
]
|
||
},
|
||
{
|
||
"cell_type": "code",
|
||
"execution_count": 14,
|
||
"metadata": {},
|
||
"outputs": [
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">text: ['这个宾馆比较陈旧了,特价的房间也很一般。总体来说一般']\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"text: ['这个宾馆比较陈旧了,特价的房间也很一般。总体来说一般']\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">labels: 0\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"labels: 0\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">text: ['怀着十分激动的心情放映,可是看着看着发现,在放映完毕后,出现一集米老鼠的动画片!开始\n",
|
||
"还怀疑是不是赠送的个别现象,可是后来发现每张DVD后面都有!真不知道生产商怎么想的,我想看的是猫\n",
|
||
"和老鼠,不是米老鼠!如果厂家是想赠送的话,那就全套米老鼠和唐老鸭都赠送,只在每张DVD后面添加一\n",
|
||
"集算什么??简直是画蛇添足!!']\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"text: ['怀着十分激动的心情放映,可是看着看着发现,在放映完毕后,出现一集米老鼠的动画片!开始\n",
|
||
"还怀疑是不是赠送的个别现象,可是后来发现每张DVD后面都有!真不知道生产商怎么想的,我想看的是猫\n",
|
||
"和老鼠,不是米老鼠!如果厂家是想赠送的话,那就全套米老鼠和唐老鸭都赠送,只在每张DVD后面添加一\n",
|
||
"集算什么??简直是画蛇添足!!']\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">labels: 0\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"labels: 0\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">text: ['还稍微重了点,可能是硬盘大的原故,还要再轻半斤就好了。其他要进一步验证。贴的几种膜气\n",
|
||
"泡较多,用不了多久就要更换了,屏幕膜稍好点,但比没有要强多了。建议配赠几张膜让用用户自己贴。'\n",
|
||
"]\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"text: ['还稍微重了点,可能是硬盘大的原故,还要再轻半斤就好了。其他要进一步验证。贴的几种膜气\n",
|
||
"泡较多,用不了多久就要更换了,屏幕膜稍好点,但比没有要强多了。建议配赠几张膜让用用户自己贴。'\n",
|
||
"]\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">labels: 0\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"labels: 0\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">text: ['交通方便;环境很好;服务态度很好 房间较小']\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"text: ['交通方便;环境很好;服务态度很好 房间较小']\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">labels: 1\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"labels: 1\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">text: ['不错,作者的观点很颠覆目前中国父母的教育方式,其实古人们对于教育已经有了很系统的体系\n",
|
||
"了,可是现在的父母以及祖父母们更多的娇惯纵容孩子,放眼看去自私的孩子是大多数,父母觉得自己的\n",
|
||
"孩子在外面只要不吃亏就是好事,完全把古人几千年总结的教育古训抛在的九霄云外。所以推荐准妈妈们\n",
|
||
"可以在等待宝宝降临的时候,好好学习一下,怎么把孩子教育成一个有爱心、有责任心、宽容、大度的人\n",
|
||
"。']\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"text: ['不错,作者的观点很颠覆目前中国父母的教育方式,其实古人们对于教育已经有了很系统的体系\n",
|
||
"了,可是现在的父母以及祖父母们更多的娇惯纵容孩子,放眼看去自私的孩子是大多数,父母觉得自己的\n",
|
||
"孩子在外面只要不吃亏就是好事,完全把古人几千年总结的教育古训抛在的九霄云外。所以推荐准妈妈们\n",
|
||
"可以在等待宝宝降临的时候,好好学习一下,怎么把孩子教育成一个有爱心、有责任心、宽容、大度的人\n",
|
||
"。']\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">labels: 1\n",
|
||
"</pre>\n"
|
||
],
|
||
"text/plain": [
|
||
"labels: 1\n"
|
||
]
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/html": [
|
||
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"></pre>\n"
|
||
],
|
||
"text/plain": []
|
||
},
|
||
"metadata": {},
|
||
"output_type": "display_data"
|
||
},
|
||
{
|
||
"data": {
|
||
"text/plain": [
|
||
"{}"
|
||
]
|
||
},
|
||
"execution_count": 14,
|
||
"metadata": {},
|
||
"output_type": "execute_result"
|
||
}
|
||
],
|
||
"source": [
|
||
"from fastNLP import Evaluator\n",
|
||
"def test_batch_step_fn(evaluator, batch):\n",
|
||
" input_ids = batch[\"input_ids\"]\n",
|
||
" attention_mask = batch[\"attention_mask\"]\n",
|
||
" token_type_ids = batch[\"token_type_ids\"]\n",
|
||
" logits = model(input_ids, attention_mask, token_type_ids)\n",
|
||
" predict = logits.argmax().item()\n",
|
||
" print(\"text:\", batch['text'])\n",
|
||
" print(\"labels:\", predict)\n",
|
||
"\n",
|
||
"evaluator = Evaluator(\n",
|
||
" model=model,\n",
|
||
" dataloaders=test_dataloader,\n",
|
||
" driver=\"paddle\",\n",
|
||
" device=0,\n",
|
||
" evaluate_batch_step_fn=test_batch_step_fn,\n",
|
||
")\n",
|
||
"evaluator.run(5) "
|
||
]
|
||
}
|
||
],
|
||
"metadata": {
|
||
"kernelspec": {
|
||
"display_name": "Python 3.7.13 ('fnlp-paddle')",
|
||
"language": "python",
|
||
"name": "python3"
|
||
},
|
||
"language_info": {
|
||
"codemirror_mode": {
|
||
"name": "ipython",
|
||
"version": 3
|
||
},
|
||
"file_extension": ".py",
|
||
"mimetype": "text/x-python",
|
||
"name": "python",
|
||
"nbconvert_exporter": "python",
|
||
"pygments_lexer": "ipython3",
|
||
"version": "3.7.13"
|
||
},
|
||
"orig_nbformat": 4,
|
||
"vscode": {
|
||
"interpreter": {
|
||
"hash": "31f2d9d3efc23c441973d7c4273acfea8b132b6a578f002629b6b44b8f65e720"
|
||
}
|
||
}
|
||
},
|
||
"nbformat": 4,
|
||
"nbformat_minor": 2
|
||
}
|