awtk/tools/word_gen
dependabot[bot] ca63c83da1
Bump micromatch from 4.0.4 to 4.0.8 in /tools/word_gen (#893)
Bumps [micromatch](https://github.com/micromatch/micromatch) from 4.0.4 to 4.0.8.
- [Release notes](https://github.com/micromatch/micromatch/releases)
- [Changelog](https://github.com/micromatch/micromatch/blob/master/CHANGELOG.md)
- [Commits](https://github.com/micromatch/micromatch/compare/4.0.4...4.0.8)

---
updated-dependencies:
- dependency-name: micromatch
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-31 11:01:28 +08:00
..
.gitignore add word_gen 2018-06-26 14:47:03 +08:00
chinese_with_freq.txt add t9 input engine 2020-04-21 10:22:05 +08:00
gen_words_json.js use novel-segment 2020-02-13 10:40:32 +08:00
package-lock.json Bump micromatch from 4.0.4 to 4.0.8 in /tools/word_gen (#893) 2024-08-31 11:01:28 +08:00
package.json update word_gen 2021-06-10 08:20:15 +08:00
README.md move suggest_words to assets 2020-05-11 11:07:48 +08:00
to_json.js add t9 input engine 2020-04-21 10:22:05 +08:00
to_words_bin.js add t9 input engine 2020-04-21 10:22:05 +08:00
words.bin add t9 input engine 2020-04-21 10:22:05 +08:00
words.json add t9 input engine 2020-04-21 10:22:05 +08:00

抓取网页,生成输入法联想词库。

生成数据

在当前目录下运行:

  • 准备
npm install
  • 抓取网页生成words.json

可以修改maxURLS改变最大网页数量。

node gen_words_json.js 
  • 生成二进制的words.bin文件

可以根据自己的需要进行编辑words.json。

node to_words_bin.js

使用现有数据

chinese_with_freq.txt是从 https://github.com/ling0322/webdict 下载的。

如果不想自己生成,可以直接使用该文件:

node to_json.js

更新数据

在awtk根目录下运行

cp tools/word_gen/words.bin demos/assets/default/raw/data/suggest_words_zh_cn.dat

如果不支持文件系统,还需要运行更新资源的脚本:

python scripts/update_res.py all

注意:

node_modules/segment/lib/module/DictTokenizer.js#getChunks 可能导致OOM。

如果遇到问题可以限制chunks.length的大小如下面限制为5000。

let getChunksCallsNr = 0;
var getChunks = function (wordpos, pos, text) {
  var words = wordpos[pos] || [];
  // debug('getChunks: ');
  // debug(words);
  // throw new Error();
  var ret = [];
  if(getChunksCallsNr > 150) {
    throw "get Chunks error";
  }
  
  getChunksCallsNr++;
  for (var i = 0; i < words.length; i++) {
    var word = words[i];
    //debug(word);
    var nextcur = word.c + word.w.length;
    if (!wordpos[nextcur]) {
      ret.push([word]);
    } else  {
      var chunks = getChunks(wordpos, nextcur);
      for (var j = 0; j < chunks.length && j < 5000; j++) {
        ret.push([word].concat(chunks[j]));
      }
    }
  }
  getChunksCallsNr--;

  return ret;
};