1. BERT
BERT工具能夠快速的得到詞向量表示坛善。名稱叫做: bert-as-service,只要調(diào)用該服務(wù)就能夠得到我們想要的向量表示搞乏。
2. 安裝 bert-as-service
1) 環(huán)境要求:
Python版本 >= 3.5波桩,Tensorflow版本 >= 1.10
本地環(huán)境配置:Win10 ,Python = 3.7 请敦,Tensorflow = 1.13.1
服務(wù)器環(huán)境配置:centos 镐躲,Python = 3.6 ,Tensorflow = 1.13.1
2)安裝服務(wù)端和客戶端
執(zhí)行
pip install -U bert-serving-server bert-serving-client
(pip install -i https://pypi.tuna.tsinghua.edu.cn/simple -U bert-serving-server bert-serving-client)
3. 啟動 BERT 服務(wù)
1)下載預(yù)訓(xùn)練模型
github :https://github.com/google-research/bert/
-
BERT-Large, Uncased (Whole Word Masking)
: 24-layer, 1024-hidden, 16-heads, 340M parameters -
BERT-Large, Cased (Whole Word Masking)
: 24-layer, 1024-hidden, 16-heads, 340M parameters -
BERT-Base, Uncased
: 12-layer, 768-hidden, 12-heads, 110M parameters -
BERT-Large, Uncased
: 24-layer, 1024-hidden, 16-heads, 340M parameters -
BERT-Base, Cased
: 12-layer, 768-hidden, 12-heads , 110M parameters -
BERT-Large, Cased
: 24-layer, 1024-hidden, 16-heads, 340M parameters -
BERT-Base, Multilingual Cased (New, recommended)
: 104 languages, 12-layer, 768-hidden, 12-heads, 110M parameters -
BERT-Base, Multilingual Uncased (Orig, not recommended)
(Not recommended, useMultilingual Cased
instead): 102 languages, 12-layer, 768-hidden, 12-heads, 110M parameters -
BERT-Base, Chinese
: Chinese Simplified and Traditional, 12-layer, 768-hidden, 12-heads, 110M parameters
下載BERT-Base, Chinese
模型解壓侍筛,放在根目錄下
2)啟動服務(wù)
解壓縮后萤皂,運行如下命令進行啟動,目錄換成解壓后的路徑匣椰。(-num_worker指定使用多少個CPU)
bert-serving-start -model_dir /Users/mantch/Downloads/chinese_L-12_H-768_A-12 -num_worker=4
運行后會看到如下結(jié)果:
ckpt_name = bert_model.ckpt
config_name = bert_config.json
cors = *
cpu = False
device_map = []
do_lower_case = True
fixed_embed_length = False
fp16 = False
gpu_memory_fraction = 0.5
graph_tmp_dir = None
http_max_connect = 10
http_port = None
mask_cls_sep = False
max_batch_size = 256
max_seq_len = 25
model_dir = ./chinese_L-12_H-768_A-12
no_position_embeddings = False
no_special_token = False
num_worker = 4
pooling_layer = [-2]
pooling_strategy = REDUCE_MEAN
port = 5555
port_out = 5556
prefetch_size = 10
priority_batch_size = 16
show_tokens_to_client = False
tuned_model_dir = None
verbose = False
xla = False
I:WORKER-1:[__i:gen:559]:ready and listening!
I:WORKER-0:[__i:gen:559]:ready and listening!
I:WORKER-2:[__i:gen:559]:ready and listening!
I:WORKER-3:[__i:gen:559]:ready and listening!
I:VENTILATOR:[__i:_ru:164]:all set, ready to serve request!
port = 5555裆熙,port_out = 5556 為端口號信息,啟動成功禽笑。
3)調(diào)用 bert 進行自己的應(yīng)用
from bert_serving.client import BertClient
bc = BertClient(ip='localhost',check_version=False, check_length=False)
vec = bc.encode(['學(xué)習(xí)'])
print(vec)
提示:
liunx剛開始裝的Python版本與本地相同入录,出現(xiàn)各種報錯
錯誤1:ImportError: /lib64/libstdc++.so.6: version `CXXABI_1.3.8' not found (required by /root/anaconda3/en
錯誤2:ImportError: /lib64/libm.so.6: version `GLIBC_2.23' not found (required by /root/anaconda3/envs/tensorflow11/lib/python3.7/site-packages/tensorflow/python/_pywrap_tensorflow_internal.so)
因此將Python版本改為3.6,重新安裝Tensorflow佳镜,啟動服務(wù)成功纷跛。