驗證碼識別泊柬,嘗試破解一個真實第三方網(wǎng)站的驗證碼
目標(biāo):
獵聘網(wǎng)的驗證碼:wget https://passport.liepin.com/captcha/randomcode -O 001.jpg
參考Baseline:
https://blog.csdn.net/c2a2o2/article/details/75844775 //有g(shù)ithub開源祸轮,貌似不錯
過程記錄:
baseline版本是在運行時動態(tài)生成train和val數(shù)據(jù)的,這種方式必須事先知道test數(shù)據(jù)的字庫等信息命贴,不太具有實踐意義,所以增加了能從文件讀取train和val數(shù)據(jù)的功能
增加 test數(shù)據(jù)自動縮放的功能,這樣無論在訓(xùn)練/驗證/測試過程都是保持統(tǒng)一的height亏掀,比較實用缸榄。 統(tǒng)一的height是因為 num_feature = height * channel渤弛,目前都轉(zhuǎn)成灰度的了channel=1
把默認圖片格式png 改成支持和默認圖片格式為 jpg,更實用一些
發(fā)現(xiàn)運行時動態(tài)生成數(shù)據(jù)的方式往往比讀如文件的方式要效果好甚带,猜測可能是兩個原因:
a. 運行時動態(tài)生成的train 和 val數(shù)據(jù)她肯,隨機性更大佳头,更利于深度學(xué)習(xí)?
b. 運行時動態(tài)生成的train 和 val數(shù)據(jù)晴氨,都是png格式康嘉,用png格式訓(xùn)練的模型效果更好?設(shè)計方法驗證上一個問題: “從隨機生成的jpg圖片讀取并訓(xùn)練” 和 “直接動態(tài)生成訓(xùn)練數(shù)據(jù)”籽前,保持同樣的數(shù)據(jù)量進行訓(xùn)練(即 為了排除所謂的“隨機性更大“)亭珍,然后在同一份jpg格式驗證集上對比效果。
結(jié)果是: 印證了上述 b 是真正原因:讀取jpg文件的方式?jīng)]有直接fly生成train樣本(默認是PNG格式)效果好聚假。
讀取jpg文件的:
00000002_tZXBdv.jpg cost time: 0.786,
res: tZXBdv
00000003_7zVRNT.jpg cost time: 0.068,
res: 7zVRNT
00000001_ZLS9GY.jpg cost time: 0.066,
res: zlS9GY
00000009_jQyqBE.jpg cost time: 0.068,
res: jQyqE
00000007_uu5J.jpg cost time: 0.077,
res: u5J
00000008_qL8G.jpg cost time: 0.085,
res: qL8G
00000004_hc2UHX.jpg cost time: 0.091,
res: hc2UHX
00000006_48Ued.jpg cost time: 0.098,
res: 48Ued
00000000_oziN.jpg cost time: 0.094,
res: oZN
00000005_xRrWK.jpg cost time: 0.088,
res: XRrWK
00000010_7HEP.jpg cost time: 0.148, // 這是個非同類的圖片块蚌,可不算數(shù)
res: Jj
total acc:5/11=0.4545
直接生成訓(xùn)練樣本的:
00000002_tZXBdv.jpg cost time: 0.748,
res: tZxBdv
00000003_7zVRNT.jpg cost time: 0.073,
res: 7zVRNT
00000001_ZLS9GY.jpg cost time: 0.067,
res: zls9GY
00000009_jQyqBE.jpg cost time: 0.096,
res: jQyqBE
00000007_uu5J.jpg cost time: 0.083,
res: uu5J
00000008_qL8G.jpg cost time: 0.078,
res: qL8G
00000004_hc2UHX.jpg cost time: 0.073,
res: hc2UHX
00000006_48Ued.jpg cost time: 0.085,
res: 48Ued
00000000_oziN.jpg cost time: 0.076,
res: oziN
00000005_xRrWK.jpg cost time: 0.065,
res: xRrWK
00000010_7HEP.jpg cost time: 0.151, // 這是個非同類的圖片,可不算數(shù)
res: 5hib
total acc:8/11=0.7273
- 在多次試驗的情況下膘格,有一些心得:
- train時候越隨機 越好峭范, 換句話說:如果樣本少而輪次再多也沒用,表現(xiàn)就是 loss降得很低了但在驗證集上準(zhǔn)確率卻很低瘪贱,通俗理解就是并未發(fā)現(xiàn)普適規(guī)律而只是在train上過擬合了
- train的時候準(zhǔn)確率是有突變的情況(不僅僅是因為“一個sample里幾個字符全對才算對“的原因纱控,因為突變得實在嚇人), 看如下這連續(xù)3個100次迭代:
seq 0: origin: [6, 15, 23, 6] decoded:[6, 15, 6]
seq 1: origin: [19, 28, 25, 35] decoded:[19, 26, 28]
seq 2: origin: [32, 27, 33, 9] decoded:[32, 33, 9]
seq 3: origin: [30, 34, 17, 17] decoded:[8, 13, 17]
seq 4: origin: [24, 26, 29, 24] decoded:[24, 29, 33]
accuracy: 0.00000
seq 0: origin: [6, 34, 18, 13] decoded:[29, 34, 13]
seq 1: origin: [36, 15, 32, 6] decoded:[36, 15, 6]
seq 2: origin: [16, 21, 2, 15] decoded:[16, 21, 15]
seq 3: origin: [29, 30, 11, 30] decoded:[29, 11, 30]
seq 4: origin: [10, 34, 26, 2] decoded:[10, 26, 34]
accuracy: 0.03125
seq 0: origin: [20, 5, 3, 28] decoded:[20, 5, 3, 28]
seq 1: origin: [3, 8, 20, 8] decoded:[3, 8, 20, 8]
seq 2: origin: [10, 25, 32, 28] decoded:[10, 25, 32, 28]
seq 3: origin: [2, 28, 26, 27] decoded:[2, 28, 26, 27]
seq 4: origin: [2, 33, 35, 18] decoded:[2, 33, 35, 18]
accuracy: 0.68750
- 驗證集上accurancy 從0到非零的突變一般發(fā)生在 train上loss到(NUM_CLASS / 字符數(shù))左右時菜秦,如果loss是ppx的話 這很容易理解甜害。
- loss持續(xù)降但accurancy徘徊上不去了,就是 學(xué)的情況還不夠全(很可能數(shù)據(jù)多樣性不夠球昨,或多樣性夠但模型care得不夠細)尔店,達不到舉一反三,要注意train與val數(shù)據(jù)不一致主慰,注意過擬合嚣州。
- 如果 loss不降,不管accurancy在改進與否共螺,都說明模型或算法壓根兒就不對该肴,沒有 “執(zhí)行力”。說明取的特征和目標(biāo)函數(shù)之間本來就沒有相關(guān)或者模型嘗試尋找的手段不對
在網(wǎng)上找個數(shù)據(jù)集試試: 在 這個type2數(shù)據(jù)集上(http://www.pkbigdata.com/common/cmpt/%E9%AA%8C%E8%AF%81%E7%A0%81%E8%AF%86%E5%88%AB%E7%AB%9E%E8%B5%9B_%E8%B5%9B%E4%BD%93%E4%B8%8E%E6%95%B0%E6%8D%AE.html)藐不,稍微跑一下就到100%準(zhǔn)確率了匀哄。這個賽事已經(jīng)close了,如此之簡單么雏蛮?疑惑涎嚼。
增加了export的功能,并寫了client挑秉,以便通過TFServing對外提供服務(wù)铸抑。這基本參考之前折騰TFServing時的Sample改的,問題不多衷模。有一點注意的就是應(yīng)該在test完后export鹊汛, 不需要在train后就export蒲赂,否則export出來的文件里會包含一些train才用到的東西(比如 warp-ctc)就比較麻煩。
回到目標(biāo)上刁憋,因為獵聘網(wǎng)的圖片下載下來還需要人工標(biāo)記滥嘴,一時半會兒并不能拿到太多訓(xùn)練數(shù)據(jù); 而且它的生成方法及字庫也看不出來至耻,不好直接模擬訓(xùn)練數(shù)據(jù)若皱。 怎么辦?
- 直接用別的數(shù)據(jù)上訓(xùn)練很好的模型尘颓? --別開玩笑走触,絕對不行:
Restoring from /home/ML/image/lstm_ctc_ocr/output/lstm_ctc/lstm_ctc_iter_10000.ckpt... done
0_7HEP.jpg cost time: 0.723,
res: 53
10_DZHD.jpg cost time: 0.054,
res: IA
11_KFBA.jpg cost time: 0.060,
res: 5
12_9T8Y.jpg cost time: 0.061,
res: 68
13_NEYH.jpg cost time: 0.062,
res:
14_989X.jpg cost time: 0.051,
res: S
15_EA5P.jpg cost time: 0.052,
res: L
16_UVZA.jpg cost time: 0.053,
res: U3
17_758N.jpg cost time: 0.052,
res: 56
18_ZYC5.jpg cost time: 0.054,
res: I
19_D8DN.jpg cost time: 0.051,
res: I5
total acc:0/11=0.0000
- 在通用的數(shù)據(jù)訓(xùn)練出的模型上,增加少量目標(biāo)圖片進行加強訓(xùn)練疤苹,不知道是否有效互广? Let's Try.
沒有用,loss很快就降低卧土,然后就沒法繼續(xù)降下去了惫皱,accurancy持續(xù)上不去(因為數(shù)據(jù)集小,很快過擬合尤莺,即train中的滾瓜爛熟但val中卻有一些壓根兒不懂旅敷, 所以如此)
seq 0: origin: [31, 32, 36, 11] decoded:[31, 10, 34]
seq 1: origin: [14, 9, 14, 24] decoded:[14, 12, 18, 24]
seq 2: origin: [8, 6, 9, 24] decoded:[8, 6, 14, 24]
seq 3: origin: [14, 36, 18, 14] decoded:[14, 11, 12, 1]
seq 4: origin: [36, 35, 13, 6] decoded:[36, 8, 13, 6]
accuracy: 0.00000
loss: 0.014991194 Wrote snapshot to: /home/ML/image/lstm_ctc_ocr/output/lstm_ctc/lstm_ctc_iter_2.ckpt
seq 0: origin: [10, 9, 10, 34] decoded:[10, 12]
seq 1: origin: [21, 16, 12, 11] decoded:[21, 18, 6, 11]
seq 2: origin: [8, 6, 9, 24] decoded:[8, 6, 14]
seq 3: origin: [24, 15, 35, 18] decoded:[24, 21, 32]
seq 4: origin: [15, 11, 6, 26] decoded:[15, 8]
accuracy: 0.00000
不信你看看在train上的結(jié)果 和 在test上的結(jié)果對比:
59_UXD7.jpg cost time: 0.057,
res: UXD7
61_C2HY.jpg cost time: 0.058,
res: C2HY
63_FYVH.jpg cost time: 0.065,
res: FYVH
88_8ZE8.jpg cost time: 0.071,
res: 8ZE8
97_75UN.jpg cost time: 0.065,
res: 75UN
total acc:90/90=1.0000
0_7HEP.jpg cost time: 0.705,
res: 7EB
10_DZHD.jpg cost time: 0.052,
res: DZ5U
11_KFBA.jpg cost time: 0.065,
res: KH5A
12_9T8Y.jpg cost time: 0.051,
res: 8THF
13_NEYH.jpg cost time: 0.054,
res: NKV
14_989X.jpg cost time: 0.051,
res: 987
15_EA5P.jpg cost time: 0.055,
res: E7
16_UVZA.jpg cost time: 0.052,
res: U9A
17_758N.jpg cost time: 0.060,
res: 75DN
18_ZYC5.jpg cost time: 0.053,
res: ZYC5
19_D8DN.jpg cost time: 0.058,
res: D8K
total acc:1/11=0.0909
- 所以沒什么取巧辦法,還是多取與目標(biāo)的同類數(shù)據(jù)才有用颤霎,取到多少為止呢媳谁?試試~
- 190個增強train還是不行:
0_7HEP.jpg cost time: 0.678,
res: 7E3
10_DZHD.jpg cost time: 0.055,
res: D5D
11_KFBA.jpg cost time: 0.050,
res: K3A
12_9T8Y.jpg cost time: 0.052,
res: 9T
13_NEYH.jpg cost time: 0.050,
res: NEY
14_989X.jpg cost time: 0.055,
res: 9B9
15_EA5P.jpg cost time: 0.052,
res: E57
16_UVZA.jpg cost time: 0.054,
res: U9A
17_758N.jpg cost time: 0.051,
res: 75N
18_ZYC5.jpg cost time: 0.053,
res: ZYC5
19_D8DN.jpg cost time: 0.052,
res: D5N
total acc:1/11=0.0909
- 190個直接train試一下,也不行:
0_7HEP.jpg cost time: 0.706,
res: 7H9
10_DZHD.jpg cost time: 0.055,
res: D2H3
11_KFBA.jpg cost time: 0.048,
res: K8
12_9T8Y.jpg cost time: 0.051,
res: 9TY
13_NEYH.jpg cost time: 0.049,
res: NH
14_989X.jpg cost time: 0.051,
res: 98X
15_EA5P.jpg cost time: 0.050,
res: EAT
16_UVZA.jpg cost time: 0.052,
res: UA
17_758N.jpg cost time: 0.053,
res: 785U
18_ZYC5.jpg cost time: 0.051,
res: ZY8
19_D8DN.jpg cost time: 0.054,
res: D8N
total acc:0/11=0.0000
繼續(xù)自力更生數(shù)據(jù)吧友酱,數(shù)據(jù)晴音,數(shù)據(jù)。粹污。段多。
- 490個增強train已經(jīng)像回事了:
0_7HEP.jpg cost time: 0.702,
res: 7HED
10_DZHD.jpg cost time: 0.053,
res: DZHD
11_KFBA.jpg cost time: 0.051,
res: KF8A
12_9T8Y.jpg cost time: 0.049,
res: 9T8Y
13_NEYH.jpg cost time: 0.048,
res: NEYH
14_989X.jpg cost time: 0.049,
res: 989X
15_EA5P.jpg cost time: 0.050,
res: EA5P
16_UVZA.jpg cost time: 0.052,
res: UVZA
17_758N.jpg cost time: 0.049,
res: 75N
18_ZYC5.jpg cost time: 0.050,
res: ZYC5
19_D8DN.jpg cost time: 0.051,
res: D8DN
total acc:8/11=0.7273
- 490個直接訓(xùn)的話首量,效果還是有差距的:
0_7HEP.jpg cost time: 0.682,
res: 7HE9
10_DZHD.jpg cost time: 0.052,
res: DZHD
11_KFBA.jpg cost time: 0.051,
res: KA
12_9T8Y.jpg cost time: 0.048,
res: 9Y
13_NEYH.jpg cost time: 0.050,
res: NEYH
14_989X.jpg cost time: 0.052,
res: 989X
15_EA5P.jpg cost time: 0.050,
res: EA5P
16_UVZA.jpg cost time: 0.049,
res: UVA
17_758N.jpg cost time: 0.049,
res: 7CN
18_ZYC5.jpg cost time: 0.051,
res: ZYC5
19_D8DN.jpg cost time: 0.050,
res: D5N
total acc:5/11=0.4545
- 再加200條訓(xùn)練數(shù)據(jù)試試
--- 沒效果了壮吩,居然會更差,說明目標(biāo)數(shù)據(jù)質(zhì)量不好的話加缘,越多還噪聲越大鸭叙。
0_7HEP.jpg cost time: 0.649,
res: 7HED
10_DZHD.jpg cost time: 0.057,
res: DZHD
11_KFBA.jpg cost time: 0.057,
res: KA
12_9T8Y.jpg cost time: 0.060,
res: 9T6Y
13_NEYH.jpg cost time: 0.061,
res: NEY3
14_989X.jpg cost time: 0.064,
res: 989X
15_EA5P.jpg cost time: 0.077,
res: EA5P
16_UVZA.jpg cost time: 0.089,
res: UVZA
17_758N.jpg cost time: 0.093,
res: 75BN
18_ZYC5.jpg cost time: 0.088,
res: ZYC5
19_D8DN.jpg cost time: 0.094,
res: D5DN
total acc:5/11=0.4545
- 偶然發(fā)現(xiàn)更簡單的驗證碼網(wǎng)站,準(zhǔn)確率應(yīng)該能高一些吧拣宏?
就是 CSDN的驗證碼:https://passport.csdn.net/ajax/verifyhandler.ashx
- 用了500個數(shù)據(jù)沈贝,直接訓(xùn)完全不行。勋乾。宋下。
-
先用自動生成的數(shù)據(jù)訓(xùn)一個base嗡善,再用這500增強訓(xùn),收斂速度和效果都好得不要不要的学歧。罩引。。
90_WFPIp.jpg cost time: 0.794,
res: WFPIp
91_ptyAB.jpg cost time: 0.083,
res: ptyAB
92_NI3kj.jpg cost time: 0.105,
res: NI3kj
93_oLQHj.jpg cost time: 0.118,
res: oLQHj
94_4Nd2H.jpg cost time: 0.118,
res: 4Nd2H
95_q6hJ8.jpg cost time: 0.121,
res: q6hJ8
96_AB6yU.jpg cost time: 0.122,
res: AB6yU
97_Hqtfk.jpg cost time: 0.112,
res: Hqtfk
98_8Ixy6.jpg cost time: 0.083,
res: 8Ixy6
99_X6W2P.jpg cost time: 0.068,
res: X6W2P
total acc:10/10=1.0000
- 再看看具體網(wǎng)絡(luò)吧枝笨,總有人喜歡面試問這個袁铐,目前我覺得基本都是抄來抄去的,鮮有大牛研究者横浑;記這好像沒啥用吧剔桨?
(self.feed('data')
.conv_single(3, 3, 64 ,1, 1, name='conv1',c_i=cfg.NCHANNELS)
.max_pool(2, 2, 2, 2, padding='VALID', name='pool1')
.conv_single(3, 3, 128 ,1, 1, name='conv2')
.max_pool(2, 2, 2, 2, padding='VALID', name='pool2')
.conv_single(3, 3, 256 ,1, 1, name='conv3_1')
.conv_single(3, 3, 256 ,1, 1, name='conv3_2')
.max_pool(1, 2, 1, 2, padding='VALID', name='pool2')
.conv_single(3, 3, 512 ,1, 1, name='conv4_1', bn=True)
.conv_single(3, 3, 512 ,1, 1, name='conv4_2', bn=True)
.max_pool(1, 2, 1, 2, padding='VALID', name='pool3')
.conv_single(2, 2, 512 ,1, 1, padding = 'VALID', name='conv5', relu=False)
#.dropout(keep_prob = self.keep_prob, name = 'dropout_layer')
.reshape_squeeze_layer(d = 512 , name='reshaped_layer'))
(self.feed('reshaped_layer','time_step_len')
.bi_lstm(cfg.TRAIN.NUM_HID,cfg.TRAIN.NUM_LAYERS,name='logits'))