【時(shí)間:2020年1月30日】從回家到現(xiàn)在已經(jīng)有10天的時(shí)間了昌跌,在鄉(xiāng)下秉撇,對(duì)全國(guó)整體疫情的情況無法整體把握夫凸,個(gè)人來看浑劳,周邊的親人有松懈的情況,沒有將這件事放在心上夭拌。下午在家看了李安導(dǎo)演的電影《比利·林恩的中場(chǎng)戰(zhàn)事》(Billy Lynn's Long Halftime Walk)魔熏,相信在不同的時(shí)刻,不同的人從中獲得東西也不同鸽扁!
我的感受是“從出生到現(xiàn)在蒜绽,我還沒找到自己生活的意義或者說某種使命感,而讓我活到現(xiàn)在的支柱是桶现,我希望未來的某一天躲雅,我的家人談到我時(shí),他們會(huì)以我為驕傲骡和;而我經(jīng)歷了23年的人生和從6歲就開始接受的教育告訴我一件事:做一個(gè)聰明的人相赁,堅(jiān)持受教育,做到自律即自由”慰于∨タ疲或許在30歲以前,我會(huì)找到我人生的使命感婆赠。
1.順序索引
- 基本索引
//ipython練習(xí)筆記
In [4]: a = tf.ones([1, 5, 5, 3])
In [5]: a
Out[5]: <tf.Tensor: id=5, shape=(1, 5, 5, 3), dtype=float32, numpy=...>
In [6]: a[0][0]
Out[6]: <tf.Tensor: id=13, shape=(5, 3), dtype=float32, numpy=
array([[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]], dtype=float32)>
In [7]: a[0][0][0]
Out[7]: <tf.Tensor: id=25, shape=(3,), dtype=float32, numpy=array([1., 1., 1.], dtype=float32)>
//標(biāo)量
In [8]: a[0][0][0][2]
Out[8]: <tf.Tensor: id=41, shape=(), dtype=float32, numpy=1.0>
- Numpy風(fēng)格索引
//ipython練習(xí)筆記
In [11]: a = tf.random.normal([4,28,28,3])
In [12]: a[1].shape
Out[12]: TensorShape([28, 28, 3])
In [13]: a[1, 2].shape
Out[13]: TensorShape([28, 3])
In [14]: a[1, 2, 3].shape
Out[14]: TensorShape([3])
//標(biāo)量
In [15]: a[1, 2, 3, 2].shape
Out[15]: TensorShape([])
- 【start:end】start為起始位置绵脯,end為終止位置,不包含end
In [17]: a = tf.range(10)
In [18]: a
Out[18]: <tf.Tensor: id=81, shape=(10,), dtype=int32, numpy=array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])>
//從idx=x到idx=y,不包含end位
In [19]: a[0:10]
Out[19]: <tf.Tensor: id=85, shape=(10,), dtype=int32, numpy=array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])>
In [20]: a[0:9]
Out[20]: <tf.Tensor: id=89, shape=(9,), dtype=int32, numpy=array([0, 1, 2, 3, 4, 5, 6, 7, 8])>
//從idx=0(最初)到某一位
In [21]: a[:10]
Out[21]: <tf.Tensor: id=93, shape=(10,), dtype=int32, numpy=array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])>
In [24]: a[:1]
Out[24]: <tf.Tensor: id=105, shape=(1,), dtype=int32, numpy=array([0])>
//從某一位到idx=N(最后)
In [22]: a[0:]
Out[22]: <tf.Tensor: id=97, shape=(10,), dtype=int32, numpy=array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])>
In [23]: a[9:]
Out[23]: <tf.Tensor: id=101, shape=(1,), dtype=int32, numpy=array([9])>
//倒序索引蛆挫,最后一位對(duì)應(yīng)的索引為“-1”赃承,倒數(shù)第二位對(duì)應(yīng)的索引為“-2”
//從倒數(shù)第二位到最后
In [25]: a[-2:]
Out[25]: <tf.Tensor: id=109, shape=(2,), dtype=int32, numpy=array([8, 9])>
//從最初到倒數(shù)第二位,不包含倒數(shù)第二位
In [26]: a[:-2]
Out[26]: <tf.Tensor: id=113, shape=(8,), dtype=int32, numpy=array([0, 1, 2, 3, 4, 5, 6, 7])>
//從倒數(shù)最后一位到倒數(shù)第一位悴侵,不包含倒數(shù)第一位
In [28]: a[-10:-1]
Out[28]: <tf.Tensor: id=121, shape=(9,), dtype=int32, numpy=array([0, 1, 2, 3, 4, 5, 6, 7, 8])>
//從倒數(shù)最后一位到最后一位
In [30]: a[-10:10]
Out[30]: <tf.Tensor: id=129, shape=(10,), dtype=int32, numpy=array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])>
//從倒數(shù)第一位到最后
In [31]: a[-1:]
Out[31]: <tf.Tensor: id=133, shape=(1,), dtype=int32, numpy=array([9])>
//特殊形式:【_:_】冒號(hào)前后無元素楣导,表示取該維度的所有元素
In [36]: b = tf.random.normal([4,28,28,3])
In [37]: b.shape
Out[37]: TensorShape([4, 28, 28, 3])
In [38]: b[0].shape
Out[38]: TensorShape([28, 28, 3])
In [39]: b[0, :, :, :].shape //等同于b[0].shape
Out[39]: TensorShape([28, 28, 3])
In [40]: b[0, 1, :, :].shape
Out[40]: TensorShape([28, 3])
In [41]: b[:, :, :, 2].shape
Out[41]: TensorShape([4, 28, 28])
In [42]: b[:, 1, :, :].shape
Out[42]: TensorShape([4, 28, 3])
+【start:end:step】start為起始位置,end為終止位置畜挨,不包含end筒繁,隔step采樣
In [46]: a = tf.range(10)
In [47]: a
Out[47]: <tf.Tensor: id=177, shape=(10,), dtype=int32, numpy=array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])>
//從起始端到終止段,隔2采樣巴元,得到偶數(shù)序列
In [50]: a[::2]
Out[50]: <tf.Tensor: id=189, shape=(5,), dtype=int32, numpy=array([0, 2, 4, 6, 8])>
In [48]: a[0::2]
Out[48]: <tf.Tensor: id=181, shape=(5,), dtype=int32, numpy=array([0, 2, 4, 6, 8])>
//從1開始毡咏,隔2采樣,得到奇數(shù)序列
In [49]: a[1::2]
Out[49]: <tf.Tensor: id=185, shape=(5,), dtype=int32, numpy=array([1, 3, 5, 7, 9])>
//從起始端到終止端逮刨,隔3采樣
In [51]: a[::3]
Out[51]: <tf.Tensor: id=193, shape=(4,), dtype=int32, numpy=array([0, 3, 6, 9])>
//倒序
In [52]: a[::-1]
Out[52]: <tf.Tensor: id=197, shape=(10,), dtype=int32, numpy=array([9, 8, 7, 6, 5, 4, 3, 2, 1, 0])>
//倒序呕缭,隔2采樣
In [53]: a[::-2]
Out[53]: <tf.Tensor: id=201, shape=(5,), dtype=int32, numpy=array([9, 7, 5, 3, 1])>
//[A:B:STEP]中,STEP為負(fù)時(shí)修己,從A往起始端采樣恢总;從終止端往B采樣,不包含B
In [55]: a[2::-1]
Out[55]: <tf.Tensor: id=209, shape=(3,), dtype=int32, numpy=array([2, 1, 0])>
In [54]: a[2::-2]
Out[54]: <tf.Tensor: id=205, shape=(2,), dtype=int32, numpy=array([2, 0])>
In [59]: a[:2:-1]
Out[59]: <tf.Tensor: id=225, shape=(7,), dtype=int32, numpy=array([9, 8, 7, 6, 5, 4, 3])>
+【...】省略號(hào)簡(jiǎn)寫代替【:, :, :】睬愤,自動(dòng)推導(dǎo)
In [60]: c = tf.random.normal([2, 4, 28, 28, 3])
In [61]: c[0].shape
Out[61]: TensorShape([4, 28, 28, 3])
In [62]: c[0, :, :, :, :].shape
Out[62]: TensorShape([4, 28, 28, 3])
In [63]: c[0,...].shape
Out[63]: TensorShape([4, 28, 28, 3])
In [64]: c[:, :, :, :, 0].shape
Out[64]: TensorShape([2, 4, 28, 28])
In [65]: c[..., 0].shape
Out[65]: TensorShape([2, 4, 28, 28])
In [66]: c[0, ..., 2].shape
Out[66]: TensorShape([4, 28, 28])
//如果要在中心省略片仿,必須聲明前后對(duì)應(yīng)的【:】
In [67]: c[0, ..., 0, :].shape
Out[67]: TensorShape([4, 28, 3])
2.選擇性索引
- tf.gather()
tf.gather(
params, //數(shù)據(jù)
indices, //索引序列
validate_indices=None,
axis=None, //數(shù)據(jù)維度
batch_dims=0, //
name=None
)
In [81]: a =tf.range(5)
In [82]: a
Out[82]: <tf.Tensor: id=309, shape=(5,), dtype=int32, numpy=array([0, 1, 2, 3, 4])>
In [83]: tf.gather(a, axis=0, indices=[2, 1, 4, 0])
Out[83]: <tf.Tensor: id=312, shape=(4,), dtype=int32, numpy=array([2, 1, 4, 0])>
data:[classes, students, subjects],如何采集各班相應(yīng)序號(hào)學(xué)生的對(duì)應(yīng)科目成績(jī)尤辱?
//分兩步走
a = tf.random.normal([4, 35, 8])
aa = tf.gather(a, axis, [several students]) //先選取相應(yīng)學(xué)生
aaa = tf.gather(aa, axis, [several subjects]) //后選取對(duì)應(yīng)科目
- tf.gather_nd()
data:[classes, students, subjects]砂豌,如何采集某不同班級(jí)中不同序號(hào)學(xué)生成績(jī)?
【class1_student1, class2_student2, class3_student3, class4_student4】
tf.gather_nd(
params, //數(shù)據(jù)
indices, //聯(lián)合索引號(hào)對(duì)應(yīng)的個(gè)體光督,或者是聯(lián)合索引號(hào)集合對(duì)應(yīng)個(gè)體的集合
batch_dims=0,
name=None
)
In [88]: a.shape
Out[88]: TensorShape([4, 35, 8])
In [89]: tf.gather_nd(a, [0]).shape
Out[89]: TensorShape([35, 8])
In [93]: tf.gather_nd(a, [[0], [2]]).shape
Out[93]: TensorShape([2, 35, 8])
//標(biāo)量
In [91]: tf.gather_nd(a, [0, 1, 2]).shape
Out[91]: TensorShape([])
//矢量
In [92]: tf.gather_nd(a, [[0, 1, 2]]).shape
Out[92]: TensorShape([1])
In [94]: tf.gather_nd(a, [[0, 1, 2], [1, 2, 1]]).shape
Out[94]: TensorShape([2])
- tf.boolean_mask()
tf.boolean_mask(
tensor,
mask, //掩模阳距,匹配其中的True元素
axis=None, //數(shù)據(jù)維度
name='boolean_mask'
)
In [95]: a = tf.random.normal([4, 28, 28, 3])
In [96]: a.shape
Out[96]: TensorShape([4, 28, 28, 3])
In [97]: tf.boolean_mask(a, mask=[True, True, False, False]).shape
Out[97]: TensorShape([2, 28, 28, 3])
In [98]: tf.boolean_mask(a, mask=[True, True, False], axis = 3).shape
Out[98]: TensorShape([4, 28, 28, 2])
In [103]: tf.boolean_mask(a, mask=[[True, False, False], [False, True, True]])
Out[103]: <tf.Tensor: id=428, shape=(3, 4), dtype=float32, numpy=
array([[1., 1., 1., 1.],
[1., 1., 1., 1.],
[1., 1., 1., 1.]], dtype=float32)>