TensorFlow dataset.shuffle、batch、repeat的使用詳解
直接看代碼例子,有詳細(xì)注釋!!
import tensorflow as tf import numpy as np d = np.arange(0,60).reshape([6, 10]) # 將array轉(zhuǎn)化為tensor data = tf.data.Dataset.from_tensor_slices(d) # 從data數(shù)據(jù)集中按順序抽取buffer_size個(gè)樣本放在buffer中,然后打亂buffer中的樣本 # buffer中樣本個(gè)數(shù)不足buffer_size,繼續(xù)從data數(shù)據(jù)集中安順序填充至buffer_size, # 此時(shí)會(huì)再次打亂 data = data.shuffle(buffer_size=3) # 每次從buffer中抽取4個(gè)樣本 data = data.batch(4) # 將data數(shù)據(jù)集重復(fù),其實(shí)就是2個(gè)epoch數(shù)據(jù)集 data = data.repeat(2) # 構(gòu)造獲取數(shù)據(jù)的迭代器 iters = data.make_one_shot_iterator() # 每次從迭代器中獲取一批數(shù)據(jù) batch = iters.get_next() sess = tf.Session() sess.run(batch) # 數(shù)據(jù)集完成遍歷完之后,繼續(xù)抽取的話會(huì)報(bào)錯(cuò):OutOfRangeError
In [21]: d Out[21]: array([[ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9], [10, 11, 12, 13, 14, 15, 16, 17, 18, 19], [20, 21, 22, 23, 24, 25, 26, 27, 28, 29], [30, 31, 32, 33, 34, 35, 36, 37, 38, 39], [40, 41, 42, 43, 44, 45, 46, 47, 48, 49], [50, 51, 52, 53, 54, 55, 56, 57, 58, 59]]) In [22]: sess.run(batch) Out[22]: array([[ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9], [30, 31, 32, 33, 34, 35, 36, 37, 38, 39], [20, 21, 22, 23, 24, 25, 26, 27, 28, 29], [10, 11, 12, 13, 14, 15, 16, 17, 18, 19]]) In [23]: sess.run(batch) Out[23]: array([[40, 41, 42, 43, 44, 45, 46, 47, 48, 49], [50, 51, 52, 53, 54, 55, 56, 57, 58, 59]])
從輸出結(jié)果可以看出:
shuffle是按順序?qū)?shù)據(jù)放入buffer里面的;
當(dāng)repeat函數(shù)在shuffle之后的話,是將一個(gè)epoch的數(shù)據(jù)集抽取完畢,再進(jìn)行下一個(gè)epoch的。
那么,當(dāng)repeat函數(shù)在shuffle之前會(huì)怎么樣呢?如下:
data = data.repeat(2) data = data.shuffle(buffer_size=3) data = data.batch(4)
In [25]: sess.run(batch) Out[25]: array([[10, 11, 12, 13, 14, 15, 16, 17, 18, 19], [20, 21, 22, 23, 24, 25, 26, 27, 28, 29], [ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9], [40, 41, 42, 43, 44, 45, 46, 47, 48, 49]]) In [26]: sess.run(batch) Out[26]: array([[50, 51, 52, 53, 54, 55, 56, 57, 58, 59], [ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9], [30, 31, 32, 33, 34, 35, 36, 37, 38, 39], [30, 31, 32, 33, 34, 35, 36, 37, 38, 39]]) In [27]: sess.run(batch) Out[27]: array([[10, 11, 12, 13, 14, 15, 16, 17, 18, 19], [50, 51, 52, 53, 54, 55, 56, 57, 58, 59], [20, 21, 22, 23, 24, 25, 26, 27, 28, 29], [40, 41, 42, 43, 44, 45, 46, 47, 48, 49]])
可以看出,其實(shí)它就是先將數(shù)據(jù)集復(fù)制一遍,然后把兩個(gè)epoch當(dāng)成同一個(gè)新的數(shù)據(jù)集,一直shuffle和batch下去。
以上這篇TensorFlow dataset.shuffle、batch、repeat的使用詳解就是小編分享給大家的全部內(nèi)容了,希望能給大家一個(gè)參考,也希望大家多多支持腳本之家。
相關(guān)文章
使用pytorch完成kaggle貓狗圖像識(shí)別方式
今天小編就為大家分享一篇使用pytorch完成kaggle貓狗圖像識(shí)別方式,具有很好的參考價(jià)值,希望對(duì)大家有所幫助。一起跟隨小編過來看看吧2020-01-01
使用Python實(shí)現(xiàn)pdf轉(zhuǎn)圖片再進(jìn)行OCR識(shí)別
這篇文章主要為大家詳細(xì)介紹了如何使用Python實(shí)現(xiàn)pdf轉(zhuǎn)圖片再進(jìn)行OCR識(shí)別功能,文中的示例代碼講解詳細(xì),感興趣的小伙伴可以跟隨小編一起學(xué)習(xí)一下2025-04-04
python在windows和linux下獲得本機(jī)本地ip地址方法小結(jié)
Python+drawpad實(shí)現(xiàn)CPU監(jiān)控小程序
python計(jì)算機(jī)視覺實(shí)現(xiàn)全景圖像拼接示例

