[源碼解讀] 2 PyBrain BuildNetword

In PyBrain, networks are composed of Modules which are connected with Connections. You can think of a network as a directed acyclic graph, where the nodes are Modules and the edges are Connections. This makes PyBrain very flexible but it is also not necessary in all cases.

The buildNetwork Shortcut

Thus, there is a simple way to create networks, which is the buildNetwork shortcut:

from pybrain.tools.shortcuts import buildNetwork
net = buildNetwork(2, 3, 1)

This call returns a network that has two inputs, three hidden and a single output neuron. In PyBrain, these layers are Module objects and they are already connected with FullConnection objects.

The Source Code

__author__ = 'Tom Schaul and Thomas Rueckstiess'


from itertools import chain
import logging 
from sys import exit as errorexit
from pybrain.structure.networks.feedforward import FeedForwardNetwork
from pybrain.structure.networks.recurrent import RecurrentNetwork
from pybrain.structure.modules import BiasUnit, SigmoidLayer, LinearLayer, LSTMLayer
from pybrain.structure.connections import FullConnection, IdentityConnection

try:
    from arac.pybrainbridge import _RecurrentNetwork, _FeedForwardNetwork
except ImportError, e:
    logging.info("No fast networks available: %s" % e)


class NetworkError(Exception): pass


def buildNetwork(*layers, **options):
    """Build arbitrarily deep networks.
    
    `layers` should be a list or tuple of integers, that indicate how many 
    neurons the layers should have. `bias` and `outputbias` are flags to 
    indicate whether the network should have the corresponding biases; both
    default to True.
        
    To adjust the classes for the layers use the `hiddenclass` and  `outclass`
    parameters, which expect a subclass of :class:`NeuronLayer`.
    
    If the `recurrent` flag is set, a :class:`RecurrentNetwork` will be created, 
    otherwise a :class:`FeedForwardNetwork`.
    
    If the `fast` flag is set, faster arac networks will be used instead of the 
    pybrain implementations."""
    # options
    opt = {'bias': True,
           'hiddenclass': SigmoidLayer,
           'outclass': LinearLayer,
           'outputbias': True,
           'peepholes': False,
           'recurrent': False,
           'fast': False,
    }
    for key in options:
        if key not in opt.keys():
            raise NetworkError('buildNetwork unknown option: %s' % key)
        opt[key] = options[key]
    
    if len(layers) < 2:
        raise NetworkError('buildNetwork needs 2 arguments for input and output layers at least.')
        
    # Bind the right class to the Network name
    network_map = {
        (False, False): FeedForwardNetwork,
        (True, False): RecurrentNetwork,
    }
    try:
        network_map[(False, True)] = _FeedForwardNetwork
        network_map[(True, True)] = _RecurrentNetwork
    except NameError:
        if opt['fast']:
            raise NetworkError("No fast networks available.")
    if opt['hiddenclass'].sequential or opt['outclass'].sequential:
        if not opt['recurrent']:
            # CHECKME: a warning here?
            opt['recurrent'] = True
    Network = network_map[opt['recurrent'], opt['fast']]
    n = Network()
    # linear input layer
    n.addInputModule(LinearLayer(layers[0], name='in'))
    # output layer of type 'outclass'
    n.addOutputModule(opt['outclass'](layers[-1], name='out'))
    if opt['bias']:
        # add bias module and connection to out module, if desired
        n.addModule(BiasUnit(name='bias'))
        if opt['outputbias']:
            n.addConnection(FullConnection(n['bias'], n['out']))
    # arbitrary number of hidden layers of type 'hiddenclass'
    for i, num in enumerate(layers[1:-1]):
        layername = 'hidden%i' % i
        n.addModule(opt['hiddenclass'](num, name=layername))
        if opt['bias']:
            # also connect all the layers with the bias
            n.addConnection(FullConnection(n['bias'], n[layername]))
    # connections between hidden layers
    for i in range(len(layers) - 3):
        n.addConnection(FullConnection(n['hidden%i' % i], n['hidden%i' % (i + 1)]))
    # other connections
    if len(layers) == 2:
        # flat network, connection from in to out
        n.addConnection(FullConnection(n['in'], n['out']))
    else:
        # network with hidden layer(s), connections from in to first hidden and last hidden to out
        n.addConnection(FullConnection(n['in'], n['hidden0']))
        n.addConnection(FullConnection(n['hidden%i' % (len(layers) - 3)], n['out']))
    
    # recurrent connections
    if issubclass(opt['hiddenclass'], LSTMLayer):
        if len(layers) > 3:
            errorexit("LSTM networks with > 1 hidden layers are not supported!")
        n.addRecurrentConnection(FullConnection(n['hidden0'], n['hidden0']))

    n.sortModules()
    return n
    

def _buildNetwork(*layers, **options):
    """This is a helper function to create different kinds of networks.

    `layers` is a list of tuples. Each tuple can contain an arbitrary number of
    layers, each being connected to the next one with IdentityConnections. Due 
    to this, all layers have to have the same dimension. We call these tuples
    'parts.'
    
    Afterwards, the last layer of one tuple is connected to the first layer of 
    the following tuple by a FullConnection.
    
    If the keyword argument bias is given, BiasUnits are added additionally with
    every FullConnection. 

    Example:
    
        _buildNetwork(
            (LinearLayer(3),),
            (SigmoidLayer(4), GaussianLayer(4)),
            (SigmoidLayer(3),),
        )
    """
    bias = options['bias'] if 'bias' in options else False
    
    net = FeedForwardNetwork()
    layerParts = iter(layers)
    firstPart = iter(layerParts.next())
    firstLayer = firstPart.next()
    net.addInputModule(firstLayer)
    
    prevLayer = firstLayer
    
    for part in chain(firstPart, layerParts):
        new_part = True
        for layer in part:
            net.addModule(layer)
            # Pick class depending on whether we entered a new part
            if new_part:
                ConnectionClass = FullConnection
                if bias:
                    biasUnit = BiasUnit('BiasUnit for %s' % layer.name)
                    net.addModule(biasUnit)
                    net.addConnection(FullConnection(biasUnit, layer))
            else:
                ConnectionClass = IdentityConnection
            new_part = False
            conn = ConnectionClass(prevLayer, layer)
            net.addConnection(conn)
            prevLayer = layer
    net.addOutputModule(layer)
    net.sortModules()
    return net

名次解釋

RecurrentNetwork:遞歸網(wǎng)絡(luò)
FeedForwardNetwork:前饋神經(jīng)網(wǎng)絡(luò)

關(guān)鍵點(diǎn)解釋

1- def buildNetwork (layers, options)中“”與“”的意義徽诲。

def func(*args):print(args)

當(dāng)用func(1,2,3) 調(diào)用函數(shù)時(shí),參數(shù)args就是元組(1,2,3)

def func(**args):print(args)

當(dāng)用func(a=1,b=2) 調(diào)用函數(shù)時(shí),參數(shù)args將會(huì)是字典{'a':1,'b':2}

def func(*args1, **args2):
    print(args1)
    print(args2)

當(dāng)調(diào)用func(1,2, a=1,b=2)時(shí)蚜迅,打印:(1,2) {'a': 1, 'b': 2}
當(dāng)調(diào)用func(a=1,b=2)時(shí),打印:() {'a': 1, 'b': 2}
當(dāng)調(diào)用func(1,2)時(shí),打印:(1,2) {}

2 - def buildNetwork(*layers, **options): 中“”與“__”的意義。

_單下劃線開(kāi)頭:弱“內(nèi)部使用”標(biāo)識(shí)昌讲,如:”from M import *”,將不導(dǎo)入所有以下劃線開(kāi)頭的對(duì)象减噪,包括包短绸、模塊、成員

單下劃線結(jié)尾_:只是為了避免與python關(guān)鍵字的命名沖突

__雙下劃線開(kāi)頭:模塊內(nèi)的成員筹裕,表示私有成員醋闭,外部無(wú)法直接調(diào)用

雙下劃線開(kāi)頭雙下劃線結(jié)尾:指那些包含在用戶無(wú)法控制的命名空間中的“魔術(shù)”對(duì)象或?qū)傩裕珙惓蓡T的name饶碘、doc目尖、init、import扎运、file瑟曲、等。推薦永遠(yuǎn)不要將這樣的命名方式應(yīng)用于自己的變量或函數(shù)豪治。

步驟

S-1 Bind the right class to the Network name

Network = network_map[opt['recurrent'], opt['fast']]

S-2 Init network

n = Network()
# linear input layer
n.addInputModule(LinearLayer(layers[0], name='in'))
# output layer of type 'outclass'
n.addOutputModule(opt\['outclass'](layers[-1], name='out'))
# arbitrary number of hidden layers of type 'hiddenclass'
for i, num in enumerate(layers[1:-1]):
    layername = 'hidden%i' % i
    n.addModule(opt\['hiddenclass'](num, name=layername))

S-3 Connections among layers

# connections between hidden layers
for i in range(len(layers) - 3):
    n.addConnection(FullConnection(n['hidden%i' % i], n['hidden%i' % (i + 1)]))
# other connections
if len(layers) == 2:
    # flat network, connection from in to out
    n.addConnection(FullConnection(n['in'], n['out']))
else:
    # network with hidden layer(s), connections from in to first hidden and last hidden to out
    n.addConnection(FullConnection(n['in'], n['hidden0']))
    n.addConnection(FullConnection(n['hidden%i' % (len(layers) - 3)], n['out']))

S-4 Recurrent connections

# recurrent connections
if issubclass(opt['hiddenclass'], LSTMLayer):
    if len(layers) > 3:
        errorexit("LSTM networks with > 1 hidden layers are not supported!")
    n.addRecurrentConnection(FullConnection(n['hidden0'], n['hidden0']))

結(jié)語(yǔ)

PyBrain是Python實(shí)現(xiàn)人工神經(jīng)網(wǎng)絡(luò)的一個(gè)第三方庫(kù)洞拨,可以利用其快速構(gòu)建神經(jīng)網(wǎng)絡(luò),本次只是展開(kāi)其構(gòu)建神經(jīng)網(wǎng)絡(luò)的大體步驟负拟,接下來(lái)會(huì)對(duì)具體實(shí)現(xiàn)細(xì)節(jié)進(jìn)行詳細(xì)描述烦衣。

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
  • 序言:七十年代末,一起剝皮案震驚了整個(gè)濱河市掩浙,隨后出現(xiàn)的幾起案子花吟,更是在濱河造成了極大的恐慌,老刑警劉巖厨姚,帶你破解...
    沈念sama閱讀 206,968評(píng)論 6 482
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件衅澈,死亡現(xiàn)場(chǎng)離奇詭異,居然都是意外死亡谬墙,警方通過(guò)查閱死者的電腦和手機(jī)今布,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 88,601評(píng)論 2 382
  • 文/潘曉璐 我一進(jìn)店門经备,熙熙樓的掌柜王于貴愁眉苦臉地迎上來(lái),“玉大人部默,你說(shuō)我怎么就攤上這事侵蒙。” “怎么了傅蹂?”我有些...
    開(kāi)封第一講書(shū)人閱讀 153,220評(píng)論 0 344
  • 文/不壞的土叔 我叫張陵纷闺,是天一觀的道長(zhǎng)。 經(jīng)常有香客問(wèn)我贬派,道長(zhǎng)急但,這世上最難降的妖魔是什么澎媒? 我笑而不...
    開(kāi)封第一講書(shū)人閱讀 55,416評(píng)論 1 279
  • 正文 為了忘掉前任搞乏,我火速辦了婚禮,結(jié)果婚禮上戒努,老公的妹妹穿的比我還像新娘请敦。我一直安慰自己,他們只是感情好储玫,可當(dāng)我...
    茶點(diǎn)故事閱讀 64,425評(píng)論 5 374
  • 文/花漫 我一把揭開(kāi)白布侍筛。 她就那樣靜靜地躺著,像睡著了一般撒穷。 火紅的嫁衣襯著肌膚如雪匣椰。 梳的紋絲不亂的頭發(fā)上,一...
    開(kāi)封第一講書(shū)人閱讀 49,144評(píng)論 1 285
  • 那天端礼,我揣著相機(jī)與錄音禽笑,去河邊找鬼。 笑死蛤奥,一個(gè)胖子當(dāng)著我的面吹牛佳镜,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播凡桥,決...
    沈念sama閱讀 38,432評(píng)論 3 401
  • 文/蒼蘭香墨 我猛地睜開(kāi)眼蟀伸,長(zhǎng)吁一口氣:“原來(lái)是場(chǎng)噩夢(mèng)啊……” “哼!你這毒婦竟也來(lái)了缅刽?” 一聲冷哼從身側(cè)響起啊掏,我...
    開(kāi)封第一講書(shū)人閱讀 37,088評(píng)論 0 261
  • 序言:老撾萬(wàn)榮一對(duì)情侶失蹤,失蹤者是張志新(化名)和其女友劉穎衰猛,沒(méi)想到半個(gè)月后迟蜜,有當(dāng)?shù)厝嗽跇?shù)林里發(fā)現(xiàn)了一具尸體,經(jīng)...
    沈念sama閱讀 43,586評(píng)論 1 300
  • 正文 獨(dú)居荒郊野嶺守林人離奇死亡腕侄,尸身上長(zhǎng)有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點(diǎn)故事閱讀 36,028評(píng)論 2 325
  • 正文 我和宋清朗相戀三年小泉,在試婚紗的時(shí)候發(fā)現(xiàn)自己被綠了芦疏。 大學(xué)時(shí)的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片。...
    茶點(diǎn)故事閱讀 38,137評(píng)論 1 334
  • 序言:一個(gè)原本活蹦亂跳的男人離奇死亡微姊,死狀恐怖酸茴,靈堂內(nèi)的尸體忽然破棺而出,到底是詐尸還是另有隱情兢交,我是刑警寧澤薪捍,帶...
    沈念sama閱讀 33,783評(píng)論 4 324
  • 正文 年R本政府宣布,位于F島的核電站配喳,受9級(jí)特大地震影響酪穿,放射性物質(zhì)發(fā)生泄漏。R本人自食惡果不足惜晴裹,卻給世界環(huán)境...
    茶點(diǎn)故事閱讀 39,343評(píng)論 3 307
  • 文/蒙蒙 一被济、第九天 我趴在偏房一處隱蔽的房頂上張望。 院中可真熱鬧涧团,春花似錦只磷、人聲如沸。這莊子的主人今日做“春日...
    開(kāi)封第一講書(shū)人閱讀 30,333評(píng)論 0 19
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽(yáng)。三九已至阿迈,卻和暖如春元媚,著一層夾襖步出監(jiān)牢的瞬間,已是汗流浹背苗沧。 一陣腳步聲響...
    開(kāi)封第一講書(shū)人閱讀 31,559評(píng)論 1 262
  • 我被黑心中介騙來(lái)泰國(guó)打工刊棕, 沒(méi)想到剛下飛機(jī)就差點(diǎn)兒被人妖公主榨干…… 1. 我叫王不留,地道東北人崎页。 一個(gè)月前我還...
    沈念sama閱讀 45,595評(píng)論 2 355
  • 正文 我出身青樓鞠绰,卻偏偏與公主長(zhǎng)得像,于是被迫代替她去往敵國(guó)和親飒焦。 傳聞我的和親對(duì)象是個(gè)殘疾皇子蜈膨,可洞房花燭夜當(dāng)晚...
    茶點(diǎn)故事閱讀 42,901評(píng)論 2 345

推薦閱讀更多精彩內(nèi)容

  • 二零一三年翁巍,我離開(kāi)了屬于自己的地方,踏上了北漂的生活休雌。 我一直希望有一份只屬于我自己的愛(ài)情灶壶,簡(jiǎn)簡(jiǎn)單單...
  • 兒子13正值青春叛逆,脾氣死臭杈曲,我?guī)缀趺恐芷咛於荚诓煌5乜絾?wèn)自己:為啥結(jié)婚為啥生娃驰凛?!純屬自作孽不可活胸懈。以至于也經(jīng)...
    安妮拉閱讀 182評(píng)論 0 1
  • 這世上有一種東西是百害而無(wú)一利的——那就是發(fā)脾氣趣钱。教育孩子也是如此。 發(fā)脾氣是教育的最大死敵胚宦,脾氣越大首有,教育效果越...
    田馳弦上經(jīng)典閱讀 589評(píng)論 0 0
  • 因?yàn)榍皫滋焱砩嫌腥饲瞄T, 然后圈爸這幾天都下班很早 我健身完接我下班 好爸爸啊^_^
    圈媽閱讀 76評(píng)論 0 0
  • 有兩個(gè)人去挖鉆石枢劝,一個(gè)人堅(jiān)持不懈的挖著井联,雖然動(dòng)作慢點(diǎn),暫時(shí)還沒(méi)挖到鉆石您旁,但是他沒(méi)有放棄依然挖下去烙常,而另外一個(gè)人,已...
    玥萱兒閱讀 329評(píng)論 0 0