xxxx18一60岁hd中国/日韩女同互慰一区二区/西西人体扒开双腿无遮挡/日韩欧美黄色一级片 - 色护士精品影院www

  • 大小: 12KB
    文件類型: .py
    金幣: 2
    下載: 1 次
    發(fā)布日期: 2021-05-14
  • 語言: Python
  • 標(biāo)簽: GAN??

資源簡(jiǎn)介

適合生成對(duì)抗網(wǎng)絡(luò)的初學(xué)者,做個(gè)小實(shí)驗(yàn),鍛煉自己。從小程序做起,一步一步學(xué)習(xí)。GAN訓(xùn)練起來比較困難,所以是個(gè)難題。

資源截圖

代碼片段和文件信息

#Author:WangJuan
import?argparse#解析命令行參數(shù)和選項(xiàng)
import?numpy?as?np#?numpy科學(xué)計(jì)算的庫,可以提供矩陣運(yùn)算
from?scipy.stats?import?norm#scipy數(shù)值計(jì)算庫
import?tensorflow?as?tf
import?matplotlib.pyplot?as?plt
from?matplotlib?import?animation#matplotlib繪圖庫
import?seaborn?as?sns#?數(shù)據(jù)模塊可視化


sns.set(color_codes=True)?#sns.set(style=“white“?palette=“muted“?color_codes=True)
#???#set(?)設(shè)置主題,調(diào)色板更常用?muted柔和的
seed?=?42#?設(shè)置seed,使得每次生成的隨機(jī)數(shù)相同
np.random.seed(seed)
tf.set_random_seed(seed)

class?DataDistribution(object):#真實(shí)數(shù)據(jù)分布(藍(lán)色的線)
????def?__init__(self):
????????self.mu?=?4#均值
????????self.sigma?=?0.5#標(biāo)準(zhǔn)差

????def?sample(self?N):
????????samples?=?np.random.normal(self.mu?self.sigma?N)
????????samples.sort()
????????return?samples

class?GeneratorDistribution(object):#G網(wǎng)絡(luò)的輸入,隨機(jī)噪聲分布
????def?__init__(self?range):
????????self.range?=?range


????def?sample(self?N):
????????#均勻分布
????????return?np.linspace(-self.range?self.range?N)?+?\
????????????np.random.random(N)?*?0.01#隨機(jī)0-1
????????‘‘‘
????????samples?=?np.random.normal(4?0.5?N)
????????samples.sort()
????????return?samples
????????‘‘‘
def?linear(input?output_dim?scope=None?stddev=1.0):#w和b參數(shù)的初始化#線性計(jì)算,計(jì)算y=wx+b
?????norm?=?tf.random_normal_initializer(stddev=stddev)#用高斯的隨機(jī)初始化給w進(jìn)行初始化
?????const?=?tf.constant_initializer(0.0)#用常量0給b進(jìn)行初始化
?????with?tf.variable_scope(scope?or?‘linear‘):#變量域?yàn)閟cope(默認(rèn)繼承外層變量域)的值當(dāng)值為None時(shí),域?yàn)閘inear
????????w?=?tf.get_variable(‘w‘?[input.get_shape()[1]?output_dim]?initializer=norm)#input.get_shape()[1]獲取input的列數(shù)
????????b?=?tf.get_variable(‘b‘?[output_dim]?initializer=const)
????????return?tf.matmul(input?w)?+?b

def?generator(input?h_dim):??#?生成網(wǎng)絡(luò)
????#?h0?=?tf.nn.tanh(linear(input?h_dim?‘g0‘))
?????#?h0?=?tf.nn.sigmoid(linear(input?h_dim?‘g0‘))
????h0?=?tf.nn.relu(linear(input?h_dim?‘g0‘))??#?較好
????#?h1?=?tf.nn.relu(linear(h0?h_dim?‘g1‘))#
????#?h2?=?linear(h1?1?‘g2‘)
????#?return?h2
????#?h0?=?tf.nn.softplus(linear(input?h_dim?‘g0‘))#原
????h1?=?linear(h0?1?‘g1‘)??#?原
????return?h1??#?原

def?discriminator(input?h_dim):??#?初始判別網(wǎng)絡(luò)
????h0?=?tf.tanh(linear(input?h_dim?*?2?‘d0‘))??#?第一層的輸出
????h1?=?tf.tanh(linear(h0?h_dim?*?2?‘d1‘))
????h2?=?tf.tanh(linear(h1?h_dim?*?2?scope=‘d2‘))

????h3?=?tf.sigmoid(linear(h2?1?scope=‘d3‘))??#?使用sigmod激活函數(shù)將最終輸出結(jié)果固定在0-1之間,方便對(duì)最終結(jié)果的真假概率進(jìn)行計(jì)算
????return?h3??#

def?optimizer(loss?var_list?initial_learning_rate):??#?學(xué)習(xí)率不斷衰減
????decay?=?0.95
????num_decay_steps?=?150??#?每迭代150次進(jìn)行一次衰減,
????batch?=?tf.Variable(0)
????learning_rate?=?tf.train.exponential_decay(
????????initial_learning_rate
????????batch
????????num_decay_steps
????????decay
????????staircase=True
????????)
????optimizer?=?tf.train.GradientDescentOptimizer(learning_rate).minimize(??#?使用梯度下降求解器來最小化loss值,對(duì)var_list中的變量進(jìn)行優(yōu)化
????????????loss
????????????global_step=batch
????????????var_list=var_list
????????)
????return?optimizer


class?GAN(object):??#?模型
????def?__init__(self?data

評(píng)論

共有 條評(píng)論