Abracadabra

Generative Adversarial Networks (GANs) in 50 lines of code (PyTorch)

Source Blog

PyTorch Install: http://pytorch.org/

The models play two distinct (literally, adversarial) roles. Given some real data set R, G is the generator, trying to create fake data that looks just like the genuine data, while D is the discriminator, getting data from either the real set or G and labeling the difference. Goodfellow’s metaphor (and a fine one it is) was that G was like a team of forgers trying to match real paintings with their output, while D was the team of detectives trying to tell the difference. (Except that in this case, the forgers G never get to see the original data — only the judgments of D. They’re like blind forgers.)

img

In the ideal case, both D and G would get better over time until G had essentially become a “master forger” of the genuine article and D was at a loss, “unable to differentiate between the two distributions.”

In practice, what Goodfellow had shown was that G would be able to perform a form of unsupervised learning on the original dataset, finding some way of representing that data in a (possibly) much lower-dimensional manner. And as Yann LeCun famously stated, unsupervised learning is the “cake” of true AI.


This powerful technique seems like it must require a metric ton of code just to get started, right? Nope. Using PyTorch, we can actually create a very simple GAN in under 50 lines of code. There are really only 5 components to think about:

  • R: The original, genuine data set
  • I: The random noise that goes into the generator as a source of entropy
  • G: The generator which tries to copy/mimic the original data set
  • D: The discriminator which tries to tell apart G’s output from R
  • The actual ‘training’ loop where we teach G to trick D and D to beware G.

1.) R: In our case, we’ll start with the simplest possible R — a bell curve. This function takes a mean and a standard deviation and returns a function which provides the right shape of sample data from a Gaussian with those parameters. In our sample code, we’ll use a mean of 4.0 and a standard deviation of 1.25.

img

2.) I: The input into the generator is also random, but to make our job a little bit harder, let’s use a uniform distribution rather than a normal one. This means that our model G can’t simply shift/scale the input to copy R, but has to reshape the data in a non-linear way.

img

3.) G: The generator is a standard feedforward graph — two hidden layers, three linear maps. We’re using an ELU (exponential linear unit) becausethey’re the new black, yo. G is going to get the uniformly distributed data samples from I and somehow mimic the normally distributed samples from R.

img

4.) D: The discriminator code is very similar to G’s generator code; a feedforward graph with two hidden layers and three linear maps. It’s going to get samples from either R or G and will output a single scalar between 0 and 1, interpreted as ‘fake’ vs. ‘real’. This is about as milquetoast as a neural net can get.

img

5.) Finally, the training loop alternates between two modes: first training D on real data vs. fake data, with accurate labels (think of this as Police Academy); and then training G to fool D, with inaccurate labels (this is more like those preparation montages from Ocean’s Eleven). It’s a fight between good and evil, people.

img

Even if you haven’t seen PyTorch before, you can probably tell what’s going on. In the first (green) section, we push both types of data through D and apply a differentiable criterion to D’s guesses vs. the actual labels. That pushing is the ‘forward’ step; we then call ‘backward()’ explicitly in order to calculate gradients, which are then used to update D’s parameters in the d_optimizer step() call. G is used but isn’t trained here.

Then in the last (red) section, we do the same thing for G — note that we also run G’s output through D (we’re essentially giving the forger a detective to practice on) but we do not optimize or change D at this step. We don’t want the detective D to learn the wrong labels. Hence, we only call g_optimizer.step().

And…that’s all. There’s some other boilerplate code but the GAN-specific stuff is just those 5 components, nothing else.


After a few thousand rounds of this forbidden dance between D and G, what do we get? The discriminator D gets good very quickly (while G slowly moves up), but once it gets to a certain level of power, G has a worthy adversary and begins to improve. Really improve.

Over 20,000 training rounds, the mean of G’s output overshoots 4.0 but then comes back in a fairly stable, correct range (left). Likewise, the standard deviation initially drops in the wrong direction but then rises up to the desired 1.25 range (right), matching R.

img

Ok, so the basic stats match R, eventually. How about the higher moments? Does the shape of the distribution look right? After all, you could certainly have a uniform distribution with a mean of 4.0 and a standard deviation of 1.25, but that wouldn’t really match R. Let’s show the final distribution emitted by G.

img

Not bad. The left tail is a bit longer than the right, but the skew and kurtosis are, shall we say, evocative of the original Gaussian.

G recovers the original distribution R nearly perfectly — and D is left cowering in the corner, mumbling to itself, unable to tell fact from fiction. This is precisely the behavior we want (see Figure 1 in Goodfellow). From fewer than 50 lines of code.

Goodfellow would go on to publish many other papers on GANs, including a 2016 gem describing some practical improvements, including the minibatch discrimination method adapted here. And here’s a 2-hour tutorial he presented at NIPS 2016. For TensorFlow users, here’s a parallel post from Aylien on GANs.

Ok. Enough talk. Go look at the code.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
#!/usr/bin/env python
# Generative Adversarial Networks (GAN) example in PyTorch.
# See related blog post at https://medium.com/@devnag/generative-adversarial-networks-gans-in-50-lines-of-code-pytorch-e81b79659e3f#.sch4xgsa9
import numpy as np
import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim
from torch.autograd import Variable
# Data params
data_mean = 4
data_stddev = 1.25
# Model params
g_input_size = 1 # Random noise dimension coming into generator, per output vector
g_hidden_size = 50 # Generator complexity
g_output_size = 1 # size of generated output vector
d_input_size = 100 # Minibatch size - cardinality of distributions
d_hidden_size = 50 # Discriminator complexity
d_output_size = 1 # Single dimension for 'real' vs. 'fake'
minibatch_size = d_input_size
d_learning_rate = 2e-4 # 2e-4
g_learning_rate = 2e-4
optim_betas = (0.9, 0.999)
num_epochs = 30000
print_interval = 200
d_steps = 1 # 'k' steps in the original GAN paper. Can put the discriminator on higher training freq than generator
g_steps = 1
# ### Uncomment only one of these
#(name, preprocess, d_input_func) = ("Raw data", lambda data: data, lambda x: x)
(name, preprocess, d_input_func) = ("Data and variances", lambda data: decorate_with_diffs(data, 2.0), lambda x: x * 2)
print("Using data [%s]" % (name))
# ##### DATA: Target data and generator input data
def get_distribution_sampler(mu, sigma):
return lambda n: torch.Tensor(np.random.normal(mu, sigma, (1, n))) # Gaussian
def get_generator_input_sampler():
return lambda m, n: torch.rand(m, n) # Uniform-dist data into generator, _NOT_ Gaussian
# ##### MODELS: Generator model and discriminator model
class Generator(nn.Module):
def __init__(self, input_size, hidden_size, output_size):
super(Generator, self).__init__()
self.map1 = nn.Linear(input_size, hidden_size)
self.map2 = nn.Linear(hidden_size, hidden_size)
self.map3 = nn.Linear(hidden_size, output_size)
def forward(self, x):
x = F.elu(self.map1(x))
x = F.sigmoid(self.map2(x))
return self.map3(x)
class Discriminator(nn.Module):
def __init__(self, input_size, hidden_size, output_size):
super(Discriminator, self).__init__()
self.map1 = nn.Linear(input_size, hidden_size)
self.map2 = nn.Linear(hidden_size, hidden_size)
self.map3 = nn.Linear(hidden_size, output_size)
def forward(self, x):
x = F.elu(self.map1(x))
x = F.elu(self.map2(x))
return F.sigmoid(self.map3(x))
def extract(v):
return v.data.storage().tolist()
def stats(d):
return [np.mean(d), np.std(d)]
def decorate_with_diffs(data, exponent):
mean = torch.mean(data.data, 1)
mean_broadcast = torch.mul(torch.ones(data.size()), mean.tolist()[0][0])
diffs = torch.pow(data - Variable(mean_broadcast), exponent)
return torch.cat([data, diffs], 1)
d_sampler = get_distribution_sampler(data_mean, data_stddev)
gi_sampler = get_generator_input_sampler()
G = Generator(input_size=g_input_size, hidden_size=g_hidden_size, output_size=g_output_size)
D = Discriminator(input_size=d_input_func(d_input_size), hidden_size=d_hidden_size, output_size=d_output_size)
criterion = nn.BCELoss() # Binary cross entropy: http://pytorch.org/docs/nn.html#bceloss
d_optimizer = optim.Adam(D.parameters(), lr=d_learning_rate, betas=optim_betas)
g_optimizer = optim.Adam(G.parameters(), lr=g_learning_rate, betas=optim_betas)
for epoch in range(num_epochs):
for d_index in range(d_steps):
# 1. Train D on real+fake
D.zero_grad()
# 1A: Train D on real
d_real_data = Variable(d_sampler(d_input_size))
d_real_decision = D(preprocess(d_real_data))
d_real_error = criterion(d_real_decision, Variable(torch.ones(1))) # ones = true
d_real_error.backward() # compute/store gradients, but don't change params
# 1B: Train D on fake
d_gen_input = Variable(gi_sampler(minibatch_size, g_input_size))
d_fake_data = G(d_gen_input).detach() # detach to avoid training G on these labels
d_fake_decision = D(preprocess(d_fake_data.t()))
d_fake_error = criterion(d_fake_decision, Variable(torch.zeros(1))) # zeros = fake
d_fake_error.backward()
d_optimizer.step() # Only optimizes D's parameters; changes based on stored gradients from backward()
for g_index in range(g_steps):
# 2. Train G on D's response (but DO NOT train D on these labels)
G.zero_grad()
gen_input = Variable(gi_sampler(minibatch_size, g_input_size))
g_fake_data = G(gen_input)
dg_fake_decision = D(preprocess(g_fake_data.t()))
g_error = criterion(dg_fake_decision, Variable(torch.ones(1))) # we want to fool, so pretend it's all genuine
g_error.backward()
g_optimizer.step() # Only optimizes G's parameters
if epoch % print_interval == 0:
print("%s: D: %s/%s G: %s (Real: %s, Fake: %s) " % (epoch,
extract(d_real_error)[0],
extract(d_fake_error)[0],
extract(g_error)[0],
stats(extract(d_real_data)),
stats(extract(d_fake_data))))

Result:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
ewan@ubuntu:~/Documents/gan/pytorch-generative-adversarial-networks$ python gan_pytorch.py
Using data [Data and variances]
0: D: 0.636019647121/0.687892377377 G: 0.692580163479 (Real: [4.0121619534492492, 1.3228379995364423], Fake: [0.36497069358825684, 0.0040907625909989871])
200: D: 2.92067015835e-05/0.474851727486 G: 1.00973010063 (Real: [4.0935744738578794, 1.3016500752040552], Fake: [-0.5716635638475418, 0.019948046232028654])
400: D: 0.0014917049557/0.502498149872 G: 0.943185687065 (Real: [4.198446000814438, 1.1262929992527102], Fake: [-0.21786054879426955, 0.0067362612730766476])
600: D: 6.4969262894e-06/0.384293109179 G: 1.15257537365 (Real: [3.8602226501703263, 1.3292726136430937], Fake: [-0.29857088595628739, 0.03924369275813562])
800: D: 1.84774467016e-06/0.211148008704 G: 1.67116880417 (Real: [4.0269100540876392, 1.2954351206409835], Fake: [-0.32296697288751602, 0.14901211840131676])
1000: D: 9.02455067262e-05/0.0219078511 G: 4.19585323334 (Real: [3.9491306754946707, 1.3613105655283608], Fake: [0.13110455054789782, 0.5252103421913964])
1200: D: 0.00441630883142/0.137605398893 G: 2.78980493546 (Real: [4.238747425079346, 1.1837142728845262], Fake: [2.3851456820964811, 0.69947230698573948])
1400: D: 0.291683584452/0.824121117592 G: 0.26126781106 (Real: [3.8486315739154815, 1.2074486225815622], Fake: [3.4868409335613251, 1.2438192602257458])
1600: D: 0.503275632858/1.08712184429 G: 0.628099560738 (Real: [3.7856648898124696, 1.1925325100947208], Fake: [3.9149187129735945, 1.5374543372663099])
1800: D: 0.992162883282/0.955306172371 G: 0.215137541294 (Real: [3.9097139459848402, 1.3729001379532129], Fake: [4.9751595187187192, 1.2850838287273094])
2000: D: 0.701098382473/0.634775817394 G: 0.389043629169 (Real: [3.9641699814796447, 1.1512756986625183], Fake: [5.0374661159515384, 1.5190411587235346])
2200: D: 0.510353624821/0.350295126438 G: 1.5988701582 (Real: [4.0406568145751951, 1.3612318676859239], Fake: [5.4763065743446351, 1.2736378899688456])
2400: D: 0.895085930824/0.400622785091 G: 0.922062814236 (Real: [3.8292097043991089, 1.1506111704583193], Fake: [4.5642045128345492, 1.7082890861364539])
2600: D: 0.802581310272/0.717123866081 G: 0.572393655777 (Real: [4.0654918360710148, 1.2552944260604222], Fake: [5.1286249160766602, 1.0479449058428656])
2800: D: 0.51098883152/0.489002883434 G: 0.842381119728 (Real: [4.0405197954177856, 1.136660175398452], Fake: [3.9549839448928834, 1.1751749984899784])
3000: D: 0.496278882027/0.97537201643 G: 0.753688693047 (Real: [4.0026307255029678, 1.2446167315972034], Fake: [3.2340782660245897, 1.2949288892421307])
3200: D: 0.696556508541/0.829834342003 G: 0.475445389748 (Real: [3.9983750417828561, 1.2828095340103229], Fake: [3.5434492731094362, 0.98673911467128028])
3400: D: 0.479906737804/0.477254271507 G: 1.2421528101 (Real: [4.1585888534784319, 1.2672863214247221], Fake: [3.3173918831348419, 1.156708995162234])
3600: D: 1.36562228203/0.508370876312 G: 0.550418972969 (Real: [4.0406067597866056, 1.1363201759386616], Fake: [4.4300824308395388, 1.0639278538481793])
3800: D: 0.538426816463/0.622343420982 G: 0.786149024963 (Real: [4.0097330248355867, 1.1609232820569348], Fake: [4.5179304122924808, 1.2347411732817635])
4000: D: 0.350504934788/0.361344873905 G: 0.728424191475 (Real: [3.7975878280401232, 1.2378775025626094], Fake: [4.3484812033176423, 1.4327683271077338])
4200: D: 0.912463009357/0.779066801071 G: 0.840294659138 (Real: [3.9861780107021332, 1.2293009498211762], Fake: [4.0718169224262235, 1.2044778720046834])
4400: D: 0.814347147942/0.794115483761 G: 0.889387726784 (Real: [3.9556436133384705, 1.1131208050960595], Fake: [3.6148070895671847, 1.1790021094109027])
4600: D: 0.637132883072/0.639598190784 G: 0.835896074772 (Real: [4.0807307386398319, 1.1590112689981971], Fake: [3.6376679444313051, 1.2540016088688517])
4800: D: 0.816388785839/0.629823803902 G: 0.6337043643 (Real: [4.1595975148677828, 1.2996693029809485], Fake: [4.0303308999538423, 1.3050560562935769])
5000: D: 1.38226401806/0.714248239994 G: 1.17240273952 (Real: [3.9217003214359285, 1.3408209709046912], Fake: [4.4204820060729979, 1.0378887480226417])
5200: D: 0.752707779408/0.432243227959 G: 0.735915839672 (Real: [4.033863249272108, 1.417255801501303], Fake: [3.7434970003366472, 1.4305561672741818])
5400: D: 0.672449588776/0.694190680981 G: 0.671269893646 (Real: [3.9849637061357499, 1.3054745436415693], Fake: [3.7987613070011137, 1.1584021967574571])
5600: D: 0.633513212204/0.678804934025 G: 0.736048042774 (Real: [3.8742538380622862, 1.1924929483627851], Fake: [4.0905960440635685, 1.0496450658176097])
5800: D: 0.954816102982/0.619474828243 G: 0.847522497177 (Real: [4.0848416697978971, 1.2377045321962332], Fake: [4.5059887909889218, 1.0769809353783582])
6000: D: 0.634225904942/0.653471052647 G: 0.402414888144 (Real: [3.9909452509880068, 1.2152347623325401], Fake: [3.9412865948677065, 1.2808620107297906])
6200: D: 0.733776032925/0.414616316557 G: 0.969770550728 (Real: [4.0096452310681343, 1.2858629342885464], Fake: [3.4776910370588303, 1.4216167469252254])
6400: D: 0.483776688576/0.456314682961 G: 0.42595911026 (Real: [4.16927042722702, 1.2557057135387499], Fake: [3.905275868177414, 1.3509040440658031])
6600: D: 1.06177055836/0.443961560726 G: 0.910483181477 (Real: [4.0327691116929056, 1.1752792712434861], Fake: [4.1322225379943847, 1.3041032842304898])
6800: D: 0.911615252495/0.851063728333 G: 0.822307884693 (Real: [4.0429812586307525, 1.0149434426406105], Fake: [4.181604235172272, 1.1091966315801844])
7000: D: 0.859644412994/0.819373309612 G: 0.683367550373 (Real: [4.0413902151584624, 1.2697299173474621], Fake: [3.6461249232292174, 1.1392232969008105])
7200: D: 0.697537004948/1.29639554024 G: 0.567749083042 (Real: [3.9289280462265013, 1.1476723124689931], Fake: [4.3612218284606934, 1.1698644305174593])
7400: D: 0.892510712147/0.93148213625 G: 1.18729686737 (Real: [3.9838603484630584, 1.10640478112829], Fake: [4.1228645443916321, 1.2695625804586594])
7600: D: 0.855136275291/0.683420717716 G: 0.87994658947 (Real: [4.1161885654926298, 1.1923004904972447], Fake: [3.6958885985612868, 1.3379389180110717])
7800: D: 0.549697399139/1.37823116779 G: 0.398991644382 (Real: [4.2173074555397037, 1.2371073094023581], Fake: [3.8741448554396629, 1.3837623378110455])
8000: D: 1.35398185253/0.410179078579 G: 0.527717351913 (Real: [3.9588229835033415, 1.3744496473744439], Fake: [3.9429207968711855, 1.3684983506717674])
8200: D: 0.700774013996/0.295857429504 G: 0.803082704544 (Real: [3.8515358114242555, 1.2566173136350174], Fake: [3.7108538401126863, 1.3342916614304938])
8400: D: 0.689352571964/0.590398311615 G: 0.698961615562 (Real: [3.965521250963211, 1.2231963456729893], Fake: [4.6866454958915709, 1.1286615282559416])
8600: D: 0.19632807374/0.604559898376 G: 0.812706291676 (Real: [3.8928249645233155, 1.3264703109197318], Fake: [3.918080286383629, 1.2016505045193488])
8800: D: 0.595732450485/0.572122216225 G: 0.738678693771 (Real: [3.7554583859443667, 1.2011572644775179], Fake: [3.8252914756536485, 1.1905187885079342])
9000: D: 0.232542961836/1.26930451393 G: 0.834500789642 (Real: [3.9203160056471824, 1.2725988502730134], Fake: [4.1613124001026156, 1.2681795442466237])
9200: D: 1.257376194/0.5735257864 G: 0.554405272007 (Real: [3.8860677522420883, 1.1041807259307903], Fake: [3.9102136331796644, 1.3811967247690093])
9400: D: 0.610212028027/0.538761377335 G: 0.558459818363 (Real: [4.0015355503559116, 0.99711450973270277], Fake: [3.8555663478374482, 1.1037480705144518])
9600: D: 0.702151358128/0.81621837616 G: 0.706716835499 (Real: [4.0513852632045744, 1.1984303669025829], Fake: [4.2933621263504032, 1.1478353305254103])
9800: D: 0.511451423168/0.670217812061 G: 0.873916983604 (Real: [3.935146123766899, 1.3218541944694313], Fake: [4.2863738107681275, 1.1362357473661524])
10000: D: 0.587130308151/0.764386773109 G: 0.714644312859 (Real: [4.0829932641983033, 1.1844677307174318], Fake: [4.2149634605646131, 1.1542778585504672])
10200: D: 0.454408079386/0.390097141266 G: 0.694087386131 (Real: [3.9480907583236693, 1.2586832917742197], Fake: [3.9525690937042235, 1.3555640918653922])
10400: D: 0.232991695404/0.377689123154 G: 0.839949011803 (Real: [3.9636431083083155, 1.2146210496905581], Fake: [4.0022356742620468, 1.0348462356745984])
10600: D: 0.887756228447/0.452646583319 G: 0.776298880577 (Real: [4.1107078218460087, 1.3061081296488184], Fake: [4.3001403945684435, 1.3191353715419794])
10800: D: 0.988030552864/0.472889751196 G: 2.00703763962 (Real: [4.1303015506267551, 1.2646447231333668], Fake: [4.2425211107730867, 1.2706986066792705])
11000: D: 0.962553679943/1.00584948063 G: 0.458068579435 (Real: [4.1017441129684444, 1.1564779436003478], Fake: [3.861787896156311, 1.2478181443952361])
11200: D: 0.404395908117/0.560545325279 G: 0.764987766743 (Real: [3.8819530367851258, 1.1290593525971337], Fake: [4.0393019503355028, 1.1760851438968263])
11400: D: 1.04482722282/0.170368790627 G: 0.979512214661 (Real: [4.0775347077846531, 1.1743573984958275], Fake: [4.4076948529481887, 1.1430737801156545])
11600: D: 0.767144262791/0.419019073248 G: 0.804197788239 (Real: [4.1507718646526337, 1.2935215526943189], Fake: [4.2565110635757444, 1.1195747875890809])
11800: D: 0.328228145838/0.192100420594 G: 0.694948136806 (Real: [4.2615561389923098, 1.3187283101366121], Fake: [3.7841238260269163, 1.2796545407667934])
12000: D: 0.939581632614/0.512252509594 G: 0.486280798912 (Real: [4.1770594882965089, 1.2492834466325793], Fake: [4.0997331076860428, 1.0701209918243111])
12200: D: 0.964525461197/0.397465586662 G: 1.45534229279 (Real: [3.9129967219382524, 1.3473476671217695], Fake: [4.3561846733093263, 1.1667221650406194])
12400: D: 0.516430974007/0.255626231432 G: 0.753806650639 (Real: [3.9942912605404852, 1.3623400447216258], Fake: [4.2171517282724382, 1.2046534326031684])
12600: D: 0.050210531801/0.567070662975 G: 0.887824892998 (Real: [3.9560802054405211, 1.3569670682588555], Fake: [3.6434229278564452, 1.2798963544271591])
12800: D: 0.566556215286/1.45121753216 G: 2.67591071129 (Real: [4.0868541407585148, 1.1440918337515926], Fake: [3.7308121472597122, 1.2567484994327229])
13000: D: 0.285438686609/1.26493763924 G: 0.714931368828 (Real: [4.0406689298152925, 1.2295255598171184], Fake: [4.1976348906755447, 1.2778464434389283])
13200: D: 0.420082330704/0.20268279314 G: 1.13221895695 (Real: [4.0006502330303189, 1.1790149224725006], Fake: [4.2336275362968445, 1.2803975596845565])
13400: D: 0.219869300723/0.733704686165 G: 1.4634616375 (Real: [3.8348834168910981, 1.240605849665303], Fake: [3.8208065938949587, 1.3042463825727604])
13600: D: 1.35286784172/0.161317944527 G: 2.29795908928 (Real: [4.0841373348236081, 1.2295542819596996], Fake: [4.0513113558292391, 1.2789595441318489])
13800: D: 0.188396275043/0.38589566946 G: 1.38826131821 (Real: [4.0228236329555509, 1.3524482715610078], Fake: [4.2307587480545044, 1.2042737228043698])
14000: D: 0.0101562952623/0.363918542862 G: 1.24292945862 (Real: [4.0695835274457934, 1.4484548400603423], Fake: [4.3588982570171355, 1.2305509242343933])
14200: D: 0.308517187834/0.687216579914 G: 0.831201374531 (Real: [4.1314239382743834, 1.2039768851618762], Fake: [4.3469831347465515, 1.1622408025070994])
14400: D: 1.05658388138/0.777651846409 G: 0.713593065739 (Real: [3.9307258637249469, 1.3932677098843045], Fake: [3.8781710839271546, 1.3920662615905985])
14600: D: 0.428974717855/0.430344074965 G: 0.865560889244 (Real: [4.2443156433105464, 1.4786604488020483], Fake: [3.9386759352684022, 1.2173706417721266])
14800: D: 0.358524769545/0.631785154343 G: 1.72760403156 (Real: [4.0897545439004901, 1.3611061267905207], Fake: [4.0185626268386843, 1.2011546705663261])
15000: D: 0.451200634241/0.451773911715 G: 1.10325527191 (Real: [3.9933083570003509, 1.0881706638388742], Fake: [3.902902855873108, 1.1771562868487595])
15200: D: 0.756480932236/0.419855684042 G: 0.942300021648 (Real: [4.1753564620018002, 1.3629881946025171], Fake: [3.8721090507507325, 1.189488508024922])
15400: D: 0.219109147787/0.190036550164 G: 2.20304942131 (Real: [3.9836783826351168, 1.4838718408508595], Fake: [3.9491609585285188, 1.1700151592543104])
15600: D: 1.01965582371/0.519556045532 G: 1.10594069958 (Real: [4.1213941669464109, 1.2398676800048194], Fake: [4.1908504700660707, 1.1195751576139747])
15800: D: 0.733263611794/0.697221815586 G: 0.84056687355 (Real: [4.0593542096018789, 1.1946663317303297], Fake: [4.3031868946552274, 1.0306412415157991])
16000: D: 0.400649875402/0.377974271774 G: 1.2899967432 (Real: [4.0140545344352718, 1.2630515897106358], Fake: [4.1656066524982451, 1.1779954377184654])
16200: D: 0.34089872241/0.265896707773 G: 1.11251270771 (Real: [4.0408088731765748, 1.3839176416694203], Fake: [4.0593357777595518, 1.2213436233279213])
16400: D: 0.00472234329209/0.513436615467 G: 1.63225841522 (Real: [4.1417997646331788, 1.2449733327544124], Fake: [3.7269023895263671, 1.1296458384504016])
16600: D: 0.756382524967/0.66779255867 G: 0.536718785763 (Real: [3.9379871004819869, 1.278594816781579], Fake: [3.8750299978256226, 1.2829775944385431])
16800: D: 0.879319548607/0.169020995498 G: 2.33787298203 (Real: [4.2075482982397077, 1.3725696551173026], Fake: [3.6744112837314606, 1.3225226221432227])
17000: D: 0.0482731573284/1.43823099136 G: 1.15067052841 (Real: [4.0404629743099214, 1.218948521692204], Fake: [4.0387165582180025, 1.2794767516999943])
17200: D: 2.88490628009e-05/0.57872825861 G: 0.495411038399 (Real: [3.9901529085636138, 1.4349120434336065], Fake: [4.0573103535175328, 1.1918079188127153])
17400: D: 0.231002807617/1.2511702776 G: 1.33606302738 (Real: [3.7472488379478452, 1.1658634335870959], Fake: [3.9354779303073881, 1.2931455406139682])
17600: D: 0.181431129575/0.149175107479 G: 2.51311731339 (Real: [4.1270963573455814, 1.312367798822683], Fake: [4.3470913958549495, 1.1818067904116243])
17800: D: 0.830040276051/0.415931969881 G: 1.57710897923 (Real: [3.99146986246109, 1.0836663745208763], Fake: [4.3325731372833252, 1.266683405420135])
18000: D: 0.20047518611/0.460676729679 G: 2.56421780586 (Real: [4.3388666504621503, 1.3881540592894346], Fake: [3.9820314025878907, 1.0436684747098013])
18200: D: 0.0659740716219/0.428199917078 G: 0.931035280228 (Real: [3.8892200005054476, 1.2217018988161374], Fake: [3.8822696304321287, 1.304586899060783])
18400: D: 0.791511416435/0.56503880024 G: 1.98549497128 (Real: [3.7894453473389147, 1.3567878969348022], Fake: [4.0909739780426024, 1.2361544714927677])
18600: D: 1.15297484398/0.102882102132 G: 1.85704553127 (Real: [4.2316720616817474, 1.2603607958456993], Fake: [3.7415710711479186, 1.311454258421634])
18800: D: 1.06078708172/0.366641134024 G: 0.914008259773 (Real: [3.9394708669185636, 1.2924449902046702], Fake: [3.9466111737489702, 1.137776845711856])
19000: D: 0.374139517546/0.448283135891 G: 0.701639294624 (Real: [3.9492650532722475, 1.2348435624999976], Fake: [3.7365686148405075, 1.215777672310739])
19200: D: 0.209440857172/0.522395193577 G: 0.707223057747 (Real: [3.8846979635953902, 1.2146658434075039], Fake: [4.1696245861053463, 1.2979841463522084])
19400: D: 0.15654887259/0.133351936936 G: 1.43907415867 (Real: [4.0292040088772776, 1.2291287794070285], Fake: [3.8498308193683624, 1.1121767482065514])
19600: D: 0.329566717148/0.222448319197 G: 0.429250627756 (Real: [3.7978928279876709, 1.1554982239517226], Fake: [3.5122534275054931, 1.2462801759237472])
19800: D: 0.0176634714007/0.480926275253 G: 0.39424943924 (Real: [4.0822606313228604, 1.2484518469881001], Fake: [4.5482089626789097, 1.1266585202489452])
20000: D: 0.45860773325/0.517112135887 G: 0.957448124886 (Real: [4.0875282829999922, 1.2310698313795749], Fake: [4.2767848205566406, 1.1186856033319335])
20200: D: 1.71172118187/0.240745082498 G: 0.314642876387 (Real: [3.8525538909435273, 1.2094100771830765], Fake: [3.6543397814035417, 1.2917598911679764])
20400: D: 0.583434104919/0.703361749649 G: 1.45571947098 (Real: [4.0388400733470915, 1.2267253073862441], Fake: [3.9019298100471498, 1.0292402192122965])
20600: D: 0.176266431808/0.55411952734 G: 0.962469100952 (Real: [4.0694609802961352, 1.2276659305759301], Fake: [3.9728190612792971, 1.1212652107309595])
20800: D: 1.17427504063/0.212535098195 G: 0.505771696568 (Real: [3.7983859290182589, 1.3565768879920506], Fake: [4.0766829651594163, 1.1742807548541911])
21000: D: 0.247546881437/0.242251947522 G: 2.533826828 (Real: [4.048124186992645, 1.2074367711533176], Fake: [3.8443934541940687, 1.0964556009967605])
21200: D: 0.000996549613774/1.77280521393 G: 0.741032421589 (Real: [3.8826335191726686, 1.3432952882949609], Fake: [4.0052364200353621, 1.0658632049377181])
21400: D: 0.0162861924618/0.202122434974 G: 0.640827775002 (Real: [3.949158318042755, 1.2312223613675215], Fake: [3.9677765011787414, 1.1984950273079937])
21600: D: 0.494586825371/0.368914216757 G: 1.73299539089 (Real: [4.2141097390651705, 1.3170628249721785], Fake: [3.9259325069189073, 1.2402090610341174])
21800: D: 1.72856020927/0.280478566885 G: 0.301942139864 (Real: [3.9425574642419816, 1.3421295277895979], Fake: [4.1370714265108113, 1.3135434962232824])
22000: D: 0.316263616085/0.425417006016 G: 4.6092467308 (Real: [3.9253722500801085, 1.1573266813219236], Fake: [3.7590440094470976, 1.2176312271677099])
22200: D: 1.70313096046/0.166758075356 G: 1.76803898811 (Real: [4.1788750314712528, 1.3796412025948377], Fake: [4.4896411395072935, 0.88890948354147137])
22400: D: 0.00245383195579/0.618139982224 G: 0.561835348606 (Real: [4.0531666296720505, 1.3030890495946361], Fake: [3.9800510057806968, 1.2769573713555427])
22600: D: 0.0456999950111/0.270536243916 G: 0.719259619713 (Real: [3.8036734467744826, 1.2489490089903446], Fake: [4.2525720745325089, 1.3061806069103183])
22800: D: 0.0318684391677/0.34651991725 G: 1.3301807642 (Real: [4.0768313544988635, 1.2930152979365797], Fake: [4.4993063497543337, 1.2277717696258752])
23000: D: 1.38112533092/0.656377196312 G: 0.700986683369 (Real: [4.0261077487468722, 1.1634786009859657], Fake: [4.1274698692560197, 1.1909195549188023])
23200: D: 0.7532761693/0.30048418045 G: 1.24321329594 (Real: [4.0255234652757643, 1.2277433432951119], Fake: [4.0463824319839476, 1.2493841122917879])
23400: D: 1.54497790337/0.524266302586 G: 1.88104653358 (Real: [4.1244187545776363, 1.2126284333800423], Fake: [4.0199511092901226, 1.4125067136876193])
23600: D: 0.838026106358/1.1139113903 G: 2.2735543251 (Real: [4.0352903008460999, 1.1687086536829701], Fake: [4.5685070466995237, 1.4508884769834012])
23800: D: 0.869914472103/0.160864800215 G: 1.42444908619 (Real: [4.1635012495517731, 1.1441051019240691], Fake: [4.1520407730340958, 1.2022442680490875])
24000: D: 0.0401677601039/0.240127012134 G: 1.21359109879 (Real: [4.0558859372138976, 1.1263029268841764], Fake: [3.8535136532783509, 0.99055012605544335])
24200: D: 0.444084912539/0.761975646019 G: 1.18176090717 (Real: [4.1462872040271757, 1.1670976588949802], Fake: [4.0291124176979061, 1.4000525541431663])
24400: D: 0.259448975325/0.206390738487 G: 0.850725114346 (Real: [4.2600694203376772, 1.3260391555100224], Fake: [4.7161277580261229, 1.3763624799621637])
24600: D: 0.821855664253/0.381440609694 G: 0.898442983627 (Real: [3.9929001557826997, 1.316718033939094], Fake: [3.659836998283863, 1.033547623133473])
24800: D: 0.869792580605/0.143853545189 G: 1.68244981766 (Real: [3.9503055346012115, 1.1980136516743376], Fake: [4.3753550618886949, 1.4268488751378543])
25000: D: 0.533834278584/0.944993913174 G: 1.35653877258 (Real: [3.8403973925113677, 1.1415226099240794], Fake: [4.3022644245624546, 1.277824404897737])
25200: D: 0.57686984539/1.21011674404 G: 0.49785476923 (Real: [4.1094828593730925, 1.0606124114518727], Fake: [3.8350191235542299, 1.1822398134788241])
25400: D: 1.30570268631/0.127069279552 G: 2.14658904076 (Real: [3.8440176880359651, 1.2759016439053388], Fake: [4.2303895175457003, 1.2478330871411345])
25600: D: 0.163877904415/0.356351107359 G: 1.50513041019 (Real: [3.9149920016527178, 1.3322359586431274], Fake: [4.5107577931880947, 1.37733363996175])
25800: D: 0.0257995054126/0.501479804516 G: 0.846267580986 (Real: [4.0328698861598973, 1.0891363228332751], Fake: [4.2062628841400143, 1.2707193105443095])
26000: D: 0.4208984375/0.45090213418 G: 1.24405300617 (Real: [4.0495267909765245, 1.3629959211491509], Fake: [3.881335927248001, 1.1534035700479874])
26200: D: 1.0977101326/0.260044932365 G: 0.274282753468 (Real: [4.0526520502567287, 1.1354404896569923], Fake: [3.7989616423845289, 1.3036229409468019])
26400: D: 0.836492598057/0.194570705295 G: 1.25769793987 (Real: [4.2580243301391603, 1.1229754918621602], Fake: [4.9420129108428954, 1.4595622988211396])
26600: D: 0.0381172671914/0.229116663337 G: 3.23367476463 (Real: [3.9871047949790954, 1.2891811878363044], Fake: [5.5130027627944944, 1.3531596753079107])
26800: D: 0.33750808239/0.0588937625289 G: 2.76632380486 (Real: [4.0901136839389798, 1.2240984948711151], Fake: [5.9970619964599612, 1.3296608494175821])
27000: D: 0.403919011354/0.025144957006 G: 5.00026988983 (Real: [3.9684947764873506, 1.1928812330565042], Fake: [5.5821900677680967, 1.5869340992569609])
27200: D: 1.26118826866/1.14945113659 G: 0.233536079526 (Real: [4.0953157800436024, 1.2000917970554563], Fake: [3.457775202393532, 1.2362199991432059])
27400: D: 0.842516124249/0.577941656113 G: 0.518706798553 (Real: [3.8673747038841246, 1.1826108239366226], Fake: [3.6999527400732042, 1.2050256827670227])
27600: D: 0.459548681974/0.516558885574 G: 1.69328427315 (Real: [4.0379843235015871, 1.267741160236167], Fake: [4.3069088852405546, 1.2883256614455194])
27800: D: 0.757292568684/0.295852422714 G: 0.82683211565 (Real: [3.6750951480865477, 1.1881818498282759], Fake: [4.3079475378990173, 1.3863961893145142])
28000: D: 1.0311729908/0.836829304695 G: 0.54562240839 (Real: [3.8109287106990815, 1.2699445078581264], Fake: [4.0800623488426204, 1.2420579399013889])
28200: D: 0.662180066109/0.698618113995 G: 0.430238395929 (Real: [3.8820258617401122, 1.3192879801078357], Fake: [3.8678512275218964, 1.2100339116659864])
28400: D: 0.857332766056/0.637849986553 G: 0.443328052759 (Real: [4.0044168281555175, 1.2977773729964786], Fake: [3.77621297955513, 1.10884790779666])
28600: D: 0.518617451191/0.676390469074 G: 0.824631929398 (Real: [3.9321113193035124, 1.189980080467403], Fake: [4.1412628889083862, 1.4110153520360829])
28800: D: 0.924657285213/0.57682287693 G: 0.867313206196 (Real: [3.8806186806410552, 1.2663798129949515], Fake: [3.7928846073150635, 0.96599856269415929])
29000: D: 0.681347727776/0.833830595016 G: 0.880895376205 (Real: [4.0122552135586735, 1.3382642859979685], Fake: [3.8699622356891634, 1.5246898233773196])
29200: D: 0.690975308418/0.571468651295 G: 0.539677977562 (Real: [3.9422134029865266, 1.2798402813873653], Fake: [3.4796924066543578, 1.0078584415562459])
29400: D: 0.600927650928/0.692537486553 G: 0.785535871983 (Real: [4.0494313037395475, 1.2729051468200046], Fake: [4.0457676327228542, 1.2121629628604733])
29600: D: 0.662378668785/0.552553355694 G: 0.665563106537 (Real: [3.8692034566402436, 1.1988600586203602], Fake: [4.3626180648803707, 1.3098951956607312])
29800: D: 0.844242811203/0.719559967518 G: 0.89226102829 (Real: [3.8751950478553772, 1.1053984789259368], Fake: [3.9671442759037019, 1.1584875699071935])