Skip to content

Commit

Permalink
IROS update
Browse files Browse the repository at this point in the history
  • Loading branch information
cbschaff committed Sep 19, 2017
1 parent 9baca58 commit 7d65985
Show file tree
Hide file tree
Showing 10 changed files with 365 additions and 8 deletions.
239 changes: 239 additions & 0 deletions maps/map3.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,239 @@
1000.00 700.00
25 25
0.00 700.00 0.00 0.00
0.00 0.00 1000.00 0.00
1000.00 0.00 1000.00 700.00
1000.00 700.00 0.00 700.00
210.86 0.00 210.86 167.43
210.86 167.43 135.93 167.43
135.93 167.43 25.14 167.43
25.14 167.43 114.07 167.43
114.07 167.43 114.07 0.00
0.00 230.00 155.71 230.00
212.86 230.00 212.86 404.29
212.86 404.29 0.00 404.29
274.29 0.00 274.29 60.00
274.29 60.00 304.29 60.00
320.00 60.00 320.00 2.86
320.00 2.86 321.43 0.00
341.43 60.00 371.43 60.00
371.43 60.00 371.43 0.00
370.00 60.00 408.57 60.00
424.29 60.00 424.29 0.00
440.43 60.00 470.73 60.00
470.73 60.00 470.73 0.00
470.73 60.00 502.05 60.00
516.19 60.00 516.19 0.00
533.36 60.00 562.65 60.00
562.65 60.00 562.65 0.00
563.67 60.00 595.06 60.00
609.12 60.00 609.12 0.00
274.29 80.00 274.29 153.51
274.29 153.51 325.27 153.51
325.27 153.51 325.27 80.00
325.27 80.00 288.90 80.00
325.27 80.00 364.67 80.00
325.27 153.51 394.97 153.51
394.97 153.51 394.97 80.00
406.08 80.00 436.39 80.00
436.39 80.00 436.39 153.51
436.39 153.51 394.97 153.51
436.39 80.00 472.75 80.00
436.39 153.51 482.85 153.51
482.85 153.51 482.85 80.00
492.95 80.00 525.28 80.00
525.28 80.00 525.28 153.51
525.28 153.51 482.85 153.51
525.28 80.00 547.50 80.00
525.28 153.51 567.14 153.51
567.14 153.51 567.14 80.00
583.57 80.00 609.12 80.00
609.12 80.00 609.12 153.51
609.12 153.51 567.14 153.51
274.29 153.51 274.29 215.72
274.29 215.72 307.14 215.72
325.27 215.72 325.27 153.51
344.29 215.72 381.43 215.72
381.43 215.72 381.43 153.51
381.43 215.72 420.00 215.72
437.14 215.72 437.14 153.51
460.00 215.72 490.00 215.72
490.00 215.72 490.00 153.51
490.00 215.72 518.57 215.72
531.43 215.72 531.43 153.51
548.57 215.72 575.71 215.72
575.71 215.72 575.71 153.51
575.71 215.72 591.95 215.72
609.12 215.72 609.12 153.51
665.71 0.00 665.71 32.86
665.71 52.86 665.71 211.43
665.71 230.00 665.71 377.15
665.71 377.15 1000.00 377.15
35.36 274.73 161.62 274.73
161.62 274.73 161.62 404.29
336.38 263.62 289.91 263.62
289.91 263.62 289.91 322.21
289.91 322.21 336.38 322.21
336.38 322.21 336.38 278.91
389.92 323.22 389.92 262.61
389.92 262.61 452.55 262.61
452.55 262.61 452.55 323.22
452.55 323.22 407.23 323.22
493.96 277.88 493.96 318.16
493.96 318.16 544.47 318.16
544.47 318.16 544.47 261.60
544.47 261.60 493.96 261.60
356.58 370.77 356.58 438.45
356.58 438.45 289.91 438.45
289.91 438.45 289.91 370.77
289.91 370.77 339.25 370.77
574.86 405.16 574.86 361.85
574.86 361.85 524.35 361.85
524.35 361.85 524.35 424.47
524.35 424.47 574.86 424.47
410.24 416.49 410.24 377.76
410.24 377.76 478.81 377.76
478.81 377.76 478.81 435.22
478.81 435.22 410.24 435.22
0.00 473.73 337.86 473.73
11.03 514.13 40.41 514.13
40.41 514.13 53.88 514.13
53.88 514.13 92.74 514.13
92.74 514.13 119.70 514.13
119.70 514.13 133.34 514.13
133.34 514.13 154.02 514.13
154.02 514.13 178.87 514.13
178.87 514.13 207.30 514.13
207.30 514.13 229.72 514.13
229.72 514.13 264.15 514.13
264.15 514.13 289.00 514.13
289.00 514.13 321.43 514.13
40.41 473.73 40.41 514.13
77.78 514.13 77.78 473.73
106.07 514.13 106.07 473.73
133.34 514.13 133.34 473.73
165.66 514.13 165.66 473.73
190.92 514.13 190.92 473.73
220.21 514.13 220.21 473.73
244.46 514.13 244.46 473.73
275.77 514.13 275.77 473.73
302.04 514.13 302.04 473.73
337.86 514.13 337.86 473.73
0.00 542.43 29.12 542.43
29.12 542.43 47.35 542.43
47.35 542.43 79.58 542.43
79.58 542.43 95.81 542.43
95.81 542.43 128.71 542.43
128.71 542.43 148.27 542.43
148.27 542.43 177.16 542.43
177.16 542.43 199.39 542.43
199.39 542.43 235.62 542.43
235.62 542.43 256.13 542.43
256.13 542.43 288.91 542.43
288.91 542.43 307.69 542.43
307.69 542.43 330.47 542.43
337.86 577.43 0.00 577.43
337.86 542.43 337.86 577.43
39.29 577.43 39.29 542.43
65.71 577.43 65.71 542.43
87.86 577.43 87.86 542.43
110.71 577.43 110.71 542.43
137.14 577.43 137.14 542.43
160.00 577.43 160.00 542.43
192.86 577.43 192.86 542.43
217.86 577.43 217.86 542.43
246.43 577.43 246.43 542.43
274.29 577.43 274.29 542.43
298.57 577.43 298.57 542.43
318.57 577.43 318.57 542.43
0.00 621.43 23.12 621.43
23.12 621.43 44.23 621.43
44.23 621.43 85.35 621.43
85.35 621.43 100.46 621.43
100.46 621.43 125.58 621.43
125.58 621.43 140.70 621.43
140.70 621.43 168.93 621.43
168.93 621.43 186.04 621.43
186.04 621.43 223.16 621.43
223.16 621.43 244.28 621.43
244.28 621.43 281.39 621.43
281.39 621.43 294.51 621.43
294.51 621.43 327.62 621.43
327.62 621.43 337.86 621.43
337.86 621.43 337.86 575.72
0.00 651.43 22.65 651.43
22.65 651.43 41.30 651.43
41.30 651.43 85.96 651.43
85.96 651.43 102.61 651.43
102.61 651.43 147.26 651.43
147.26 651.43 167.91 651.43
167.91 651.43 210.56 651.43
210.56 651.43 229.21 651.43
229.21 651.43 274.52 651.43
274.52 651.43 289.17 651.43
289.17 651.43 349.82 651.43
349.82 651.43 368.47 651.43
368.47 651.43 413.12 651.43
413.12 651.43 426.43 651.43
426.43 651.43 426.43 700.00
33.57 621.43 33.57 577.43
64.29 621.43 64.29 577.43
93.57 621.43 93.57 577.43
132.71 621.43 132.71 577.43
176.43 577.43 176.43 621.43
233.57 577.43 233.57 621.43
265.00 621.43 265.00 577.43
287.86 621.43 287.86 577.43
309.29 576.43 309.29 621.43
32.86 651.43 32.86 700.00
65.00 700.00 65.00 651.43
94.29 651.43 94.29 700.00
127.14 700.00 127.14 651.43
157.86 651.43 157.86 700.00
187.14 651.43 187.14 700.00
220.00 700.00 220.00 651.43
250.71 651.43 250.71 700.00
280.71 700.00 280.71 651.43
320.71 651.43 320.71 700.00
360.00 700.00 360.00 651.43
391.43 651.43 391.43 700.00
206.43 577.43 206.43 621.43
630.00 561.43 719.29 561.43
719.29 561.43 719.29 497.14
719.29 497.14 649.29 497.14
649.29 497.14 649.29 547.86
649.29 547.86 628.57 547.86
652.86 561.43 652.86 595.00
652.86 595.00 688.57 595.00
688.57 595.00 688.57 644.29
719.29 561.43 755.00 561.43
755.00 561.43 755.00 644.29
755.00 644.29 701.00 644.29
755.00 617.15 880.00 617.15
880.00 617.15 880.00 587.14
755.00 570.72 880.00 570.72
797.14 570.72 797.14 493.57
797.14 493.57 730.71 493.57
797.14 493.57 833.57 493.57
833.57 493.57 833.57 523.57
833.57 523.57 880.00 523.57
880.00 523.57 880.00 550.00
796.43 377.14 796.43 414.29
796.43 414.29 842.14 414.29
842.14 414.29 842.14 442.15
863.57 442.15 863.57 377.14
877.14 442.15 914.29 442.15
914.29 442.15 914.29 407.14
914.29 407.14 1000.00 407.14
914.29 442.15 914.29 472.86
914.29 472.86 987.43 472.86
1000.00 524.29 937.14 524.29
937.14 524.29 937.14 557.14
937.14 568.57 937.14 608.57
937.14 608.57 1000.00 608.57
743.57 644.29 743.57 682.14
820.00 617.15 820.00 700.00
880.00 617.15 880.00 660.00
880.00 660.00 910.00 660.00
910.00 660.00 910.00 672.00
910.00 672.00 910.00 700.00
2 changes: 1 addition & 1 deletion src/eval_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ def mprint(s):
# Eval Loop
mprint(stamp()+"Starting model evaluation:")
c = 0
for i in xrange(n_batches):
for i in range(n_batches):

# Run training step & get next batch
bdata = batcher.get_batch()
Expand Down
41 changes: 41 additions & 0 deletions src/experiments/anneal_map3.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
"""
Copyright (C) 2017 Charles Schaff, David Yunis, Ayan Chakrabarti,
Matthew R. Walter. See LICENSE.txt for details.
"""

# This experiment jointly learns beacon placement and localization on Map 3
# and anneals beacon regularization throughout training.

import models.inf as md
import models.trainable_beacon as smod
import models.prop1 as pmod

md.smod = smod
md.pmod = pmod

# Number of layer blocks in the neural net
md.nGroup = 6


# Map file.
MAPFILE = '../maps/map3.txt'

# Determine which checkpoints to keep
KEEPLAST = 1
KEEPEVERY= 1e6
SAVE_FREQ= 10000

# How often to display loss compute loss under a hard beacon assignment
DISP_FREQ=25
HARD_FREQ=500

# Optimization hyperparams
MOM = 0.9
WEIGHT_DECAY=0.
BEACON_DECAY=.2
BEACON_ANNEAL=.25
BEACON_ANNEAL_FREQ=100000
BSZ=1000

LR = 0.01
MAX_ITER = 1e6
39 changes: 39 additions & 0 deletions src/experiments/fixed_beacon_map3.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
"""
Copyright (C) 2017 Charles Schaff, David Yunis, Ayan Chakrabarti,
Matthew R. Walter. See LICENSE.txt for details.
"""

# This experiment learns to localize with our best performing fixed
# beacon layout on Map 3.

import models.inf as md
import models.beacon14 as smod
import models.prop1 as pmod

md.smod = smod
md.pmod = pmod

# Number of layer blocks in the neural net
md.nGroup = 6


# Map file.
MAPFILE = '../maps/map2.txt'

# Determine which checkpoints to keep
KEEPLAST = 1
KEEPEVERY= 1e6
SAVE_FREQ= 10000

# How often to display loss compute loss under a hard beacon assignment
DISP_FREQ=25
HARD_FREQ=500

# Optimization hyperparams
MOM = 0.9
WEIGHT_DECAY=0.
BEACON_DECAY=0.
BSZ=1000

LR = 0.01
MAX_ITER = 1e6
38 changes: 38 additions & 0 deletions src/experiments/no_reg_map3.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
"""
Copyright (C) 2017 Charles Schaff, David Yunis, Ayan Chakrabarti,
Matthew R. Walter. See LICENSE.txt for details.
"""

# This experiment jointly learns beacon placement and localization on Map 3
# with no regularization on beacon placement.
import models.inf as md
import models.trainable_beacon as smod
import models.prop1 as pmod

md.smod = smod
md.pmod = pmod

# Number of layer blocks in the neural net
md.nGroup = 6


# Map file.
MAPFILE = '../maps/map3.txt'

# Determine which checkpoints to keep
KEEPLAST = 1
KEEPEVERY= 1e6
SAVE_FREQ= 10000

# How often to display loss compute loss under a hard beacon assignment
DISP_FREQ=25
HARD_FREQ=500

# Optimization hyperparams
MOM = 0.9
WEIGHT_DECAY=0.
BEACON_DECAY=0.
BSZ=1000

LR = 0.01
MAX_ITER = 1e6
2 changes: 1 addition & 1 deletion src/gen_train_data.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ def gen_train_data(mapfile, nbatch):

mapfile = sys.argv[1]
if len(sys.argv) < 3:
nbatch = 200
nbatch = 2000
else:
nbatch = int(sys.argv[2])

Expand Down
2 changes: 1 addition & 1 deletion src/models/prop2.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
import numpy as np

WFAC = 0.1
DMIN = 1.0
DMIN = 0.001

P0 = 6.25e-4
zeta = 2.0
Expand Down
2 changes: 1 addition & 1 deletion src/models/prop3.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
import numpy as np

WFAC = 0.5
DMIN = 1.0
DMIN = 0.001

P0 = 6.25e-4
zeta = 2.0
Expand Down
Loading

0 comments on commit 7d65985

Please sign in to comment.