Skip to content

mehtadushy/DeepNetStuff

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

###Bits and Bobs From Deepnet Implementations. Mostly SIGGRAPH stuff. Thesis stuff to come later.

New Layers With CPU and GPU Implementations

L1 Loss Layer

https://github.com/mehtadushy/DeepNetStuff/blob/master/caffe/include/caffe/layers/l1_loss_layer.hpp

https://github.com/mehtadushy/DeepNetStuff/blob/master/caffe/src/caffe/layers/l1_loss_layer.cpp

https://github.com/mehtadushy/DeepNetStuff/blob/master/caffe/src/caffe/layers/l1_loss_layer.cu

Usage

Exactly like L2 Loss Layer

Structural Similarity (SSIM) Loss Layer (Perception based loss)

https://github.com/mehtadushy/DeepNetStuff/blob/master/caffe/include/caffe/layers/ssim_loss_layer.hpp

https://github.com/mehtadushy/DeepNetStuff/blob/master/caffe/src/caffe/layers/ssim_loss_layer.cpp

https://github.com/mehtadushy/DeepNetStuff/blob/master/caffe/src/caffe/layers/ssim_loss_layer.cu

Parameters for SSIM

https://github.com/mehtadushy/DeepNetStuff/blob/master/caffe/src/caffe/proto/caffe.proto

Usage

layer {
  name: "mylosslayer"
  type: "SSIMLoss"
  bottom: "result"
  bottom: "ground_truth"
  top: "loss_vale"
  loss_weight: 1             # <- set whatever you fancy
  ssim_loss_param{
    kernel_size: 8           # <- The kernel size is linked to the gaussian variance (circular). The kernel encloses +/1 3*sigma 
    stride: 8                # <- Equal strides in both dimensions
    c1: 0.0001               # <- Let these be
    c2: 0.001                # <- Let these be
  }
}

About

Bits and bobs from various deepnet experiments

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published