Dropout-----据说以后就不再XXXXX了
Posted simple_wxl
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Dropout-----据说以后就不再XXXXX了相关的知识,希望对你有一定的参考价值。
GPU和CPU实现的不一样,这里贴的是CPU中的drop out
直接看caffe里面的源码吧:(产生满足伯努利分布的随机数mask,train的时候,data除以p,......反向传播的时候,没有产生伯努利分布的随机数,
scale_ = 1. / (1. - threshold_);
template <typename Dtype> void DropoutLayer<Dtype>::Forward_cpu(const vector<Blob<Dtype>*>& bottom, const vector<Blob<Dtype>*>& top) { const Dtype* bottom_data = bottom[0]->cpu_data(); Dtype* top_data = top[0]->mutable_cpu_data(); unsigned int* mask = rand_vec_.mutable_cpu_data(); const int count = bottom[0]->count(); if (this->phase_ == TRAIN) { // Create random numbers caffe_rng_bernoulli(count, 1. - threshold_, mask); for (int i = 0; i < count; ++i) { top_data[i] = bottom_data[i] * mask[i] * scale_; } } else { caffe_copy(bottom[0]->count(), bottom_data, top_data); } } template <typename Dtype> void DropoutLayer<Dtype>::Backward_cpu(const vector<Blob<Dtype>*>& top, const vector<bool>& propagate_down, const vector<Blob<Dtype>*>& bottom) { if (propagate_down[0]) { const Dtype* top_diff = top[0]->cpu_diff(); Dtype* bottom_diff = bottom[0]->mutable_cpu_diff(); if (this->phase_ == TRAIN) { const unsigned int* mask = rand_vec_.cpu_data(); const int count = bottom[0]->count(); for (int i = 0; i < count; ++i) { bottom_diff[i] = top_diff[i] * mask[i] * scale_; } } else { caffe_copy(top[0]->count(), top_diff, bottom_diff); } } }
以上是关于Dropout-----据说以后就不再XXXXX了的主要内容,如果未能解决你的问题,请参考以下文章