-
histogram
画图像的直方图,先通过统计的方法,得到统计块数,可自行定义横纵坐标(Draw image histogram, first by statistical methods, the statistics the number of blocks, you can define horizontal and vertical coordinates)
- 2011-01-01 22:09:12下载
- 积分:1
-
OFDM_simulation
a method of ofdm simulation using matlab(a method of OFDM simulation using Matlab)
- 2006-07-10 10:45:53下载
- 积分:1
-
Master-matlab-and-guidline
精通MATLAB—其中包含综合辅导与指南(Proficient in MATLAB-counseling and guidance
)
- 2012-04-24 22:33:42下载
- 积分:1
-
Jakes-model
关于Jakes模型的两篇文章,对初学者有帮助(Two articles about Jakes Model)
- 2016-04-11 22:16:12下载
- 积分:1
-
SloshingSquareTank2d
A source code which describes sloshing in a 2d tank under sinusoidal excitation
- 2013-03-11 23:23:46下载
- 积分:1
-
PALM-PRINT
palm print recognition project did in matlab
- 2014-12-03 14:11:37下载
- 积分:1
-
DiverMIMO
tai lieu ve multiple input and multiple output
- 2010-11-04 00:35:33下载
- 积分:1
-
hough
Hough transform is used to transform images
- 2015-04-18 04:32:58下载
- 积分:1
-
xj
说明: 软硬阈值的语音增强处理,可直接运行出结果,可做语音增强算法参考(Hard and soft threshold speech enhancement processing, can be run directly the results, can be reference for speech enhancement algorithm
)
- 2015-05-21 20:23:56下载
- 积分:1
-
adaboost
AdaBoost元算法属于boosting系统融合方法中最流行的一种,说白了就是一种串行训练并且最后加权累加的系统融合方法。
具体的流程是:每一个训练样例都赋予相同的权重,并且权重满足归一化,经过第一个分类器分类之后,
计算第一个分类器的权重alpha值,并且更新每一个训练样例的权重,然后再进行第二个分类器的训练,相同的方法.......
直到错误率为0或者达到指定的训练轮数,其中最后预测的标签计算是各系统*alpha的加权和,然后sign(预测值)。
可以看出,训练流程是串行的,并且训练样例的权重是一直在变化的,分错的样本的权重不断加大,正确的样本的权重不断减小。
AdaBoost元算法是boosting中流行的一种,还有其他的系统融合的方法,比如bagging方法以及随机森林。
对于非均衡样本的处理,一般可以通过欠抽样(undersampling)或者过抽样(oversampling),欠抽样是削减样本的数目,
过抽样是重复的选取某些样本,最好的方法是两种进行结合的方法。
同时可以通过删除离决策边界比较远的样例。
(AdaBoost boosting systems dollar fusion algorithm is the most popular one, it plainly systems integration approach is a serial train and final weighted cumulative.
Specific process is: Each training example is given equal weight, and the weights satisfy normalization, after the first classifiers after
Calculating a first classifier weights alpha value for each sample and updates right weight training, and then the second classifier training, the same way .......
0, or until the specified error rate training rounds, wherein the label is the calculation of the final prediction system* alpha weighted and then sign (predicted value).
As can be seen, the training process is serial, and weight training examples is always changing, the right of the wrong sample weight continued to increase, the right to correct sample weight decreasing.
AdaBoost algorithm is an element, as well as other methods of boosting popular systems integration, such as bagging and random forest method.
For )
- 2014-07-09 19:24:29下载
- 积分:1