-
upf_demos
说明: % PURPOSE : Demonstrate the differences between the following filters on the same problem:
%
% 1) Extended Kalman Filter (EKF)
% 2) Unscented Kalman Filter (UKF)
% 3) Particle Filter (PF)
% 4) PF with EKF proposal (PFEKF)
% 5) PF with UKF proposal (PFUKF)( PURPOSE: Demonstrate the differences between the following filters on the same problem: 1) Extended Kalman Filter (EKF) 2) Unscented Kalman Filter (UKF) 3) Particle Filter (PF) 4) PF with EKF proposal ( PFEKF) 5) PF with UKF proposal (PFUKF))
- 2008-09-13 12:21:05下载
- 积分:1
-
detect_face
face detection in algorithm in rgb
- 2011-12-16 05:35:19下载
- 积分:1
-
show_pso
说明: pso 粒子群优化算法演示程序,实现图形化界面。(PSO PSO algorithm demonstration program to achieve graphical interface.)
- 2006-04-04 21:36:21下载
- 积分:1
-
conjugated-MUSIC
共轭MUSIC与平滑MUSIC,来实现波达方向估计(The conjugated MUSIC smoothing MUSIC DOA estimation)
- 2013-05-06 21:21:26下载
- 积分:1
-
BSpline
自己写的matlab里面的,B样条基函数的简单程序,用的是卷积的方法(bspline basic functions,you can use it to conquer some ans)
- 2012-05-16 17:09:28下载
- 积分:1
-
Double_buck_PI_Control
Double_buck_PI_Control.rar,功率电子领域matlab仿真文件,已经验证过,程序运行正常(Double_buck_PI_Control.rar,Power electronics field matlab simulation file, has already been verified, the normal operating procedures)
- 2013-08-26 22:38:07下载
- 积分:1
-
Matlab-and-C-CPP-mixed-programming
Matlab 和C C++ 混合编程
刘维编著(Matlab and C C++ mixture programming)
- 2013-09-09 20:53:41下载
- 积分:1
-
fastICA
基于负熵最大化的fastICA,matlab程序。程序中的公式在所附带的pdf文件中都高亮标明(Based on negative entropy maximization fastICA, matlab program. The formula in the program attached pdf file are highlighted marked)
- 2013-10-29 19:24:49下载
- 积分:1
-
chaoliu
电力系统潮流计算程序(节点电压幅值和相角、线路注入的有功功率和无功功率、线路损耗)(power system flow)
- 2014-07-14 20:00:41下载
- 积分:1
-
1807.01622
说明: 深度神经网络在函数近似中表现优越,然而需要从头开始训练。另一方面,贝叶斯方法,像高斯过程(GPs),可以利用利用先验知识在测试阶段进行快速推理。然而,高斯过程的计算量很大,也很难设计出合适的先验。本篇论文中我们提出了一种神经模型,条件神经过程(CNPs),可以结合这两者的优点。CNPs受灵活的随机过程的启发,比如GPs,但是结构是神经网络,并且通过梯度下降训练。CNPs通过很少的数据训练后就可以进行准确的预测,然后扩展到复杂函数和大数据集。我们证明了这个方法在一些典型的机器学习任务上面的的表现和功能,比如回归,分类和图像补全(Deep neural networks perform well in function approximation, but they need to be trained from scratch. On the other hand, Bayesian methods, such as Gauss Process (GPs), can make use of prior knowledge to conduct rapid reasoning in the testing stage. However, the calculation of Gauss process is very heavy, and it is difficult to design a suitable priori. In this paper, we propose a neural model, conditional neural processes (CNPs), which can combine the advantages of both. CNPs are inspired by flexible stochastic processes, such as GPs, but are structured as neural networks and trained by gradient descent. CNPs can predict accurately with very little data training, and then extend to complex functions and large data sets. We demonstrate the performance and functions of this method on some typical machine learning tasks, such as regression, classification and image completion.)
- 2020-06-23 22:20:02下载
- 积分:1