-
CH2CH5CH7
最小二乘法的参数辨识研究。逐个的程序实现了最小二乘法算法以及实现的研究。(Least squares method of parameter identification studies. Procedures achieved by least squares algorithm and realize research.)
- 2008-05-13 22:02:16下载
- 积分:1
-
2242144_PDE_Galerkin (1)
说明: Galerkin method,a numerical approximate method based on weighted residual method,commonly used in differential equation.(Galerkin method,a numerical approximate method based on weighted residual method,commonly used in differential equation.)
- 2019-12-25 20:15:07下载
- 积分:1
-
fahanshu
利用c++编写了一个外点惩罚函数,用于对于对目标值的推算与尝试。(Use c++ to write a point outside the penalty function for the target value for the calculation and try.)
- 2016-05-15 21:00:54下载
- 积分:1
-
jaya
jaya algorithm source code
- 2017-03-01 16:24:24下载
- 积分:1
-
EMC166
多导体传输线之间的串扰问题,通过BLT方程求解。(Crosstalk between the multi-conductor transmission line problem solving BLT equation.)
- 2013-03-07 13:34:32下载
- 积分:1
-
step response
Step response of vehicle
- 2018-04-20 19:56:54下载
- 积分:1
-
level-set-20120914
level set method in fortran.基本控制方程离散求解。(level set method)
- 2012-09-28 13:22:30下载
- 积分:1
-
harmonic
电力系统谐波潮流计算程序,根据各个节点谐波电力发射值采用解耦算法计算各个节点谐波电压(Harmonic voltage power system harmonic flow calculation procedures decoupling algorithm based on the harmonic power emitted value of each node, each node)
- 2013-01-10 18:44:18下载
- 积分:1
-
bamu
matlab语言写的计算八木天线的程序,可以改变单元的数量,给出各个阵子的电流分布和方向图
(matlab language computing Yagi antenna program, you can change the number of units each time around given the current distribution and orientation map)
- 2020-11-30 18:59:27下载
- 积分:1
-
共轭梯度法
说明: 共轭梯度法(Conjugate Gradient)是介于最速下降法与牛顿法之间的一个方法,它仅需利用一阶导数信息,但克服了最速下降法收敛慢的缺点,又避免了牛顿法需要存储和计算Hesse矩阵并求逆的缺点,共轭梯度法不仅是解决大型线性方程组最有用的方法之一,也是解大型非线性最优化最有效的算法之一。 在各种优化算法中,共轭梯度法是非常重要的一种。其优点是所需存储量小,具有步收敛性,稳定性高,而且不需要任何外来参数。(Conjugate gradient method Gradient) is a method between the steepest descent method and Newton's method. It only uses the first derivative information, but overcomes the disadvantage of slow convergence of steepest descent method, and avoids the disadvantage of storing and calculating Hesse matrix and solving inverse of Newton's method. Conjugate gradient method is not only one of the most useful methods to solve large-scale linear equations, but also the most effective method to solve large-scale nonlinear optimization One of the algorithms of. Among all kinds of optimization algorithms, conjugate gradient method is very important. It has the advantages of small storage, step convergence, high stability and no external parameters.)
- 2020-06-27 15:46:08下载
- 积分:1