-
MATLAB_linear_programming_algorithm_and_nonlinear_
MATLAB程序设计之线性算法和非线性源码比较MATLAB linear programming algorithm and nonlinear source comparison(MATLAB linear programming algorithm and nonlinear source more MATLAB linear programming algorithm and nonlinear source comparison)
- 2010-08-02 12:28:37下载
- 积分:1
-
program2
MATLAB R2008图形与动画实例教程源程序(MATLAB R2008 graphics and animation tutorial source code examples)
- 2010-08-25 10:24:48下载
- 积分:1
-
K_dis
海杂波相关K分布建模,使用球不变随机过程方法实现,概率密度曲线和功率谱(K distribution of sea clutter modeling spherically invariant random process method, the probability density curve and power spectrum)
- 2021-04-25 17:28:46下载
- 积分:1
-
Time-series-of-the-wavelet-analysis
时间序列小波分析详细步骤,能帮助你快速熟悉matlab小波工具箱(Wavelet analysis of time series detailed steps that can help you quickly familiar with matlab wavelet toolbox)
- 2013-12-08 13:45:37下载
- 积分:1
-
CGMethod
该方法给出了一个CG算法求对称正定矩阵的线性方程的解,要求给出矩阵A、向量b和一个探测向量x0。(This method gives a CG algorithm for linear equations of symmetric positive definite matrix, requiring to give a matrix A, a vector b and a probe vector x0.)
- 2013-11-24 17:50:31下载
- 积分:1
-
bpsk
通信原理实验中数字调制的二进制绝对相位调制(Communication theory of binary digital modulation experiments, the absolute phase modulation)
- 2010-12-08 17:40:27下载
- 积分:1
-
LMSand-RLS
这是通信中常用到的数字滤波器算法的matlab的例子,下载的话,自己要修改。(This is commonly used in communications to the digital filter algorithm matlab example, download the case, they have to change.)
- 2011-05-06 14:38:50下载
- 积分:1
-
Hopfield
离散Hopfield。。。。。。。。。。。。。。。。。。。(Discrete Hopfield。。。。。)
- 2015-01-26 15:17:09下载
- 积分:1
-
最速下降法
说明: 最速下降法是迭代法的一种,可以用于求解最小二乘问题(线性和非线性都可以)。在求解机器学习算法的模型参数,即无约束优化问题时,梯度下降(Gradient Descent)是最常采用的方法之一,另一种常用的方法是最小二乘法。在求解损失函数的最小值时,可以通过梯度下降法来一步步的迭代求解,得到最小化的损失函数和模型参数值。反过来,如果我们需要求解损失函数的最大值,这时就需要用梯度上升法来迭代了。在机器学习中,基于基本的梯度下降法发展了两种梯度下降方法,分别为随机梯度下降法和批量梯度下降法。(The steepest descent method is a kind of iterative method, which can be used to solve the least squares problem (both linear and nonlinear). In solving the model parameters of machine learning algorithm, that is, unconstrained optimization, gradient descent is one of the most commonly used methods, and the other is the least square method. When solving the minimum value of loss function, the gradient descent method can be used step by step to get the minimum value of loss function and model parameters. Conversely, if we need to solve the maximum value of the loss function, then we need to use the gradient rise method to iterate. In machine learning, two kinds of ladders are developed based on the basic gradient descent method)
- 2019-11-24 13:06:03下载
- 积分:1
-
ofdmsignal
ofdm_signal by matlab
- 2010-10-08 00:46:21下载
- 积分:1