登录
首页 » matlab » tengqie

tengqie

于 2016-12-29 发布 文件大小:7KB
0 94
下载积分: 1 下载次数: 8

代码说明:

  通过虚拟阵元进行DOA估计,多目标跟踪的粒子滤波器,计算晶粒的生长,入门级别程序。( Conducted through virtual array DOA estimation, Multi-target tracking particle filter, Calculation of growth, entry-level program grain.)

文件列表:

tengqie.m,12229,2016-12-15

下载说明:请别用迅雷下载,失败请重下,重下不扣分!

发表评论

0 个回复

  • gpc52
    generalized predictive controller algorithm
    2012-07-31 08:31:35下载
    积分:1
  • pso_3
    PSO BASED MPPT CONTROLLER FOR SOLAR PANEL
    2014-04-13 01:14:14下载
    积分:1
  • zitai
    描述刚体卫星姿态稳定控制的程序。分为三种情况,小推力控制,飞轮控制与小推力飞轮联合控制。能在干扰下稳定卫星的姿态。( The program of attitude stabilization control for rigid satellite. It is divided into three situations: small thrust control, flywheel control and small thrust flywheel combined control. Can stabilize the satellite attitude under jamming. )
    2021-05-06 17:19:55下载
    积分:1
  • MIMO-OFDM Wireless Communications with MATLAB
    通信类MIMO与OFDM编程仿真代码,通信类建模或者毕业设计可以参考(MIMO-OFDM Wireless Communications with MATLAB)
    2018-05-25 22:09:55下载
    积分:1
  • p50
    fuzzy algorithm very simila to use
    2010-08-02 10:09:38下载
    积分:1
  • TSP_Particle
    在MATLAB中用粒子群算法解决TSP问题。(use the particle swarm optimization(PSO) to solve the TSP problem。)
    2010-01-17 21:37:20下载
    积分:1
  • digtalsignalprowithmatlab
    说明:  详细的介绍用Matlab处理数字信号的方法(Detailed introduction of digital signal processing using Matlab method)
    2008-10-26 10:34:53下载
    积分:1
  • 05C21_460
    A Fractional Order PID Tuning Algorithm for A Class of Fractional Order Plants
    2014-09-25 20:12:31下载
    积分:1
  • 最速下降法
    说明:  最速下降法是迭代法的一种,可以用于求解最小二乘问题(线性和非线性都可以)。在求解机器学习算法的模型参数,即无约束优化问题时,梯度下降(Gradient Descent)是最常采用的方法之一,另一种常用的方法是最小二乘法。在求解损失函数的最小值时,可以通过梯度下降法来一步步的迭代求解,得到最小化的损失函数和模型参数值。反过来,如果我们需要求解损失函数的最大值,这时就需要用梯度上升法来迭代了。在机器学习中,基于基本的梯度下降法发展了两种梯度下降方法,分别为随机梯度下降法和批量梯度下降法。(The steepest descent method is a kind of iterative method, which can be used to solve the least squares problem (both linear and nonlinear). In solving the model parameters of machine learning algorithm, that is, unconstrained optimization, gradient descent is one of the most commonly used methods, and the other is the least square method. When solving the minimum value of loss function, the gradient descent method can be used step by step to get the minimum value of loss function and model parameters. Conversely, if we need to solve the maximum value of the loss function, then we need to use the gradient rise method to iterate. In machine learning, two kinds of ladders are developed based on the basic gradient descent method)
    2019-11-24 13:06:03下载
    积分:1
  • umatcrystal_mod.f
    A User Material (UMAT) subroutine that models the deformation of single crystals which undergo plastic deformation
    2014-11-12 17:37:05下载
    积分:1
  • 696518资源总数
  • 104228会员总数
  • 45今日下载