登录
首页 » matlab » tongxin

tongxin

于 2012-09-11 发布 文件大小:6KB
0 70
下载积分: 1 下载次数: 4

代码说明:

  这是一个通信系统仿真的simulink 仿真模型 可以进行误码率分析(Simulink simulation model of a communication system simulation BER analysis)

下载说明:请别用迅雷下载,失败请重下,重下不扣分!

发表评论

0 个回复

  • matlab
    详细介绍了matlab的基础知识,简单易懂的入门读物(MATLAB)
    2009-12-23 00:58:09下载
    积分:1
  • huise
    灰色预测,用于统计学方面,可以进行近段时间的预测,比较简单的程序(Grey, for statistical aspects that can be predicted Recently, a relatively simple procedure)
    2013-09-13 13:15:02下载
    积分:1
  • Pade
    The Padé approximant often gives better approximation of the function than truncating its Taylor series, and it may still work where the Taylor series does not converge. For these reasons Padé approximants are used extensively in computer calculations.
    2012-09-20 17:51:39下载
    积分:1
  • digitizeit
    说明:  关于matlab与digital signal的几个源程序。具体内容请自己下载看!(on Matlab and several digital signal source. Specific details please download Look!)
    2005-10-01 23:57:27下载
    积分:1
  • inver
    说明:  基于MATLAB的反问题程序,用于反问题求解。(MATLAB program based on the inverse problem for the inverse problem solution.)
    2011-03-06 10:09:38下载
    积分:1
  • qicheqiziyouduchengxubijiao
    汽车七自由度改编的振动程序,采用对比方法比较了《汽车振动分析》里面的振动程序,欢迎大家下载学习与交流(vibration of the car with seven degrees)
    2013-12-22 15:18:06下载
    积分:1
  • lmax
    Find local maxima in matlab
    2009-05-04 17:58:43下载
    积分:1
  • Ex521_page234_Jamshedi
    example 5.21 Jamshedi Book
    2012-05-22 19:06:16下载
    积分:1
  • gsc-1.2
    基于Graphcut的分割程序,使用matlab封装,并使用了测地线距离,程序界面有好,有助于二次开发(segmentation based on Graph cuts and using geodiesic distance,the UI is friendly,and one can add own code on it)
    2011-11-17 21:24:59下载
    积分:1
  • 最速下降法
    说明:  最速下降法是迭代法的一种,可以用于求解最小二乘问题(线性和非线性都可以)。在求解机器学习算法的模型参数,即无约束优化问题时,梯度下降(Gradient Descent)是最常采用的方法之一,另一种常用的方法是最小二乘法。在求解损失函数的最小值时,可以通过梯度下降法来一步步的迭代求解,得到最小化的损失函数和模型参数值。反过来,如果我们需要求解损失函数的最大值,这时就需要用梯度上升法来迭代了。在机器学习中,基于基本的梯度下降法发展了两种梯度下降方法,分别为随机梯度下降法和批量梯度下降法。(The steepest descent method is a kind of iterative method, which can be used to solve the least squares problem (both linear and nonlinear). In solving the model parameters of machine learning algorithm, that is, unconstrained optimization, gradient descent is one of the most commonly used methods, and the other is the least square method. When solving the minimum value of loss function, the gradient descent method can be used step by step to get the minimum value of loss function and model parameters. Conversely, if we need to solve the maximum value of the loss function, then we need to use the gradient rise method to iterate. In machine learning, two kinds of ladders are developed based on the basic gradient descent method)
    2019-11-24 13:06:03下载
    积分:1
  • 696518资源总数
  • 104269会员总数
  • 42今日下载