-
最新易语言发送WPE封包实例
利用易语言做成发送WPE截取的封包源码,之前找了好长时间,使用的是模块化操作
- 2020-12-01下载
- 积分:1
-
利用Matlab实现的二维TDOA定位算法仿真程序
利用Matlab实现的二维TDOA定位算法仿真程序
- 2020-12-07下载
- 积分:1
-
能量检测、匹配滤波器检测、合作式检测Matlab仿真代码
认知无线电中频谱感知技术研究+Matlab仿真代码
- 2020-12-04下载
- 积分:1
-
基于QT的简单视频播放器
最近在学习QT,学了一段时间之久就自己动手做了个音视频的播放器,这是该播放器的全部代码,里边基本都有注释,功能包括有视频切换,快进/退,音量调节(鼠标调节、键盘上下键调节),全屏切换,播放列表,打开本地文件等等,基础功能不少,但是也不多,适合初学者学习,有兴趣的朋友可以下载来看看。
- 2020-12-03下载
- 积分:1
-
EC11编码器 STM32 OLED多级菜单显示.zip
【实例简介】旋转编码开关EC11有一个按键和AB相编码输出,通过相位变化来判断顺时针还是逆时针旋转,由编码产生外部中断配合STM32进行解析,本程序采用软件滤波,得到的编码值直接控制0.96寸OLED的多级菜单显示
- 2021-11-08 00:30:59下载
- 积分:1
-
计算机网络PPT韩立刚老师
这是韩立刚老师的计算机网络的PPT,PPT内容很详实,可以用来查阅相关知识,建议看看韩老师的计算机网络视频
- 2020-12-12下载
- 积分:1
-
袋鼠云数据中台解决方案介绍
数据中台落地解决方案
- 2020-12-08下载
- 积分:1
-
PCM_FSK_ASK_DPSK仿真源码(matlab实现)
通信原理实验课上写的代码,PCM部分参考了网络部分的资源,做了些改变。基本上包含了PCM编码,FSK,ASK,DPSK的调制与解调。有助于大家理解通信系统。代码用MATLAB编写,希望大家能够多提意见。
- 2020-12-01下载
- 积分:1
-
【PDF】《Machine learning A Probabilistic Perspective》 MLAPP;by Kevin Murphy
完整版,带目录,机器学习必备经典;大部头要用力啃。Machine learning A Probabilistic PerspectiveMachine LearningA Probabilistic PerspectiveKevin P. MurphyThe mit PressCambridge, MassachusettsLondon, Englando 2012 Massachusetts Institute of TechnologyAll rights reserved. No part of this book may be reproduced in any form by any electronic or mechanicalmeans(including photocopying, recording, or information storage and retrieval)without permission inwriting from the publisherFor information about special quantity discounts, please email special_sales@mitpress. mit. eduThis book was set in the HEx programming language by the author. Printed and bound in the UnitedStates of AmLibrary of Congress Cataloging-in-Publication InformationMurphy, Kevin Png:a piobabilistctive/Kevin P. Murphyp. cm. -(Adaptive computation and machine learning series)Includes bibliographical references and indexisBn 978-0-262-01802-9 (hardcover: alk. paper1. Machine learning. 2. Probabilities. I. TitleQ325.5M872012006.31-dc232012004558109876This book is dedicated to alessandro, Michael and stefanoand to the memory of gerard Joseph murphyContentsPreactXXVII1 IntroductionMachine learning: what and why?1..1Types of machine learning1.2 Supervised learning1.2.1Classification 31.2.2 Regression 83 Unsupervised learning 91.3.11.3.2Discovering latent factors 111.3.3 Discovering graph structure 131.3.4 Matrix completion 141.4 Some basic concepts in machine learning 161.4.1Parametric vs non-parametric models 161.4.2 A simple non-parametric classifier: K-nearest neighbors 161.4.3 The curse of dimensionality 181.4.4 Parametric models for classification and regression 191.4.5Linear regression 191.4.6Logistic regression1.4.7 Overfitting 221.4.8Model selection1.4.9No free lunch theorem242 Probability2.1 Introduction 272.2 A brief review of probability theory 282. 2. 1 Discrete random variables 282. 2.2 Fundamental rules 282.2.3B292. 2. 4 Independence and conditional independence 302. 2. 5 Continuous random variable32CONTENTS2.2.6 Quantiles 332.2.7 Mean and variance 332.3 Some common discrete distributions 342.3.1The binomial and bernoulli distributions 342.3.2 The multinomial and multinoulli distributions 352. 3.3 The Poisson distribution 372.3.4 The empirical distribution 372.4 Some common continuous distributions 382.4.1 Gaussian (normal) distribution 382.4.2Dte pdf 392.4.3 The Laplace distribution 412.4.4 The gamma distribution 412.4.5 The beta distribution 422.4.6 Pareto distribution2.5 Joint probability distributions 442.5.1Covariance and correlation442.5.2 The multivariate gaussian2.5.3 Multivariate Student t distribution 462.5.4 Dirichlet distribution 472.6 Transformations of random variables 492. 6. 1 Linear transformations 492.6.2 General transformations 502.6.3 Central limit theorem 512.7 Monte Carlo approximation 522.7.1 Example: change of variables, the MC way 532.7.2 Example: estimating T by Monte Carlo integration2.7.3 Accuracy of Monte Carlo approximation 542.8 Information theory562.8.1Entropy2.8.2 KL dive572.8.3 Mutual information 593 Generative models for discrete data 653.1 Introducti653.2 Bayesian concept learning 653.2.1Likelihood673.2.2 Prior 673.2.3P683.2.4Postedictive distribution3.2.5 A more complex prior 723.3 The beta-binomial model 723.3.1 Likelihood 733.3.2Prior743.3.3 Poster3.3.4Posterior predictive distributionCONTENTS3.4 The Dirichlet-multinomial model 783. 4. 1 Likelihood 793.4.2 Prior 793.4.3 Posterior 793.4.4Posterior predictive813.5 Naive Bayes classifiers 823.5.1 Model fitting 833.5.2 Using the model for prediction 853.5.3 The log-sum-exp trick 803.5.4 Feature selection using mutual information 863.5.5 Classifying documents using bag of words 84 Gaussian models4.1 Introduction974.1.1Notation974. 1.2 Basics 974. 1.3 MlE for an mvn 994.1.4 Maximum entropy derivation of the gaussian 1014.2 Gaussian discriminant analysis 1014.2.1 Quadratic discriminant analysis(QDA) 1024.2.2 Linear discriminant analysis (LDA) 1034.2.3 Two-claSs LDA 1044.2.4 MLE for discriminant analysis 1064.2.5 Strategies for preventing overfitting 1064.2.6 Regularized LDA* 104.2.7 Diagonal LDA4.2.8 Nearest shrunken centroids classifier1094.3 Inference in jointly Gaussian distributions 1104.3.1Statement of the result 1114.3.2 Examples4.3.3 Information form 1154.3.4 Proof of the result 1164.4 Linear Gaussian systems 1194.4.1Statement of the result 1194.4.2 Examples 1204.4.3 Proof of the result1244.5 Digression: The Wishart distribution4.5. 1 Inverse Wishart distribution 1264.5.2 Visualizing the wishart distribution* 1274.6 Inferring the parameters of an MVn 1274.6.1 Posterior distribution of u 1284.6.2 Posterior distribution of e1284.6.3 Posterior distribution of u and 2* 1324.6.4 Sensor fusion with unknown precisions 138
- 2020-12-10下载
- 积分:1
-
STM32示波器源代码 便携式数字示波器源码
STM32示波器毕业设计 便携式数字示波器源码及上位机 基于正点原子迷你板 ALIENTEK MiniSTM32 V3.0UCOSIII+EMWIN开发
- 2020-06-30下载
- 积分:1