登录
首页 » Others » C#程序设计经典300例源代码

C#程序设计经典300例源代码

于 2020-12-05 发布
0 146
下载积分: 1 下载次数: 0

代码说明:

第1篇 基础篇 第1章 开发环境 第2章 语法基础 第3章 程序流程 第4章 数组与集合 第5章 字符串处理 第6章 数据结构与算法 第7章 类与结构 第8章 常用设计模式第2篇 窗体篇 第10章 窗体的使用 第11章 控件的使用 第12章 组件的使用 第9章 鼠标与键盘第3篇 应用篇 第13章 多线程编程 第14章 文件系统 第15章 注册表技术 第16章 数据库技术 第17章 访问Office第4篇 新技术篇 第18章 GDI+绘图技术 第19章 自定义控件 第20章 图像

下载说明:请别用迅雷下载,失败请重下,重下不扣分!

发表评论

0 个回复

  • LDA与PCA的讲解与matlab演示
    详细讲解lda与pca的特征降维方法,并结合实际分类例子来演示matlab,用matlab做出散点图
    2020-12-04下载
    积分:1
  • ADI模数转换器应用笔记第册.pdf
    ADI模数转换器应用笔记第一册.pdf ADI模数转换器应用笔记第一册.pdf
    2020-12-11下载
    积分:1
  • 超宽带UWB四个基站与25个测试点测距误差分布热图matlab序(含数据)
    超宽带四个基站,设置25个测试点,画出测距误差分布热图(含数据),matlab
    2020-12-06下载
    积分:1
  • 模糊控制的Matlab仿真实例
    本文件详细讲述模糊控制在matlab中的应用,其中包含大量实例。
    2020-06-21下载
    积分:1
  • 蒙特卡罗抽样 vs 拉丁超立方体抽样
    蒙特卡罗抽样vs拉丁超立方体抽样区别和原理
    2020-12-04下载
    积分:1
  • IEEE14节点潮流计算
    这是一款IEEE14节点的潮流算程序,主要进行潮流计算
    2020-12-05下载
    积分:1
  • sigma-delta 入门推导
    讲述了sigma-delta调制器基本单元的原理,并进行了简单推导;
    2020-12-02下载
    积分:1
  • 基于Labview的信号处理系统
    基于Labview平台开发的信号处理系统,可执行。程序代码公开。
    2020-12-09下载
    积分:1
  • Matlab三次样条插值函数
    自己编写的Matlab三次样条插值函数,与spline函数对照,所得图形一致。
    2020-12-11下载
    积分:1
  • 【PDF】《Machine learning A Probabilistic Perspective》 MLAPP;by Kevin Murphy
    完整版,带目录,机器学习必备经典;大部头要用力啃。Machine learning A Probabilistic PerspectiveMachine LearningA Probabilistic PerspectiveKevin P. MurphyThe mit PressCambridge, MassachusettsLondon, Englando 2012 Massachusetts Institute of TechnologyAll rights reserved. No part of this book may be reproduced in any form by any electronic or mechanicalmeans(including photocopying, recording, or information storage and retrieval)without permission inwriting from the publisherFor information about special quantity discounts, please email special_sales@mitpress. mit. eduThis book was set in the HEx programming language by the author. Printed and bound in the UnitedStates of AmLibrary of Congress Cataloging-in-Publication InformationMurphy, Kevin Png:a piobabilistctive/Kevin P. Murphyp. cm. -(Adaptive computation and machine learning series)Includes bibliographical references and indexisBn 978-0-262-01802-9 (hardcover: alk. paper1. Machine learning. 2. Probabilities. I. TitleQ325.5M872012006.31-dc232012004558109876This book is dedicated to alessandro, Michael and stefanoand to the memory of gerard Joseph murphyContentsPreactXXVII1 IntroductionMachine learning: what and why?1..1Types of machine learning1.2 Supervised learning1.2.1Classification 31.2.2 Regression 83 Unsupervised learning 91.3.11.3.2Discovering latent factors 111.3.3 Discovering graph structure 131.3.4 Matrix completion 141.4 Some basic concepts in machine learning 161.4.1Parametric vs non-parametric models 161.4.2 A simple non-parametric classifier: K-nearest neighbors 161.4.3 The curse of dimensionality 181.4.4 Parametric models for classification and regression 191.4.5Linear regression 191.4.6Logistic regression1.4.7 Overfitting 221.4.8Model selection1.4.9No free lunch theorem242 Probability2.1 Introduction 272.2 A brief review of probability theory 282. 2. 1 Discrete random variables 282. 2.2 Fundamental rules 282.2.3B292. 2. 4 Independence and conditional independence 302. 2. 5 Continuous random variable32CONTENTS2.2.6 Quantiles 332.2.7 Mean and variance 332.3 Some common discrete distributions 342.3.1The binomial and bernoulli distributions 342.3.2 The multinomial and multinoulli distributions 352. 3.3 The Poisson distribution 372.3.4 The empirical distribution 372.4 Some common continuous distributions 382.4.1 Gaussian (normal) distribution 382.4.2Dte pdf 392.4.3 The Laplace distribution 412.4.4 The gamma distribution 412.4.5 The beta distribution 422.4.6 Pareto distribution2.5 Joint probability distributions 442.5.1Covariance and correlation442.5.2 The multivariate gaussian2.5.3 Multivariate Student t distribution 462.5.4 Dirichlet distribution 472.6 Transformations of random variables 492. 6. 1 Linear transformations 492.6.2 General transformations 502.6.3 Central limit theorem 512.7 Monte Carlo approximation 522.7.1 Example: change of variables, the MC way 532.7.2 Example: estimating T by Monte Carlo integration2.7.3 Accuracy of Monte Carlo approximation 542.8 Information theory562.8.1Entropy2.8.2 KL dive572.8.3 Mutual information 593 Generative models for discrete data 653.1 Introducti653.2 Bayesian concept learning 653.2.1Likelihood673.2.2 Prior 673.2.3P683.2.4Postedictive distribution3.2.5 A more complex prior 723.3 The beta-binomial model 723.3.1 Likelihood 733.3.2Prior743.3.3 Poster3.3.4Posterior predictive distributionCONTENTS3.4 The Dirichlet-multinomial model 783. 4. 1 Likelihood 793.4.2 Prior 793.4.3 Posterior 793.4.4Posterior predictive813.5 Naive Bayes classifiers 823.5.1 Model fitting 833.5.2 Using the model for prediction 853.5.3 The log-sum-exp trick 803.5.4 Feature selection using mutual information 863.5.5 Classifying documents using bag of words 84 Gaussian models4.1 Introduction974.1.1Notation974. 1.2 Basics 974. 1.3 MlE for an mvn 994.1.4 Maximum entropy derivation of the gaussian 1014.2 Gaussian discriminant analysis 1014.2.1 Quadratic discriminant analysis(QDA) 1024.2.2 Linear discriminant analysis (LDA) 1034.2.3 Two-claSs LDA 1044.2.4 MLE for discriminant analysis 1064.2.5 Strategies for preventing overfitting 1064.2.6 Regularized LDA* 104.2.7 Diagonal LDA4.2.8 Nearest shrunken centroids classifier1094.3 Inference in jointly Gaussian distributions 1104.3.1Statement of the result 1114.3.2 Examples4.3.3 Information form 1154.3.4 Proof of the result 1164.4 Linear Gaussian systems 1194.4.1Statement of the result 1194.4.2 Examples 1204.4.3 Proof of the result1244.5 Digression: The Wishart distribution4.5. 1 Inverse Wishart distribution 1264.5.2 Visualizing the wishart distribution* 1274.6 Inferring the parameters of an MVn 1274.6.1 Posterior distribution of u 1284.6.2 Posterior distribution of e1284.6.3 Posterior distribution of u and 2* 1324.6.4 Sensor fusion with unknown precisions 138
    2020-12-10下载
    积分:1
  • 696518资源总数
  • 104349会员总数
  • 32今日下载