数据分析之电商客户评价数据分析
某知名电商拥有二十万条关于热水器的客户评价数据 ,希望能够从数据中,分析某一品牌的用户感情倾向,并详细分析该品牌产品的优缺点,进而提炼所有其他品牌热水器的卖点01项目背景02.产品销量统计03.海尔热水器情感分析04.其他品牌热水器卖点分析05小结:卖点综合分析06.数据来源及指标说明01项目背景★某知名电商拥有二十万条关于热水器的客户评价数据,希望能够从数据中,分析某一品牌的用户感情倾向,并详细分析该品牌产品的优缺点,进而提炼所有其他品牌热水器的卖点★网购大家电的行为日益成熟,及网店评价系统的日益完善,为我们提供了很好的数据积累★用前沿科技手段挖掘用户感情倾向,对生产和销售具有重要的指导意义01.项目背景02产品销量统计03.海尔热水器情感分析04.其他品牌热水器卖点分析05小结:卖点综合分析06.数据来源及指标说明02销量统计年至年总体销量如图,海尔销量约占总体的占了将近大半壁江山年销售汇总图表标题海尔·美的·万和c格兰仕··万家乐海尔美的万和格兰仕万家乐02销量统计年至年各年各品牌销售情况如图可以看出,海尔热水器于年早先于其他品牌进入网店销售,并且销量一直稳居前列年热水器各年销售统计■格兰仕■海尔■美的■万和■万家乐01.项目背景02.产品销量统计03海尔热水器情感分析04.其他品牌热水器卖点分析05小结:卖点综合分析06.数据来源及指标说明03海尔热水器客户情感分析★总体满意度指枋海尔品牌客户评价满意度星级海尔品牌客户评价比例差评差评★较差★★一般★★★很满意满意★★★★很满意★★★★★较差一般满意从图表中可以看出,客户对海尔热水器的综合评价较高,很满意的占45%,差评比率为差评·较差■一般■满意■很满意占24%,其中差评主要的原因为安装配件的费用03海尔热水器客户评价词频质量费用服务品牌加热安装费安装品牌从图表中可以看出,客户对于海保温便宜送货名牌外观实惠师傅大牌尔品牌的服务提及的频率较高配件性价比服务销售时可以着重强调服务及售后质量价格售后效果材料费发货水阀收费包装容量价钱保修温度价位质保出水量特价换货使用经济功品率质口口恒温性能效率插头耗电管道功能质量价格服务品牌
- 2020-12-12下载
- 积分:1
稀疏自编码深度学习的Matlab实现
稀疏自编码深度学习的Matlab实现,sparse Auto coding,Matlab codetrain, m/7% CS294A/CS294W Programming Assignment Starter CodeInstructions%%%This file contains code that helps you get started ontheprogramming assignment. You will need to complete thecode in sampleIMAgEsml sparseAutoencoder Cost m and computeNumericalGradientml For the purpose of completing the assignment, you domot need tochange the code in this filecurer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencodtrain.m∥%%========%6% STEP 0: Here we provide the relevant parameters valuesthat willl allow your sparse autoencoder to get good filters; youdo not need to9 change the parameters belowvisibleSize =8*8; number of input unitshiddensize 25number of hidden unitssparsity Param =0.01; desired average activation ofthe hidden units7 (This was denoted by the greek alpharho, which looks like a lower-case pcurer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod4/57train.,m∥in the lecture notes)1 ambda=0.0001%o weight decay parameterbeta 3%o weight of sparsity penalty term%%==:79 STEP 1: Implement sampleIMAGESAfter implementing sampleIMAGES, the display_networkcommand shouldfo display a random sample of 200 patches from the datasetpatches sampleIMAgES;display_network(patches(:, randi(size(patches, 2), 204, 1)), 8)%为产生一个204维的列向量,每一维的值为0~10000curer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod5/57train.m/v%中的随机数,说明是随机取204个 patch来显示%o Obtain random parameters thetatheta= initializeParameters ( hiddenSize, visibleSize)%%=============三三三三====================================97 STEP 2: Implement sparseAutoencoder CostYou can implement all of the components (squared errorcost, weight decay termsparsity penalty) in the cost function at once, butit may be easier to do%o it step-by-step and run gradient checking (see STEP3 after each stepWecurer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod6/57train. m vb suggest implementing the sparseAutoencoder Cost functionusing the following steps(a) Implement forward propagation in your neural networland implement the%squared error term of the cost function. Implementbackpropagation tocompute the derivatives. Then (using lambda=beta=(run gradient Checking%to verify that the calculations corresponding tothe squared error costterm are correctcurer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod7/57train. m vl(b) Add in the weight decay term (in both the cost funcand the derivativecalculations), then re-run Gradient Checking toverify correctnessl (c) Add in the sparsity penalty term, then re-run gradiChecking toverify correctnessFeel free to change the training settings when debuggingyour%o code. (For example, reducing the training set sizecurer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod8/57train m vl/number of hidden units may make your code run fasterand setting betaand/or lambda to zero may be helpful for debuggingHowever, in yourfinal submission of the visualized weights, please useparameters web gave in Step 0 abovecoS七grad]sparseAutoencoderCost(theta, visibleSize,hiddensize, lambda,sparsityParam, beta,patches)二〓二二二二二二二〓二〓二〓二〓=二====〓=curer:YiBinYUyuyibintony@163.com,WuYiUniversityning, MATLAB Code for Sparse Autoencod9/57train.m vlll96% STeP 3: Gradient CheckingHint: If you are debugging your code, performing gradienchecking on smaller modelsand smaller training sets (e. g, using only 10 trainingexamples and 1-2 hiddenunits) may speed things upl First, lets make sure your numerical gradient computationis correct for a%o simple function. After you have implemented computeNumerun the followingcheckNumericalGradientocurer:YiBinYUyuyibintony@163.com,WuYiUniversityDeep Learning, MATLAB Code for Sparse Autoencode10/57
- 2020-12-05下载
- 积分:1