助手标题  
全文文献 工具书 数字 学术定义 翻译助手 学术趋势 更多
查询帮助
意见反馈
   bayesian learning process 的翻译结果: 查询用时:0.006秒
图标索引 在分类学科中查询
所有学科
金融
证券
投资
更多类别查询

图标索引 历史查询
 

bayesian learning process
相关语句
  相似匹配句对
     ON THE LEARNING PROCESS OF THE NEOCOGNITRON
     Neocognitron学习算法分析
短句来源
     THE LEARNING PROCESS AND ITS MECHANISM
     学习过程及其运行机制
短句来源
     Study of MAS Learning Model Based on the Bayesian Process of the Single Agent
     基于单Agent的Bayesian学习过程的多Agent系统学习模型研究
短句来源
     Bayesian inference of Process CapabiHty
     工序能力Bayes推断
短句来源
     In the process of the E.
     E.
短句来源
查询“bayesian learning process”译词为用户自定义的双语例句

    我想查看译文中含有:的双语例句
例句
为了更好的帮助您理解掌握查询词或其译词在地道英语中的实际用法,我们为您准备了出自英文原文的大量英语例句,供您参考。
  bayesian learning process
A system-based decision logic predicated on subjective and objective probabilities is developed incorporating the Bayesian learning process.
      
The fourth measure of risk is generated using a Bayesian learning process.
      


Classification has been considered as a hot research area in machine learning, pattern recognition and data mining. Incremental learning is an effective method for learning the classification knowledge from massive data, especially in the situation of high cost in getting labeled training examples. Firstly, this paper discusses the difference between Bayesian estimation and classical parameter estimation and denotes the fundamental principle for incorporating the prior knowledge in Bayesian...

Classification has been considered as a hot research area in machine learning, pattern recognition and data mining. Incremental learning is an effective method for learning the classification knowledge from massive data, especially in the situation of high cost in getting labeled training examples. Firstly, this paper discusses the difference between Bayesian estimation and classical parameter estimation and denotes the fundamental principle for incorporating the prior knowledge in Bayesian learning. Then we provide the incremental Bayesian learning model. This model explains the Bayesian learning process that changes the belief with the prior knowledge and new examples information. By selecting the Dirichlet prior distribution, we show this process in detail. In the second session, we mainly discuss the incremental process. For new examples for incremental learning, there exist two statuses: with labels and without labels. As for examples with labels, it is easy to update the classification parameter with the help of conjunct Dirichlet distribution. So it is the key point to learn from unlabeled examples. Different from the method provided by Kamal Nigam, which learns from unlabeled examples using EM algorithm, we focus on the next example that would be selected in learning. This paper gives a method measuring the classification loss with 0 1 loss. We will select the examples that minimize the classification loss. Meanwhile, to improve the algorithm performance, the pool based technique is introduced. For each turn, we only compute the classification loss for examples in pool. Because the basic operations in learning are updating the classification parameters and classifying test instances incrementally, we give their approximate expressions. For testing algorithm's efficiency, this paper makes an experiment on mushroom data set in UCI repository. The initial training set contains 6 labeled examples. Then several unlabeled examples are added. The final experimental results show that this algorithm is feasible and effective.

分类一直是机器学习、模式识别和数据挖掘研究的核心问题 .从海量数据中学习分类知识 ,尤其是当获得大量的带有类别标注的样本代价较高时 ,增量学习是解决该问题的有效途径 .该文将简单贝叶斯方法应用于增量分类中 ,提出了一种增量贝叶斯学习模型 ,给出了增量贝叶斯推理过程 ,包括增量地修正分类器参数和增量地分类测试样本 .实验结果表明 ,该算法是可行的和有效的

 
图标索引 相关查询

 


 
CNKI小工具
在英文学术搜索中查有关bayesian learning process的内容
在知识搜索中查有关bayesian learning process的内容
在数字搜索中查有关bayesian learning process的内容
在概念知识元中查有关bayesian learning process的内容
在学术趋势中查有关bayesian learning process的内容
 
 

CNKI主页设CNKI翻译助手为主页 | 收藏CNKI翻译助手 | 广告服务 | 英文学术搜索
版权图标  2008 CNKI-中国知网
京ICP证040431号 互联网出版许可证 新出网证(京)字008号
北京市公安局海淀分局 备案号:110 1081725
版权图标 2008中国知网(cnki) 中国学术期刊(光盘版)电子杂志社