助手标题  
全文文献 工具书 数字 学术定义 翻译助手 学术趋势 更多
查询帮助
意见反馈
   optimal feature selection 的翻译结果: 查询用时:0.013秒
图标索引 在分类学科中查询
所有学科
数学
更多类别查询

图标索引 历史查询
 

optimal feature selection
相关语句
  “optimal feature selection”译为未确定词的双语例句
     A technique for solving optimization problems is given by genetic algorithm (GA). GA is applied to the problem of the optimal feature selection.
     由于遗传算法在组合优化问题上的成功应用,对特征子集寻优采用了遗传算法。
短句来源
  相似匹配句对
     Feature:
     本文特色:
短句来源
     Research on Optimal Algorithms of Feature Selection
     特征选择的优化算法研究
短句来源
     The Apcication of K-L Transformation on the Optimal Feature Descriptions of Debris
     K-L变换在磨粒特征参数优化中的应用
短句来源
     Optimal Feature Subset Selection of Decision Tables
     决策表最优特征子集的选择——基于粗集理论的启发式算法
短句来源
     AN ALGORITHM FOR THE OPTIMAL FEATURE SUBSET SELECTION
     一种最优特征集的选择算法
短句来源
查询“optimal feature selection”译词为用户自定义的双语例句

    我想查看译文中含有:的双语例句
例句
为了更好的帮助您理解掌握查询词或其译词在地道英语中的实际用法,我们为您准备了出自英文原文的大量英语例句,供您参考。
  optimal feature selection
A novel, efficient near-optimal feature selection algorithm which we callratchet search is also presented.
      
There has been very little research in the machine learning community on optimal feature selection.
      
The OCFS is an optimal feature selection approach designed according to the objective function of OC.
      
Optimal feature selection is achieved by maximizing or minimizing a criterion function.
      
In Section 3, we extend the existing approach to the two databases for optimal feature selection.
      
更多          


Learning from examples is widely studied in machine learning because it is a very effective cure for the bottleneck problem of knowledge acquisition. To discern positive and negative example fully, feature subset selection plays a great role in learning from examples. The smaller the base of feature subset, the better it is for concept extraction, but the optimal feature selection has been proved to be a NP hard problem. There are many disadvantages in previous algorithms. Based on extension theory, which...

Learning from examples is widely studied in machine learning because it is a very effective cure for the bottleneck problem of knowledge acquisition. To discern positive and negative example fully, feature subset selection plays a great role in learning from examples. The smaller the base of feature subset, the better it is for concept extraction, but the optimal feature selection has been proved to be a NP hard problem. There are many disadvantages in previous algorithms. Based on extension theory, which used to be utilized for heuristic algorithms, and rough set, which is especially suitable for reduct of decision tables, we change the feature selection into an optimization problem and the corresponding models are proposed. The models are both solved by existing software or genetic algorithms (GAs) and more understandable. The method above are used for a method for concept extraction in KDD (Knowledge Discovery in Database) and the result is satisfactory in addition to overcoming some disadvantages of previous algorithms.

特征选择是示例学习的关键 ,直接关系到获取的概念的优劣。基于扩张矩阵理论和粗集理论 ,将特征子集的选择问题转化为数学优化问题 ,提出了相应的优化模型。这种优化模型易于理解 ,采用现有的软件即可求解 ,克服了以前许多特征选择算法的不足。实例计算表明了方法的有效性

A method of feature selection was proposed. A between-calss and within-class distance measurement criterion and genetic algorithm (GA) for optimal feature selection were presented. According to the results of bearing diagnosis example,it is proved that this method possesses excellent optimization property,and can enhance the diagnostic correctness,decrease false alarm rate. The method has good prospects in the BIT(built-in test) fault feature selection.

提出一种故障特征选择方法 ,该方法采用类内类间距离作为特征评价准则 ,并利用遗传算法良好的寻优能力 ,解决特征的优选问题。轴承诊断实例证明 ,该方法有较好的寻优特征子集的能力 ,能够提高BIT系统的诊断精度 ,降低系统的虚警率 ,因而在机电BIT故障特征选择中有较好的应用前景。

The feature selection can remove redundant features existed in the fault feature set to enhance diagnosis precision. Generally there are two different methods for the feature selection, i.e. filter method and wrapper method. A composite feature selection method for reaping the benefits of precision from the wrapper method and keeping the computational expense down from the filter method is proposed. Firstly the method filters original features to form a feature subset which can meet classification correctness...

The feature selection can remove redundant features existed in the fault feature set to enhance diagnosis precision. Generally there are two different methods for the feature selection, i.e. filter method and wrapper method. A composite feature selection method for reaping the benefits of precision from the wrapper method and keeping the computational expense down from the filter method is proposed. Firstly the method filters original features to form a feature subset which can meet classification correctness rate, then the wrapper method is used to select an optimal feature subset. A technique for solving optimization problems is given by genetic algorithm (GA). GA is applied to the problem of the optimal feature selection. The composite method saves more computing time than the wrapper method with holding the classification accuracy in data simulation and experiment on bearing fault feature selection. So the method possesses excellent optimization property and saves more selection time, thus having the characteristics of high correctness rate and high efficiency.

特征选择方法主要包括过滤方法和绕封方法。为了利用过滤方法计算简单和绕封方法精度高的优点,提出一种组合过滤和绕封方法的特征选择新方法。该方法首先利用基于互信息准则的过滤方法得到满足一定精度要求的子集后,再采用绕封方法找到最后的优化特征子集。由于遗传算法在组合优化问题上的成功应用,对特征子集寻优采用了遗传算法。在数值仿真和轴承故障特征选择中,采用新方法在保证诊断精度的同时,可以节省大量选择时间。组合特征选择方法有较好的寻优特征子集的能力,能够节省选择时间,具有高效、高精度的双重优点。

 
图标索引 相关查询

 


 
CNKI小工具
在英文学术搜索中查有关optimal feature selection的内容
在知识搜索中查有关optimal feature selection的内容
在数字搜索中查有关optimal feature selection的内容
在概念知识元中查有关optimal feature selection的内容
在学术趋势中查有关optimal feature selection的内容
 
 

CNKI主页设CNKI翻译助手为主页 | 收藏CNKI翻译助手 | 广告服务 | 英文学术搜索
版权图标  2008 CNKI-中国知网
京ICP证040431号 互联网出版许可证 新出网证(京)字008号
北京市公安局海淀分局 备案号:110 1081725
版权图标 2008中国知网(cnki) 中国学术期刊(光盘版)电子杂志社