Browse Wiki & Semantic Web

Jump to: navigation, search
Http://dbpedia.org/resource/Decision tree pruning
  This page has no properties.
hide properties that link here 
  No properties link to this page.
 
http://dbpedia.org/resource/Decision_tree_pruning
http://dbpedia.org/ontology/abstract 剪枝(英語:pruning)是机器学习与搜索算法当中通过移除决策树中分辨能力较弱的節剪枝(英語:pruning)是机器学习与搜索算法当中通过移除决策树中分辨能力较弱的節點而减小决策树大小的方法。剪枝降低了模型的复杂度,因此能够降低过拟合风险,从而降低泛化误差。 在决策树算法中,决策树过大会有过拟合的风险,从而在新样本上的泛化性能很差;决策树过小则无法从样本空间中获取重要的结构化信息。然而,由于很难判断新增一个额外的分裂结点能否显著降低误差,人们很难判断何时停止决策树的生长是恰当的。该问题被称为。一个通用的策略是让决策树一直生长,直到每个叶子结点都包含足够少量的样本,而后通过剪枝的方法,移除分辨能力较弱的结点。 剪枝過程应当在减小决策树大小的同时,保证交叉验证下的精度不降低。辨能力较弱的结点。 剪枝過程应当在减小决策树大小的同时,保证交叉验证下的精度不降低。 , Pruning is a data compression technique inPruning is a data compression technique in machine learning and search algorithms that reduces the size of decision trees by removing sections of the tree that are non-critical and redundant to classify instances. Pruning reduces the complexity of the final classifier, and hence improves predictive accuracy by the reduction of overfitting. One of the questions that arises in a decision tree algorithm is the optimal size of the final tree. A tree that is too large risks overfitting the training data and poorly generalizing to new samples. A small tree might not capture important structural information about the sample space. However, it is hard to tell when a tree algorithm should stop because it is impossible to tell if the addition of a single extra node will dramatically decrease error. This problem is known as the horizon effect. A common strategy is to grow the tree until each node contains a small number of instances then use pruning to remove nodes that do not provide additional information. Pruning should reduce the size of a learning tree without reducing predictive accuracy as measured by a cross-validation set. There are many techniques for tree pruning that differ in the measurement that is used to optimize performance.ment that is used to optimize performance. , Na ciência da computação, poda (em inglês:Na ciência da computação, poda (em inglês: pruning) é a remoção de partes de uma árvore de decisão ou de neurônios de uma rede neural. Isso pode ocorrer porque não há significativa contribuição para a precisão ou interpretabilidade da árvore, reduzindo-se a complexidade da árvore e aumentando-se a sua generalização. Uma estratégia comum é aumentar a árvore até que cada nó contenha um pequeno número de instâncias e, em seguida, usar a poda para remover os nós que não fornecem informações adicionais.s que não fornecem informações adicionais. , Pruning ist der englische Ausdruck für dasPruning ist der englische Ausdruck für das Beschneiden (Zurechtstutzen) von Bäumen und Sträuchern. In der Informatik im Umfeld des maschinellen Lernens wird der Ausdruck für das Vereinfachen, Kürzen und Optimieren von Entscheidungsbäumen verwendet. Die Idee des Pruning entstammt ursprünglich aus dem Versuch, das sog. Overfitting bei Bäumen zu verhindern, die durch induziertes Lernen entstanden sind. Overfitting bezeichnet die unerwünschte Induktion von Noise in einem Baum. Noise bezeichnet falsche Attributwerte oder Klassenzugehörigkeiten, welche Datensets verfälschen und so Entscheidungsbäume unnötig vergrößern. Durch das Pruning der Bäume werden die unnötigen Sub-Bäume wieder gekürzt.en die unnötigen Sub-Bäume wieder gekürzt.
http://dbpedia.org/ontology/thumbnail http://commons.wikimedia.org/wiki/Special:FilePath/Before_after_pruning.png?width=300 +
http://dbpedia.org/ontology/wikiPageExternalLink http://citeseer.ist.psu.edu/76752.html + , http://web.cs.iastate.edu/~honavar/kdd95_mdl.pdf + , http://www.math.tau.ac.il/~mansour/ml-course/scribe11.ps + , http://www.cp.eng.chula.ac.th/~boonserm/publication/ijcnn_kc2001.pdf + , http://www.cis.upenn.edu/~mkearns/papers/pruning.pdf +
http://dbpedia.org/ontology/wikiPageID 5462075
http://dbpedia.org/ontology/wikiPageLength 6756
http://dbpedia.org/ontology/wikiPageRevisionID 1085936310
http://dbpedia.org/ontology/wikiPageWikiLink http://dbpedia.org/resource/Category:Artificial_intelligence + , http://dbpedia.org/resource/Overfitting + , http://dbpedia.org/resource/Null-move_heuristic + , http://dbpedia.org/resource/Data_compression + , http://dbpedia.org/resource/Horizon_effect + , http://dbpedia.org/resource/Artificial_neural_network + , http://dbpedia.org/resource/Machine_learning + , http://dbpedia.org/resource/Cross-validation_%28statistics%29 + , http://dbpedia.org/resource/Alpha%E2%80%93beta_pruning + , http://dbpedia.org/resource/Statistical_classification + , http://dbpedia.org/resource/Category:Machine_learning + , http://dbpedia.org/resource/Decision_tree_learning + , http://dbpedia.org/resource/Search_algorithm + , http://dbpedia.org/resource/File:Before_after_pruning.png + , http://dbpedia.org/resource/Category:Decision_trees +
http://dbpedia.org/property/wikiPageUsesTemplate http://dbpedia.org/resource/Template:Tmath + , http://dbpedia.org/resource/Template:Cite_book + , http://dbpedia.org/resource/Template:Reflist + , http://dbpedia.org/resource/Template:Cite_journal +
http://purl.org/dc/terms/subject http://dbpedia.org/resource/Category:Machine_learning + , http://dbpedia.org/resource/Category:Decision_trees + , http://dbpedia.org/resource/Category:Artificial_intelligence +
http://www.w3.org/ns/prov#wasDerivedFrom http://en.wikipedia.org/wiki/Decision_tree_pruning?oldid=1085936310&ns=0 +
http://xmlns.com/foaf/0.1/depiction http://commons.wikimedia.org/wiki/Special:FilePath/Before_after_pruning.png +
http://xmlns.com/foaf/0.1/isPrimaryTopicOf http://en.wikipedia.org/wiki/Decision_tree_pruning +
owl:sameAs https://global.dbpedia.org/id/3NgV4 + , http://de.dbpedia.org/resource/Pruning + , http://fa.dbpedia.org/resource/%D9%87%D8%B1%D8%B3_%DA%A9%D8%B1%D8%AF%D9%86_%D8%AF%D8%B1%D8%AE%D8%AA_%D8%AC%D8%B3%D8%AA%D8%AC%D9%88 + , http://zh.dbpedia.org/resource/%E5%86%B3%E7%AD%96%E6%A0%91%E5%89%AA%E6%9E%9D + , http://dbpedia.org/resource/Decision_tree_pruning + , http://pt.dbpedia.org/resource/Poda_%28computa%C3%A7%C3%A3o%29 + , http://www.wikidata.org/entity/Q365738 +
rdfs:comment Na ciência da computação, poda (em inglês:Na ciência da computação, poda (em inglês: pruning) é a remoção de partes de uma árvore de decisão ou de neurônios de uma rede neural. Isso pode ocorrer porque não há significativa contribuição para a precisão ou interpretabilidade da árvore, reduzindo-se a complexidade da árvore e aumentando-se a sua generalização. Uma estratégia comum é aumentar a árvore até que cada nó contenha um pequeno número de instâncias e, em seguida, usar a poda para remover os nós que não fornecem informações adicionais.s que não fornecem informações adicionais. , Pruning is a data compression technique inPruning is a data compression technique in machine learning and search algorithms that reduces the size of decision trees by removing sections of the tree that are non-critical and redundant to classify instances. Pruning reduces the complexity of the final classifier, and hence improves predictive accuracy by the reduction of overfitting. Pruning should reduce the size of a learning tree without reducing predictive accuracy as measured by a cross-validation set. There are many techniques for tree pruning that differ in the measurement that is used to optimize performance.ment that is used to optimize performance. , Pruning ist der englische Ausdruck für das Beschneiden (Zurechtstutzen) von Bäumen und Sträuchern. In der Informatik im Umfeld des maschinellen Lernens wird der Ausdruck für das Vereinfachen, Kürzen und Optimieren von Entscheidungsbäumen verwendet. , 剪枝(英語:pruning)是机器学习与搜索算法当中通过移除决策树中分辨能力较弱的節剪枝(英語:pruning)是机器学习与搜索算法当中通过移除决策树中分辨能力较弱的節點而减小决策树大小的方法。剪枝降低了模型的复杂度,因此能够降低过拟合风险,从而降低泛化误差。 在决策树算法中,决策树过大会有过拟合的风险,从而在新样本上的泛化性能很差;决策树过小则无法从样本空间中获取重要的结构化信息。然而,由于很难判断新增一个额外的分裂结点能否显著降低误差,人们很难判断何时停止决策树的生长是恰当的。该问题被称为。一个通用的策略是让决策树一直生长,直到每个叶子结点都包含足够少量的样本,而后通过剪枝的方法,移除分辨能力较弱的结点。 剪枝過程应当在减小决策树大小的同时,保证交叉验证下的精度不降低。辨能力较弱的结点。 剪枝過程应当在减小决策树大小的同时,保证交叉验证下的精度不降低。
rdfs:label Pruning , Poda (computação) , Decision tree pruning , 决策树剪枝
hide properties that link here 
http://dbpedia.org/resource/Pruning_%28disambiguation%29 + http://dbpedia.org/ontology/wikiPageDisambiguates
http://dbpedia.org/resource/Pruning_%28algorithm%29 + , http://dbpedia.org/resource/Pruning_%28decision_trees%29 + , http://dbpedia.org/resource/Decision-tree_pruning + , http://dbpedia.org/resource/Pruning_%28Algorithm%29 + , http://dbpedia.org/resource/Pruning_algorithm + , http://dbpedia.org/resource/Pruning_algorithms + http://dbpedia.org/ontology/wikiPageRedirects
http://dbpedia.org/resource/Pruning_%28disambiguation%29 + , http://dbpedia.org/resource/Game_tree + , http://dbpedia.org/resource/Pruning_%28algorithm%29 + , http://dbpedia.org/resource/Pruning_%28decision_trees%29 + , http://dbpedia.org/resource/Decision-tree_pruning + , http://dbpedia.org/resource/Pruning_%28Algorithm%29 + , http://dbpedia.org/resource/Pruning_algorithm + , http://dbpedia.org/resource/Pruning_algorithms + , http://dbpedia.org/resource/Search_tree_pruning + http://dbpedia.org/ontology/wikiPageWikiLink
http://en.wikipedia.org/wiki/Decision_tree_pruning + http://xmlns.com/foaf/0.1/primaryTopic
http://dbpedia.org/resource/Decision_tree_pruning + owl:sameAs
 

 

Enter the name of the page to start semantic browsing from.