. . . . "Decision tree pruning"@en . "\u526A\u679D\uFF08\u82F1\u8A9E\uFF1Apruning\uFF09\u662F\u673A\u5668\u5B66\u4E60\u4E0E\u641C\u7D22\u7B97\u6CD5\u5F53\u4E2D\u901A\u8FC7\u79FB\u9664\u51B3\u7B56\u6811\u4E2D\u5206\u8FA8\u80FD\u529B\u8F83\u5F31\u7684\u7BC0\u9EDE\u800C\u51CF\u5C0F\u51B3\u7B56\u6811\u5927\u5C0F\u7684\u65B9\u6CD5\u3002\u526A\u679D\u964D\u4F4E\u4E86\u6A21\u578B\u7684\u590D\u6742\u5EA6\uFF0C\u56E0\u6B64\u80FD\u591F\u964D\u4F4E\u8FC7\u62DF\u5408\u98CE\u9669\uFF0C\u4ECE\u800C\u964D\u4F4E\u6CDB\u5316\u8BEF\u5DEE\u3002 \u5728\u51B3\u7B56\u6811\u7B97\u6CD5\u4E2D\uFF0C\u51B3\u7B56\u6811\u8FC7\u5927\u4F1A\u6709\u8FC7\u62DF\u5408\u7684\u98CE\u9669\uFF0C\u4ECE\u800C\u5728\u65B0\u6837\u672C\u4E0A\u7684\u6CDB\u5316\u6027\u80FD\u5F88\u5DEE\uFF1B\u51B3\u7B56\u6811\u8FC7\u5C0F\u5219\u65E0\u6CD5\u4ECE\u6837\u672C\u7A7A\u95F4\u4E2D\u83B7\u53D6\u91CD\u8981\u7684\u7ED3\u6784\u5316\u4FE1\u606F\u3002\u7136\u800C\uFF0C\u7531\u4E8E\u5F88\u96BE\u5224\u65AD\u65B0\u589E\u4E00\u4E2A\u989D\u5916\u7684\u5206\u88C2\u7ED3\u70B9\u80FD\u5426\u663E\u8457\u964D\u4F4E\u8BEF\u5DEE\uFF0C\u4EBA\u4EEC\u5F88\u96BE\u5224\u65AD\u4F55\u65F6\u505C\u6B62\u51B3\u7B56\u6811\u7684\u751F\u957F\u662F\u6070\u5F53\u7684\u3002\u8BE5\u95EE\u9898\u88AB\u79F0\u4E3A\u3002\u4E00\u4E2A\u901A\u7528\u7684\u7B56\u7565\u662F\u8BA9\u51B3\u7B56\u6811\u4E00\u76F4\u751F\u957F\uFF0C\u76F4\u5230\u6BCF\u4E2A\u53F6\u5B50\u7ED3\u70B9\u90FD\u5305\u542B\u8DB3\u591F\u5C11\u91CF\u7684\u6837\u672C\uFF0C\u800C\u540E\u901A\u8FC7\u526A\u679D\u7684\u65B9\u6CD5\uFF0C\u79FB\u9664\u5206\u8FA8\u80FD\u529B\u8F83\u5F31\u7684\u7ED3\u70B9\u3002 \u526A\u679D\u904E\u7A0B\u5E94\u5F53\u5728\u51CF\u5C0F\u51B3\u7B56\u6811\u5927\u5C0F\u7684\u540C\u65F6\uFF0C\u4FDD\u8BC1\u4EA4\u53C9\u9A8C\u8BC1\u4E0B\u7684\u7CBE\u5EA6\u4E0D\u964D\u4F4E\u3002"@zh . . . . . . . . . "Pruning is a data compression technique in machine learning and search algorithms that reduces the size of decision trees by removing sections of the tree that are non-critical and redundant to classify instances. Pruning reduces the complexity of the final classifier, and hence improves predictive accuracy by the reduction of overfitting. One of the questions that arises in a decision tree algorithm is the optimal size of the final tree. A tree that is too large risks overfitting the training data and poorly generalizing to new samples. A small tree might not capture important structural information about the sample space. However, it is hard to tell when a tree algorithm should stop because it is impossible to tell if the addition of a single extra node will dramatically decrease error. This problem is known as the horizon effect. A common strategy is to grow the tree until each node contains a small number of instances then use pruning to remove nodes that do not provide additional information. Pruning should reduce the size of a learning tree without reducing predictive accuracy as measured by a cross-validation set. There are many techniques for tree pruning that differ in the measurement that is used to optimize performance."@en . . . . . "Na ci\u00EAncia da computa\u00E7\u00E3o, poda (em ingl\u00EAs: pruning) \u00E9 a remo\u00E7\u00E3o de partes de uma \u00E1rvore de decis\u00E3o ou de neur\u00F4nios de uma rede neural. Isso pode ocorrer porque n\u00E3o h\u00E1 significativa contribui\u00E7\u00E3o para a precis\u00E3o ou interpretabilidade da \u00E1rvore, reduzindo-se a complexidade da \u00E1rvore e aumentando-se a sua generaliza\u00E7\u00E3o. Uma estrat\u00E9gia comum \u00E9 aumentar a \u00E1rvore at\u00E9 que cada n\u00F3 contenha um pequeno n\u00FAmero de inst\u00E2ncias e, em seguida, usar a poda para remover os n\u00F3s que n\u00E3o fornecem informa\u00E7\u00F5es adicionais."@pt . "\u51B3\u7B56\u6811\u526A\u679D"@zh . "Pruning ist der englische Ausdruck f\u00FCr das Beschneiden (Zurechtstutzen) von B\u00E4umen und Str\u00E4uchern. In der Informatik im Umfeld des maschinellen Lernens wird der Ausdruck f\u00FCr das Vereinfachen, K\u00FCrzen und Optimieren von Entscheidungsb\u00E4umen verwendet. Die Idee des Pruning entstammt urspr\u00FCnglich aus dem Versuch, das sog. Overfitting bei B\u00E4umen zu verhindern, die durch induziertes Lernen entstanden sind. Overfitting bezeichnet die unerw\u00FCnschte Induktion von Noise in einem Baum. Noise bezeichnet falsche Attributwerte oder Klassenzugeh\u00F6rigkeiten, welche Datensets verf\u00E4lschen und so Entscheidungsb\u00E4ume unn\u00F6tig vergr\u00F6\u00DFern. Durch das Pruning der B\u00E4ume werden die unn\u00F6tigen Sub-B\u00E4ume wieder gek\u00FCrzt."@de . . "6756"^^ . "\u526A\u679D\uFF08\u82F1\u8A9E\uFF1Apruning\uFF09\u662F\u673A\u5668\u5B66\u4E60\u4E0E\u641C\u7D22\u7B97\u6CD5\u5F53\u4E2D\u901A\u8FC7\u79FB\u9664\u51B3\u7B56\u6811\u4E2D\u5206\u8FA8\u80FD\u529B\u8F83\u5F31\u7684\u7BC0\u9EDE\u800C\u51CF\u5C0F\u51B3\u7B56\u6811\u5927\u5C0F\u7684\u65B9\u6CD5\u3002\u526A\u679D\u964D\u4F4E\u4E86\u6A21\u578B\u7684\u590D\u6742\u5EA6\uFF0C\u56E0\u6B64\u80FD\u591F\u964D\u4F4E\u8FC7\u62DF\u5408\u98CE\u9669\uFF0C\u4ECE\u800C\u964D\u4F4E\u6CDB\u5316\u8BEF\u5DEE\u3002 \u5728\u51B3\u7B56\u6811\u7B97\u6CD5\u4E2D\uFF0C\u51B3\u7B56\u6811\u8FC7\u5927\u4F1A\u6709\u8FC7\u62DF\u5408\u7684\u98CE\u9669\uFF0C\u4ECE\u800C\u5728\u65B0\u6837\u672C\u4E0A\u7684\u6CDB\u5316\u6027\u80FD\u5F88\u5DEE\uFF1B\u51B3\u7B56\u6811\u8FC7\u5C0F\u5219\u65E0\u6CD5\u4ECE\u6837\u672C\u7A7A\u95F4\u4E2D\u83B7\u53D6\u91CD\u8981\u7684\u7ED3\u6784\u5316\u4FE1\u606F\u3002\u7136\u800C\uFF0C\u7531\u4E8E\u5F88\u96BE\u5224\u65AD\u65B0\u589E\u4E00\u4E2A\u989D\u5916\u7684\u5206\u88C2\u7ED3\u70B9\u80FD\u5426\u663E\u8457\u964D\u4F4E\u8BEF\u5DEE\uFF0C\u4EBA\u4EEC\u5F88\u96BE\u5224\u65AD\u4F55\u65F6\u505C\u6B62\u51B3\u7B56\u6811\u7684\u751F\u957F\u662F\u6070\u5F53\u7684\u3002\u8BE5\u95EE\u9898\u88AB\u79F0\u4E3A\u3002\u4E00\u4E2A\u901A\u7528\u7684\u7B56\u7565\u662F\u8BA9\u51B3\u7B56\u6811\u4E00\u76F4\u751F\u957F\uFF0C\u76F4\u5230\u6BCF\u4E2A\u53F6\u5B50\u7ED3\u70B9\u90FD\u5305\u542B\u8DB3\u591F\u5C11\u91CF\u7684\u6837\u672C\uFF0C\u800C\u540E\u901A\u8FC7\u526A\u679D\u7684\u65B9\u6CD5\uFF0C\u79FB\u9664\u5206\u8FA8\u80FD\u529B\u8F83\u5F31\u7684\u7ED3\u70B9\u3002 \u526A\u679D\u904E\u7A0B\u5E94\u5F53\u5728\u51CF\u5C0F\u51B3\u7B56\u6811\u5927\u5C0F\u7684\u540C\u65F6\uFF0C\u4FDD\u8BC1\u4EA4\u53C9\u9A8C\u8BC1\u4E0B\u7684\u7CBE\u5EA6\u4E0D\u964D\u4F4E\u3002"@zh . . "Na ci\u00EAncia da computa\u00E7\u00E3o, poda (em ingl\u00EAs: pruning) \u00E9 a remo\u00E7\u00E3o de partes de uma \u00E1rvore de decis\u00E3o ou de neur\u00F4nios de uma rede neural. Isso pode ocorrer porque n\u00E3o h\u00E1 significativa contribui\u00E7\u00E3o para a precis\u00E3o ou interpretabilidade da \u00E1rvore, reduzindo-se a complexidade da \u00E1rvore e aumentando-se a sua generaliza\u00E7\u00E3o. Uma estrat\u00E9gia comum \u00E9 aumentar a \u00E1rvore at\u00E9 que cada n\u00F3 contenha um pequeno n\u00FAmero de inst\u00E2ncias e, em seguida, usar a poda para remover os n\u00F3s que n\u00E3o fornecem informa\u00E7\u00F5es adicionais."@pt . . . . . "Pruning"@de . "Pruning is a data compression technique in machine learning and search algorithms that reduces the size of decision trees by removing sections of the tree that are non-critical and redundant to classify instances. Pruning reduces the complexity of the final classifier, and hence improves predictive accuracy by the reduction of overfitting. Pruning should reduce the size of a learning tree without reducing predictive accuracy as measured by a cross-validation set. There are many techniques for tree pruning that differ in the measurement that is used to optimize performance."@en . . "5462075"^^ . . "Poda (computa\u00E7\u00E3o)"@pt . . . "Pruning ist der englische Ausdruck f\u00FCr das Beschneiden (Zurechtstutzen) von B\u00E4umen und Str\u00E4uchern. In der Informatik im Umfeld des maschinellen Lernens wird der Ausdruck f\u00FCr das Vereinfachen, K\u00FCrzen und Optimieren von Entscheidungsb\u00E4umen verwendet."@de . . . . . "1085936310"^^ . . . . . . . .