About: Decision stump     Goto   Sponge   NotDistinct   Permalink

An Entity of Type : yago:WikicatDecisionTrees, within Data Space : dbpedia.demo.openlinksw.com associated with source document(s)
QRcode icon
http://dbpedia.demo.openlinksw.com/describe/?url=http%3A%2F%2Fdbpedia.org%2Fresource%2FDecision_stump

A decision stump is a machine learning model consisting of a one-level decision tree. That is, it is a decision tree with one internal node (the root) which is immediately connected to the terminal nodes (its leaves). A decision stump makes a prediction based on the value of just a single input feature. Sometimes they are also called 1-rules. The term "decision stump" was coined in a 1992 ICML paper by Wayne Iba and Pat Langley.

AttributesValues
rdf:type
rdfs:label
  • Decision stump (en)
rdfs:comment
  • A decision stump is a machine learning model consisting of a one-level decision tree. That is, it is a decision tree with one internal node (the root) which is immediately connected to the terminal nodes (its leaves). A decision stump makes a prediction based on the value of just a single input feature. Sometimes they are also called 1-rules. The term "decision stump" was coined in a 1992 ICML paper by Wayne Iba and Pat Langley. (en)
foaf:depiction
  • http://commons.wikimedia.org/wiki/Special:FilePath/Decision_stump.svg
dcterms:subject
Wikipage page ID
Wikipage revision ID
Link from a Wikipage to another Wikipage
sameAs
dbp:wikiPageUsesTemplate
thumbnail
has abstract
  • A decision stump is a machine learning model consisting of a one-level decision tree. That is, it is a decision tree with one internal node (the root) which is immediately connected to the terminal nodes (its leaves). A decision stump makes a prediction based on the value of just a single input feature. Sometimes they are also called 1-rules. Depending on the type of the input feature, several variations are possible. For nominal features, one may build a stump which contains a leaf for each possible feature value or a stump with the two leaves, one of which corresponds to some chosen category, and the other leaf to all the other categories. For binary features these two schemes are identical. A missing value may be treated as a yet another category. For continuous features, usually, some threshold feature value is selected, and the stump contains two leaves — for values below and above the threshold. However, rarely, multiple thresholds may be chosen and the stump therefore contains three or more leaves. Decision stumps are often used as components (called "weak learners" or "base learners") in machine learning ensemble techniques such as bagging and boosting. For example, a Viola–Jones face detection algorithm employs AdaBoost with decision stumps as weak learners. The term "decision stump" was coined in a 1992 ICML paper by Wayne Iba and Pat Langley. (en)
gold:hypernym
prov:wasDerivedFrom
page length (characters) of wiki page
foaf:isPrimaryTopicOf
is Link from a Wikipage to another Wikipage of
is Wikipage redirect of
is Wikipage disambiguates of
is foaf:primaryTopic of
Faceted Search & Find service v1.17_git139 as of Feb 29 2024


Alternative Linked Data Documents: ODE     Content Formats:   [cxml] [csv]     RDF   [text] [turtle] [ld+json] [rdf+json] [rdf+xml]     ODATA   [atom+xml] [odata+json]     Microdata   [microdata+json] [html]    About   
This material is Open Knowledge   W3C Semantic Web Technology [RDF Data] Valid XHTML + RDFa
OpenLink Virtuoso version 08.03.3330 as of Mar 19 2024, on Linux (x86_64-generic-linux-glibc212), Single-Server Edition (378 GB total memory, 60 GB memory in use)
Data on this page belongs to its respective rights holders.
Virtuoso Faceted Browser Copyright © 2009-2024 OpenLink Software