Mixture of experts (MoE) refers to a machine learning technique where multiple expert networks (learners) are used to divide a problem space into homogeneous regions. It differs from ensemble techniques in that typically only a few, or 1, expert model will be run, rather than combining results from all models. An example from computer vision is combining one neural network model for human detection with another for pose estimation.
Attributes | Values |
---|---|
rdfs:label |
|
rdfs:comment |
|
dcterms:subject | |
Wikipage page ID |
|
Wikipage revision ID |
|
Link from a Wikipage to another Wikipage | |
sameAs | |
dbp:wikiPageUsesTemplate | |
has abstract |
|
prov:wasDerivedFrom | |
page length (characters) of wiki page |
|
foaf:isPrimaryTopicOf | |
is Link from a Wikipage to another Wikipage of | |
is Wikipage redirect of | |
is Wikipage disambiguates of | |
is foaf:primaryTopic of |