PlumX Metrics
Embed PlumX Metrics

Data fusion with entropic priors

Frontiers in Artificial Intelligence and Applications, ISSN: 1879-8314, Vol: 226, Page: 107-114
2011
  • 14
    Citations
  • 0
    Usage
  • 6
    Captures
  • 0
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

Book Chapter Description

In classification problems, lack of knowledge of the prior distribution may make the application of Bayes' rule inadequate. Uniform or arbitrary priors may often provide classification answers that, even in simple examples, may end up contradicting our common sense about the problem. Entropic priors, determined via the application of the maximum entropy principle, seem to provide a much better answer and can be easily derived and applied to classification tasks when no more than the likelihood functions are available. In this paper we present an example in which the use of the entropic priors is compared to the results of the application of Dempster-Shafer theory. © 2011 The authors and IOS Press. All rights reserved.

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know