Show simple item record

dc.contributor.authorKeroglou, C.en
dc.contributor.authorHadjicostis, Christoforos N.en
dc.contributor.editorLennartson B.en
dc.contributor.editorLesage J.-J.en
dc.contributor.editorFaure J.-M.en
dc.contributor.editorCury J.E.R.en
dc.creatorKeroglou, C.en
dc.creatorHadjicostis, Christoforos N.en
dc.date.accessioned2019-04-08T07:46:27Z
dc.date.available2019-04-08T07:46:27Z
dc.date.issued2014
dc.identifier.urihttp://gnosis.library.ucy.ac.cy/handle/7/43781
dc.description.abstractGiven a sequence of observations, classification among two known hidden Markov models (HMMs) can be accomplished with a classifier that minimizes the probability of error (i.e., the probability of misclassification) by enforcing the maximum a posteriori probability (MAP) rule. For this MAP classifier, the a priori probability of error (before any observations are made) can be obtained, as a function of the length of the sequence of observations, by summing up the probability of error over all possible observation sequences of the given length, which is a computationally expensive task. In this paper, we obtain an upper bound on the probability of error of the MAP classifier. Our results are based on a suboptimal decision rule that ignores the order with which observations occur and relies solely on the empirical frequencies with which different symbols appear. We describe necessary and sufficient conditions under which this bound on the probability of error decreases exponentially with the length of the observation sequence. Apart from the usefulness of the suboptimal rule in bounding the probability of misclassification, its numerous advantages (such as low computational complexity, reduced storage requirements, and potential applicability to distributed or decentralized decision schemes) could prove a useful alternative to the MAP rule for HMM classification in many applications. © IFAC.en
dc.publisherIFAC Secretariaten
dc.sourceIFAC Proceedings Volumes (IFAC-PapersOnline)en
dc.sourceIFAC Proceedings Volumes (IFAC-PapersOnline)en
dc.source.urihttps://www.scopus.com/inward/record.uri?eid=2-s2.0-84945952274&doi=10.3182%2f20140514-3-FR-4046.00068&partnerID=40&md5=cc2487bf3d185068219d7a9c4a27d2b2
dc.subjectErrorsen
dc.subjectProbabilityen
dc.subjectMarkov processesen
dc.subjectHidden markov modelsen
dc.subjectProbability of errorsen
dc.subjectDiscrete event simulationen
dc.subjectA-priori probabilitiesen
dc.subjectProbability distributionsen
dc.subjectTrellis codesen
dc.subjectClassificationen
dc.subjectClassification (of information)en
dc.subjectEmpirical frequenciesen
dc.subjectHidden markov modelen
dc.subjectHidden markov models (hmms)en
dc.subjectLow computational complexityen
dc.subjectMaximum a posteriori probabilitiesen
dc.subjectProbability of misclassificationen
dc.subjectStorage requirementsen
dc.titleHidden markov model classification based on empirical frequencies of observed symbolsen
dc.typeinfo:eu-repo/semantics/conferenceObject
dc.identifier.doi10.3182/20140514-3-FR-4046.00068
dc.description.volume9
dc.description.startingpage7
dc.description.endingpage12
dc.author.facultyΠολυτεχνική Σχολή / Faculty of Engineering
dc.author.departmentΤμήμα Ηλεκτρολόγων Μηχανικών και Μηχανικών Υπολογιστών / Department of Electrical and Computer Engineering
dc.type.uhtypeConference Objecten
dc.contributor.orcidHadjicostis, Christoforos N. [0000-0002-1706-708X]
dc.gnosis.orcid0000-0002-1706-708X


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record