会员体验
专利管家(专利管理)
工作空间(专利管理)
风险监控(情报监控)
数据分析(专利分析)
侵权分析(诉讼无效)
联系我们
交流群
官方交流:
QQ群: 891211   
微信请扫码    >>>
现在联系顾问~
热词
    • 2. 发明授权
    • System and method for partitioning the feature space of a classifier in
a pattern classification system
    • US6058205A
    • 2000-05-02
    • US781574
    • 1997-01-09
    • Lalit Rai BahlPeter Vincent deSouzaDavid NahamooMukund Padmanabhan
    • Lalit Rai BahlPeter Vincent deSouzaDavid NahamooMukund Padmanabhan
    • G06K9/62G06F17/20
    • G06K9/6282
    • A system and method are provided which partition the feature space of a classifier by using hyperplanes to construct a binary decision tree or hierarchical data structure for obtaining the class probabilities for a particular feature vector. One objective in the construction of the decision tree is to minimize the average entropy of the empirical class distributions at each successive node or subset, such that the average entropy of the class distributions at the terminal nodes is minimized. First, a linear discriminant vector is computed that maximally separates the classes at any particular node. A threshold is then chosen that can be applied on the value of the projection onto the hyperplane such that all feature vectors that have a projection onto the hyperplane that is less than the threshold are assigned to a child node (say, left child node) and the feature vectors that have a projection greater than or equal to the threshold are assigned to a right child node. The above two steps are then repeated for each child node until the data at a node falls below a predetermined threshold and the node is classified as a terminal node (leaf of the decision tree). After all non-terminal nodes have been processed, the final step is to store a class distribution associated with each terminal node. The class probabilities for a particular feature vector can then be obtained by traversing the decision tree in a top-down fashion until a terminal node is identified which corresponds to the particular feature vector. The information provided by the decision tree is that, in computing the class probabilities for the particular feature vector, only the small number of classes associated with that particular terminal node need be considered. Alternatively, the required class probabilities can be obtained simply by taking the stored distribution of the terminal node associated with the particular feature vector.