skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Loffler, M."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. A decision tree recursively splits a feature space Rd and then assigns class labels based on the resulting partition. Decision trees have been part of the basic machine- learning toolkit for decades. A large body of work treats heuristic algorithms to compute a decision tree from training data, usually aiming to minimize in particular the size of the resulting tree. In contrast, little is known about the complexity of the underlying computational problem of computing a minimum-size tree for the given training data. We study this problem with respect to the number d of dimensions of the feature space. We show that it can be solved in O(n2d+1d) time, but under reasonable complexity-theoretic assumptions it is not possible to achieve f (d) · no(d/ log d) running time, where n is the number of training examples. The problem is solvable in (dR)O(dR) · n1+o(1) time, if there are exactly two classes and R is an upper bound on the number of tree leaves labeled with the first class. 
    more » « less