A General Dimension for Query Learning
J. L. Balcàzar, J. Castro, D. Guijarro, J. Köbler, W. Lindner
Abstract:
We introduce a combinatorial dimension that characterizes the number
of queries needed to exactly (or approximately) learn concept classes
in various models. Our general dimension provides tight upper and lower
bounds on the query complexity for all sorts of queries, not only for
example-based queries as in previous works. As an application we show
that for learning DNF formulas, unspecified attribute value membership
and equivalence queries are not more powerful than standard membership
and equivalence queries. Further, in the approximate learning setting,
we use the general dimension to characterize the query complexity in
the statistical query as well as the learning by distances model.
Moreover, we derive close bounds on the number of statistical queries
needed to approximately learn DNF formulas.
PDF: gdim.pdf