Jump to content

Vapnik–Chervonenkis dimension

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by APH (talk | contribs) at 12:43, 29 January 2004. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Vapnik Chervonenkis dimension (or VC dimension) is a measure of the capacity of a learning algorithm. It is one of the core concepts in statistical learning theory. It was originally defined by Vladimir Vapnik and Alexey Chervonenkis.

This article is a stub. You can help Wikipedia by fixing it.

References

  • V. Vapnik and A. Chervonenkis. On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probability and its Applications, 16(2):264--280, 1971.
  • A. Blumer, A. Ehrenfeucht, D. Haussler, and M. K. Warmuth. Learnability and the Vapnik-Chervonenkis dimension. Journal of the ACM, 36(4):929--865, 1989.