Computational Learning Theory

Issam El Naqa*, Jen Tzung Chien

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

The conditions for learnability of a task by a computer algorithm provide guidance for understanding their performance and guidance for selecting the appropriate learning algorithm for a particular task. In this chapter, we present the main theoretical frameworks for machine learning algorithms: probably approximately correct (PAC) and Vapnik–Chervonenkis (VC) dimension. In addition, we discuss the new underlying principles of deep learning. These frameworks allow us to answer questions such as which learning process we should select, what is the learning capacity of the algorithm selected, and under which conditions is successful learning possible or impossible. Practical methods for selecting proper model complexity are presented using techniques based on information theory and statistical resampling.

Original languageEnglish
Title of host publicationMachine and Deep Learning in Oncology, Medical Physics and Radiology, Second Edition
PublisherSpringer International Publishing
Pages17-26
Number of pages10
ISBN (Electronic)9783030830472
ISBN (Print)9783030830465
DOIs
StatePublished - 1 Jan 2022

Keywords

  • Deep learning
  • Information theory
  • PAC
  • Statistical learning
  • Statistical resampling
  • VC dimension

Fingerprint

Dive into the research topics of 'Computational Learning Theory'. Together they form a unique fingerprint.

Cite this