## Abstract

Structural risk minimisation (SRM) is a general complexity regularization method which automatically selects the model complexity that approximately minimises the misclassification error probability of the empirical risk minimiser. It does so by adding a complexity penalty term ∊(m, k) to the empirical risk of the candidate hypotheses and then for any fixed sample size m it minimises the sum with respect to the model complexity variable k. When learning multicategory classification there are M subsamples m_{i}, corresponding to the M pattern classes with a priori probabilities p_{i}, 1 ≤ i ≤ M. Using the usual representation for a multi-category classifier as M individual boolean classifiers, the penalty becomes ∑^{M}_{i=1} p_{i}∊(m_{i}, k_{i}). If the mi are given then the standard SRM trivially applies here by minimizing the penalised empirical risk with respect to k_{i}, 1,..., M. However, in situations where the total sample size ∑^{M}_{i=1} mi needs to be minimal one needs to also minimize the penalised empirical risk with respect to the variables m_{i}, i = 1,..., M. The obvious problem is that the empirical risk can only be defined after the subsamples (and hence their sizes) are given (known). Utilising an on-line stochastic gradient descent approach, this paper overcomes this difficulty and introduces a sample-querying algorithm which extends the standard SRM principle. It minimises the penalised empirical risk not only with respect to the k_{i}, as the standard SRM does, but also with respect to the m_{i}, i = 1,..., M. The challenge here is in defining a stochastic empirical criterion which when minimised yields a sequence of subsample-size vectors which asymptotically achieve the Bayes’ optimal error convergence rate.

Original language | English |
---|---|

Title of host publication | Algorithmic Learning Theory - 14th International Conference, ALT 2003, Proceedings |

Editors | Ricard Gavalda, Klaus P. Jantke, Eiji Takimoto |

Publisher | Springer Verlag |

Pages | 205-220 |

Number of pages | 16 |

ISBN (Print) | 3540202919, 9783540202912 |

DOIs | |

State | Published - 1 Jan 2003 |

Externally published | Yes |

Event | 14th International Conference on Algorithmic Learning Theory, ALT 2003 - Sapporo, Japan Duration: 17 Oct 2003 → 19 Oct 2003 |

### Publication series

Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
---|---|

Volume | 2842 |

ISSN (Print) | 0302-9743 |

ISSN (Electronic) | 1611-3349 |

### Conference

Conference | 14th International Conference on Algorithmic Learning Theory, ALT 2003 |
---|---|

Country/Territory | Japan |

City | Sapporo |

Period | 17/10/03 → 19/10/03 |

## ASJC Scopus subject areas

- Theoretical Computer Science
- General Computer Science