Abstract
A famous definition of AI is based on the terms weak and strong AI from McCarthy. An open question is the characterization of these terms, i.e., the transition from weak to strong. Nearly no research results are known for this complex and important question. In this paper we investigate how the size and structure of a Neural Network (NN) limits the learnability of a training sample, and thus, can be used to discriminate weak and strong AI (domains). Furthermore, the size of the training sample is a primary parameter for the training effort estimation with the big O function. The needed training repetitions may also limit the learning tractability and will be investigated. The results are illustrated with an analysis of a feedforward NN and a training sample for language with 1,000 words including the effort for the training repetitions.
Original language | English |
---|---|
Pages (from-to) | 338-346 |
Number of pages | 9 |
Journal | Advances in Artificial Intelligence and Machine Learning |
Volume | 2 |
Issue number | 2 |
DOIs | |
State | Published - 1 Jan 2022 |
Externally published | Yes |
Keywords
- Betti numbers
- Dimension of NN
- NN
- Power of AI
- Training Repetitions of NN
- Training Sample
- Weak and strong AI
ASJC Scopus subject areas
- Artificial Intelligence