Abstract
Discrete random variables are essential ingredients in various artificial intelligence problems. These include the estimation of the probability of missing the deadline in a series-parallel schedule and the assignment of suppliers to tasks in a project in a manner that maximizes the probability of meeting the overall project deadline. The solving of such problems involves repetitive operations, such as summation, over random variables. However, these computations are NP-hard. Therefore, we explore techniques and methods for approximating random variables with a given support size and minimal Kolmogorov distance. We examine both the general problem of approximating a random variable and a one-sided version in which over-approximation is allowed but not under-approximation. We propose several algorithms and evaluate their performance through computational complexity analysis and empirical evaluation. All the presented algorithms are optimal in the sense that given an input random variable and a requested support size, they return a new approximated random variable with the requested support size and minimal Kolmogorov distance from the input random variable. Our approximation algorithms offer useful estimations of probabilities in situations where exact computations are not feasible due to NP-hardness complexity.
Original language | English |
---|---|
Article number | 104086 |
Journal | Artificial Intelligence |
Volume | 329 |
DOIs | |
State | Published - 1 Apr 2024 |
Keywords
- Data compression
- Deadline constraints
- Discrete random variables
- Kolmogorov approximation
- One-sided approximation
- Statistical distance measures
- Support size reduction
- Task scheduling
ASJC Scopus subject areas
- Language and Linguistics
- Linguistics and Language
- Artificial Intelligence