TY - GEN
T1 - Multi-Stage Active Sequential Hypothesis Testing with Clustered Hypotheses
AU - Vershinin, George
AU - Cohen, Asaf
AU - Gurewitz, Omer
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025/1/1
Y1 - 2025/1/1
N2 - We consider the problem where an active DecisionMaker (DM) is tasked to identify the true hypothesis using as few as possible observations while maintaining accuracy. The DM collects observations according to its determined actions and knows the distributions under each hypothesis. We propose a deterministic and adaptive multi-stage hypothesis-elimination strategy where the DM selects an action, applies it repeatedly, and discards hypotheses in light of its obtained observations. The DM selects actions based on maximal separation expressed by the distance between the parameter vectors of each distribution under each hypothesis. Close distributions can be clustered, simplifying the search and significantly reducing the number of required observations. Our algorithms achieve vanishing Average Bayes Risk (ABR) as the error probability approaches zero, i.e., the algorithm is asymptotically optimal. Furthermore, we show that the ABR is bounded when the number of hypotheses grows. Simulations are carried out to evaluate the algorithm's performance compared to another multi-stage hypothesis-elimination algorithm, where an improvement of several orders of magnitude in the mean number of observations required is observed.
AB - We consider the problem where an active DecisionMaker (DM) is tasked to identify the true hypothesis using as few as possible observations while maintaining accuracy. The DM collects observations according to its determined actions and knows the distributions under each hypothesis. We propose a deterministic and adaptive multi-stage hypothesis-elimination strategy where the DM selects an action, applies it repeatedly, and discards hypotheses in light of its obtained observations. The DM selects actions based on maximal separation expressed by the distance between the parameter vectors of each distribution under each hypothesis. Close distributions can be clustered, simplifying the search and significantly reducing the number of required observations. Our algorithms achieve vanishing Average Bayes Risk (ABR) as the error probability approaches zero, i.e., the algorithm is asymptotically optimal. Furthermore, we show that the ABR is bounded when the number of hypotheses grows. Simulations are carried out to evaluate the algorithm's performance compared to another multi-stage hypothesis-elimination algorithm, where an improvement of several orders of magnitude in the mean number of observations required is observed.
UR - https://www.scopus.com/pages/publications/105021996349
U2 - 10.1109/ISIT63088.2025.11195532
DO - 10.1109/ISIT63088.2025.11195532
M3 - Conference contribution
AN - SCOPUS:105021996349
T3 - IEEE International Symposium on Information Theory - Proceedings
BT - ISIT 2025 - 2025 IEEE International Symposium on Information Theory, Proceedings
PB - Institute of Electrical and Electronics Engineers
T2 - 2025 IEEE International Symposium on Information Theory, ISIT 2025
Y2 - 22 June 2025 through 27 June 2025
ER -