Abstract
Sample compression using ε-net effectively reduces the number of labeled instances required for accurate classification with nearest neighbor algorithms. However, one-shot construction of an ε-net can be extremely challenging in large-scale distributed data sets. We explore two approaches for distributed sample compression: one where local ε-net is constructed for each data partition and then merged during an aggregation phase, and one where a single backbone of an ε-net is constructed from one partition and aggregates target label distributions from other partitions. Both approaches are applied to the problem of malware detection in a complex, real-world data set of Android apps using the nearest neighbor algorithm. Examination of the compression rate, computational efficiency, and predictive power shows that a single backbone of an ε-net attains favorable performance while achieving a compression rate of 99%.
Original language | English |
---|---|
Pages (from-to) | 19976-19989 |
Number of pages | 14 |
Journal | Applied Intelligence |
Volume | 53 |
Issue number | 17 |
DOIs | |
State | Published - 1 Sep 2023 |
Keywords
- Big data
- Distributed machine learning
- Malware detection
- Nearest neighbors
- Sample compression
ASJC Scopus subject areas
- Artificial Intelligence