DISCONA: distributed sample compression for nearest neighbor algorithm

Jedrzej Rybicki, Tatiana Frenklach, Rami Puzis

Research output: Contribution to journalArticlepeer-review


Sample compression using ε-net effectively reduces the number of labeled instances required for accurate classification with nearest neighbor algorithms. However, one-shot construction of an ε-net can be extremely challenging in large-scale distributed data sets. We explore two approaches for distributed sample compression: one where local ε-net is constructed for each data partition and then merged during an aggregation phase, and one where a single backbone of an ε-net is constructed from one partition and aggregates target label distributions from other partitions. Both approaches are applied to the problem of malware detection in a complex, real-world data set of Android apps using the nearest neighbor algorithm. Examination of the compression rate, computational efficiency, and predictive power shows that a single backbone of an ε-net attains favorable performance while achieving a compression rate of 99%.

Original languageEnglish
Pages (from-to)19976-19989
Number of pages14
JournalApplied Intelligence
Issue number17
StatePublished - 1 Sep 2023


  • Big data
  • Distributed machine learning
  • Malware detection
  • Nearest neighbors
  • Sample compression

ASJC Scopus subject areas

  • Artificial Intelligence


Dive into the research topics of 'DISCONA: distributed sample compression for nearest neighbor algorithm'. Together they form a unique fingerprint.

Cite this