When Bots Take Over the Stock Market: Evasion Attacks Against Algorithmic Traders

Elior Nehemya, Yael Mathov, Asaf Shabtai, Yuval Elovici

Research output: Working paper/PreprintPreprint

76 Downloads (Pure)

Abstract

In recent years, machine learning has become prevalent in numerous tasks, including algorithmic trading. Stock market traders utilize learning models to predict the market's behavior and execute an investment strategy accordingly. However, learning models have been shown to be susceptible to input manipulations called adversarial examples. Yet, the trading domain remains largely unexplored in the context of adversarial learning. This is mainly because of the rapid changes in the market which impair the attacker's ability to create a real-time attack. In this study, we present a realistic scenario in which an attacker gains control of an algorithmic trading bots by manipulating the input data stream in real-time. The attacker creates an universal perturbation that is agnostic to the target model and time of use, while also remaining imperceptible. We evaluate our attack on a real-world market data stream and target three different trading architectures. We show that our perturbation can fool the model at future unseen data points, in both white-box and black-box settings. We believe these findings should serve as an alert to the finance community about the threats in this area and prompt further research on the risks associated with using automated learning models in the finance domain.
Original languageEnglish
DOIs
StatePublished - 19 Oct 2020

Keywords

  • q-fin.TR
  • cs.CR
  • cs.LG

Fingerprint

Dive into the research topics of 'When Bots Take Over the Stock Market: Evasion Attacks Against Algorithmic Traders'. Together they form a unique fingerprint.

Cite this