OneSketch: learning high-level shape features from simple sketches

Eyal Reisfeld, Andrei Sharf

Research output: Contribution to journalArticlepeer-review

Abstract

Humans use simple sketches to convey complex concepts and abstract ideas in a concise way. Just a few abstract pencil strokes can carry a large amount of semantic information that can be used as meaningful representation for many applications. In this work, we explore the power of simple human strokes denoted to capture high-level 2D shape semantics. For this purpose, we introduce OneSketch, a crowd-sourced dataset of abstract one-line sketches depicting high-level 2D object features. To construct the dataset, we formulate a human sketching task with the goal of differentiating between objects with a single minimal stroke. While humans are rather successful at depicting high-level shape semantics and abstraction, we investigate the ability of deep neural networks to convey such traits. We introduce a neural network which learns meaningful shape features from our OneSketch dataset. Essentially, the model learns sketch-to-shape relations and encodes them in an embedding space which reveals distinctive shape features. We show that our network is applicable for differentiating and retrieving 2D objects using very simple one-stroke sketches with good accuracy.

Original languageEnglish
JournalVisual Computer
DOIs
StateAccepted/In press - 1 Jan 2022

Keywords

  • High-level shape features
  • Partial representations
  • Sketch-based retrieval

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'OneSketch: learning high-level shape features from simple sketches'. Together they form a unique fingerprint.

Cite this