Ink is a powerful medium for note-taking and creativity tasks. Multi-touch devices and stylus input have enabled digital ink to be editable and searchable. To extend the capabilities of digital ink, we introduce Inkeraction, an interaction modality powered by ink recognition and synthesis. Inkeraction segments and classifies digital ink objects (e.g., handwriting and sketches), identifies relationships between them, and generates strokes in different writing styles. Inkeraction reshapes the design space for digital ink by enabling features that include: (1) assisting users to manipulate ink objects, (2) providing word-processor features such as spell checking, (3) automating repetitive writing tasks such as transcribing, and (4) bridging with generative models’ features such as brainstorming. Feedback from two user studies with a total of 22 participants demonstrated that Inkeraction supported writing activities by enabling participants to write faster with fewer steps and achieve better writing quality.
Inkeraction: An Interaction Modality Powered by Ink Recognition and Synthesis
Lei Shi,R. Campbell,Peggy Chi,Maria Cirimele,Mike Cleron,Kirsten Climer,Chelsey Q Fleming,Ashwin Ganti,Philippe Gervais,Pedro Gonnet,T. A. Karim,Andrii Maksai,Chris Melancon,Rob Mickle,C. Musat,Palash Nandy,Xiaoyu Iris Qu,David Robishaw,Angad Singh,Mathangi Venkatesan
Published 2024 in International Conference on Human Factors in Computing Systems
ABSTRACT
PUBLICATION RECORD
- Publication year
2024
- Venue
International Conference on Human Factors in Computing Systems
- Publication date
2024-05-11
- Fields of study
Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-79 of 79 references · Page 1 of 1
CITED BY
Showing 1-5 of 5 citing papers · Page 1 of 1