Rochester Institute of Technology Research Study
Beyond the SmartWatch:
A Design Space Exploration of Near-Wrist Bioacoustic On-Skin Interactions
Interacting with a small smartwatch touchscreen is awkward. What if instead, you could perform gestures using your palm, hand, fingers, or forearm?
We are recruiting participants for a research study exploring futuristic on-skin gesture interactions for smartwatch wearables using an emerging technique called bioacoustic sensing. Your participation will help design and evaluate a new way to interact with technology.
Screener takes less than 5 minutes
Eligibility
Who Can Participate?
Please review the criteria below to see if you may be eligible for this study.
Inclusion Criteria
- Must be at least 18 years of age
- Must have eyesight or corrective vision able to read small text on smartwatch displays
- Must have dexterity of both hands and ten digits to perform the full set of gestures
- Must be able to attend in-person sessions at Rochester Institute of Technology
Exclusion Criteria
- Individuals under 18 years of age
- Individuals who lack dexterity of both hands and ten digits
Compensation
Compensation for Your Time
As a thank you for your participation, you will receive compensation for each phase of the study you complete.
Gesture Elicitation & Training
$20 reimbursement upon completion (~1.5 hours in person)
Model Validation
$10 reimbursement upon completion (30 min – 1 hour in person)
Prototype Evaluation
$10 reimbursement upon completion (~1 hour in person)
You do not need to participate in all three sessions. You will be compensated for each session you complete.
About This Research
About the Study
Learn more about what this research is about and who is behind it.
Study Purpose
This study designs and evaluates on-skin bioacoustic gesture interactions for smartwatch wearables. Using sensors that detect tiny vibrations, we are exploring how gestures on your palm, wrist, and forearm can provide a much larger and more intuitive interaction surface than a small touchscreen.
Who Is Conducting This Study
This study is being conducted by Sidney Grabosky, a M.S. Human-Computer Interaction student in School of Information at Rochester Institute of Technology, under the supervision of Dr. Garreth Tigwell, Associate Professor.
IRB Approval
This study has been reviewed and approved by the Rochester Institute of Technology Institutional Review Board (IRB). Protocol #01112425. Your rights as a research participant are protected.
Study Phases
What to Expect
This study has three sessions. You may participate in one or more — here's what each involves.
Gesture Elicitation & Training
Use your creativity to help design on-skin gestures for smartwatch interactions, then participate in a machine learning data collection session. You'll wear an Apple Watch running prototype software throughout. Approximately 1.5 hours in person.
Model Validation
Follow instructions on a laptop application to perform a series of gestures, helping us validate whether the machine learning classifier can accurately recognize each gesture. Between 30 minutes and 1 hour in person.
Prototype Evaluation
Evaluate an Apple Watch prototype that uses on-skin bioacoustic gestures for common tasks like scrolling, zooming, and panning. You'll share your thoughts aloud and answer questions about comfort, intuitiveness, and social acceptability. Approximately 1 hour in person.
Ready to Participate?
Your participation will contribute to important academic research. Click below to begin the screening process.
Takes less than 5 minutes
Questions? Contact Sidney Grabosky at