Skip to content

Mobile application for exploring fitness data using both speech and touch interaction.

License

Notifications You must be signed in to change notification settings

umdsquare/data-at-hand-mobile

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Data@Hand

Data@Hand is a cross-platform smartphone app that facilitates visual data exploration leveraging both speech and touch interactions. Data visualization is a common way that mobile health apps enable people to explore their data on smartphones. However, due to smartphones’ limitations such as small screen size and lack of precise pointing input, they provide limited support for visual data exploration with over-simplified time navigation, even though time is a primary dimension of self-tracking data. Data@Hand leverages the synergy of speech and touch; speech-based interaction takes little screen space and natural language is flexible to cover different ways of specifying dates and their ranges (e.g., “October 7th”, “Last Sunday”, “This month”). Currently, Data@Hand supports displaying the Fitbit data (e.g., step count, heart rate, sleep, and weight) for navigation and temporal comparisons tasks.

For more information about this project, please visit https://data-at-hand.github.io.

Related Research Paper (Describes the design and a user study)

Data@Hand: Fostering Visual Exploration of Personal Data on Smartphones Leveraging Speech and Touch Interaction
[Best Paper Honorable Mention Award]
Young-Ho Kim, Bongshin Lee, Arjun Srinivasan, and Eun Kyoung Choe
ACM CHI 2021 (PDF)

How to build & run

System Overview

Data@Hand is a stand-alone application that does not require a backend server. The app communicates with the Fitbit server and fetches the data locally on the device.

Acquire Fitbit API Key

  1. Register an app on the Fitbit developer page https://dev.fitbit.com/apps/new.

    1. Select Client for OAuth 2.0 Application Type.
    2. Use a URL similar to edu.umd.hcil.data-at-hand://oauth2/redirect for Callback URL. This URL will be used locally on your device.
  2. Data@Hand leverages Fitbit's Intraday API, which you should explicitly get approval from Fitbit https://dev.fitbit.com/build/reference/web-api/intraday-requests/.

  3. In the credentials directory in the repository, copy fitbit.example.json and rename it into fitbit.json.

  4. Fill the information accordingly. You can get the information in Manage My Apps on the Fitbit developer page.

{
  "client_id": "YOUR_FITBIT_ID", // <- OAuth 2.0 Client ID 
  "client_secret": "YOUR_FITBIT_SECRET", // <- Client Secret
  "redirect_uri": "YOUR_REDIRECT_URI" // <- Callback URL
}

(Android Only) Acquire Microsoft Cognitive Speech API Key

  1. Register a Microsoft Cognitive Speech-to-text service at a free-tier https://azure.microsoft.com/en-us/services/cognitive-services/speech-to-text/.
  2. In the credentials directory in the repository, copy microsoft_cognitive_service_speech.example.json and rename it into microsoft_cognitive_service_speech.json.
  3. Fill the information accordingly. You need a subscription ID and the region information.
{
  "subscriptionId": "YOUR_SUBSCRIPTION_ID",
  "region": "YOUR_AZURE_REGION" // <- Depending on the region you set. e.g., "eastus"
}

(Optional) If you want to track exceptions, register Bugsnag.

  1. Create a Bugsnag project and get the API Key https://www.bugsnag.com/.
  2. In the credentials directory in the repository, copy bugsnag.example.json and rename it into bugsnag.json.
  3. Fill the information accordingly.
{
  "api_key": "YOUR_BUGSNAG_API_KEY"
}

Compile Data@Hand

Install Node.js on your system.

Install react-native CLI:

> npm install -g @react-native-community/cli

Install dependencies (In the directory of the repository where package.json exists)

> npm i

Run on IOS:

If you have not used Cocoapods before, install it once:

> sudo gem install cocoapods

Install iOS project dependencies.

> cd ios
> pod install

Run on iOS.

> react-native run-ios

Run on Android:

> react-native run-android

Third-party Services Used


Research Team Member

Young-Ho Kim (Website)
Postdoctoral Associate
University of Maryland, College Park
*Contact for code and implementation

Bongshin Lee (Website)
Sr. Principal Researcher
Microsoft Research

Arjun Srinivasan (Website)
Research Scientist
Tableau Research
*Arjun did this work while at Georgia Institute of Technology

Eun Kyoung Choe (Website)
Associate Professor
University of Maryland, College Park


Acknowledgment

This work was in part supported by National Science Foundation award #1753452 (CAREER: Advancing Personal Informatics through Semi-Automated and Collaborative Tracking).


License

Source Code

MIT License

Original Design Resources including Logos and Assets

CC BY 4.0