Objective and Self-reported Data Collection to Understand Screen Time Usage in Families

Master's Thesis

Introduction

  • The adoption of digital technologies has sparked research interest among scholars to study technology use. 
  • Self-reported data collection mechanisms are commonly used in behavioural and sociological studies but the validity of these mechanisms is debatable when studying technology use. (Due to our inability to sense information temporally, parents want to "look" good.

Research Question

  • how can be we objectively study the differences between self-reported screen time and logged screen time data, and the validity of the same.

 

  • second, how can be better study this in the context of families, where the relationship between family members and digital devices is quite complex and often shared between family members. 

 

  • third, can we find out what factors affect the closeness of the screen time estimates.

 

Methodology

Research methodology broken down for each research question

Comparing screen time data

  • For collecting the objective data, we will be using a cross-platform flutter app to collect ground truth screen time data from the smartphone itself.

 

 

For subjective screen time data, we created a survey of 3 questions. 

  • For how many hours does [member x] use his/her smartphone in a day? (response type - time)

  • What does [member x] use her smartphone mostly for? (work or leisure, this is for testing a hypothesis, that a preconceived notion about what a family member usually does on his/her smartphone, might be a relevant factor to understand how good family members are at estimating usage)

  • How many times in a day do you think [member x] picks up their smartphone?

Methodology

Comparing screen time data

Methodology

Comparing screen time data

  • Self-reported survey questions are tricky as the participants mould their responses based on the wording of the question, the mode of answering the question and demographics. 

 

  • Ernala SK et al. (2020) found that open ended questions have more error than closed ended. But closed ended questions decrease the accuracy of data and is difficult to frame. 

Methodology

Comparing screen time data

To overcome this problem, we set out to do some A/B testing for deciding the question formats. 

DEMO

Methodology

Comparing screen time data

Results

  1. Majority of the participants preferred closed-ended multiple choice question type with 6 options of 2 hour increments. (52%)

  2. Majority of the participants preferred the slider-type question with 15-min step. (76%)

  3. The overall preference for question type was slider type with 15-min step (47%), with MCQ type with 6 options coming at 43% and open-ended type at 8%.

Methodology

Family Context

For this, we decided to conduct the studies for families where all family members have smartphones. The twist to the study is where the family member is asked to report screen time estimate for themselves as well as every other family member. This gives as n^2 comparisons. 

Methodology

Factors Affecting the estimates. 

We initially decided to use an experience jar probe to encourage family members to think about their own smartphone usage and other's usage. We later pivoted to conducting a digital version of the study in light of logistical issues in handling the probes. 

deck

By shreyasgupta

deck

  • 183