You are on page 1of 4

2017 9th International Conference on Communication Systems and Networks (COMSNETS)

An Experimental Wearable IoT for Data-driven Management of Autism


Yan Shi1, Saptarshi Das1, Sarah Douglas2, and Subir Biswas1
Electrical and Computer Engineering1, Human Development and Family Studies2
Michigan State University, East Lansing, USA
Abstract This paper reports early developments on an students. Furthermore, research suggests that teachers
wearable IoT system that can be used for collecting require extensive training to conduct accurate
quantified data about interactions among children with observations [7] and prefer assessment methods that are
autism spectrum disorders (ASD) in classroom settings. The
overall objective of the project is to perform data-driven brief and do not detract from instructional time.
detection, therapy, intervention, and progress monitoring Technology to Measure Social Interaction: Technologies
for children with Autism. Early intervention can help have begun to emerge to reduce the challenges early
support the development of social skills for children with childhood educators face as they collect observational
ASD. To perform intervention, teachers monitor the social
interaction progress of children with ASD through
data about child social development. Mobile applications
observational methods, which can often be lengthy and (apps) are now available to aid educators as they collect
subjective, therefore requiring extensive teacher training to data in the classroom [8], but still require direct
ensure accuracy. Although new technologies have begun to observation and manual data entry from teachers, which
emerge, no existing technologies can provide quantitative can be time-consuming, subjective, and impractical in
and automatically-collected data about peer interactions for many early childhood classrooms [9]. Automated
children with ASD. We address this critical need through a
technologies have also emerged, such as the LENA
wearable sensor system that can provide real-time
quantitative data about peer interactions in classroom system [10], which was developed to evaluate adult
settings. The long-term objective of the project is to use this support within the language environment and provide
system for data-driven detection, therapy, intervention, and information about language development. However, there
progress monitoring for children with Autism. is an urgent need for tools that can aid teachers in
Index Terms Autism; Internet of Things; Interaction measuring peer interactions of children with ASD. Yet, to
Detection; Data-driven Therapy, Intervention. date there are no existing technologies that can provide
I. INTRODUCTION fully automated and quantitative data about the peer
Autism and the Importance of Early Intervention: interactions of children with ASD. The objective of this
Children with autism spectrum disorders (ASD) have work is to bridge that technology gap using a wearable
difficulty establishing, maintaining, and engaging in Internet of Thing (IoT) for social interaction monitoring
social interactions [1]. These impairments often lead to within classroom settings.
loneliness and a lack of friendship with peers. However, II. WEARABLE IOT FOR INTERACTION MONITORING
early intervention can improve social and developmental Using funding from NASA [11], the authors have
outcomes and reduce the overall support needs [2]. To developed a wearable system for measuring pair-wise
support the social skill development of children with human interaction between individuals. When two
ASD, educators must utilize research-based interventions, people, each wearing a nametag-sized sensor badge (i.e.,
as required by federal law, and collect ongoing data to the IoT device), interact, their interaction dynamics are
inform decision-making [3]. Thus, schools, including tracked and quantitatively captured by the sensors for
early childhood programs, play a vital role in providing subsequent wireless upload to an access point connected
intervention services to the nearly 500,000 children with to a notebook computer. The badge can be worn in
ASD receiving public educational services [4]. With various ways including as a nametag or integrated within
ASD on the rise, now impacting 1 in every 68 children in clothing depending on the target population and
the United States [5], there is a critical need to develop application. The system provides multi-modal sensing
tools that can effectively and efficiently monitor child data including: (a) face-to-face time - the amount of time
social interaction. two individuals face each other; (b) proximity - the
Challenges Measuring Social Interaction: Teacher physical distance between the two participants when they
documentation of social skill development requires the are facing each other; and (c) activity level - the physical
use of observational methods [6] and is essential to movement of participants. Measurements are taken using
monitoring and sharing student progress. Unfortunately, ultrasound sonar for face-to-face time and relative
observational methods are time consuming, subjective, proximity, and accelerometer for physical activity level.
and pull teachers away from instructional time with Data from these modalities are combined in software to

978-1-5090-4250-0/17/$31.00 2017 IEEE 468


2017 9th International Conference on Communication Systems and Networks (COMSNETS)

directly in front. It can


also detect the distance
Video
of those badges from the
Camera
Interaction
Time Difference of
Video Feed Manual Coding Timeline Arrival (TDOA)
(Manual)
Wearable
Sensor
between a 400MHz
Cross
Lin e of
Sigh
t
Validation radio and the ultrasound
Wearable
signal.
Sensor
The directionality
pattern of the ultrasound
Interaction sensors in the badge is
Timeline
Classroom
Environment Sensor Data Visualization Automated (Sensor)
illustrated in Fig. 2. It
Sensor Data
Processing shows that a badge can
6
detect all individuals
G ree n - Y ello w
Inte rac tio n
6 00
G reen-Y ello w

within a 220-degree
D is tanc e
O n-s c re e n T im e
5 00
5

4 4 00

O cc urren c e
D is ta n c e (ft)
3

2
cone to the front of the 3 00

Sensor Shirt
Wearable Sensor 1 individual. Since the 2 00

1 00

badges are worn directly


0

-1 0
80 81 82 83 84 85 0 3 6 9 12
Tim e(m in)

facing forward (see Fig.


D is tanc e(ft)

Base Station Interaction Timeline Data Analysis

Fig. 1: Wearable IoT system and its processing architecture


1), the ultrasound sensor
in it is able to capture
understand pair-wise human interaction in real-time. For true face-to-face time between the individuals in a dyad.
example, combining the face-to-face proximity and When face-to-face time occurs within a dyad, the
physical movement data we can identify the initiator and badges measure
terminator of an interaction within a given dyad (i.e., a their pairwise
pair). distance. When
III. QUANTITATIVE APPROACH TO MONITORING two individuals
are oriented
As shown in Fig. 1, the sensor system consists of sensor
face-to-face
badges worn by child and teacher participants in the
with a relative
pocket of a custom-made T-shirt (i.e., sensor shirt in Fig. proximity less
1) and a wireless base station (i.e., access point), which than certain
collects data from the badges. A PC-based software user software-set
interface provides real-time visualization of raw and threshold (e.g.,
various forms of processed data to be observed by the 3 feet), an
teacher in the classroom, as well as others outside the interaction is
classroom over the Internet. The user interface provides registered. By
line and bar graphs of face-to-face time, proximity, and combining the
physical movements. Collected data is also archived for activity levels of the individual badges with relative face-
post-processing. The sensor system allows children to to-face time and proximity, it is possible to determine
participate in the typical range of classroom activities which individual within a dyad is responsible for
initiating and terminating an interaction as defined above.
while data about the interactions between participants are
Considering only 9cm x 7cm x 1.5cm size and 65
captured unobtrusively. grams weight with no need for wired connectivity, the
One notable innovation of the system is its ability to badges are practical and safe from a wearability
capture face-to-face time (i.e., amount of time a dyad standpoint. Placing the badge within T-shirt pockets in
face each other) and proximity (i.e., distance between a our preliminary study had created no visible and reported
dyad during face-to-face time), which are not available in discomfort for the preschool children. Since all the
existing commercial or research-grade devices. Using a components are completely sealed within the badge, no
40KHz ultrasound signal, each badge can detect all other electrical and mechanical hazards are posed for the study
badges that are within an approximately 220o cone participants.

469
2017 9th International Conference on Communication Systems and Networks (COMSNETS)

Real-time analytics for intervention: The sensor system min 30 sec to 20 min 45 sec the video recording and
can produce child-specific interaction analytics and sensor data show agreement that the dyad was engaging
reports in real-time so that a teacher Green - Yellow
can provide immediate and
individualized intervention to 3.5 Interaction(HLI/LLI)
Distance
children with ASD. Using the 3 On-screen Time

information such as interaction

Distance(ft)
duration, initiations, and
2.5

terminations, individualized 2 Agreement


(HLI or LLI)
statistics can be created at different 1.5 Disagreement
time-scales. This brings an Agreement (HLI or LLI but no Dyad
not on
1 (No interac on) sensor data)
unprecedented level of quantitative camera

visibility to interactions in a 0.5

classroom with many children with 0

ASD, enabling scalable -0.5


interventions. Although 15 16 17 18 19 20
Time(min)
21 22 23 24

interventions will not be explored in


Fig. 3: Interac on Data Sample (Green-Yellow Dyad)
this particular project, the analytics
capabilities will be thoroughly leveraged in the future for in an interaction. From approximately 17 min 40 sec to
realizing our long-term vision of more efficient 18 min 30 sec, and 22 min 15 sec to 23 min 15 sec the
interventions using real-time sensor data. dyad was engaging in an interaction with each other, but
IV. EXPERIMENTAL SETTINGS the sensor did not capture the interaction. Review of the
video data during these time periods revealed that one of
A study was conducted in a pre-school classroom at an
the dyad members was sitting on the floor with his knees
early childhood center in the Mid-western United States.
folded against the chest, therefore blocking the line-of-
The classroom was equipped with ceiling mounted
sight of the sensors. Such blockages contribute to the
cameras for video recording purposes. The study
sensor data inaccuracies as reported in Table 1.
involved a teacher in a preschool classroom with three-
year olds. Three children, who were recruited through the Dyad SV S V S V S V Type Match
early childhood center director, also participated. Parental Yellow-Black 1521 33 0 258 Teacher-Child 86%
consents were obtained for all children and each child
Green-Yellow 1880 595 400 101 Child-Child 83%
provided verbal assent to participate before badges were
placed on them. Two boys and one girl participated, all Yellow-Orange 22 1737 130 278 Child-Child 81%
three years of age. Study dyads included child-child pairs Green-Orange 587 877 216 155 Child-Child 80%
to measure peer interaction and teacher-child pairs to
measure child-teacher interaction. Table 1: Match rate between sensor data and video coded data
Relative face-to-face time and proximity data from the The face-to-face proximity distribution for the same
badges (worn in the pocket of a custom-made colored T- green-yellow dyad is shown in the bottom-right part of
shirt) from all the child-child and child-teacher dyads Fig. 1. These distribution graphs show representative
were collected. ways of providing interaction data to teachers for
Fig. 3 provides a representative sample of data possible data-driven interventions in the future.
collected in the experiment from both video and the An interaction for each dyad was also analyzed via
sensor. The data represents activity of the green-yellow video using Datavyu software [12] and a behavioral
dyad and their interactions as extracted from the data definition for interaction [13] (i.e., the dyad was within 3
between 15 min and 24 min of a session. In Figure 3, feet of each other and talking, sharing items, playing
distance represents the distance reading between the together, trying to gain the attention of the other person in
sensors worn by the dyad. The on-screen time and the dyad, or engaging in the same activity). Interaction
Interaction (HLI/LLI) are hand-coded from video data. instances extracted from the sensor data are compared
The on-screen time bar expands from 15 to 21 min and with interactions observed from video coding. Percent
approximately 22 to 24 min, indicating that both matches are shown in Table 1, in which  indicates that
members of the dyad were viewable in the video during data from the sensor reported an interaction, and 
these time periods. It can be seen in Figure 4 that from 17 indicates that data from the sensor reported no
min 0 sec to 17 min 40 sec, and from approximately 18

470
2017 9th International Conference on Communication Systems and Networks (COMSNETS)

[2] National Research Council. (2001). Educating Children with Autism.


interaction.  Indicates video coding reported an Washington, DC: National Academy Press.
interaction, while  indicates video coding reported no [3] Koegel, L., Matos-Freden, R., Lang, R., & Koegel, R. (2012).
interaction. The match rate is defined as the ratio of the Interventions for children with autism spectrum disorders in inclusive
number of matched samples (i.e., between sensor and school settings. Cognitive and Behavioral practice, 19(3), 401-412.

video) to the total sample count. The final column in [4] U.S. Dept. of Education, National Center for Education Statistics. (2016).
Digest of Education Statistics, 2014 (NCES 2016-006), Chapter 2.
Table 1 shows a consistent match of around 80% and up
[5] Christensen, D.L., Baio, J., Braun, K.V., Bilder, D., Charles, J.,
across all the dyads. Costantino, J. N., . . . Yeargin-Allsopp, M. (2016). Prevalence and
To get further insight to the systems performance, the Characteristics of Autism Spectrum Disorder Among Children Aged 8
Years Autism and Developmental Disabilities Monitoring Network,
number of samples in each major classroom activity and 11 Sites, United States. MMWR Surveillence Summary 2016; 65(No. SS-
the number of correctly classified samples (using video 3): 123. DOI: http://dx.doi.org/10.15585/mmwr.ss6503a1
data as ground truth) are shown in Fig. 4. The primary [6] Boyd, B. A., Conroy, M. A., Asmus, J., & McKenney, E. (2011). Direct
reason for loss of accuracy was found to be sensor observation of peer related social interaction: Outcomes for young
children with autism spectrum disorders. Exceptionality, 19, 94-108.
blockage, which prevents the line-of-sight ultrasound
[7] LeDoux, M.W., Yoder, N.N., & Hanes, B. (2010). The use of personal
signal to reach across the badges in a dyad. Such data assistants in early childhood assessment. Computers in the Schools,
blockage most often occurred when a childs badge was 27, 132-144.
at the same level as an object such as a table. This [8] Marcu, G., Tassini, K., Carlson, Q., Goodwyn, J., Rivkin, G., Schaefer,
explains why the maximum loss of accuracy in Fig. 2 K. J., Dey, A. K., & Kiesler, S. (2013, April). Why do they still use
paper?: understanding data collection and use in Autism education. In
happened during activities involving tables. Upon Proceedings of the SIGCHI Conference on Human Factors in Computing
analyzing the video it was found that whenever a child Systems (pp. 3177-3186).
did not sit upright, the sensor windows in the badge was [9] Milfort, R., & Greenfield, D. B. (2002). Teacher and observer ratings of
head start childrens social skills. Early Childhood Research Quarterly,
temporarily blocked. After this study concluded, we 17(4), 581-595.
identified a new sensor placement higher on the T-shirt to [10] Dykstra, J., Sabatos-DeVito, M. G., Irvin, D. W., Boyd, B. A., Hume, K.
ameliorate the blockage problem in most instances. A., & Odom, S. L. (2012). Using the Language Environment Analysis
(LENA) system in preschool classrooms with children with autism
V. SUMMARY AND ONGOING WORK spectrum disorders. Autism, 17(6), 582-594.
In summary, our preliminary study demonstrates the [11] Baard, S., Kozlowski, S., DeShon, R., Biswas, S., Braun, M., Rench,
wearable IoT systems ability to monitor interactions for Piolet, Y. (April 2012) Assessing Team Process Dynamics using
Wearable Sensors: An Innovative Methodology for Team Research. In
children with acceptable accuracy in a preschool Proceedings of the Annual Conference of the Society of Industrial
classroom setting. We are now in the process of Organizational Psychology, San Diego, CA.
developing a data processing system for run-time [12] Datavyu Team (2014). Datavyu: A Video Coding Tool. Databrary
Project, New York University. URL http://datavyu.org
detection of anomaly in children interaction and behavior [13] Bauminger, N. (2002). The facilitation of social-emotional understanding
pattern, so that feedback can be provided to teachers and and social interaction in high-functioning children with autism:
parents for timely interventions. A number of neural Intervention outcomes. Journal of Autism and Developmental Disorders,
32(4), 283-298.
network and other classifier training mechanisms are
being explored for this purpose. We are also working on
developing the next generation IoT hardware with
reduced form-factor for better clothing integration.
VI. References
[1] Bellini, S., Peters, J. K., Benner, L., & Hopf, A. (2007). A meta-analysis
of school-based social skills interventions for children with autism
spectrum disorders. Remedial and Special Education, 28(3), 153-162.

471

You might also like