Towards Large-Scale Data Mining for Data-Driven Analysis of Sign Languages

06/03/2020
by   Boris Mocialov, et al.
0

Access to sign language data is far from adequate. We show that it is possible to collect the data from social networking services such as TikTok, Instagram, and YouTube by applying data filtering to enforce quality standards and by discovering patterns in the filtered data, making it easier to analyse and model. Using our data collection pipeline, we collect and examine the interpretation of songs in both the American Sign Language (ASL) and the Brazilian Sign Language (Libras). We explore their differences and similarities by looking at the co-dependence of the orientation and location phonological parameters

READ FULL TEXT
research
05/24/2022

Classification of Phonological Parameters in Sign Languages

Signers compose sign language phonemes that enable communication by comb...
research
06/27/2023

YouTube-ASL: A Large-Scale, Open-Domain American Sign Language-English Parallel Corpus

Machine learning for sign languages is bottlenecked by data. In this pap...
research
05/21/2022

Unsupervised Sign Language Phoneme Clustering using HamNoSys Notation

Traditionally, sign language resources have been collected in controlled...
research
01/07/2017

Sign Language Recognition Using Temporal Classification

Devices like the Myo armband available in the market today enable us to ...
research
03/11/2022

WLASL-LEX: a Dataset for Recognising Phonological Properties in American Sign Language

Signed Language Processing (SLP) concerns the automated processing of si...
research
12/29/2018

Applying Text Mining to Protest Stories as Voice against Media Censorship

Data driven activism attempts to collect, analyze and visualize data to ...

Please sign up or login with your details

Forgot password? Click here to reset