IDENTIFICATION OF SUICIDAL CONTENT IN TWITTER DATA FLUX
Keywords:
Suicidal ideation detection; online social networks; social media platforms; suicide prevention; risk factors; warning signs; Twitter; natural language processing; behavioral features; textual features; martingale framework; change detection; machine learning classifiers; text-scoring approach; at-risk individuals.Abstract
Identifying suicidal thoughts in online social networks is a new field of study that faces several difficulties. According to recent studies, publicly accessible data dispersed throughout social media platforms has useful markers for accurately identifying people who are suicidally inclined. The main obstacle to preventing suicide is comprehending and identifying the many risk factors and warning indicators that could lead to the incident. In this exploration, we propose a new system for detecting bulletins with self-murder- related content and quantifying self-murder warning pointers for persons using the social media platform Twitter.. This method's primary innovation is its ability to automatically detect abrupt shifts in a user's online behaviour. We employ a martingale framework, which is frequently used for change detection in data streams, to filter out such changes by combining textual and behavioural aspects using natural language processing approaches. Tests demonstrate that, in contrast to conventional machine learning classification, our text-scoring method successfully identifies warning indicators in text. likewise, using the martingale frame reveals shifts in online geste and has implicit for relating behavioural shifts in those who are at threat.
Downloads
References
C. C. Chancellor and M. D. De Choudhury, “Methods in predictive modeling for mental health on social media,” Current Opinion in Behavioral Sciences, vol. 18, pp. 43–49, 2017.
[2]. M. Coppersmith, C. Harman, and M. Dredze, “Measuring post traumatic stress disorder in Twitter,” in Proc. ICWSM, 2014.
[3]. K. Kumar, S. Ekbal, and P. Bhattacharyya, “Deep learning-based automatic detection of depression from social media,” Computers in Human Behavior, vol. 106, p. 106275, 2020.
[4]. D. S. Low, M. S. Cheung, and K. D. Fong, “Temporal analysis of suicidal ideation on social media,” Journal of Affective Disorders, vol. 276, pp. 624–631, 2020.
[5]. G. Gkotsis et al., “Characterisation of mental health conditions in social media using Informed Deep Learning,” Scientific Reports, vol. 7, no. 45141, 2017.
[6]. J. Benton, M. Mitchell, and D. Hovy, “Multitask learning for mental health conditions with limited social media data,” in Proc. EACL, 2017, pp. 152–162.
[7]. Yates, A. Cohan, and N. Goharian, “Depression and self-harm risk assessment in online forums,” Journal of the Association for Information Science and Technology, vol. 68, no. 8, pp. 1920–1934, 2017.
[8]. Tartakovsky and V. Veeravalli, “Change-point detection in multichannel and distributed systems with applications,” in Applications of Sequential Methodologies, CRC Press, 2007, pp. 339–370.
[9]. J. Devlin, M.-W. Chang, K. Lee, and K. Toutanova, “BERT: Pre-training of deep bidirectional transformers for language understanding,” in Proc. NAACL-HLT, 2019, pp. 4171–4186.
[10]. M. Basseville and I. V. Nikiforov, Detection of Abrupt Changes: Theory and Application. Prentice Hall, 1993.
[11]. Y. Jiang, J. Li, and H. Xu, “Early detection of suicidal ideation on social media: A multimodal deep learning approach,” Journal of Affective Disorders, vol. 271, pp. 626–634, 2020.
[12]. T. Althoff, K. Clark, and J. Leskovec, “Large-scale analysis of counseling conversations: An application of natural language processing to mental health,” Transactions of the Association for Computational Linguistics, vol. 4, pp. 463–476, 2016.
[13]. M. De Choudhury, E. Kiciman, M. Dredze, G. Coppersmith, and M. Kumar, “Discovering shifts to suicidal ideation from mental health content in social media,” in Proc. CHI, 2016, pp. 2098–2110.
[14]. M. Conway and D. O’Connor, “Social media, big data, and mental health: Current advances and ethical implications,” Current Opinion in Psychology, vol. 9, pp. 77–82, 2016.