Twitter Sentiment Analysis Using BERT: A Transformer-Based NLP Approach

Uncategorized

Authors: M.S.R.naidu, Barri Kuvalaya, Bandaru Jyothika, Barle hemanth kumar, Amjuru bhanuprakash

Abstract: This paper introduces Bidirectional Encoder Representations from Transformers (BERT), a transformer-based natural language processing framework for sentiment analysis of Twitter data. Large amounts of opinion-rich textual data are produced by social media platforms, reflecting the public's feelings about societal issues, events, and products. Conventional sentiment analysis methods have trouble deciphering the informal language, contextual meaning, and semantic ambiguity seen in tweets. A pretrained BERT model is optimized for multi-class sentiment classification in order to get over these restrictions. An end-to-end pipeline comprising data preprocessing, tokenization, model training, evaluation, and result display is demonstrated in the built notebook. Experimental data reveal that contextual embeddings and attention mechanisms greatly boost sentiment classification accuracy compared to conventional approaches, validating the usefulness of transformer-based models for social media opinion mining.

 

 

× How can I help you?