Book Details


International Journal of Computer Science (IJCS) Published by SK Research Group of Companies (SKRGC)

Download this PDF format


Learning sentiment-specific word embeddings dubbed sentiment embeddings is proposed in this paper. Existing word embedding learning algorithms typically only use the contexts of words but ignore the sentiment of texts. It is problematic for sentiment analysis because the words with similar contexts but opposite sentiment polarity, such as good and bad, are mapped to neighboring word vectors. This issue is addressed by encoding sentiment information of texts (e.g. sentences and words) together with contexts of words in sentiment embeddings. By combining context and sentiment level evidences, the nearest neighbors in sentiment embedding space are semantically similar and it favors words with the same sentiment polarity. In order to learn sentiment embeddings effectively, a number of neural networks with tailoring loss functions, and collect massive texts automatically with sentiment signals like emoticons as the training data is developed. Sentiment embeddings can be naturally used as word features for a variety of sentiment analysis tasks without feature engineering. Sentiment embeddings is applied to word-level sentiment analysis, sentence level sentiment classification and building sentiment lexicons. Experimental results show that sentiment embeddings consistently outperform context-based embeddings on several benchmark datasets of these tasks.


? D. Tang, F. Wei, B. Qin, T. Liu, and M. Zhou, “Coooolll: A deep learning system for twitter sentiment classification,” in Proc. 8th Int. Workshop Semantic Eval., 2014, pp. 208–212.

? C. D. Manning and H. Sch€utze, Foundations of Statistical Natural Language Processing. Cambridge, MA, USA: MIT Press, 1999.

? D. Jurafsky and H. James, Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition. Englewood Cliffs, NJ, USA: Prentice-Hall, 2000.

? Y. Bengio, R. Ducharme, P. Vincent, and C. Janvin, “A neural probabilistic language model,” J. Mach. Learning Res., vol. 3, pp. 1137–1155, 2003.

? T. Mikolov, I. Sutskever, K. Chen, G. Corrado, and J. Dean, “Distributed representations of words and phrases and their compositionality,” in Proc. Conf. Neural Inf. Process. Syst., 2013, pp. 3111–3119.

? J. Pennington, R. Socher, and C. Manning, “Glove: Global vectors for word representation,” in Proc. Conf. Empirical Methods Natural Lang. Process., 2014, pp. 1532–1543.



Text mining, sentiment analysis, sentiment classification, bag of words, feature based sentiment.

  • Format Volume 5, Issue 1, No 28, 2017
  • Copyright All Rights Reserved ©2017
  • Year of Publication 2017
  • Author G. Rajadurai, Mrs. D. Vinotha
  • Reference IJCS-293
  • Page No 1934-1940

Copyright 2022 SK Research Group of Companies. All Rights Reserved.