Skip to content

Commit

Permalink
add emotion analysis, release version 1.2
Browse files Browse the repository at this point in the history
  • Loading branch information
huseinzol05 committed Jan 6, 2019
1 parent 06e0be1 commit 01ba1da
Show file tree
Hide file tree
Showing 133 changed files with 18,580 additions and 7,146 deletions.
6 changes: 4 additions & 2 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,8 @@ GPU version
Features
--------

- **Emotion Analysis**, from BERT, Fast-Text, Dynamic-Memory Network, Sparse-Chars,
Attention to build deep emotion analysis models.
- **Entities Recognition**, using latest state-of-art CRF deep learning
models to do Naming Entity Recognition.
- **Language Detection**, using Multinomial, SGD, XGB, Fast-text N-grams deep learning to distinguish Malay, English, and Indonesian.
Expand All @@ -41,12 +43,12 @@ Features
- Num2Word
- **Part-of-Speech Recognition**, using latest state-of-art CRF deep
learning models to do POS Recognition.
- **Sentiment Analysis**, from BERT, Fast-Text, Dynamic-Memory Network,
- **Sentiment Analysis**, from BERT, Fast-Text, Dynamic-Memory Network, Sparse-Chars,
Attention to build deep sentiment analysis models.
- **Spell Correction**, using local Malaysia NLP researches to
auto-correct any bahasa words.
- Stemmer
- **Subjectivity Analysis**, from BERT, Fast-Text, Dynamic-Memory Network,
- **Subjectivity Analysis**, from BERT, Fast-Text, Dynamic-Memory Network, Sparse-Chars,
Attention to build deep subjectivity analysis models.
- **Summarization**, using skip-thought state-of-art to give precise
summarization.
Expand Down
Binary file added accuracy/emotion-accuracy.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
28 changes: 28 additions & 0 deletions accuracy/emotion-template.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
option = {
xAxis: {
type: 'category',
axisLabel: {
interval: 0,
rotate: 30
},
data: ['bahdanau','BERT','bidirectional','entity-network',
'fast-text','fast-text-char','hierarchical','luong',
'multinomial','xgb']
},
yAxis: {
type: 'value',
min:0.73,
max:0.81
},
backgroundColor:'rgb(252,252,252)',
series: [{
data: [0.79,0.77,0.80,0.76,0.77,0.75,0.80,0.79,0.75,0.79],
type: 'bar',
label: {
normal: {
show: true,
position: 'top'
}
},
}]
};
279 changes: 276 additions & 3 deletions accuracy/models-accuracy.ipynb

Large diffs are not rendered by default.

205 changes: 203 additions & 2 deletions accuracy/models-accuracy.rst
Original file line number Diff line number Diff line change
Expand Up @@ -432,6 +432,18 @@ Fast-text
avg / total 0.73 0.73 0.73 2856
Fast-text-char
^^^^^^^^^^^^^^

.. code:: text
precision recall f1-score support
negative 0.71 0.64 0.67 1303
positive 0.72 0.78 0.75 1553
avg / total 0.71 0.71 0.71 2856
Hierarchical
^^^^^^^^^^^^

Expand Down Expand Up @@ -499,7 +511,7 @@ Labels are,
.. image:: models-accuracy_files/models-accuracy_33_0.png
.. image:: models-accuracy_files/models-accuracy_34_0.png
:width: 500px


Expand Down Expand Up @@ -628,7 +640,7 @@ sessions stored in
.. image:: models-accuracy_files/models-accuracy_42_0.png
.. image:: models-accuracy_files/models-accuracy_43_0.png
:width: 500px


Expand Down Expand Up @@ -702,6 +714,18 @@ Fast-text
macro avg 0.89 0.89 0.89 1993
weighted avg 0.89 0.89 0.89 1993
Fast-text-char
^^^^^^^^^^^^^^

.. code:: text
precision recall f1-score support
negative 0.88 0.88 0.88 1002
positive 0.88 0.87 0.88 991
avg / total 0.88 0.88 0.88 1993
Hierarchical
^^^^^^^^^^^^

Expand Down Expand Up @@ -758,3 +782,180 @@ XGB
micro avg 0.85 0.85 0.85 1993
macro avg 0.85 0.85 0.85 1993
weighted avg 0.85 0.85 0.85 1993
Emotion Analysis
----------------

Trained on 80% of dataset, tested on 20% of dataset. All training
sessions stored in
`session/emotion <https://github.com/huseinzol05/Malaya/tree/master/session/emotion>`__

.. code:: ipython3
display(Image('emotion-accuracy.png', width=500))
.. image:: models-accuracy_files/models-accuracy_55_0.png
:width: 500px


Bahdanau
^^^^^^^^

.. code:: text
precision recall f1-score support
anger 0.80 0.80 0.80 3827
fear 0.77 0.78 0.78 3760
joy 0.81 0.78 0.80 3958
love 0.82 0.86 0.84 3099
sadness 0.73 0.76 0.74 3119
surprise 0.79 0.74 0.77 1940
avg / total 0.79 0.79 0.79 19703
BERT
^^^^

.. code:: text
precision recall f1-score support
anger 0.73 0.83 0.78 3747
fear 0.70 0.84 0.77 3789
joy 0.74 0.80 0.77 3929
love 0.82 0.76 0.79 3081
sadness 0.82 0.60 0.69 3168
surprise 0.85 0.63 0.72 1989
avg / total 0.77 0.76 0.76 19703
Bidirectional
^^^^^^^^^^^^^

.. code:: text
precision recall f1-score support
anger 0.81 0.80 0.81 3726
fear 0.77 0.78 0.77 3806
joy 0.83 0.81 0.82 3975
love 0.86 0.83 0.85 2992
sadness 0.75 0.78 0.77 3293
surprise 0.77 0.79 0.78 1911
avg / total 0.80 0.80 0.80 19703
Entity-network
^^^^^^^^^^^^^^

.. code:: text
precision recall f1-score support
anger 0.82 0.72 0.77 3717
fear 0.72 0.77 0.75 3743
joy 0.77 0.74 0.76 4050
love 0.81 0.81 0.81 2992
sadness 0.71 0.74 0.72 3274
surprise 0.72 0.80 0.76 1927
avg / total 0.76 0.76 0.76 19703
Fast-text
^^^^^^^^^

.. code:: text
precision recall f1-score support
anger 0.82 0.75 0.78 3754
fear 0.71 0.81 0.75 3837
joy 0.76 0.79 0.78 3844
love 0.83 0.83 0.83 3065
sadness 0.75 0.75 0.75 3241
surprise 0.79 0.64 0.71 1962
avg / total 0.77 0.77 0.77 19703
Fast-text-char
^^^^^^^^^^^^^^

.. code:: text
precision recall f1-score support
anger 0.79 0.75 0.77 3803
fear 0.73 0.73 0.73 3784
joy 0.71 0.77 0.74 3872
love 0.81 0.80 0.80 3052
sadness 0.72 0.70 0.71 3205
surprise 0.73 0.70 0.72 1987
avg / total 0.75 0.74 0.75 19703
Hierarchical
^^^^^^^^^^^^

.. code:: text
precision recall f1-score support
anger 0.81 0.79 0.80 3786
fear 0.78 0.79 0.78 3754
joy 0.81 0.82 0.82 3886
love 0.85 0.84 0.85 3022
sadness 0.76 0.80 0.78 3300
surprise 0.81 0.75 0.78 1955
avg / total 0.80 0.80 0.80 19703
Luong
^^^^^

.. code:: text
precision recall f1-score support
anger 0.80 0.79 0.80 3774
fear 0.78 0.75 0.77 3759
joy 0.79 0.80 0.79 3944
love 0.83 0.84 0.84 3033
sadness 0.75 0.75 0.75 3272
surprise 0.76 0.80 0.78 1921
avg / total 0.79 0.79 0.79 19703
Multinomial
^^^^^^^^^^^

.. code:: text
precision recall f1-score support
anger 0.72 0.82 0.77 3833
fear 0.68 0.80 0.74 3802
joy 0.68 0.84 0.75 3924
love 0.85 0.71 0.78 2981
sadness 0.81 0.67 0.73 3189
surprise 0.80 0.36 0.50 1974
avg / total 0.75 0.73 0.73 19703
XGB
^^^

.. code:: text
precision recall f1-score support
anger 0.80 0.80 0.80 3769
fear 0.79 0.76 0.78 3808
joy 0.79 0.81 0.80 3913
love 0.84 0.85 0.84 2998
sadness 0.76 0.75 0.76 3250
surprise 0.77 0.77 0.77 1965
avg / total 0.79 0.79 0.79 19703
Binary file not shown.
Binary file not shown.
Binary file modified accuracy/models-accuracy_files/models-accuracy_22_0.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified accuracy/models-accuracy_files/models-accuracy_9_0.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified accuracy/sentiment-accuracy.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
28 changes: 28 additions & 0 deletions accuracy/sentiment-template.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
option = {
xAxis: {
type: 'category',
axisLabel: {
interval: 0,
rotate: 30
},
data: ['bahdanau','BERT','bidirectional','entity-network',
'fast-text','fast-text-char','hierarchical','luong',
'multinomial','xgb']
},
yAxis: {
type: 'value',
min:0.65,
max:0.75
},
backgroundColor:'rgb(252,252,252)',
series: [{
data: [0.67,0.69,0.67,0.71,0.73,0.71,0.67,0.66,0.73,0.69],
type: 'bar',
label: {
normal: {
show: true,
position: 'top'
}
},
}]
};
Binary file modified accuracy/subjectivity-accuracy.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
28 changes: 28 additions & 0 deletions accuracy/subjectivity-template.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
option = {
xAxis: {
type: 'category',
axisLabel: {
interval: 0,
rotate: 30
},
data: ['bahdanau','BERT','bidirectional','entity-network',
'fast-text','fast-text-char','hierarchical','luong',
'multinomial','xgb']
},
yAxis: {
type: 'value',
min:0.81,
max:0.9
},
backgroundColor:'rgb(252,252,252)',
series: [{
data: [0.83,0.84,0.85,0.88,0.89,0.88,0.84,0.82,0.89,0.85],
type: 'bar',
label: {
normal: {
show: true,
position: 'top'
}
},
}]
};
Empty file.
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
1.0
Empty file.
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
1.2
4 changes: 2 additions & 2 deletions docs/Accuracy.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Models accuracy
====================
Models Accuracy
============================

.. include:: models-accuracy.rst
6 changes: 5 additions & 1 deletion docs/Contribution.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,14 @@ Contribution
Rules
-----

Malaya has some standard code like PEP8 if want to add, we use Black Mamba to prettify our code.
1. Malaya has some standard code like PEP8 if want to add, we use Black Mamba to prettify our code.

You can install it from here, https://github.com/mohtar/blackmamba.

2. 100% Tensorflow, no Keras.

3. 3.3 > Python < 3.7

New code
--------

Expand Down
Loading

0 comments on commit 01ba1da

Please sign in to comment.