MLP Classifier#
MLPClassifier adalah singkatan dari Multi-layer Perceptron classifier yang dalam namanya terhubung ke Neural Network. Tidak seperti algoritme klasifikasi lain seperti Support Vectors Machine atau Naive Bayes Classifier, MLPClassifier mengandalkan Neural Network yang mendasari untuk melakukan tugas klasifikasi.
Namun, satu kesamaan, dengan algoritme klasifikasi Scikit-Learn lainnya adalah bahwa penerapan MLPClassifier tidak memerlukan keribetan daripada mengimplementasikan Support Vectors atau Naive Bayes atau pengklasifikasi lain dari Scikit-Learn alias sangat mudah sekali.
Implementasi MLP menggunakan data IRIS#
import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
from sklearn.neural_network import MLPClassifier
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from sklearn.metrics import accuracy_score
dataset = pd.read_csv('https://raw.githubusercontent.com/mk-gurucharan/Classification/master/IrisDataset.csv')
X = dataset.iloc[:,:4].values
y = dataset['species'].values
dataset.head(5)
| sepal_length | sepal_width | petal_length | petal_width | species | |
|---|---|---|---|---|---|
| 0 | 5.1 | 3.5 | 1.4 | 0.2 | setosa |
| 1 | 4.9 | 3.0 | 1.4 | 0.2 | setosa |
| 2 | 4.7 | 3.2 | 1.3 | 0.2 | setosa |
| 3 | 4.6 | 3.1 | 1.5 | 0.2 | setosa |
| 4 | 5.0 | 3.6 | 1.4 | 0.2 | setosa |
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
clf = MLPClassifier(hidden_layer_sizes=(100,100,100), max_iter=1000, alpha=0.0001,
solver='sgd', verbose=10, random_state=21,tol=0.001)
clf.fit(X_train, y_train)
y_pred = clf.predict(X_test)
Iteration 1, loss = 1.12973730
Iteration 2, loss = 1.11639685
Iteration 3, loss = 1.09787996
Iteration 4, loss = 1.07549633
Iteration 5, loss = 1.05135745
Iteration 6, loss = 1.02728888
Iteration 7, loss = 1.00441184
Iteration 8, loss = 0.98298418
Iteration 9, loss = 0.96319490
Iteration 10, loss = 0.94543301
Iteration 11, loss = 0.93033027
Iteration 12, loss = 0.91800240
Iteration 13, loss = 0.90810247
Iteration 14, loss = 0.90044487
Iteration 15, loss = 0.89326488
Iteration 16, loss = 0.88602333
Iteration 17, loss = 0.87854108
Iteration 18, loss = 0.87058700
Iteration 19, loss = 0.86225904
Iteration 20, loss = 0.85371627
Iteration 21, loss = 0.84499422
Iteration 22, loss = 0.83621079
Iteration 23, loss = 0.82735732
Iteration 24, loss = 0.81856322
Iteration 25, loss = 0.80981727
Iteration 26, loss = 0.80125235
Iteration 27, loss = 0.79283365
Iteration 28, loss = 0.78445814
Iteration 29, loss = 0.77620078
Iteration 30, loss = 0.76800976
Iteration 31, loss = 0.75992467
Iteration 32, loss = 0.75187788
Iteration 33, loss = 0.74380981
Iteration 34, loss = 0.73591304
Iteration 35, loss = 0.72825403
Iteration 36, loss = 0.72069747
Iteration 37, loss = 0.71329431
Iteration 38, loss = 0.70610259
Iteration 39, loss = 0.69911091
Iteration 40, loss = 0.69235636
Iteration 41, loss = 0.68576065
Iteration 42, loss = 0.67929336
Iteration 43, loss = 0.67292372
Iteration 44, loss = 0.66658686
Iteration 45, loss = 0.66025380
Iteration 46, loss = 0.65397368
Iteration 47, loss = 0.64785475
Iteration 48, loss = 0.64195413
Iteration 49, loss = 0.63623332
Iteration 50, loss = 0.63061549
Iteration 51, loss = 0.62506434
Iteration 52, loss = 0.61955669
Iteration 53, loss = 0.61408190
Iteration 54, loss = 0.60865462
Iteration 55, loss = 0.60325994
Iteration 56, loss = 0.59787436
Iteration 57, loss = 0.59252505
Iteration 58, loss = 0.58721977
Iteration 59, loss = 0.58195936
Iteration 60, loss = 0.57673761
Iteration 61, loss = 0.57156361
Iteration 62, loss = 0.56649723
Iteration 63, loss = 0.56156725
Iteration 64, loss = 0.55682547
Iteration 65, loss = 0.55225299
Iteration 66, loss = 0.54784055
Iteration 67, loss = 0.54356840
Iteration 68, loss = 0.53938798
Iteration 69, loss = 0.53525372
Iteration 70, loss = 0.53117226
Iteration 71, loss = 0.52714135
Iteration 72, loss = 0.52316411
Iteration 73, loss = 0.51923393
Iteration 74, loss = 0.51529829
Iteration 75, loss = 0.51134159
Iteration 76, loss = 0.50732992
Iteration 77, loss = 0.50323568
Iteration 78, loss = 0.49903110
Iteration 79, loss = 0.49492745
Iteration 80, loss = 0.49101120
Iteration 81, loss = 0.48724658
Iteration 82, loss = 0.48356128
Iteration 83, loss = 0.47983199
Iteration 84, loss = 0.47608169
Iteration 85, loss = 0.47231980
Iteration 86, loss = 0.46852659
Iteration 87, loss = 0.46473769
Iteration 88, loss = 0.46092211
Iteration 89, loss = 0.45711327
Iteration 90, loss = 0.45337553
Iteration 91, loss = 0.44978875
Iteration 92, loss = 0.44633139
Iteration 93, loss = 0.44293877
Iteration 94, loss = 0.43969799
Iteration 95, loss = 0.43656511
Iteration 96, loss = 0.43354625
Iteration 97, loss = 0.43064258
Iteration 98, loss = 0.42784033
Iteration 99, loss = 0.42509575
Iteration 100, loss = 0.42242366
Iteration 101, loss = 0.41980737
Iteration 102, loss = 0.41722346
Iteration 103, loss = 0.41467051
Iteration 104, loss = 0.41214278
Iteration 105, loss = 0.40964045
Iteration 106, loss = 0.40716862
Iteration 107, loss = 0.40472635
Iteration 108, loss = 0.40231261
Iteration 109, loss = 0.39992764
Iteration 110, loss = 0.39757269
Iteration 111, loss = 0.39524304
Iteration 112, loss = 0.39294053
Iteration 113, loss = 0.39066541
Iteration 114, loss = 0.38841274
Iteration 115, loss = 0.38618678
Iteration 116, loss = 0.38398829
Iteration 117, loss = 0.38181502
Iteration 118, loss = 0.37966743
Iteration 119, loss = 0.37754169
Iteration 120, loss = 0.37544044
Iteration 121, loss = 0.37336751
Iteration 122, loss = 0.37132001
Iteration 123, loss = 0.36929897
Iteration 124, loss = 0.36730079
Iteration 125, loss = 0.36532739
Iteration 126, loss = 0.36337921
Iteration 127, loss = 0.36145628
Iteration 128, loss = 0.35955253
Iteration 129, loss = 0.35766540
Iteration 130, loss = 0.35580060
Iteration 131, loss = 0.35395627
Iteration 132, loss = 0.35212905
Iteration 133, loss = 0.35032070
Iteration 134, loss = 0.34853027
Iteration 135, loss = 0.34675421
Iteration 136, loss = 0.34499006
Iteration 137, loss = 0.34324034
Iteration 138, loss = 0.34150736
Iteration 139, loss = 0.33978803
Iteration 140, loss = 0.33808163
Iteration 141, loss = 0.33639028
Iteration 142, loss = 0.33471319
Iteration 143, loss = 0.33305091
Iteration 144, loss = 0.33140300
Iteration 145, loss = 0.32976927
Iteration 146, loss = 0.32814734
Iteration 147, loss = 0.32653731
Iteration 148, loss = 0.32493870
Iteration 149, loss = 0.32335181
Iteration 150, loss = 0.32177661
Iteration 151, loss = 0.32021310
Iteration 152, loss = 0.31866103
Iteration 153, loss = 0.31711983
Iteration 154, loss = 0.31558744
Iteration 155, loss = 0.31406521
Iteration 156, loss = 0.31255268
Iteration 157, loss = 0.31104926
Iteration 158, loss = 0.30955653
Iteration 159, loss = 0.30807360
Iteration 160, loss = 0.30660014
Iteration 161, loss = 0.30513568
Iteration 162, loss = 0.30368237
Iteration 163, loss = 0.30223743
Iteration 164, loss = 0.30080106
Iteration 165, loss = 0.29937374
Iteration 166, loss = 0.29795551
Iteration 167, loss = 0.29654629
Iteration 168, loss = 0.29514631
Iteration 169, loss = 0.29375575
Iteration 170, loss = 0.29237519
Iteration 171, loss = 0.29100426
Iteration 172, loss = 0.28964148
Iteration 173, loss = 0.28828673
Iteration 174, loss = 0.28693945
Iteration 175, loss = 0.28559978
Iteration 176, loss = 0.28426814
Iteration 177, loss = 0.28294413
Iteration 178, loss = 0.28162752
Iteration 179, loss = 0.28031786
Iteration 180, loss = 0.27901601
Iteration 181, loss = 0.27772138
Iteration 182, loss = 0.27643391
Iteration 183, loss = 0.27515407
Iteration 184, loss = 0.27388154
Iteration 185, loss = 0.27261611
Iteration 186, loss = 0.27135788
Iteration 187, loss = 0.27010687
Iteration 188, loss = 0.26886286
Iteration 189, loss = 0.26762620
Iteration 190, loss = 0.26639662
Iteration 191, loss = 0.26517407
Iteration 192, loss = 0.26395855
Iteration 193, loss = 0.26274995
Iteration 194, loss = 0.26154783
Iteration 195, loss = 0.26035224
Iteration 196, loss = 0.25916311
Iteration 197, loss = 0.25798049
Iteration 198, loss = 0.25680466
Iteration 199, loss = 0.25563537
Iteration 200, loss = 0.25447254
Iteration 201, loss = 0.25331618
Iteration 202, loss = 0.25216651
Iteration 203, loss = 0.25102433
Iteration 204, loss = 0.24988895
Iteration 205, loss = 0.24875992
Iteration 206, loss = 0.24763733
Iteration 207, loss = 0.24652117
Iteration 208, loss = 0.24541155
Iteration 209, loss = 0.24430811
Iteration 210, loss = 0.24321110
Iteration 211, loss = 0.24212064
Iteration 212, loss = 0.24103627
Iteration 213, loss = 0.23995803
Iteration 214, loss = 0.23888598
Iteration 215, loss = 0.23782035
Iteration 216, loss = 0.23676109
Iteration 217, loss = 0.23570824
Iteration 218, loss = 0.23466176
Iteration 219, loss = 0.23362166
Iteration 220, loss = 0.23258821
Iteration 221, loss = 0.23156092
Iteration 222, loss = 0.23053979
Iteration 223, loss = 0.22952453
Iteration 224, loss = 0.22851525
Iteration 225, loss = 0.22751205
Iteration 226, loss = 0.22651466
Iteration 227, loss = 0.22552319
Iteration 228, loss = 0.22453752
Iteration 229, loss = 0.22355767
Iteration 230, loss = 0.22258366
Iteration 231, loss = 0.22161558
Iteration 232, loss = 0.22065333
Iteration 233, loss = 0.21969696
Iteration 234, loss = 0.21874637
Iteration 235, loss = 0.21780191
Iteration 236, loss = 0.21686365
Training loss did not improve more than tol=0.001000 for 10 consecutive epochs. Stopping.
accuracy = accuracy_score(y_test, y_pred)
print("Accuracy:", accuracy)
Accuracy: 0.9666666666666667
df = pd.DataFrame({'Real Values':y_test, 'Predicted Values':y_pred})
df
| Real Values | Predicted Values | |
|---|---|---|
| 0 | versicolor | versicolor |
| 1 | setosa | setosa |
| 2 | virginica | virginica |
| 3 | versicolor | versicolor |
| 4 | versicolor | versicolor |
| 5 | setosa | setosa |
| 6 | versicolor | versicolor |
| 7 | virginica | virginica |
| 8 | versicolor | virginica |
| 9 | versicolor | versicolor |
| 10 | virginica | virginica |
| 11 | setosa | setosa |
| 12 | setosa | setosa |
| 13 | setosa | setosa |
| 14 | setosa | setosa |
| 15 | versicolor | versicolor |
| 16 | virginica | virginica |
| 17 | versicolor | versicolor |
| 18 | versicolor | versicolor |
| 19 | virginica | virginica |
| 20 | setosa | setosa |
| 21 | virginica | virginica |
| 22 | setosa | setosa |
| 23 | virginica | virginica |
| 24 | virginica | virginica |
| 25 | virginica | virginica |
| 26 | virginica | virginica |
| 27 | virginica | virginica |
| 28 | setosa | setosa |
| 29 | setosa | setosa |