Python 데이터 분석
Python 데이터분석 기초 72 - MLP(multi-layer perceptron) - 다층 신경망
코딩탕탕
2022. 11. 25. 13:07

# MLP(다층 신경망)
# 논리회로로 실습
import numpy as np
from sklearn.neural_network import MLPClassifier
from sklearn.metrics import accuracy_score
feature = np.array([[0, 0],[0, 1],[1, 0],[1, 1]])
print(feature)
# label = np.array([0, 0, 0, 1]) # and
# label = np.array([0, 1, 1, 1]) # or
label = np.array([0, 1, 1, 0]) # xor, 세포체(Node)가 하나인 경우는 예측이 잘 안 된다.
# model = MLPClassifier(hidden_layer_sizes=30, solver='adam', learning_rate_init=0.01).fit(feature, label)
# model = MLPClassifier(hidden_layer_sizes=30, solver='adam', learning_rate_init=0.1,
# max_iter=100, verbose=1).fit(feature, label)
model = MLPClassifier(hidden_layer_sizes=(10, 10, 10), solver='adam', learning_rate_init=0.1,
max_iter=100, verbose=1).fit(feature, label) # hidden_layer_sizes 를 30개를 줄 수도 있고 10개씩 3번 줄 수도 있다.
pred = model.predict(feature)
print('pred :', pred)
print('acc :', accuracy_score(label, pred)) # 학습 수와 학습률에 따라 acc가 달라진다.
<console>
[[0 0]
[0 1]
[1 0]
[1 1]]
Iteration 1, loss = 0.73538661
Iteration 2, loss = 0.69523413
Iteration 3, loss = 0.68004176
Iteration 4, loss = 0.65206359
Iteration 5, loss = 0.63316163
Iteration 6, loss = 0.61182013
Iteration 7, loss = 0.57523983
Iteration 8, loss = 0.52549316
Iteration 9, loss = 0.48167323
Iteration 10, loss = 0.43458616
pred : [0 1 1 0]
acc : 1.0
학습 수와 학습률이 매우 밀접한 관계가 있다.