일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | |||
5 | 6 | 7 | 8 | 9 | 10 | 11 |
12 | 13 | 14 | 15 | 16 | 17 | 18 |
19 | 20 | 21 | 22 | 23 | 24 | 25 |
26 | 27 | 28 | 29 | 30 | 31 |
Tags
- Python
- Judge
- MongoDB
- BAEKJOON
- ubuntu
- 토네이도
- Tornado
- 설치
- Java
- OrientDB
- 알고리즘
- Framework
- online
- AWS
- 자바스크립트
- 자료형
- spring
- 저지
- r script
- 오픈한글
- API
- r
- 파이썬
- 프레임워크
- 연결
- 배열
- 백준
- 연동
- mariadb
- 이클립스
Archives
- Today
- Total
맛동산
예측 + 미니배치 + 오차 본문
가중치는 책에 있는거 가져와서 씀(chap03)
1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859606162636465666768697071727374757677787980818283·¬¬#·coding:·utf-8¬import·sys,·os¬sys.path.append(os.pardir)··#·부모·디렉터리의·파일을·가져올·수·있도록·설정¬import·numpy·as·np¬import·pickle¬from·dataset.mnist·import·load_mnist¬from·common.functions·import·sigmoid,·softmax¬¬np.max¬def·get_data():¬····(x_train,·t_train),·(x_test,·t_test)·=·load_mnist(normalize=True,·flatten=True,·one_hot_label=True)¬····#print(len(t_train))¬····#print(len(t_test))¬····return·x_train,·t_train¬¬¬def·init_network():¬····with·open("sample_weight.pkl",·'rb')·as·f:¬········network·=·pickle.load(f)¬····return·network¬¬¬def·predict(network,·x):¬····W1,·W2,·W3·=·network['W1'],·network['W2'],·network['W3']¬····b1,·b2,·b3·=·network['b1'],·network['b2'],·network['b3']¬¬····a1·=·np.dot(x,·W1)·+·b1¬····z1·=·sigmoid(a1)¬····a2·=·np.dot(z1,·W2)·+·b2¬····z2·=·sigmoid(a2)¬····a3·=·np.dot(z2,·W3)·+·b3¬····y·=·softmax(a3)¬¬····return·y¬¬def·cross_entropy_error(y,t):¬····if·y.ndim==1:·#·인덱스로·뽑았을땐·[[]]꼴이·되지만·for·문으로·돌렸다·치면·[]¬········t=t.reshape(1,t.size)¬········y=y.reshape(1,y.size)¬····#·다른·이유가·아니라·/·y.shape[0]부분·때문¬····#·10으로·나눠주게·되니까¬¬····delta·=·1e-7¬····print(y.shape)¬····print(y.ndim)¬····#aa=np.array([1,2])¬····print(y)¬····print(t)¬····return·-np.sum(t*np.log(y+delta))·/·y.shape[0]¬¬x,·t·=·get_data()¬network·=·init_network()¬accuracy_cnt·=·0¬¬train_size·=·x.shape[0]·#·6만이겠지¬print(train_size)¬batch_size·=·1¬batch_mask·=·np.random.choice(train_size,batch_size)¬¬x_batch·=·x[batch_mask]¬t_batch·=·t[batch_mask]¬¬y·=·predict(network,x_batch)¬#print(len(y))¬print(batch_mask[0])¬print(y[0])#예측한거¬print(t[batch_mask[0]])#답¬¬result·=·cross_entropy_error(y,t_batch)¬result·=·cross_entropy_error(np.array([··1.00422942e-03,···1.34813308e-04,···1.90760253e-03,···9.74361539e-01,¬····9.08520860e-07,···1.45274522e-02,···9.53066490e-07,···3.48790345e-04,¬····7.43788807e-03,···2.75873637e-04]),np.array([·0,··0,··0,··1,··0,··0,··0,··0,··0,··0]))¬print(result)·#·0.0259727314115¬¬aa·=·np.array([1,2,3,4])¬print(aa)¬aa·=·aa.reshape(1,aa.size)¬print(aa)¬aa·=·aa.reshape(2,2)¬print(aa)¬¶
'파이썬 > 딥러닝 구현' 카테고리의 다른 글
1차원 백터의 기울기 구현 (0) | 2017.06.08 |
---|---|
수치 미분 구현 (0) | 2017.06.08 |
미니배치 학습 (0) | 2017.06.03 |
MNIST 파라미터 (0) | 2017.06.03 |
평균 제곱 오차, 교차 엔트로피 오차 함수 구현 (0) | 2017.06.03 |
Comments