일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 | 28 | 29 | 30 |
Tags
- 오픈한글
- 연결
- Java
- MongoDB
- 토네이도
- 배열
- ubuntu
- BAEKJOON
- online
- 프레임워크
- 저지
- Framework
- Python
- API
- 이클립스
- 알고리즘
- 자료형
- Judge
- AWS
- 자바스크립트
- r script
- 파이썬
- mariadb
- spring
- 설치
- 연동
- OrientDB
- Tornado
- r
- 백준
Archives
- Today
- Total
맛동산
3층 신경망 구현 본문
에서 0->1층으로의 전달
123456789101112131415·¬import·numpy·as·np¬¬def·sigmoid(x):¬····return·1·/·(1·+·np.exp(-x))·#·numpy의·브로드캐스트¬¬X·=·np.array([1.0,·0.5])¬W1·=·np.array([[0.1,·0.3,·0.5],·[0.2,·0.4,·0.6]])¬B1·=·np.array([0.1,·0.2,·0.3])¬¬A1·=·np.dot(X,·W1)·+·B1¬print(A1)¬Z1·=·sigmoid(A1)¬print(Z1)¬¶
0 - 1층. A
활성화 함수로 sigmoid를 사용(http://tastydarr.tistory.com/137)
절차에 따른 구현
1234567891011121314151617181920212223242526272829303132333435363738394041·¬import·numpy·as·np¬¬def·sigmoid(x):¬····return·1·/·(1·+·np.exp(-x))·#·numpy의·브로드캐스트¬¬def·identiry_function(x):¬····#·출력층의·활성화·함수는·풀고하·하는·문제의·성질에·맞게·정의¬····#·예를·들어·회귀에는·항등·함수,·2클래스·분류에는·시그모이드·함수,·다중·클래스·분류에는·소프트맥수·함수·등¬····return·x¬¬#·0·-·1·Layer¬X·=·np.array([1.0,·0.5])¬W1·=·np.array([[0.1,·0.3,·0.5],·[0.2,·0.4,·0.6]])¬B1·=·np.array([0.1,·0.2,·0.3])¬¬A1·=·np.dot(X,·W1)·+·B1¬print(A1)¬Z1·=·sigmoid(A1)¬print(Z1)¬¬¬#·1·-·2·Layer¬W2·=·np.array([[0.1,·0.4],·[0.2,·0.5],·[0.3,·0.6]])¬B2·=·np.array([0.1,·0.2])¬¬A2·=·np.dot(Z1,·W2)·+·B2¬print(A2)¬Z2·=·sigmoid(A2)¬print(Z2)¬¬¬#·2·-·Out·Layer¬W3·=·np.array([[0.1,·0.3],·[0.2,·0.4]])¬B3·=·np.array([0.1,·0.2])¬¬A3·=·np.dot(Z2,·W3)·+·B3¬print(A3)¬Y·=·identiry_function(A3)¬print(Y)¬¶
조금 더 프로그램화.
제대로 하려면 init_network에서 weight와 bias를 tensorflow에서 회귀모델 만들때처럼 랜덤으로 주고 그라디언트로 값 찾게 하는거랑 같은 원리겠지
12345678910111213141516171819202122232425262728293031323334353637383940import·numpy·as·np¬¬def·sigmoid(x):¬····return·1·/·(1·+·np.exp(-x))·#·numpy의·브로드캐스트¬¬def·identity_function(x):¬····#·출력층의·활성화·함수는·풀고하·하는·문제의·성질에·맞게·정의¬····#·예를·들어·회귀에는·항등·함수,·2클래스·분류에는·시그모이드·함수,·다중·클래스·분류에는·소프트맥수·함수·등¬····return·x¬¬def·init_network():¬····W1·=·np.array([[0.1,·0.3,·0.5],·[0.2,·0.4,·0.6]])¬····B1·=·np.array([0.1,·0.2,·0.3])¬····W2·=·np.array([[0.1,·0.4],·[0.2,·0.5],·[0.3,·0.6]])¬····B2·=·np.array([0.1,·0.2])¬····W3·=·np.array([[0.1,·0.3],·[0.2,·0.4]])¬····B3·=·np.array([0.1,·0.2])¬····network·=·[·[W1,B1],[W2,B2],[W3,B3]·]¬····return·network¬¬def·forward(network,·x,·out):¬····for·idx·in·range(len(network)):¬········if·idx==0:¬············a·=·np.dot(x,·network[idx][0])·+·network[idx][1]·#·x로·시작¬············out.append(sigmoid(a))¬············continue¬········if·idx==len(network)-1:¬············a·=·np.dot(out[-1],·network[idx][0])·+·network[idx][1]¬············out.append(identity_function(a))·#·identity·function으로·끝¬············return·out[-1]·#·list를·통째로·내보내는게·더·유연할·듯¬········a·=·np.dot(out[-1],·network[1][0])·+·network[1][1]¬········out.append(sigmoid(a))¬¬network·=·init_network()¬out·=·[]¬¬x·=·np.array([1.0,·0.5])¬y·=·forward(network,x,out)¬print(y)¬¶
'파이썬 > 딥러닝 구현' 카테고리의 다른 글
Softmax 함수 구현 (0) | 2017.05.22 |
---|---|
note1 (0) | 2017.05.19 |
[여담] word2vec, seq2seq, tensorflow문서 한글번역본 (0) | 2017.05.19 |
재현신경망? (0) | 2017.05.19 |
신경망의 내적은 행렬의 내적과 같다 (0) | 2017.05.19 |
Comments