일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 | 28 | 29 | 30 |
- anisotropicgaussianlearning #gaussiansplatting #nonisotropicrenderinㅎ
- gaussiansplatting #3dgaussiansplatting #pointbasedrendering
- pointbasedrendering #computergraphics #3dmodeling #volumerendering
- splatrendering #3dpointcloud #differentiablerendering
- realtimerendering #machinelearning3d #deeplearninggraphics #computervision
- gaussianprojection #covariancematrix3d #anisotropicgaussians #ellipsoidalsplatting
- computervisionresearch #3dreconstructiontechniques #graphicsresearch
- siggraph #3dsceneunderstanding #highquality3drendering #fastrendering
- realtimerendering #highquality3d #fastrendering
- optimizationalgorithms
- turtlebot3 #터틀봇
- machinelearning3d #deeplearninggraphics #airendering
- advancedrenderingtechniques #neuralscenerepresentation
- 3dscanning #digitaltwintechnology #3dcontentcreation
- 3drenderingtools #pointcloudsoftware #3dvisualizationsoftware
- 3dscenerepresentation #covariancematrixoptimization #adaptivegaussians
- differentiablerendering #machinelearning3d #deeplearninggraphics
- highfidelityreconstruction #sceneunderstanding #computationalgraphics
- nerf (neural radiance fields) #3dreconstruction
- nextgengraphics #futureof3drendering #innovativerenderingtechniques
- computergraphics #3dmodeling #virtualreality #augmentedreality #gamedevelopment
- gaussiansplatting #3dgaussiancovariance #nerf #3dreconstruction
- 3dsceneunderstanding #pointcloudrendering #neuralscenerepresentation
- anisotropickernels #ellipsoidalsplatting #orientedsplats #gradientbasedlearning
- siggraphtechniques #aigraphics #3dmodelingalgorithms
- 3dpointcloud #differentiablerendering #3dscenerepresentation #neuralrendering
- geometrylearning #shapeoptimization #gpuacceleration #realtimerendering
- 3dcontentcreation
- nerf (neural radiance fields) #3dreconstruction #pointcloudrendering #volumerendering
- gpuacceleration #aigraphics #virtualreality #augmentedreality #gamedevelopment
- Today
- Total
Wiredwisdom
Deep-learning point 본문
`R2=1-(SST)/(SSE)=1-(Sigma(y_text(target)-bar y_text(predict))^2)/ (Sigma(y_text(target)-bar y_text(average))^2)`
R2 > 0.8: 매우 좋은 모델
R2 0.6~0.8: 괜찮은 모델
R2 0.4~0.6: 개선 여지가 있는 모델
R2 < 0.4: 추가적인 개선이 필요한 모델
입력 데이터의 Normalization의 이점
1. 그래디언트 폭주를 막을 수 있다.(Gradient Explosion)
발산하게 되는 경우
2. Chain rule 영향
x = 입력값
w = 가중치
z = w * x # 첫 번째 계산
y = sigmoid(z) # 활성화 함수 적용
L = (target - y)² # 손실 함수
`(delL)/(delw)=(delL)/(dely)*(dely)/(delz)*(delz)/(delw)`
∂L/∂y = -2(target - y) # 손실함수의 미분
∂y/∂z = y * (1-y) # 시그모이드 함수의 미분
∂z/∂w = x # 선형 함수의 미분
`(delL)/(delw) = -2(target - y) * y(1-y) * x`
여기서 tartget =1 , y(predict) = 0.5 라 가정할때
# 입력이 작은 경우
x = 2 gradient = -2 * (0.5) * 0.2 * 2 = -0.4 # 안정적인 값
# 입력이 큰 경우
x = 1000 gradient = -2 * (0.5) * 0.2 * 1000 = -200 # 매우 큰 값
3. 그래디언트 포화(Saturation)
시그모이드 함수같이 극대값이 1이나 0으로 수렴값이 존재하는 경우
입력값이 크면 값이 포화되어 학습저하가 발생된다.
'Artificial Intelligence > Deep Learning Basic' 카테고리의 다른 글
Why MSE (0) | 2025.03.30 |
---|---|
Cross-entropy and MSE principles (0) | 2025.03.30 |
Neural Network Basic (0) | 2024.08.01 |
Attention (0) | 2024.06.23 |
RNN - LSTM - LLM summary (0) | 2024.06.21 |