当前位置: 代码迷 >> 综合 >> 单层神经网络实现-pytorch-softmax-sigmoid使用
  详细解决方案

单层神经网络实现-pytorch-softmax-sigmoid使用

热度:5   发布时间:2023-12-02 14:16:00.0

7.示例:单层神经网络实现

本节使用神经网络在iris数据集的多分类示例来说明神经网络的整个流程。一般的神经网络训练包括几个重要的步骤:数据准备,初始化权重,激活函数,前向计算,损失函数,计算损失,反向传播,更新参数,直到收敛或者达到终止条件。

import torch
from torch import sigmoid
import torch.nn.functional as F
import matplotlib.pyplot as plt
from sklearn.datasets import load_iris
from torch.autograd import Variable
from torch.optim import SGDN_FEATURE = 4
N_HIDDEN = 5
N_OUTPUT = 4
N_ITERS = 1000
LR = 0.5# 判断GPU是否可用
use_cuda = torch.cuda.is_available()
print("use_cuda: ",use_cuda)# 加载数据集
iris = load_iris()
print(iris.keys())# 数据预处理
x = iris['data']
y = iris['target']
print('x.shape: ',x.shape)
print('y.shape: ',y.shape)
print(y)
x = torch.FloatTensor(x)
y = torch.LongTensor(y)
x = Variable(x)
y = Variable(y)# 定义神经网络类,继承自torch.nn.Module
class Net(torch.nn.Module):# 定义构造函数def __init__(self,n_feature, n_hidden, n_output):super(Net,self).__init__()# 定义隐含层self.hidden = torch.nn.Linear(n_feature,n_hidden)# 定义输出层self.predict = torch.nn.Linear(n_hidden,n_output)# 定义前向传播def forward(self,x):# 计算隐含层,激活函数为sigmoidx = sigmoid(self.hidden(x))# 计算输出层,激活函数为log_softmax(多分类)out = F.log_softmax(self.predict(x),dim=1)return out# 定义神经网络实例
net = Net(n_feature=N_FEATURE,n_hidden=N_HIDDEN,n_output=N_OUTPUT)
print(net)# 如果GPU可用,把数据和模型都转到GPU上计算;CPU时调用.cpu()即可
if use_cuda:x = x.cuda()y = y.cuda()net = net.cuda()# 定义神经网络优化器:这里使用随机梯度下降SGD,学习率lr=0.5
optimizer = SGD(net.parameters(),lr=LR)# 开始训练神经网络
px,py = [],[]for i in range(N_ITERS):# 数据集传入网络前向计算预测值prediction = net(x)# 计算损失loss = F.nll_loss(prediction,y)# 清除网络状态optimizer.zero_grad()# 误差反向传播loss.backward()# 更新参数optimizer.step()# 打印并记录当前的index和lossprint(i," loss: ",loss.item())px.append(i)py.append(loss.item())plt.figure(figsize=(6,4),dpi=144)
plt.plot(px,py,'r-',lw=1)
plt.yticks([x * 0.1 for x in range(16)])
plt.show()

输出结果

use_cuda:  True
dict_keys(['data', 'target', 'target_names', 'DESCR', 'feature_names'])
x.shape:  (150, 4)
y.shape:  (150,)
[0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 00 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 22 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 22 2]
Net((hidden): Linear(in_features=4, out_features=5, bias=True)(predict): Linear(in_features=5, out_features=4, bias=True)
)
0  loss:  1.3780320882797241
1  loss:  1.244009017944336
2  loss:  1.1953575611114502
3  loss:  1.1671373844146729
4  loss:  1.147063136100769
5  loss:  1.1312648057937622
6  loss:  1.1179096698760986
7  loss:  1.10591721534729
8  loss:  1.0945632457733154
9  loss:  1.0833168029785156
10  loss:  1.0717663764953613
……
990  loss:  0.08581560850143433
991  loss:  0.0774485245347023
992  loss:  0.08574181795120239
993  loss:  0.07738661766052246
994  loss:  0.08566807955503464
995  loss:  0.07732491940259933
996  loss:  0.08559456467628479
997  loss:  0.07726352661848068
998  loss:  0.08552153408527374
999  loss:  0.0772024393081665

作者:金字塔下的小蜗牛
链接:https://www.jianshu.com/p/c2436d151744
来源:简书
著作权归作者所有。商业转载请联系作者获得授权,非商业转载请注明出处。侵删