Shortcuts

热身:numpy

Created On: Dec 03, 2020 | Last Updated: Dec 03, 2020 | Last Verified: Nov 05, 2024

一个三阶多项式,训练来预测 \(y=\sin(x)\),范围为 \(-\pi\)\(pi\),通过最小化平方欧几里得距离。

此实现使用 numpy 手动计算前向传播、损失和反向传播。

一个 numpy 数组是一个通用的 n 维数组;它不知道任何关于深度学习、梯度或计算图的信息,仅仅是进行通用数值计算的工具。

99 1314.5233605892595
199 920.7899404155552
299 646.2350531527254
399 454.6360023944252
499 320.82946175059794
599 227.31728163862482
699 161.92100612152032
799 116.15751554567778
899 84.112973707987
999 61.6614652248286
1099 45.922305972995275
1199 34.88278103310294
1299 27.135653234785412
1399 21.696372122544016
1499 17.87567982823961
1599 15.190755279644566
1699 13.303191474101487
1799 11.97567116033004
1899 11.041682683805794
1999 10.384336902439395
Result: y = -0.04054381247397627 + 0.8470191878859463 x + 0.006994482303712554 x^2 + -0.09194756824704707 x^3

import numpy as np
import math

# Create random input and output data
x = np.linspace(-math.pi, math.pi, 2000)
y = np.sin(x)

# Randomly initialize weights
a = np.random.randn()
b = np.random.randn()
c = np.random.randn()
d = np.random.randn()

learning_rate = 1e-6
for t in range(2000):
    # Forward pass: compute predicted y
    # y = a + b x + c x^2 + d x^3
    y_pred = a + b * x + c * x ** 2 + d * x ** 3

    # Compute and print loss
    loss = np.square(y_pred - y).sum()
    if t % 100 == 99:
        print(t, loss)

    # Backprop to compute gradients of a, b, c, d with respect to loss
    grad_y_pred = 2.0 * (y_pred - y)
    grad_a = grad_y_pred.sum()
    grad_b = (grad_y_pred * x).sum()
    grad_c = (grad_y_pred * x ** 2).sum()
    grad_d = (grad_y_pred * x ** 3).sum()

    # Update weights
    a -= learning_rate * grad_a
    b -= learning_rate * grad_b
    c -= learning_rate * grad_c
    d -= learning_rate * grad_d

print(f'Result: y = {a} + {b} x + {c} x^2 + {d} x^3')

**脚本的总运行时间:**(0 分钟 0.555 秒)

画廊由 Sphinx-Gallery 生成

文档

访问 PyTorch 的详细开发者文档

查看文档

教程

获取针对初学者和高级开发人员的深入教程

查看教程

资源

查找开发资源并获得问题的解答

查看资源