2.3 线性代数

发布时间 2023-05-24 10:26:26作者: AncilunKiang

2.3.1 标量

import torch

标量由只有一个元素的张量表示,可进行熟悉的算数运算。

x = torch.tensor(3.0)
y = torch.tensor(2.0)
x, y, x+y, x*y, x/y, x**y
(tensor(3.), tensor(2.), tensor(5.), tensor(6.), tensor(1.5000), tensor(9.))

2.3.2 向量

向量通过只有一个轴的张量表示。与普通的Python数组一样可通过索引访问,也可以通过调用Python的内置函数 len 来访问张量长度。

x = torch.arange(4)
x, x[3], len(x), x.shape
(tensor([0, 1, 2, 3]), tensor(3), 4, torch.Size([4]))

2.3.3 矩阵

矩阵通过具有两个轴的张量表示,可通过行索引和列索引访问矩阵中的标量元素,也可进行诸如转置之类的操作。

A = torch.arange(20).reshape(5, 4)
A, A.T
(tensor([[ 0,  1,  2,  3],
         [ 4,  5,  6,  7],
         [ 8,  9, 10, 11],
         [12, 13, 14, 15],
         [16, 17, 18, 19]]),
 tensor([[ 0,  4,  8, 12, 16],
         [ 1,  5,  9, 13, 17],
         [ 2,  6, 10, 14, 18],
         [ 3,  7, 11, 15, 19]]))
B = torch.tensor([[1, 2, 3], [2, 0, 4], [3, 4, 5]])  # 对称矩阵
B, B.T, B == B.T
(tensor([[1, 2, 3],
         [2, 0, 4],
         [3, 4, 5]]),
 tensor([[1, 2, 3],
         [2, 0, 4],
         [3, 4, 5]]),
 tensor([[True, True, True],
         [True, True, True],
         [True, True, True]]))

2.3.4 张量

本节张量指的是代数对象,我的理解是多轴的数组即为张量。

X = torch.arange(24).reshape(2, 3, 4)
X
tensor([[[ 0,  1,  2,  3],
         [ 4,  5,  6,  7],
         [ 8,  9, 10, 11]],

        [[12, 13, 14, 15],
         [16, 17, 18, 19],
         [20, 21, 22, 23]]])

2.3.5 张量算法的基本性质

任何按元素的一元运算都不会改变其操作数的形状,同样,给定具有相同形状的任意两个张量进行任何按元素二元运算的结果都将是相同形状的张量。

具体而言,两个矩阵按元素乘法称为哈达玛积(Hadamard product)(数学符号 \(\odot\))

A = torch.arange(20, dtype=torch.float32).reshape(5, 4)
B = A.clone()  # 开辟新内存,复制一个A,若仅是B=A则只是引用
A, A+B, A*B
(tensor([[ 0.,  1.,  2.,  3.],
         [ 4.,  5.,  6.,  7.],
         [ 8.,  9., 10., 11.],
         [12., 13., 14., 15.],
         [16., 17., 18., 19.]]),
 tensor([[ 0.,  2.,  4.,  6.],
         [ 8., 10., 12., 14.],
         [16., 18., 20., 22.],
         [24., 26., 28., 30.],
         [32., 34., 36., 38.]]),
 tensor([[  0.,   1.,   4.,   9.],
         [ 16.,  25.,  36.,  49.],
         [ 64.,  81., 100., 121.],
         [144., 169., 196., 225.],
         [256., 289., 324., 361.]]))

将张量加上或乘以一个标量也不会改变张量的形状,而是张量的每个元素都将与标量进行相加。

a = 2
X = torch.arange(24).reshape(2, 3, 4)
a+X, (a*X).shape
(tensor([[[ 2,  3,  4,  5],
          [ 6,  7,  8,  9],
          [10, 11, 12, 13]],
 
         [[14, 15, 16, 17],
          [18, 19, 20, 21],
          [22, 23, 24, 25]]]),
 torch.Size([2, 3, 4]))

2.3.6 降维

默认情况下,调用求和函数 sum 会沿着所有的轴降低张量的维度,使它变成一个标量。当然也可以指定张量沿哪个轴求和降维。

x = torch.arange(4, dtype=torch.float32)
x, x.sum()  # 降为标量
(tensor([0., 1., 2., 3.]), tensor(6.))
A.shape, A.sum()  # 任意形状均可
(torch.Size([5, 4]), tensor(190.))
A_sum_axis0 = A.sum(axis=0)  # 沿0轴求和降维
A_sum_axis0, A_sum_axis0.shape
(tensor([40., 45., 50., 55.]), torch.Size([4]))
A_sum_axis1 = A.sum(axis=1)  # 沿1轴求和降维
A_sum_axis1, A_sum_axis1.shape
(tensor([ 6., 22., 38., 54., 70.]), torch.Size([5]))
A.sum(axis=[0, 1])  # 同时沿着0轴和1轴求和,等效于 A.sum()
tensor(190.)

与求和有关的量是平均值,可以通过总和除以元素数求得,也可以直接调用 mean 函数。同样,计算均值也可以沿指定轴方向降低维度。

A.mean(), A.sum()/A.numel()
(tensor(9.5000), tensor(9.5000))
A.mean(axis=0), A.sum(axis=0)/A.shape[0]
(tensor([ 8.,  9., 10., 11.]), tensor([ 8.,  9., 10., 11.]))

有时候需要非降维求和,即指定求和但是不降维,然后可以通过广播机制计算 A/sum_A。

sum_A = A.sum(axis=1, keepdim=True)
sum_A, A/sum_A
(tensor([[ 6.],
         [22.],
         [38.],
         [54.],
         [70.]]),
 tensor([[0.0000, 0.1667, 0.3333, 0.5000],
         [0.1818, 0.2273, 0.2727, 0.3182],
         [0.2105, 0.2368, 0.2632, 0.2895],
         [0.2222, 0.2407, 0.2593, 0.2778],
         [0.2286, 0.2429, 0.2571, 0.2714]]))

也可以调用 cumsum 函数沿某个轴计算元素的累积总和,如 axis=0 则是按行累加。

A, A.cumsum(axis=0)
(tensor([[ 0.,  1.,  2.,  3.],
         [ 4.,  5.,  6.,  7.],
         [ 8.,  9., 10., 11.],
         [12., 13., 14., 15.],
         [16., 17., 18., 19.]]),
 tensor([[ 0.,  1.,  2.,  3.],
         [ 4.,  6.,  8., 10.],
         [12., 15., 18., 21.],
         [24., 28., 32., 36.],
         [40., 45., 50., 55.]]))

2.3.7 点积

点积即为按元素相乘再求和。可以调用 dot 函数求点积,等效于先做乘法再求和。

y = torch.ones(4, dtype=torch.float32)
x, y, torch.dot(x, y), torch.sum(x * y)
(tensor([0., 1., 2., 3.]), tensor([1., 1., 1., 1.]), tensor(6.), tensor(6.))

2.3.8 矩阵-向量积

矩阵-向量积即矩阵每行分别与向量做点积。注意,矩阵的列维数需与向量维数相同。

A.shape, x.shape, A, x, torch.mv(A, x)
(torch.Size([5, 4]),
 torch.Size([4]),
 tensor([[ 0.,  1.,  2.,  3.],
         [ 4.,  5.,  6.,  7.],
         [ 8.,  9., 10., 11.],
         [12., 13., 14., 15.],
         [16., 17., 18., 19.]]),
 tensor([0., 1., 2., 3.]),
 tensor([ 14.,  38.,  62.,  86., 110.]))

2.3.9 矩阵-矩阵乘法

矩阵-矩阵乘法即一矩阵每行分别与另一矩阵每列做点积。注意,前者的列维度需与后者行维度相同。莫要把矩阵乘法与哈达玛积混淆。

B = torch.ones(4, 3)
A, B, torch.mm(A, B)
(tensor([[ 0.,  1.,  2.,  3.],
         [ 4.,  5.,  6.,  7.],
         [ 8.,  9., 10., 11.],
         [12., 13., 14., 15.],
         [16., 17., 18., 19.]]),
 tensor([[1., 1., 1.],
         [1., 1., 1.],
         [1., 1., 1.],
         [1., 1., 1.]]),
 tensor([[ 6.,  6.,  6.],
         [22., 22., 22.],
         [38., 38., 38.],
         [54., 54., 54.],
         [70., 70., 70.]]))

2.3.10 范数

向量的范数(norm)表示一个向量的大小,此处的大小不涉及维度,而是分量的大小。

在线性代数中,向量的范数是将向量映射到标量的函数 \(f\)。给定任意向量 \(x\),向量的范数具有以下三个性质:

  1. 如果按常数因子 \(\alpha\) 缩放向量则其范数也会按相同的常数因子的绝对值缩放:

\[ f(\alpha x)=|\alpha|f(x) \]

  1. 三角不等式:

\[ f(x+y)\le f(x)+f(y) \]

  1. 非负性:

\[f(x)\ge 0 \]

范数很像距离的度量。事实上,欧几里得距离是一个 \(L_2\) 范数:假设 \(n\) 维向量 \(x\) 中的元素是 \(x_1,\dots,x_n\),其 \(L_2\) 范数是向量元素平方和的平方根:

\[||x||_2=\sqrt{\sum^n_{i=1}x^2_i} \]

其中,再 \(L_2\) 范数中常常省略下标 2,即 \(||x||_2\) 等同于 \(||x||\)

可以调用 norm 函数计算向量的 \(L_2\) 范数。

u = torch.tensor([3.0, -4.0])
torch.norm(u)
tensor(5.)

深度学习中也会经常遇到 \(L_1\) 范数,它表示为向量元素的绝对值之和:

\[||x||_1=\sum^n_{i=1}|x_i| \]

\(L_2\) 范数相比,\(L_1\) 范数受异常值的影响较小,可以调用 abs 函数和 sum 函数计算 \(L_1\) 范数。

torch.abs(u).sum()
tensor(7.)

\(L_2\) 范数和 \(L_1\) 范数都是更一般的 \(L_p\) 范数的特例:

\[||x||_p=(\sum^n_{i=1}|x_i|^p)^{1/p} \]

类似于向量的 \(L_2\) 范数,矩阵 \(X\in \mathbb{R}^{m\times n}\) 的弗罗贝尼乌斯范数(Frobenius norm)是矩阵元素平方和的平方根:

\[||X||_F=\sqrt{\sum^m_{i=1}\sum^n_{j=1}x^2_{ij}} \]

弗罗贝尼乌斯范数具有向量范数的所有性质,他就像是矩阵型向量的 \(L_2\) 范数。

可以调用 norm 函数计算矩阵的弗罗贝尼乌斯范数。

torch.norm(torch.ones((4, 9)))
tensor(6.)

练习

(1)证明一个矩阵 \(A\) 的转置的转置是 \(A\),即 \((A^T)^T=A\)

A = torch.randn(3, 4)
A, A.T, A.T.T, A==A.T.T, torch.equal(A, A.T.T)
(tensor([[-0.4384, -0.5538, -2.5270,  1.3256],
         [-0.4584,  0.5911,  1.3676, -0.7333],
         [ 0.5668, -1.3604,  1.3320, -0.5259]]),
 tensor([[-0.4384, -0.4584,  0.5668],
         [-0.5538,  0.5911, -1.3604],
         [-2.5270,  1.3676,  1.3320],
         [ 1.3256, -0.7333, -0.5259]]),
 tensor([[-0.4384, -0.5538, -2.5270,  1.3256],
         [-0.4584,  0.5911,  1.3676, -0.7333],
         [ 0.5668, -1.3604,  1.3320, -0.5259]]),
 tensor([[True, True, True, True],
         [True, True, True, True],
         [True, True, True, True]]),
 True)

(2)给出两个矩阵 \(A\)\(B\),证明“它们转置的和”等于“它们和的转置”,即 \(A^T+B^T=(A+B)^T\)

A, B = torch.randn(3, 4), torch.randn(3, 4)
A, B, A.T, B.T, A.T+B.T, (A+B).T, A.T+B.T==(A+B).T, torch.equal(A.T+B.T, (A+B).T)
(tensor([[ 0.8526, -0.1816,  0.1884, -0.5057],
         [ 0.1776,  0.6299, -0.1878, -0.2197],
         [-0.3169, -0.6792,  1.4165, -0.8142]]),
 tensor([[ 1.3895,  0.9179, -1.6885,  0.7068],
         [-0.8290,  0.6529, -0.6209, -0.1764],
         [-1.5397,  0.3814,  0.0838,  0.5798]]),
 tensor([[ 0.8526,  0.1776, -0.3169],
         [-0.1816,  0.6299, -0.6792],
         [ 0.1884, -0.1878,  1.4165],
         [-0.5057, -0.2197, -0.8142]]),
 tensor([[ 1.3895, -0.8290, -1.5397],
         [ 0.9179,  0.6529,  0.3814],
         [-1.6885, -0.6209,  0.0838],
         [ 0.7068, -0.1764,  0.5798]]),
 tensor([[ 2.2421, -0.6513, -1.8566],
         [ 0.7363,  1.2828, -0.2978],
         [-1.5000, -0.8088,  1.5002],
         [ 0.2011, -0.3961, -0.2345]]),
 tensor([[ 2.2421, -0.6513, -1.8566],
         [ 0.7363,  1.2828, -0.2978],
         [-1.5000, -0.8088,  1.5002],
         [ 0.2011, -0.3961, -0.2345]]),
 tensor([[True, True, True],
         [True, True, True],
         [True, True, True],
         [True, True, True]]),
 True)

(3)给定任意方阵 \(A\)\(A+A^T\) 总是对称的吗?为什么?

A = torch.randn(4, 4)
A, A.T, A+A.T, A+A.T==(A+A.T).T, torch.equal(A+A.T, (A+A.T).T)
(tensor([[-0.1174, -0.9523,  2.9669, -1.2442],
         [ 0.3419, -0.7263,  1.0194, -0.0063],
         [-1.2912, -0.4803,  0.6785,  1.3618],
         [-0.0641,  0.9961, -2.2250,  1.8944]]),
 tensor([[-0.1174,  0.3419, -1.2912, -0.0641],
         [-0.9523, -0.7263, -0.4803,  0.9961],
         [ 2.9669,  1.0194,  0.6785, -2.2250],
         [-1.2442, -0.0063,  1.3618,  1.8944]]),
 tensor([[-0.2348, -0.6105,  1.6758, -1.3084],
         [-0.6105, -1.4527,  0.5390,  0.9898],
         [ 1.6758,  0.5390,  1.3571, -0.8632],
         [-1.3084,  0.9898, -0.8632,  3.7888]]),
 tensor([[True, True, True, True],
         [True, True, True, True],
         [True, True, True, True],
         [True, True, True, True]]),
 True)

(4)本节中定义了形状为(2, 3, 4)的张量 \(X\)。len(X)的输出结果是什么?

X = torch.arange(24).reshape(2, 3, 4)
X, len(X)
(tensor([[[ 0,  1,  2,  3],
          [ 4,  5,  6,  7],
          [ 8,  9, 10, 11]],
 
         [[12, 13, 14, 15],
          [16, 17, 18, 19],
          [20, 21, 22, 23]]]),
 2)

(5)对于任意形状的张量 \(X\),len(X)是否总是对应于 \(X\) 特定轴的长度?这个轴是什么?

import random

for i in range(10):
    shape = []
    for j in range(random.randint(1, 9)):  # 生成随机shape
        shape.append(random.randint(1, 9))
    X = torch.zeros(shape)
    print(f'shape is {X.shape}, len(X) is {len(X)}')

# 足以见得len()显示的是0轴的长度
shape is torch.Size([1, 9, 3, 3, 2, 7, 2, 5, 5]), len(X) is 1
shape is torch.Size([8, 5, 7, 1, 3, 5, 5]), len(X) is 8
shape is torch.Size([5, 6, 3, 6, 5, 4, 9, 3]), len(X) is 5
shape is torch.Size([2, 3, 8, 9]), len(X) is 2
shape is torch.Size([7, 3, 3, 3, 8]), len(X) is 7
shape is torch.Size([3, 6, 6, 8, 6, 4, 5]), len(X) is 3
shape is torch.Size([2, 2]), len(X) is 2
shape is torch.Size([4, 9]), len(X) is 4
shape is torch.Size([6, 2, 6, 8, 3]), len(X) is 6
shape is torch.Size([6, 1, 2, 5]), len(X) is 6

(6)运行 A/A.sum(axis=1),看看会发生什么。请分析一下原因。

A, A.sum(axis=1)
# A/A.sum(axis=1)形状不同则无法计算
(tensor([[-0.1174, -0.9523,  2.9669, -1.2442],
         [ 0.3419, -0.7263,  1.0194, -0.0063],
         [-1.2912, -0.4803,  0.6785,  1.3618],
         [-0.0641,  0.9961, -2.2250,  1.8944]]),
 tensor([0.6530, 0.6286, 0.2688, 0.6014]))
'''
参见2.3.6及1.2.1.练习(2)
可以保留维度然后依据广播机制进行运算
'''
sum_A = A.sum(axis=1, keepdim=True)
A, sum_A, A.shape, sum_A.shape, A/sum_A
(tensor([[-0.1174, -0.9523,  2.9669, -1.2442],
         [ 0.3419, -0.7263,  1.0194, -0.0063],
         [-1.2912, -0.4803,  0.6785,  1.3618],
         [-0.0641,  0.9961, -2.2250,  1.8944]]),
 tensor([[0.6530],
         [0.6286],
         [0.2688],
         [0.6014]]),
 torch.Size([4, 4]),
 torch.Size([4, 1]),
 tensor([[-0.1798, -1.4585,  4.5439, -1.9056],
         [ 0.5439, -1.1555,  1.6217, -0.0101],
         [-4.8027, -1.7867,  2.5239,  5.0654],
         [-0.1067,  1.6564, -3.6999,  3.1501]]))

(7)考虑一个具有形状(2, 3, 4)的张量,在轴0、1、2上的求和输出是什么形状?

X = torch.randn([2, 3, 4])
X, X.shape, X.sum(axis=0), X.sum(axis=0).shape, X.sum(axis=1), X.sum(axis=1).shape, X.sum(axis=2), X.sum(axis=2).shape
(tensor([[[-0.1514, -0.2254, -1.1703, -0.4737],
          [-0.0562,  1.1885,  0.2306,  0.5065],
          [ 0.2709,  1.2751, -2.2213, -0.7125]],
 
         [[ 0.3613,  0.7764,  0.8976,  0.1476],
          [-0.1609,  0.3369, -0.9397, -1.1766],
          [ 0.3401, -0.6927, -1.8565,  0.9088]]]),
 torch.Size([2, 3, 4]),
 tensor([[ 0.2099,  0.5510, -0.2727, -0.3261],
         [-0.2171,  1.5254, -0.7091, -0.6701],
         [ 0.6111,  0.5825, -4.0777,  0.1963]]),
 torch.Size([3, 4]),
 tensor([[ 0.0634,  2.2382, -3.1609, -0.6797],
         [ 0.5405,  0.4206, -1.8986, -0.1202]]),
 torch.Size([2, 4]),
 tensor([[-2.0207,  1.8694, -1.3877],
         [ 2.1828, -1.9403, -1.3002]]),
 torch.Size([2, 3]))

(8)为 linalg.norm 函数提供3个或更多轴的张量,并观察其输出。对于任意形状的张量这个函数计算得到什么?

A, B, C = torch.randn([2, 3, 4]), torch.randn([2, 3, 4, 5]), torch.randn([2, 3, 4, 5, 6])
A, B, C, torch.norm(A), torch.norm(B), torch.norm(C)  # 计算的是所有元素平方和的平方根,也就是L2范数咯
(tensor([[[-1.4605, -0.2626, -0.3146,  0.5800],
          [-0.3007, -1.1624,  1.4690,  0.8512],
          [-1.5118, -1.2382, -0.6615,  0.6414]],
 
         [[ 1.4322, -0.6534, -0.0602, -0.5418],
          [-0.3674, -0.5996, -0.0298,  0.1505],
          [ 1.8959,  1.4191,  1.4638, -1.2170]]]),
 tensor([[[[-1.3103,  0.2245,  0.5714, -0.6395, -0.3428],
           [-1.9542, -0.2442, -0.4077,  0.1311,  1.2586],
           [ 0.9246,  0.3465, -0.1761, -0.3634,  0.8664],
           [-0.2300, -0.1846, -0.9212,  0.6557,  0.4497]],
 
          [[-0.5540,  0.0068,  0.6770,  0.0562,  0.1607],
           [-1.6862, -0.6131, -2.0522, -1.1732,  1.4401],
           [ 1.8759, -1.3954, -1.9391, -1.1330, -0.8850],
           [-0.3903, -1.3384,  0.3776, -0.8520, -0.2333]],
 
          [[-0.1714, -0.6471,  1.0929,  0.0403, -0.3621],
           [-1.2680, -0.6445,  0.9973, -1.6713,  0.9688],
           [ 0.0406, -0.3986, -0.8694, -0.3931, -0.2176],
           [-0.9195, -0.7479,  0.8918, -0.3224,  0.2345]]],

​ [[[-0.2234, -0.4964, 1.0276, -0.9460, 1.5755],
​ [-1.4506, -0.4309, 0.1666, 1.1141, 0.5686],
​ [-0.6848, -0.5883, 0.2217, 0.5959, 1.0737],
​ [-1.2661, 0.2323, -0.3551, -0.2596, 1.2994]],

​ [[ 1.0310, 2.4682, 0.1451, 1.6385, -1.0438],
​ [ 1.3069, -0.7536, 1.4158, 1.2705, -1.5419],
​ [-0.7280, -0.7162, -0.8236, 0.4997, -0.6060],
​ [-1.4321, 0.3617, -1.2377, 0.0210, 0.4280]],

          [[-0.8301, -0.4356,  0.0637,  0.3792,  1.1425],
           [-0.4838, -0.3319,  0.6467, -0.4092, -0.7351],
           [ 1.5267, -0.6831,  0.8057, -0.1463, -1.6016],
           [ 1.7899, -0.7154, -0.6299,  0.2598,  0.7131]]]]),
 tensor([[[[[-2.3681e-01, -1.2391e+00, -9.4857e-01,  7.5074e-01,  2.3047e-01,
              2.5469e-02],
            [-3.6294e-01,  7.8039e-02, -1.9710e+00,  1.2863e+00,  1.4326e+00,
              1.1273e+00],
            [-2.5402e+00,  3.3435e-01,  3.8675e-01, -1.3969e+00, -1.2303e+00,
             -1.2671e+00],
            [ 4.6504e-01,  4.5052e-01,  7.7558e-01,  2.0966e-01, -4.2586e-01,
              8.0643e-01],
            [ 9.4497e-01, -7.5537e-01,  7.9046e-01,  1.1398e-01,  2.0785e-01,
             -4.2377e-02]],
 
           [[-2.1610e-01, -1.8648e-01, -4.8735e-01, -1.5699e+00, -9.9162e-01,
              4.0486e-01],
            [-1.4946e+00,  8.8093e-02,  1.1489e+00,  1.0916e+00,  1.2568e+00,
             -5.3603e-01],
            [ 5.4103e-01,  3.2862e-01,  4.4250e-01,  8.8317e-01,  5.8832e-01,
             -1.3286e-01],
            [ 1.6239e-01,  3.4156e-02,  2.1371e+00, -7.9210e-01,  1.0125e+00,
             -1.2368e+00],
            [ 3.6173e-01, -1.2961e+00,  1.4850e+00,  3.5698e-01, -5.5366e-01,
             -6.2270e-02]],
 
           [[ 1.9718e-01, -1.2165e+00, -2.1746e-01, -2.4360e-01,  2.4630e-01,
              1.8108e+00],
            [ 5.2872e-01,  1.3304e+00,  8.7964e-01, -1.4758e+00, -4.6010e-01,
              4.2315e-01],
            [-2.4250e+00, -4.6481e-02, -4.9795e-01, -2.0117e+00, -1.7987e+00,
             -2.8784e-01],
            [-1.7477e-01, -5.7401e-01, -8.9379e-01, -1.9196e-01,  6.2150e-01,
              1.6695e+00],
            [ 2.1795e+00,  1.6530e-01,  2.3624e-02,  1.1554e+00,  9.6699e-02,
              5.7174e-01]],
 
           [[-1.1659e+00,  1.0970e+00,  4.2434e-01,  5.2833e-01, -1.5082e+00,
              6.8939e-01],
            [-7.7538e-01,  1.5595e-01,  1.4868e+00, -2.6550e-01,  2.5025e-01,
              1.6639e+00],
            [-2.6924e+00, -5.1799e-01, -2.9549e+00,  6.8999e-01, -6.5300e-01,
             -6.3848e-01],
            [-1.4995e+00,  1.0923e+00, -5.8940e-01,  3.4054e-02, -1.8086e+00,
             -4.8821e-01],
            [-2.1559e-01, -4.1844e-01,  4.0658e-01, -1.8008e+00, -6.1357e-02,
              3.6926e-01]]],


​ [[[-1.9671e-01, -1.0532e+00, 5.1931e-01, 1.0950e+00, -8.4728e-01,
​ 1.8586e+00],
​ [-4.2900e-01, -1.1248e+00, 2.1983e+00, 8.1956e-01, -2.8130e-01,
​ -6.0043e-01],
​ [-1.4371e-01, 1.0356e-01, -2.8585e-01, 1.1325e+00, -1.7587e-01,
​ 3.1057e-01],
​ [-1.0140e+00, -6.3424e-01, 7.5587e-02, 2.3697e-01, -6.8109e-01,
​ 9.4810e-01],
​ [ 1.2712e+00, 1.7370e+00, -2.9469e-02, 8.6160e-01, 1.3352e-01,
​ 1.2745e+00]],

​ [[ 1.4751e+00, -1.4066e-01, -1.4431e+00, 9.5975e-01, -5.9194e-01,
​ -6.5439e-02],
​ [ 4.6125e-01, -7.4849e-01, -1.0145e-01, -1.3025e+00, 1.2026e+00,
​ 8.5171e-01],
​ [-7.2939e-01, -9.3678e-01, -8.7221e-01, 3.5142e-01, 3.4244e-01,
​ -3.4934e-02],
​ [ 4.1856e-02, -2.3899e-01, -8.7821e-01, -4.9267e-01, -2.3335e-01,
​ 3.7129e-01],
​ [-8.8792e-01, 7.4862e-01, 2.8516e-01, -2.9138e-01, -7.9824e-01,
​ -1.1881e+00]],

​ [[-9.0243e-01, 6.5236e-01, 1.2818e+00, -1.4523e+00, -1.2040e-01,
​ 1.1197e+00],
​ [ 1.5636e+00, 8.0232e-01, -7.8576e-01, 3.8953e-01, 2.4898e+00,
​ 4.6596e-02],
​ [ 8.7049e-01, 3.0981e-01, -1.3775e-01, -6.9731e-01, 1.9283e-01,
​ 9.8587e-01],
​ [ 2.3094e-01, 3.8024e-01, -1.8182e-01, -1.9909e-01, -3.1653e-01,
​ -1.9212e+00],
​ [ 3.7987e-01, -1.1033e+00, 1.0368e+00, 1.3674e+00, -1.5588e+00,
​ -2.8801e-01]],

[[ 3.6544e-01, -4.9406e-01, -1.2855e+00, 2.6266e-01, -2.0728e-01,
-1.2804e-02],
[-1.9292e-01, 1.5489e+00, -1.0539e+00, -1.4654e+00, -6.5201e-01,
-3.0243e+00],
[ 3.8386e-01, -6.1063e-01, -3.2347e-01, -6.1776e-01, -2.7872e-01,
2.1739e-01],
[-2.2469e+00, 3.3578e-01, 1.0574e-01, 2.8419e-01, 1.5649e+00,
2.5916e+00],
[ 4.0530e-01, -1.7778e+00, -7.7793e-01, -6.2932e-01, 4.4860e-01,
-6.0313e-01]]],


​ [[[-1.2234e+00, -1.3158e+00, -2.0496e-01, -3.4461e-01, 3.7333e-01,
​ 5.1923e-01],
​ [ 1.1356e+00, -1.6992e+00, -9.1575e-01, -4.7238e-01, 1.5959e+00,
​ 7.7047e-01],
​ [-1.2222e+00, -8.2171e-02, -4.0709e-02, -1.2852e+00, 1.2343e+00,
​ 6.5542e-01],
​ [-2.7400e-01, 1.8967e+00, 2.3603e-01, -1.9523e-01, 5.4763e-01,
​ 5.6928e-01],
​ [-5.6846e-01, -6.4787e-01, -9.4029e-01, -5.5044e-01, 1.6716e+00,
​ -5.3589e-01]],

​ [[-1.6756e+00, 5.7291e-01, 5.7057e-01, -7.3101e-01, -7.8822e-01,
​ -4.1071e-01],
​ [-9.6495e-01, -1.1333e-01, 7.8392e-01, 1.4385e+00, 9.0719e-02,
​ -3.8260e-03],
​ [-1.8228e+00, -1.8262e-02, -1.8270e-01, 2.9828e-01, 9.9792e-01,
​ 4.8489e-01],
​ [-2.8055e-01, -2.1665e+00, 2.6847e-01, -4.3142e-01, 8.2777e-01,
​ -3.9716e-02],
​ [-1.4242e+00, -1.0207e+00, -9.6304e-01, 5.7136e-02, 3.6630e-01,
​ -5.1653e-01]],

​ [[ 6.8433e-01, -1.1357e+00, -6.9096e-01, -4.0251e-01, 3.8363e-01,
​ 1.3306e+00],
​ [-2.8712e-01, -9.5369e-01, -1.0420e+00, 2.5982e-01, 3.0069e-01,
​ -2.1153e+00],
​ [-2.0689e+00, -5.4658e-01, 1.0015e+00, 7.6531e-01, 1.3565e-01,
​ 1.2120e+00],
​ [ 5.4723e-01, -4.7905e-01, -3.2742e-01, 8.3502e-01, -6.3026e-01,
​ -4.8095e-01],
​ [ 1.0674e+00, 9.2440e-01, 8.3082e-01, 8.0113e-01, 2.5097e+00,
​ -1.7073e+00]],

[[ 7.1185e-02, 8.3329e-01, -1.1794e+00, -1.3056e-01, 1.8918e+00,
-1.2735e+00],
[ 2.3678e+00, 3.1218e-01, 7.3616e-02, -3.0509e-01, 8.1833e-01,
1.5081e+00],
[ 4.2761e-01, 8.1681e-01, 7.1666e-01, -5.8312e-01, -3.4733e-01,
-5.0784e-01],
[-6.0197e-01, -4.8275e-01, 6.9468e-01, 2.2240e+00, -4.0134e-01,
-8.1107e-02],
[ 2.7924e-01, 3.3227e+00, 2.9689e+00, -7.4494e-01, -1.5868e-01,
1.5256e+00]]]],



​ [[[[-1.8341e-02, -1.3368e+00, 1.1863e-01, 4.9295e-01, -3.2281e-01,
​ -1.9266e+00],
​ [ 1.9948e+00, -1.4219e+00, -5.7316e-01, 4.4124e-01, -7.4871e-01,
​ -1.5656e+00],
​ [-2.2769e-01, -1.7995e-01, 2.0330e-01, -3.3624e-02, 1.7929e-01,
​ -3.5317e-01],
​ [ 7.6973e-01, -1.5529e+00, 1.0805e+00, -6.3353e-01, 8.6651e-01,
​ -1.4109e+00],
​ [ 9.1453e-01, 9.3930e-01, 1.6012e+00, -8.6130e-01, 4.8736e-02,
​ 1.7314e+00]],

​ [[ 1.0651e+00, -7.4053e-01, -1.7741e+00, -1.7352e+00, -2.5545e-01,
​ -9.8533e-01],
​ [ 5.2711e-01, 1.1754e+00, -9.0877e-01, 3.0391e-01, 1.0157e+00,
​ 1.6345e-01],
​ [-1.7306e-01, -1.9128e+00, -5.7075e-01, 2.8673e-01, -1.4100e+00,
​ -4.7255e-01],
​ [-6.3736e-01, 9.2820e-01, -1.1122e+00, -4.4870e-01, 4.0008e-01,
​ 5.1720e-01],
​ [-8.6697e-01, -8.3099e-01, 3.9469e-01, 6.1581e-01, 4.7365e-01,
​ -8.0613e-01]],

​ [[ 8.7486e-02, -3.8761e-01, -1.6233e+00, -1.4528e+00, -8.2229e-02,
​ 8.2287e-01],
​ [-1.3103e+00, -4.4219e-01, 1.1701e+00, -1.1111e+00, 3.8301e-01,
​ 2.4340e-01],
​ [ 2.0634e-01, -9.9909e-01, -4.9101e-01, -3.6558e-01, -4.8920e-01,
​ 1.4732e+00],
​ [ 1.8372e+00, 2.4752e-01, 5.3928e-01, 4.7222e-01, 3.5563e-02,
​ -8.2193e-01],
​ [-2.1065e-01, 7.9028e-01, 7.9098e-01, 2.3608e-01, 3.9599e-01,
​ -2.7774e+00]],

[[ 2.9044e-01, -3.3193e-01, -5.6791e-01, 1.4179e+00, -7.1839e-02,
-1.4722e+00],
[ 1.3000e+00, 1.2754e+00, -3.8392e-01, 1.3830e+00, 1.5661e-01,
-2.9919e-01],
[-2.7046e-01, 1.2363e-01, -1.5553e-01, -8.8824e-01, -1.0016e+00,
-2.2993e-02],
[ 6.8264e-01, 4.1017e-01, 2.0045e-01, -1.4552e+00, 2.6950e-01,
-1.5569e+00],
[ 1.9402e+00, 1.0476e+00, -3.4344e-01, 2.6083e-02, -2.2718e+00,
1.3277e+00]]],


​ [[[-2.5759e+00, -9.9958e-01, -5.8521e-01, -9.4210e-01, -3.6004e-01,
​ 1.1535e+00],
​ [-5.2320e-01, 1.5524e-01, 5.2574e-01, -1.0418e+00, 1.7822e-01,
​ -6.2566e-02],
​ [-7.6038e-01, 2.0033e-01, -9.2876e-01, 1.3899e+00, -1.3983e+00,
​ -1.1140e+00],
​ [-5.9680e-01, 3.0601e-01, 7.1907e-01, -1.4262e+00, 5.6797e-01,
​ 1.2214e+00],
​ [ 2.9634e-01, -1.8836e-01, -2.7178e-01, 2.0267e+00, 4.0650e-02,
​ 1.3460e+00]],

​ [[ 2.6077e-01, 4.0244e-01, -3.3304e-01, 2.2512e-01, -1.1108e-01,
​ 5.3097e-01],
​ [ 2.3198e+00, -5.8976e-01, 1.6727e+00, 1.2185e-01, -9.0776e-02,
​ 1.3923e-01],
​ [ 8.4103e-01, -3.4702e-01, 7.9417e-01, 1.0072e+00, 4.6842e-01,
​ 5.3763e-02],
​ [-1.4434e-01, -9.8787e-01, -2.0157e-01, 1.4569e-01, 1.2006e+00,
​ 6.6003e-02],
​ [ 5.2587e-02, -1.4907e-01, 4.8548e-01, -6.3283e-01, -1.2818e-01,
​ -1.5752e+00]],

​ [[-2.4980e-01, 4.6042e-01, 1.1870e+00, 1.4215e+00, -8.4509e-01,
​ 1.1578e-01],
​ [-6.9611e-01, 5.8878e-01, 2.3107e-01, -1.4135e+00, -1.5293e+00,
​ -3.1337e-03],
​ [-2.0332e-01, 1.5856e+00, 1.0680e+00, 1.3670e-01, -2.0154e+00,
​ 1.0085e+00],
​ [-1.2204e+00, 1.2947e+00, -9.1155e-01, 8.2917e-01, 8.2966e-01,
​ -1.3457e+00],
​ [-2.3037e+00, -4.1706e-01, -4.3987e-01, 1.3938e-01, -4.6010e-02,
​ -4.2154e-01]],

[[ 1.4157e+00, -6.6821e-01, 5.4937e-01, -2.9109e+00, -1.1924e+00,
8.4755e-01],
[-1.5888e+00, 7.2133e-01, 6.2010e-01, 1.2801e+00, -4.8208e-01,
-1.4064e+00],
[-2.2202e-01, -1.4115e+00, -1.5315e+00, 8.6884e-01, -7.8160e-02,
-1.0712e+00],
[-1.4697e+00, -1.8991e-01, 3.8984e-01, 1.4557e-01, -7.1230e-01,
-8.0043e-01],
[ 6.0775e-01, 7.3943e-01, -3.6674e-01, 8.4078e-01, -6.0883e-01,
-2.4029e-01]]],


​ [[[-4.4640e-01, 6.4365e-01, 3.4469e-02, -5.5350e-01, -1.3785e+00,
​ 2.2526e+00],
​ [ 9.1401e-01, -1.5421e-01, 2.7347e+00, 1.4960e+00, -1.0696e+00,
​ -1.9551e-01],
​ [ 1.4035e+00, 1.5419e+00, -8.9381e-01, 9.7827e-01, -1.2565e+00,
​ 1.3332e+00],
​ [ 1.4230e+00, 4.8455e-01, 1.4428e+00, 1.0620e-01, 1.3049e+00,
​ 1.7265e+00],
​ [ 6.3637e-01, -3.7007e-03, 6.9977e-01, -4.6133e-01, -1.2418e-01,
​ 3.8922e-03]],

​ [[-1.2555e+00, 1.0396e-01, -3.5612e-01, -6.4529e-01, -6.4107e-01,
​ -1.1754e-02],
​ [-1.7830e+00, -1.8589e+00, 7.9223e-01, -4.8929e-01, -2.1105e-01,
​ 1.0357e+00],
​ [-1.2333e+00, -1.4611e+00, -1.1336e-01, 7.4807e-01, 7.8195e-01,
​ 1.2215e-01],
​ [ 1.5926e+00, -1.6530e-01, -1.0393e-03, -7.2221e-01, -8.2134e-01,
​ 4.3914e-01],
​ [ 1.5987e+00, -3.8033e-01, -1.2532e+00, 2.1474e+00, -1.6270e+00,
​ -1.6835e-01]],

​ [[ 2.6085e-01, 2.8549e+00, 1.2538e-01, 1.8153e+00, -3.8062e-01,
​ 7.3390e-01],
​ [-6.1966e-01, -5.0895e-01, 4.5251e-01, -3.8693e-01, -2.9837e-01,
​ 3.6476e-01],
​ [-5.0648e-01, 1.6547e-01, -2.8595e-01, -4.6369e-01, 1.1370e+00,
​ 1.0663e+00],
​ [ 8.8687e-02, -1.9605e-01, 2.8950e+00, -1.3116e-01, 9.4628e-01,
​ 4.4099e-01],
​ [ 2.3677e+00, 6.5326e-01, 8.6405e-01, -4.3999e-01, 4.3620e-01,
​ -2.3336e-01]],

[[ 6.2477e-01, -2.3012e-01, -4.1184e-02, 1.1864e+00, 6.4641e-01,
-6.5475e-01],
[-1.5904e+00, -1.6602e+00, 4.8835e-01, -3.3560e-01, -5.5419e-02,
-1.4478e-01],
[ 1.5529e+00, -5.0848e-01, 6.2644e-01, -1.2175e+00, 1.0862e+00,
1.1233e+00],
[ 1.9763e-01, 3.6826e-02, 1.6900e+00, -7.5374e-01, -5.2350e-01,
2.7703e-01],
[-4.4228e-01, -1.5461e+00, -6.1931e-01, 2.7160e+00, -1.2653e+00,
2.0834e-01]]]]]),
tensor(4.9148),
tensor(10.2012),
tensor(26.9727))