微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

了解“ backward”:如何从头开始编写Pytorch函数“ .backward”?

如何解决了解“ backward”:如何从头开始编写Pytorch函数“ .backward”?

我是学习深度学习的新手,我一直试图了解Pytorch的'.backward()'是做什么的,因为它在那儿几乎完成了大部分工作。因此,我试图详细了解后向功能的作用,因此,我将尝试逐步编写功能代码。您可以推荐我的任何资源(书籍,视频,GitHub存储库)开始对该功能进行编码?谢谢您的时间,希望对您有所帮助。

解决方法

backward()正在计算相对于(w.r.t.)图叶的梯度。 grad()函数更为通用,它可以计算出w.r.t.任何输入(包括叶)。

我前一段时间实现了grad()功能,您可以检查一下。它使用自动微分(AD)的功能。

import math
class ADNumber:
    
    def __init__(self,val,name=""): 
        self.name=name
        self._val=val
        self._children=[]         
        
    def __truediv__(self,other):
        new = ADNumber(self._val / other._val,name=f"{self.name}/{other.name}")
        self._children.append((1.0/other._val,new))
        other._children.append((-self._val/other._val**2,new)) # first derivation of 1/x is -1/x^2
        return new 

    def __mul__(self,other):
        new = ADNumber(self._val*other._val,name=f"{self.name}*{other.name}")
        self._children.append((other._val,new))
        other._children.append((self._val,new))
        return new

    def __add__(self,other):
        if isinstance(other,(int,float)):
            other = ADNumber(other,str(other))
        new = ADNumber(self._val+other._val,name=f"{self.name}+{other.name}")
        self._children.append((1.0,new))
        other._children.append((1.0,new))
        return new

    def __sub__(self,other):
        new = ADNumber(self._val-other._val,name=f"{self.name}-{other.name}")
        self._children.append((1.0,new))
        other._children.append((-1.0,new))
        return new
    
            
    @staticmethod
    def exp(self):
        new = ADNumber(math.exp(self._val),name=f"exp({self.name})")
        self._children.append((self._val,new))
        return new

    @staticmethod
    def sin(self):
        new = ADNumber(math.sin(self._val),name=f"sin({self.name})")      
        self._children.append((math.cos(self._val),new)) # first derivation is cos
        return new
    
    def grad(self,other):
        if self==other:            
            return 1.0
        else:
            result=0.0
            for child in other._children:                 
                result+=child[0]*self.grad(child[1])                
            return result 
        
A = ADNumber # shortcuts
sin = A.sin
exp = A.exp

def print_childs(f,wrt): # with respect to
    for e in f._children:
        print("child:",wrt,"->",e[1].name,"grad: ",e[0])
        print_child(e[1],e[1].name)
        
    
x1 = A(1.5,name="x1")
x2 = A(0.5,name="x2")
f=(sin(x2)+1)/(x2+exp(x1))+x1*x2

print_childs(x2,"x2")
print("\ncalculated gradient for the function f with respect to x2:",f.grad(x2))

出局:

child: x2 -> sin(x2) grad:  0.8775825618903728
child: sin(x2) -> sin(x2)+1 grad:  1.0
child: sin(x2)+1 -> sin(x2)+1/x2+exp(x1) grad:  0.20073512936690338
child: sin(x2)+1/x2+exp(x1) -> sin(x2)+1/x2+exp(x1)+x1*x2 grad:  1.0
child: x2 -> x2+exp(x1) grad:  1.0
child: x2+exp(x1) -> sin(x2)+1/x2+exp(x1) grad:  -0.05961284871202578
child: sin(x2)+1/x2+exp(x1) -> sin(x2)+1/x2+exp(x1)+x1*x2 grad:  1.0
child: x2 -> x1*x2 grad:  1.5
child: x1*x2 -> sin(x2)+1/x2+exp(x1)+x1*x2 grad:  1.0

calculated gradient for the function f with respect to x2: 1.6165488003791766

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。