Agu*_*ina 5 gradient complex-numbers autodiff tensorflow
I am working with complex-valued neural networks.
For Complex-valued neural networks Wirtinger calculus is normally used. The definition of the derivate is then (take into acount that functions are non-Holomorphic because of Liouville's theorem):
If you take Akira Hirose book "Complex-Valued Neural Networks: Advances and Applications", Chapter 4 equation 4.9 defines:

Where the partial derivative is also calculated using Wirtinger calculus of course.
Is this the case for tensorflow? or is it defined in some other way? I cannot find any good reference on the topic.
好的,所以我在github/tensorflow的现有线程中讨论了这个问题,@charmasaur 找到了响应,Tensorflow 用于梯度的方程是:
当使用 z 和 z* 的偏导数的定义时,它使用 Wirtinger 微积分。
对于一个或多个复变量的实值标量函数的情况,该定义变为:
这确实是复值神经网络(CVNN)应用程序中使用的定义(在此应用程序中,该函数是损失/误差函数,它确实是真实的)。
| 归档时间: |
|
| 查看次数: |
616 次 |
| 最近记录: |