Complex iis bad. Simpel is good. So best would be to use simpel formulas. Or complex for training our brains and get insane (if we don't watch out)clayt wrote: ↑Mon Jun 13, 2022 7:33 pmThis isn't exactly correct: backpropagation is a technique for computing the gradient of multi-layer neural nets. What we're doing here is actually just a single-layer neural net (also known as an adaptive linear element, or Adaline), which does not require backpropagation at all.
Computing the gradient of the MSE cost function
Moderator: Ras
-
- Posts: 7251
- Joined: Mon May 27, 2013 10:31 am