How to Easily Patch Fedora Packages

While many Linux users are developers, a rather small part of the developers feels easy to meddle or extend their distribution's packaging. When there is a pending feature from an upstream package, or when there is a bug in a package, users usually wait for the distribution maintainers to fix it, or resort to use independent solutions such as homebrew, for rebuilding upstream projects.

Software packaging in distributions entails dealing with the build system of an unknown sub-project, and it is sometimes a learning curve that may deter developers from tackling it. Luckily, in Fedora Linux, we have good tools allowing to deal each package's own mess, and make it surprisingly easy to build modified versions.

Computing Symbolic Gradient Vectors with Plain Haskell

While writing my previous post, I was curious how easy it would be to implement TensorFlow's automatic differentiation for back propagation. In TensorFlow's web site they call it 'automatic differentiation' but in fact they probably do 'symbolic differentiation', as mentioned in their white paper. The difference between the two relates to whether the differentiation is done during the original computation or beforehand. It makes sense to do the latter, because then you can maintain a separate computational graph of the back propagation to perform the updates.

Back Propagation with TensorFlow

(Updated for TensorFlow 1.0 on March 6th, 2017)

When I first read about neural network in Michael Nielsen's Neural Networks and Deep Learning, I was excited to find a good source that explains the material along with actual code. However there was a rather steep jump in the part that describes the basic math and the part that goes about implementing it, and it was especially apparent in the numpy-based code that implements backward propagation.


© 2010-2020, Dan Aloni