A
C
D
G
L
M
N
P
R
S
T
X
Differentiable programming refers to programs that rewrite themselves within at least one component, like neural networks do, using optimization algorithms like gradient descent.
Some applications of differentiable programming are combining deep learning with physics engines in robotics, analyzing electronic structures with differentiable density functional theory, and differentiable ray tracing.
Differentiable programming allows companies to build a new program using weighted parameters, trained from examples using gradient-based optimization. These programs can rewrite parts of themselves in a more optimized manner using gradients.
Differentiable programming uses automatic differentiation and gradient-based optimization to approximate a loss function, such as neural networks. In contrast, the probabilistic programming approach uses various forms of Markov Chain Monte Carlo (MCMC) and differentiable inference to approximate a probability density function.
Differentiable programming consists of applying the techniques of deep learning to complex to reuse the knowledge embedded in them.