Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Prolog is well suited to transformations, representing and processing rules, and searching. It's a good fit for any program that can be expressed as a transformations or implications. Compilers and other programs that transform data are a good fit, as are rules engines, and pretty much anything that behaves like them.

It turns out that transformations and rules engines are pretty generally useful, so Prolog makes a decent general-purpose programming language. It does take some getting used to, because its mode of operation is different from imperative and functional languages. You don't say, "do this, do that," and you don't present a functional expression for reduction to normal form. Instead, you write a set of facts and rules, then supply some initial conditions and let Prolog compute all the results that your rules imply. It can be a little brain-bending at first because most people seem to have an easier time with "do this, do that" than "this imples that, that implies the other thing."

On the other hand, for problems that are naturally represented as transformations or rules engines, it can be sort of magical. One fun detail is that, since you can specify the logical relations between terms and not an order of operations, you can commonly write Prolog programs that can be run "forward" and "backward" -- that is, given a transformation between A and B, if you supply A it computes B, and if you supply B it computes A.



Deep learning may fit in the prolog paradigm in terms of finding programs to compute the output from the input. I am sure people are working on it, it is one aspect of differentiable programming.. This is similar to functional approaches as well. The search here is for the function f that computes f(A)=B ... Today we specify the state space as a CNN or transformer models. Prolog knows how to do deductions from logical statements so you'd need something similar for mapping f I guess.


>> Deep learning may fit in the prolog paradigm in terms of finding programs to compute the output from the input.

That's interesting, but my hunch is that you'd need two deep learning models to do what a single Prolog program could do: one model to map inputs to outputs and one to map outputs to inputs.

One way to describe deep neural nets is to say that they are function approximators. Now, functions have well-defined inputs and outputs, although I don't think that's actual mathematical terminology. Functions are basically mappings from the elements of one or more sets _to_ the elements of another (or more) sets. But a function mapping is uni-directional. If you have a function ƒ: X → Y, it maps elements of X to Y, but it doesn't map elements of Y to X.

So for instance, if you have a function that maps the set of integers from 1 to 24 to the set of letters in the English alphabet, you don't simultaneously have a function that maps letters to integers- you need another function. This is true both in mathematical terms and in programming terms.

Suppose for instance that you have a Pseudocode function with signature:

  int_char(int: n) -> char: c
You can call int_char() as:

  int_char(2)
And get out "b", but you can't call it the other way:

  int_char("b")
And expect to get 2 in return.

In Prolog on the other hand there are no functions. Rather, every expression is a predicate, or in other words an n-ary relation. Now, the concept of relation is a generalisation of the concept of function, so in Prolog you could write int_char() above with the signature:

  int_char(N,C).
And call it with either or both arguments instantiated... or none:

  ?- int_char(2,C).
  C = b.

  ?- int_char(N,b).
  N = 2.

  ?- int_char(2,b).
  true.

  ?- int_char(N,C).
  N = 1, C = a ;
  N = 2, C = b ;
  N = 3, C = c ;
  % ... etc
Which you can't easily do with other languages. Well, you can do it in, say, C if you write a Prolog interpeter in C :-)

Anyway, deep learning learns functions, not relations, so its models can't go back-and-forth between sets in mappings. In theory anyway; deep neural net often don't really care about theory.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: