This is just a propaganda - repetitive reciting of slogans made of long words.)
It is, of course, difficult to argue with zealots, but I will try nevertheless.)
The cost of what is called "advanced type system" is inability to put an elements of different types in the same list or tuple or whatever. It is not just a feature, it is a limitation. In some cases, when you, for example, dealing only with numbers, say, positive integers, it is OK, but what then the real advantage of such type checking?
On the other case, the concept of a pointer which all those "packers" trying to throw away is very mind-natural. When we have a box we could put anything in it, as long as it fits. Imagine what a disaster it would be when you moving from one flat to another but must pack stuff only into special kind of boxes. This one is only for such kind of shoes, this is only for spoons, that for forks. So, with a pointer we could "pick up" everything, and then decide what it is and where it goes.
Another mind friendly concept is using symbols and synonyms to refer to thing - s symbolic pointers. It is how our minds work. Those who are able of thinking in more than one language know that we could refer to it using different words, but "inner representation" is one and the same.
These two simple ideas - using pointers (references) and have data ("representation") to define itself (type-tagging is another great idea - it is labeling) gives you a very natural way of programming. It is a mix of so-called "data-directed" and "declarative" and, as long as you describe a transformation rules instead of imperative step-by-step processes, "functional" styles.
Of course, the code will look in a certain way - there will be lots of case-analysis, like unpacking things form a big box - oh, this is a book - it goes to a shelf, it is a computer speakers, it goes on the table, etc. But that's OK, it is natural.
The claims that "packing" is the best strategy is, of course, nonsense. Trying to mimic natural processes in our mind (as lousy as we could do it) is, in my opinion, has some sense.
There are some good ideas behind each language and not so good. Symbolic computation, pattern matching, data-description (what s-expression, or yaml is) are good ones. Static typing, describing "properties" instead of "behavior" - not so.)
Also it is good to remember that we're programming computers, not VMs. There is something called "machine representation" which is, well, just bits. It doesn't mean to swing into another extreme and program in assembly, but it is advisable to stay close to hardware, especially when it is not that difficult.
Everything is built from pointers, you like it or not.) The idea to throw them away is insane, the idea (a discipline) of not doing math on them is much better. The idea of avoiding over-writing is a great one, it is good even for paper and pencil - everything become a mess very quickly, but avoiding all mutation is, of course, madness.
So, finding the balance is the difficult task, and it is certainly not Haskell. Classic Lisps came close, but it requires some skill to appreciate the beauty.) So, the most popular languages are the ugliest ones.
> The cost of what is called "advanced type system" is inability to put an elements of different types in the same list or tuple or whatever. It is not just a feature, it is a limitation. In some cases, when you, for example, dealing only with numbers, say, positive integers, it is OK, but what then the real advantage of such type checking?
I can't say I managed to completely understand the argument you're making here, but doing mostly Java/Python for work, I don't remember the last time I had to write a heterogeneous list. At worst, you can always go for existential types.
A list that has a special marker at its end is a general concept. The-empty-list in Lisp, EOF in C, etc. With this you have streams and ports and all the nice things.)
In my opinion, the single-linked-list data structure as a foundation of Lisp was selected not by accident. It is also most basic and natural representation of many natural concepts - a chain, a list. You could ask for the next element, and find out that there is no more. Simple and natural. Because of its simplicity the code is also simple.
Such kind of lists should be heterogeneous, because when all the elements are of the same type, it is more natural to represent it as an array - an ordered sequence. As far as I know, Python's lists actually are dynamic arrays.
The sequences of the elements of the same type (same storage size and encoding) with a marker at the end could be also viewed as a homogeneous lists. C strings processed as a list of characters is canonical example.
Now consider a UTF-8 encoding. It is a variable-length encoding. UTF-8 string is not an array, and because you cannot tell the boundaries between runes while reading, is not a list. But, nevertheless it could be considered and processed as a stream, until EOL marker is reached. This is why it was invented in Bell Labs to keep things as simple as possible.
Now, you see, the concept of a homogeneous list from math is not enough for CS, and sometimes it is much better to have it fuzzy. What is a list is a matter of a point of view.
> In my opinion, the single-linked-list data structure as a foundation of Lisp was selected not by accident. It is also most basic and natural representation of many natural concepts - a chain, a list. You could ask for the next element, and find out that there is no more. Simple and natural. Because of its simplicity the code is also simple.
Yes, that's a recursive data structure well-suited to functional languages, same as Haskell lists.
> Such kind of lists should be heterogeneous, because when all the elements are of the same type, it is more natural to represent it as an array - an ordered sequence.
From a memory point of view, maybe, but that's really an implementation detail. If you take, eg, Perl arrays, which you can access by index, push, shift, unshift, and pop, the actual implementation is invisible from the programmer, just as it should be in this sort of language. Java will happily store cats and dogs in an array of Object, for instance.
> Now consider a UTF-8 encoding. It is a variable-length encoding. UTF-8 string is not an array, and because you cannot tell the boundaries between runes while reading, is not a list. But, nevertheless it could be considered and processed as a stream, until EOL marker is reached. This is why it was invented in Bell Labs to keep things as simple as possible.
You could very well store it as a list of bytes. This wouldn't be terribly efficient, but it's perfectly doable. Whether you process it as a stream or you have it stored in whatever array/collection is orthogonal to the fact that your language of choice supports heterogeneous lists. You can do stream processing in Haskell too (with, eg, pipes). You have also access to other data structures which are not lists, but which do expect to be homogenous.
Yes, it is indeed a stream of bytes, with a zero-byte as an EOF marker. That's why UTF-8 is good-enough.
As for lists, as long as your next element is not always in the next chunk of memory, you need a pointer. A chain of pointers is a single-linked list. This is the core of a Lisp and it is not an accident. Together with type-tagging, you could have your lists heterogeneous, as simple as that.)
This is a part of the beauty and elegance of a Lisp, in my opinion - few selected ideas put together.
It is, of course, difficult to argue with zealots, but I will try nevertheless.)
The cost of what is called "advanced type system" is inability to put an elements of different types in the same list or tuple or whatever. It is not just a feature, it is a limitation. In some cases, when you, for example, dealing only with numbers, say, positive integers, it is OK, but what then the real advantage of such type checking?
On the other case, the concept of a pointer which all those "packers" trying to throw away is very mind-natural. When we have a box we could put anything in it, as long as it fits. Imagine what a disaster it would be when you moving from one flat to another but must pack stuff only into special kind of boxes. This one is only for such kind of shoes, this is only for spoons, that for forks. So, with a pointer we could "pick up" everything, and then decide what it is and where it goes.
Another mind friendly concept is using symbols and synonyms to refer to thing - s symbolic pointers. It is how our minds work. Those who are able of thinking in more than one language know that we could refer to it using different words, but "inner representation" is one and the same.
These two simple ideas - using pointers (references) and have data ("representation") to define itself (type-tagging is another great idea - it is labeling) gives you a very natural way of programming. It is a mix of so-called "data-directed" and "declarative" and, as long as you describe a transformation rules instead of imperative step-by-step processes, "functional" styles.
Of course, the code will look in a certain way - there will be lots of case-analysis, like unpacking things form a big box - oh, this is a book - it goes to a shelf, it is a computer speakers, it goes on the table, etc. But that's OK, it is natural.
The claims that "packing" is the best strategy is, of course, nonsense. Trying to mimic natural processes in our mind (as lousy as we could do it) is, in my opinion, has some sense.
There are some good ideas behind each language and not so good. Symbolic computation, pattern matching, data-description (what s-expression, or yaml is) are good ones. Static typing, describing "properties" instead of "behavior" - not so.)
Also it is good to remember that we're programming computers, not VMs. There is something called "machine representation" which is, well, just bits. It doesn't mean to swing into another extreme and program in assembly, but it is advisable to stay close to hardware, especially when it is not that difficult.
Everything is built from pointers, you like it or not.) The idea to throw them away is insane, the idea (a discipline) of not doing math on them is much better. The idea of avoiding over-writing is a great one, it is good even for paper and pencil - everything become a mess very quickly, but avoiding all mutation is, of course, madness.
So, finding the balance is the difficult task, and it is certainly not Haskell. Classic Lisps came close, but it requires some skill to appreciate the beauty.) So, the most popular languages are the ugliest ones.