Identity of numbers and information
"What are numbers?"
"What is information?"
While I cannot answer these perennial philosophical questions, I had the idea that the answers to these questions are the same thing.
Information, at least the information we might be familiar with in computers, certainly seems number-like. Claude Shannon famously defined the fundamental unit of information to be the bit, which is the answer to a "yes" or "no" question. A bit is the simplest possible number, that can only have the values 0 or 1. "Bit" is a contraction of "binary digit", and when concatenated together, bits can represent larger numbers. 4-digit numbers can range from 0 to 9999 (0 to 10^4 -1), and 4-digit binary numbers can range from 0 to 15 (0 to 2^4 - 1). Because an ordinary (base-10) digit has more information than a bit, 4 ordinary digits can represent more numbers than 4 bits.
You might be aware that everything stored on computers is a series of bits. Every data file, and every program, are ultimately long series of bits. If a bit is a binary digit, then a series of bits is a binary number, a number with potentially millions, billions or more binary digits. But there is nothing special about binary digits. Every binary number has one and only one corresponding base-10 number, as every base-10 number has its unique corresponding binary number. In other words, they are two representations of the same thing: numbers.
Every text document, every email, every mp3 and mp3 player, every web page and web browser, every video game and every operating system, are ultimately (enormous) numbers. For instance, the text of this post, before this bar |, is represented by a base 10 number with approximately 3850 digits. A mp3 file would have around 8 million base 10 digits. These huge numbers are sometimes thought to be the province of hobby mathematics, with no expression in the actual universe. In fact, we interact with them every day.
Computers can be thought of as physical instantiations of math functions, which ceaselessly takes the computer's current state (a gargantuan number) and transform it into the next state/number.
State(n+1) = Computer(state(n))
In principle, nothing stops you from implementing a computer as an enormous look-up table, so that the internal logic would be something like
if (state == 12345667890...)
state = 1423456789...)
else if (state == 1234567891...)
state = 1324567891...
...
and so on. In practice, of course only the simplest computers could ever be implemented this way.
I think ultimately the difference between information and numbers is only pragmatic. If the number is large and information dense enough so that it is impractical to deal with it without breaking it up, we call it information. If it is small and manageable, it is a number. If information and numbers are the same, this might explain why math keeps popping up in nature. If the universe is both physical and informational, then it is both physical and numerical, and it is therefore to be expected that mathematical patterns show up in the universe.
There is more to say, and I think there are problems with this idea, but what do you guys think?
"What is information?"
While I cannot answer these perennial philosophical questions, I had the idea that the answers to these questions are the same thing.
Information, at least the information we might be familiar with in computers, certainly seems number-like. Claude Shannon famously defined the fundamental unit of information to be the bit, which is the answer to a "yes" or "no" question. A bit is the simplest possible number, that can only have the values 0 or 1. "Bit" is a contraction of "binary digit", and when concatenated together, bits can represent larger numbers. 4-digit numbers can range from 0 to 9999 (0 to 10^4 -1), and 4-digit binary numbers can range from 0 to 15 (0 to 2^4 - 1). Because an ordinary (base-10) digit has more information than a bit, 4 ordinary digits can represent more numbers than 4 bits.
You might be aware that everything stored on computers is a series of bits. Every data file, and every program, are ultimately long series of bits. If a bit is a binary digit, then a series of bits is a binary number, a number with potentially millions, billions or more binary digits. But there is nothing special about binary digits. Every binary number has one and only one corresponding base-10 number, as every base-10 number has its unique corresponding binary number. In other words, they are two representations of the same thing: numbers.
Every text document, every email, every mp3 and mp3 player, every web page and web browser, every video game and every operating system, are ultimately (enormous) numbers. For instance, the text of this post, before this bar |, is represented by a base 10 number with approximately 3850 digits. A mp3 file would have around 8 million base 10 digits. These huge numbers are sometimes thought to be the province of hobby mathematics, with no expression in the actual universe. In fact, we interact with them every day.
Computers can be thought of as physical instantiations of math functions, which ceaselessly takes the computer's current state (a gargantuan number) and transform it into the next state/number.
State(n+1) = Computer(state(n))
In principle, nothing stops you from implementing a computer as an enormous look-up table, so that the internal logic would be something like
if (state == 12345667890...)
state = 1423456789...)
else if (state == 1234567891...)
state = 1324567891...
...
and so on. In practice, of course only the simplest computers could ever be implemented this way.
I think ultimately the difference between information and numbers is only pragmatic. If the number is large and information dense enough so that it is impractical to deal with it without breaking it up, we call it information. If it is small and manageable, it is a number. If information and numbers are the same, this might explain why math keeps popping up in nature. If the universe is both physical and informational, then it is both physical and numerical, and it is therefore to be expected that mathematical patterns show up in the universe.
There is more to say, and I think there are problems with this idea, but what do you guys think?
Comments (87)
So you can say, that is a (one) blade of grass. Two blades are the concept of 1 and 1 together. And of course you can say, "That is a field of grass." "That is one piece of grass". And so on.
Information is the compact storage of a discrete experience that may be a combination of many aspects, properties, feelings etc. "My dog" evokes a lot of combined information into a neat and reasonable package to think and communicate about. So yes, I like your thinking on this!
Both piggyback on the logic of counterfactuality. And that in turn leads to the deep difficulties folk have in believing in a continuum that can also be broken into discrete parts. Logic seems to fail right at the point where you seek its own origin.
So a string of bits or the numberline exist in the happy world where we can just take this paradoxical division between the continuous and the discrete for granted. Continuums are constructible. Don't ask further questions. Get on with counting your numbers and bits.
So yes, a 1D line can be understood as an infinite collection of 0D points. And then you can start mechanically imposing any concept of an ordering of the points on the line that takes your fancy. The syntax can become as complex and hierarchical as you like. An analog reality can be encoded in string of digital data points. A Turing machine with an infinite paper tape can in principle represent any more complicated state. Counterfactuality is the true atom of Being.
Yet this happy conception may work pragmatically, however its deeper foundations remain suspect. So there is that to consider.
Can you unpack that a bit? The meaning doesn't spring from the page, so to speak.
Quoting hypericin
A question that has long interested me, and one of the motivators for joining forums. I once had an epiphany along the lines that while phenomena are (1) composed of parts and (2) begin and end in time, that these attributes don't obtain to numbers, which are neither composed of parts nor begin or end in time (although I later realised that (1) only properly applies to prime numbers, but the point remains.) At the time I had this epiphany, the insight arose, 'so this is why ancient philosophy held arithmetic in high esteem. It was certain, immutable and apodictic.' These are attributes of a higher cognitive functionality, namely rational insight. Of course, I was to discover that this is Platonism 101, and I'm still drawn to the Platonist view of the matter. The philosophical point about it is that through rational thought we have insight into a kind of transcendental realm. As an SEP article puts it:
[quote=SEP;https://plato.stanford.edu/entries/platonism-mathematics/#PhilSignMathPlat]Mathematical platonism has considerable philosophical significance. If the view is true, it will put great pressure on the physicalist idea that reality is exhausted by the physical. For platonism entails that reality extends far beyond the physical world and includes objects that arent part of the causal and spatiotemporal order studied by the physical sciences. Mathematical platonism, if true, will also put great pressure on many naturalistic theories of knowledge. For there is little doubt that we possess mathematical knowledge. The truth of mathematical platonism would therefore establish that we have knowledge of abstract (and thus causally inefficacious) objects. This would be an important discovery, which many naturalistic theories of knowledge would struggle to accommodate.[/quote]
See also What is Math? Smithsonian Institute.
As for Shannon's information theory, I think it tends to be somewhat over-interpreted. Shannon was an electronic engineer trying to solve a particular problem of reliable transmission of information. Of course one of the fundamental discoveries of cybernetics, we all rely on Shannon's work for data compression and tranmission every time we use these devices. But there's a lot of hype around information as a kind of fundamental ontological ground, kind of like the digital geist of the computer age.
It is easy to assume things are just what they are. But that depends on them being in fact not what they are not. That should be familiar to you from Deacon's notion of absentials if not from the three laws of thought.
Counterfactuality secures the identity of things by being able to state what they are not. And that is what a bit represents. A and not not-A. A switch that could be off and thus can in fact be on.
Numbers are places marked on a line. The assumption is that their value is ranked from small to large. So already the symmetry is geometrically broken in a particular fashion. But that asymmetry is secured algebraically by identity operations. Adding and multiplying. Adding zero or multiplying by 1 leaves a number unchanged. Adding or multiplying by any other value then does change things.
So the counterfactuality is a little more obscure in the case of the numberline. The question is whether a value was transformed in a way that either did break its symmetry or didn't break its symmetry. The finger pointing at a position on the line either hopped somewhere else or remained in the same place.
Information and numbers then have to move closer to each other as we seek to employ atomistic bits little on/off switches as representations of algebraic structures. To run arithmetic on a machine, the optimal way is break it all down into a binary information structure that can sit on a mechanical switching structure. Just plug into a socket and watch it run, all its little logic gates clacking away.
So counterfactuality is the way we like to think as it extremitises things to the point of a digital/mechanical clarity. The simplicity of yes or no. All shades of grey excluded. Although you can then go back over the world and pick out as many shades of grey as you like in terms of specific mixtures of black and white. You can recover the greyness of any particular shade of grey to as many decimal places as you like. Or at least to whatever seems acceptable in terms of your computational architecture. The gradual shift from 8-bit precision to 64-bit was pricey.
So part of what I was pointing out is how information theory cashes out the counterfactuality of logic in actual logic gates and thus in terms of a pragmatic entropic payback.
Numbers seem to live far away in the abstract realm of Platonia. Shannon information was how they could be brought back down to live among us on Earth.
Algebra with real costs and thus the possibility of real profits.
I guess I've bought into the hype. For me, thinking about a piece of information, say a snippet of song, pass somehow unchanged through multiple wildly different physical media, such as sound waves, tape, CD, mp3, cable internet, wireless internet, streaming buffer, then back to sound waves as you finally hear it, led me to start conceiving of information and matter as being independent, and both as fundamental elements of the universe (maybe not unlike Aristotle's hylomorphism).
Quoting SophistiCat
I'm not sure. Suppose an archaeologist uncovers tablets on which are inscribed a lost language. What did the archaeologist discover? Seemingly, information that can no longer be decoded. Years later, the language was translated. Did the information spring into being? Or was it always there?
That, I agree with :100: and have often argued along these lines (see this thread).
Afaik, it's "the difference" between pattern-strings and mathematical structures, respectively, such that the latter is an instance of the former. They are formal abstractions which are physically possible to instantiate by degrees within tractable limits in physical things / facts and usually according to various, specified ("pragmatic") uses. I think 'Platonizing' information and/or numbers (as 'concept realists', 'hylomorphists, and 'logical idealists' do) is, at best, fallaciously reifying.
No problem for visual and auditory information. We can include written language as a specialized subset of auditory information. Practical problems abound, however, with information related to smell, touch, and taste.
Digital scent is considered experimental and quite impractical. There may actually not even be that much demand for it outside specialized application niches:
Digitizing taste is experimental only:
Digitizing touch is also highly experimental research with currently no practical applications so to speak of:
Yet the holographic principle in fundamental physics says it means something that the same formalism works for information and entropy. At the Planck scale, the physical distinction between the discrete and the continuous dissolves into its own identity operation.
There is something deeper as is now being explored. Reality is bound by finitude. Which would be a big change in thinking.
[quote=Perl, Thinking Being]Forms are ideas, not in the sense of concepts or abstractions, but in that they are realities apprehended by thought rather than by sense. They are thus separate in that they are not additional members of the world of sensible things, but are known by a different mode of awareness.[/quote]
that different mode being rational insight rather than sensory perception.
Quoting apokrisis
Wasn't that because Von Neumann, who was an associate of Claude Shannon, suggested to him that he adopt the term 'entropy', noticing that it was isometric with Bolzmann's statistical mechanics interpretation of entropy? He also said that 'as nobody really knows what it means then you will always have an advantage in debates'. (Ain't that the truth ;-) )
I don't grok this. :chin::
Quoting Wayfarer
Semantic quibble: ideal, not "real".
Yep. Thats where it started. With a conceptual similarity. But then an actual physical connection got made. Folk like Szilárd, Brillouin, Landauer and Bekenstein showed that the Boltzmann constant k that sets the fundamental scale for entropy production also sets a fundamental scale for information processing.
Computing creates heat. And so there is an irreducible limit to how much information a volume of space can contain without melting it. Or in fact gravitationally curling it up into a blackhole.
In a number of such ways, information and entropy have become two faces of the same coin connected by k, which in turn reduces to c, G and h as the fundamental constants of nature. Reality has a finite grain of resolution. And information and entropy become two ways of talking about the same fact.
Quoting 180 Proof
Im talking about holography and horizons. Volumes of spacetime can only contain finite amounts of information because information has a finite grain. Or in entropy terms, a finite number of degrees of freedom.
So at the Heat Death, it all just stops. The information represented by the energy density of the Big Bang has reached its eternalised de Sitter state of being cooled and spread out as far as it could ever go. The Universe is a bath of blackbody radiation. But with a temperature in quantum touching distance of absolute zero. Photons with a wavelength the size of the visible universe.
(There are a few issues of course. Like how to account for the dark energy that ensures this de Sitter state where the cosmic event horizon does freeze over at this finite maximum extent. So it is a sketch of the work in progress. Lineweaver is a good source.)
Exactly, how is it that the same marks on dry clay can carry more or less information in different contexts? And note that it's not just any marks that transmit information. Some random indentations and scratches on the same tablet would not do. How could that be if marks themselves were information?
Also, note that in your example you used clay tablets, not numbers (and in your OP you went back and forth between numbers and computers, which, of course, are not the same thing). This shows that there isn't a necessary connection between information and numbers. Numbers or bits can serve as an abstract representation of an encoded message.
I think they can't. They carry the same information, whether or not it happens to be decodable at the time.
Quoting SophistiCat
Why does that preclude the marks themselves being information? Marks, arranged in certain ways, are information. Arranged randomly, they are not.
Quoting SophistiCat
The point was that computers, thought of as information processing devices, are just as much number processing devices.
Quoting SophistiCat
Why not
Numbers or bits can serve as [s]an abstract representation of[/s] an encoded message
I see the connection you're drawing between entropy and information at the physical level, where both are linked by thermodynamic principles through the Boltzmann constant (after a bit of reading!) However, I wonder if equating the two risks losing sight of the fact that information derives its significance from its order. For instance, a random string of letters might technically have entropy, but it lacks the kind of structured information we get from an ordered string of words. Even so, you could transmit random strings of characters and measure the degree of variance (or entropy) at the receiving end, but regardless no information would have been either transmitted or lost. It's this order that makes information meaningful and, importantly, its the order that gets degraded in transmission. So I question that equivalence - might it not be a fallacy of equivocation?
Well if you are interested in processing signals, that requires you to have a model of noise. This was Shannon's actual job given he worked with a phone company with crackling long distance wires. You need to know how often to repeat yourself when it is costing you x dollars a minute. Maybe send a telegram instead.
If you know the worse case scenario no information getting through to the other end, just random crackle then you can begin to measure the opposite of that. How much redundancy you can squeeze out of a message and yet rely on it to arrive with all its meaning intact.
So sure. We can load up our bit strings with our precious cargo of "differences that make a difference". We can send them out across the stormy seas of noisy indifference. But if we know that some of the crockery is going to be inevitably broken or nicked, we might choose to pack a few extra cups and plates to be certain.
Information theory just starts with a theory of noise or disorder. Then you can start designing your meaning transmission networks accordingly.
Quoting Wayfarer
It is like when folk talk about big bangs or dark matter. People latch onto complex scientific ideas in simplistic fashion.
To actually have a flow of meaning which is what you are meaning by "information" it has to be a flow of differences that make a difference. However that in turn requires a physical channel that transmits difference in the first place. Like a vocal tract that can make noises. Or a stylus that can scratch marks on wax.
So you start with the mechanics for making a noisy and meaningless racket. Then you can start to add the constraints that suppress the noise and thus enhance the signal. You add the structure that is the grammar or syntax that protects your precious cargo of meaning.
A number line has its direction. You are either counting up or counting down. A bit string has some standard word size that fits the bit architecture of the hardware. As further levels of syntax get added, the computer can figure out whether you are talking about an operation or its data.
So no equivocation. Information theory is the arrival of mechanical precision. Mathematical strength action.
Entropy likewise brings mathematical precision to our woolly everyday notions about randomness or chaos. It gives a physical ground to statistical mechanics. Boltzmann's formula speaks to the idea of noise in terms of joules per degree Kelvin.
Besides, I have a suspicion that the designation of 'information' as being foundational to existence, goes back to Norbert Wiener saying 'Information is information, not matter or energy. No materialism which does not admit this can survive at the present day.' I'm sure this is what leads to the prevalence of information-as-foundation in contemporary discourse.
The message "The cat is on the mat. The cat is on the mat." gives you no more information than the message "The cat is on the mat." even though the former contains more bits than the latter (I am discounting noise for simplicity). The message "Your name is X" gives you no information if your name really is X and you are not suffering from amnesia. So, information depends on the receiver as well.
Numbers can be used in mathematical modeling of communication, but numbers in themselves are no more information than they are novels or bridges or population clusters.
How does entropy apply naturally to order except as its negation or inverse? Just as is the relation between signal and noise in the information perspective.
Quoting Wayfarer
Did you read the full passage? Do you really now want to commit to the thesis that brains are just mechanical computers? Racks of logic switches? Thought reduces to symbol processing?
This seems like a conflation of information and communication, where communication is the transmission of information from A to B. Tea leaves and random marks are rich with information, it would take a lot of work to accurately represent their states. But they are not communications, which are intentional acts.
Quoting SophistiCat
Information is not the same as how informative the information might or might not be to a receiver. "The cat is on the mat. The cat is on the mat." may be no more informative than "The cat is on the mat", but the former still carries more information (it requires more bits to represent, but less than "The cat is on the mat. The dog is on the log.") "Your name is X" may not be informative, but it is still information.
Quoting SophistiCat
Just where exactly? If information is distinguished from communicative acts, and from being informative, then numbers are pure information: they are answers to a series of yes and no questions, seen clearly in their binary representations.
If you start with the question, "what is information?" the way to go is to survey existing uses of the word. Another approach would be to do what Shannon and other researchers did, which is to start with a specific problem, something that matters, and then see whether a concept with a family resemblance to "information" fits. But starting with the answer, before you even understand the question, is backwards.
I am reading about this matter of what is information. I am reading Rudy Rucker's book "Mind Tools".
He writes- "Anyone at the receiving end of a communication is involved with "information processing." I may communicate with you by writing this book, but you need to organize the book's information in terms of mental categories that are significant for you. You must process the information."
Spoken and written communication are, if you stop to think about it, fully as remarkable as telepathy would be. How is it that you can know my thoughts at all, or I yours? You have a thought, you make some marks on a piece of paper, you mail the paper to me, I look at it, and by some mysterious communication algorithm, I construct in my own brain a pattern that has the same feel as your original thought. Information!"
He goes on to explain numbers, space (and patterns that create forms), logic, and finally infinity and information. Problem is I know so little about math this book might as well be written in Greek. I am unprepared to understand what I am reading. There is so much I need to know before I can understand.
What if we did not use words, but communicated with math? I know mathematicians can do that, but what if from the beginning we all did? I am sure my IQ would be much higher if I could do that. And I wonderful how thinking in mathematical terms might change our emotional experience of life.
Interesting idea. Logicians might be able to do this, but math people use words and symbols. I have never heard of a math research paper written in math symbols only. Thinking in mathematical terms is common amongst my colleagues, but even there one talks to oneself with words.
But then, the set of all quantities is countable, as is the set of points on a number line. Are these two any more inherently number than bits? Quantities have arithmetic operations defined for them that don't make sense for bits, and bits have binary operations that don't make sense for quantities. Is one set of operations more "number" than the others? Or are bit arrangements, quantities, number lines, all just countable things, and so all equally numeric?
Not so, my friend, if we speak of the real number line. This has been chewed on on this forum until there is little left to be said.
True enough. Instead of points I should have said integer marks
Sounds like Early Wittgenstein's picture theory of language.
Quoting Athena
How would that work, basically?
Why is your view restricting this to numbers? There is no reason you need to represent things with numbers.
Good gravy, I do not know! As I just said in another thread, there is so much I do not know and didn't even know I didn't know so much. I swear I am at a time in my life when every day I feel more and more ignorant of everything I do not know and absolutely paniced to figure out a way to reduce this ignorance.
I sure respect Socrates right now and I would bet my bank account, he did not know how much he did
not know until he got old. :lol: Remember when we were teenagers and thought we knew it all?
I can barely imagine how it is to think in terms of mathical code. To see H2O and know that means water, is special. To think "water" and immediately think where does it come from and where does it go and why is it important and is it clean and safe to drink, etc., etc. is absolutely amazing! No other animal on the planet has this capability. Yes, animals can communicate but they do not come close to the mental activity of which we are capable.
Perfect! I wish I could know what you know through experience with those who think mathematically. I am quite sure some things can not be communicated with math, but I am not sure exactly what divides the world of math and human language, but I suspect AI does not get high scores for comprehending being human. Being human is an experience and that is outside of logic. We are not predictable machines. Are barely inside the laws of nature, compared to the rest of the animal kingdom.
? Edward T. Hall
As she sent it with a computer, absolutely.
The whole point of my op is that information and numbers are the same thing
Yeah but how would you answer the point that you don't need numbers to represent something?
This makes me think about the distinction, particularly in quantum mechanics, between the unmeasured and the measured.
Numbers are just scribbles (2 and two refer to the same thing. it's just easier to do math with the condensed version using numbers instead of words) that refer to certain quantities. They are causally connected - the scribble and the quantity of objects within a category, whether it be cows or photons.
Information is the relationship between the scribble (the effect) and the quantity (the cause).
Quoting apokrisis
Maybe I'm misunderstanding but this seems counter-intuitive considering that we must categorize objects by their similarities, not their differences or what they are not. Objects that are similar fall into some category and it is only then that we can assert that there is a quantity of similar objects. If everything was unique and there are no categories of similar objects then what use are quantities? If there is only one of everything what use is math?
Why does 2+2=4? Some may say that this is logically sound statement, but why? What makes some string of scribbles true? It seems to me that you have to have made some observation, and categorizing your observations, prior to making this statement. Are they just scribbles on this page or are they about something that I can experience and make predictions from?
Quoting SophistiCat
Where are the numbers and how did they get there?
Object recognition thus parallels an entropic view of information. An equilibrium system like an ideal gas is defined by its macro properties - temperature and pressure - and not its micro properties. The actual position of a bunch of particles is an ensemble of differences that becomes merely the sameness of a statistical blur. And the global state of the ensemble is a sameness that can be now treated as a difference when comparing one thermal system to some other in terms of a pressure and temperature.
If I asked you to count the number of crows in a tree, the fact that some were rooks, some ravens, some magpies, would be differences you are being asked to ignore. They are all varieties of crow, but that is being treated as a difference that doesnt make a difference for the purpose of counting crows.
So reality is like this. There are always further distinctions to be had. Even two electrons might be identical in every way, except they are in different places. But equally, the differences can cease to matter from a higher level that sees instead the sameness of a statistical regularity. Sameness and difference are connected by the third thing of where in scale we choose to stand in measuring the properties of a system.
Are we interested in the distinctions between types of crow. Or if it is birds we are counting, is a crow any different from an ostrich?
Quoting Harry Hindu
So reality is divided into sameness and difference by its hierarchical scale. There really is something to talk about at the level of statistical mechanics. But then our talking about it is done in a way that claims to talk past the third thing of a viewpoint where either the sameness or the difference is being ignored. Information and numbers are our means to talk about reality as if from some completely objective nowhere.
It matters in language whether I think I am being asked to count birds or ravens. I have to place myself at a certain interpretive level that matches your understanding about what you were asking. But then the number line is its own abstract thing where there is no physical scale involved. Space, time and energy are all generalised as differences to be ignored. Three ravens is equivalent to three birds, three apples or three spaghetti monsters. The focus is now on the arithmetic or algebraic operations that can be performed on an abstract number system.
We have shifted ourselves into a Platonia so far as reality is concerned. And that new mathematical level of semiosis or reality modelling offers a huge entropic payback for us humans in terms of technology, engineering, computation, and other ways of mechanistically controlling the world.
It makes the whole of reality look the same in our eyes. A mechanical device. A system of particles regulated by differential equations. A sameness of physical laws with a difference in initial conditions.
So numbers and information are part of a new way of speaking about the world that is very useful in proportion to the degree that it is also unreal. It is a language of atomised reductionism that places itself outside even space, time and energy as those are the physical generalities it now aspires to take algorithmic control over.
A modelling relation with the world coming from the Gods eye view. Just equations and variables. Absolute sameness coupled to absolute difference now.
Rational thought or the cognition, the apprehension of pattern, it is grounded in? Animals obviously recognize forms. Should we say they are rational?
Quoting Janus
But anyway, animals obviously have good object recognition. The recognise pragmatic forms. But are they apprehending form at a rational level? Or is that level of abstraction how we humans learn to view the world for our own new purpose of seeing reality in general as one giant rational machine?
Semiosis would say that animals are rational at the level of genetic and neural encoding. They see the world in terms of a regulated metabolism and a patterned environment.
Humans have linguistic semiosis which gets us to a level of seeing the world in terms of a pattern of interaction between intentional agents. The play of viewpoints captured by being able to speak of me and you, before and after, good and bad.
And the OP concerns mathematical semiosis. Pattern abstracted to the point of a mechanical generality. The ability to construct forms in algorithmic fashion. What seems to us the ultimate level of rationalised structure.
This seems a useful clarification. Information is encoded meaning. Genetic information encodes for constraints on chemical actions. Neural information encodes for constraints on environmental actions. Verbal information encodes for constraints on intentional actions. And numeric information encodes for constraints on mathematical actions.
So information is about the pragmatic encoding of meanings. It is how an organism regulates its world by having the kind of memory that acts as a store for data and algorithms to put it rather computationally. An organism can construct states of constraint because a meaningful relation with the world has been atomised into a system of syntax acting on semantics. Habits or routines that can be run. Behaviours which can be switched on or off.
Numbers are then just the form that information takes at the level of a complete semiotic abstraction in terms of the self that is aiming to regulate its world by the business of constructing states of constraint. A numberline gives both the data points and the algorithmic logic that are needed to encode an absolutely general but also perfectly mechanistic modelling relation with the world.
So four levels of information or mechanistic regulation. With maths and logic at the end of this trail as the most evolved form of pragmatic rationality. The ultimate way that an organism could think about the world. If also then, the least actually "organic". :razz:
Quoting apokrisis
Yes, "numbers" are abstractions. But I think animals have a sense of number. The word "form" in information seems to reflect the relationship between information and form. Form and information and number all primordially rely on cognition and recognition of difference and sameness or similarity and pattern.
Or of perceptual grouping. Human working memory famously tops out at about 7 ± 2 items. The kind of grouping in the test that Trump aced when he could recall Person, woman, man, camera, TV. And probably even manage that feat in reverse order.
Animals all have working memory too. The ability to juggle a small set of particular aspects of some larger cognitive task.
But that is not the same as counting. Just the reason why we struggle with holding number strings longer than seven in our working memories.
Quoting Janus
Indeed. I would call the abstracted notion of information our atoms of form. Form reduced to the counterfactuality of a binary switch. We can count how many distinctions it takes to arrive at something completely specific.
The game of Twenty Questions is a good example. Ideally, every question cuts the number of remaining choices in half. And that way we cut through a world of possibilities with an exponentialised efficiency.
The form I have in mind is well you will just have to start guessing. And each guess is an atomistic act of counterfactual switching. If you are any good at this game, you will switch off half the world of possibilities as you zero in on the possibilities still left switched on.
That makes sense to me. I have come across reports that suggest some animals can learn to do basic small number counting. They may be apocryphal.
You know yourself that three, four or even five things can be seen as different sized collections at a single glance. And remembered as such. But the difference between seven or eight apples starts to require a method of checking if you want to be sure of your mathematical correctness. Whereas as for a hungry monkey, it becomes a difference not making a difference. It is just seen as a lot of apples.
:up:
Quoting apokrisis
Nice ordinary example!
I wouldn't be surprised if you were able to (in a manner unknown to me at the moment) topologically map out the possible logical spaces for information to represent truths in state space. Again, I am assuming that theorems and truth of theorems can be topologically mapped out in a heuristically manner in logical space.
Sure. Information is everywhere causes leave effects. What information is relevant, or attended to, depends on the goal in the mind.
Quoting apokrisis
I don't know if I agree with what you're saying here. What does it mean for something to be useful but not real? What does it mean for something to be useful if not having some element of being real? It seems to me that survival is the best incentive for getting things right. The environment selects traits that benefit the survival and reproductive fitness of organisms. Our highly evolved brain must have been selected for a reason and there must be a reason why humans have been so successful in spreading across the planet and out into space. Are those reasons unreal? Do your many words point to real states of reality? Am I to gain some advantage by reading your words? If not, then why read them?
It seems to me that a rational process takes time and mental space.
If I were to talk about marijuana legalization in this thread, would that be a real state of affairs of being off-topic? It seems to me that the way we perceive the world has a real effect on the world by means of our behaviors. Is Santa Claus real? As an idea Santa Claus is very real as you merely need to look at the effect the idea has had on the world.
In your example of the variety of birds we currently observe, we can point to evolution as the cause. Their differences evolved to fill different environmental niches. The variety of birds informs us of how they evolved and what their common ancestor would be like. The differences and similarities in birds indicate that they started with one common ancestor and evolved over time in different environments. We could potentially point to one common ancestor for all life with space and time being the medium in which the differences accumulate to the current state of affairs with the variety of life that we observe today.
Quoting Janus
How does one even learn a language without apprehending the scribbles and sounds in the present and reflecting on how those same scribbles and sounds were used before? I could argue that language use is just more complex learned behavior. Animals communicate with each other using sounds, smells and visual markings. Animals understand that there is more to the markings than just the form the marking takes. It informs them of some state of affairs, like this is another's territory, not mine and in essence has some form of self-model.
I often link this story in discussions like this:
https://vimeo.com/72072873
This man made it to an adult without having learned a language. How he eventually learned language was by reflecting on what others were doing over time to come to understand that those scribbles mean things or are about things?
I agree that numbers and information have something in common.
So, numbers physically exist as,
Brain; (numbers)
And, information physically exists as,
Brain; (information)
The general form is,
Brain; (a non-physical thing)
So as far as identity of numbers and information...they are associated with a physical location and time of a physical brain...always.
And they have the mental content consistent with what brains can do.
Numbers and information are not non-physical without support...but only exist as a physically supported non-physicals.
Any Claude Shannon reference is going to cause confusion
Is it physical, non-physical or physically supported non-physicals? I assume anything with Shannon information theory is physical only.
Entropy doesn't apply to non-physicals or physically supported non-physicals.
How does Shannon information even deal with a non physical things such as the past or future?
Our brains do it all the time, so something different is going on. More than a physical signal in our brains...
To be clear, yes of course information storage as genes or words has some entropic cost. To scratch a mark on a rock is an effort. Heat is produced. Making DNA bases or pushing out the air to say a word are all physical acts.
But the trick of a code is that it zeroes this physical cost to make it always the same and as least costly as possible. I can say raven or I can say cosmos or god. The vocal act is physical. But the degree of meaning involved is not tied to that. I can speak nonsense or wisdom and from an entropic point of view it amounts to the same thing,
As they say, infinite variety from finite means. A virtual reality can be conjured up that physical reality can no longer get at with its constraints. But then of course, whether the encoded information is nonsense or wisdom starts to matter when it is used to regulate the physics of the world. It has to cover its small running cost by its effectiveness in keeping the organism alive and intact.
Quoting Harry Hindu
There are grades of semiosis. Indexes, icons and then symbols. So I was talking about symbols when I talk about codes. Marks that bear no physical resemblance to what they are meant to represent.
Animals communicate with signs that are genetically fixed. A peacock has a tail it can raise. But that one sign doesnt become a complex language for talking about anything a peacock wants.
A language is a system of symbolic gestures. Articulate and syntactically structured. A machinery for producing an unlimited variety of mark combinations. Quite different in its ability to generate endless novelty.
So "apprehending forms", in the sense of prelinguistic recognition would amount to prelinguistic conceptualization.
The entropic cost in creating the sound or scribble isn't the only part of the equation. Don't forget about the mind that is observing the mark or hearing the sound and the mental effort involved with decoding the message. It takes more mental power to get at the meaning of "philosophy" than "photograph" even though both words contain the same amount of letters. The question then becomes does the discussion about philosophy provide any survival or reproductive benefit (wisdom), or are we just playing symbol games (speaking nonsense)? For humans at least it could be argued that entering a virtual reality world can relieve stress and provide unique social interactions with others sharing the same virtual reality that strengthen social bonds in the physical world.
The speaker or writer must have some sense of empathy for the listener and reader. They have to put things in a way that they know they will understand with the least amount of mental effort (efficiently) if they actually want to be understood without having to re-phrase or repeat themselves.
This is why, for me at least, I get irritated at people that waste my time with word salad, mental gymnastics and intellectual dishonesty, which ends with me not putting much weight into what they write or say in the future.
Quoting apokrisis
I would argue that when a peacock raises its tail it wants to mate. It also communicates to female peacocks the fitness of the male. There is complexity there in the causes that lead to some effect, like a male peacock showing off its tail. I could argue that the display of the peacock's tail says something about the Big Bang, as there would not be a peacocks if there wasn't a Big Bang. Of course the immediate effects say more about their immediate causes than some effect billions of years later, but my point is that all effects carry information about their causes.
In reading your words I can get at what you intended to say, the idea you intend to convey, but can get at your level of understanding of English as well. The information is there whether we look or not. Where we look, or what information we attend to, at any given moment is dependent upon the goal in the mind.
It's really just a difference in degrees. More complex brains can use more complex representations and get at more complex causal relations. The question then becomes at what point does the complexity cease to be useful? Are we overcomplicating things with our language, especially in philosophical discussions?
Sure. But then our brain is an expensive organ to run. It uses glucose at the rate of working muscle. On the other hand, that is a constant metabolic cost. There is little change when we daydream or go to sleep.
And the goal of the brain is also to reduce all thoughts to learnt habits. It we figure things out, then our mind can just shortcut to our routine definitions of those words. So what you call mental power is the effort of attending to novelty. But once we have reduced some thing to a habit of thought, it can simply be unthinkingly emitted. It becomes so remembered formula that just needs to be triggered. The metabolic cost of rewiring the brain has been paid.
Quoting Harry Hindu
You could read that into a peacock tail. But two peacocks just have their one instinctual understanding.
You have actual language and that makes a huge difference. Peacocks only have their genes and neurology informing their behaviour. No virtual social level of communication.
Quoting Harry Hindu
Your own argument says it isnt if humans have language and a virtual mentality that comes with that.
Can we learn more by using math than by using words? I have not communicated anything with math but computers do not use words to compute. And I am sure my failure to understand math keeps my IQ relatively low.
Here is an explanation of rational
.
Animals can problem solve. Some do it better than others. Dogs are amazing. They are the only animal that recognizes we point at something, it should go see what we are pointing at. This is one of the reasons they are good hunting partners. However, not all dogs get it. Wolves do not pick up the cue to check out what we are pointing it. Dogs that come from a line of domesticated dogs can be brilliant in figuring out human behavior and how to play the human for all the human is worth. That ability can make the difference between being a street dog or attaching to a human who provides food and shelter.
That information comes from shows about dogs.
That is only awareness of quantity. It is nothing like recognizing numbers as a code and bits of information that can help us understand the universe. It is not like the ability to use origami to understand how nature works.
That comes from a video about a father and son, both mathematicians, discovering how much we have to learn of nature by studying origami. And the ability of work with that kind of information can be transferred in genes. That is, the ability to do math or play the piano can come in our genes. That is true for dogs and humans. :lol: I did not get the gene.
Yes, it is only a basis, not linguistically elaborated obviously.
I agree that many dogs are very smart. It's hard for us, an animal capable of abstracting and reflecting on our experiences, an ability which seems to be reliant on symbolic language, to understand animal intelligence on its own terms, and not to underestimate it. No doubt we have it there somewhere.
Of course I don't really know you and you should consider the following a matter of speculation on my part. If there is something that resonates with you it might be worthwhile to consider it more, if not I won't be offended if you tell me you can't relate to what I say. That said...
I don't think IQ works the way you think. We all have different constellations of cognitive strengths and weaknesses, with the consequence that learning some things may be harder or easier for us than for others. It seems plausible to me that math just doesn't come as easy for you as it does for some or even most. There is no failure on your part in that. Furthermore, it sound to me like the results of what you have learned are beautiful, and I hope you can be less hard on yourself.
Thank you so much for your concern about my feelings. That is something lacking in forums and yet science is making us aware of how important our emotions are and that we are healthier and happier when we reach out to others. And your nice words make me want to try even harder. I have math games I can put in my computer and I can give some time to using them. If we kept a running thread about math, the social aspect might help me stay motivated.
The weird thing is I am fascinated by math. I have books and DVD's about math. I want to learn the language of math and I understand learning a language is one way to keep our mental powers as we age. And oh shit, I am in trouble. Your encouragement led to looking for books and I have to have at least 2 of them. These are some thrift books offerings...
In one of my sets of college lectures, the professor can talk about knots for at least an hour. This is using math to understand the unseen, such as to explore DNA with math.
I just order a book that explains math and animals. If I could get to understanding math at least as well as a dog I will have achieved something while I sit at home with COVID.
I hope we always have a math thread to sustain my interest in math.
You might not be aware that the infamous western gunman, John Wesley Hardin, when in prison worked his way through an algebra textbook. He also became a lawyer.
Quoting Athena
That would have put me to sleep. :cool:
Devlin knows a lot more about this than me, but In all my years I haven't witnessed any kind of improvement that hasn't come from simply picking up an elementary math text and making an attempt to understand it. Or taking an elementary class. With Wikipedia as a sort of backdrop it's easier to do this these days.
Sometimes people convince themselves they have little to no math ability. Then it's really hard to make progress.
The same question arises with critical thinking. I am discouraging there also. But I would love to be shown wrong. ChatGPT disagrees with me, but its suggestions assume someone who has certain personality qualities. Can these be cultivated?
1 + 27 + 4,534 = I eat apple.
There's no math in that. Yeah, I just did that in five minutes. But would we find a solution if we spent a thousand years trying? I doubt it. And I assume it's been tried by plenty of mathematicians over the centuries. I can't imagine a way of actually doing math that also means things we want to discuss.
But next time I'm in Castalia, I'll see if they've figured it out.
Is there a way to have tagged inside of the Athena quote?
Language evolved from a theory of other minds. Animals have learned to anticipate other animals intentions by observing their behavior and learned to communicate their intentions by behaving in certain ways. Drawing scribbles and making sounds with your mouth are just more complex forms of communicating your intentions and reading into others intentions.
Words refer to things that are not words. It would be better to show you what I'm talking about than to just tell you. If words only referred to things in our heads, how would we ever be able to communicate that to others? Words refer to things that we can see and feel in the world and are only necessary to communicate to others what they were not present for.
Thats been one theory favoured by cognitivists. As a biosemiotician, I would instead stress the simpler story that language proper arose when Homo sapiens evolved the modern articulate vocal tract.
Quoting Harry Hindu
A capacity to generate syntactical speech is a difference in kind and not just degree. All apes are social and so have an ability to anticipate and coordinate actions in their social setting. But no ape can learn fluent grammar.
I think this is where the op goes astray. Information is what is represented by symbols, and "mathematical" is a type of information. Mathematical symbols have corresponding with them, mathematical information. But not all symbols are mathematical symbols, nor is all information mathematical information.
"Identity" is what a particular (individual) thing is said to have. So when a symbol represents a particular thing, this is a special type of information in which identity is assumed. So the information represented with "that apple is mine", is not mathematical information.
The principal difference between these two types of information seems to be that the same mathematical information is freely applied in a wide variety of situations, in a universal way, and to a multitude of different things, while identity information is by its nature restricted in application, to particular things.
Indeed, not along the lines of the op. I just commented on a snippet of side conversation I thought was interesting. I'll stop now. :smile:
This seems too anthropomorphic to me. The difference you are talking about is one between the rules of representation humans have selected in the scribbles they use for efficient communication vs. the rules natural selection has selected for efficient communicating. One could argue that natural selection had a role in the former as well.
Then there's this:
https://phys.org/news/2024-08-uncovering-secret-communication-marmoset-monkeys.html
There's still a lot we do not know about animal communication. It appears to me that what you have shown is that the level of complexity in communication is based on the degree the brain has evolved to distinguish between certain symbols. It's like comparing how hominids started cooking food by throwing it on a fire and the diversity of recipes we have in the modern era. It's still cooking food.
An advanced alien species that communicates telepathically might consider our mode of communication not a language. There are many different ways to communicate, most of which we probably don't even know about.
:heart: I absolutely love that example. :rofl: That makes as much sense as spell-check programs that obviously don't have a clue about the intended meaning. Or don't know it is a quote and not something to correct. And AI can do better why?
I am creating a thread for communication. "From numbers and information to communication".
I'm not qualified to engage in this profound thread, but your "epiphany" suggested a relationship between Numbers and Information that is not covered by Shannon's engineering theory, yet may be implicit in Plato's broader philosophical worldview.
Shannon's digital Information is defined in terms of pragmatic, physical, immutable, apodictic distinctions. But Plato's ideal Numbers*1 were non-physical, non-sensible things in a realm beyond time and space (transcendent). Ironically, the latter may be more applicable to mundane human use of Information with analog values, personal meanings, and perhaps even fractal dimensions, that don't lend themselves to yes/no digitization.
Quantum Physics has analyzed reality down, not to atoms of value & meaning, but to oceans of value (the Quantum Field) that lie, not on a simplistic linear number line, but in a "transcendent" state-of-being where "real" particles of Matter are temporary, conditional, and statistically probable. Could Plato's ideal non-sensible mathematical realm correspond to that hypothetical abstract mathematical sphere-of-Influence that physicists call "the universal quantum field"*2?
In Plato's Cave allegory, material things in the sensible world are merely shadows of an illuminated-but-unreal domain. Likewise, our social meanings and linguistic information consist of imperfect analog values that are close enough to absolute True/False to be useful for communication. Not Identical, but relative.
Conservative Physicists probably don't think of the Quantum Field as "transcendent", so exploring that possibility is left to Liberal (new-agey ; mystical energy) Metaphysicians*3. Personally, I doubt that there are any practical real-world applications of transcendental preternatural information, such as access to "unlimited knowledge". But the theoretical philosophical implications of perfection may be of interest to those who like to reason beyond immanent Materialism and utilitarian Mechanism. :smile:
*1. Mathematical Platonism :
the doctrine that there exist abstract objectsobjects that are wholly nonspatiotemporal, nonphysical, and nonmental . . . . based on the postulation of unchanging and eternal realities known as forms.
https://www.britannica.com/topic/mathematical-Platonism
*2. The Universal Field Theory is not a physics theory in a classical sense. It is rather a philosophical theory explaining Why and How physical phenomena appear.
https://theuniversalfieldtheory.com/
*3. What is Quantum Transcendence? :
I just googled QT and got this hit.
https://www.1to1coachingschool.com/QEC_What_is_Quantum_Transcendence_Coaching.htm
I don't think analog values are not information, while digital values are. Of course analog values are just as representable on machines. The difference with physical values is that in a machine precision is fixed and immutable, you cannot extract more or less. Whereas precision is more fluid in natural values, you can expend more or less work to extract more or less precision. But this precision is also ultimately fixed, bound by physical limitations.
While personal meanings are not in themselves information, but rather frameworks of interpretation. I think the conflation of information and interpretation is one of the main confusions of this topic.
If information is thought of as form (actuality, quiddity) then the idea of information as a "foundation" of sorts is very old indeed. In Aristotle, form (act) has primacy over matter (potency).
But often it seems that attempts to use information in a hylomorphic sense are hamstrung by being unable to jettison the modern conception of matter as having form, and so you end up with reductionist versions of information-based ontologies where things are "made of bits," which seems to badly miss the point.
I suppose this subject is also haunted by the mistake of some scholastics, particularly later ones, of turning natures, species, genera, etc. into logical objects, when they are first and foremost the principles of actual, changing being (and the principles of the change therein). Hence, the idea that evolution is a problem for essences because it shows they can changewell this presupposes thinking of them in what is probably an unhelpful manner. I think this is an area where Deely's treatment of Aristotle is particularly helpful, even if he tends to neglect the "form as intellection," side that folks like Perl bring out well.
Shannon took an ancient term referring generally & loosely to meaning in a mind*1 --- or as you noted, "frameworks for interpretation --- and adapted it for use in mindless computers*2. To that end, he ignored the inconsistent variable analog concrete semantic forms of Information, and focused on the consistent absolute digital abstract mathematical (either/or ratios) that could be exactly defined as something or nothing (1 or 0).
Human meanings are subject to vague personal interpretation and mis-interpretation, while computer bits & bytes are impersonal & precise. However, those numerical values can later be translated back into human (natural language*3) meanings, but at the risk of mis-interpretation. Anything that can cause an information processor (computer or brain) to create meaningful internal Forms (images ; configurations) is a source of Information. :smile:
*1. Information :
Knowledge and the ability to know. Technically, it's the ratio of order to disorder, of positive to negative, of knowledge to ignorance. It's measured in degrees of uncertainty. Those ratios are also called "differences". So Gregory Bateson defined Information as "the difference that makes a difference". The latter distinction refers to "value" or "meaning". Babbage called his prototype computer a "difference engine". Difference is the cause or agent of Change. In Physics its called "Thermodynamics" or "Energy". In Sociology its called "Conflict".
https://blog-glossary.enformationism.info/page11.html
*2. Information is an abstract concept that refers to something which has the power to inform. At the most fundamental level, it pertains to the interpretation of that which may be sensed, or their abstractions. . . . Information is not knowledge itself, but the meaning that may be derived from a representation through interpretation ____Wikipedia
*3. Natural Language :
a language that has developed naturally in use (as contrasted with an artificial language or computer code). ____ Oxford Languages
Gnonsense. Shannon worked on analog computers before essentially inventing digital logic. His communication theory was very much about communicating uncorrupted digital data through the noisy analog world. So no, he didn't ignore the analog.
What is with your obsessive need to propagate misinformation?
Please note that I wasn't talking about analog Computers (continuous vs digital values), but analog Information*1 (semantic meaning expressed by figurative analogies). Shannon found a way to reduce the Uncertainty of "noisy" Analog Computers, including human brains*2, by using Digital Information in which the Natural Language meaning is converted into synthetic Mathematical symbols. In that process, the real world meanings (analogies ; metaphors ; similes ; nuances) are ignored in favor of abstract numerical values, and must be reconstructed later, opening the possibility of misconstrual.
Ironically, cutting edge computers are now learning to communicate with human programmers in natural language instead of artificial codes*3. How do you think the programmers will deal with the inherent Uncertainties of human language? Your misinterpretation of my human language post is a prime example of self-misinformation. :smile:
*1. Analog Information :
information processing called analog-form information, or simply analog information. Until the development of the digital computer, cognitive information was stored and processed only in analog form, basically through the technologies of printing, photography, and telephony.
https://www.britannica.com/topic/analog-information
*2. Analog Brain
The mammalian brain, comprised of neuronal networks, functions as an analog device and has given rise to artificial neural networks that are implemented as digital algorithms but function as analog models would.
https://www.frontiersin.org/journals/ecology-and-evolution/articles/10.3389/fevo.2022.796413/full
*3. Why Natural Language is the New Language of the Digital Era
The days of writing lines of code to achieve tasks are gradually giving way to the era of conversation. Natural language processing (NLP) and machine learning have reached a point where machines can not only understand what we say but also grasp the context and nuances of our conversations.
https://www.linkedin.com/pulse/why-natural-language-new-digital-era-anuya-kamat
You didn't answer my question. Do you want to hear my best guess at what the answer is?
The computer chip industry understands hylomorphism very well. Why? Because there's the chip designers, and the chip fabricators, and nowadays they're usually different companies. This is called 'fabless manufacture' and is the standard model in current chip design. NVidia, for instance, deals entirely with design ('form'), while TSMC is one of the leading companies which fabricate the chips using fiendishly complex machines ('matter') - about which, see this mind-blowing documentary on ASML's EUV lithography machines.
Quoting Count Timothy von Icarus
It is because of reification, the 'thingifying' tendency deeply embedded in modern thought, which believes that only things are real.
Rather the opposite of reification. Instead of treating abstractions as real, it excludes parts of the real from the category "real".
Quoting Count Timothy von Icarus
Why hamstring? If matter has or coexists with form, matter has or coexists with information. What is the problem?
Ha, I am now seeing that the way I wrote that is extremely unclear. Of course matter has form in hylomorphism!
What I meant to say was that in modern, particularly early modern, thought form is packaged intrinsically with matter into fundamental "building blocks"atomism, reductionism, smallism"wholes are the sum of their parts." There is no "prime matter," as an abstraction, sheer potency, but rather matter as actual bits of stuff with definite form and definite/actual attributes (eliminating potency).
On such a view, the form of composite objects is just reducible to arrangements of various building blocks. The relationality and processual elements of the information theoretic view get lost. Things "are what they are made of," a view which loses sight of how information is defined by context.
If you maintain this sort of thinking you end up with "stuff made out of 1s and 0s," the sort of thing the original digital physics was taken over the coals for.
Information seems more like number, something that doesn't exist at all without interpretation, that almost seems to reside in a platonic realm of its own.