Identity of numbers and information

hypericin August 18, 2024 at 21:31 7150 views 87 comments
"What are numbers?"
"What is information?"

While I cannot answer these perennial philosophical questions, I had the idea that the answers to these questions are the same thing.

Information, at least the information we might be familiar with in computers, certainly seems number-like. Claude Shannon famously defined the fundamental unit of information to be the bit, which is the answer to a "yes" or "no" question. A bit is the simplest possible number, that can only have the values 0 or 1. "Bit" is a contraction of "binary digit", and when concatenated together, bits can represent larger numbers. 4-digit numbers can range from 0 to 9999 (0 to 10^4 -1), and 4-digit binary numbers can range from 0 to 15 (0 to 2^4 - 1). Because an ordinary (base-10) digit has more information than a bit, 4 ordinary digits can represent more numbers than 4 bits.

You might be aware that everything stored on computers is a series of bits. Every data file, and every program, are ultimately long series of bits. If a bit is a binary digit, then a series of bits is a binary number, a number with potentially millions, billions or more binary digits. But there is nothing special about binary digits. Every binary number has one and only one corresponding base-10 number, as every base-10 number has its unique corresponding binary number. In other words, they are two representations of the same thing: numbers.

Every text document, every email, every mp3 and mp3 player, every web page and web browser, every video game and every operating system, are ultimately (enormous) numbers. For instance, the text of this post, before this bar |, is represented by a base 10 number with approximately 3850 digits. A mp3 file would have around 8 million base 10 digits. These huge numbers are sometimes thought to be the province of hobby mathematics, with no expression in the actual universe. In fact, we interact with them every day.

Computers can be thought of as physical instantiations of math functions, which ceaselessly takes the computer's current state (a gargantuan number) and transform it into the next state/number.

State(n+1) = Computer(state(n))

In principle, nothing stops you from implementing a computer as an enormous look-up table, so that the internal logic would be something like

if (state == 12345667890...)
state = 1423456789...)
else if (state == 1234567891...)
state = 1324567891...
...

and so on. In practice, of course only the simplest computers could ever be implemented this way.

I think ultimately the difference between information and numbers is only pragmatic. If the number is large and information dense enough so that it is impractical to deal with it without breaking it up, we call it information. If it is small and manageable, it is a number. If information and numbers are the same, this might explain why math keeps popping up in nature. If the universe is both physical and informational, then it is both physical and numerical, and it is therefore to be expected that mathematical patterns show up in the universe.

There is more to say, and I think there are problems with this idea, but what do you guys think?

Comments (87)

Philosophim August 18, 2024 at 21:56 #926489
Sounds pretty good. I've always taken that numbers are the logical expression of a living beings ability to discretely experience. Experience is the totality of your senses, thoughts, feelings, etc. A discrete experience is your ability to focus on an aspect of it. A sense. A thought. A feeling.

So you can say, that is a (one) blade of grass. Two blades are the concept of 1 and 1 together. And of course you can say, "That is a field of grass." "That is one piece of grass". And so on.

Information is the compact storage of a discrete experience that may be a combination of many aspects, properties, feelings etc. "My dog" evokes a lot of combined information into a neat and reasonable package to think and communicate about. So yes, I like your thinking on this!

apokrisis August 18, 2024 at 22:30 #926500
Quoting hypericin
I think ultimately the difference between information and numbers is only pragmatic.


Both piggyback on the logic of counterfactuality. And that in turn leads to the deep difficulties folk have in believing in a continuum that can also be broken into discrete parts. Logic seems to fail right at the point where you seek its own origin.

So a string of bits or the numberline exist in the happy world where we can just take this paradoxical division between the continuous and the discrete for granted. Continuums are constructible. Don't ask further questions. Get on with counting your numbers and bits.

So yes, a 1D line can be understood as an infinite collection of 0D points. And then you can start mechanically imposing any concept of an ordering of the points on the line that takes your fancy. The syntax can become as complex and hierarchical as you like. An analog reality can be encoded in string of digital data points. A Turing machine with an infinite paper tape can in principle represent any more complicated state. Counterfactuality is the true atom of Being.

Yet this happy conception may work pragmatically, however its deeper foundations remain suspect. So there is that to consider.


Wayfarer August 18, 2024 at 22:51 #926505
Quoting apokrisis
Both piggyback on the logic of counterfactuality.


Can you unpack that a bit? The meaning doesn't spring from the page, so to speak.

Quoting hypericin
There is more to say, and I think there are problems with this idea, but what do you guys think?


A question that has long interested me, and one of the motivators for joining forums. I once had an epiphany along the lines that while phenomena are (1) composed of parts and (2) begin and end in time, that these attributes don't obtain to numbers, which are neither composed of parts nor begin or end in time (although I later realised that (1) only properly applies to prime numbers, but the point remains.) At the time I had this epiphany, the insight arose, 'so this is why ancient philosophy held arithmetic in high esteem. It was certain, immutable and apodictic.' These are attributes of a higher cognitive functionality, namely rational insight. Of course, I was to discover that this is Platonism 101, and I'm still drawn to the Platonist view of the matter. The philosophical point about it is that through rational thought we have insight into a kind of transcendental realm. As an SEP article puts it:

[quote=SEP;https://plato.stanford.edu/entries/platonism-mathematics/#PhilSignMathPlat]Mathematical platonism has considerable philosophical significance. If the view is true, it will put great pressure on the physicalist idea that reality is exhausted by the physical. For platonism entails that reality extends far beyond the physical world and includes objects that aren’t part of the causal and spatiotemporal order studied by the physical sciences. Mathematical platonism, if true, will also put great pressure on many naturalistic theories of knowledge. For there is little doubt that we possess mathematical knowledge. The truth of mathematical platonism would therefore establish that we have knowledge of abstract (and thus causally inefficacious) objects. This would be an important discovery, which many naturalistic theories of knowledge would struggle to accommodate.[/quote]

See also What is Math? Smithsonian Institute.

As for Shannon's information theory, I think it tends to be somewhat over-interpreted. Shannon was an electronic engineer trying to solve a particular problem of reliable transmission of information. Of course one of the fundamental discoveries of cybernetics, we all rely on Shannon's work for data compression and tranmission every time we use these devices. But there's a lot of hype around information as a kind of fundamental ontological ground, kind of like the digital geist of the computer age.
apokrisis August 18, 2024 at 23:56 #926509
Quoting Wayfarer
Can you unpack that a bit?


It is easy to assume things are just what they are. But that depends on them being in fact not what they are not. That should be familiar to you from Deacon's notion of absentials if not from the three laws of thought.

Counterfactuality secures the identity of things by being able to state what they are not. And that is what a bit represents. A and not not-A. A switch that could be off and thus can in fact be on.

Numbers are places marked on a line. The assumption is that their value is ranked from small to large. So already the symmetry is geometrically broken in a particular fashion. But that asymmetry is secured algebraically by identity operations. Adding and multiplying. Adding zero or multiplying by 1 leaves a number unchanged. Adding or multiplying by any other value then does change things.

So the counterfactuality is a little more obscure in the case of the numberline. The question is whether a value was transformed in a way that either did break its symmetry or didn't break its symmetry. The finger pointing at a position on the line either hopped somewhere else or remained in the same place.

Information and numbers then have to move closer to each other as we seek to employ atomistic bits – little on/off switches – as representations of algebraic structures. To run arithmetic on a machine, the optimal way is break it all down into a binary information structure that can sit on a mechanical switching structure. Just plug into a socket and watch it run, all its little logic gates clacking away.

So counterfactuality is the way we like to think as it extremitises things to the point of a digital/mechanical clarity. The simplicity of yes or no. All shades of grey excluded. Although you can then go back over the world and pick out as many shades of grey as you like in terms of specific mixtures of black and white. You can recover the greyness of any particular shade of grey to as many decimal places as you like. Or at least to whatever seems acceptable in terms of your computational architecture. The gradual shift from 8-bit precision to 64-bit was pricey.















apokrisis August 19, 2024 at 00:25 #926513
Quoting Wayfarer
But there's a lot of hype around information as a kind of fundamental ontological ground, kind of like the digital geist of the computer age.


So part of what I was pointing out is how information theory cashes out the counterfactuality of logic in actual logic gates and thus in terms of a pragmatic entropic payback.

Numbers seem to live far away in the abstract realm of Platonia. Shannon information was how they could be brought back down to live among us on Earth.

Algebra with real costs and thus the possibility of real profits.
Wayfarer August 19, 2024 at 00:33 #926514
Reply to apokrisis :up: :pray:
SophistiCat August 19, 2024 at 01:00 #926518
Reply to hypericin "Information" is a vexed term, as it is used differently (and often vaguely) in different contexts. A crucial thing about Shannon's theory in particular, which is often lost when it is casually mentioned, as you do here, is that it is a theory of communication, in which bits are only one part of a system that also includes, at a minimum, the encoder, the channel and the decoder. Taken in isolation, numbers or bits cannot be identified with information in any meaningful way.
hypericin August 19, 2024 at 02:22 #926533
Quoting Wayfarer
As for Shannon's information theory, I think it tends to be somewhat over-interpreted. Shannon was an electronic engineer trying to solve a particular problem of reliable transmission of information. Of course one of the fundamental discoveries of cybernetics, we all rely on Shannon's work for data compression and tranmission every time we use these devices. But there's a lot of hype around information as a kind of fundamental ontological ground, kind of like the digital geist of the computer age.


I guess I've bought into the hype. For me, thinking about a piece of information, say a snippet of song, pass somehow unchanged through multiple wildly different physical media, such as sound waves, tape, CD, mp3, cable internet, wireless internet, streaming buffer, then back to sound waves as you finally hear it, led me to start conceiving of information and matter as being independent, and both as fundamental elements of the universe (maybe not unlike Aristotle's hylomorphism).

Quoting SophistiCat
"Information" is a vexed term, as it is used differently (and often vaguely) in different contexts. A crucial thing about Shannon's theory in particular, which is often lost when it is casually mentioned, as you do here, is that it is a theory of communication, in which bits are only one part of a system that also includes, at a minimum, the encoder, the channel and the decoder. Taken in isolation, numbers or bits cannot be identified with information in any meaningful way.


I'm not sure. Suppose an archaeologist uncovers tablets on which are inscribed a lost language. What did the archaeologist discover? Seemingly, information that can no longer be decoded. Years later, the language was translated. Did the information spring into being? Or was it always there?



Wayfarer August 19, 2024 at 03:22 #926543
Quoting hypericin
then back to sound waves as you finally hear it, led me to start conceiving of information and matter as being independent, and both as fundamental elements of the universe (maybe not unlike Aristotle's hylomorphism).


That, I agree with :100: and have often argued along these lines (see this thread).
180 Proof August 19, 2024 at 04:04 #926546
Quoting hypericin
I think ultimately the difference between information and numbers is only pragmatic.

Afaik, it's "the difference" between pattern-strings and mathematical structures, respectively, such that the latter is an instance of the former. They are formal abstractions which are physically possible to instantiate by degrees – within tractable limits – in physical things / facts and usually according to various, specified ("pragmatic") uses. I think 'Platonizing' information and/or numbers (as 'concept realists', 'hylomorphists, and 'logical idealists' do) is, at best, fallaciously reifying.
Tarskian August 19, 2024 at 05:12 #926550
Quoting hypericin
There is more to say, and I think there are problems with this idea, but what do you guys think?


No problem for visual and auditory information. We can include written language as a specialized subset of auditory information. Practical problems abound, however, with information related to smell, touch, and taste.

Digital scent is considered experimental and quite impractical. There may actually not even be that much demand for it outside specialized application niches:


https://en.wikipedia.org/wiki/Digital_scent_technology

Digital scent technology (or olfactory technology) is the engineering discipline dealing with olfactory representation. It is a technology to sense, transmit and receive scent-enabled digital media (such as motion pictures, video games, virtual reality, extended reality, web pages, and music). The sensing part of this technology works by using olfactometers and electronic noses.

Current challenges. Current obstacles of mainstream adoption include the timing and distribution of scents, a fundamental understanding of human olfactory perception, the health dangers of synthetic scents, and other hurdles.


Digitizing taste is experimental only:

https://en.wikipedia.org/wiki/Gustatory_technology

Virtual taste refers to a taste experience generated by a digital taste simulator. Electrodes are used to simulate the taste and feel of real food in the mouth.[1] In 2012, Dr. Nimesha Ranasinghe and a team of researchers at the National University of Singapore developed the digital lollipop, an electronic device capable of transmitting four major taste sensations (salty, sour, sweet and bitter) to the tongue.


Digitizing touch is also highly experimental research with currently no practical applications so to speak of:

https://contextualrobotics.ucsd.edu/seminars/digitizing-touch-sense-unveiling-perceptual-essence-tactile-textures

Imagine you could feel your pet's fur on a Zoom call, the fabric of the clothes you are considering purchasing online, or tissues in medical images. We are all familiar with the impact of digitization of audio and visual information in our daily lives - every time we take videos or pictures on our phones. Yet, there is no such equivalent for our sense of touch. This talk will encompass my scientific efforts in digitizing naturalistic tactile information for the last decade. I will explain the methodologies and interfaces we have been developing with my team and collaborators for capturing, encoding, and recreating the perceptually salient features of tactile textures for active bare-finger interactions. I will also discuss current challenges, future research paths, and potential applications in tactile digitization.

apokrisis August 19, 2024 at 05:17 #926551
Quoting 180 Proof
'Platonizing' information and/or numbers (as 'concept realists', 'hylomorphists, and 'logical idealists' do) is, at best, fallaciously reifying.


Yet the holographic principle in fundamental physics says it means something that the same formalism works for information and entropy. At the Planck scale, the physical distinction between the discrete and the continuous dissolves into its own identity operation.

There is something deeper as is now being explored. Reality is bound by finitude. Which would be a big change in thinking.
Wayfarer August 19, 2024 at 05:47 #926552
I've tried to explain recently why I think it's fallacious to say that Platonism in mathematics is a reification, meaning literally 'making into a thing'. Numbers and 'forms' in the Platonist sense can be thought of as being real insofar as they can be grasped by reason, but it's not because they exist in the sense that objects exist.

[quote=Perl, Thinking Being]Forms are ideas, not in the sense of concepts or abstractions, but in that they are realities apprehended by thought rather than by sense. They are thus ‘separate’ in that they are not additional members of the world of sensible things, but are known by a different mode of awareness.[/quote]

that different mode being rational insight rather than sensory perception.

Quoting apokrisis
Yet the holographic principle in fundamental physics says it means something that the same formalism works for information and entropy


Wasn't that because Von Neumann, who was an associate of Claude Shannon, suggested to him that he adopt the term 'entropy', noticing that it was isometric with Bolzmann's statistical mechanics interpretation of entropy? He also said that 'as nobody really knows what it means then you will always have an advantage in debates'. (Ain't that the truth ;-) )
180 Proof August 19, 2024 at 07:16 #926558
Quoting apokrisis
Reality is bound by finitude.

I don't grok this. :chin::

Quoting Wayfarer
[s]real[/s] insofar as they can be grasped by reason

Semantic quibble: ideal, not "real".
apokrisis August 19, 2024 at 09:56 #926585
Quoting Wayfarer
Wasn't that because Von Neumann, who was an associate of Claude Shannon, suggested to him that he adopt the term 'entropy', noticing that it was isometric with Bolzmann's statistical mechanics interpretation of entropy?


Yep. That’s where it started. With a conceptual similarity. But then an actual physical connection got made. Folk like Szilárd, Brillouin, Landauer and Bekenstein showed that the Boltzmann constant k that sets the fundamental scale for entropy production also sets a fundamental scale for information processing.

Computing creates heat. And so there is an irreducible limit to how much information a volume of space can contain without melting it. Or in fact gravitationally curling it up into a blackhole.

In a number of such ways, information and entropy have become two faces of the same coin connected by k, which in turn reduces to c, G and h as the fundamental constants of nature. Reality has a finite grain of resolution. And information and entropy become two ways of talking about the same fact.

Quoting 180 Proof
I don't grok this.


I’m talking about holography and horizons. Volumes of spacetime can only contain finite amounts of information because information has a finite grain. Or in entropy terms, a finite number of degrees of freedom.

So at the Heat Death, it all just stops. The information represented by the energy density of the Big Bang has reached its eternalised de Sitter state of being cooled and spread out as far as it could ever go. The Universe is a bath of blackbody radiation. But with a temperature in quantum touching distance of absolute zero. Photons with a wavelength the size of the visible universe.

(There are a few issues of course. Like how to account for the dark energy that ensures this de Sitter state where the cosmic event horizon does freeze over at this finite maximum extent. So it is a sketch of the work in progress. Lineweaver is a good source.)



SophistiCat August 19, 2024 at 16:18 #926635
Quoting hypericin
I'm not sure. Suppose an archaeologist uncovers tablets on which are inscribed a lost language. What did the archaeologist discover? Seemingly, information that can no longer be decoded. Years later, the language was translated. Did the information spring into being? Or was it always there?


Exactly, how is it that the same marks on dry clay can carry more or less information in different contexts? And note that it's not just any marks that transmit information. Some random indentations and scratches on the same tablet would not do. How could that be if marks themselves were information?

Also, note that in your example you used clay tablets, not numbers (and in your OP you went back and forth between numbers and computers, which, of course, are not the same thing). This shows that there isn't a necessary connection between information and numbers. Numbers or bits can serve as an abstract representation of an encoded message.
hypericin August 19, 2024 at 21:16 #926675
Quoting SophistiCat
Exactly, how is it that the same marks on dry clay can carry more or less information in different contexts?


I think they can't. They carry the same information, whether or not it happens to be decodable at the time.

Quoting SophistiCat
And note that it's not just any marks that transmit information. Some random indentations and scratches on the same tablet would not do. How could that be if marks themselves were information?


Why does that preclude the marks themselves being information? Marks, arranged in certain ways, are information. Arranged randomly, they are not.

Quoting SophistiCat
and in your OP you went back and forth between numbers and computers, which, of course, are not the same thing


The point was that computers, thought of as information processing devices, are just as much number processing devices.

Quoting SophistiCat
Numbers or bits can serve as an abstract representation of an encoded message.


Why not
Numbers or bits can serve as [s]an abstract representation of[/s] an encoded message



Wayfarer August 19, 2024 at 22:45 #926693
Quoting apokrisis
information and entropy become two ways of talking about the same fact.


I see the connection you're drawing between entropy and information at the physical level, where both are linked by thermodynamic principles through the Boltzmann constant (after a bit of reading!) However, I wonder if equating the two risks losing sight of the fact that information derives its significance from its order. For instance, a random string of letters might technically have entropy, but it lacks the kind of structured information we get from an ordered string of words. Even so, you could transmit random strings of characters and measure the degree of variance (or entropy) at the receiving end, but regardless no information would have been either transmitted or lost. It's this order that makes information meaningful and, importantly, it’s the order that gets degraded in transmission. So I question that equivalence - might it not be a fallacy of equivocation?

apokrisis August 19, 2024 at 23:32 #926705
Quoting Wayfarer
However, I wonder if equating the two risks losing sight of the fact that information derives its significance from its order.


Well if you are interested in processing signals, that requires you to have a model of noise. This was Shannon's actual job given he worked with a phone company with crackling long distance wires. You need to know how often to repeat yourself when it is costing you x dollars a minute. Maybe send a telegram instead.

If you know the worse case scenario – no information getting through to the other end, just random crackle – then you can begin to measure the opposite of that. How much redundancy you can squeeze out of a message and yet rely on it to arrive with all its meaning intact.

So sure. We can load up our bit strings with our precious cargo of "differences that make a difference". We can send them out across the stormy seas of noisy indifference. But if we know that some of the crockery is going to be inevitably broken or nicked, we might choose to pack a few extra cups and plates to be certain.

Information theory just starts with a theory of noise or disorder. Then you can start designing your meaning transmission networks accordingly.

Quoting Wayfarer
So I question that equivalence - might it not be a fallacy of equivocation?


It is like when folk talk about big bangs or dark matter. People latch onto complex scientific ideas in simplistic fashion.

To actually have a flow of meaning – which is what you are meaning by "information" – it has to be a flow of differences that make a difference. However that in turn requires a physical channel that transmits difference in the first place. Like a vocal tract that can make noises. Or a stylus that can scratch marks on wax.

So you start with the mechanics for making a noisy and meaningless racket. Then you can start to add the constraints that suppress the noise and thus enhance the signal. You add the structure that is the grammar or syntax that protects your precious cargo of meaning.

A number line has its direction. You are either counting up or counting down. A bit string has some standard word size that fits the bit architecture of the hardware. As further levels of syntax get added, the computer can figure out whether you are talking about an operation or its data.

So no equivocation. Information theory is the arrival of mechanical precision. Mathematical strength action.

Entropy likewise brings mathematical precision to our woolly everyday notions about randomness or chaos. It gives a physical ground to statistical mechanics. Boltzmann's formula speaks to the idea of noise in terms of joules per degree Kelvin.























Wayfarer August 19, 2024 at 23:41 #926707
Reply to apokrisis I don't think you've seen the point of the objection. The word 'information' is often used in this context, but 'order' and 'meaning' are both much broader in meaning that 'information'. The reason that measures of entropification can be applied to information, is just because information represents a form of order, and entropy naturally applies to order. No wonder that Von Neumann spotted an isomorphism between Shannon's methods and entropy.

Besides, I have a suspicion that the designation of 'information' as being foundational to existence, goes back to Norbert Wiener saying 'Information is information, not matter or energy. No materialism which does not admit this can survive at the present day.' I'm sure this is what leads to the prevalence of information-as-foundation in contemporary discourse.
SophistiCat August 20, 2024 at 00:24 #926724
Reply to hypericin Information crucially depends on the sender and the receiver (and noise, if any) - this is what is being neglected here. Divining from patterns of tea leaves or decoding random marks on clay gives you no information, because no information was sent in the first place, despite there being a message. Similarly, numbers in themselves are not information, because they do not encode any message - they are just there.

The message "The cat is on the mat. The cat is on the mat." gives you no more information than the message "The cat is on the mat." even though the former contains more bits than the latter (I am discounting noise for simplicity). The message "Your name is X" gives you no information if your name really is X and you are not suffering from amnesia. So, information depends on the receiver as well.

Numbers can be used in mathematical modeling of communication, but numbers in themselves are no more information than they are novels or bridges or population clusters.
apokrisis August 20, 2024 at 00:29 #926725
Quoting Wayfarer
The reason that measures of entropification can be applied to information, is just because information represents a form of order, and entropy naturally applies to order.


How does entropy apply naturally to order except as its negation or inverse? Just as is the relation between signal and noise in the information perspective.

Quoting Wayfarer
I'm sure this is what leads to the prevalence of information-as-foundation in contemporary discourse.


Did you read the full passage? Do you really now want to commit to the thesis that brains are just mechanical computers? Racks of logic switches? Thought reduces to symbol processing?
hypericin August 20, 2024 at 01:42 #926763
Quoting SophistiCat
Information crucially depends on the sender and the receiver (and noise, if any) - this is what is being neglected here. Divining from patterns of tea leaves or decoding random marks on clay gives you no information, because no information was sent in the first place, despite there being a message. Similarly, numbers in themselves are not information, because they do not encode any message - they are just there.


This seems like a conflation of information and communication, where communication is the transmission of information from A to B. Tea leaves and random marks are rich with information, it would take a lot of work to accurately represent their states. But they are not communications, which are intentional acts.

Quoting SophistiCat
The message "The cat is on the mat. The cat is on the mat." gives you no more information than the message "The cat is on the mat." even though the former contains more bits than the latter (I am discounting noise for simplicity). The message "Your name is X" gives you no information if your name really is X and you are not suffering from amnesia. So, information depends on the receiver as well.


Information is not the same as how informative the information might or might not be to a receiver. "The cat is on the mat. The cat is on the mat." may be no more informative than "The cat is on the mat", but the former still carries more information (it requires more bits to represent, but less than "The cat is on the mat. The dog is on the log.") "Your name is X" may not be informative, but it is still information.

Quoting SophistiCat
Similarly, numbers in themselves are not information, because they do not encode any message - they are just there.


Just where exactly? If information is distinguished from communicative acts, and from being informative, then numbers are pure information: they are answers to a series of yes and no questions, seen clearly in their binary representations.

SophistiCat August 21, 2024 at 00:10 #926946
Reply to hypericin See, this is what I was talking about: a lot of confusion is created when the term "information" is thrown around with little care given to its meaning. In your OP you were specifically referring to Shannon's theory, and Shannon's theory is all about communication. Shannon did not set out to answer the question, "what is information?" He was interested in the problem of reliable and efficient communication, and his concept of information makes sense in that context. Other concepts of information have been put to different uses. Yours, on the other hand, seems to be a solution in search of a problem.

If you start with the question, "what is information?" the way to go is to survey existing uses of the word. Another approach would be to do what Shannon and other researchers did, which is to start with a specific problem, something that matters, and then see whether a concept with a family resemblance to "information" fits. But starting with the answer, before you even understand the question, is backwards.
Athena August 22, 2024 at 16:58 #927252
Quoting SophistiCat
If you start with the question, "what is information?" the way to go is to survey existing uses of the word. Another approach would be to do what Shannon and other researchers did, which is to start with a specific problem, something that matters, and then see whether a concept with a family resemblance to "information" fits. But starting with the answer, before you even understand the question, is backwards.


I am reading about this matter of what is information. I am reading Rudy Rucker's book "Mind Tools".
He writes- "Anyone at the receiving end of a communication is involved with "information processing." I may communicate with you by writing this book, but you need to organize the book's information in terms of mental categories that are significant for you. You must process the information."

Spoken and written communication are, if you stop to think about it, fully as remarkable as telepathy would be. How is it that you can know my thoughts at all, or I yours? You have a thought, you make some marks on a piece of paper, you mail the paper to me, I look at it, and by some mysterious communication algorithm, I construct in my own brain a pattern that has the same feel as your original thought. Information!"

He goes on to explain numbers, space (and patterns that create forms), logic, and finally infinity and information. Problem is I know so little about math this book might as well be written in Greek. I am unprepared to understand what I am reading. There is so much I need to know before I can understand.

What if we did not use words, but communicated with math? I know mathematicians can do that, but what if from the beginning we all did? I am sure my IQ would be much higher if I could do that. And I wonderful how thinking in mathematical terms might change our emotional experience of life.

jgill August 22, 2024 at 19:08 #927280
Quoting Athena
What if we did not use words, but communicated with math? I know mathematicians can do that, but what if from the beginning we all did? I am sure my IQ would be much higher if I could do that. And I wonderful how thinking in mathematical terms might change our emotional experience of life.


Interesting idea. Logicians might be able to do this, but math people use words and symbols. I have never heard of a math research paper written in math symbols only. Thinking in mathematical terms is common amongst my colleagues, but even there one talks to oneself with words.
hypericin August 23, 2024 at 03:01 #927336
I've been thinking more about this. At first I thought I was just mistaken in my op. The set of all possible arrangements of bits is countable, so it is no wonder that we can uniquely assign a whole number to every arrangement. Just because bits are countable, doesn't establish some kind of identity between bits and numbers.

But then, the set of all quantities is countable, as is the set of points on a number line. Are these two any more inherently number than bits? Quantities have arithmetic operations defined for them that don't make sense for bits, and bits have binary operations that don't make sense for quantities. Is one set of operations more "number" than the others? Or are bit arrangements, quantities, number lines, all just countable things, and so all equally numeric?
jgill August 23, 2024 at 06:29 #927355
Quoting hypericin
the set of all quantities is countable, as is the set of points on a number line.


Not so, my friend, if we speak of the real number line. This has been chewed on on this forum until there is little left to be said.
hypericin August 23, 2024 at 10:26 #927378
Quoting jgill
so, my friend, if we speak of the real number line.


True enough. Instead of points I should have said integer marks
Lionino August 23, 2024 at 12:10 #927397
Quoting Athena
You have a thought, you make some marks on a piece of paper, you mail the paper to me, I look at it, and by some mysterious communication algorithm, I construct in my own brain a pattern that has the same feel as your original thought. Information!


Sounds like Early Wittgenstein's picture theory of language.

Quoting Athena
What if we did not use words, but communicated with math?


How would that work, basically?
Apustimelogist August 23, 2024 at 15:18 #927424
Reply to hypericin

Why is your view restricting this to numbers? There is no reason you need to represent things with numbers.
Athena August 25, 2024 at 17:23 #927904
Quoting Lionino
Sounds like Early Wittgenstein's picture theory of language.

What if we did not use words, but communicated with math?
— Athena

How would that work, basically?


Good gravy, I do not know! As I just said in another thread, there is so much I do not know and didn't even know I didn't know so much. I swear I am at a time in my life when every day I feel more and more ignorant of everything I do not know and absolutely paniced to figure out a way to reduce this ignorance.

I sure respect Socrates right now and I would bet my bank account, he did not know how much he did
not know until he got old. :lol: Remember when we were teenagers and thought we knew it all?

I can barely imagine how it is to think in terms of mathical code. To see H2O and know that means water, is special. To think "water" and immediately think where does it come from and where does it go and why is it important and is it clean and safe to drink, etc., etc. is absolutely amazing! No other animal on the planet has this capability. Yes, animals can communicate but they do not come close to the mental activity of which we are capable.
Athena August 25, 2024 at 17:25 #927906
Reply to Apustimelogist How many books have you read about why math is deemed a very important thinking tool? What justifies us taking you seriously as an authority on communication or math?
Athena August 25, 2024 at 17:40 #927908
Quoting jgill
Interesting idea. Logicians might be able to do this, but math people use words and symbols. I have never heard of a math research paper written in math symbols only. Thinking in mathematical terms is common amongst my colleagues, but even there one talks to oneself with words.


Perfect! I wish I could know what you know through experience with those who think mathematically. I am quite sure some things can not be communicated with math, but I am not sure exactly what divides the world of math and human language, but I suspect AI does not get high scores for comprehending being human. Being human is an experience and that is outside of logic. We are not predictable machines. Are barely inside the laws of nature, compared to the rest of the animal kingdom.
Athena August 25, 2024 at 17:57 #927911
"The exciting thing about mathematics and science and music and literature is what they can tell us about the workings of the human mind. For these disciplines are literally models (extensions) of at least certain parts of the mind. Just as the knife cuts but does not chew, while the lens does only a portion of what the eye can do, extensions are reductionist in their capability. No matter how hard it tries, the human race can never fully replace what was left out of extensions in the first place. Also, it is just as important to know what is left out of a given extension system as it is to know what the system will do. Yet the extension-omissions side is frequently overlooked.”
? Edward T. Hall
Apustimelogist August 25, 2024 at 21:37 #927936
Reply to Athena Did you communicate this message with numbers?
hypericin August 25, 2024 at 22:00 #927943
Reply to Apustimelogist
As she sent it with a computer, absolutely.
Apustimelogist August 25, 2024 at 22:24 #927948
Reply to hypericin But the information isn't numbers, the symbols are not numbers.
hypericin August 26, 2024 at 01:29 #927992
Reply to Apustimelogist

The whole point of my op is that information and numbers are the same thing
Apustimelogist August 26, 2024 at 03:32 #928020
Reply to hypericin
Yeah but how would you answer the point that you don't need numbers to represent something?
Harry Hindu August 26, 2024 at 12:53 #928105
Quoting apokrisis
So a string of bits or the numberline exist in the happy world where we can just take this paradoxical division between the continuous and the discrete for granted. Continuums are constructible. Don't ask further questions. Get on with counting your numbers and bits.

This makes me think about the distinction, particularly in quantum mechanics, between the unmeasured and the measured.

Numbers are just scribbles (2 and two refer to the same thing. it's just easier to do math with the condensed version using numbers instead of words) that refer to certain quantities. They are causally connected - the scribble and the quantity of objects within a category, whether it be cows or photons.

Information is the relationship between the scribble (the effect) and the quantity (the cause).

Quoting apokrisis
It is easy to assume things are just what they are. But that depends on them being in fact not what they are not.

Maybe I'm misunderstanding but this seems counter-intuitive considering that we must categorize objects by their similarities, not their differences or what they are not. Objects that are similar fall into some category and it is only then that we can assert that there is a quantity of similar objects. If everything was unique and there are no categories of similar objects then what use are quantities? If there is only one of everything what use is math?

Why does 2+2=4? Some may say that this is logically sound statement, but why? What makes some string of scribbles true? It seems to me that you have to have made some observation, and categorizing your observations, prior to making this statement. Are they just scribbles on this page or are they about something that I can experience and make predictions from?

Quoting SophistiCat
Similarly, numbers in themselves are not information, because they do not encode any message - they are just there.

Where are the numbers and how did they get there?

apokrisis August 26, 2024 at 21:20 #928162
Quoting Harry Hindu
Objects that are similar fall into some category and it is only then that we can assert that there is a quantity of similar objects.


Object recognition thus parallels an entropic view of information. An equilibrium system like an ideal gas is defined by its macro properties - temperature and pressure - and not its micro properties. The actual position of a bunch of particles is an ensemble of differences that becomes merely the sameness of a statistical blur. And the global state of the ensemble is a sameness that can be now treated as a difference when comparing one thermal system to some other in terms of a pressure and temperature.

If I asked you to count the number of crows in a tree, the fact that some were rooks, some ravens, some magpies, would be differences you are being asked to ignore. They are all varieties of crow, but that is being treated as a difference that doesn’t make a difference for the purpose of counting crows.

So reality is like this. There are always further distinctions to be had. Even two electrons might be identical in every way, except they are in different places. But equally, the differences can cease to matter from a higher level that sees instead the sameness of a statistical regularity. Sameness and difference are connected by the third thing of where in scale we choose to stand in measuring the properties of a system.

Are we interested in the distinctions between types of crow. Or if it is birds we are counting, is a crow any different from an ostrich?

Quoting Harry Hindu
Why does 2+2=4? Some may say that this is logically sound statement, but why? What makes some string of scribbles true?


So reality is divided into sameness and difference by its hierarchical scale. There really is something to talk about at the level of statistical mechanics. But then our talking about it is done in a way that claims to talk past the third thing of a viewpoint where either the sameness or the difference is being ignored. Information and numbers are our means to talk about reality as if from some completely objective nowhere.

It matters in language whether I think I am being asked to count birds or ravens. I have to place myself at a certain interpretive level that matches your understanding about what you were asking. But then the number line is its own abstract thing where there is no physical scale involved. Space, time and energy are all generalised as differences to be ignored. Three ravens is equivalent to three birds, three apples or three spaghetti monsters. The focus is now on the arithmetic or algebraic operations that can be performed on an abstract number system.

We have shifted ourselves into a Platonia so far as reality is concerned. And that new mathematical level of semiosis or reality modelling offers a huge entropic payback for us humans in terms of technology, engineering, computation, and other ways of mechanistically controlling the world.

It makes the whole of reality look the same in our eyes. A mechanical device. A system of particles regulated by differential equations. A sameness of physical laws with a difference in initial conditions.

So numbers and information are part of a new way of speaking about the world that is very useful in proportion to the degree that it is also unreal. It is a language of atomised reductionism that places itself outside even space, time and energy as those are the physical generalities it now aspires to take algorithmic control over.

A modelling relation with the world coming from the God’s eye view. Just equations and variables. Absolute sameness coupled to absolute difference now.



Janus August 26, 2024 at 22:09 #928173
Perl, Thinking Being:Forms are ideas, not in the sense of concepts or abstractions, but in that they are realities apprehended by thought rather than by sense.


Rational thought or the cognition, the apprehension of pattern, it is grounded in? Animals obviously recognize forms. Should we say they are rational?
apokrisis August 26, 2024 at 22:31 #928180
Reply to Janus Were you addressing me? :chin:

Quoting Janus
Animals obviously recognize forms. Should we say they are rational?


But anyway, animals obviously have good object recognition. The recognise pragmatic forms. But are they apprehending form at a rational level? Or is that level of abstraction how we humans learn to view the world for our own new purpose of seeing reality in general as one giant rational machine?

Semiosis would say that animals are rational at the level of genetic and neural encoding. They see the world in terms of a regulated metabolism and a patterned environment.

Humans have linguistic semiosis which gets us to a level of seeing the world in terms of a pattern of interaction between intentional agents. The play of viewpoints captured by being able to speak of me and you, before and after, good and bad.

And the OP concerns mathematical semiosis. Pattern abstracted to the point of a mechanical generality. The ability to construct forms in algorithmic fashion. What seems to us the ultimate level of rationalised structure.




apokrisis August 26, 2024 at 23:20 #928189
Quoting apokrisis
Semiosis would say that animals are rational at the level of genetic and neural encoding.


This seems a useful clarification. Information is encoded meaning. Genetic information encodes for constraints on chemical actions. Neural information encodes for constraints on environmental actions. Verbal information encodes for constraints on intentional actions. And numeric information encodes for constraints on mathematical actions.

So information is about the pragmatic encoding of meanings. It is how an organism regulates its world by having the kind of memory that acts as a store for data and algorithms – to put it rather computationally. An organism can construct states of constraint because a meaningful relation with the world has been atomised into a system of syntax acting on semantics. Habits or routines that can be run. Behaviours which can be switched on or off.

Numbers are then just the form that information takes at the level of a complete semiotic abstraction in terms of the self that is aiming to regulate its world by the business of constructing states of constraint. A numberline gives both the data points and the algorithmic logic that are needed to encode an absolutely general – but also perfectly mechanistic – modelling relation with the world.

So four levels of information or mechanistic regulation. With maths and logic at the end of this trail as the most evolved form of pragmatic rationality. The ultimate way that an organism could think about the world. If also then, the least actually "organic". :razz:







Janus August 26, 2024 at 23:58 #928205
Reply to apokrisis Reply to apokrisis I was addressing the quoted text from Perl. I haven't read his work but have received the impression that "apprehending form via the rational intellect" was the thought in play there. I guess it depends on whether you think "apprehending form" means recognizing it or reflecting on it. I would agree with you that the latter requires symbolic language and I don't think that is at all controversial.

Quoting apokrisis
Numbers are then just the form that information takes at the level of a complete semiotic abstraction in terms of the self that is aiming to regulate its world by the business of constructing states of constraint.


Yes, "numbers" are abstractions. But I think animals have a sense of number. The word "form" in information seems to reflect the relationship between information and form. Form and information and number all primordially rely on cognition and recognition of difference and sameness or similarity and pattern.
apokrisis August 27, 2024 at 00:09 #928208
Quoting Janus
But I think animals have a sense of number.


Or of perceptual grouping. Human working memory famously tops out at about “7 ± 2” items. The kind of grouping in the test that Trump aced when he could recall “Person, woman, man, camera, TV.“ And probably even manage that feat in reverse order.

Animals all have working memory too. The ability to juggle a small set of particular aspects of some larger cognitive task.

But that is not the same as counting. Just the reason why we struggle with holding number strings longer than seven in our working memories.

Quoting Janus
The word "form" in information seems to reflect the relationship between information and form.


Indeed. I would call the abstracted notion of information our “atoms of form”. Form reduced to the counterfactuality of a binary switch. We can count how many distinctions it takes to arrive at something completely specific.

The game of Twenty Questions is a good example. Ideally, every question cuts the number of remaining choices in half. And that way we cut through a world of possibilities with an exponentialised efficiency.

The form I have in mind is … well you will just have to start guessing. And each guess is an atomistic act of counterfactual switching. If you are any good at this game, you will switch off half the world of possibilities as you zero in on the possibilities still left switched on.

Janus August 27, 2024 at 00:26 #928214
Quoting apokrisis
But that is not the same as counting. Just the reason why we struggle with holding number strings longer than seven in our working memories.


That makes sense to me. I have come across reports that suggest some animals can learn to do basic small number counting. They may be apocryphal.
apokrisis August 27, 2024 at 00:34 #928217
Quoting Janus
I have come across reports that suggest some animals can learn to do basic small number counting.


You know yourself that three, four or even five things can be seen as different sized collections at a single glance. And remembered as such. But the difference between seven or eight apples starts to require a method of checking if you want to be sure of your mathematical correctness. Whereas as for a hungry monkey, it becomes a difference not making a difference. It is just seen as a lot of apples.





Janus August 27, 2024 at 00:42 #928221

Reply to apokrisis :up:

Quoting apokrisis
The game of Twenty Questions is a good example. Ideally, every question cuts the number of remaining choices in half. And that way we cut through a world of possibilities with an exponentialised efficiency.


Nice ordinary example!
Shawn August 27, 2024 at 01:51 #928235
Quoting hypericin
I've been thinking more about this. At first I thought I was just mistaken in my op. The set of all possible arrangements of bits is countable, so it is no wonder that we can uniquely assign a whole number to every arrangement. Just because bits are countable, doesn't establish some kind of identity between bits and numbers.


I wouldn't be surprised if you were able to (in a manner unknown to me at the moment) topologically map out the possible logical spaces for information to represent truths in state space. Again, I am assuming that theorems and truth of theorems can be topologically mapped out in a heuristically manner in logical space.
Harry Hindu August 27, 2024 at 12:29 #928318
Quoting apokrisis
So reality is like this. There are always further distinctions to be had. Even two electrons might be identical in every way, except they are in different places. But equally, the differences can cease to matter from a higher level that sees instead the sameness of a statistical regularity. Sameness and difference are connected by the third thing of where in scale we choose to stand in measuring the properties of a system.

Sure. Information is everywhere causes leave effects. What information is relevant, or attended to, depends on the goal in the mind.

Quoting apokrisis
So numbers and information are part of a new way of speaking about the world that is very useful in proportion to the degree that it is also unreal. It is a language of atomised reductionism that places itself outside even space, time and energy as those are the physical generalities it now aspires to take algorithmic control over.

I don't know if I agree with what you're saying here. What does it mean for something to be useful but not real? What does it mean for something to be useful if not having some element of being real? It seems to me that survival is the best incentive for getting things right. The environment selects traits that benefit the survival and reproductive fitness of organisms. Our highly evolved brain must have been selected for a reason and there must be a reason why humans have been so successful in spreading across the planet and out into space. Are those reasons unreal? Do your many words point to real states of reality? Am I to gain some advantage by reading your words? If not, then why read them?

It seems to me that a rational process takes time and mental space.

If I were to talk about marijuana legalization in this thread, would that be a real state of affairs of being off-topic? It seems to me that the way we perceive the world has a real effect on the world by means of our behaviors. Is Santa Claus real? As an idea Santa Claus is very real as you merely need to look at the effect the idea has had on the world.

In your example of the variety of birds we currently observe, we can point to evolution as the cause. Their differences evolved to fill different environmental niches. The variety of birds informs us of how they evolved and what their common ancestor would be like. The differences and similarities in birds indicate that they started with one common ancestor and evolved over time in different environments. We could potentially point to one common ancestor for all life with space and time being the medium in which the differences accumulate to the current state of affairs with the variety of life that we observe today.



Harry Hindu August 27, 2024 at 14:48 #928329
Reply to apokrisis
Quoting Janus
I was addressing the quoted text from Perl. I haven't read his work but have received the impression that "apprehending form via the rational intellect" was the thought in play there. I guess it depends on whether you think "apprehending form" means recognizing it or reflecting on it. I would agree with you that the latter requires symbolic language and I don't think that is at all controversial.

How does one even learn a language without apprehending the scribbles and sounds in the present and reflecting on how those same scribbles and sounds were used before? I could argue that language use is just more complex learned behavior. Animals communicate with each other using sounds, smells and visual markings. Animals understand that there is more to the markings than just the form the marking takes. It informs them of some state of affairs, like this is another's territory, not mine and in essence has some form of self-model.

I often link this story in discussions like this:
https://vimeo.com/72072873

This man made it to an adult without having learned a language. How he eventually learned language was by reflecting on what others were doing over time to come to understand that those scribbles mean things or are about things?

Mark Nyquist August 27, 2024 at 18:16 #928379
Reply to hypericin
I agree that numbers and information have something in common.

So, numbers physically exist as,
Brain; (numbers)
And, information physically exists as,
Brain; (information)

The general form is,
Brain; (a non-physical thing)

So as far as identity of numbers and information...they are associated with a physical location and time of a physical brain...always.
And they have the mental content consistent with what brains can do.

Numbers and information are not non-physical without support...but only exist as a physically supported non-physicals.

Any Claude Shannon reference is going to cause confusion
Is it physical, non-physical or physically supported non-physicals? I assume anything with Shannon information theory is physical only.
Entropy doesn't apply to non-physicals or physically supported non-physicals.

How does Shannon information even deal with a non physical things such as the past or future?
Our brains do it all the time, so something different is going on. More than a physical signal in our brains...
apokrisis August 27, 2024 at 20:04 #928407
Quoting Harry Hindu
What does it mean for something to be useful but not real?


To be clear, yes of course information storage as genes or words has some entropic cost. To scratch a mark on a rock is an effort. Heat is produced. Making DNA bases or pushing out the air to say a word are all physical acts.

But the trick of a code is that it zeroes this physical cost to make it always the same and as least costly as possible. I can say raven or I can say cosmos or god. The vocal act is physical. But the degree of meaning involved is not tied to that. I can speak nonsense or wisdom and from an entropic point of view it amounts to the same thing,

As they say, infinite variety from finite means. A virtual reality can be conjured up that physical reality can no longer get at with its constraints. But then of course, whether the encoded information is nonsense or wisdom starts to matter when it is used to regulate the physics of the world. It has to cover its small running cost by its effectiveness in keeping the organism alive and intact.

Quoting Harry Hindu
I could argue that language use is just more complex learned behavior. Animals communicate with each other using sounds, smells and visual markings.


There are grades of semiosis. Indexes, icons and then symbols. So I was talking about symbols when I talk about codes. Marks that bear no physical resemblance to what they are meant to represent.

Animals communicate with signs that are genetically fixed. A peacock has a tail it can raise. But that one sign doesn’t become a complex language for talking about anything a peacock wants.

A language is a system of symbolic gestures. Articulate and syntactically structured. A machinery for producing an unlimited variety of mark combinations. Quite different in its ability to generate endless novelty.
Janus August 27, 2024 at 20:07 #928409
Reply to Harry Hindu :up: It is quite a few years since I read A Man Without Words. It seems reasonable to think people and some animals can conceptualize prelinguistically in the form of imagery.

So "apprehending forms", in the sense of prelinguistic recognition would amount to prelinguistic conceptualization.
Harry Hindu August 28, 2024 at 11:48 #928594
Quoting apokrisis
To be clear, yes of course information storage as genes or words has some entropic cost. To scratch a mark on a rock is an effort. Heat is produced. Making DNA bases or pushing out the air to say a word are all physical acts.

But the trick of a code is that it zeroes this physical cost to make it always the same and as least costly as possible. I can say raven or I can say cosmos or god. The vocal act is physical. But the degree of meaning involved is not tied to that. I can speak nonsense or wisdom and from an entropic point of view it amounts to the same thing,

As they say, infinite variety from finite means. A virtual reality can be conjured up that physical reality can no longer get at with its constraints. But then of course, whether the encoded information is nonsense or wisdom starts to matter when it is used to regulate the physics of the world. It has to cover its small running cost by its effectiveness in keeping the organism alive and intact.


The entropic cost in creating the sound or scribble isn't the only part of the equation. Don't forget about the mind that is observing the mark or hearing the sound and the mental effort involved with decoding the message. It takes more mental power to get at the meaning of "philosophy" than "photograph" even though both words contain the same amount of letters. The question then becomes does the discussion about philosophy provide any survival or reproductive benefit (wisdom), or are we just playing symbol games (speaking nonsense)? For humans at least it could be argued that entering a virtual reality world can relieve stress and provide unique social interactions with others sharing the same virtual reality that strengthen social bonds in the physical world.

The speaker or writer must have some sense of empathy for the listener and reader. They have to put things in a way that they know they will understand with the least amount of mental effort (efficiently) if they actually want to be understood without having to re-phrase or repeat themselves.

This is why, for me at least, I get irritated at people that waste my time with word salad, mental gymnastics and intellectual dishonesty, which ends with me not putting much weight into what they write or say in the future.

Quoting apokrisis
There are grades of semiosis. Indexes, icons and then symbols. So I was talking about symbols when I talk about codes. Marks that bear no physical resemblance to what they are meant to represent.

Animals communicate with signs that are genetically fixed. A peacock has a tail it can raise. But that one sign doesn’t become a complex language for talking about anything a peacock wants.

A language is a system of symbolic gestures. Articulate and syntactically structured. A machinery for producing an unlimited variety of mark combinations. Quite different in its ability to generate endless novelty.


I would argue that when a peacock raises its tail it wants to mate. It also communicates to female peacocks the fitness of the male. There is complexity there in the causes that lead to some effect, like a male peacock showing off its tail. I could argue that the display of the peacock's tail says something about the Big Bang, as there would not be a peacocks if there wasn't a Big Bang. Of course the immediate effects say more about their immediate causes than some effect billions of years later, but my point is that all effects carry information about their causes.

In reading your words I can get at what you intended to say, the idea you intend to convey, but can get at your level of understanding of English as well. The information is there whether we look or not. Where we look, or what information we attend to, at any given moment is dependent upon the goal in the mind.

It's really just a difference in degrees. More complex brains can use more complex representations and get at more complex causal relations. The question then becomes at what point does the complexity cease to be useful? Are we overcomplicating things with our language, especially in philosophical discussions?
apokrisis August 28, 2024 at 19:59 #928669
Quoting Harry Hindu
It takes more mental power to get at the meaning of "philosophy" than "photograph" even though both words contain the same amount of letters.


Sure. But then our brain is an expensive organ to run. It uses glucose at the rate of working muscle. On the other hand, that is a constant metabolic cost. There is little change when we daydream or go to sleep.

And the goal of the brain is also to reduce all thoughts to learnt habits. It we figure things out, then our mind can just shortcut to our routine definitions of those words. So what you call mental power is the effort of attending to novelty. But once we have reduced some thing to a habit of thought, it can simply be unthinkingly emitted. It becomes so remembered formula that just needs to be triggered. The metabolic cost of rewiring the brain has been paid.

Quoting Harry Hindu
I could argue that the display of the peacock's tail says something about the Big Bang, as there would not be a peacocks if there wasn't a Big Bang.


You could read that into a peacock tail. But two peacocks just have their one instinctual understanding.

You have actual language and that makes a huge difference. Peacocks only have their genes and neurology informing their behaviour. No virtual social level of communication.

Quoting Harry Hindu
It's really just a difference in degrees. More complex brains can use more complex representations and get at more complex causal relations.


Your own argument says it isn’t if humans have language and a virtual mentality that comes with that.



Athena August 29, 2024 at 13:28 #928841
Quoting Apustimelogist
Did you communicate this message with numbers?


Can we learn more by using math than by using words? I have not communicated anything with math but computers do not use words to compute. And I am sure my failure to understand math keeps my IQ relatively low.
Athena August 29, 2024 at 14:01 #928851
Quoting Janus
Rational thought or the cognition, the apprehension of pattern, it is grounded in? Animals obviously recognize forms. Should we say they are rational?


Here is an explanation of rational
Rational behavior is used to describe a decision-making process that results in the optimal level of benefit, or alternatively, the maximum amount of utility. https://corporatefinanceinstitute.com/resources/career-map/sell-side/capital-markets/rational-behavior/#:~:text=What%20is%20Rational%20Behavior%3F,highest%20amount%20of%20personal%20satisfaction
.

Animals can problem solve. Some do it better than others. Dogs are amazing. They are the only animal that recognizes we point at something, it should go see what we are pointing at. This is one of the reasons they are good hunting partners. However, not all dogs get it. Wolves do not pick up the cue to check out what we are pointing it. Dogs that come from a line of domesticated dogs can be brilliant in figuring out human behavior and how to play the human for all the human is worth. That ability can make the difference between being a street dog or attaching to a human who provides food and shelter.

That information comes from shows about dogs.
Athena August 29, 2024 at 14:15 #928856
Quoting Janus
But I think animals have a sense of number


That is only awareness of quantity. It is nothing like recognizing numbers as a code and bits of information that can help us understand the universe. It is not like the ability to use origami to understand how nature works.

That comes from a video about a father and son, both mathematicians, discovering how much we have to learn of nature by studying origami. And the ability of work with that kind of information can be transferred in genes. That is, the ability to do math or play the piano can come in our genes. That is true for dogs and humans. :lol: I did not get the gene.
Janus August 29, 2024 at 20:52 #928937
Quoting Athena
That is only awareness of quantity.


Yes, it is only a basis, not linguistically elaborated obviously.

Reply to Athena I agree that many dogs are very smart. It's hard for us, an animal capable of abstracting and reflecting on our experiences, an ability which seems to be reliant on symbolic language, to understand animal intelligence on its own terms, and not to underestimate it. No doubt we have it there somewhere.
wonderer1 August 29, 2024 at 22:13 #928955
Quoting Athena
Can we learn more by using math than by using words? I have not communicated anything with math but computers do not use words to compute. And I am sure my failure to understand math keeps my IQ relatively low.


Of course I don't really know you and you should consider the following a matter of speculation on my part. If there is something that resonates with you it might be worthwhile to consider it more, if not I won't be offended if you tell me you can't relate to what I say. That said...

I don't think IQ works the way you think. We all have different constellations of cognitive strengths and weaknesses, with the consequence that learning some things may be harder or easier for us than for others. It seems plausible to me that math just doesn't come as easy for you as it does for some or even most. There is no failure on your part in that. Furthermore, it sound to me like the results of what you have learned are beautiful, and I hope you can be less hard on yourself.

Athena August 30, 2024 at 12:53 #929121
Quoting wonderer1
Of course I don't really know you and you should consider the following a matter of speculation on my part. If there is something that resonates with you it might be worthwhile to consider it more, if not I won't be offended if you tell me you can't relate to what I say. That said...

I don't think IQ works the way you think. We all have different constellations of cognitive strengths and weaknesses, with the consequence that learning some things may be harder or easier for us than for others. It seems plausible to me that math just doesn't come as easy for you as it does for some or even most. There is no failure on your part in that. Furthermore, it sound to me like the results of what you have learned are beautiful, and I hope you can be less hard on yourself.


Thank you so much for your concern about my feelings. That is something lacking in forums and yet science is making us aware of how important our emotions are and that we are healthier and happier when we reach out to others. And your nice words make me want to try even harder. I have math games I can put in my computer and I can give some time to using them. If we kept a running thread about math, the social aspect might help me stay motivated.

The weird thing is I am fascinated by math. I have books and DVD's about math. I want to learn the language of math and I understand learning a language is one way to keep our mental powers as we age. And oh shit, I am in trouble. Your encouragement led to looking for books and I have to have at least 2 of them. These are some thrift books offerings...

The Language of Mathematics: Making the Invisible Visible
by Keith Devlin
"The great book of nature," said Galileo, "can be read only by those who know the language in which it was written. And this language is mathematics." In The Language of Mathematics, award-winning author Keith Devlin reveals the vital role mathematics plays in our eternal quest to understand who we are and the world we live in. More than just the study of numbers, mathematics provides us with the eyes to recognize and describe the hidden...


The Math Instinct: Why You're a Mathematical Genius (Along with Lobsters, Birds, Cats, and Dogs)
by Keith Devlin

There are two kinds of math: the hard kind and the easy kind. The easy kind, practiced by ants, shrimp, Welsh corgis -- and us -- is innate. What innate calculating skills do we humans have? Leaving aside built-in mathematics, such as the visual system, ordinary people do just fine when faced with mathematical tasks in the course of the day. Yet when they are confronted with the same tasks presented as "math," their accuracy often drops. But if we have innate mathematical ability, why do we have to teach math and why do most of us find it so hard to learn? Are there tricks or strategies that the ordinary person can do to improve mathematical ability? Can we improve our math skills by learning from dogs, cats, and other creatures that "do math"? The answer to each of these questions is a qualified yes. All these examples of animal math suggest that if we want to do better in the formal kind of math, we should see how it arises from natural mathematics. From NPR's "Math Guy" -- The Math Instinct will provide even the most number-phobic among us with confidence in our own mathematical abilities. This description may be from another edition of this product.


In one of my sets of college lectures, the professor can talk about knots for at least an hour. This is using math to understand the unseen, such as to explore DNA with math.
Athena August 30, 2024 at 13:35 #929126
Quoting Janus
I agree that many dogs are very smart. It's hard for us, an animal capable of abstracting and reflecting on our experiences, an ability which seems to be reliant on symbolic language, to understand animal intelligence on its own terms, and not to underestimate it. No doubt we have it there somewhere.


I just order a book that explains math and animals. If I could get to understanding math at least as well as a dog I will have achieved something while I sit at home with COVID.

I hope we always have a math thread to sustain my interest in math.
jgill August 30, 2024 at 20:41 #929204
Quoting Athena
The weird thing is I am fascinated by math. I have books and DVD's about math. I want to learn the language of math and I understand learning a language is one way to keep our mental powers as we age


You might not be aware that the infamous western gunman, John Wesley Hardin, when in prison worked his way through an algebra textbook. He also became a lawyer.

Quoting Athena
In one of my sets of college lectures, the professor can talk about knots for at least an hour.


That would have put me to sleep. :cool:

Janus August 30, 2024 at 22:01 #929222
Reply to Athena I have no doubt that with enough passion you will get there. I hope you have a speedy recovery from Covid.
jgill August 31, 2024 at 04:34 #929283
Are there tricks or strategies that the ordinary person can do to improve mathematical ability?


Devlin knows a lot more about this than me, but In all my years I haven't witnessed any kind of improvement that hasn't come from simply picking up an elementary math text and making an attempt to understand it. Or taking an elementary class. With Wikipedia as a sort of backdrop it's easier to do this these days.

Sometimes people convince themselves they have little to no math ability. Then it's really hard to make progress.

The same question arises with critical thinking. I am discouraging there also. But I would love to be shown wrong. ChatGPT disagrees with me, but its suggestions assume someone who has certain personality qualities. Can these be cultivated?
Patterner August 31, 2024 at 05:32 #929288
Quoting Athena
What if we did not use words, but communicated with math?
— Athena

How would that work, basically?
— Lionino

Good gravy, I do not know!
I doubt it's possible. We communicate much more than mathematical ideas. If we tried using math to talk about any of those things, it would no longer be math. It would be numbers, equations, etc., representing things. Just another language. 1 stands for me. 27 stands for eat. 4,534 stands for apple.
1 + 27 + 4,534 = I eat apple.
There's no math in that. Yeah, I just did that in five minutes. But would we find a solution if we spent a thousand years trying? I doubt it. And I assume it's been tried by plenty of mathematicians over the centuries. I can't imagine a way of actually doing math that also means things we want to discuss.

But next time I'm in Castalia, I'll see if they've figured it out.


Is there a way to have tagged Reply to Lionino inside of the Athena quote?
Harry Hindu August 31, 2024 at 16:22 #929354
Quoting apokrisis
I could argue that the display of the peacock's tail says something about the Big Bang, as there would not be a peacocks if there wasn't a Big Bang.
— Harry Hindu

You could read that into a peacock tail. But two peacocks just have their one instinctual understanding.

You have actual language and that makes a huge difference. Peacocks only have their genes and neurology informing their behaviour. No virtual social level of communication.

It's really just a difference in degrees. More complex brains can use more complex representations and get at more complex causal relations.
— Harry Hindu

Your own argument says it isn’t if humans have language and a virtual mentality that comes with that.

Language evolved from a theory of other minds. Animals have learned to anticipate other animals intentions by observing their behavior and learned to communicate their intentions by behaving in certain ways. Drawing scribbles and making sounds with your mouth are just more complex forms of communicating your intentions and reading into others intentions.

Words refer to things that are not words. It would be better to show you what I'm talking about than to just tell you. If words only referred to things in our heads, how would we ever be able to communicate that to others? Words refer to things that we can see and feel in the world and are only necessary to communicate to others what they were not present for.
apokrisis August 31, 2024 at 20:49 #929395
Quoting Harry Hindu
Language evolved from a theory of other minds.


That’s been one theory favoured by cognitivists. As a biosemiotician, I would instead stress the simpler story that language proper arose when Homo sapiens evolved the modern articulate vocal tract.

Quoting Harry Hindu
Drawing scribbles and making sounds with your mouth are just more complex forms of communicating your intentions and reading into others intentions.


A capacity to generate syntactical speech is a difference in kind and not just degree. All apes are social and so have an ability to anticipate and coordinate actions in their social setting. But no ape can learn fluent grammar.


Metaphysician Undercover September 01, 2024 at 11:39 #929503
Quoting Patterner
I doubt it's possible. We communicate much more than mathematical ideas. If we tried using math to talk about any of those things, it would no longer be math. It would be numbers, equations, etc., representing things. Just another language. 1 stands for me. 27 stands for eat. 4,534 stands for apple.
1 + 27 + 4,534 = I eat apple.
There's no math in that. Yeah, I just did that in five minutes. But would we find a solution if we spent a thousand years trying? I doubt it. And I assume it's been tried by plenty of mathematicians over the centuries. I can't imagine a way of actually doing math that also means things we want to discuss.


I think this is where the op goes astray. Information is what is represented by symbols, and "mathematical" is a type of information. Mathematical symbols have corresponding with them, mathematical information. But not all symbols are mathematical symbols, nor is all information mathematical information.

"Identity" is what a particular (individual) thing is said to have. So when a symbol represents a particular thing, this is a special type of information in which identity is assumed. So the information represented with "that apple is mine", is not mathematical information.

The principal difference between these two types of information seems to be that the same mathematical information is freely applied in a wide variety of situations, in a universal way, and to a multitude of different things, while identity information is by its nature restricted in application, to particular things.
Patterner September 01, 2024 at 14:12 #929513
Reply to Metaphysician Undercover
Indeed, not along the lines of the op. I just commented on a snippet of side conversation I thought was interesting. I'll stop now. :smile:
Harry Hindu September 01, 2024 at 14:55 #929520
Quoting apokrisis
That’s been one theory favoured by cognitivists. As a biosemiotician, I would instead stress the simpler story that language proper arose when Homo sapiens evolved the modern articulate vocal tract.

Drawing scribbles and making sounds with your mouth are just more complex forms of communicating your intentions and reading into others intentions.
— Harry Hindu

A capacity to generate syntactical speech is a difference in kind and not just degree. All apes are social and so have an ability to anticipate and coordinate actions in their social setting. But no ape can learn fluent grammar.

This seems too anthropomorphic to me. The difference you are talking about is one between the rules of representation humans have selected in the scribbles they use for efficient communication vs. the rules natural selection has selected for efficient communicating. One could argue that natural selection had a role in the former as well.

Then there's this:
https://phys.org/news/2024-08-uncovering-secret-communication-marmoset-monkeys.html

There's still a lot we do not know about animal communication. It appears to me that what you have shown is that the level of complexity in communication is based on the degree the brain has evolved to distinguish between certain symbols. It's like comparing how hominids started cooking food by throwing it on a fire and the diversity of recipes we have in the modern era. It's still cooking food.

An advanced alien species that communicates telepathically might consider our mode of communication not a language. There are many different ways to communicate, most of which we probably don't even know about.
Athena September 01, 2024 at 17:22 #929559
Quoting Patterner
1 + 27 + 4,534 = I eat apple.


:heart: I absolutely love that example. :rofl: That makes as much sense as spell-check programs that obviously don't have a clue about the intended meaning. Or don't know it is a quote and not something to correct. And AI can do better why?

Athena September 01, 2024 at 17:46 #929565
I am not sure if there should be a separate thread for communication because we are getting far from the identity of numbers and information and when this happens a thread loses its cohesiveness and cognizance.

I am creating a thread for communication. "From numbers and information to communication".
Gnomon September 05, 2024 at 18:07 #930171
Quoting Wayfarer
At the time I had this epiphany, the insight arose, 'so this is why ancient philosophy held arithmetic in high esteem. It was certain, immutable and apodictic.' These are attributes of a higher cognitive functionality, namely rational insight. Of course, I was to discover that this is Platonism 101, and I'm still drawn to the Platonist view of the matter. The philosophical point about it is that through rational thought we have insight into a kind of transcendental realm.

I'm not qualified to engage in this profound thread, but your "epiphany" suggested a relationship between Numbers and Information that is not covered by Shannon's engineering theory, yet may be implicit in Plato's broader philosophical worldview.

Shannon's digital Information is defined in terms of pragmatic, physical, immutable, apodictic distinctions. But Plato's ideal Numbers*1 were non-physical, non-sensible things in a realm beyond time and space (transcendent). Ironically, the latter may be more applicable to mundane human use of Information with analog values, personal meanings, and perhaps even fractal dimensions, that don't lend themselves to yes/no digitization.

Quantum Physics has analyzed reality down, not to atoms of value & meaning, but to oceans of value (the Quantum Field) that lie, not on a simplistic linear number line, but in a "transcendent" state-of-being where "real" particles of Matter are temporary, conditional, and statistically probable. Could Plato's ideal non-sensible mathematical realm correspond to that hypothetical abstract mathematical sphere-of-Influence that physicists call "the universal quantum field"*2?

In Plato's Cave allegory, material things in the sensible world are merely shadows of an illuminated-but-unreal domain. Likewise, our social meanings and linguistic information consist of imperfect analog values that are close enough to absolute True/False to be useful for communication. Not Identical, but relative.

Conservative Physicists probably don't think of the Quantum Field as "transcendent", so exploring that possibility is left to Liberal (new-agey ; mystical energy) Metaphysicians*3. Personally, I doubt that there are any practical real-world applications of transcendental preternatural information, such as access to "unlimited knowledge". But the theoretical philosophical implications of perfection may be of interest to those who like to reason beyond immanent Materialism and utilitarian Mechanism. :smile:


*1. Mathematical Platonism :
the doctrine that there exist abstract objects—objects that are wholly nonspatiotemporal, nonphysical, and nonmental . . . . based on the postulation of unchanging and eternal realities known as forms.
https://www.britannica.com/topic/mathematical-Platonism

*2. The Universal Field Theory is not a physics theory in a classical sense. It is rather a philosophical theory explaining Why and How physical phenomena appear.
https://theuniversalfieldtheory.com/

*3. What is Quantum Transcendence? :
I just googled QT and got this hit.
https://www.1to1coachingschool.com/QEC_What_is_Quantum_Transcendence_Coaching.htm


hypericin September 08, 2024 at 21:19 #930817
Quoting Gnomon
analog values, personal meanings, and perhaps even fractal dimensions, that don't lend themselves to yes/no digitization.


I don't think analog values are not information, while digital values are. Of course analog values are just as representable on machines. The difference with physical values is that in a machine precision is fixed and immutable, you cannot extract more or less. Whereas precision is more fluid in natural values, you can expend more or less work to extract more or less precision. But this precision is also ultimately fixed, bound by physical limitations.

While personal meanings are not in themselves information, but rather frameworks of interpretation. I think the conflation of information and interpretation is one of the main confusions of this topic.
Count Timothy von Icarus September 09, 2024 at 15:00 #930985
Reply to hypericin Reply to Wayfarer

Besides, I have a suspicion that the designation of 'information' as being foundational to existence, goes back to Norbert Wiener saying 'Information is information, not matter or energy. No materialism which does not admit this can survive at the present day.' I'm sure this is what leads to the prevalence of information-as-foundation in contemporary discourse.


If information is thought of as form (actuality, quiddity) then the idea of information as a "foundation" of sorts is very old indeed. In Aristotle, form (act) has primacy over matter (potency).

But often it seems that attempts to use information in a hylomorphic sense are hamstrung by being unable to jettison the modern conception of matter as having form, and so you end up with reductionist versions of information-based ontologies where things are "made of bits," which seems to badly miss the point.

I suppose this subject is also haunted by the mistake of some scholastics, particularly later ones, of turning natures, species, genera, etc. into logical objects, when they are first and foremost the principles of actual, changing being (and the principles of the change therein). Hence, the idea that evolution is a problem for essences because it shows they can change—well this presupposes thinking of them in what is probably an unhelpful manner. I think this is an area where Deely's treatment of Aristotle is particularly helpful, even if he tends to neglect the "form as intellection," side that folks like Perl bring out well.
Gnomon September 09, 2024 at 16:58 #931009
Quoting hypericin
While personal meanings are not in themselves information, but rather frameworks of interpretation. I think the conflation of information and interpretation is one of the main confusions of this topic.

Shannon took an ancient term referring generally & loosely to meaning in a mind*1 --- or as you noted, "frameworks for interpretation --- and adapted it for use in mindless computers*2. To that end, he ignored the inconsistent variable analog concrete semantic forms of Information, and focused on the consistent absolute digital abstract mathematical (either/or ratios) that could be exactly defined as something or nothing (1 or 0).

Human meanings are subject to vague personal interpretation and mis-interpretation, while computer bits & bytes are impersonal & precise. However, those numerical values can later be translated back into human (natural language*3) meanings, but at the risk of mis-interpretation. Anything that can cause an information processor (computer or brain) to create meaningful internal Forms (images ; configurations) is a source of Information. :smile:


*1. Information :
Knowledge and the ability to know. Technically, it's the ratio of order to disorder, of positive to negative, of knowledge to ignorance. It's measured in degrees of uncertainty. Those ratios are also called "differences". So Gregory Bateson defined Information as "the difference that makes a difference". The latter distinction refers to "value" or "meaning". Babbage called his prototype computer a "difference engine". Difference is the cause or agent of Change. In Physics it’s called "Thermodynamics" or "Energy". In Sociology it’s called "Conflict".
https://blog-glossary.enformationism.info/page11.html

*2. Information is an abstract concept that refers to something which has the power to inform. At the most fundamental level, it pertains to the interpretation of that which may be sensed, or their abstractions. . . . Information is not knowledge itself, but the meaning that may be derived from a representation through interpretation ____Wikipedia

*3. Natural Language :
a language that has developed naturally in use (as contrasted with an artificial language or computer code). ____ Oxford Languages

wonderer1 September 09, 2024 at 17:56 #931015
Quoting Gnomon
To that end, he [Shannon] ignored the inconsistent variable analog...


Gnonsense. Shannon worked on analog computers before essentially inventing digital logic. His communication theory was very much about communicating uncorrupted digital data through the noisy analog world. So no, he didn't ignore the analog.

What is with your obsessive need to propagate misinformation?
Gnomon September 09, 2024 at 21:24 #931047
Quoting wonderer1
To that end, he [Shannon] ignored the inconsistent variable analog . . . concrete semantic forms of Information. (bolded words were omitted in your misinterpretation)

Gnonsense. Shannon worked on analog computers before essentially inventing digital logic. His communication theory was very much about communicating uncorrupted digital data through the noisy analog world. So no, he didn't ignore the analog.
What is with your obsessive need to propagate misinformation?

Please note that I wasn't talking about analog Computers (continuous vs digital values), but analog Information*1 (semantic meaning expressed by figurative analogies). Shannon found a way to reduce the Uncertainty of "noisy" Analog Computers, including human brains*2, by using Digital Information in which the Natural Language meaning is converted into synthetic Mathematical symbols. In that process, the real world meanings (analogies ; metaphors ; similes ; nuances) are ignored in favor of abstract numerical values, and must be reconstructed later, opening the possibility of misconstrual.

Ironically, cutting edge computers are now learning to communicate with human programmers in natural language instead of artificial codes*3. How do you think the programmers will deal with the inherent Uncertainties of human language? Your misinterpretation of my human language post is a prime example of self-misinformation. :smile:

*1. Analog Information :
information processing called analog-form information, or simply analog information. Until the development of the digital computer, cognitive information was stored and processed only in analog form, basically through the technologies of printing, photography, and telephony.
https://www.britannica.com/topic/analog-information

*2. Analog Brain
The mammalian brain, comprised of neuronal networks, functions as an analog device and has given rise to artificial neural networks that are implemented as digital algorithms but function as analog models would.
https://www.frontiersin.org/journals/ecology-and-evolution/articles/10.3389/fevo.2022.796413/full

*3. Why Natural Language is the New Language of the Digital Era
The days of writing lines of code to achieve tasks are gradually giving way to the era of conversation. Natural language processing (NLP) and machine learning have reached a point where machines can not only understand what we say but also grasp the context and nuances of our conversations.
https://www.linkedin.com/pulse/why-natural-language-new-digital-era-anuya-kamat

wonderer1 September 09, 2024 at 21:42 #931049
Reply to Gnomon

You didn't answer my question. Do you want to hear my best guess at what the answer is?
Wayfarer September 12, 2024 at 22:16 #931601
Quoting Count Timothy von Icarus
If information is thought of as form (actuality, quiddity) then the idea of information as a "foundation" of sorts is very old indeed. In Aristotle, form (act) has primacy over matter (potency).


The computer chip industry understands hylomorphism very well. Why? Because there's the chip designers, and the chip fabricators, and nowadays they're usually different companies. This is called 'fabless manufacture' and is the standard model in current chip design. NVidia, for instance, deals entirely with design ('form'), while TSMC is one of the leading companies which fabricate the chips using fiendishly complex machines ('matter') - about which, see this mind-blowing documentary on ASML's EUV lithography machines.

Quoting Count Timothy von Icarus
often it seems that attempts to use information in a hylomorphic sense are hamstrung by being unable to jettison the modern conception of matter as having form


It is because of reification, the 'thingifying' tendency deeply embedded in modern thought, which believes that only things are real.
hypericin September 14, 2024 at 18:15 #931942
Quoting Wayfarer
It is because of reification, the 'thingifying' tendency deeply embedded in modern thought, which believes that only things are real.


Rather the opposite of reification. Instead of treating abstractions as real, it excludes parts of the real from the category "real".

Quoting Count Timothy von Icarus
often it seems that attempts to use information in a hylomorphic sense are hamstrung


Why hamstring? If matter has or coexists with form, matter has or coexists with information. What is the problem?
Count Timothy von Icarus September 14, 2024 at 21:03 #931967
Reply to hypericin

Why hamstring? If matter has or coexists with form, matter has or coexists with information. What is the problem?


Ha, I am now seeing that the way I wrote that is extremely unclear. Of course matter has form in hylomorphism!

What I meant to say was that in modern, particularly early modern, thought form is packaged intrinsically with matter into fundamental "building blocks"—atomism, reductionism, smallism—"wholes are the sum of their parts." There is no "prime matter," as an abstraction, sheer potency, but rather matter as actual bits of stuff with definite form and definite/actual attributes (eliminating potency).

On such a view, the form of composite objects is just reducible to arrangements of various building blocks. The relationality and processual elements of the information theoretic view get lost. Things "are what they are made of," a view which loses sight of how information is defined by context.

If you maintain this sort of thinking you end up with "stuff made out of 1s and 0s," the sort of thing the original digital physics was taken over the coals for.

hypericin September 15, 2024 at 21:25 #932177
Reply to Count Timothy von Icarus Form doesn't seem particularly equivalent to information. Just as the same information might "reside" in different material substrates, it can reside in different forms of the same substrate. (i.e black or white text, small or big) Though form does seem "closer" to information than matter (the shape of text is closer to the message than whatever material the glyphs are composed of).

Information seems more like number, something that doesn't exist at all without interpretation, that almost seems to reside in a platonic realm of its own.