• NEW! LOWEST RATES EVER -- SUPPORT THE SHOW AND ENJOY THE VERY BEST PREMIUM PARACAST EXPERIENCE! Welcome to The Paracast+, eight years young! For a low subscription fee, you can download the ad-free version of The Paracast and the exclusive, member-only, After The Paracast bonus podcast, featuring color commentary, exclusive interviews, the continuation of interviews that began on the main episode of The Paracast. We also offer lifetime memberships! Flash! Take advantage of our lowest rates ever! Act now! It's easier than ever to susbcribe! You can sign up right here!

    Subscribe to The Paracast Newsletter!

Consciousness and the Paranormal — Part 2

Status
Not open for further replies.
But information might be spiritual stuff.

... and you say you don't like labels ... ;-)

I don't think we finished talked about the causation problem that pops back up if you don't reduce information to the arrangement of primordial stuff - it seems we can store some information using the physical properties of the monadic stuff - and it's conceivable that we could store phenomenal experience ( information) by exploiting the phenomenal properties of same, right? So ... whence information if not in the arrangements? Then we don't have to deal with causation.
 
Informationphilosopher.com


"What is information that merits its use as the foundation of a new philosophical method of inquiry?

Abstract information is neither matter nor energy, yet it needs matter for its concrete embodiment and energy for its communication. Information is immaterial.
It is the modern spirit, the ghost in the machine."

I still don't see the immateriality of information by this definition?

If it requires matter for its concrete embodiment ... then where does it "come from" before it's embodied and where does it "go " ... ?

And if it requires energy to be communicated ... then, sans energy, isn't it just a lump of stuff in a particular arrangement that allows for a dynamic response (an unfurling) when energy is applied over time ... such that something can be made of it by an information processing system such as a brainperson?

If so, then it seems information is just another physical process, if not, then what's missing?

Is the intelligence in the arrangement of matter, in the energy ... It's not in the receiving mind because that's just information too ... something's vanished here on analysis ... or is intelligence extraneous here ... I've not heard intelligence as epiphenomenal but it's much easier to see that than making consciousness epiphenomenal ...

"Immaterial information is perhaps as close as a physical or biological scientist can get to the idea of a soul or spirit that departs the body at death. When a living being dies, it is the maintenance of biological information that ceases. The matter remains."

Actually, it's as close as anyone can get ... the trick is not to stand so close.

"Biological systems are different from purely physical systems primarily because they create, store, and communicate information. Living things store information in a memory of the past that they use to shape their future. Fundamental physical objects like atoms have no history."

I bet that last statement gets disproved ...

"Every atom is sacred ..."
(See: Monty Python)

"And when human beings export some of their personal information to make it a part of human culture, that information moves closer to becoming immortal.

Human beings differ from other animals in their extraordinary ability to communicate information and store it in external artifacts. In the last decade the amount of external information per person may have grown to exceed an individual's purely biological information.

Information is an excellent basis for philosophy, and for science as well, capable of answering questions about metaphysics (the ontology of things themselves), epistemology (the existential status of ideas and how we know them), idealism (pure information), the mind-body problem, the problem of free will, and the "hard" problem of consciousness."

Dang! Does everyone but me call it the "hard" problem??

Not sure how they can assess it capable of answering all these questions ... but I'll keep reading ...
 
Ok, I think I get what you're saying. I do think the mind can process information. This would be metacognition, thinking about thinking.

The current scientific consensus I believe is that our actions are controlled unconsciously. However, I don't think this means we lack free will.

A paper I posted awhile back offered an interesting theory as to why we have reflective consciousness: To predict the behaviors of others, and also to predict our own behavior.

So while our on-the-fly behavior is controlled unconsciously, I think our ability to meta cognate allows us to physically change our brain and thus our future behavior.

Circling back to phenomenal experience, which I think preceded reflective consciousness in the evolution of mind, my contention has been that zombies are not possible; phenomenal experience will always result from information integrated in the manner performed by organisms.

As I've said in the past, these phenomenal experiences (integrated information) can exist but be non-reflective. That is, there may be no conscious "sense of self" attached to these phenomenal experiences. (It's a question I've been trying to answer throughout this discussion.)

However, once a mind achieves the capacity to self-reflect or meta cognate, access to phenomenal experience occurs and it becomes phenomenal consciousness. There is now a "sense of experiencing self."

These now-conscious experiences are used by the organism to shape future behavior.

This is all just my very humble opinion.

As I read you here, there seems to be a considerable change in either your ideas or in the way in which you're expressing them. But I still don't understand what you mean by "phenomenal consciousness" {please define}, nor why you resist the idea that mind develops in the recognition of its own standing at a distance from that which is 'other', producing the increasingly aware self revealed in and by consciousness), which is already incipient in experience.
 
Informationphilosopher.com
"What is information that merits its use as the foundation of a new philosophical method of inquiry?

Abstract information is neither matter nor energy, yet it needs matter for its concrete embodiment and energy for its communication. Information is immaterial.

It is the modern spirit, the ghost in the machine."

I still don't see the immateriality of information by this definition?
If it requires matter for its concrete embodiment ... then where does it "come from" before it's embodied and where does it "go " ... ?

And if it requires energy to be communicated ... then, sans energy, isn't it just a lump of stuff in a particular arrangement that allows for a dynamic response (an unfurling) when energy is applied over time ... such that something can be made of it by an information processing system such as a brainperson?

If so, then it seems information is just another physical process, if not, then what's missing?

Is the intelligence in the arrangement of matter, in the energy ... It's not in the receiving mind because that's just information too ... something's vanished here on analysis ... or is intelligence extraneous here ... I've not heard intelligence as epiphenomenal but it's much easier to see that than making consciousness epiphenomenal ...

"Immaterial information is perhaps as close as a physical or biological scientist can get to the idea of a soul or spirit that departs the body at death. When a living being dies, it is the maintenance of biological information that ceases. The matter remains."

Actually, it's as close as anyone can get ... the trick is not to stand so close.

"Biological systems are different from purely physical systems primarily because they create, store, and communicate information. Living things store information in a memory of the past that they use to shape their future. Fundamental physical objects like atoms have no history."

I bet that last statement gets disproved ...

"Every atom is sacred ..."
(See: Monty Python)

"And when human beings export some of their personal information to make it a part of human culture, that information moves closer to becoming immortal.
Human beings differ from other animals in their extraordinary ability to communicate information and store it in external artifacts. In the last decade the amount of external information per person may have grown to exceed an individual's purely biological information.

Information is an excellent basis for philosophy, and for science as well, capable of answering questions about metaphysics (the ontology of things themselves), epistemology (the existential status of ideas and how we know them), idealism (pure information), the mind-body problem, the problem of free will, and the "hard" problem of consciousness."

Dang! Does everyone but me call it the "hard" problem??

Not sure how they can assess it capable of answering all these questions ... but I'll keep reading ...

Steve, would you link the page at the 'Information Philosophy' site that you're quoting from? Is there a forum attached to that website? If not, there should be.
 
@Soupie

I wrote: ". . . the types of informational theories of consciousness we've most frequently encountered appear to be reductive in terms of consciousness -- that is, the 'information' originates and develops its integrations outside of consciousness, without the use of consciousness."

You replied: "Hm, I'm not sure what you mean by that."

It seems in your posts today that you have since understood what I was saying and accordingly modified your viewpoint somewhat in the post to which I replied a few minutes ago. Is that impression of mine accurate?
 
In other words, you seem to be recognizing now that consciousness cannot be thought as separate from mind. Is that impression accurate?
 
@Constance

Information Philosopher - Information

And Bob is the information philosopher:

About Information Philosopher

... haven't seen a forum yet.

Ah, found the page about him at this link: About Information Philosopher

which includes this statement about the website's use of extracts from a variety of philosophers and scientists past and present:

"His goal for the I-Phi website is to provide web pages on all the major philosophers and scientists who have worked on the problems of freedom, value, and knowledge. Each page has excerpts from the thinker's work and a critical analysis. The three major sections of the website each will have a history of the problem, the relevant physics, biology, cosmology, etc, and pages on the core concepts in the problem."

It looks at first glance as if for Bob 'the problem' is the problem of free will, but that remains to be seen by anyone who sticks with him long enough. It appears that he presents several 'introductory' pages (the one you first referred to and quoted, and the one you quoted second, which I asked for the location of). Maybe more. It would help if he provided a site map. Maybe he has? I haven't come across it yet.
 
Last edited:
Here's the site map for Bob's website: Information Philosopher Site Map

No entry for God but there is this:

"The Problem of Value"


"Does "Goodness" exist? We find this a much more tractable problem than whether God exists. And identifying objective goodness or value will uncover the nature of some things often attributed to a God.

The Existentialists thought good did not exist . Most religions place its origin in a supernatural Being. Humanists felt that good was a human invention. "Man is the measure of all things." and "Nothing either good or bad, but thinking makes it so."

Modern bioethicists situate value in all life. Environmentalists have a slightly broader view, embracing all our planetary resources.

A variety of ancient religions looked to the sun as the source of all life and thus good. If not the sun itself, they anthropomorphized the "bright sky" as God. Dark and the night were stigmatized forever as evil and "fallen."

Core Concepts

The Ergo
Ergod
Ergodic

Philosophers have ever longed to discover a cosmic good. Their ideal source of the good was remote as possible from the Earth in space and in time. Some wanted it outside space and time.

For Plato a timeless Good was found in Being itself. For his student Aristotle, Good was a property of the first principles that set the world in motion. For Kant it needed a transcendental God in a noumenal realm outside space, time, and the phenomena.
Can we discover a cosmic good? At least identify the source of anything resembling the Good?

Yes, we can.

Does it resemble the Good anthropomorphized as a God personally concerned about our individual good, a God intervening in the world to respond to prayer?

No, it does not.

It is more like the Divine Providence of the Stoics, Spinoza, and Einstein.

Our source of goodness has one outstanding characteristic of such a God. We can accurately say it is Providence, in the sense of that which has provided for our existence. We have discovered that which provides. It provides the light, it provides life, it provides intelligence.

Again celebrating the first modern philosopher, René Descartes, we name our model for value and Goodness the Ergo.

We call "ergodic" those few processes that resist the terrible and universal Second Law of Thermodynamics, which describes the increase of chaos and entropy (disorder).

Without violating the Second Law, ergodic processes reduce the entropy locally, producing pockets of negative entropy (order and information-rich structures). We will see that ergodic processes radiate away positive entropy, far more than the local reduction, thus satisfying the Second Law.

We call all this cosmic order the Ergo. It is the ultimate sine qua non. All else is chaos.
For those who want to anthropomorphize on the slender thread of discovering the natural Providence, they might call it the Ergod. No God can be God without being Ergodic."
 
I like the way he breaks out categories - a fresh (To me) approach ... so much on free will, looks to be his particular interest.

Yes, I think it is his primary interest, and he appears to attempt to interpret quantum mechanics and theory to enable free will.
 
The SEP doesn't have an article on 'information philosophy' but does have two informative articles related to information theories. Here are a few extracts from the first (the whole of it is worth reading):

"The situation that seems to emerge is not unlike the concept of energy: there are various formal sub-theories about energy (kinetic, potential, electrical, chemical, nuclear) with well-defined transformations between them. Apart from that, the term ‘energy’ is used loosely in colloquial speech. There is no consensus about the exact nature of the field of philosophy of information. Some authors like Floridi (2002, 2003, 2011) present ‘Philosophy of Information’ as a completely new development with a capacity to revolutionize philosophy per se. Others (Adriaans and van Benthem 2008a; Lenski 2010) see it more as a technical discipline with deep roots in the history of philosophy and consequences for various disciplines like methodology, epistemology and ethics."

"The term ‘information’ in colloquial speech is currently predominantly used as an abstract mass-noun used to denote any amount of data, code or text that is stored, sent, received or manipulated in any medium. The detailed history of both the term ‘information’ and the various concepts that come with it is complex and for the larger part still has to be written (Seiffert 1968; Schnelle 1976; Capurro 1978, 2009; Capurro and Hjørland 2003). The exact meaning of the term ‘information’ varies in different philosophical traditions and its colloquial use varies geographically and over different pragmatic contexts. Although an analysis of the notion of information has been a theme in Western philosophy from its early inception, the explicit analysis of information as a philosophical concept is recent, and dates back to the second half of the 20th century. Historically the study of the concept of information can be understood as an effort to make the extensive properties of human knowledge measurable. In the 20th century various proposals for formalization of concepts of information were made:
  1. Fisher information: the amount of information that an observable random variable X carries about an unknown parameter θ upon which the probability of X depends (Fisher 1925).
  2. Shannon information: the entropy, H, of a discrete random variable X is a measure of the amount of uncertainty associated with the value of X (Shannon 1948; Shannon & Weaver 1949).
  3. Kolmogorov complexity: the information in a binary string x is the length of the shortest program p that produces x on a reference universal Turing machine U (Solomonoff 1960, 1964a,b, 1997; Kolmogorov 1965; Chaitin 1969, 1987).
  4. Quantum Information: The qubit is a generalization of the classical bit and is described by a quantum state in a two-state quantum-mechanical system, which is formally equivalent to a two-dimensional vector space over the complex numbers (Von Neumann 1955; Redei & Stoeltzner 2001).
  5. Information as a state of an agent: the formal logical treatment of notions like knowledge and belief was initiated by Hintikka (1962, 1973). Dretske (1981) and van Benthem & van Rooij (2003) studied these notions in the context of information theory, cf. van Rooij (2004) on questions and answers, or Parikh& Ramanujam (2003) on general messaging. Also Dunn seems to have this notion in mind when he defines information as “what is left of knowledge when one takes away believe, justification and truth” (Dunn 2001 pg. 423, 2008).
  6. Semantic Information: Bar-Hillel and Carnap developed a theory of semantic Information (1953). Floridi (2002, 2003, 2011) defines semantic information as well-formed, meaningful and truthful data. Formal entropy based definitions of information (Fisher, Shannon, Quantum, Kolmogorov) do not imply wellformedness or truthfulness.
Information (Stanford Encyclopedia of Philosophy)


This second article, on information and computation, at the SEP is especially interesting:

Open Problems in the Study of Information and Computation
There is no consensus on a ‘standard’ set of open problems in philosophy of information. Some typical problems of a philosophical nature in the study of information and computation are (ordered roughly in terms of estimated hardness):
The unification of various theories of information:
In a mathematical sense information is associated with measuring extensive properties of classes of systems with finite but unlimited dimensions (systems of particles, texts, codes, networks, graphs, games etc.). This suggests that a uniform treatment of various theories of information is possible. In the Handbook of Philosophy of Information three different forms of information are distinguished (Adriaans and van Benthem 2008b):
  • Information-A, Knowledge, logic, what is conveyed in informative answers
  • Information-B, Probabilistic, information-theoretic, measured quantitatively
  • Information-C, Algorithmic, code compression, measured quantitatively
Because of recent development the connections between Information-B (Shannon) and Information-C (Kolmogorov) are reasonably well understood (Cover and Thomas 2006). The historical material presented in this article suggests that reflection on Information-A (logic, knowledge) is historically much more interwoven than was generally known up till now. The research program of logical positivism can with hindsight be characterized as the attempt to marry a possible worlds interpretation of logic with probabilistic reasoning (Carnap 1945, 1950; Popper 1934). It has to my knowledge not been revitalized in the light of the huge developments in information theory in the past decades. Modern attempt to design a bayesian epistemology (Bovens and Hartmann 2003) do not seem to be aware of the work done in the first half of the 20th century. However, an attempt to unify Information-A and Information-B seems a viable exercise. Also the connection between thermodynamics and information theory have become much closer in the past decade, amongst others, due to the work of Gell-Mann & Lloyd (2003) (see also: Bais and Framer 2008). Verlinde (2010) even presented a reduction of gravity to information. All these developments suggest that information is an even deeper concept than was known up till now but, that the ambition to formulate a unified theory of information is everything but a lost cause.

What is useful/meaningful information?
All well-known quantitative information measures (specifically Shannon Information and Kolmogorov complexity) assign the highest information content to data sets with the highest entropy. In this sense a television broadcast with only white noise would contain the most meaningful information. This is counter intuitive. In the past decennia there have been a number of proposals to define a formal unit of measurement of the amount of structural (or model-) information in a data set.
  • Aesthetic measure (Birkhoff 1950)
  • Sophistication (Koppel 1987; Antunes et al. 2006; Antunes & Fortnow 2003)
  • Logical Depth (Bennet 1988)
  • Effective complexity (Gell-Mann, Lloyd 2003)
  • Meaningful Information (Vitányi 2006)
  • Self-dissimilarity (Wolpert, Macready 2007)
  • Computational Depth (Antunes et al. 2006)
  • Facticity (Adriaans 2008)
Three intuitions dominate the research. A string is ‘interesting’ when …
  • a certain amount of computation is involved in its creation (Sophistication, Computational Depth);
  • there is a balance between the model-code and the data-code under two part code optimization (effective complexity, facticity);
  • it has internal phase transitions (self-dissimilarity).
Such models penalize both maximal entropy and low information content, but the exact relationship between these intuitions is unclear. Since these proposals have a lot in common it is not inconceivable that a unified theory of meaningful information will be developed in the coming years. Phenomena that might be related to a theory of structural information and that currently are ill-understood are: phase transitions in the hardness of satisfiability problems related to their complexity (Simon & Dubois 1989; Crawford & Auton 1993) and phase transitions in the expressiveness of Turing machines related to their complexity (Crutchfield & Young 1989, 1990; Langton 1990; Dufort & Lumsden 1994).

What is an adequate logic of information?
What is a good logical system (or set of systems) that formalizes our intuitions of the relation between concepts like ‘knowing’, ‘believing’ and ‘being informed of’. Proposals by: Dretske (1981), van Benthem (2006; van Bethem & de Rooij 2003), Floridi (2003, 2011) and others. In principle these concepts probably can be mapped onto our current landscape of known logics (structural, modal). Also the discrepancies between proposed systems presumably can be analyzed within the ‘normal science’ framework of existing logics.

Continuous versus discrete models of nature
This problem seems to have bothered the development of a unified theory of information and entropy for the last 150 years or so. The central issue is this: the most elegant models of physical systems are based on functions in continuous spaces. In such models almost all points in space carry an infinite amount of information. Yet, the cornerstone of thermodynamics is that a finite amount of space has finite entropy. What is the right way to reconcile these two views? This problem is related to questions studied in philosophy of mathematics (an intuitionistic versus a more platonic view). The issue is central in some of the more philosophical discussions on the nature of computation and information (Putnam 1988; Searle 1990). The problem is also related to the notion of phase transitions in the description of nature (e.g., thermodynamics versus statistical mechanics) and to the idea of levels of abstraction (Floridi 2002).

Computation versus thermodynamics:
There is a reasonable understanding of the relationship between Kolmogorov Complexity and Shannon information (Li & Vitányi 2008; Grünwald and Vitányi 2008; Cover & Thomas 2006), but the unification between the notion of entropy in thermodynamics and Shannon-Kolmogorov information is very incomplete apart from some very ad hoc insights (Harremoës and Topsøe 2008; Bais and Farmer 2008). What is a computational process from a thermodynamical point of view? What is a thermodynamical process from a computational point of view. Can a thermodynamic theory of computing serve as a theory of non-equilibrium dynamics? This problem seems to be hard because 150 years of research in thermodynamics still leaves us with a lot of conceptual unclarities in the heart of the theory of thermodynamics itself. It is also not clear how a theory of computation will help us here, although bringing in concepts of theory of computation seems to be promising. Possible theoretical models could with high probability be corroborated with feasible experiments (e.g., Joule's adiabatic expansion, see Adriaans 2008)

Classical information versus quantum information
Classical information is measured in bits. Implementation of bits in nature involves macroscopic physical systems with at least two different stable states and a low energy reversible transition process (i.e., switches, relays, transistors). The most fundamental way to store information in nature on an atomic level involves qubits. The qubit is described by a state vector in a two-level quantum-mechanical system, which is formally equivalent to a two-dimensional vector space over the complex numbers (Von Neumann 1955; Nielsen & Chuang 2000). Quantum algorithms have, in some cases, a fundamentally lower complexity (e.g., Shor's algorithm for factorization of integers.) The exact relation between classical and quantum information is unclear. Part of it has to do with our lack of understanding of quantum mechanics and with the question whether nature is essentially deterministic or not.

Information and the theory of everything:
In the past decennia information seems to have become a vital concept in physics. Seth Lloyd and others (Zuse 1969; Wheeler 1990; Schmidhuber 1997b; Wolfram 2002; Hutter 2010) have analyzed computational models of various physical systems. The notion of information seems to play a major role in the analysis of black holes (Lloyd & Ng 2004; Bekenstein 1994). Erik Verlinde (2010) has proposed a theory in which gravity is analyzed in terms of information. For the moment these models seem to be purely descriptive without any possibility of empirical verification.

The Church-Turing Hypothesis.
We know that a lot of formal systems are Turing equivalent (Turing machines, recursive functions, lambda calculus, combinatory logic, cellular automata, to name a few). The question is: does this equivalence define the notion of computation. Dershowitz and Gurevich (2008) claim to have vindicated the hypothesis, but this result is not generally accepted (see the discussion on “Computability – What would it mean to to disprove the Church-Turing thesis”, in the Other Internet Resources section). A lot of conceptual clarification seems necessary, but for now it is unclear how one ever could verify the thesis definitively. The discovery of a system in nature that could actually compute more than a Turing machine would imply an immediate falsification, but such a device has not been found up till now.

P versus NP?
Can every problem for which the solution can be checked efficiently also be solved efficiently by a computer (Garey & Johnson 1979)? See Cook 2000 (Other Internet Resources) for a good introduction. The problem, that appears to be very hard, has been a rich source of research in computer science and mathematics although relatively little has been published on its philosophical relevance. That a solution might have profound philosophical impact is illustrated by a quote from Scott Aaronson: If P = NP, then the world would be a profoundly different place than we usually assume it to be. There would be no special value in “creative leaps,” no fundamental gap between solving a problem and recognizing the solution once it's found. Everyone who could appreciate a symphony would be Mozart; everyone who could follow a step-by-step argument would be Gauss…. (See the post “Reasons to Believe” on Scott Aaronson's blog Shtetl-Optimized listed in Other Internet Resources. This is cited in the Wikipedia entry on the P versus NP problem (also in Other Internet Resources), as of September 10, 2012.)
Information > Open Problems in the Study of Information and Computation (Stanford Encyclopedia of Philosophy)
 
Wikipedia has an entry labeled 'Philosophy of Information'. It's rather sparse and names the same originator of 'information philosophy' named in the SEP, L. Floridi.

Extract:

According to L. Floridi "[4] one can think of several ways for applying computational methods towards philosophical matters:
  1. Conceptual experiments in silico: As an innovative extension of an ancient tradition of thought experiment, a trend has begun in philosophy to apply computational modeling schemes to questions in logic, epistemology, philosophy of science, philosophy of biology, philosophy of mind, and so on.
  2. Pancomputationalism: By this view, computational and informational concepts are considered to be so powerful that given the right Level of abstraction, anything in the world could be modeled and represented as a computational system, and any process could be simulated computationally. Then, however, pancomputationalists have the hard task of providing credible answers to the following two questions:
    1. how can one avoid blurring all differences among systems?
    2. what would it mean for the system under investigation not to be an informational system (or a computational system, if computation is the same as information processing)?
Philosophy of information - Wikipedia, the free encyclopedia
 
May 2, 2013
A Most Profound Math Problem
By Alexander Nazaryan

On August 6, 2010, a computer scientist named Vinay Deolalikar published a paper with a name as concise as it was audacious: “P ≠ NP.” If Deolalikar was right, he had cut one of mathematics’ most tightly tied Gordian knots. In 2000, the P = NP problem was designated by the Clay Mathematics Institute as one of seven Millennium Problems—“important classic questions that have resisted solution for many years”—only one of which has been solved since. (The Poincaré Conjecture was vanquished in 2003 by the reclusive Russian mathematician Grigory Perelman, who refused the attached million-dollar prize.)

A few of the Clay problems are long-standing head-scratchers. The Riemann hypothesis, for example, made its debut in 1859. By contrast, P versus NP is relatively young, having been introduced by the University of Toronto mathematical theorist Stephen Cook in 1971, in a paper titled “The complexity of theorem-proving procedures,” though it had been touched upon two decades earlier in a letter by Kurt Gödel, whom David Foster Wallace branded “modern math’s absolute Prince of Darkness.” The question inherent in those three letters is a devilish one: Does P (problems that we can easily solve) equal NP (problems that we can easily check)?

Take your e-mail password as an analogy. Its veracity is checked within a nanosecond of your hitting the return key. But for someone to solve your password would probably be a fruitless pursuit, involving a near-infinite number of letter-number permutations—a trial and error lasting centuries upon centuries. Deolalikar was saying, in essence, that there will always be some problems for which we can recognize an answer without being able to quickly find one—intractable problems that lie beyond the grasp of even our most powerful microprocessors, that consign us to a world that will never be quite as easy as some futurists would have us believe. There always will be problems unsolved, answers unknown.

If Deolalikar’s audacious proof were to hold, he could not only quit his day job as a researcher for Hewlett-Packard but rightly expect to enter the pantheon as one of the day’s great mathematicians. But such glory was not forthcoming. Computer scientists and mathematicians went at Deolalikar’s proof—which runs to dozens of pages of fixed-point logistics and k-SAT structures and other such goodies—with the ferocity of sharks in the presence of blood. The M.I.T. computational theorist Scott Aaronson (with whom I consulted on this essay’s factual assertions) wrote on his blog, “If Vinay Deolalikar is awarded the $1,000,000 Clay Millennium Prize for his proof of P ≠ NP, then I, Scott Aaronson, will personally supplement his prize by the amount of $200,000.” It wasn’t long before Deolalikar’s paper was thoroughly discredited, with Dr. Moshe Vardi, a computer-science professor at Rice University, telling the Times, “I think Deolalikar got his 15 minutes of fame.”

As Lance Fortnow describes in his new book, “The Golden Ticket: P, NP and the Search for the Impossible,” P versus NP is “one of the great open problems in all of mathematics” not only because it is extremely difficult to solve but because it has such obvious practical applications. It is the dream of total ease, of the confidence that there is an efficient way to calculate nearly everything, “from cures to deadly diseases to the nature of the universe,” even “an algorithmic process to recognize greatness.” So while a solution for the Birch and Swinnerton-Dyer conjecture, another of the Clay Millennium Prize problems, would be an impressive feat, it would have less practical application than definitive proof that anything we are able to quickly check (NP), we can also quickly solve (P).

Fortnow’s book—which, yes, takes its name from “Willy Wonka & the Chocolate Factory”—bills itself as a primer for the general reader, though you will likely regret not having paid slightly more attention during calculus class. Reading “The Golden Ticket” is sort of like watching a movie in a foreign language without captions. You will miss some things, but not everything. There is some math humor, which is at once amusing, cheesy, and endearing exactly in the way that you think a mathematician’s humor might be amusing, cheesy, and endearing.

What Fortnow calls “P” stands for polynomial time, meaning the size of the input raised to a fixed number like two or three. Conversely, exponential time is some number raised to the size of the input. Though polynomial time can be long (say, 502), it is nothing compared to its exponential opposite (250). If the first is the Adirondacks, the second is the Himalayas. When solving things, we want to keep them in polynomial time if we still want to have time for lunch.

“NP” (nondeterministic polynomial time) is a set of problems we want to solve, of varying degrees of difficulty. Many everyday activities rely on NP problems: modern computer encryption, for example, which involves the prime factors of extremely large numbers. Some forty years ago, Richard Karp, the Berkeley theoretician, first identified twenty-one problems as being “NP-complete,” meaning that they are at least as hard as any other NP problem. The NP-complete problems are a sort of inner sanctum of computational difficulty; solve one and you’ve solved them all, not to mention all the lesser NP problems lurking in the rear. Karp’s foreboding bunch of problems have names like “directed Hamiltonian cycle” and “vertex cover.” Though they are extremely hard to solve, solutions are easy to check. A human may be able to solve a variation of one of these problems through what Soviet mathematicians called “perebor,” which Fortnow translates as “brute-force search.” The question of P versus NP is whether a much faster way exists.

So far, the answer is no. Take one of these NP-complete problems, called “k-clique,” which Fortnow explains as follows: “What is the largest clique on Facebook [such that] all of [them] are friends with each other?” Obviously, the more users there are on Facebook, the more difficult it is to find the biggest self-enclosed clique. And thus far, no algorithm to efficiently solve the clique problem has been discovered. Or, for that matter, to solve any of its NP-complete siblings, which is why most people do think that P ≠ NP.

There are considerations here, too, beyond math. Aaronson, the M.I.T. scientist, wrote a blog post about why he thinks P ≠ NP, providing ten reasons for why this is so. The ninth of these he called “the philosophical argument.” It runs, in part, as follows: “If P = NP, then the world would be a profoundly different place than we usually assume it to be. There would be no special value in ‘creative leaps,’ no fundamental gap between solving a problem and recognizing the solution once it’s found. Everyone who could appreciate a symphony would be Mozart; everyone who could follow a step-by-step argument would be Gauss; everyone who could recognize a good investment strategy would be Warren Buffett.”

We already check novels for literary qualities; most critics could easily enough put together a list of categories that make a novel great. Imagine, now, if you could write an algorithm to efficiently create verifiably great fiction. It isn’t quite as outlandish as you think: back in 2008, the Russian writer Alexander Prokopovich “wrote” the novel “True Love” by taking seventeen classics that were recombined via computer in seventy-two hours into an entirely new work. As Prokopovich told the St. Petersburg Times, “Today publishing houses use different methods of the fastest possible book creation in this or that style meant for this or that readers’ audience. Our program can help with that work.” He then added a note of caution: “However, the program can never become an author, like Photoshop can never be Raphael.” But if P = NP, then it could only be a matter of time before someone figured out how to create verifiably “great” novels and paintings with mathematical efficiency.

Much of Fortnow’s book is spent depicting a world in which P is proven to equal NP, a world of easily computed bliss. He imagines, for example, an oncologist no longer having to struggle with the trial and error of chemotherapy because “we can now examine a person’s DNA as well as the mutated DNA of the cancer cells and develop proteins that will fold in just the right way to effectively starve the cancer cells without causing any problems for the normal cells.” He also whips up a political scandal in which a campaign manager “hired a computer programmer, who downloaded tens of thousands of well-received speeches throughout the decades. The programmer then used [an] algorithm to develop a new speech based on current events”—one that the unwitting public predictably loves.

To postulate that P ≠ NP, as Fortnow does, is to allow for a world of mystery, difficulty, and frustration—but also of discovery and inquiry, of pleasures pleasingly delayed. Fortnow concedes the possibility that “it will forever remain one of the true great mysteries of mathematics and science.” Yet Vinay Deolalikar is unlikely the last to attempt a proof, for all of mathematics rests on a fundamental hubris, a belief that we can order what Wallace Stevens calls “a slovenly wilderness.”* It is a necessary confidence, yet we are not always rewarded for it.

Alexander Nazaryan is on the editorial board of the New York Daily News, where he edits the Page Views book blog.
Illustration by Jordan Awan.
Correction: It is the Riemann Hypothesis, not the Reimann Hypothesis.


Here's that Stevens poem:

Anecdote of the Jar
By Wallace Stevens

I placed a jar in Tennessee,
And round it was, upon a hill.
It made the slovenly wilderness
Surround that hill.

The wilderness rose up to it,
And sprawled around, no longer wild.
The jar was round upon the ground
And tall and of a port in air.

It took dominion everywhere.
The jar was gray and bare.
It did not give of bird or bush,
Like nothing else in Tennessee.

 
Steve, as so often happens, your incisive intellect has zeroed in on the critically important portions of Bob's Information Philosophy. Here is a sizeable extract from his introductory page on Information, which you quoted from more briefly and pointedly yesterday. I needed to read the whole of it again today to understand his core argument comprehending information in a nondeterministic universe.

". . .
What is information that merits its use as the foundation of a new philosophical method of inquiry?

Abstract information is neither matter nor energy, yet it needs matter for its concrete embodiment and energy for its communication. Information is immaterial.
It is the modern spirit, the ghost in the machine.

Immaterial information is perhaps as close as a physical or biological scientist can get to the idea of a soul or spirit that departs the body at death. When a living being dies, it is the maintenance of biological information that ceases. The matter remains.

Biological systems are different from purely physical systems primarily because they create, store, and communicate information. Living things store information in a memory of the past that they use to shape their future. Fundamental physical objects like atoms have no history.

And when human beings export some of their personal information to make it a part of human culture, that information moves closer to becoming immortal.

Human beings differ from other animals in their extraordinary ability to communicate information and store it in external artifacts. In the last decade the amount of external information per person may have grown to exceed an individual's purely biological information.

Information is an excellent basis for philosophy, and for science as well, capable of answering questions about metaphysics (the ontology of things themselves), epistemology (the existential status of ideas and how we know them), idealism (pure information), the mind-body problem, the problem of free will, and the "hard" problem of consciousness.

Actionable information has pragmatic value.

In our information philosophy, knowledge is the sum of all the information created and preserved by humanity. It is all the information in human minds and in artifacts of every kind - from books and internetworked computers to our dwellings and managed environment.

We shall see that all information in the universe is created by a single two-part process, the only one capable of generating and maintaining information in spite of the dread second law of thermodynamics, which describes the irresistible increase in disorder or entropy. We call this anti-entropic process ergodic. It should be appreciated as the creative source of everything we can possibly value, and of everything distinguishable from chaos and therefore interesting.

Enabled by the general relativistic expansion of the universe, the cosmic creative process has formed the macrocosmos of galaxies, stars, and planets. It has also generated the particular forms of microscopic matter - atoms, molecules, and the complex macromolecules that support biological organisms. It includes all quantum cooperative phenomena.

Quantum phenomena control the evolution of life and human knowledge. They help bring new information into the universe in a fundamentally unpredictable way. They drive biological speciation. They facilitate human creativity and free will.

Although information philosophy looks at the universe, life, and intelligence through the single lens of information, it is far from mechanical and reducible to a deterministic physics. The growth of information over time - our principle of increasing information - is the essential reason why time matters and individuals are distinguishable.


Information is the principal reason that biology is not reducible to chemistry and physics. Increasing information (a combination of perfect replication with occasional copying errors) explains all emergent phenomena, including many "laws of nature."

In information philosophy, the future is unpredictable for two basic reasons. First, quantum mechanics shows that some events are not predictable. The world is causal, but not pre-determined. Second, the early universe does not contain the information of later times, just as early primates do not contain the information structures for intelligence and verbal communication, and infants do not contain the knowledge and remembered experience they will have as adults.

In the naive world of Laplace's demon and strict determinism, all the information in the universe is constant at all times. But "determinism" itself is an emergent idea, realized only when large numbers of particles assemble into bodies that can average over the irreducible microscopic indeterminacy of their component atoms. . . ."

Information Philosopher - Information
 
Last edited:
The next big question that this information philosopher needs to consider is that of quantum entanglement, but as yet that term leads to a blank page at his site. Entanglement implies that "all the information in the universe," while not atemporally 'constant' as in Laplace, is continually maintained and added to, increasingly compounded. Bohm and Pribram and their successors think of this process as structurally constituting a 'hologram' that evolves in complexity, without losing any information along the way. If the universe at some point in time disappears in a 'heat death' or into a black hole, does that information persist? If so, in what form? Could it constitute the point of development of another universes? Or a number of them?
 
Status
Not open for further replies.
Back
Top