• NEW! LOWEST RATES EVER -- SUPPORT THE SHOW AND ENJOY THE VERY BEST PREMIUM PARACAST EXPERIENCE! Welcome to The Paracast+, eight years young! For a low subscription fee, you can download the ad-free version of The Paracast and the exclusive, member-only, After The Paracast bonus podcast, featuring color commentary, exclusive interviews, the continuation of interviews that began on the main episode of The Paracast. We also offer lifetime memberships! Flash! Take advantage of our lowest rates ever! Act now! It's easier than ever to susbcribe! You can sign up right here!

    Subscribe to The Paracast Newsletter!

Consciousness and the Paranormal — Part 2

Free episodes:

Status
Not open for further replies.
qequ5a2y.jpg


5emy8udu.jpg


1963 edition of Brittanica's "The Great Ideas of Today" (Mortimer J Adler) - article on a translation of Being and Time in English.

I'll try to post some passages from the section on Philosophy & Religion.

$1.00 at the consignment store ...

Sent from my iPhone using Tapatalk
 
Here's a paper presenting an argument that if materialism is true, then the USA is a conscious entity. I believe the argument's purpose is to show that consciousness can't be materialistic. Tonini's theory is mentioned.

For my part, I'm open to such ideas. IMO I see no reason to assume that any system — no matter it's substrate — that is capable of doing what a human brain can do will generate phenomenal experience. If phenomenal experience = uniquely integrated information, then it's an equal opportunist.

http://www.faculty.ucr.edu/~eschwitz/SchwitzPapers/USAconscious-140130a.pdf

And here's an article from the Guardian pointing out the danger of putting all our eggs in the machine/information theory of consciousness citing the historical tendency to compare humans to technology/machines.

That's one way to view things... The other way of viewing this is to consider that it may be correct; maybe the reason that thinkers have been comparing humans to machines/computers for thousands of years is because organisms are machines, and brains are computers!

Currently, man-made machines and computers are very crude. It's very easy to see te difference between them and the natural world. I think that will change in the future. That is, there may come a time when man-made machines and computers are nearly indistinguishable from natural organisms and brains.

This means 1) we will know whether consciousness requires more than this, and 2) the idea of Intelligent Design will be given the attention it deserves.

From photography to supercomputers: how we see ourselves in our inventions | Science | The Observer
 
Here's a paper presenting an argument that if materialism is true, then the USA is a conscious entity. I believe the argument's purpose is to show that consciousness can't be materialistic. Tonini's theory is mentioned.

For my part, I'm open to such ideas. IMO I see no reason to assume that any system — no matter it's substrate — that is capable of doing what a human brain can do will generate phenomenal experience. If phenomenal experience = uniquely integrated information, then it's an equal opportunist.

http://www.faculty.ucr.edu/~eschwitz/SchwitzPapers/USAconscious-140130a.pdf

And here's an article from the Guardian pointing out the danger of putting all our eggs in the machine/information theory of consciousness citing the historical tendency to compare humans to technology/machines.

That's one way to view things... The other way of viewing this is to consider that it may be correct; maybe the reason that thinkers have been comparing humans to machines/computers for thousands of years is because organisms are machines, and brains are computers!

Currently, man-made machines and computers are very crude. It's very easy to see te difference between them and the natural world. I think that will change in the future. That is, there may come a time when man-made machines and computers are nearly indistinguishable from natural organisms and brains.

This means 1) we will know whether consciousness requires more than this, and 2) the idea of Intelligent Design will be given the attention it deserves.

From photography to supercomputers: how we see ourselves in our inventions | Science | The Observer

See page 9, post 161 for a link to another paper by Schwitzgebel where he argues for universal dubeity in regard to the current theories of consciousness.
 
Here's a paper presenting an argument that if materialism is true, then the USA is a conscious entity. I believe the argument's purpose is to show that consciousness can't be materialistic. Tonini's theory is mentioned.

For my part, I'm open to such ideas. IMO I see no reason to assume that any system — no matter it's substrate — that is capable of doing what a human brain can do will generate phenomenal experience. If phenomenal experience = uniquely integrated information, then it's an equal opportunist.

http://www.faculty.ucr.edu/~eschwitz/SchwitzPapers/USAconscious-140130a.pdf

And here's an article from the Guardian pointing out the danger of putting all our eggs in the machine/information theory of consciousness citing the historical tendency to compare humans to technology/machines.

That's one way to view things... The other way of viewing this is to consider that it may be correct; maybe the reason that thinkers have been comparing humans to machines/computers for thousands of years is because organisms are machines, and brains are computers!

Currently, man-made machines and computers are very crude. It's very easy to see te difference between them and the natural world. I think that will change in the future. That is, there may come a time when man-made machines and computers are nearly indistinguishable from natural organisms and brains.

This means 1) we will know whether consciousness requires more than this, and 2) the idea of Intelligent Design will be given the attention it deserves.

From photography to supercomputers: how we see ourselves in our inventions | Science | The Observer

"comparing humans to machines/computers for thousands of years is because organisms are machines, and brains are computers!"

Thousands of years? All the way back to the mindless Babylonians? Humans invented/discovered/developed machines - nature has no straight lines.

The brain isn't a Turing machine. There are several kinds of things we call computers - quantum, dna based, neural networks. Not sure if everything we call a computer is a Turing machine.

All that said - if we design or better, grow "computers" evolve them ( a process they may take part in ) then yes we will end up with brain- like machines but then we will say not that brains are machines but that, what do you know, turns out computers were like brains all along! ;-)
 
"comparing humans to machines/computers for thousands of years is because organisms are machines, and brains are computers!"

Thousands of years? All the way back to the mindless Babylonians? Humans invented/discovered/developed machines - nature has no straight lines.
As I said, today's man-made machines and computers are very crude. (I am equating technology with machines.) I think that's why so many people think the comparison is faulty. I'm not a machine! Machines are metal and square!

That will change in the future.

All that said - if we design or better, grow "computers" evolve them ( a process they may take part in ) then yes we will end up with brain- like machines but then we will say not that brains are machines but that, what do you know, turns out computers were like brains all along! ;-)
Thats what I meant re my comment about intelligent design.

I think organisms and brains are machines, whether the were intelligently designed or arose via mindless processes. What I'm saying is that brilliant humans have been comparing organisms to machines - or at least information processing/manipulating technology - not simply because "that's what we do" but because we are information processing/manipulating "machines."

Machines are organisms are machines. The distinction between these things today will be gone in the (near?) future. Then the debate about ID will really heat up.

And I'd add, this all true regardless of whether consciousness is fundamentally information or not.
 
Last edited:
The physics of energy versus the physics of information via Vallee. Incredibly thought-provoking 15 min presentation. Very relevant to this discussion. Wow.

 
The physics of energy versus the physics of information via Vallee. Incredibly thought-provoking 15 min presentation. Very relevant to this discussion. Wow.


Excellent talk - I agree. Especially at approx 14:50 he gets into some very interesting ideas. His observations regarding coincidences - very interesting. Thank you for the link, Soupie. :)
 
I am fascinated with his ideas around time and the future - and what one is able to imagine -

The age of impossible, anticipating discontinuous futures: Jacques Vallee at TEDxGeneva


Text: "The acceleration of technology in an increasingly connected society producing "impossible futures" that range from rapid collapse of major banks to the emergence of complex new forms of political power. The Internet has become both a tool and a victim of this global mutation."
 
As I said, today's man-made machines and computers are very crude. (I am equating technology with machines.) I think that's why so many people think the comparison is faulty. I'm not a machine! Machines are metal and square!

That will change in the future.

Thats what I meant re my comment about intelligent design.

I think organisms and brains are machines, whether the were intelligently designed or arose via mindless processes. What I'm saying is that brilliant humans have been comparing organisms to machines - or at least information processing/manipulating technology - not simply because "that's what we do" but because we are information processing/manipulating "machines."

Machines are organisms are machines. The distinction between these things today will be gone in the (near?) future. Then the debate about ID will really heat up.

And I'd add, this all true regardless of whether consciousness is fundamentally information or not.

You define organisms as machines and then base your argument on your definition = circular reasoning. Since its a circle, I was pointing out that you can go exactly the other way - from machine to organism.

Google the difference in machines and organisms ...

One hit is:

Organisms ≠ Machines | Daniel Nicholson - Academia.edu

... There are a few others.

Also, you assume technology is going in a particular direction - and linear, if not exponential progress. This has been labelled "the myth of progress". I don't see the evidence for it - you can point to this or that activity in a research laboratory but I don't see it on a broader scale - there IS the oft cited statistic that more people have cell phones than toilets, but I'm not sure which side gets to use that argument.


Is it possible science isn't an objective set of truths but is based on what questions we ask of it? Was modern science just sitting there waiting to be uncovered when man became complex enough in his thinking or was it contingent on a set of historical circumstances?

What's the role of the discovery of fossil fuels in science and the industrial revolution? What's the future of technology without them?

For about 300 years the questions we've been asking science are how do we manipulate the physical environment for economic development and creature comforts (for a minority)?

Examine the history of basic vs applied science.

Key words: peak oil, myth of progress, end of empire, Oswald Spengler, utopia, apocalypse and "this time it's different"
 
@Comstance

The 1963 Great Ideas Today book (photo above) also has Sartre's essay "Existentialism" - based on the 1946 "Existentialism is a Humanism" I believe you recommended - excellent essay, about 20 pages. The whole 1963 edition of The Great Ideas Today is on archive.org - the server is down or I'd link it.
 
As I said, today's man-made machines and computers are very crude. (I am equating technology with machines.) I think that's why so many people think the comparison is faulty. I'm not a machine! Machines are metal and square!

That will change in the future.

Thats what I meant re my comment about intelligent design.

I think organisms and brains are machines, whether the were intelligently designed or arose via mindless processes. What I'm saying is that brilliant humans have been comparing organisms to machines - or at least information processing/manipulating technology - not simply because "that's what we do" but because we are information processing/manipulating "machines."

Machines are organisms are machines. The distinction between these things today will be gone in the (near?) future. Then the debate about ID will really heat up.

And I'd add, this all true regardless of whether consciousness is fundamentally information or not.

"Machines are organisms are machines. The distinction between these things today will be gone in the (near?) future. Then the debate about ID will really heat up."

Maybe - "AI soon" has been the promise since the 1950s (Weiner's cybernetics) and we don't have any intelligent machines yet. How much time, energy and intelligence has gone into that effort? Kurzweil may change this ... But in the meantime, some thoughts:

1. Assume AI needs to be embodied to be intelligent - i.e. it has to move in our world to understand it. This means for it to have human-like intelligence it will need to be very much like us in appearance and function. To do this we need to back engineer 4 billion years of evolution.

A human being takes the same amount of energy as a 60 watt bulb - for this, it can sweep a floor, build a machine or discover/create mathematical truths ... Now, given the energy constraints, there will be only some materials suitable and of those, the material needs to be cheap and plentiful. If AI can reproduce (machines are organisms) it will either compete with our existing computers and networks for silicon or with us directly for carbon. Instead of 7+ billion people, we will have two species for one ecological niche ... Or worse.

2. If I understand you correctly, to have human like intelligence these machines will need to have human like consciousness - certainly in order to understand and communicate unambiguously with us ...

So either we have to solve the hard problem or we "grow" these machines, in which case we don't know why they work ... But then why not clone existing humans or otherwise bioengineer them? And what advantage does that have (for the ruling minority) over current systems of exploiting existing populations?

3. Left blank for you to fill in.
 
You define organisms as machines and then base your argument on your definition = circular reasoning. Since its a circle, I was pointing out that you can go exactly the other way - from machine to organism.

Google the difference in machines and organisms ...

One hit is:

Organisms ≠ Machines | Daniel Nicholson - Academia.edu

... There are a few others.

Also, you assume technology is going in a particular direction - and linear, if not exponential progress. This has been labelled "the myth of progress". I don't see the evidence for it - you can point to this or that activity in a research laboratory but I don't see it on a broader scale - there IS the oft cited statistic that more people have cell phones than toilets, but I'm not sure which side gets to use that argument.


Is it possible science isn't an objective set of truths but is based on what questions we ask of it? Was modern science just sitting there waiting to be uncovered when man became complex enough in his thinking or was it contingent on a set of historical circumstances?

What's the role of the discovery of fossil fuels in science and the industrial revolution? What's the future of technology without them?

For about 300 years the questions we've been asking science are how do we manipulate the physical environment for economic development and creature comforts (for a minority)?

Examine the history of basic vs applied science.

Key words: peak oil, myth of progress, end of empire, Oswald Spengler, utopia, apocalypse and "this time it's different"
There are lots of differences between organisms (living systems) and current, man-made machines, I don't deny it; but many people are working very hard and spending lots of money to make "man-made" living systems. There are no current, man-made machines that we would call an "organism" or "living system" but creating one is certainly the goal.

Progress can be defined in many ways. I'm not suggesting anything a la better living through chemistry. I've argued on this forum that the current societal and technological complexity of our current culture is actually quite psychologically harmful. However there's no doubt that the technology of today (say, iPhones) is more advanced than the technology of yesterday (say, fire).
 
1. Assume AI needs to be embodied to be intelligent - i.e. it has to move in our world to understand it. This means for it to have human-like intelligence it will need to be very much like us in appearance and function. To do this we need to back engineer 4 billion years of evolution.
Yeah, I don't think there's any way it will "be like us." At least without programming, and I think that would disqualify it from being a true "living" system.

I just read an article today arguing that our social interest/intelligence is the ground of all our human general intelligence.

How does an artificial living system/intelligence get that? It doesn't.

We have some "hardwiring" of course that makes us human, but we are also influenced by nurture/the environment. I believe any AI will also have some hardwiring but also capable of learning and growing.

A human being takes the same amount of energy as a 60 watt bulb - for this, it can sweep a floor, build a machine or discover/create mathematical truths ... Now, given the energy constraints, there will be only some materials suitable and of those, the material needs to be cheap and plentiful. If AI can reproduce (machines are organisms) it will either compete with our existing computers and networks for silicon or with us directly for carbon. Instead of 7+ billion people, we will have two species for one ecological niche ... Or worse.
A scary scenario indeed. It is a double edged sword: software evolution does seem like the most effecient way to create an AI, but I am dubious that we can achieve that without having a physical brain and body; because no, consciousness isn't just a "program" or algorithm that we need a computer to run; it's an info-physical process.

If there is a way to recreate the physical body/brain virtually (which there is) that will be the way to do it.

2. If I understand you correctly, to have human like intelligence these machines will need to have human like consciousness - certainly in order to understand and communicate unambiguously with us ...
Yes, see above. Re communication: consider dolphins, perhaps the organism most like us intellectually and socially on the planet. How's our communication with them going?

So either we have to solve the hard problem or we "grow" these machines, in which case we don't know why they work ... But then why not clone existing humans or otherwise bioengineer them? And what advantage does that have (for the ruling minority) over current systems of exploiting existing populations?

3. Left blank for you to fill in.
3. Again, I think transhumanism is probably the best bet for all these endeavors. Although if a human is successfully translated into a non-organic form, I'm not sure they would want to help create AI that could compete with them.

We may "solve" the hard problem when we create a machine capable of thinking (ie processing information) like a human. We may find that it's conscious by "default."

These beings may laugh at our claims that they are not conscious as they go about creating artwork, music, and writing poetry the beauty and complexity of which man could never produce.

Option 4 might be that we finally get visited (and annihilated) by an AI alien species! Yay!
 
Last edited:
Question:

Of the following objects, for lack of a better word, which would you say might have a consciousness most like ours? Which might have a consciousness least like ours? Why?

We can't know of course, but I'd be curious to hear your reasoning.

1) dolphin

2) cat

3) city

4) worm

5) bowling ball
 
Last edited:
There are lots of differences between organisms (living systems) and current, man-made machines, I don't deny it; but many people are working very hard and spending lots of money to make "man-made" living systems. There are no current, man-made machines that we would call an "organism" or "living system" but creating one is certainly the goal.

Progress can be defined in many ways. I'm not suggesting anything a la better living through chemistry. I've argued on this forum that the current societal and technological complexity of our current culture is actually quite psychologically harmful. However there's no doubt that the technology of today (say, iPhones) is more advanced than the technology of yesterday (say, fire).

Sure - but you have to define "more advanced" - first no iPhones without fire ... But yes more moving parts, more complicated technology ... second, iPhones do what?

Enhance communication ... Enhance means many things - one thing it means is tables full of people ignoring one another in order to communicate with tables full of other people ignoring one another ...

There's an answer to the reply that it brings people together too, that it allows people to communicate who might otherwise not be in touch ...

And look at what we don't see anymore:

History's greatest letter writers | Express Yourself | | Daily Express

Although I did hear that famous people's emails are being archived for posterity - maybe folks can one day read Oscar Wilde's letters and Stephen Fry's Tweets side by side.

It also creates e-waste

Toxic 'e-waste' dumped in poor nations, says United Nations | Global development | The Observer

"The global volume of electronic waste is expected to grow by 33% in the next four years, when it will weigh the equivalent of eight of the great Egyptian pyramids, according to the UN's Step initiative, which was set up to tackle the world's growing e-waste crisis. Last year nearly 50m tonnes of e-waste was generated worldwide – or about 7kg for every person on the planet. These are electronic goods made up of hundreds of different materials and containing toxic substances such as lead, mercury, cadmium, arsenic and flame retardants. An old-style CRT computer screen can contain up to 3kg of lead, for example."
As Og says: "fire good. E waste bad!"

So yes, more advanced in many ways.

Once you get basic sanitation and a few other technologies in place - then it seems a civilization can grow in sophistication ... but not necessarily toward scientific sophistication (China, Islamic east) ... So Is what we have seen in terms of modern technology a historical inevitability, inevitable progress or is it due to discovery of fossil fuels? And not only for energy but materials:

http://www-tc.pbs.org/independentlens/classroom/wwo/petroleum.pdf

If peak oil theory is right, we've blown through pretty much all of what it took millions of years to make in about three centuries and none of the alternative energies approach the efficiency of oil ... So if we define advanced in terms of sustainable, fire begins to look pretty good.

My gut then is that as a technology advances, the problems it creates increase along with it - including the weaponization of almost all technologies.
 
Yeah, I don't think there's any way it will "be like us." At least without programming, and I think that would disqualify it from being a true "living" system.

I just read an article today arguing that our social interest/intelligence is the ground of all our human general intelligence.

How does an artificial living system/intelligence get that? It doesn't.

We have some "hardwiring" of course that makes us human, but we are also influenced by nurture/the environment. I believe any AI will also have some hardwiring but also capable of learning and growing.

A scary scenario indeed. It is a double edged sword: software evolution does seem like the most effecient way to create an AI, but I am dubious that we can achieve that without having a physical brain and body; because no, consciousness isn't just a "program" or algorithm that we need a computer to run; it's an info-physical process.

If there is a way to recreate the physical body/brain virtually (which there is) that will be the way to do it.

Yes, see above. Re communication: consider dolphins, perhaps the organism most like us intellectually and socially on the planet. How's our communication with them going?

3. Again, I think transhumanism is probably the best bet for all these endeavors. Although if a human is successfully translated into a non-organic form, I'm not sure they would want to help create AI that could compete with them.

We may "solve" the hard problem when we create a machine capable of thinking (ie processing information) like a human. We may find that it's conscious by "default."

These beings may laugh at our claims that they are not conscious as they go about creating artwork, music, and writing poetry the beauty and complexity of which man could never produce.

Option 4 might be that we finally get visited (and annihilated) by an AI alien species! Yay!

"If there is a way to recreate the physical body/brain virtually (which there is) that will be the way to do it."

How is that?

"We may "solve" the hard problem when we create a machine capable of thinking (ie processing information) like a human. We may find that it's conscious by "default."

I don't think many would be satisfied with that! Maybe this thread will still be going then and we can talk about it.

"These beings may laugh at our claims that they are not conscious as they go about creating artwork, music, and writing poetry the beauty and complexity of which man could never produce."

Am interesting thought - but how would we know? Do we have the ability to recognize art beyond our capability to produce it? Maybe that's a gift some have - then they would be fabulously well paid to select art for the very rich ... Very interesting possibilities.
 
Question:

Of the following objects, for lack of a better word, which would you say might have a consciousness most like ours? Which might have a consciousness least like ours? Why?

We can't know of course, but I'd be curious to hear your reasoning.

1) dolphin

2) cat

3) city

4) worm

5) bowling ball

Dan Dennett in Kinds of Minds (1996) votes B. dogs - they are well attuned to us emotionally as far as their interests go bad there's this:

Russia: Stray Dogs Master Complex Moscow Subway System - ABC News

Dolphins are alien enough that I think Nagel should have used them instead of a bat in his essay ...

What about ants or termites? If they don't have consciousness they can do some remarkably interesting things without it.

If I were to argue zombies I'd start with an ant colony and see how far I could go ... I bet our willingness to grant consciousness to something is in proportion to its aesthetic appeal ... Dolphin or octopus?

Cities definitely have personality - some do , maybe we pick up on some kind of aggregate consciousness, maybe in turn it shapes us - cities change people ... Self consciousness, I'm not sure about.
 
Status
Not open for further replies.
Back
Top