03-19-2018, 06:05 PM | #111 |
Join Date: Feb 2005
Location: Berkeley, CA
|
Re: No AI/No Supercomputers: Complexity Limits?
No we actually can't. The tools we use are not part of the formal system and are therefore not 'logical'.
|
03-19-2018, 06:10 PM | #112 | ||
Join Date: Feb 2012
|
Re: No AI/No Supercomputers: Complexity Limits?
Quote:
Quote:
Still, among the things that we can do, there are some not reproducible by a formal system, as it would not be finite. |
||
03-19-2018, 06:20 PM | #113 | |
Join Date: Feb 2005
Location: Berkeley, CA
|
Re: No AI/No Supercomputers: Complexity Limits?
Quote:
I am not aware of any examples where that is provably true. |
|
03-19-2018, 06:22 PM | #114 | ||
Join Date: Feb 2012
|
Re: No AI/No Supercomputers: Complexity Limits?
Quote:
It’s the easiest attack strategy on Gödel’s theorem, and it doesn’t work. Quote:
Theorems are built with set of symbols and transformation rules, but the theorem meaning can’t be contained in the set. |
||
03-19-2018, 06:26 PM | #115 |
Join Date: Feb 2005
Location: Berkeley, CA
|
Re: No AI/No Supercomputers: Complexity Limits?
You'll have to define your terms a bit better; I cannot think of a definition of 'meaning' that is both logical and impossible to express logically.
|
03-19-2018, 06:31 PM | #116 | |
Join Date: Feb 2012
|
Re: No AI/No Supercomputers: Complexity Limits?
Quote:
I reword it: we don’t know if we will be able to create one apart from the ol’ good ways. I was thinking to have shown it well enough. Yet you can get a much better explanation going to the sources: Gödel, Turing, Lucas, and Penrose. My advice is to go for Gödel and Lucas or Gödel and Penrose. |
|
03-19-2018, 06:34 PM | #117 |
Join Date: Feb 2005
Location: Berkeley, CA
|
Re: No AI/No Supercomputers: Complexity Limits?
I have read all three. Yes, we can decide that certain problems cannot be solved from within a given formal system, but we do so from outside the system. Construct a formal system that is large enough to include us and it will include undecidable statements that we cannot demonstrate are undecidable.
|
03-19-2018, 07:26 PM | #118 |
Join Date: Jun 2006
|
Re: No AI/No Supercomputers: Complexity Limits?
What makes you sure of that? Can you produce any evidence that there do not exist inputs that cause human minds to lock up in infinite loops exactly like the ones you seem to think the AIs can get stuck in?
__________________
-- MA Lloyd |
03-19-2018, 07:53 PM | #119 |
Untagged
Join Date: Oct 2004
Location: Forest Grove, Beaverton, Oregon
|
Re: No AI/No Supercomputers: Complexity Limits?
Wouldn't that just be indecision "brain-lock" followed by simply choosing randomly? I've had cats that got stuck like that until I made some slight noise. Then they suddenly decide.
Only really simple animals lack these kinds of fail safes. Ants can get stuck in a death-spiral where they fail to decide even to the point of death. https://en.wikipedia.org/wiki/Ant_mill
__________________
Beware, poor communication skills. No offense intended. If offended, it just means that I failed my writing skill check. |
03-19-2018, 09:04 PM | #120 | |
Join Date: Jun 2005
Location: Lawrence, KS
|
Re: No AI/No Supercomputers: Complexity Limits?
Quote:
So it seems to me that if a human being can decide "logically" that theorem X, which is undecidable from postulates A1, A2, A3, and so on of formal system P, can be made decidable by shifting to formal system P* with different postulates, then a Turing machine, or an actual computer, can equally well shift to a different formal system. If necessary, you can program it to try new postulates at random, or in some specified order, and keep going till it gets to one that gives you formal system P* in which it can be ascertained that X is now decidable. And if for some reason you can't do that, then since a Turing machine is exactly equivalent to an idealized human logician, anything a human being does to decide X is not "logical." Or if you are using "logical" in one case to mean only what can be done within a formal system, but in the other case to mean something more extensive, then that looks like the Fallacy of Equivocation.
__________________
Bill Stoddard I don't think we're in Oz any more. |
|
|
|