03-14-2018, 08:49 PM | #31 | |
Join Date: Mar 2008
|
Re: No AI/No Supercomputers: Complexity Limits?
Quote:
|
|
03-15-2018, 12:50 PM | #32 |
Untagged
Join Date: Oct 2004
Location: Forest Grove, Beaverton, Oregon
|
Re: No AI/No Supercomputers: Complexity Limits?
And his ALS was a slower form than was commonly known back then. That's an important fact too. It's not the miracle that some imply.
__________________
Beware, poor communication skills. No offense intended. If offended, it just means that I failed my writing skill check. |
03-15-2018, 12:53 PM | #33 |
Join Date: Feb 2005
Location: Berkeley, CA
|
Re: No AI/No Supercomputers: Complexity Limits?
That said, I still wouldn't bet on making it to 76 with ALS if limited to 1960s medical tech.
|
03-15-2018, 01:27 PM | #34 |
Untagged
Join Date: Oct 2004
Location: Forest Grove, Beaverton, Oregon
|
Re: No AI/No Supercomputers: Complexity Limits?
I wouldn't bet on making it to 76 now.
But I would say modern medicine helped to keep him from getting lethal secondary issues from decades long paralysis.
__________________
Beware, poor communication skills. No offense intended. If offended, it just means that I failed my writing skill check. |
03-15-2018, 09:38 PM | #35 | ||||||||
Join Date: Feb 2007
|
Re: No AI/No Supercomputers: Complexity Limits?
Quote:
The Apollo Project is arguably comparable to the 'handgun in the Middle Ages' example. It worked, but it worked in a way that isn't really a practical bridge to further developments. It was useful, but it was an example of an accomplishment literally ahead of its time. Another example: Europe 'discovered' the Americas at least once before Columbus, probably twice, quietly likely many times. In the previous instances, nothing much came of it because the circumstances were not right for anything to come of it. Everything came together in the 1500s. Quote:
Quote:
Most subsequent marine tech has been refinements, though fission power for submarines was the key ingredient to make them come into their own. Quote:
Lasers might be different. It's too soon to say. Quote:
Quote:
'Humanity' doesn't exist. There are only humans, and they do things for individual and group reasons that undercut that sort of analysis. For ex, Earth is more liveable than Mars, unquestionably. Therefore why would anyone live on Mars? What? Your neighbors on Earth own all the nice good land and resources and won't share, and some of them would like to eviscerate you? Well, Mars might be second-best territory, but at least you're a long way away from the crazies next door... It made no logical sense, in economic terms, for the Puritans to travel across 3000 miles of ocean to settle Massachusetts Bay. There were endless reasons not to do it...except of course that the cultural and religious disputes of the time turned those reasons inside out. And no, Antarctica is not a counter-example, because it's no longer far enough away in technological terms for the comparisons to even apply under current conditions. Future conditions might change that, of course. Quote:
Quote:
__________________
HMS Overflow-For conversations off topic here. Last edited by Johnny1A.2; 03-15-2018 at 09:43 PM. |
||||||||
03-15-2018, 09:39 PM | #36 |
Join Date: Feb 2007
|
Re: No AI/No Supercomputers: Complexity Limits?
And modern electronics enabled him to use his intellect and communicate and operate in ways that would have been quite impossible, or impractical, for a person in the same medical state not so long before.
__________________
HMS Overflow-For conversations off topic here. |
03-16-2018, 12:49 AM | #37 |
Join Date: Feb 2016
Location: Melbourne, Australia (also known as zone Brisbane)
|
Re: No AI/No Supercomputers: Complexity Limits?
I like the idea of a TL10 society without AI for game based reasons even if it isn't realistic. I want the PC humans to be the space explorers of the future even though it makes far more sense to send robots. I think the Dune universe for example would make for a very interesting gurps setting.
__________________
The stick you just can't throw away. |
03-16-2018, 01:02 AM | #38 |
Join Date: Jul 2015
Location: England
|
Re: No AI/No Supercomputers: Complexity Limits?
I think for me, the key requirement for a game at high TL would be that you couldn't just ask a computer to fix the problem for you. You need to keep the agency of the characters, and that might mean limiting AI - there's no reason you can't play an AI, of course, but whatever sort of characters you're thinking of, they need to have something to do. If it's just "oh, I ask my AI to sort that out while my PC continues to live in a post-scarcity utopia", that's probably going to get dull quickly (though if you find that fun, go for it!).
|
03-16-2018, 01:37 AM | #39 | |||||||
Banned
Join Date: Mar 2018
|
Re: No AI/No Supercomputers: Complexity Limits?
Quote:
Quote:
Quote:
The scale of distances is also not remotely comparable. The distance between the continents and the distance between planets are several orders of magnitude greater. I agree with basically every word of this blog post from cybereconomics on why interstellar travel is impossible. Solar system colonization is a far less spectacular engineering feat, but suffers from most of the same problems. I not only think it's technically unfeasible I also think it's basically pointless. Eventually there may be some people further out into the solar system - as massive supply chains and gradually built up - but it won't be remotely like 'going to America'. America is not an uninhabitable Hellscape, which every single extraterrestrial body is. America was possible to colonize by siberian primitives on foot. Modern humans would die instantly anywhere else in the solar system, or in a few hours/days with the most advanced technology possible. And unlike the already sophisticated ship fleets and technology that had been developed for traveling across oceans (of which American colonization was merely an application) there exists no comparable technology to survive in and resupply over the much, much greater gaps in local space. Interstellar space travel is, I believe, basically a religious fad. I also tend to believe that any society sufficiently advanced to overcome these difficulties would no longer be inhabited by human beings, and would probably not bother. Perhaps there may be robots spreading across the solar system like a steel cancer, but people? Nah. Really, if the Science! futurologists were accurate they ought to consider more than biological mankind is going to be extinct, and not just because 'evil robots destroy us' but because hyper-advanced AI replaces humans in all functions and we become nothing but pets incapable of competing from them and have all our agency and material resources stripped away. I think simple Darwinism would lead to that. It's also possible that many people would convert themselves into cyborgs indistinguishable from super-machines. If biological human beings exist into the higher 'TL' it will because a lot of stuff like nanomachines, super power sources, etc. are not possible or too expensive to bother with. I am by no means a convinced transhumanist, but if a lot of the science fiction technologies are possible then I think they lead there invariably. Quote:
Quote:
Someone more knowledgeable than I calculated that warping space (as the Star Trek ships do) would require two entire galaxies worth of antimatter to use once. Quote:
Quote:
The more resources the characters have - and for humans technology is the trump resource, with influential friends coming in a close second - the harder it is to create plausible challenges for them. But I prefer to tough it out by spending more effort creating challenges than to 'cheat' by simply ruling out perfectly logical uses of resources and abilities. Last edited by VonKatzen; 03-16-2018 at 02:37 AM. |
|||||||
03-16-2018, 01:45 AM | #40 | |
Banned
Join Date: Mar 2018
|
Re: No AI/No Supercomputers: Complexity Limits?
Quote:
That's really what Reed's essay and my point on economics was getting at. There are extremely advanced processes of material acquisition and processing that would be virtually impossible in ancient times, and would be too worthless to ever be started because the return on investment would be billions of dinars in the negative even if you pulled it off. Some ancient greeks built miniature steam-powered toys, but nobody ever bothered to even try and make a dreadnought - not because they were stupid, but because it simply couldn't be done. To continue with the above post (ran out of characters) it is of course possible to ignore some of these likely physical limits for the purpose of setting or whatever. Though, personally, I like it when the author at least follows through on the implications. IE if you have cheap and easily available power sources so that anyone can have a planet hopping space ship then everyone can also have a nuclear bomb. If you have autonomous superintelligent AI it becomes hard to justify it not being ubiquitous - some people may have their Butlerian Jihad but they're going to be overwhelmed and replaced by people who don't have such hesitancy in using super-AI. Trying to be a space-conservative is like trying to be an Amazon forest tribe - you're basically at the mercy of anyone who feels like giving you a hard time. Darwinism on a biological and social level are extremely powerful, as humanity has learned quite well in the past few hundred years - no matter what merits your civilization may have if it can't compete materially it just plain can't compete. And if the people with super technology are very friendly and restrained it becomes hard to imagine why there isn't a eutopia. That is actually something I'd like to see in more speculative fiction - eutopia. While fighting and primitive apeling hu-mans with laser-sticks have a primal appeal to us, it's also interesting to consider what a really well-adjusted, hyper-advanced civilization would actually be like instead of treating it as WW2 in Spaaace! as Star Wars does. This is one of the things that I like about role playing games and consistency/realism in them - it can be an interesting tool to explore how people might actually behave if they had super-powers or teleporters, instead of confining oneself to TVTropes as most authors do. Personally I find the idea of what a world with random godlike superhumans might actually be like to be far more interesting than mainstream comic books, even though it does present some challenges for people writing traditional stories (of course, there's no law that says all stories ought to follow such formats). On the sci-fi side it's interesting to think about how people might deal with the challenges of space and the existence of super-technology instead of using them as mere prop dressing for what might as well be a Greek play for all the difference it makes in them. Space travel and technological advancement has a much more complex and frankly strange set of implications than Firefly would have us believe. And that's perhaps the biggest problem with Futurism as it's practiced - it's very linear, and projects modern society and modern man into a setting where it may not really have a place. The future is going to be much weirder than anything Mitchio Kaku daydreams up. I have no claim to know what it would be like in detail, but I do like to explore some avenues typically neglected by him and the authors of science fiction novels. Last edited by VonKatzen; 03-16-2018 at 03:04 AM. |
|
|
|