Heroes of Might and Magic Community
visiting hero! Register | Today's Posts | Games | Search! | FAQ/Rules | AvatarList | MemberList | Profile


Age of Heroes Headlines:  
5 Oct 2016: Heroes VII development comes to an end.. - read more
6 Aug 2016: Troubled Heroes VII Expansion Release - read more
26 Apr 2016: Heroes VII XPack - Trial by Fire - Coming out in June! - read more
17 Apr 2016: Global Alternative Creatures MOD for H7 after 1.8 Patch! - read more
7 Mar 2016: Romero launches a Piano Sonata Album Kickstarter! - read more
19 Feb 2016: Heroes 5.5 RC6, Heroes VII patch 1.7 are out! - read more
13 Jan 2016: Horn of the Abyss 1.4 Available for Download! - read more
17 Dec 2015: Heroes 5.5 update, 1.6 out for H7 - read more
23 Nov 2015: H7 1.4 & 1.5 patches Released - read more
31 Oct 2015: First H7 patches are out, End of DoC development - read more
5 Oct 2016: Heroes VII development comes to an end.. - read more
[X] Remove Ads
LOGIN:     Username:     Password:         [ Register ]
HOMM1: info forum | HOMM2: info forum | HOMM3: info mods forum | HOMM4: info CTG forum | HOMM5: info mods forum | MMH6: wiki forum | MMH7: wiki forum
Heroes Community > Other Side of the Monitor > Thread: Utopia and the End of History
Thread: Utopia and the End of History This thread is 5 pages long: 1 2 3 4 5 · «PREV
TheDeath
TheDeath


Responsible
Undefeatable Hero
with serious business
posted March 01, 2009 03:08 AM
Edited by TheDeath at 03:14, 01 Mar 2009.

Quote:
As for the machines - why would we create something that would destroy us of its own will? (And don't bring up the nuclear bomb; I've never heard of a bomb thinking, "Hmm... I'm going to destroy all of humanity.")
Why not?
If I had the funds and time and skills and PROCESSING POWER (well skills is somewhat resolved), I would make and program such machines. And many would as well. Some do it for scientific curiosity. Others do it because, well, they like "the next step", like me. Others may do it because machines are much more efficient than humans, and they seek that. Others may do it because we are a dumb arrogant species with too much influence and wasting stuff (as I have explained, we do not deserve in our current state, to be called "efficient"). Others simply because they want to see this "next step" in action. I congratulate such people, and they are usually the ones involved in *real* AI, not the commercial crap you find everywhere advertised, which is static (trust me, I know).

Not everyone is selfishly ignorant and "preserve-how-I-am-at-all-costs" attitude. You're lucky that even monkeys didn't think like that, else we would be monkeys right now.

Mvass, what you suggest is STAGNATION like I have explained so many times. It is choosing the 'human' species (as it is now, not "the next step") and keep it as it is, at all costs. Complete stagnation, but even worse -- you want to expand this primitive influence. You don't want to evolve to "xuman" (let's say, the next species) or to "cyborgs"; but you indeed wouldn't be here without our ancestors evolving, so you seem to like it, but not when it comes to the next generations -- how arrogant is that?

EDIT: Of course I can even consider the cyborg "my son" so to speak. Or even further, imagine if I could transfer my brain and memories into that cyborg body! That's somewhat evolution

Quote:
Nope. They're a lot more primitive, especially in this respect.
What respect? In their lives? Let's see how well you'll do in that atmosphere and situation in which they survive, if you think they are more primitive than you. I don't think you'll even survive, since you lack cooperation (did I mention an IQ of below 50? let's see how well you do).

Quote:
But I am independent to obtain these things in any way I choose. Cells aren't.
Unless you consider it free will, you aren't. (btw I believe in free will). But on the large scale, most small choices don't matter.
____________
The above post is subject to SIRIOUSness.
No jokes were harmed during the making of this signature.

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
mvassilev
mvassilev


Responsible
Undefeatable Hero
posted March 01, 2009 04:11 AM

Hey, I'm a transhumanist too. But there's a difference between uploading my brain into a cyborg body (good idea) and creating a robot that kills me (bad idea). I wouldn't mind having wheels come out of my feet, extendable fingers, and a tripled IQ. I'd mind being dead.

I fully support creating humans+. But that's a physical/technological change, not a philosophical one.

As for ants, they're adapted to their situation. But if humans were in a similar position, but would retain their intelligence, they wouldn't do too poorly, either (except they'd have to find a way to fight predators, but eventually something like the nuclear bomb would be invented ).

Quote:
Unless you consider it free will, you aren't.
Meh, I believe in free will on the macro level but not on the micro level. But, in the context of this discussion, we have free will.
____________
Eccentric Opinion

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
TheDeath
TheDeath


Responsible
Undefeatable Hero
with serious business
posted March 02, 2009 02:58 AM
Edited by TheDeath at 03:01, 02 Mar 2009.

Quote:
Hey, I'm a transhumanist too. But there's a difference between uploading my brain into a cyborg body (good idea) and creating a robot that kills me (bad idea). I wouldn't mind having wheels come out of my feet, extendable fingers, and a tripled IQ. I'd mind being dead.
I guess you're one of those who doesn't care about his son (you could consider the cyborg your son, especially if you designed it).

Quote:
I fully support creating humans+. But that's a physical/technological change, not a philosophical one.
What do you mean by '+'?
The reason I put there cyborgs is because they are much more efficient at mentality without being "influenced" by pointless things like pleasure. But that's about it. In fact, what you suggest is no better at all -- we would still be the same, unless of course we expand our goals to include the "next level".

If we could deprive ourselves of pleasure it would be same thing for me (not exactly -- add there immunity to all biological diseases and you get something similar).

If monkeys became cyborgs somehow "magically" and wiped out all other animals on Earth before becoming humans (that is, still being dumb), I would call it an utter evolutionary failure, because they not only stopped their own evolution, but they also removed all possibility of ever going back, since they lack powerful reason to choose.

Likewise, we might develop something else entirely (and more advanced) than reason, and might even scrap it. But what you suggest it to keep on totally on what we have forever, and only make the "cyborg" thing if it protects that even more, which is worse than doing nothing as it makes it even harder to be able to evolve than otherwise. Thus they will never evolve into humans anymore as their cyborgs already handle it.

Don't get me wrong, I'm all for conservation of MANY things (you know me ), but those have their places, while what you suggest isn't conservation of influence but of evolution, which I disagree (I support conservation of influence though -- say, not enslave aliens or leave animals and many parts of nature alone, etc... ).

Quote:
As for ants, they're adapted to their situation. But if humans were in a similar position, but would retain their intelligence, they wouldn't do too poorly, either (except they'd have to find a way to fight predators, but eventually something like the nuclear bomb would be invented ).
But they don't. You can compare ants with other creatures, like monkeys, and see which one is better off. You can't compare ants to humans by taking into account the intelligence and creativity and abstraction-abilities.




By the way, if you're wondering about natural selection, let me tell you that it was only a "step" like I say "next step" or whatever. We aren't subject to it anymore, or if we are, it's very insignificant. See how powerful the "steps" can be? Reason is a step too, as is natural selection.

Who knows what's next? But I tell you one thing: you ain't helping it by trying to preserve how you are even more efficiently than you already do
____________
The above post is subject to SIRIOUSness.
No jokes were harmed during the making of this signature.

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
mvassilev
mvassilev


Responsible
Undefeatable Hero
posted March 02, 2009 04:07 AM

Quote:
I guess you're one of those who doesn't care about his son (you could consider the cyborg your son, especially if you designed it).
I wouldn't particularly care for a child that kills me.

Quote:
If we could deprive ourselves of pleasure it would be same thing for me (not exactly -- add there immunity to all biological diseases and you get something similar).
I don't think you get it. Pleasure feels good. Biological diseases feel bad. It's good to get rid of bad stuff. It's bad to get rid of good stuff.
____________
Eccentric Opinion

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
JollyJoker
JollyJoker


Honorable
Undefeatable Hero
posted March 02, 2009 06:48 AM

Someone is clearly suffering from the Frankenstein-syndrome. Not to mention from the Hijack-all-threads-with-the-same-bull-syndrome.

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
TheDeath
TheDeath


Responsible
Undefeatable Hero
with serious business
posted March 03, 2009 12:55 AM

Quote:
I wouldn't particularly care for a child that kills me.
Hold on just a sec, I didn't necessarily say anything about KILLING. You know they can be reasonable as well, if we'll just "back off" and assure their 'happiness' (whatever), like we assure our childrens'. After all, many parents even sacrifice for their kids, we could sacrifice for them, so to speak (not literally).

Obviously cyborgs won't be unreasonable so as to savagely attack us. But at that point it's important to just realize our time has passed and that we should let them some space around -- they won't kill us, they're reasonable, but they will feel threatened if WE do not back off and try to assume leadership or use them as slaves. WE should back off, not them. Maybe even cooperate (as much as we can), but the important thing is not to "use" them or to consider yourself more superior somehow (maybe morally?) or that you deserve to have your influence more than them (and whatever comes with it). They will "follow" our lives after we die (we = biological humans as we are now), but we can consider them the 'next step' in this respect. (of course still conserving many things, even cultures, and historical art -- which is one of the requirements of a noble species or a "species-with-taste-and-elegance' anyway (ok that sounded weird but you know what I mean), and not just be a pointless existence).

To make it easier for you to understand, imagine that babies are born artificially and that they are 'genetically engineered'; of course you will love them like your "normal-natural-selected" son, right? Of course it would be extremely wrong for them to dominate us like they are superior (just as it is wrong for us to exploit lesser beings), but just living with us. However, notice it'll only be a matter of time until your 'normal' species goes extinct -- this doesn't mean that the society, culture, and traditions go however, which is actually the important thing (and also being reasonable/peaceful species, not one to exploit lesser beings). I couldn't care less even if aliens "followed up".

Of course where we must start from. And by "improvement" I by NO means mean necessarily 'enhancements' so to speak, but mostly MENTALITY improvements and increased awareness of pointless things which we should scrap. Enhancements are nice, but not required and by no means will they make us 'better' alone (they don't actually add much to that, but don't decrease it either, just neutral). Now, as an example for mentality would be, if those cyborgs were more aware than us at efficiency and not wasting, but we decided to NOT become aware, when we clearly can. Even worse if we would actually fight them because, uhm, we don't like 'em.

Quote:
I don't think you get it. Pleasure feels good. Biological diseases feel bad. It's good to get rid of bad stuff. It's bad to get rid of good stuff.
What? A cyborg feels no pleasure and is immune to diseases, that's what I meant and is exactly what I said


Quote:
Someone is clearly suffering from the Frankenstein-syndrome.
Someone is clearly suffering from the wrong syndrome. (see below)

Also do me a favor and if you happen to see "frankensteins" (whatever) in your lifetime, do not try to follow the trend and become one yourself, even if everyone else is doing it

Quote:
Not to mention from the Hijack-all-threads-with-the-same-bull-syndrome.
Not to mention from the Troll-all-threads-and-adding-super-ultra-mega-hyper-important-and-extremely-urgent-information-and-carefully-crafted-insight-by-calling-other-posts-BS-or-nonsense. Honestly I have no idea how threads can even go on without such sensitive and critical information.




on a more serious note, you probably were referring to this right?
Which is, as you may or may not have noticed, something which I am entirely against. Creating slaves or subhumans is one of the worst things that could happen to a Utopia, in fact I think it's already happening with many animals and such (note: domestic animals aren't always slaves, it depends on the respective 'usage'; I won't go into that). Not sure where you got that I wanted something like that.

Further, I do not necessarily advocate making pointless enhancements -- such as better senses or stronger muscles, which is NOT my intention (we already have conversions anyway). Not that such things would be unwelcome, but they're not requirements for a Utopia.

What is required is dramatic change in human social abilities and mentalities, eliminating waste (in both areas) that drag it down. Let's just say that (for the social case) by having problems with someone else and then being "sorry" and doing it over and over isn't going to be considered efficiency, especially if you arrive at being buddies again. Don't tell me it's "way more complex than that" as I'm pretty aware, but it's only complex because of our current mentalities. Computers don't really have social problems do they? I'm not saying we'll be computers (lol) but just showing an example that it is possible, if you dare to step out of the world in which you were raised, so to speak (as a Utopia is, after all, kinda idealistic given the current situations, otherwise it wouldn't be a Utopia).

Just a question, do you happen to know what the "singularity" is? It's just the next step in evolution. Some say it's the end of humanity as we know it, but I think it's just an evolutionary "next step". Sure it'll be the end of it as we know it, well ever since monkeys evolved, it has been the "end of monkeys as we know them" (I mean, becoming humans). Bad thing?

Do not take my arguments as being completely trying to subjugate nature or enslaving everything else as that is what I am MOST against. But being more efficient, which is what I'm advocating, actually helps us drain less energy (especially if we drained it on pointless things) and thus have less influence while still being evolved as a 'noble' species which respects other things and not just themselves (because there is nothing noble in being a tyrant, which we would be otherwise).

And by "human improvement" I do not necessarily mean it the classical way, but just ENOUGH to eliminate the waste/pointless things that we should not NEED. (after all, no need for something, no energy wasted pursuing it). Immunity to diseases (as cyborgs) would also have the benefit of not wasting more energy and lives on curing diseases anymore, especially degenerative ones. (except if you want to do it on animals or nature, but some would say we shouldn't involve ourselves unless it's our fault that they got diseased).

So if anything such an "improvement" is mostly of MENTALITY, and becoming aware of wasting energy and/or resources with some of our pointless desires, while IMPROVING the others, and FOCUSING on what we have been specifically given (unlike stuff we 'inherited', I detailed this in my previous posts).

Dunno why I feel like I'm repeating myself though, prob. take some break from all this
____________
The above post is subject to SIRIOUSness.
No jokes were harmed during the making of this signature.

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
TheDeath
TheDeath


Responsible
Undefeatable Hero
with serious business
posted March 03, 2009 01:01 AM

Also @mvass, did you watch Steven Spielberg's "AI" or read the book (I didn't read the book, I admit)?
____________
The above post is subject to SIRIOUSness.
No jokes were harmed during the making of this signature.

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
mvassilev
mvassilev


Responsible
Undefeatable Hero
posted March 03, 2009 05:18 AM

Your whole first point makes absolutely no sense.

Quote:
What? A cyborg feels no pleasure and is immune to diseases, that's what I meant and is exactly what I said
So, in one way, it's more advanced than us (no diseases), but in another, it's less advanced (no pleasure).

And no, I've never seen or read AI.
____________
Eccentric Opinion

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
JollyJoker
JollyJoker


Honorable
Undefeatable Hero
posted March 03, 2009 07:35 AM

A couple of points.

What you do here, Death, is the following: you say: I have a clear idea of utopia and since this isn't attainable with the way humans are, we must change human nature (you can call it any way you want, but that's what it amounts to) or the human.
That's what I call the Frankenstein-syndrome: the mad scientist pursuing an abstract idea and will stop at nothing to go there. Note, that it's basically inhuman, since humans are reduced to SERVING a purpose now, and not the other way round: the purpose is now utopia and humans must be changed to serve that purpose, when it's actually meant to be the other way round. (This includes the whole AI and cyborg business.)
I hate to mention him, but Hitler is actually the same. An idea about humanity and a couple of aspects, and the idea is followed through INHUMANLY: making humanity better by deleting certain branches from the human gene pool. It sounds like a reasonable idea, if you phrase it accordingly, but in practise - and phrased a little different - it immediately gets INHUMAN.

Same here.

As a byproduct of all this another thread has been sunk.

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Jump To: « Prev Thread . . . Next Thread » This thread is 5 pages long: 1 2 3 4 5 · «PREV
Post New Poll    Post New Topic    Post New Reply

Page compiled in 0.0775 seconds