Heroes of Might and Magic Community
visiting hero! Register | Today's Posts | Games | Search! | FAQ/Rules | AvatarList | MemberList | Profile


Age of Heroes Headlines:  
5 Oct 2016: Heroes VII development comes to an end.. - read more
6 Aug 2016: Troubled Heroes VII Expansion Release - read more
26 Apr 2016: Heroes VII XPack - Trial by Fire - Coming out in June! - read more
17 Apr 2016: Global Alternative Creatures MOD for H7 after 1.8 Patch! - read more
7 Mar 2016: Romero launches a Piano Sonata Album Kickstarter! - read more
19 Feb 2016: Heroes 5.5 RC6, Heroes VII patch 1.7 are out! - read more
13 Jan 2016: Horn of the Abyss 1.4 Available for Download! - read more
17 Dec 2015: Heroes 5.5 update, 1.6 out for H7 - read more
23 Nov 2015: H7 1.4 & 1.5 patches Released - read more
31 Oct 2015: First H7 patches are out, End of DoC development - read more
5 Oct 2016: Heroes VII development comes to an end.. - read more
[X] Remove Ads
LOGIN:     Username:     Password:         [ Register ]
HOMM1: info forum | HOMM2: info forum | HOMM3: info mods forum | HOMM4: info CTG forum | HOMM5: info mods forum | MMH6: wiki forum | MMH7: wiki forum
Heroes Community > Other Side of the Monitor > Thread: I gave up on believing in God.
Thread: I gave up on believing in God. This Popular Thread is 204 pages long: 1 30 60 90 ... 117 118 119 120 121 ... 150 180 204 · «PREV / NEXT»
DagothGares
DagothGares


Responsible
Undefeatable Hero
No gods or kings
posted June 25, 2008 12:47 AM

And the atheist religion will start science damn you
Religion started out like this, but it has grown to so much more, afterwards.
____________
If you have any more questions, go to Dagoth Cares.

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
TheDeath
TheDeath


Responsible
Undefeatable Hero
with serious business
posted June 25, 2008 02:14 PM

Quote:
Unfortunately, some people still believe that lightning is caused by the anger of God.
That's beside the point. Whether or not it's caused by the anger of God has nothing to do with it -- actually they could pretty much be right (not that I agree!). Sure, they have no reproducible proof of it, but lack of proof does not mean it's wrong. Again, wrong is an attribute of knowledge -- practice is different. Knowledge does not need to be useful, it just is.. knowledge.

The idea of it is that you can't truly say they are wrong because you don't know -- on the other hand, you can easily say that you have no reason to believe it, which is a totally different matter.

Besides what i was trying to imply with that is the fact that you usually don't use the Bible to explain things that are not written in it -- and electricity is not lightning, I was talking about the electrical 'electricity' (from physics), not lightning bolts! For sure, you can't use the Bible to explain computers, since they are 'made up' precisely from the scientific definitions we wanted them to have.

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
mvassilev
mvassilev


Responsible
Undefeatable Hero
posted June 25, 2008 05:41 PM

Quote:
That's beside the point. Whether or not it's caused by the anger of God has nothing to do with it -- actually they could pretty much be right (not that I agree!). Sure, they have no reproducible proof of it, but lack of proof does not mean it's wrong.
Are you suggesting that their explanation is as legitimate as science's? Because it's definitely not.


____________
Eccentric Opinion

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
TheDeath
TheDeath


Responsible
Undefeatable Hero
with serious business
posted June 25, 2008 05:47 PM

Quote:
Are you suggesting that their explanation is as legitimate as science's? Because it's definitely not.
I honestly don't know what 'legitimate' means (or how it's used in this context), but i am going to repeat myself. An explanation is not a scientific theory -- it doesn't 'predict' stuff, it may not even be useful. If I explain to you that we are in a dream, will that change anything? On a 'useful' scale, nope, but it was an explanation (that I chose to believe for whatever reasons in this example). It might be false, but you don't know it (usually the one who uses the explanation should be the one who decides in the first place).

Though of course there is a difference between saying "I know you are wrong" and "I have no reason (my reason) to believe that what you say is true".

Of course we, as people, choose the explanation that fits better with our beliefs (in our case, we choose the reproducible mechanistic one). If those people think that the anger of God causes it, and for them that's an explanation (I don't question them), then it's ok, at least because we can't prove them wrong and they might be true (even if by our subjective calculations that's a 0.000001% chance). It doesn't mean you have to use the same explanation.

Unless of course you prove them wrong but that is hardly possible (the reverse is also true).

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
dimis
dimis


Responsible
Supreme Hero
Digitally signed by FoG
posted June 27, 2008 05:00 PM

Quote:
Math is a language, a language that we use to express logical formulations and mathematical representations.
What about Computer Science then? Is it a science? And if so, why? Where is the experiment? Why not?
It seems as if I am playing with words, but I am not. In a sense it is deeply related with the whole god-science discussion and what I am trying to say is that people are keen on misclassifying subjects and many times reaching wrong conclusions.

As for the earlier conversation, I must admit that I am provoking. And I do so, because I want to know what others actually think about something. If I was offensive, I apologize.
____________
The empty set

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
TheDeath
TheDeath


Responsible
Undefeatable Hero
with serious business
posted June 27, 2008 05:22 PM

Depends on what you mean by computer science.
If you mean hardware, then it's physics.
If you mean software, I'd rather say it's a language -- it's a reason they are called 'programming languages'

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
dimis
dimis


Responsible
Supreme Hero
Digitally signed by FoG
posted June 28, 2008 01:02 AM

I don't think so.
For hardware you simply need something that implements fast boolean logic / has two states. So electricity is ideal. But that's the ONLY reason that computers use electricity, and essentially that's where the "physics" part of computer science ends. In fact, we have reached a state where it is very difficult to overcome the barriers of quantum effects on hardware level; yet computers become faster - mainly because they have more cores. That's not physics at all and I think it is understandable at least. But even if we didn't augment the cores on the processor, other designs on hardware have (almost?) nothing to do with physics.
As of the software part, that's only a minor part in computer science. Programming languages are merely used to express an algorithm (a recipe or a method of doing things); that's the reason they are called "languages". It's a way of communicating with a computer. You can also verify the minor significance of a programming language in computer science as a whole if you check any syllabus on any computer science department; it might be a course or two out of 40 or so. So, that's not the "other half".
Besides, a programming language does nothing by itself.
It is the notion of the algorithm that is central. Information theory plays a central role as well (see telecommunications). Is it coincidental that many european countries talk about the same "area of study" when they talk with the term "informatics" instead of "computer science"? Moreover, computer science *does not depend at all* to Physics on its foundations. That's another common myth; which is nourished by many graduates of physics departments (at least among the guys that I know). I can elaborate on why is that if it is necessary, but I don't think it is.
I think that the simple one-liners above do not reflect the truth on what is computer science - even on a very coarse scale. And I again, note that computer science does not depend on *any* experiment on its foundations (not the applications!). So, is it a science? May be I don't expect an answer.
____________
The empty set

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Binabik
Binabik


Responsible
Legendary Hero
posted June 28, 2008 09:31 AM

Quote:
For hardware you simply need something that implements fast boolean logic / has two states
I'll be sure to mention that to the engineers on my next contract.  BTW, you're not under the impression that binary/digital actually exists other than theoretically are you?

Oh yea, to stay on topic, God is analog..

____________

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
dimis
dimis


Responsible
Supreme Hero
Digitally signed by FoG
posted June 28, 2008 10:25 AM

Quote:
BTW, you're not under the impression that binary/digital actually exists other than theoretically are you?
Depends on how restrictive you are when you say theoretically. But the real answer to your question for me is: I don't care! How many sides does a coin have? And regarding the hardware part, it is not a joke. It's the heart of the speed of "practical" computers. Finally, you don't even need computers to "do" computer science; which is oxymoron, yet true.
____________
The empty set

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
TheDeath
TheDeath


Responsible
Undefeatable Hero
with serious business
posted June 28, 2008 11:31 AM

As for hardware and it's 'multiple cores', I don't really know if it's considered physics, you are right, but it is not science. At best it is engineering. Because building a house is not science, nor is building a computer (when you "know" how it works, but you need to decide an algorithm/whatever).

As for programming languages, well they are the core of software. What comes after them, is a binary output (0100111011) which can hardly be considered science. But it can easily be considered engineering or (seriously) some people on programming forums I visit are even considering it an 'art' -- don't ask me why

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Binabik
Binabik


Responsible
Legendary Hero
posted June 28, 2008 11:32 AM

Quote:
How many sides does a coin have
You're asking someone who makes their living in topology how many sides there are to a coin?

The answer is one. But you can only see half of it at a time. And you can see it from an infinite number of viewpoints in 3D space. Even though there is only one side, there is an infinite number of half sides.

The reason God is Analog is because everything in nature is analog. Digital is strictly a human construct with the most important attribute being error correction. And God doesn't need error correction....or so the theologians say.

____________

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
TheDeath
TheDeath


Responsible
Undefeatable Hero
with serious business
posted June 28, 2008 11:37 AM

Frankly speaking, digital is also 'analog'. For example, you can represent 1s and 0s with 6V or 0V or whatever other principle. The thing is, the '6' or '0' are still analog -- the only difference is the fact that you tolerate a lot of values around '6' and around '0' and use them as 1 or 0 (that is, with a margin of error!). From a deep quantum viewpoint, it's still analog (and even some hard drives have a small error due to this fact!) at it's core. What we call digital is only a macroscopic and theoretical view.

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
dimis
dimis


Responsible
Supreme Hero
Digitally signed by FoG
posted June 30, 2008 01:35 PM

Quote:
As for hardware and it's 'multiple cores', I don't really know if it's considered physics, you are right, but it is not science. At best it is engineering. Because building a house is not science, nor is building a computer (when you "know" how it works, but you need to decide an algorithm/whatever).
Ok. Hardware plus programming languages don't cover more than 10-20% of the spectrum of computer science and I am being really generous on the percent. The main reason that hardware is seen as "engineering" I guess is the fact that electricity is used. Had there been used another mean, I don't know if you considered this as engineering as well.

Quote:
As for programming languages, well they are the core of software. What comes after them, is a binary output (0100111011) which can hardly be considered science. But it can easily be considered engineering or (seriously) some people on programming forums I visit are even considering it an 'art' -- don't ask me why
Wrong. Programming languages are NOT the core of software. And don't relate the output with what they are meant to be, because similarly I can say that Newton's second law actually computes a number in decimal and not some force; and from your viewpoint this should also be hardly considered science. As for the "art" in programming, there is a reason that the word "art" is used. Btw, the volumes that most likely influenced most of the guys talking about "art in programming" are these.

Quote:
You're asking someone who makes their living in topology how many sides there are to a coin?
Come on Binabik. You want me to force somebody to say "two" no matter what he does for a living? Ok. What is the cardinality of the set A = {empty_set, {empty_set}} ? Now, this might seem theoretical, but you can now define functions that map things either to the empty_set or to the {empty_set}, thereby classifying them with binary logic. And now this is the application, and is no longer theoretical.

@TheDeath: My quantum viewpoint is very shallow because I am no physicist; I am a computer scientist. And btw, the brute generalization of equating digital with analog is another joke. Digital describes things that are discrete, while analog deals with things that are more continuous ... And if you also want a "quantum extension" of that, consider the fact that energy comes in packets; i.e. energy is digital ....
____________
The empty set

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
TheDeath
TheDeath


Responsible
Undefeatable Hero
with serious business
posted June 30, 2008 01:45 PM
Edited by TheDeath at 13:47, 30 Jun 2008.

Quote:
@TheDeath: My quantum viewpoint is very shallow because I am no physicist; I am a computer scientist. And btw, the brute generalization of equating digital with analog is another joke. Digital describes things that are discrete, while analog deals with things that are more continuous ... And if you also want a "quantum extension" of that, consider the fact that energy comes in packets; i.e. energy is digital ....
Analog is not necessarily continuous, it is also the fact that the energy is stored in 'real numbers' (that means, natural infinite precision thingies) rather than 1s and 0s (classified by those real numbers too) -- see my example above with the volts representing bits (of course analog has 'errors' but that's in nature).

What I meant with "digital" = "analog" in a way is more like this: The world, the natural laws of physics or whatever (on which computers are based on, be they analog or digital, doesn't matter) are analog. Therefore, digital from this viewpoint, is an imitation of analog, and thus analog. Now, our thinking of it is like this: digitally, we think discrete and we think without any errors (natural errors) unless we are talking about imprecision obviously. However, the errors still exist, but are small, because it is an imitation of an analog world. That is, any device (e.g: hard disks) have a probability to fail (optical media have a smaller error, but the bad quality optical drives are usually erroneous, not the storage media); this is because they are, after all, using analog laws of physics and are based on those. Digital is only a theoretical viewpoint because we live in an analog world, so we try to 'approximate' that theory as best as we can by imitating with analog things.

The reason I am talking about programming languages in software is because I am very experienced in that area, so probably I am not very familiar with other 'software' definitions (but I also know assembly language and binary opcodes ).

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
dimis
dimis


Responsible
Supreme Hero
Digitally signed by FoG
posted June 30, 2008 02:22 PM

Quote:
Analog is not necessarily continuous, it is also the fact that the energy is stored in 'real numbers' (that means, natural infinite precision thingies) rather than 1s and 0s (classified by those real numbers too) -- see my example above with the volts representing bits (of course analog has 'errors' but that's in nature).
Well, energy can't take the whole spectrum of real numbers, because it comes in packets. So, you simply use wrong scale.
Quote:
What I meant with "digital" = "analog" in a way is more like this: The world, the natural laws of physics or whatever (on which computers are based on, be they analog or digital, doesn't matter) are analog.
If you mean computer science when saying "computers" as well, then you are wrong, simply because computer science is NOT based on ANY law of Physics. So, digital in computer science exists, whether we have computers or not.
Quote:
Therefore, digital from this viewpoint, is an imitation of analog, and thus analog. Now, our thinking of it is like this: digitally, we think discrete and we think without any errors (natural errors) unless we are talking about imprecision obviously.
Not really. For example, can we write down the square root of two? You can say that you need infinitely many digits, and therefore it is impossible. However, we have a very compact representation of this number which is exact with infinite precision! That's isolating interval representation; meaning, square root of 2 is, e.g., the unique root of the polynomial x^2 - 2, in the interval [0, +infinity) (or [0,2] if you prefer something smaller).
And now, all the operations with square root of 2, can be translated in operations with polynomials that have integer coefficients. And as of the engineering and practical part, it is sufficient to "spit" a large enough approximation of the number. Of course you can give all the digits, but that also assumes that you have an infinite amount of time...

Quote:
However, the errors still exist, but are small, because it is an imitation of an analog world. That is, any device (e.g: hard disks) have a probability to fail (optical media have a smaller error, but the bad quality optical drives are usually erroneous, not the storage media); this is because they are, after all, using analog laws of physics and are based on those. Digital is only a theoretical viewpoint because we live in an analog world, so we try to 'approximate' that theory as best as we can by imitating with analog things.
As I said earlier, the reason that we use electricity in computers is because it is the fastest implementation that we know of, and in practice we (of course) care about the response times. If you want a computer that never fails, you can have it. However, it's going to be real slow. So, the choice is obvious: A computer which is as fast as a turtle with no failures, or a computer which is as fast as a cheetah with a very low probability on a failure? The world has decided on this question long ago ...

Quote:
The reason I am talking about programming languages in software is because I am very experienced in that area, so probably I am not very familiar with other 'software' definitions (but I also know assembly language and binary opcodes ).
It's not a matter of definition; nor I want to trick you in a way. 'software' is a very general word which says very little about the "thing being inspected". It's what you do that matters; not how (programming language) you implement it! Programming languages give you a mean to implement what you really want to do. But that's not what solves the problem you want to solve. It's just that; a mean of implementation.
____________
The empty set

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
TheDeath
TheDeath


Responsible
Undefeatable Hero
with serious business
posted June 30, 2008 02:28 PM

Quote:
Well, energy can't take the whole spectrum of real numbers, because it comes in packets. So, you simply use wrong scale.
I meant the energy 'amount' that we measure with equipment -- that 'amount' is stored as analog data, not digital

Quote:
If you mean computer science when saying "computers" as well, then you are wrong, simply because computer science is NOT based on ANY law of Physics. So, digital in computer science exists, whether we have computers or not.
Doesn't that mean it's theoretical

Quote:
Not really. For example, can we write down the square root of two? You can say that you need infinitely many digits, and therefore it is impossible. However, we have a very compact representation of this number which is exact with infinite precision! That's isolating interval representation; meaning, square root of 2 is, e.g., the unique root of the polynomial x^2 - 2, in the interval [0, +infinity) (or [0,2] if you prefer something smaller).
And now, all the operations with square root of 2, can be translated in operations with polynomials that have integer coefficients. And as of the engineering and practical part, it is sufficient to "spit" a large enough approximation of the number. Of course you can give all the digits, but that also assumes that you have an infinite amount of time...
Analogously, the square root of two exists, whether our equipment can 'display' it or not. Digitally it is impossible since we don't have infinite memory available.

Quote:
As I said earlier, the reason that we use electricity in computers is because it is the fastest implementation that we know of, and in practice we (of course) care about the response times. If you want a computer that never fails, you can have it. However, it's going to be real slow. So, the choice is obvious: A computer which is as fast as a turtle with no failures, or a computer which is as fast as a cheetah with a very low probability on a failure? The world has decided on this question long ago ...
I don't think there's such a thing as a computer that never fails in a non-ideal world, as much as there isn't any 'perfect' vacuum. That's why I say it's theoretical.

Quote:
It's not a matter of definition; nor I want to trick you in a way. 'software' is a very general word which says very little about the "thing being inspected". It's what you do that matters; not how (programming language) you implement it! Programming languages give you a mean to implement what you really want to do. But that's not what solves the problem you want to solve. It's just that; a mean of implementation.
What is interesting is that the end result will be stored in a 'hardware' media (hard disk, CDs, RAM, whatever), so it's basically some form of hardware at the end.

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
dimis
dimis


Responsible
Supreme Hero
Digitally signed by FoG
posted June 30, 2008 02:48 PM
Edited by dimis at 14:49, 30 Jun 2008.

Quote:
I meant the energy 'amount' that we measure with equipment -- that 'amount' is stored as analog data, not digital
You use wrong scale.

Quote:
Quote:
If you mean computer science when saying "computers" as well, then you are wrong, simply because computer science is NOT based on ANY law of Physics. So, digital in computer science exists, whether we have computers or not.
Doesn't that mean it's theoretical
No.

Quote:
Quote:
Not really. For example, can we write down the square root of two? You can say that you need infinitely many digits, and therefore it is impossible. However, we have a very compact representation of this number which is exact with infinite precision! That's isolating interval representation; meaning, square root of 2 is, e.g., the unique root of the polynomial x^2 - 2, in the interval [0, +infinity) (or [0,2] if you prefer something smaller).
And now, all the operations with square root of 2, can be translated in operations with polynomials that have integer coefficients. And as of the engineering and practical part, it is sufficient to "spit" a large enough approximation of the number. Of course you can give all the digits, but that also assumes that you have an infinite amount of time...
Analogously, the square root of two exists, whether our equipment can 'display' it or not. Digitally it is impossible since we don't have infinite memory available.
Read again the representation. It only takes a few bits (i.e. digital) to describe something that in decimal needs infinite precision.

Quote:
I don't think there's such a thing as a computer that never fails in a non-ideal world, as much as there isn't any 'perfect' vacuum. That's why I say it's theoretical.
There are ways to guarantee non-faults. It's just that they are extremely expensive in real life.

Quote:
What is interesting is that the end result will be stored in a 'hardware' media (hard disk, CDs, RAM, whatever), so it's basically some form of hardware at the end.
Sorry, but this viewpoint is naive, childish, and misleading at least. The interesting part in computer science is not the "end product" but the way you follow to produce the "end product".
____________
The empty set

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
TheDeath
TheDeath


Responsible
Undefeatable Hero
with serious business
posted June 30, 2008 02:55 PM
Edited by TheDeath at 14:56, 30 Jun 2008.

Quote:
Read again the representation. It only takes a few bits (i.e. digital) to describe something that in decimal needs infinite precision.
What about transcendental functions? (I'm not talking about an algorithm to generate them, but how to "store" a given value).

And even with such an algorithm, you still need infinite memory if you want infinite precision -- and also infinite time.

Quote:
Sorry, but this viewpoint is naive, childish, and misleading at least. The interesting part in computer science is not the "end product" but the way you follow to produce the "end product".
The way you follow to the end product is the programming language, or the binary opcodes generated (which are also called assembly language), so I am really confused now

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Paxdominum
Paxdominum

Tavern Dweller
posted August 14, 2008 02:20 AM
Edited by Paxdominum at 02:20, 14 Aug 2008.

Follow the bible, and ask a deacon or a priest if you have troubles understanding it. Go the (mostly) three years catholic catecumenate/candidate training (sorry for my English here) and you will learn to understand God better, and you will be a part of his Body, the body of Christ's Church. Go to church more than every Sunday, that's how I got a life!

God bless!
- PaX
____________
Fight evil!

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
mvassilev
mvassilev


Responsible
Undefeatable Hero
posted August 14, 2008 02:27 AM

No thanks. There's no way I want to be part of a group that's been involved in keeping people in the dark for over a thousand years.
____________
Eccentric Opinion

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Jump To: « Prev Thread . . . Next Thread » This Popular Thread is 204 pages long: 1 30 60 90 ... 117 118 119 120 121 ... 150 180 204 · «PREV / NEXT»
Post New Poll    Post New Topic    Post New Reply

Page compiled in 0.3084 seconds