1st series [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24]  2nd series [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49]

  View the latest questions and answers at askaphilosopher.org

Ask a Philosopher: Questions and Answers 21 (1st series)

Here are some of the questions that you Asked a Philosopher from April 2003 — May 2003:

  1. Dreaming and reality
  2. Differences between Epicureans and Stoics
  3. Objectivism versus non-objectivism
  4. Can necrophilia be inherited?
  5. Objective idealism
  6. Differences between art and science
  7. Can I prevent my elderly relative from changing his will?
  8. Who has the right to tell me what to do?
  9. Differences between Camus and Heidegger
  10. How to prevent human beings from destroying themselves
  11. 'Evil' in religion and morals
  12. Does Monday cause Tuesday?
  13. Questions on causation, ontology of mind and Sartre
  14. Did Macbeth gain, or lose 'power'?
  15. Facts and truths
  16. Education as indoctrination
  17. What Socrates did not write
  18. Clarity is not enough
  19. Kant and the noumenal self
  20. Consequences of accepting that man evolved
  21. Do we control what we think or does what we think control us?
  22. Ethical relativism vs ethical objectivism
  23. Deciding ethically which knowledge should be pursued
  24. Which language should I learn if I am studying philosophy?
  25. On the possibility of having been someone else
  26. Reading Heidegger and Nietzsche
  27. 'To be, or not to be'
  28. 'Seeing' star systems that don't exist any more
  29. Happiness and work
  30. Philosophical considerations on technology
  31. The opposite of 'God'
  32. Collingwood on the method of scissors and paste
  33. I am against the enforcement of intellectual property rights
  34. Bataille on death, sex and religious experience
  35. John Searle's critique of the Turing Test
  36. What we mean by a 'genius'
  37. Can God exist without evil?
  38. Ockham's razor
  39. Why philosophers hop on the bandwagon
  40. Help with Nietzsche's Beyond Good and Evil
  41. Another tree falling in the forest
  42. Is there one thing that gives meaning to life?
  43. Descartes on mind and body, and Freud on the ego
  44. How the universe knows itself
  45. Advice for a philosophy student looking for a marriage partner
  46. Effect of gender on the way we think and feel
  47. Instantiation of self
  48. Relative and absolute views of truth
  49. Problems with translating Parmenides
  50. How agency determines ontology
  51. Why believe in God if human beings will one day be extinct like dinosaurs?
  52. Why it's wrong to eat people
  53. When a question is not a question
  54. Defining and justifying terrorism
  55. The problem of pollution
  56. Explaining the mind-body problem
  57. Philosophies of Lao Tsu, Chuang Tsu and Confucius
  58. I have a theory about the nature of time
  59. Why astrology and Christianity don't mix
  60. Was Descartes an idealist?
  61. Absence of evidence and evidence of absence
  62. Knowing that you know
  63. Philosophy of Hesiod
  64. Do thoughts or actions measure ethical worth?
  65. Levinas vs Heidegger, and Levinas on the 'Other'
  66. What we immediately 'perceive' in perception
  67. I'm writing a novel based on philosophy
  68. Consciousness as an emergent phenomenon
  69. Estimating the probability of being born a human being
  70. Reality is immense in comparison with myself
  71. Good web sites for 6th Grade philosophy
  72. Morals without religion
  73. What follows, if an image of God is always in our mind
  74. 'If the atheist exists, so must God'
  75. Hegel and Schopenhauer
  76. A philosopher who wrote about love
  77. What knowledge is and how to get it
  78. Evaluating IQ tests
  79. Justifying punishment in schools
  80. Mind as a hierarchical Petri net
  81. Is time travel conceptually possible?
  82. Imagining a universe in which I do not exist
  83. Tracing the evolutionary pattern of music
  84. Descartes' 'Cogito'
  85. How to consider scepticism without getting depressed
  86. Which is more necessary, religion or philosophy?
  87. What dreams show about consciousness
  88. Philosophical problems for polytheists
  89. War is not the answer
  90. Anthony Flew's analogy of the gardener
  91. Russell on numbers and sets
  92. Comparing Thomas Aquinas with Martin Buber
  93. Are works of art in the mind?
  94. Induction and falsification
  95. Need for philosophy of education
  96. Why it's good to know things
  97. Arguments against the sceptic
  98. Doing philosophy
  99. Paradoxes about 'God' and 'nothing'
  100. Proving God's non-existence
  101. Existentialism and phenomenology
  102. Each to his own
  103. Logic that resolves the contradictions in dualism
  104. Everything written in philosophy is obvious
  105. Socrates and the hemlock
  106. How the universe can expand or contract
  107. Is 'reality' relative to technology?
  108. Santayana on pleasurable vs admirable beauty
  109. Psychological egoism
  110. Philosophy of the photograph
  111. How to take the apples way from '5 apples'
  112. God and Aristotle
  113. If this is the answer, what is the question?

Ed asked:

Is reality a dream?

If not, what differentiates the 'real world' from the 'dream world'?

If so, Life is wrapped in a dream. If that is true, then wouldn't death be wrapped in a dream? Is death just one big dream?

Hazel and SFB asked:

What is reality?

and Aaron asked:

What is reality? Could we simply be pawns in a child's computer game?

The key to the answer is the recognition that the concepts "reality" and "dream [world]" refer to two distinctly different modes of experience. By the very nature of these two concepts, they cannot refer to the same thing. Therefore, the simple answer is "No!". Reality cannot be a dream without seriously abusing the meaning of the two words. Poets, of course, are granted license to abuse the language for artistic purposes. But philosophers must take greater care.

We each experience "the slings and arrows of outrageous fortune" in two distinctly different modes. When experiencing life in one mode, we notice that things perceived are constant, persistent, consistent, and coherent. When experiencing life in the other mode, we notice that things perceived are dramatically less constant in form and character, often transient in existence, frequently mutually inconsistent both from thing to thing and across time, and far more frequently quite incoherent. One mode of experience draws the focus of our attention, is amenable to inquiry, and responsive to our reactions. The other mode of experience often drifts uncontrollably past our attention, is rarely subject to inquiry, and is often unresponsive to our reactions. On any scale of measure, the difference between the two modes of experience is dramatic and unmistakable whenever noticed. One of these modes of experience we call the "real word", the other we call the "dream world" (or hallucinations, or illusions).

Most of us spend most of our time experiencing life in the "real world" mode. Episodes spent in the "dream world", while they may seem quite real at the time, always end with a transition back to the "real world" mode of experience. Some people, for reasons as diverse as drugs to organic brain damage, spend more of their time in the "dream world". Some people, again for diverse reasons, lose the ability to notice the distinctly different character of two modes of experience, and are unable to distinguish their "real" experiences from their "dream" experiences.

The bottom line is that life is not a dream. The "real world", unlike the "dream world" possesses an unmistakably greater degree of constancy, consistency, and coherence. In the real world, elephants are huge, grey and don't fly. That remains true across time, and is consistent with all other information we have about the real world mode of experience. In the dream world, pink elephants can buzz around your head, and turn into green mice stomping on the roof of your house. The fact that sometimes a dream appears so real you can't tell, does not alter the fact that you always wake up.

Stuart Burns

We cannot waken up from reality, therefore it is not a dream. However, we can waken up to reality from a dream. I think I understand where you are coming from, and it is not an unusual question you have asked. Fundamentally, you are asking; What is reality? the blunt answer is, I don't know!! Further, I have not yet come across anyone who does. However, there are several theories to choose from, posited by philosophy, science and religion. Most of these theories are backed by strong logical arguments, although some appear more feasible than others. There is little doubt that we all entertain the notion of a fundamental reality, and most of the world's human population take it for granted that there are things that are real and things that are unreal.

Probably most people who care to give the problem some consideration are willing to accept the materialist views of science, i.e. the universe is in reality solid, size, weight and shape are measurements of a solid reality. Philosophers can be divided broadly into 'materialists' and 'idealists.' Materialists basically hold views sympathetic to science. Reality for idealists is somehow linked to 'mind,' we live in an inner 'subjective' world rather than an outer' objective' world. However, there are several variations within both the notion of materialism and the notion of idealism; hence, we are presented with a choice of several 'world views.' Added to all this is the notion of 'Dualism,' which accepts that the world is both mind stuff and matter stuff. In dualism there is, in some cases, a link with religious views, where mind and body are interpreted as 'soul' and body. We have a real soul in a real body.

Religion in general is perhaps the thinking body least concerned with seeking out reality. Religion, in all its variations, remains spiritual, and establishes reality through 'faith.' God is real and God created a real universe. There is an undeniable pragmatism about the overall religious view, 'what is 'is,' do not question reality, trust in God, or the powers that be, accept the reality we are aware of and get on with living the good life.'

To briefly consider dreams. Probably most people will accept that dreams are real, though the content may be fictitious. The assertion, "I had a dream last night," is true to me and probably can be accepted as true by the person I am addressing, based on his/her own experience with dreams. We, therefore, both understand what is meant by dreaming, and are each well aware of the of the difference between dreaming and its association with sleep, and being consciously awake. The implication in your question suggests a fallacy commonly expressed by those who wish to make a comparison between a solid, material world and the idealist world of mind concepts. It is not the case that if reality is not fundamentally material, then it is somehow a dream world in the mind. The idealist world is a concept of a real world, but differently constructed to the notions of scientists and materialists in general. If our world is an idealist world, then it is a real world, in which we are capable of recognising the difference between being awake and dreaming.

I am no expert on death, for, so far as I know, I have not yet experienced it. Neither have I met anyone to my knowledge who has returned from the dead. However, what little I do know about death indicates that it is a reality and far from being a dream. In fact it is the only outcome of life that we can safely predict. Having said all this, I do keep an open mind on the subject, my years of interest in psychic phenomena keeps me alert to possibilities.

Not wishing to appear flippant, because your question is a very serious one, I would say that reality is what each individual chooses to believe, some explanations seem more acceptable than others, and until philosophy, science or religion produces the real answer, if they ever do, then we will have to go along with the choices open to us. But there is no denying the fact that one of the present theories just may be true, it is a matter of proof.

John Brandon

I think you're confusing something really basic here. Death is not a state of anything or anyone: it is the absence of existence, the non-being in or of any state. You'll have to think about this in the context of language, which treats 'Death' as though it were the opposite of something existent. But of course it's not. Language used in such a way is a means for us to bring to apprehension a state that exists (or which we presume to exist) and then to identify linguistically a non-state.

Perhaps the simplest way to explain this is as follows: if you and I were stranded in the middle of the Sahara on a hot day, and you say: 'I'm thirsty', I might reply 'there's no water here'. The important language element in this sentence of mine is not 'water' but 'here'. It implies that water is known to exist; it just happens not to be available where we are situated. So I'm not making an existential statement about water or non-water. Whereas, if in the same situation you are on your last gasp, about to expire, then I might be in the humanly very distressing situation of having to understand that, at present, you are, but in a few moment, you may no longer be. Then it is appropriate for me to report, 'this man was alive and now he's dead,' to identify a state of being which I knew you to inhabit at some temporal bracket in history. But to extend this kind of articulation to states which are not, never have been and therefore never will be 'dead', is strictly speaking just a game, the game of language (cf. Wittgenstein). It does not refer to anything 'real', it just refers back to us, and that includes to a large extent not just our understanding but our wishes and beliefs.

I expect that from this answer you will readily deduce that your question about dreams is a non-issue for the same language-dependent reasons. Reality is distinctly of the body: it is therefore experienced by every organism in its struggle to live and survive and reproduce. The only organism to which this is a 'problem' is the mind-endowed creature called homo sapiens, whose state-of-being is among many other qualities identifiable by his ability to note a difference between mental and physical features of this reality. We then go ahead from this fairly innocuous problem and hang enormous weights of speculative thinking on it, of which a great deal is again just part of the game of language.

To put this into a neat capsule for you: we tend to lump the concepts 'mental', 'psychological', 'spiritual', 'soul' and so on into a single basket, as if somehow they were all the same, i.e. parts of a dimension divorced from 'reality', which is then opposed to it as the 'hard stuff'. But just as a rock differs in significant features from a microbe, so 'mental' and 'spiritual' are different categories. What we refer to as 'mental' are states-of-reality which apply to animals as well as us (animals dream!) and are simply the neurophysiological responses of our body to the impact of 'reality' on us. Dreams are generated by the body, by the neurosystem as part of its homeostatic routine; but the dreams to which you might otherwise assign such notions as (e.g.) 'hope' are a different kettle of fish. Again, in language we usually fail to distinguish in the expression 'hope' a realistic expectation and the doodling of the mind.

But whichever you look at it, in the end 'reality' comes first. So 'reality', however experienced, precedes 'dreams', however defined. In dreams, waking or sleeping, you can do 'what you like', but God help you if you try to do the same 'in reality'!

Jürgen Lawrenz

Though it can't be proved I believe that the world exists. Existing in the dream of an unknown being is no fun, so I refuse to believe that..

But reality IS a fantasy, that is you can shape it to your own likings. However it is practical to share a big part of that fantasy with others, otherwise you'll lead a lonely life (and often end in a mental hospital)

So death is for me not a dream but another fantasy. In many cultures just an accepted part of life. In Christian culture it generally and officially was made something absolute and a subject of fear (but many Christian priests have a comforting relative view on death).

Henk Tuten

This doesn't really constitute an answer but your question sounds very much like Morpheuses in The Matrix. I don't remember the exact quote but yours is quite close unless I've lost all short term memory. Which is all a good way of saying that there are two article that everyone should read:

David Chalmers 'The Matrix as Metaphysics' at http://www.whatisthematrix.com


Nick Bostrom 'Are you Living in a Computer Simulation?' at http://www.simulation-argument.com.

Both are really good though the latter is quite tough if you're familiar with probability theory (though there is a really good introduction to the argument that was published in the Times Literary Supplement on the site) and the former does get quite technical (though Chalmers' ever impressive writing style makes things very clear)

And everyone should go and see that new Matrix film, (Matrix Reloaded in case you've been reading Kant in your room too much — get out!)

This might not answer the 'is life a dream question?' (which leads to interesting questions about how clever I must be (I came up with this?)) but I might help with the analogous 'are we in the Matrix?' question.

Rich Woodward


Nikki asked:

I am about to take a test in philosophy and am wondering if anyone can help with this question. Both Epicureans and Stoics maintain that the way we live our lives should conform to the nature or the "nature of things." In which ways do there moral visions agree/ disagree? The Epicureans also maintain that the aim of human living is to seek pleasure and avoid pain. For what ultimate end should one pursue pleasure and avoid pain? Is the Epicurean view simply an updated version of the views defended by Polus and Callicles? (Plato Gorgias) Why?

Although there are similarities between Callicles and Epicurus' views, there is a significant difference. Callicles believed in the pursuit of pleasure with pleasure defining happiness, which was the final end. Epicurus believed in pursuing pleasure, but also stated that sometimes it is preferable to choose pain if it will lead to greater pleasures. He also thought that some pleasures should not be chosen if they would lead to greater pain. In other words, Epicurus believed our choices should be moderated by having regard for their consequences. Callicles (at least how Plato painted him in his dialogue Gorgias) believed we should just pursue our desires. I presume you're studying Letter to Menoeceus. I hope this helps in some way.

Lyn Renwood


Marc asked:

I recently got into a debate with this fellow on objectivism versus non-objectivism.

My position was that a Platonic reality exists for scientific, mathematical and moral concepts. That is I want to believe that there is a timeless, universal set of scientific, mathematical and moral principles that exist external to the human mind and are knowable by it.

My opponent disputed this, claiming that any such objective reality would be unknowable, and science is simply a calculational device used for making correct predictions. He also strongly disputed that there could exist a set of universal moral principles.

What does modern philosophy have to say about this? What would be a majority consensus view of things at present? And what texts should I read to get a good grounding in the basic arguments for and against?

Your question about consensus on these questions is extremely difficult to decide. Some time ago Philosophy Now magazine did a survey on what students of philosophy tended to believe regarding ethical objectivity and what philosophy teachers tended to believe. The result as I recall was that students tend to be non-objectivists and teachers tend to be objectivists. However, in general there are likely to be more philosophers who are sceptical about ethical objectivism than mathematical objectivism or scientific objectivism.

However, we need to get a little clearer on how to couch the debate between objectivism and non-objectivism as these terms can be a little slippery and through the course of history they have changed their meaning. Hence the modern debate over these questions tends to be couched in terms of two positions called

'Response-independence' and 'response-dependence'. The motivation for couching the debate is due to a general acceptance of a distinction between primary and secondary qualities (derived from John Locke). Primary qualities were those properties of objects that existed independently of our responses e.g., shape and secondary qualities were those that were dependent on our responses e.g., colour, warmth, taste.

This way of setting the debate up offers us sharp distinctions between judgements concerning colours such as 'The carpet is red,' and taste such as 'Beer tastes bitter' and judgements concerning shape such as 'The pebbles are round.' The response dependence of the judgement, 'The carpet is red' is explained by saying that the truth conditions of the judgement are not independent of our responses, that is they are partly constituted by our responses. The response independence of the judgement, 'The pebbles are round' is explained by saying that the truth conditions of the judgement are independent of any judgement that we could possibly make about the pebbles. That is to say they would be round even if we had never come across them, or they are mind independent.

Not everyone accepts the distinction between primary and secondary qualities and you have to have one foot in the objectivist camp at least for some judgements in order to make the distinction. Some philosophers do hold a global response dependence view of our judgements but with the distinction in terms of the truth conditions of the judgement we can see what they are arguing about. The great difficulty for those who hold the response dependence view of judgements consists in saying exactly what responses are equivalent to the truth of the judgement i.e., what judgements cannot be false. Most are elusive on this question and it is a weakness in the theory.

Controversial areas in science concern theoretical posits or unobservable entities — but there is no need to see these are being response-dependent. Many scientific entities might not be directly observable without being constructed out of our responses.

With the above in mind we can now turn to maths. A response dependence view of maths looks initially attractive because we may be suspicious of attributing mathematical sets to a response-independent reality. However, certain mathematical theorems like Godel's theorem look like they are either true or false and there is no possible judgement on our behalf that could make it so because of our limited ability for determining the truth of the judgement. Much of mathematical breakthroughs only make sense on a response-independence view of the subject matter. The possibility of our best judgements being false is what the objectivist or response independence theorist has as his main foil against the non-objectivist or response-dependence theorist.

Turning to Morals the matter is a little trickier. There is a distinction to be made between objective moral or value facts and objective moral principles. Basically some philosophers like Richard Hare hold that you can have universal moral principles without objective moral or value facts. The two are likely to be more successful if they go together though. Moral judgements such as, 'Inflicting wanton cruelty to animals is morally wrong' look like they have truth conditions that are independent of the subject who is making the judgement. That is the truth of a moral judgement is not to be decided by the person making the judgement. This looks like a conceptual truth — it is what differentiates moral judgements from judgements of taste. However, the truth of the above judgement does not look like it is going to true independent of all responses — it is not going to be true independent of the capacity for the animal to feel pain or to suffer. So there is a sense in which moral judgements are both response independent since they do not concern the speaker's responses, and response dependent since they concern the responses of the subject of the judgement. (the subject of the judgement in the above is 'animals' and is not to be confused with the subject making the judgement i.e., the speaker).

If you see a Platonic reality as an objective reality, and approach objective reality as a reality that exists independent of our judgements about it, then it seems you have a good case for making it with regard to maths and science, but with ethics we have to be careful about the scope of this distinction. All of the above would be regarded as objectivist positions.

I would recommend reading the arguments of some non-objectivist philosophers in order to see what their motivations are for adopting such a position. That is to say that most philosophers take response dependence views of subjects because they see a problem with the response independent view.In this way once you remove the obstacles for your opponent they should fall in line with a form of objectivism.

J.L. Mackie Ethics Inventing Right and Wrong is a nice little book by someone who challenges objectivity in morals and distinguishes between objective moral or value facts and objective moral principles in his opening chapters. Mackie sets out what it would look like for there to be objective values i.e., everyone's happiness would count equally when making moral decisions but he rejects this view because of a clash between Platonic or Kantian conceptions of morality entailing reasons for action and Humean conceptions of reasons for action. This is one of the main debating points in contemporary meta-ethics so if you can find a good way around it then you will be able to defend your position form likely critics. Also try David O Brink Moral Realism and the Foundation of Ethics for support.

Julian Bennett


Gert asked:

Is necrophilia inheritable through the genes of the necrophile's family?

The short and sweet answer: no, no, NO.

The longer answer: at this point we have, really, no idea about what is inherited and what isn't in regards to mental attributes, except in rather vague statistical ways. BUT the absolute, strongest correlations found, between identical twins, with really very general attributes such as intelligence, still do not find 100% correlation. As far as detailed personality traits like heterosexuality and homosexuality, there is some very small correlation among siblings. I'm not even sure that anyone can say there is really a genetic component there. If there is, it's pretty small, as far as we know at this point.

Now, specifically, necrophilia is a learned sexual preference. As far as I know, you cannot inherit it any more than you can inherit a preference for Fords over Chevys or red hats over blue hats. Maybe there's something heritable there, but if there is, it would be basically science fiction at this point to assert it. There's no way that I can say, absolutely, that a preference for blue hats is certainly not genetic, and so there's no way that I can say, absolutely, that necrophilia is not genetic. There's not enough known about genetics. Maybe I bite my fingernails because of genetics. Haha. But I'd put all those on about the same level of likelihood.

Steven Ravett Brown


Bob asked:

What is Objective Idealism? Is it considered a tenable position today?

Idealism is a complex subject with several facets, Objective Idealism, better known as Absolute Idealism, is one of them. To come to some understanding of what is a fairly obscure concept, it is perhaps advisable to briefly consider the development of idealism from Berkeley to Hegel. Very often when we refer to development in philosophy, it must not be regarded in an evolutionary sense, it simply means that someone has added a new idea to what has gone before, or maybe has substituted their own idea for the previous one, but none of it can be said to fully supersede what has gone before. Take for example the graded progress to Absolute Idealism, from Berkeley's Subjective Idealism, through Kant, Schiller, Fichte, Schelling, Schopenhauer, and finally the total Absolute in Hegel. No development has completely eliminated what has gone before, and we find that there are supporters of each variation of idealism who will not modify their enthusiasm for the variation they adopt. Hence, what we find is a range of alternative approaches to a difficult question; What is reality? or; What really exists?

I obviously cannot go through a detailed history of the development of idealism here, but I will try to construct a brief indication of the general trend towards Absolutism. You can learn more about each of the philosophers mentioned and their ideas, by reading about them in a good encyclopedia of Western Philosophy.

Idealism is a term originating in the concept of ideas in the mind. Idealism does not quarrel with the naive view that material things exist; rather, it disagrees with the analysis of a material thing that many philosophers have offered, according to which the material world is wholly independent of minds. Berkeley asked how an observer who was aware of nothing but his own ideas could know anything about an external world. The situation is made more absurd when we realise that senses can deceive us, i.e. a sense can present us with alternative ideas about which we have to rationalise to obtain what we might call the correct choice. As there is no way of proving the presence of an external material world, why should we presume there is such a presence? It is more likely that the only world we can justifiably accept is an internal world of ideas. Things that exist are things that are perceived, when no human mind is perceiving an object, we have to presume that it continues to exist because God is perceiving it.

Unlike Berkeley, Kant did not refute the notion of the existence of things outside the mind. However, he believed that we could have no direct access to what was there, all we could be aware of are representations by way of the senses, mere shadows or phenomena of what could exist, which he called things in themselves. To make sense of the phenomena we receive, the mind adds a priori knowledge, knowledge in a way gifted to us by nature, to form mind constructs. Thus, the popular notion that the mind conforms to objects in the world is reversed, and, according to Kant, objects conform to the mind. The world out there is called the noumenal world, the things in themselves which constitute the noumenal world are thinkable but not knowable. Kant called this doctrine "transcendental idealism."

Fichte, though influenced by Kant, could not accept the notion of things in themselves. He asked, how we could actually postulate hypotheses about a noumenal world that we knew nothing about, and for which we had no proof whatsoever that it existed at all. He decided that the noumenal world had to go; there could be no grounds for asserting something quite unknown, and no meaning in doing so. After this rejection we are left with just minds and objects of experience. Fichte developed the idea further by referring to two parts of mind, the I and the non-I, the I observes what goes on in the non-I, thus eliminating an outside objective world. The I is considered subjective and the non-I objective. The I is what the Greeks might have called the soul. So we have now entered what Fichte called "Absolute Idealism."

The development of absolute idealism proceeded through Schelling, who introduced a spiritual concept, to Schopenhauer, an atheist who considered the absolute to be the will, this he considered to be the ultimate reality. Absolute idealism comes to fruition in Hegel. The absolute for Hegel was the Universal Mind, an interpersonal consciousness. Berkeleian subjective idealism and Kantian transcendental idealism, construe reality in terms of the content of individual minds, absolute idealism on the other hand, tends to construe it in terms of an interpersonal consciousness. The distinction between one 'self' and another tends to lapse, leading to a form of monism, according to which there is only one thing, the mind divided up into appearances. All reality is in the mind, there is nothing outside it.

Complicated stuff, but I trust you will grasp the general idea of what absolute idealism is about. Yes, it is considered a tenable position today by some philosophers. In fact, idealism in general is experiencing a revival. Oddly enough it is receiving a boost from science, particularly physics, which no longer sees the world as a great machine or technical construction, but is seen by many as a great 'thought!' Matter keeps disappearing and re-appearing under their very eyes. Personally, I can only make sense of the world by way of the Kantian idea of mental constructs, but, like the absolutists, I find it difficult to conceive a noumenal world. Like Bradley, I am out on a limb with the notion of the mind contemplating itself, the real absolute!!

John Brandon


Katleen asked:

What are the differences between art and science?

There are innumerable differences as well as similarities, so any answer (especially if kept short, as here) is bound to commit an injustice to one or the other. In other words, this is an issue one could not possibly address exhaustively in less than a few hundred pages. If therefore I frame an answer for you in terms of what our general intuitions would tell us anyway, this is merely to be understood as a beginning, as a kind of 'demarcation of competences'.

Fundamentally, then, science is concerned with knowledge. As a branch of philosophy (if I may push this antiquated barrel!) it occupies the branch called 'epistemology'. It is concerned, to the degree possible to us, with exact knowledge, and by extension with prediction on the basis of this knowledge of future trends among the objects studied. Thus the twig of physics on this branch seeks to ascertain, as exactly as possible, theoretical models of the interactions among atoms, electrons and other particles in order to acquire a pattern of understanding that relates to (say) the creation, movement and ultimately dissolution of galaxies, nebulae, stars, planets and (oh yes) things like atom bombs and similar benefits to mankind. The science of statistics is concerned with modelling large-scale trends, for example the incidence of motor car accidents in cities (this is of help to insurance companies), yet while such knowledge may be deficient in detail, there is a surprising agreement of its findings over the long run: so accurate in fact, that it is in its own way an exact science capable of making accurate predictions.

Art, on the other hand, is concerned with intuitive knowledge, most of which is of the kind that 'exact' science has no ready methodology to investigate. On this account it falls under the branch of 'aesthetics', a term which bears the meaning of 'what our senses communicate'. For example, human relations: love is indubitably a form of knowledge on a deeply intimate level between two persons, but it is not the kind that can be encoded in a scientific theory. But through the medium of art, we can become acquainted with it as a spiritual phenomenon (e.g. through music). This is perhaps an extreme example, but simpler ones merely reinforce the same point on their own different levels. Painting, poetry, drama are to an overwhelming extent concerned with exploring the meaning of the human estate, whether by way of private contemplation or public celebration. Now one big difference between art and science is, of course, in the 'objects' that are the outcome of their activities. In science it is a theory, a method, a technology; in art it is a 'work' which embodies some fragment of our intuitive knowledge and gives access to that knowledge by recourse to our aesthetic sensibilities.

But there is another difference, equally important: scientific knowledge is about given and largely impartial conditions and features of the world; whereas aesthetic knowledge is largely about 'meanings', which are creations of the human mind and superimposed on 'reality'. This means that an artist, whose vision or personal experience motivates him/her to make an effort to communicate it, will do so in a form and using such materials as are capable of carrying a such 'message', which is therefore inevitably an artefact to which we are sensitively and/or sensually responsive on an intuitive level. So this kind of knowledge is not only inexact, but also 'created' (cf. 'poesis', the Greek for creating); it is not found objectively in the world, but is added to the world as knowledge specifically human, by humans and for humans. We can see this impulse stirring in children when they 'pretend' that a piece of string is Mum; and anthropological researches would confirm that this search for knowledge of humans by humans and for humans is a predilection that long preceded any impulse to extend the 'exact' knowledge by means of which we survive in and control the shape of our habitat.

Here is something to ponder: not a definitive answer (which can't be given anyway), but some thoughts that may lead you on to your own path of discovery. I'll conclude with an observation that may or may not have occurred to you, when you asked your question: namely that pursued in earnest, you might have posed a problem for yourself that could easily occupy you for the rest of your life!

Jürgen Lawrenz


Charlie asked:

I have a relative who's case, I think, is a question of morals and ethics. This is my sister whose uncle fell very sick last year. She took care of him as they've always been very close since her childhood. Just recently, the doctor told us that the elderly man was not going to make it through next month. The problem is that he left a will which states that he will leave all his fortune in the research in diabetes (the disease from which he suffered). He has a pretty rich son if you ask me who has a business in Chicago who rarely came to visit his father. The elderly man's condition has deteriorated over the past few days and he simply isn't thinking clear. Just recently, he wanted to modify his will to leave all his fortune to his son! My sister's always been a kind hearted person, but when it comes to the lives of hundreds maybe thousands of others, she would even sacrifice herself. My uncle could die any day and the will hasn't been modified. Should we go ahead and let the poor man make the change? I don't know, I'm all confused let alone her.

Should you "let" the man make the change? What about respect for his freedom of choice? Is it not wrong to interfere with someone's else's property? However, if he isn't thinking clearly it is unlikely that he will be legally allowed to make a change in his will.

Rachel Browne


Shady asked:

Hey, I was wondering about a couple philosophical questions:

For one, who has the right to tell someone else what to do? I mean regarding laws and rules. Also, I would like to hear a philosophical argument of an ongoing controversy. Any kind, but I'm tired of hearing of free will/ determinism, and proving the existence of God. Thankyou.

As far as your first question goes... first, I'm not sure what the word "right" means, especially in this context. But let's take a couple of scenarios relating to "rules".

Children: children are incompetent to deal with the world. Period. If you've seen a young child, then you know this point is not even worth discussion. Ok, so then their parents have the duty to guide them, and this includes, when necessary, telling them what to do. Now you might be saying, "ok, fine, but I'm not a child any more, I'm 12 (or 15 or 16... or whatever) now". Who is competent to judge your competence? If a 5-year old says that, you say, very gently, "yes, you're a big [boy or girl] now"... and continue telling them what to do, right? So when do you (the parent) stop? When you judge you can, gradually. When is that? Um... obviously I have no answer to that. That's something that has to be worked out, usually painfully, unfortunately.

Disabled: what about intellectually disabled people? What about emotionally disabled? We tell them what to do, right? As little as possible, but still that must be done to some extent.

Incompetent: what about when you're in a situation where something must be done but you don't know how to do it, or don't know well enough for that situation? Then hopefully there is someone around who will tell you what to do. And you'd better do it, or someone will die... if, say, you work in a hospital and a doctor is telling you what you must do. Or you have to survive somewhere and don't know how.

But, you say, these are extreme situations. Yes. But I'm using them to set up a baseline, so to speak. From that baseline, commands, advice, hints, etc... shade off to the other extreme where you are telling a child, for example, what to do. If you want some black and white solution here, forget it. Each situation has to be judged on its own merit.

I'm not going to tackle "laws". I see them as cultural or societal extensions of the above; but I'm sure others will have other points of view.

The second question: well, just think of how tired I am of that. Here's one source of questions:

Tye, M. Ten Problems of Consciousness: A Representational Theory of the Phenomenal Mind. Edited by H. Putnam and N. Block. 2nd ed, Representation and Mind. Cambridge, MA: The MIT Press, 1996.

You might also look at "the new problem of induction", and Goodman's exposition of that. A nasty problem.

There are also open problems dealing with, for example, essentialism. Are there essences, in an epistemological sense? A metaphysical sense? Believe it or not, this is a very interesting and important question which relates not only to epistemology but also to cognitive structures.

Philosophy of language: see the Pinker/ Langacker or Chomsky/ Lakoff positions: sophisticated nativism vs. sophisticated cognitive-developmental positions.

Philosophy of science: Kitcher vs., say, Lacan or even Derrida.

I mean, once you get past the basics... the necessary 3 to 5 years learning what to learn about the issues and how to learn about them, it gets very interesting and complicated. But you are wanting to do the equivalent of reading journals in mathematics or physics without learning the language and background. Or reading articles in genetics with a bare knowledge of what DNA is. You can't do it, sorry... there's background you simply have to know to understand, much less participate, in these issues. But. The issues are out there, they're just not easy to grasp, any more than technical issues in physics, genetics, or mathematics are easy for the layman to grasp.

Steven Ravett Brown


Jackie asked:

In what ways are Camus and Heidegger different?

Heidegger was much older than Camus, and he died in his bed, whereas Camus died when his car crashed into a tree. It is supposed that he had epileptic tendencies and suffered an attack just as he was driving.

Also Heidegger was a German, whereas Camus was a Frenchman. This means they wrote their books in different languages, and although Camus was deeply influenced by Heidegger, he was very selective in what he took from his older contemporary. For example, one of Camus' ideas, which he followed through in a book entitled The Myth of Sisyphus, is that the ultimate philosophical principle is suicide, i.e. whether we can justify, once we have realised the logical absurdity of living a human life, that we do not make an end of it. I'm not aware of Heidegger's response, but I think he might have shaken his head in disbelief at such naivety.

Camus, of course, also wrote plays and novels. Heidegger never wrote anything other than philosophy books and papers. That's probably why Camus got a Nobel prize and Heidegger did not. In this context it might interest you that Bertrand Russell also got his Nobel for literature, not philosophy. Now you know at least what's important in the world of Messrs Nobel & Co.

I could go on for a while yet, but in the end there is no substitute to actually saying 'hello' to Camus and Heidegger. The book is just referred to is less than 200 pages and not very difficult (if you can read the arts column in your weekend newspaper, then you can read Camus). Heidegger is difficult, true enough: but you could make a start by reading his 50-page 'Letter on Humanism' and that would give you a very good perspective on what Heidegger is all about. Then you would be very well placed to figure out the difference in their philosophies, and it would save you wading through at least the same number of pages in the secondary literature, which of course can't answer that question any better than doing what I have just recommended.

Jürgen Lawrenz


Michael asked:

If as yet we have been unable to answer the question "Why are we here?" with any certainty, surely the first thing to do is to work out why we are NOT here? It would seem logical that it is not to fight over land, to create weapons of mass destruction, to allow fellow humans to live in dire poverty and obscene conditions when other people stay in palaces. Surely it is better, at least more sensible, to try to work together. So my question is this:

Why when we are destroying ourselves and our home do we continue instead of stopping and helping each other as one race not as various races?

Becoming one race can be realized in at least 2 ways.

1. Everybody getting the same system of thought (like the U.S. tries in Iraq).

2. Making a nice cocktail of views, and cherishing every different thought.

Realize that a small percentage of terrorism is just caused by some people having a different view, and not really being accepted. Should instead they stop fighting for their rights and disappear in the silent masses?

As long as the dominant view behaves in a 'religious' (dogmatic) way there'll be so called terrorists. And because often those 'deviant' people are quite intelligent they'll pose a threat. They are perfectly willing to stop fighting and start fair negotiations, but generally don't get a real chance.

Seen as evolution, the powerful win and the others disappear. But now and then on earth there is a ice age, otherwise still dinosaurs would be in power and use us humans as a snack.

Henk Tuten


Edaw asked:

Are religious people the only ones that use the term 'evil'?

A lot of people who are not religious use the term "evil" since it has a descriptive use for the extremely malicious.

It is strange that a lot of moral philosophy is about what is good and evil has been quite largely ignored. However, some philosophers take the problem of evil seriously, such as Raimond Gaita in Good and Evil: An Absolute Conception.

Rachel Browne


Emma asked:

Does Monday cause Tuesday?

Well first of all you can see that the names of the days can be anything or nothing, so they are irrelevant. Second, we humans divide out time by periods of light and dark and label those periods. We don't have to do that; we could just count the time from, say, full moon to full moon in tenths or hundredths of that time, if we wanted to.

Third, what causes the periods of light and dark we call "day" and "night"? The fact that the source of light for the earth is the sun, basically shining like a spotlight on the earth, while the earth rotates on its axis (and it also revolves around the sun, but that's irrelevant to this). Ok? What causes the sun to burn? Hydrogen fusion reactions, basically... the sun is an enormous H-bomb which doesn't explode outward because its own gravity keeps it in one place. What causes the earth to rotate? Angular momentum resulting from its formation from a cloud of rotating particles; in other words, it got going a couple of billion years ago and nothing has hit it hard enough to stop it; it just goes on momentum, like a gyroscope. Ok so far? Now then, you tell me... what causes the day to progress to night? Nothing, really, just the system sitting there, burning and turning. Does Monday cause Tuesday? That's like asking whether the shadow on a turning ball causes the light next to it, right?

Steven Ravett Brown

If you take a Humean view of causation (David Hume Treatise Of Human Nature), then the the statement, 'x causes y' is analysed along the following lines:

At all places and at all times, events of type X are followed by events of type Y, and x is an event of type X, and y is an event of type Y.

A whole philosophical industry has grown up trying to patch the holes in this deceptively simple analysis. An obvious objection is the one that you have raised. Tuesday always follows Monday. Does that mean, as Hume's analysis implies, that Monday causes Tuesday? (Of course, names are arbitrary, but one could just as easily have asked, 'Does today cause tomorrow?', or, more generally, 'Does the time period t to t+1 cause the time period t+1 to t+2?')

However, it can be argued that in a sense it is true that Monday causes Tuesday (or today causes tomorrow), because — at least, if you accept determinism — everything that happens tomorrow is caused by everything that happens today, if by 'everything' we mean the sum total of events occurring in a given time period. But it is a strange way of talking, and not at all what we normally mean when we say that one thing causes something else.

However, if you are a Humean about causation, and you believe in determinism, then, yes, Monday does cause Tuesday.

Geoffrey Klempner


Malcolm asked:

Can someone explain to me the necessary and sufficient cause distinction?

Malcolm also asked:

I don't know if this is a genuine question, but I am interested to know the ontological status of mind. It can't seem to me to be the same as physical matter (more similar to numbers perhaps). I know Descartes calls it a distinct substance, and I think Sartre argues it is not a substance.

Incidentally I am also interested to know what in Sartre is the ontology of art, and why does Roquetin in Nausea think it can cure him of the sin of existing?

Put very simply: a necessary and sufficient cause is one where the effect could not arise without that cause and where that cause itself is all that's needed to produce the effect. For example, a necessary and sufficient cause for you to exist is to have a male and female parent. At present some arguments are being bandied about in relation to cloning whether this is true or not, but I would be content to wait upon proof to the contrary being delivered (and I'm not content that cloning proves anything at all).

>From this at any rate you can deduce that there are innumerable causes without a one-to-one relation to effects, because additional causes are required the wheels turning in a motor car are produced by the cause 'combustion of gasoline', but this is neither necessary, because wheels may be turned by other causes (you can push the car), nor sufficient, because the gas released in the combustion must first push a piston, which in turn must push a lever, which in turn etc etc. So a necessary and sufficient cause could be described using other words (synonyms) such as 'compulsory and comprehensive' or 'essential and consummate' with roughly the same meaning. The virtue of using the standard phrasing has to do with adequacy of verbal expression: and we use this nomenclature by Leibnitz because it is the most precise way of enunciating the principle (Leibnitz had an inimitable gift for preciseness in such matters; indeed Ortega pointed out that nine of the ten principal concepts of philosophy were articulated by Leibnitz in the form in which we use them today).

On the ontological status of the mind I would have to say, it has none. Irrespective of what may have been Descartes' or Sartre's opinion on the matter, we are not in a position, either philosophically or scientifically, to claim such a status for the mind, for the simple reason that anyone with a converse opinion would have no difficulty punching holes into any argument we can propose for it. Now I put it this way for reasons of objectivity and in contradiction to my own beliefs: for I am myself satisfied that mind is ontologically distinct. But I cannot overlook on that account that an ontology of mind is a matter of persuasion, not proof. That 'mind' exists we can assert without fear of denial, but what it is is an indispensable part of any ontological argument, and on this we haven't got much of a clue. For Descartes as much as for Sartre, the ontology of mind is a metaphysical concept, and accordingly whatever satisfaction we may gain from the study of their writings (e.g. the cogito statement or the 'in-itself' and 'for-itself' principle of Sartre), a residue of ontological uncertainty remains because we do not possess a platform from which to judge adequately whether mind and brain are to be regarded as one or two (or even more) entities. This is where the old metaphysical claim that 'a predicate does not confer existence' retains its force. Until further notice.

You will forgive me, I hope, if I sidestep Sartre's ideas of the ontology of art and leave this to someone else to attend to. From my perspective, what Sartre contributes to this subject is muddleheaded stuff; but this judgement relies on a conception of art which I defend, that goes against the grain of what he and most writers on aesthetics conceive of as the relation between art and its objects which (in my humble opinion) is incapable of resolving the dilemma of what kind of an object a work of art is, ontologically speaking. (If you happen to have read Danto's Transfiguration of the Commonplace, you might be aware that this confusion is endemic and getting worse instead of better. Again, if you were to read Heidegger's 'Origin of the Work of Art' you might find that Sartre is off on a tangent that somehow I don't feel Heidegger would approve of. But, sorry, this is where I'm going to leave it.)

Jürgen Lawrenz

Interesting question, Malcolm. Have you looked at materialist theories of mind (such as Armstrong's A Materialist Theory of Mind)? Armstrong came up with "The mind-brain identity theory" in which he stated that the human mind and the human brain are identical. By this he did not mean that they shared the same properties (like identical twins) but that they were in fact exactly the same thing, in much the same way as "John Howard" and "Australia's current Prime Minister" are identical

Lyn Renwood


Christopher asked:

Define power. Is it merely having ability to exercise control or is the meaning of power deeper? If someone betrays himself and his/ her ethics in order to achieve "power" have they really gained any power? Macbeth for example. He betrayed his ethics in order to achieve "power" and become King. Yet, did he really gain this power that he seeked or did he lose it in the process of attempting to attain it?

Already Nietzsche said that all life is about expressing the will to power. I would define it as the ability to express your own creativity without taking to much care of others. So in essence it must be seen as a pure evolutional force, but often only the inherent 'negative' side is shown in the spotlight (robbing others of THEIR freedom).

Sorry, the last time I read Macbeth was in high school, and since then I only saw it several times on television. So no impressive quotations. But I think Macbeth wanted both sides of the cake. He wanted power to rule, but he did not want blood on his hands. What he asked for was power. This he got, but only by getting his hands dirty. So he was tricked by the dual shape of power. This is I think what Shakespeare wanted to show.

Henk Tuten


Roy asked:

Is there a difference between a "Fact" and a "truth"? I realize that some people use the terms interchangeably, but I wondered if there was a logically necessary distinction. I reasoned that the difference between them is that "Facts" are always true. Truths are temporary. For example, "George W. Bush is President of the United States" is true only within the length of his term (let's say 4 years). To make the same statement 8 years from now the truth value will be false. But, "George W. Bush was elected president of the United States in 2001" will forever be true. Is my distinction between "Facts" and "Truth" reasonable or faulty?

I think you are wrong in what you're saying. Right now, "George W. Bush is President of the United States" is a fact. It is also true. In, say, 10 years, it will be neither. So that example is not correct.

One way to test a statement is to put it through logical variations:

Statement: If P then Q.

Converse: If Q then P, which is not logically equivalent to the first statement.

Inverse: If not P then not Q, which again is not logically equivalent to the first statement.

Contrapositive: If not Q then not P, which IS logically equivalent to the first statement.


So let's look at the statement: IF something is a fact, THEN it is true.

What do we get out of that? Well, if that is true, then the contrapositive is true: IF something is NOT true, THEN it is NOT a fact. Let's test that. "Unicorns exist" is not true. It's also not a fact, right? Or is it? We know that unicorns do not exist, at least in any normal sense of that term. Is "unicorns exist" a fact? No. So the contrapositive works.

Ok... the converse: IF something is true, THEN it is a fact. Well, we might make a distinction here between abstract mathematical propositions versus statements about the "real world" (and I'm not going to deal here with making that latter term clear). That is, if we say something like "the square of the sum of the hypotenuse of a right triangle is the square of the sum of the other two sides", which is a true statement, would we call that a "fact"? After all, there is no real triangle for which this is true, it's only really true for ideal right triangles. And so it cannot be a fact. I could find, I'm sure, even more abstract statements in higher mathematics which were true but not facts... and indeed one can fairly easily create logical systems in which true statements, statements which were consistent with the logic, and provable within the system, would be false in the real world, i.e., true but not factual. Thus I could say, "there is a world where like electric charges attract and unlike repel". It would follow, then, that atoms, etc., could not consist of clouds of electrons around nuclei... since the electrons would attract each other and the whole atom would collapse. Given the assumption, those are true conclusions. But there are no facts there.

But if that's the distinction we can make, then we must say that a "true" statement refers to any correct statement, while a fact refers to any correct statement about the real world. That seems a reasonable distinction to me... One could then get into some very bizarre discussions about what is real and what isn't... which as I say I'm not going to touch here.

But that should give you a starting point, anyway, for thinking about this kind of issue. There is a literature on "conceivability" and on "contingency" and on "counterfactuals" which you might look at, although it's not easy reading. There's also this brief exposition on the subjunctive:


and here:


And on counterfactuals:


Steven Ravett Brown

The word 'fact' derives from the Latin and has a very precise meaning (which in our modern languages tend to be somewhat obscured, simply from habits of usage). It means 'something that actually occurred.' Philosophically one may include objects existing in that definition, because it is legitimate to speak of objects as 'occurrences' in the sense that they are local concentrations of the 'event spectrum' of the universe.

In the very narrow and limited definition of truth that applies, say, in information technology, where a value may be deposited in a memory site (TRUE/FALSE), this 'factuality' becomes a purely operative mode. The system containing those value does not 'know' whether a value of 'true' is truly true. There is some similarity here to the old form of syllogisms, where you can put up a nonsense maxim and have the syllogism running through to its nonsense conclusion, for as long as the logic of the operation is satisfied, no hiccup occurs. Accordingly (in syllogisms) it is the duty of the philosopher to ensure that the maxim is (as they used to call it) a 'self-evident truth', such as for example, 'Socrates is a man' and then go on from there. But of course, humans can be very simple minded; and especially in the middle ages, many 'self-evident truths' were put up for syllogistic reasoning of which one might say that they were very far from being self-evident. Now in relation to information processing systems, similar rules hold: the attendance of an intelligent agent to control the 'factuality' of the truth conditions being tested is required. Clearly if a value of True is being deposited in a memory location, this value says nothing whatever about the truth or falsity of the condition which led to that value being deposited, for as in the case of syllogisms, the device is responsible only to the operative logic, not its factuality or truth.

>From this, at any rate, you will deduce the one important criterion which separates fact from truth. A fact is an occurrence that may occur without any human agent knowing about it; but if a human agent knows about it, then that agent is responsible for assigning a 'value' to it, e.g. by reporting it. If the report stands up to scrutiny (for example, if it concerns an earthquake that can be independently checked), then the fact and the truth coincide (as in your example of President Bush) and any claim to the contrary will then bear the stigma of 'untruth'. Other conditions may prevail to qualify that truth. The witness may have confused the date on which the event occurred, but this only means that an error corrupted some aspect of that factual truth without impairing its essential content. A lot of history writing is concerned with just such issues, and historians are constantly required to evaluate testimony which may be essentially true, but deficient in one or another facet of this truth (e.g. the reigns of Egyptian pharaohs, which often overlap because apparently the Egyptians did not always rigorously separate the life span and the actual reign of a king).

But many truth situations in human commerce relate to truth which is not tied to events and the testimony which confirms their factuality. It is fairly clear, without delving deeply into the philosophical merits of 'truth', that FACTUAL TRUTH is always conditional. In your example of President Bush, the factual truth about his term of office can only be established when it actually ends; any statement made before that event is not an 'untruth', nor even an 'error', but just a verbal utterance without meaning content. However, a fiction writer may, for purposes of their own, pretend that Bush lived to the age of 120 and remained President for 50 years. This is where the concept of 'truth' becomes difficult to handle. The writer may be writing before or after the President's death; in either case the improbability of this scenario is manifest; yet if the work we are discussing has claims to be regarded as a great work of art, it may show a 'truth content' which transcends the simply fact-truth relation that I've discussed so far. In other words, 'human truth' need not rely on factuality, but does in fact have much more stringent (ethically determined) values associated with itself. The example I've just used recurs in innumerable instances throughout literature, art, opera etc. What merit of truth is contained in Shakespeare's Macbeth? Clearly the yardstick of factuality is inappropriate here. But you may hear it said quite often, about such figures as Macbeth, that the 'truth' about Macbeth, even though it may be 'false' and would be recorded as 'false' in a time machine, is 'true' in a more humanly relevant context. There is an old adage which occasionally pops up in contexts such as these: 'Even if the deeds attributed to this person were never performed, they should have been, because they reflect some intrinsic aspect of that person's character.'

This is the point at which the philosophical concept of 'truth' takes over. Just a few examples:

Truth is profoundly involved in the concept of justice.

Truth has a bearing on aesthetics, i.e. in the relation between art and a very dimly perceived ('inarticulate') truth content.

Truth and morality are inseparably entwined in religious and social interactions.

Truth is ingrained in something we call 'character'. What a person is, deep down.

Truth and factuality may collide in ethical situations: such as a doctor diagnosing terminal cancer and being of two minds whether or not to communicate this to the victim. Here the 'truth' is not (as one might suppose) the illness or its terminal conditions (they're the 'facts'), but the attitude of the doctor and/or those whom he/she consults about the merit of communicating the diagnosis.

There is no need to go on, because your question is limited to what I have discussed above, i.e. the difference between fact and truth. From this, you should take away the fairly important distinction between the two, and I hope that the outcome is a 'truth' in itself, namely that the concept of truth is considerable wider than the concept of fact; that indeed to some extent it includes the concept of factuality as one of its aspects. But, essentially, that 'truth' relates in the first instance to the human agent, without whom there would not be such a concept; and that accordingly it relates most deeply to human issues, where (unlike the fact-truth relation with its essentially linear logic) the concept shows up in its full complexity.

Jürgen Lawrenz

The sentence 'GWB is President of the US' is only true during his term. But the fact that GWB is the president of the US is only a fact during his term. In order to make both 'always true' or 'always a fact' will involve incorporating temporal notions: Its always true that 'GWB was elected in 2001', but similarly, its always a fact that GWB was elected in 2001. More technically, Tarski's disquotation schema has it that:

DS: 'P' is true if and only if P

For example 'snow is white' is true if and only if snow is white. Hence, there is a direct link between facts and truths. Whenever you have a truth you have a fact and vice versa. If you still want to make the case there is a difference then I guess an intuitive difference might be that the truth predicate only applies to sentence, whereas facts are things 'in the world'. You could also say that facts are what make sentences true. Facts, in that sense, would be the truth-makers for the true sentences.

Rich Woodward


Ian asked:

Some years ago I was reading the London Evening Standard on the tube train and stumbled upon an article on education by A.J. Ayer in which he said, "All education is indoctrination." it struck me as an absurd thing to say then and still does. But is there more to this assertion of Ayer's than meets my jaundiced eye? and what could it be?

Yes it is absurd and no it is not.

It is absurd, because education is inevitable. So all can't BE MEANT as indoctrination. And what is more, it is needed, it is the function in nature of all parents to teach their offspring to survive.

It is true, because without intending to all teachers become little gods to their pupils. They can't help teaching their pupils things that are only tradition. That's why after some time pupils must go their own way. Only 100% computers can be given most knowledge to survive at 'birth'. Then after that even they must learn from experience to improve.

Henk Tuten

Two meanings of "indoctrination" are given in my dictionary, as follows:

1: to instruct especially in fundamentals or rudiments: TEACH

2: to imbue with a usually partisan or sectarian opinion, point of view, or principle

In meaning 1. "indoctrination" is nearly synonomous with "education" except that it has a more confined scope than the latter since it concerns the elements of a particular subject, as in "children are indoctrinated into the fundamentals of arithmetic."

But, in meaning 2, of course, the term to educate is very different from to indoctrinate, since education is supposed to present students with information and ideas without any attempt to present them with any partisan or sectarian opinion.

Ayer, it seems to me, was playing on these two meanings so as to get his own view about what was actually going on in the schools as opposed to what he thought the schools should be doing. They were indoctrinating rather than educating which was what they ought to be doing. And, so, in a way, Ayer was, himself, indoctrinating the readers of The Evening Standard rather than educating them. Of course, I never read the article you are referring to, so I can't know that what I say above is true.

Philosophers often use the device of saying something paradoxical in order to emphasized a particular viewpoint on a matter. In Plato's Republic for instance, Thrasymachus tells us "Justice is in the interest of the stronger." Now, that is exactly what justice is not. But, by putting it that way, Thrasymachus gives us his view of how, in fact, the notion of justice actually operates in society.

Ayer, I think was doing very much the same thing, only, of course, concerning his own, perhaps jaundiced view of how people are educated in Britain. He might be understood as saying, "We are supposed to be educating people, but what we are doing is indoctrinating them."

Ken Stern

Just what do you mean by the term "indoctrination"? What did Ayer mean? I haven't read the article, but given that he is a philosopher, he must have defined that term somewhere in the paper. Did he mean what you mean by it?

That's point one. Point two is this: you're a child learning, say, mathematics. Now, mathematics, real mathematics, is not addition, subtraction, etc. The closest one comes to doing what mathematicians do, while one is in school, is when one learns to do proofs in geometry. Mathematicians do proofs, for the most part, in extremely difficult conceptual areas... and attempt to think up new things to prove. Now. What must a child learn, and how must they learn it, in order to even get to a point where doing mathematics is at all conceivable? They must learn arithmetic. Can they learn it by doing proofs, i.e., by proving, say, that 2+3 = 5? No, of course not. So they must first learn, by memory, what addition is. Then how to add. Then facts like 2+3 = 5. And on, and on. Then at some point perhaps they will find that they want to and have the ability to think of tentative mathematical truths, and prove them correct or not. So the first stage is memorization of the basics.

Do you see where I'm going with this? The child is, effectively, indoctrinated with the basics of mathematics, i.e., they must learn something, and accept it, without fully understanding it or being able to question it. How else would you proceed? This is true for pretty much all fields, even to a certain extent for philosophy... although that latter might possibly be the exception, if one were of a sufficiently Socratic bent. But even there, you will find tricks in the Dialogues which amount to the same thing. In physics, one learns about forces, vectors, electromagnetism, and so forth, without being able to question them. In medicine... etc., etc.

However, once one gets past the point of learning the basics, then one can start to question what one is learning; one has the tools to investigate why 2+3 = 5, and so forth. So I would agree with Ayer up to a point. Past that point, I would not agree. The difficulty, of course, is determining how far one can go before one starts playing with what one is learning. To read one must be "indoctrinated" with the alphabet, with grammar, with a basic vocabulary... an then one can make up one's own words... in the proper context. That context comes at different times for different people, depending on ability, interest, etc... And of course there are cross-influences between some fields that enable one to immediately question something in one if one has already learned another.

Steven Ravett Brown


Pam asked:

i was wondering what kind (and I mean the titles) of books did Socrates write and how did he teach his students?

Easy question, Pam. Socrates never wrote anything down. the only reason we know what he said to his students is because Plato (a philosopher in his own right but also a pupil of Socrates) wrote several works. Many were written in so-called "Socratic dialogue" form, meant to mimic the style Socrates used when he taught. Socrates tended to wander around market places in Athens and went up to people asking them difficult questions, such as, "What is happiness?" He also had a school. Socratic dialogues involved questions and answers during which Socrates usually managed to have his adversary contradict themselves and prove his point. It's worth your reading sections from Plato's Gorgias to get an idea of how Socrates taught. (besides, it's on the yr 12 reading list for next year!) you could also check out Bryan Magee's Story of Philosophy (great book, costs about 30 Dollars but covers heaps of history and has amazing illustrations) and/ or Jostein Gaarder's Sophie's World — he talks about Socrates at length.

I teach year 11 and year 12 philosophy and you can email me direct if you like. I am impressed that you are trying to do the work yourself rather than just posting your essay topic and hoping someone else will write it for you. Are you in Victoria? would your school be interested in corresponding with another year 11 class?

Lyn Renwood

Socrates didn't write. Plato wrote books in which he depicted Socrates as a teacher who taught by discussing philosophical problems, questioning and trying to bring people to think for themselves. Look up Plato's early dialogues. In his later works he started to develop his own philosophy.

Rachel Browne


John asked:

Should a philosopher always take the most clear and direct approach when writing a piece of philosophy? Now I've given this some thought and it seems to me that those who insist on complete clarity at all times are oversimplifying the issue, especially when it comes to the question of interpreting the thoughts of other philosophers.

For example, how would one go about giving an accurate account of Heidegger's Being and Time if it's insisted that this account must be explicated in everyday terms? To explicate Heidegger in 'everyday terms' would do a great disservice to his thought, and to those trying to get an accurate understanding of his thought. If one is serious about explicating the ideas of a particular philosopher, then one must employ the technical terms used by that philosopher, for otherwise philosopher's arguments get leveled down to a vague semblance of their original form. Should one always try to define these technical terms? If one is writing a serious philosophical essay on a particular philosopher, shouldn't one assume that the potential readers of this essay will have some understanding of the philosopher the essay is about? Or should one simply assume that everyone is completely ignorant of the philosopher in question? Should one be giving a primer on philosophy each time one decides to undertake a philosophical project?

Furthermore, it is my contention that, if you're after a specific effect, the only way to get that effect is to eschew the notion of complete clarity. Let us take the topic of aesthetics as an example. Often I find that if I want to give a non-reductive account of the aesthetic experience, I actually have to perform an action in my writing that produces an aesthetic effect. I could ramble on and on about poetry in a style as transparent as Ayer's, but I would fail to capture the essence of the poetic. Now you may insist that my argument is fallacious if I can't give a completely transparent account of the essence of the poetic, but I think this is false because, if one can actually write poetically about poetry, then this serves as a more convincing argument than simple logical deduction. Logical deduction understands nothing of the language of poetry, but you can be sure that poetry understands logical deduction perfectly well. Now this is not to say that logic is unimportant. That would be an absurd statement for a philosopher to make. But logic can only reach a certain point in explicating the poetic, and once this point has been reached, we are left with a remainder: the quiddity of the poetic. To reach this zone it's necessary for one to ignore the limitations of strict logical argument, and proceed with a performance of the poetic.

Lastly, there are many great philosophers who are not 'clear and direct'. How 'clear and direct' is Kant's Critique of Pure Reason, or Hegel's Phenomenology of Spirit, or Heidegger's Being and Time, or Derrida's Of Grammatology, or Nietzsche's Thus Spoke Zarathustra? Should we ignore Sartre because of statements like, "I am what I'm not, and I'm not what I am"? Must all philosophy be judged under the doctrines of the Anglo-American tradition of complete clarity?

Your question is very pertinent and well argued and you have my complete sympathy. Somewhere in his lecture cycle on 'Philosophical Terminology', Adorno takes Wittgenstein to task for his assertion that we shouldn't and cannot talk about matters that we know nothing about: It is precisely the office of philosophy to do that, Adorno retorts. Language is an extremely imprecise communicator of almost everything of value to human beings (that was one of Wittgenstein's points), but this only means that in writing down a possible very complex argument or raising issues that are new to the philosophical vocabulary, a philosopher may find him/ herself unable to express what needs to be said in 'clear' prose. The essence of this matter is that it is part of the human equation to understand very well in non-lingual terms many complex forms of communication (e.g. symbols) which are also difficult or impossible to decompose into plain statements; and when philosophers write highly convoluted arguments, invent exotic nomenclatures or implicitly redefine standard expressions to suit themselves, this is often an appeal to the intuitions of their readers to fill the comprehensibility gaps by marshalling their own imaginations. Under these terms, philosophy can be become a creative exercise not only for philosophers, but for their readers as well.

This is not to say that clear writing is not a desideratum, ultimately. If you had the chance to ask Hegel, he would unquestionably agree. No-one could have been more sensitive to the deficiencies of his diction than the man himself; but he had important thing to say that he simply found himself unable to frame into 'clear prose'; he was always wrestling with language like Jacob with the Lord's angel in the service of precision of utterance and came out of the fray somewhat bruised and dishevelled. Indeed of Kant it is well-known that he expressed the sentiment that he was forced to leave elegance to his tailor, because he simply lacked the time to polish his text. Consequently it is not a valid counter-argument that men like Descartes or Nietzsche or Santayana wrote in prose to match the best of their respective literary languages. C'est le metier. What I mean by this is: any typical sample of 100 books on Descartes would be devoted to precisely the same task as any sample of 100 books on Hegel: elucidating the meaning of the authors. But didn't Descartes write 'clear and simple prose'? Well, yes. But so did Hegel, on his own terms. For although Descartes might be more readily served up in the Sunday Literary Supplement, what he really meant is no simpler to extract from his texts than anything Hegel had to say. In a word, until somebody takes up Leibnitz's idea of a characteristica universalis and develops it into a richer as well as more precise communications vehicle than plain language, we're stuck with what we got. So your point is well taken.

Jürgen Lawrenz

First question. Yes, a philosopher should aim for clarity, but should not accept absurd terms. 'Complete' clarity doesn't necessarily mean oversimplifying, but if sometimes terms of shortness do. When that seems awfully difficult, than that only means you don't master the stuff. My personal experience is that anything you really through and through understand can be explained in a few pages (or shorter). If you can't explain Heidegger in common word then you don't really understand his point (replace 'Heidegger' by any name).Then I don't mean Heidegger's mathematical views, but his philosophic ideas.

I'll explain: In SF movies sometimes in a few sentences, the subject is treated of a whole formal philosophy book. Not because of an extremely clever text, but because of the context in which that sentence is used (making use of the visual power of movies).

Second question: Yes, focus is always useful.

Third question: Be careful to accuse other philosophers of unclarity. Consider the time in which their article was written.

My experience is that for instance Kant seems at present unclear (he was clearly a product of his time), BUT considering his circumstances is quite understandable. Only his ideas have in the meantime been a lot improved. Nietzsche's Zarathustra is on close reading very clear. I even wrote a summary in which every chapter takes only a few sentences (if you're really interested I can give you the internet address). Mind that explaining something you only half understand takes a lot of words.

If it is only Anglo Saxon to demand 'complete clarity', then there sure be other ways to look at it No harm meant but that is slightly arrogant.

Remember, thing are as clear as your own eyes see them.

Henk Tuten

Since an aesthetic experience is an experience then I agree that you cannot capture it without giving an example. Furthermore, that example is likely to be a philosophical poem and so logic, I agree, would not be very important. But a logical poem might be quite fun.

Translation of technical terms into everyday language is an attempt to understand. You can assume that potential readers know something about the philosopher you are writing about, but it depends on the level of the essay. If it is an undergraduate essay you have to show that you understand the philosopher. If it is not, it is still a good idea to explicate the ideas of the philosopher since this allows readers to know whether you have the same interpretation.

It is the practice of showing that you understand which leads to the Anglo-Saxon requirement for clarity.

Something is always lost in translation even if it is just the tone of the original philosopher but it would be very restrictive if this was to bother philosophers too much!

Rachel Browne


Karen asked:

I am new to philosophy, I am interested in Kant's philosophy, I have had a question to ask about Kant's theory.

Kant (as far as I understand) says there is the thing in itself, and the world of phenomena, the world we experience, but both worlds are not two separate entities, they are one world, we understand the thing in itself as the phenomenal world.

But who are we? aren't we (in reality) in the world of the thing in itself that is not temporal, how can be there misunderstanding, misinterpretation if there is no time, and if time is said to be illusion how can such a non-static illusion be produced from a static reality (the thing in itself) and can it be said that it is produced from the world of the thing in itself? if it is said to be non-temporal then it can produce nothing, what relation is there between the phenomenal world and the world in itself? misinterpretation? how can misinterpretation exist without time?

They are in a sense two worlds, the noumenal and the phenomenal... but in a very particular sense, not what you'd expect. According to Kant, and I'm going to compress quite a bit into a teeny explanation, we construct the world we see, the phenomenal, employing "schemas" and what might be termed "built-in" parameters ("forms", "pure intuition" and "a priori concepts"). Two of the most fundamental parameters that we are, in effect, hard-wired to use, in constructing the phenomenal world, are space and time. But when I say "constructing", I'm using that term in a particular way... really, a better term might be "understanding"... but that isn't really correct either; it's not like our conscious efforts at understanding. "Structuring" is really best... but that's such a vague term... I'll use "understanding", OK? With the proviso that what I'm talking about is a kind of unconscious meaning of that term. So... we "understand" the noumenal world through those parameters. But we do not know that the noumenal world does, or does not, have comparable or even identical dimensions of space and time. In fact, to speak of the noumenon as being "atemporal" is to employ our limited parameter of time to describe something which we cannot know or describe in any other way. So saying the noumenon is "beyond time" or "atemporal" or that time is an "illusion" is to use the limited palette, in effect, that we have to work with to describe something which really necessitates another type of description. But we do not, indeed cannot, know what that type is, because of our limitations.

"We" are of the noumenon... but that doesn't mean we understand the noumenon directly. Kant's argument is very long and complicated... and I simply don't know how to summarize it all for you here. Stating the conclusions doesn't summarize the argument. I guess you could put it this way: we make mistakes about reality, and we clearly have many types of incomplete knowledge of reality. So there must be a disconnect of some sort between ourselves and reality. Ok? Now, given that, what can we say about what we do know and how we know it? Well, we then have to talk about the kinds of things we have to do with whatever connection we do have to reality. But we don't even know what kind of knowledge we have of reality, do we. So what Kant does is try to figure out, in the most general way, what we can know. Starting from the assumption that we are affected by reality (we "receive impressions"), he attempts to understand how we understand those impressions. There is a difference, very important to Kant, between receiving impressions and understanding that those impressions are objects, events, etc., etc. (which is what I mean above by "understand"). And if you think about it, you will see that even if we are direct receptors, in effect, of reality, the noumenon, that reception, those impressions, without some sort of understanding, i.e., classification, organization, structuring... would just be chaos.

So what Kant is saying is this: how do we get order from the chaos of impressions? And that ordering is done in terms of space, time... etc. Take a look at para 88 through about 100 of the Critique of Pure Reason for this introduction. So the mistake that people make is to think that Kant is saying, "Well, there's this world out there separated from us by a veil of some sort..." No. What he's saying is that without internal restructuring we would be overwhelmed by the chaotic impressions we do receive of reality. And then he goes on... and on, and on... describing how we create order from that chaos. And, interestingly enough, Kant is supported by modern cognitive science to an amazing degree.

Steven Ravett Brown


Edward asked:

Does anyone ever think that the world would have been better off if man had never taken, 'control of his affairs,' in the first place?

Evolution. Imagine what that idea would cause — that man's image is of but a limited time and of no absolute fixed value or duration — if it were generally known or believed. Do you think that man jeopardizes his future by feeling revulsion at the idea of his evolution, due to perhaps some childlike fear or immaturity? I can't imagine people being too eager to accept, perhaps eyes on the back of their heads as an evolutionary advancement, or whatever it might be. So perhaps in taking as many preventative and reversive measures as possible the fearful creature might destroy himself? Why? because everything evolves, everything improves.

Now if man prevents this natural improvement he will inevitably fall behind in the 'survival of the fittest' scheme of things. I do not mean to suppose that monkeys will rise up and overthrow their masters, as is the case in so much paltry science fiction, but think for a moment. What is the great problem in medicine today that is already worrying doctors, scientists and the rest? Is it not the evolution and adaptability of bacterial viruses and infections? If I had to spell out a certain problem I would say that evolution affects men, it improves him, in doing so it improves every organ in him, including his brain, not just improvement in his natural immunity. And an improved brain means better intelligence , intelligence to think new ideas, perhaps even unimaginable and extraordinary ideas to us today. Ideas which, none the less, he needs, needs to think up new ways to combat the ever improving threats to his existence. So, do I have a case, or am I just weird?

Very interesting question, Edward, deserving of a well-considered answer. Let me recommend to you, however, in asking important philosophical questions (not just to Pathways, but in general), that you never assume sight unseen that you're the first to put them. For example, 'Does anyone ever think... etc?' is an issue as old as the hills and there must be thousands of books and articles on the subject. By you putting questions in this rhetorical manner, I have to choose between the alternative of believing that you really have never seen an article or spoken to anyone who shares your opinion or that you are indeed just using a rhetorical device. I don't know which applies to you, and that makes it difficult for me to assess how to respond: you see the problem?

Anyway, I'll assume that it is rhetorical and that you're just looking for another answer because you've not seen or heard one that really satisfies you. This means that I can simply respond 'yes' and leave it at that.

And so now to the other parts. I'm sure your idea of 'revulsion' applies to some people, even whole societies. Maybe 'revulsion' is not the best word, but this is a minor consideration. We do constantly jeopardise our future; but this is not rooted in a fear of 'improvement'. The statement 'everything evolves, everything improves' is factually incorrect. For example, many species of bacteria have never evolved beyond their original state, and an argument may be put that creatures who reproduce by cloning are no longer evolving, and that their mode of reproduction is precisely geared to keeping the status quo going indefinitely. Further; many way stations along the path of evolution are not improvements for many creatures and/or branches on the tree of evolution. One may put the proposition that any species which is now extinct was not intrinsically an improvement on what went before. One may propose, even more radically, that homo sapiens occupies an evolutionary rung which has overshot the mark in terms of adaptability and is therefore very likely to 'write himself out' of the further evolution of species. The point of these deliberations is that evolution is not a sort of mechanism of progress, but rather an interplay between organisms and the habitat, in which the former adapt to the conditions which prevail in their niche and the latter changes on two fronts simultaneously, namely through the impact of organisms (which must inevitably change it) and the chemical composition of that habitat from time to time, of which one outcome may be that devolution is on occasion a preferable alternative. In other words, to think of evolution as an upward curve is a mistake. Evolution is neutral: and in the scientific literature you will find it stressed repeatedly that zero change is the rule of the game in stable environmental conditions.

Once you understand evolution in this light namely: that adaptability, not improvement is the key criterion of evolution, then you will be in a better position to judge the crucial issue of mankind's impact and the dangers involved in it. What you call 'improvement' is, in fact, the disposition of some types of organisms towards more complex evolutionary patterns, i.e. the development of more sensitively attuned response systems. Take the evolution of nerves as a paradigmatic example: millions of species have nerves and therefore a greatly improved resource of adaptive response to changes in the habitat over creatures without nerves; then evolutionary stress may induce a further evolution to a nervous system with control and evaluative facilities in a smaller number of species; from there more species will go on to evolve brains. Speaking generally, this is to date the topmost rung on the evolutionary ladder: fish, birds and mammals possess brains of varying size and resource capability. Along comes, in a kind of sudden upward push possibly beyond the needs of the species, the brain of homo sapiens, which displays a crucial change in the capability of brains-in-general. Brains-in-general evolved for the superior handling of short-term evolutionary changes, even instantaneous changes, i.e. changes where the time stamp is too short to allow the quasi-mechanical interplay between organisms and habitat that is the norm; but the human brain goes beyond that in that we can think of the future, i.e. events which have not yet happened, and generate plans and ideas and visions of possible tracks into the future against which we may wish to equip ourselves. One obvious advantage to this is that the creature so endowed is able to build structures, both 'hard' (material, so as to provide an artificial habitat which is to some extent independent of the natural environment) and 'soft' (societal and cultural, designed to facilitate the coherence and cooperativeness of the species in its efforts to survive). One disadvantage is that the animal instincts which we inherited are still in force and have a tendency to be productive of 'misreadings' of these possible futures in light of desires and short-term fulfilment of supposed advantages, all which change the habitat very quickly and thus create evolutionary conditions where we as well as many other species on which we depend on our survival, are endangered.

If we accept the reasonable conjecture that ultimately homo sapiens is the survivor of an arboreal simian (ape-like) branch which is now extinct, then we can see easily by comparing the life habits of other arboreal mammals (e.g. monkeys) what our problems may be. For example, we have no instinct for cleaning up after ourselves, because our instincts were formed in the trees; we have no instinct for curbing our natural aggression, because in an animal lacking 'tooth and claw' that aggression is in the main designed to frighten rather than to kill; and so one could go through a long list of bad outcomes of the evolution of certain simians into hominids. These outcomes are a result of instincts already formed and genetically transmitted which have not had the time to adjust properly to changed living conditions let me point to our eyes, whose stereoscopic ability is a reminder to us that once we needed that sharpness of vision to cope with brachiation. Against all these defects and maladjustment, our brain is the only makeweight: but our brain is heavily influenced by this instinct legacy which we carry around with us; and this is not a problem we are likely to solve in the short term. But it is a problem of which we have been aware ever since Darwin started the evolutionary ball rolling, but which as a whole we have never yet had the courage to face squarely. Instead, we've had two world wars, nuclear bombs and pollution near to suffocation level.

So as all the old religious and philosophical stand-bye's have it, the potential for good and bad has been placed into our own hands; we are the 'husbands' of the earth in the sense that as consciously aware creatures we bear an enormous responsibility far extending our own needs, for every decision we make as a collective affects untold numbers of other creatures and the vegetative world as well. The danger we are facing most acutely is that our perceived and imagined needs will outrun the capacity of the planet to sustain them, but equally deleterious, that many of those organisms which we perceive as pests, nuisances and dangers have the same 'right' to existence as we do (although strictly speaking no-one has a 'right' to live, only the privilege), but that from sheer ignorance we are likely to erode much of that hardly-perceived life on which our own depends.

To some extent, then, your concern is surely well-founded, but the presuppositions by which you judge them are still a little off the mark. You're not alone in this; but since your question revolved largely around questions of evolution, I have concentrated on this to hammer home the point that with all our 'knowledge' and acceptance of the idea, we have not yet, by a long shot, come to an acceptance of what is entailed in this knowledge. We have not yet, as you'll surely agree now, even come to an acceptance of such a simple fact as the incompatibility of our instincts with the need for co-operative living in the terrestrial mode which was probably forced on our distant ancestors by the cyclic recurrence of forest recession. One day, I guess, we'll be forced to; let's just hope that when it happens, it will not be too late!

Jürgen Lawrenz


Edaw asked:

Do we control what we think or does what we think control us? (Important to note that we can sometimes think we are in control when we are not.)

If what we think controls us, who controls what we think?

Example one: you play billiards, and you hit the cue ball, which hits another, and so forth. You control the balls; everything is linear.

Example two: the billiard table is set up so that it starts with a few balls on it, and a new ball is automatically placed in front of the cue stick every few seconds (up to a maximum number... maybe they're extracted from the pockets, if you're playing that kind of billiards, or maybe they're plucked randomly off the table), while the balls are still moving from the previous stroke. The angle the stick hits at, and its velocity, are controlled by a set of sensors which use the positions of the balls on the table and their velocities to generate angle and speed of stroke. Thus, that angle and speed a) are never the same, and b) they are controlled by the balls that the stick hits, and by the stick which hits the balls.

So in the second example, what is controlling what? Is the stick, hitting the balls, controlling them, or are the balls, generating the angle and force of the stick, controlling it?

You might think of our central nervous system as something like the second example.

Steven Ravett Brown


Jay asked:

What are the strengths of ethical relativism and objectivism?

Indeed both have their strengths, as already is remarked in classical Buddhism, when it says in Tantra Buddhism: 'Absolute Knowledge ([objectivism] and Relative Knowledge [relativism] together constitute the supreme knowledge.'

The strength of objectivism is being conservative and thus preventing mistakes. Another of it strengths is authoritarian education. Although education in theory is always indoctrination, passing knowledge can't be missed in the struggle for survival.

But in the end it is death to stay within the same truth. That means linear progress or seen from space standstill. Mind that death in nature can't be missed.

A strength of relativism is that it compensates absolutism (or objectivism). It acknowledges every system of thought and as such realizes that there can be limitless truths. It remains critical towards any truth, and therefore seen in terms of evolution it keeps things going.

It means life, i.e. acceleration into new truths.

See it as a kind of Yin and Yang. Inseparable, both necessary, together forming a unity.

Henk Tuten


Elin asked:

How can either you, or your society decide ethically which knowledge should or should not be pursued?

This is one of those old questions that I scan down to now and then... I'm not going to answer it. But I recommend that you or anyone interested in this question take a look at Philip Kitcher's latest book: Science, Truth, and Democracy. He addresses this question extensively.

Steven Ravett Brown


Law asked:

I am an undergraduate philosophy major with concentrations in cognitive science and medical ethics. I need to fulfill a language requirement in order to graduate but I am not sure which language (German, French, Latin or Greek) would be most useful considering my two concentrations. I may need to read writings on medicine, health, the body, what constitutes the mind or a living human being. Also ethics, to examine topics like suicide and some legal/ political theories (as it pertains to the state and its responsibilities to welfare issues, like the provision of health care). I am a sophomore and while I am certain by my senior year the correct response to this question would be obvious to me, I cannot wait till then to begin my language courses. Please advise.

Your best bet is German. Aside from the obvious philosophers, there is a great deal of work being done now in Germany in various areas of cognitive science and consciousness studies.

Steven Ravett Brown


Ian asked:

I recently heard someone say, "I might easily have been someone else after all, mightn't I?" The obvious question is, "Might he have been?" Any thoughts.

First of all, we have to get clear what,

(P) 'Individual x might have been individual y'

means — as the truth of (P) is going to depend on what the context of utterance is. I'm going to assume that (P) means,

(P1) 'It is possible that x has different properties than x actually has.'

Let 'x' refer to you. So what (P) means is that you might have had different properties. So, say, you might have had the properties of being a professional footballer, whereas actually you are, say, a professional basketball player.

In terms of possible worlds this turns out as:

(P2) There exists a world w & At w, Ian exists and Ian is a professional footballer.

Well, you might say "look, that's all well and good but surely I can only exist in one world — whoever that other "Ian" is, it certainly is not me!" What we need to then do is talk about counterparts of you. A counterpart of you is an individual to whom you are similar too in some qualitative respect. Hence, P turns out as,

(P3) There exists a world w & at w, there exists a individual y & y is a professional footballer & y is a counterpart of x.

(Remember 'x' refers to you.)

So, in answer to your question, yes you might have been someone else, but what that means is that you have counterparts who are different from you. You might have been David Beckham but that is only true if David Beckham is one of your counterparts.

I understand that all this is controversial, but any answer to this question is controversial — and I think this is the best answer overall. I don't have time to go into details but see David Lewis, On the Plurality of Worlds (1986) chapter 4, and John Divers, Possible Worlds (2002). Feel free to email me with any questions.

Rich Woodward

What exactly is it to imagine that you might have been some other person from the person you actually are? is it necessarily the same as imagining that you might have been different from the way you actually are?

Whenever we think about how different things might have been from the way they are, we are thinking about other 'possible worlds'. No need to worry about how 'real' these possible worlds are (Lewis takes the extreme view that other possible worlds are as real as the actual world). If you like, it's just a convenient way of talking, nothing hangs on this so far as your question is concerned.

I am now thinking about another possible world. In this other possible world, I have just intercepted a careless pass from an Arsenal player and I am racing with the ball towards the Arsenal goal. Other information about me: my name is David Beckham, I play for Manchester United, I am married to Victoria who used to be 'Posh Spice' from the Spice Girls. In my garage, I have a Bentley and a Ferrari. So far, so good.

But now there come a tricky question. How did I get to be 'David Beckham', when my father's surname was Klempner? Two possible answers to this: 1. In the other possible world I changed my name (better name for an English footballer). 2. In the other possible world my parents were not my actual parents but were in fact Mr and Mrs Beckham.

If I opt for 1. then I am not imagining that I might have been David Beckham, I am only imagining that I might have had the name David Beckham (as well as various other enviable attributes). In that world, one might suppose that there are two David Beckham's, myself and the son of Mr and Mrs Beckham (who hated sport and became an accountant).

So I opt for 2. How did my parents come to be Mr and Mrs Beckham instead of Mr and Mrs Klempner? What the question is asking is what connects me the person writing these words at this moment, to the person called 'David Beckham' in this other possible world. What makes this individual (in Lewises language) my counterpart?

This isn't a point necessarily about the first person. It makes sense to ask (although no-one ever would) under what circumstances, and in what sense might George W. Bush 'have been' Saddam Hussein, when the person asking the question is neither George W. Bush nor Saddam Hussein. I'll leave you to work out the details.

To get back to Mr and Mrs Beckham. Let's say that in this other possible world, when the embryo which later developed into me was just a few days old, it was secretly removed from my mother's womb and placed in the womb of Mrs Beckham. Would that be enough to make me David Beckham? All I am imagining now is my being substituted (!) for David Beckham. The embryo which would have grown into David Beckham was either destroyed, or maybe became his non-identical twin brother Derek...

Geoffrey Klempner


Malcolm asked:

I actually have two questions:

1. How can someone learn to understand a book like Heidegger's Being and Time, or any other difficult book?

2. I am very frustrated when trying to learn philosophy because all I have are questions and all I think of are contradictions in the text. Combine this with an awareness of an opinion of Nietzsche and you'll see why: "The so called paradoxes of an author, which the reader objects to, are often not at all in the author's book but rather in the readers head" (from Human, all to Human).

Unfortunately there is no easy way to approach a difficult writer like Heidegger, who presupposes of his reader very wide reading in just about all eras of western philosophy, from the Presocratics onward. In such a predicament, the only useful advice is to read the text chapter by chapter with a guide that does the same thing; and fortunately there is such a book around: it's by Stephen Mulhall in the Routledge Guides (Heidegger and Being and Time).

Re the Nietzsche quote: I sympathise with your predicament! But of course you realise that this quote is a paradox in itself. Part of the problem with reading Nietzsche is that he is aphoristic and like Heidegger assumes that you know, when you come upon such a paradox, what it's background and motive might be. Unfortunately Nietzsche is not as well served as Heidegger (yet) in the basic secondary literature and it is unfortunately all to easy to fall into the hands of an author with an axe to grind and that's one reason why I gave up and just read his texts. However, I can recommend one book to you which, even though it is a bit dated by now, still conveys the essentials of his thinking, the book just called Nietzsche by Walter Kaufmann. You might have to buy it secondhand these days or borrow from a library.

Jürgen Lawrenz

1) a) Take courses in the philosophy that the book is based on before you read the book.

b) Read the philosophical background before you read the book based on that background.

c) Take a course in the philosopher.

d) Get a version of the book with lots of annotations and explanations (Macquarrie's is good).

e) Read it over several times.

Heidegger took pride in being unintelligible (yes this was literally true). So you're not going to understand him easily. Do all the above, in that order.

2)Yes, well, Nietzsche wasn't much of a philosopher, especially in terms of clarity, organization, lack of contradictions, logic, and if it comes to that, sanity, in my opinion. So I'd take his comments with a large grain of salt. He was a polemicist, and an insightful one. That's on the one hand. On the other, without knowing the background which someone brings to a philosophical position, you simply cannot fully appreciate what they're saying. It's like trying to jump into a physics or biology journal without knowing, well, quite a bit in those fields. So Nietzsche has a point, even if it's self-serving.

Steven Ravett Brown


Jayson asked:

What does, "To be, or not to be" mean?

I disagree with Rachel Browne's answer (Answers 18).

The phrase, "To be or not to be," from Shakespeare's Hamlet is the first line of Hamlet's "Soliloquy". Alone, and by itself, this phrase has taken on a life of its own, and many people infer that this has something to do with a sense of "being" or "non being", possibly in an existentialist sense; yet that is not the case at all.

The phrase is followed by the question that Shakespeare is posing, concerning nobility. Shakespeare is asking if it is nobler to suffer abuse at the hands of others, or is it nobler to take a "bodkin," a dagger, and kill them to end the abuse. It is a timeless moral question.

Succinctly stated, Shakespeare's proposition is this: If someone offends me, can't I just take up a knife and kill them? Is this nobler than suffering abuse?

His answer to this is a question; "Who knows what dreams may come?" Shakespeare implies that there may be an afterlife wherein we suffer consequences for actions committed upon the earth.



Miracle asked:

I. The Alpha Centauri star system has been visible from earth for some time into the past. It is a bit more than four light years away from earth, which means that light takes about 4.1 years to travel from Alpha Centauri to earth. This is all unproblematically understood in the conceptual framework of naive realism.

II. A reference point is set: 1/1/2000; at which time the perception of Alpha Centauri from earth is noted.

III. A thought-hypothesis: the explosion and immediate disintegration of Alpha Centauri on 1/1/2002.

IV. On 1/1/2004 an observer on Earth continues to see Alpha Centauri. The scientific reason for this is that the photons from Alpha Centauri which are in the vicinity of the earth on 1/1/2004 had left Alpha Centauri four years earlier; thus they had not been affected by the explosion.

There is a philosophical problem, if not scientific. What is it the observer on earth perceives on 1/1/2004? The problem is the star doesn't exist on 1/1/2004, Earth time. If one is currently existing at time, relative to one's history, can one veridically perceive an object that does not exist at time, but did exit relative to one's history?

P.S. The defender of scientific/ naive realism would say that on 1/1/2004 the observer truly or veridically sees Alpha Centauri, but he/ she sees the star veridically at the moment of its existence identical with 1/1/2000, Earth time.

From the way you have framed your question, I take courage that you might gain some benefit from reading a couple of chapters on relativity theory. Minkowski's geometricisation of the space/ time theory actually gives you clear cut diagrammatic representations of this type of event and a simple technique for solving it. So rather than give you a pre-digested 'answer', from which you would learn nothing, go to the library and pick up a copy of Einstein and Infeld. It's all in there, from the horse's mouth.

Jürgen Lawrenz

I'm having difficulty seeing the problem here. What do you ever see? Photons. What do you ever hear? Phonons, right? Which travel at the speed of sound, much slower than light, which people have known for thousands of years. Do you have a problem that you see, for example, lightning, and then hear the thunder a few seconds or minutes later, or see an explosion off in the distance and then hear it later, etc. etc.? Does anyone?

A defender of naive realism would be naive indeed to say that they heard the lightning, or anything else, "directly". So what's the difference? Light moves faster than sound, that's all.

Steven Ravett Brown


MIke asked:

What is the relationship between happiness and work?

There probably isn't a direct relation between work and happiness except in the most extraordinary cases. The relation is probably via fulfilment. You cannot be happy in any deep sense without fulfilment. Freud described work as a 'path' to happiness. He noted that work is a source of satisfaction only where it is freely chosen, and sadly this probably isn't the normal case. Even then he talks of 'professional activity' and not manual labour.

While in Civilisation and Its Discontents Freud said that man doesn't prize work very highly as a path to happiness and tends to work because of necessity, in The Future of an Illusion he says that civilisation rests on a 'compulsion' to work. Rather a contradiction. I don't think people feel, in the main, compelled to do a job. For Freud, the impulse to work is a sublimation of sexual instincts. That is, the impulse to work displaces erotic instincts and provides satisfaction through being involved in reality, or the human community. However, Freud claimed that persons differ and the man who is predominantly erotic will prefer to seek the path to happiness through relationships, whereas a narcissistic man will seek satisfaction in his mental processes. Furthermore, he urges people not to seek satisfaction from a single aspiration.

When we work from necessity, this is because we need money. But there are all sorts of other ways in which we can look at work. It might be bringing up the children or doing the gardening, and in this sense most people are compelled to work because we naturally seek fulfilment and strive for happiness.

Rachel Browne


Tracey asked:

First let me begin by thanking those who provide this service. It is very helpful.

I am a doctoral student in Instructional Technology. If I wanted to read more about how classic thinkers viewed the creation and use of tools and technology by human beings, where would I begin? Who should I read first?

Let me just explain why I ask this question. I am concerned with the development of theory in my field. We discuss a lot of application, but since my field is relatively new, solid theory has not yet been developed (in my opinion). I really don't have a background in philosophy, so I hope my question is not too strange. I had a thought that perhaps I can gain some ideas about the development of theory in IT from reading some classic thought in philosophy on tools, technology, and/or theory and knowledge creation. The reason I have this idea is because many other fields have beginnings in philosophy. The philosophical questions that the classic thinkers asked and discussed comprised the foundations of many of the sciences and social sciences we know today.

I hope someone is able to help me with this question and that if I am barking up the wrong tree, that someone would kindly let me know.

The real difficulty, I would say, is that thinking about technological issues could hardly precede the development of technology itself, and the use and theory of tools and implements and work methods is not, unfortunately, a topic to have exercised any philosopher on a more than very superficial layer. Invariably, when a new science begins, the most important issues to exercise either philosophers or scholars influenced by them are epistemological issues, and methodologies usually lag far behind at least insofar as they are elaborated and written up. I mention these difficulties mainly for the reason that I'm inclined to recommend some pertinent reading to you, but with the caveat that, I'm afraid, the risk of wasting your time is yours!

The best known of the classical philosophers who actually has a lot to say about these issues is Francis Bacon. I suggest you dip into his Advancement of Learning and The New Organon. Bacon is not shy about 'principles' and produces innumerable classifications and taxonomies related to the various branches of learning (in which what we call technology today forms a part). Now depending on your personal inclination, you could be bored stiff or fascinated about the prophetic genius that glimmers through the dim fog of a very primitive science. One way or another, you may find food for thought in this.

Now following on from this, Bacon was the inspiration behind the Encyclopedie of the French 'philosophes', headed by Diderot and Voltaire. I regret I've only read the latter's contribution plus a small handful of others, so my recommendation is restricted to recommending that you consult someone who is knowledgeable about the work. I suspect (but can't confirm) that something of value to you might be found in there.

Another work worth looking into might be Comte's Positive Philosophy. As philosophy it is dreadfully dated now, and probably just for the reason that so much of its matter or principle has been updated in the proliferation of methodologies. With Comte we begin in any case to overlap with the rise of autonomous scientific principles; and I doubt there is much in writings on the latter that has not since been improved upon. But just in case your interest is wakened, you might also dip into the writings of Hermann von Helmholtz, which give a fascinating and first-hand glimpse into the technical accomplishments of that era (cf. Science and Culture, Chicago UP).

Finally, for a purely scientific point of view, yet from the vantage point of philosophy, you can't go past the correspondence between Leibnitz and Clarke (the latter a mouthpiece for Newton). This might be aiming somewhat too high, but it can't hurt you to read at least a couple of exchanges to get a taste for what's going on here.

I would love to think that this helps. But though I remain dubious, let me add (speaking purely of my own disposition) that of the many ways to kill time, few are as interesting as these for their own sake.

Jürgen Lawrenz

I think you're looking in the wrong direction. You can read Heidegger until your face turns blue, but he won't help you in your work; in fact, he was very much a Luddite. You might look at Aristotle and the idea of techne, but again, I don't think it will help. The area you want to look at, in my opinion, is cognitive psychology. There's been a tremendous amount of work there on the various modalities of perception, manipulation, theory formation, and so forth... so much that I don't even know where to begin with references... I could give you 50, easily, right off the bat. Go to your Psychology department and find some people in these areas, or who can start you reading in them. Computer people on the whole do not know this literature. The problem however is that it is truly enormous, and you're going to have to do quite a bit of reading to extract what you want to know. I guess you could start with the (very old) literature on the tachistoscope and keep going from there, and take a look at the Stroop effect, and also I'd recommend (which you might know) the MIT robotics lab website... There's Gopnik on 'theory theory'... and of course the huge literature on cognitive development... I mean, I think you're doing the right thing, but you may have to (and I'm quite serious) get another PhD, or do the equivalent reading, to really learn this as thoroughly as it should be learned.

Steven Ravett Brown

Certainly your question is not strange:

About philosophy and technology I found two sites that may be interesting:




Technology and philosophy could help each other in more ways.

Henk Tuten


ZhunTzu asked:

My question is about the Philosophy of Religion: The foundations of some philosophical arguments start with a premise of an all knowing, all powerful, and all good God. Did anyone ever think of a different foundation perhaps consisting of a negative God? A all non-knowing, all non-powerful, all non-good god. What if the powers and attributes of God are like Nelson Goodman's 'grue' and 'bleen'?

The idea of constructing a negative god is termed 'cacodaemony.' This notion of god has been used to explore the famous 'problem of evil,' which is to show how an all good, omnipotent and omniscient god could have created a world that contains moral and natural evil. By changing the idea of god to one in which he is all evil, we find that we come up against the 'problem of goodness.' How could an all evil, all powerful and all knowing god have created a world which contains moral and natural goodness? The result of this thought experiment is a challenge that exactly parallels the traditional problem of evil.

An article regarding this is by Steven Cahn, Analysis 37, 2, 1977. In the cacodaemony, this god is 'all evil' (not good), but the attributes of omnipotence and omniscience remain the same. This is because if we went so far as to attribute a god with not knowing anything, and/or no power, we have stopped speaking of anything like the Judeo-Christian god. My nasty uncle Charlie, for instance, is completely ignorant, utterly powerless, and pretty much wills evil all around, yet he poses no issues for us whatsoever, given that this combination of attributes renders him a completely non-effective being in every respect. A god with virtually no powers and knowledge, who wills evil, is no god at all. It is possible to contemplate a god that is all powerful, but is not good or all-knowing. Likewise, we can contemplate a god that is all knowing, but not all good and all-powerful, but under these permutations of attributes, the god we are considering does not resemble the traditional Judeo-Christian deity. Under polytheism, gods have significantly different attributes. Hera, for instance, from the Ancient Greek pantheon, is not all powerful: other gods are also powerful. She is not all good, and expresses jealousy and other flaws, nor is she all knowing as other gods have their plans and spheres of influence independent from hers. However, under polytheism the problems that beset philosophers and theologians, like the problem of evil, do not arise (or, at least, do not arise in the same way).

The issue of whether or not god's properties are entrenched and gerrymandered, such as Goodman's predicates 'grue' and 'bleen,' is a different sort of issue. The new riddle of induction shows that these sorts of predicates cannot be distinguished between ordinary predicates like 'blue' or 'green' induction does not permit it. Both 'green' and 'grue' [= 'green up to, and including today, blue afterwards'] apply to all samples of emeralds until today (now). Goodman's point is that our conceptual scheme in general is entrenched, and there is no way to tell "well-behaved" predicates like 'green' from the ill-behaved ones like 'grue.' All of the predicates we use, no matter what they apply to, be it copper, emeralds or god, are so entrenched. If you accept Goodman's argument regarding grue/ bleen and induction, you must accept the entire argument and what it implies, and cannot separate out a piece of it and apply it only to god's predicates. Empiricists have worse problems (or better ones, depending on perspective), than figuring out god's attributes.

Maureen Eckert


Danielle asked:

I'm having trouble understanding R.G. Collingwood's scissors and paste method of philosophy and what it has to do with critical thinking and historical evidence in his book The Idea of History. I need to write a paper on it so your help would be greatly appreciated!

There are many ways of writing history, for example Caesar's or Xenophon's accounts of their own campaign based on first person experience; or by being a witness or participant in the events being described, as happened to be the case with (say) Sallust; or you can study the civil records and the stories of a region and piece together an account of the events reflected in them. All these methods possess value; some in virtue of being the evidence for events in themselves, others in evaluating their evidence conscientiously and critically, so that the end result may qualify as an acceptable vision of the historical segment being treated. A good historian taking up such documents much later and using them as sources, would be required to do much critical thinking and evaluate them together with appropriate research being done in related fields such as, for example, archaeology.

Now there is a problem with some areas of historical study where everything we are ever likely to know (barring some fortunate accident) is already known. This relates mostly to the human interface. What historical people said, their speeches, arguments, their promises and treachery etc. is laid down for large tracts of history in sometimes very few documents which serve every historian of those historical segments as sources and often enough nothing else but those accounts exist. Archaeology cannot, of course, retrieve the real interactions among people, only the artefacts they left behind them. In some few cases, legends and mementi add colour to the historical records, but they are in general very unreliable and indeed take most of their value from the accounts which pre-exist their discovery.

So when Collingwood talks about the cut-and-paste jobs in history, he is referring to just such problem cases: to 'history' writing which is nothing more than a particular writer snipping bits and pieces out of (say) three existing sources and re-arranging them according to taste, prejudice or in some cases for the purpose of manufacturing a literary masterpiece. A famous specimen of this kind is A History of the World by Sir Walter Raleigh, a magnificent tapestry of glorious English prose, but completely worthless as a history. On a slightly higher rung you find Gibbon's Rise and Fall of the Roman Empire, 'higher' only because there is some scholarship in it, though by Collingwood's standard not enough but in any case it's whole value rests on its merits as a literary performance. Until the rise of such ancillary sciences as archaeology, the overwhelming bulk of history writing was in fact scissors-and-glue-pot.

Collingwood makes the additional point (on p.258) that even a lot of supposed 'source' material is of the same worthless variety (and without much redeeming literary merit), which we are obliged to evaluate merely because of its antiquity: great reams of regurgitated history were produced in the last 2—3 centuries of the Roman Empire as well as in the Middle Ages which is nothing other than cut-and-paste stuff. But beggars can't be choosers. We know that most of this is gibberish, but when some author claims he got story X from historian Y, and we don't know who Y was, then unfortunately you are reduced to taking on trust what strictly speaking deserves no trust at all.

On the same issue, one of Collingwood's important points relates to the concept of 'authority'. In cut-and-paste jobs, the appeal to authority is constantly misused. Author A has his prejudice; in appealing to author B he compounds the prejudices, and then along comes author C, who appeals to A and adds merely another layer of prejudice. All this is just cribbing, of course. None of these authors knew anything, or any more, than any of the others. The point is that one of the significant methodological advances of late 19th century historiography was the redefinition of that concept: either all sources are authoritative or none. Thus in one fell swoop the pitfalls of prejudice were plugged up. In its wake, every historian was forced to evaluate sources objectively. (This works wonderfully well in theory, but in reality it is, of course, nothing but a crutch. Objectivity is simply not attainable to a human being. But historians at least strive to, in order to get at the real facts of history which are often glazed over not only by legendry, but also by partisanship, common likes and dislikes, love and hatred and one's private notion of morality).

This is more or less the gist of it; and one reason why it is important to Collingwood is because the notion of evidence itself is very tenuous; and he quotes an hypothetical case in law (p. 266 et seq) to show why. Reading that section of his book (i.e. b 3 of the 'Epilegomena') is not all that difficult, except that in being addressed to experts and philosophers makes it difficult for anyone to work through who is not familiar with all the different philosophical and methodological trends being examined here. But just bear in mind what I wrote above and, additionally, that one very good reason for not trusting a scissors-and-paste history job, is because there is literally no-one who bears responsibility for the truth or even mere factuality of the events these types of works inaugurate an infinite regress of authorities!

On the other hand, though it is a bugbear, the scissor-and-paste history can't simply be ignored, because there is so much of it which, due to historical circumstances, is now our only source! Between the devil and the deep blue sea!

Jürgen Lawrenz


Michael asked:

I am an undergraduate philosophy student, and I very much desire to earn my PhD and become a professor. I am concerned that my moral convictions may get in the way, however. I don't believe that, for the most part, intellectual property enforcement is morally acceptable, and it would be a violation of my personal code of ethics to grant anyone one an exclusive right to any papers I produce. My question is, are there universities and reputable journals that don't demand exclusivity? Even if the way is not easy, can I advance in academia without compromising my principles?

I am quite sympathetic to your views.

There is a huge movement for open access to journals, articles, and other material. Stevan Harnad is one of the prime people in this. Look here: http://www.ecs.soton.ac.uk/~harnad/; you will find all sorts of open access resources.

Now. That's one way of getting papers out, and I use it myself. But it's not a good way to get published. For that you need to be in reputable, peer-reviewed journals, and they are exclusive. So what I do is put a preliminary version out on the free site and the final version in the journal. That way, no one can complain... I hope. Technically, we have not submitted the same article to the journal and the site. In addition, ethically, the site is not a journal, not peer-reviewed, and indeed asks for the journals the article was published in. So you have not abused the peer-review process in putting your article in Harnad's site. But you do need to publish in "normal" zines to get recognition.

Steven Ravett Brown


Melissa asked:

These questions stem from Georges Bataille's Erotism: Death and Sensuality.

What, for Bataille, is the relation between sex, death and religious experience? How does he understand Christianity in light of his analyses? What do you think of his position? Does he get close to what is "religious experience"? What does Bataille mean by "continuous and discontinuous being"? What concepts and ideas in Bataille parallel Heidegger's notion of the enframing? Which parallel Heidegger's notion of exstatic Dasein (ex-istence)? Does Bataille think we can get beyond the Ordering?

You have certainly asked a composite series of complicated questions concerning Bataille's major themes and motifs, so let us start with your last one. It has been a very popular activity of many Bataille scholars as of late to find solid links and parallels between the Heideggerian and Bataillian project; however, such hasty links compromise the character of Bataille's works. One must remember that Bataille was writing in a rather oblique tradition away from existentialism and phenomenology. Although he gained much of his Hegel understanding from his infrequent attendance of Kojeve's lectures, what we see in Bataille is a promotion of irrationalism more in league with a speculation beyond Hegel's system and Bergsonism. Bataille's relationship with religion in general is a complicated affair, for in his more formative years he had planned on joining seminary, to which end later he renounced his faith and eventually found himself conjoined with Breton and the surrealist movement. After he left Breton et al., Bataille made his own way to demonstrate his belief in a state of non-savoir, of our reason going to its limit and "expiring." I will respond in more detail at a later date.

Kane X. Faucher


Precious asked:

Describe, precisely, how the Turing Test works and why Searle thinks it fails.

Why Searle thinks it fails is very well described by the man himself in at least two of his books (e.g. Rediscovery of the Mind, Ch.9). Searle's English is exceptionally clear and he takes great pains to avoid ambiguities; whatever I might write, therefore, could not possibly make any simpler and clearer what his views on the Turing Test are, so let me just recommend to you that you read those very few pages (hardly 20) from the pen of the man himself.

Now there's actually a double benefit, because in reading of Searle's objections to it, he of course describes in equally clear prose what the Turing test is all about. So once again this has got to be the top recommendation.

Jürgen Lawrenz


Adam asked:

What do we mean when we call someone a genius?

As far as I know, the first use of that term was by Kant. He used it to refer to someone who, because of a profound connection with reality, was able to create new rules for constructing and understanding what Kant termed "aesthetic ideas". This connection to reality, to the noumenon, was unknowable, and the noumenon remained unknowable. The reason for this was that it enabled some few people to have, occasionally, free will, inasmuch as they were through that temporary connection able to free themselves from the causal patterns or laws, to put it roughly, of the knowable world. Now this is a summary of the motivation of Kant's Critique of Judgment, and for a 25 words or less statement I think it's pretty good. For a statement of the C of J, it's lousy.

Anyway, that's where the term started, as far as I know. It was, I believe, taken up by Goethe to refer to outstanding artists. And it took off from there. Today it means nothing. It's just a word someone uses about someone else when they admire them. So, in answer to your question, I have no idea what "we" mean... because the term really has no clear meaning any more. Different people mean different things by it. Does that help?

Steven Ravett Brown

The word itself has a long history of changing meanings: and in the early years of its usage it was often a simple synonym for cleverness. However, it was then mostly used in the form "X has a genius for...", meaning that X has a talent.

But in the inception of the German branch of the romantic movement, the usage of the term underwent a subtle shift. The presiding "genius" of that movement, Herder, used it in such expressions as, for example, "the genius of the language", where he is not referring to a person, but to the language itself as a kind of river that flows through the population and impregnates the people speaking that languages with its spirit. Herder was especially vocal in defending a then popular theory that the authentic poetry of a nation arises spontaneously and anonymously, long before individuals make it their business to "cultivate" language poetically. You can see here the connection between the original French meaning of genius (spirit), which Herder directly imported, and the connotation of authenticity. Both of these eventually converged in the extension of the notion to individuals.

In that definition, then, a genius was a person imbued with the authentic spirit of poetry (or 'poesie' which is the term they preferred to distinguish the authentic from the manufactured). The author of the Nibelungen Poem was such a genius, and it was rather a recommendation that he remained unknown. Similarly with the Edda poems of Norse mythology, the Beowulf etc. Incidentally you will find in these adumbrations the kernel of the later doubt that Homer was just a collator of old legends! Now Herder is almost unknown in the English-speaking world, but if you've read Goethe's Werther, then you might remember the almost hysterical ravings over Ossian and Fingal, which Goethe (on Herder's say-so) classed as "authentic" Scottish folk poesie (they didn't know it was a put-up job by one Macpherson, a second rate versifier!). Roughly at the same time he (Goethe) wrote an eulogy on Erich Steinbach, the architect of the Strassburg cathedral; and again the same notion of "genius" prevailed here, in that the master builder was an intuitively authentic embodiment of the gothic spirit.

The whole notion acquired the momentum of a cult in very short order (that time is still referred to as the Era of the Genius-Cult in literary history); even the old fogey from Konigsberg [Kant, Ed.] felt obliged, in his aesthetic treatise, to offer a sort of "definition" of genius along roughly those lines; and later his avid pupil Schiller worked out a comprehensive aesthetic (Uber naive und sentimentalische Dichtung) which made the distinction between the "authentic" and the "cultured" poet a cornerstone of poetic theory that remains to this day pretty much in force in German philosophy. One suspects that the overwhelming esteem accorded to Goethe owes not a little to Schiller's advocacy of him as such a (perhaps the only modern) specimen of an "authentic" genius. Thus the die was cast; henceforth the term "genius" became affixed to individuals of a particular creative potency.

It would not be long afterwards, that the imprecise and indeed indefinable notion of "authenticity" came tacitly under fire; and more and more of the creative types of the "sentimental" variety found themselves called "genius", even though according to the standard set by Schiller it was an inadmissible licence. Then, as the result of a natural attrition of exaggeration, to which especially the late stages of romanticism were prone, the value of the term genius became debased by over-usage; essentially it has become again what it used to be, a synonym for cleverness. Yet because enthusiasm for things romantic — novels, music, poetry etc — has never quite died down, the term itself is carried over and retains in these specialised contexts some of its old force of meaning.

So ultimately the answer to your question is: if used today in an everyday context, it probably means nothing other than "X is clever" or also "X is pretty stupid, but for some queer reason he's got a knack for stringing up pretty words; guess he must be a poet." But the old custom, as I said, has not died out altogether, and so it is still occasionally used to mean "X is really an exceptionally inventive/ creative personality".

Jürgen Lawrenz


Andrew asked:

How can Good exist without Evil?

What are you implying? Is it that good cannot be seen to exist without the contrast of evil? Or that good only exists to destroy evil? Whichever is implied we are still left with a teasing dilemma. If God is the creator of the universe, why allow evil into the creation in the first place? Or could it be that God did not have all his own way, working as the creator of good alongside the creator of evil? Some religious people believe in the existence of a very strong Devil.

There is no doubt that the battle between good and evil seems to have been going on since the world began, but in a 'natural' world "red in tooth and claw" a caring God seems to take second place. However, as your question implies, it would appear that the evil we see round about us stimulates the concept that there must be some power for good to which an appeal can be made. There is also the notion at the root of religion that the world is the creation of a 'good' power, but, somehow, evil has managed to gain access. Some claim that a mistake was made initially by God when he allowed humans to have a certain amount of freewill.

Then there is the question of evil itself, with all its variable concepts. Many regard the perceived cruelty of nature itself to be evil, what is regarded by many as a natural and necessary food chain to maintain the balance of nature, is seen by others to be an unnecessary form of cruelty which extrapolates to evil. Here is another dilemma: take the simple case of a domestic cat coming in from the garden with a dead bird in its mouth; the owner of the cat gives the animal a good beating. Which action can be interpreted as evil, the action of the cat in killing the bird, or the action of the owner in beating the cat, or both? Some will say that a natural action, though cruel, cannot be evil, but an action punishing natural activity is both cruel and evil. So, we are presented yet again with a complicated problem regarding God. If God is the creator of nature, surely He could have presented us with a kinder regime of nature. How could a loving God confront us with such cruelty?

Separate from nature are the choices regarded as evil, or which lead to evil, made by humans themselves. A choice to murder, rob, deceive, inflict pain, betray, hate, etc.. Pertinent to your question, it might appear that God is somehow responsible for the evil which your question suggests He cannot live without. However, the existence of God depends on factors other than evil, the general claim is that the universe must have a creator, and most are content to believe that this must be God. Regarding God as the creator means that he exists whether or not evil is present. Perhaps the 'Grand Design' must include evil to make it work properly. Supporters of God would not argue with this seeing that their apology rests on the premise that God knows best. As Kant implied, our minds are not constructed to go beyond a certain level of knowledge, i.e. there are things which will remain outside the powers of human understanding.

Perhaps we should be content with the words of the hymn: "God moves in a mysterious way, His wonders to perform." Which provides some support for accepting evil as an essential part of creation. How many times have we looked back at something evil and destructive in our lives, only to find that if it had not happened the subsequent good arising from it would have been denied us? There are so many things in our lives where good has had to be preceded by bad.

John Brandon


Ed asked:

What's Ockham's Razor? and when is it commonly used in testing an empirical hypothesis?

Before all else, you need to be clear in your mind that Ockam's Razor is not being 'used' for anything. Although there seems to be a widespread but fairly vague belief that it's some kind of methodology (or even a real razor!), it isn't. In fact, it goes back to (of course) William Ockham, the scholastic philosopher, who wrote on two separate and quite unrelated occasions in pithy epigrammatic form words to the effect that 'if you are confronted with a choice of two equally valid alternatives, take the simpler'. The reason being, essentially, that humans are error prone, and by taking the simpler alternative, you also reduce the chance of compounding errors. You can also apply the precept to situations where there are no alternatives, but just a very complicated setup. Then you are advised to seek ways of simplifying it. This simplification, which you may picturesquely imagine as trimming off the curls in your beard, so that it's lean and clean, is probably what most people would think of as 'Ockam's Razor'.

From this you will gather that Ockam's Razor is just a very good precept. It's got nothing to do with science at all, except that many scientists have adopted it as a general sort of advice that seems especially pertinent to their pursuits. But I would not be surprised if some scientific theorist has worked out a true methodology based on this principle. I'm not aware of any such explicit methodology, but even if there is one, it evidently has no bearing on Ockam himself no more than, say, the old adage 'birds of a feather flock together' has anything to do with the science of ornithology.

Jürgen Lawrenz


Damon asked:

I've noticed that a lot of the focus in professional philosophy at any one time is very narrow, and of the "hop on the bandwagon" sort.

It seems to be a matter of what's in vogue at the time, and once it falls out of vogue, it's dismissed.

That makes little sense, since rarely can any theory be proven to be false. Even the questions we ask are contextually dependent. Am I right in what I sense?

Ok... let's take physics. Don't you think that lots of the focus there is on what's in vogue right now? I mean, just everyone is doing the same experiments or the same theory... right now it's string theory, right? And all that stuff about "dark matter"... everyone's writing about dark matter! I guess physicists are just like philosophers... whatever's the rage, that's where the grants are... whoops, no grants to speak of in philosophy. Well, anyway, what can I say? We philosophers are just like the physicists, I guess, a flighty lot, swayed by fashion, hopping on the old bandwagon. I mean, why not, since, as you say, no theory can be disproven?

Steven Ravett Brown


Phil asked:

I have been asked to study the effects of Nietzsche's Beyond Good and Evil on Ethics for my A-level coursework. Our teacher has given us some information about emotivism, prescriptivism, cultural relativism and subjectivism, but I am not sure I will be able to write enough about just these areas. I am unusually succinct! Are there any other general areas of 20th century ethical philosophy that may have been influenced in some way by Beyond Good and Evil? If you could give me their names I'll do the research myself!

Well, Phil, I teach Yr 12 (like A levels) philosophy, and I could write a book just about cultural relativism. But you could try looking at Iris Murdoch for ideas of good. Do you need counterexamples to Nietzsche? Try Martin Luther King Jr. Of course, I'm assuming you've looked at Sartre? Good luck with it!

Lyn Renwood


Bethany asked:

"If a tree falls in the forest, and no one is there to hear it, does it make a noise?" What is your theory on it?

The answer is a counter-question: 'What is noise?' If you reply, as you probably would, the sounds picked up by our ears, then the answer is indubitably 'yes'. For you do not literally have to be on site to witness the event for it to be a definite event of a definable class it would do to have a tape recorder as a proxy witness; and if other creatures are present, such as wolves or birds, the answer would still be yes.

The answer would, however still be 'yes' in another theoretical situation which is probably what was at the back of your mind when you asked the question — which, by the way, is a very ancient standby in the philosophy of mind and perception. Over the centuries, thinkers have responded variously, depending on their conception of the nature of perception, some going to the extreme of claiming that only a human is capable of identifying or interpreting such a noise, therefore it would not be, if humans became extinct. These and similar claims do not, however, meet the requirements of the case. For in saying that the answer must still be 'yes', I am asserting something about 'priority', i.e. what comes first? Necessarily the noise must come first, for the aerial molecular energy involved in the fall of the tree is a fact of the environment which organisms learn about, adapt to and eventually respond to. In other words, the pre-existence of the noise is responsible for the evolution of ears. Plainly then, to maintain the opposite opinion, requires of the holder of such a belief to explain what function ears are intended to fulfil. I would suggest, moreover, that this 'fact' is misplaced in mind theories; the perception of falling-tree-noise has no ontological component worth mentioning, nor is it implicated in any significant way in the theory of the mind. Falling-tree-noise is just a plain sensation and verifiable by probably several hundred thousands species on earth which do not possess a mind.

Jürgen Lawrenz


AC asked:

What is the basic element in all things that can be known by reason or learned from experience that gives meaning in our life, leading us to right conduct?

Let me put it this way. Suppose that you were a creature totally alone, floating in space, the only living creature in the universe. Ok? Now: what gives meaning to your life? Where does your morality come from, what would it be, and why?

Well, how about this: suppose you were an animal living on a planet somewhere, and somehow you achieved consciousness. There you are, surrounded by other animals who have not achieved consciousness. Now, again, what gives meaning to your life? Why?

Or, to take a more familiar example, suppose you were an animal living with other conscious animals, on a planet somewhere. What gives meaning to your life? Why?

Would the meaning of your life be different in each of these cases? In what way? What would justify those differences? In other words, how would you know that whatever the meaning of your life was, whatever you created it to be, was what it should be? Well in the first two cases you would only have yourself to go on. In the last, you could either let another tell you what your meaning should be, or you could create one for yourself and ignore the others. Or you could create one then argue with others about whether yours or theirs were best. Or you could try to compromise between them all; or you could try to find what they all had in common. But what justifies any of those alternatives? Where do the standards come from for deciding how to choose between these alternatives?

You could say, "well, I'm just going to pick one". That's usually called "having faith". You could try to find the most logically consistent one. That's called "coherentism". You could try to reason out the best basis for picking an alternative. That's called "foundationalism". And what I'm discussing, in general, is called "metaethics". It's a very complex subject. You might look up some of the basic literature there, if you're interested.

Steven Ravett Brown


Maria asked:

What is the relation between Descartes' mind-body dualism and Freud's ego formation?

Well, Descartes argued that his mind was distinct from his body but there was a connection between them and that the mind-body interaction was effected through the pineal gland. Freud studied neurology before developing psychoanalysis, and he was also a doctor but simply because he was living in later times he would obviously reject the pineal gland idea. So Freud's position would be that the mind is closely related to brain, whether he would claim it is the same I don't know because such speculation is slightly metaphysical. But he did believe that mental activity is the result of neuronal activity and that the term ego described a pattern of neural organisation. There is a position in modern neuroscience which states that the sort of illnesses Freud treated, such as neuroses, are not localisable in the brain but are an all-pervasive or non-identifiable phenomena so there are claims that the unconscious cannot be identified by a particular brain state. This would imply a non-identity theory of unconscious mind and body but nothing as extreme as Cartesian dualism.

As far as the ego formation is concerned this Freud didn't have a theory of how or what neural mechanism would give rise to displacement of consciousness.

The ego connects us to reality and has the function of repressing impulses and instincts, but it wouldn't be what Descartes takes the mind to be, ie a 'thinking thing'. A thinking thing is rational and cognitive, as is the ego, but for Freud the unconscious is always present even though it is repressed and not available to consciousness. The non-rational, or the unconscious, doesn't just have an affect on the pathological ill. We all dream, indulge in wishful thinking and make slips of the tongue. Also artistic creativity depends upon the same mechanism as dreams. I wonder how Descartes' own story about himself as a thinking thing was driven?

The energy provided by instinct and impulses repressed into the unconscious is the driving force of life. For Freud we wouldn't have a mind without the unconscious.

Rachel Browne


Michael asked:

I was most interested in Jurgen Lawrenz's statement in reply to my previous question, "The most important thing is that the universe know itself." Or at least this is what I understood your answer to my question to be. What is your basis for this statement? What if we had a universe that did not "know itself". What kind of a universe would that be and what would be wrong with that?

I suspect that I'm not telling you anything you don't know yourself when I say that philosophy is going through a very difficult phase. It has been, actually, for nearly 200 years, because that branch called 'exact science' has made such a powerful impact on civilisation that we are (and I include many philosophers in this) altogether in danger of forgetting the really important things. It is wonderful to have science and its daughter technology, delivering a lifestyle that is beyond the capacity to even dream of by the greatest kings and potentates in history: today, such comforts and achievements (like the Internet) are within virtually everybody's reach. But we have not made equivalent progress in the mental (psychological, spiritual) sphere, where in a sense we've remained on the level of an overgrown chimpanzee, as some writers are not shy of putting it.

That's a long preamble to your simple' question; but of course it isn't simple at all. It's purpose was to make the very important point that scientific research is a methodology, not a philosophy. The evaluation of that research should still be in the courts of philosophers but I'll be the first to admit that philosophers have on the whole turned their back on it and left us in the lurch by 'us' I mean us-as-a-society or civilisation. Your question is one of innumerable such questions that people nowadays address to scientists, in the belief that science, so powerful and all-mighty, must surely know the answer. The trouble is: they don't. It's in fact impossible, in-principle, for research to answer such questions, as Wittgenstein, who was a research scientist prior to turning to philosophy, demonstrated nearly 100 years ago.

I wrote all this, even though its seems to be marginal to your supplementary question, because it is a terrible mistake to give up on science, just because the temptation lies close to hand to ask the wrong questions of it. Science is like a sharp-eyed watchdog; many dubious ideas that were traded in philosophy for centuries have been exposed by the clear thinking that science demands, as figments. But some belief systems seem to be ineradicable; today more than ever ordinary people are addicted to astrology, parascience and whatnot. So it's important to think clearly about such recondite issues as your question entails.

Somewhere else in this Q & A segment I answered another question on this topic, which I recommend you seek out (just search for my name until it is indexed). I deal there with the possibilities of conscious life elsewhere in the universe and my conclusion is, that one cannot plausible exclude it because all matter in the universe is structured. It is (if you can accept this) an a priori condition of existence. No structure, no existence. (For us to be able to assert that atoms actually exist, is only possible because they vibrate, and in doing so they shed part of their structure, an activity which translates as detectable energy.)

Now you wonder why I say the universe 'is conscious of itself' and why this is so important. Let me give you a definition: Self-consciousness entails the ability to account for yourself to yourself. On a lower level, e.g. among fish or snakes, Consciousness entails the ability to discriminate between a self and a non-self. You'll need to distinguish clearly between these two. They are the fundamental modi of consciousness which organisms gain from the possession of nerves, for the simple reason that the organisation of nervous systems includes a capacity for evaluation: in simpler organisms to detect and evaluate sensations and perception; in complex organisms like humans the capacity for evaluating self-generated percepts (like words, symbols etc.). It follows that creatures without nerves cannot therefore have consciousness, although they must still be able to discriminate between what inside and outside of their body structure (technically this is referred to as metabolism and homeostasis); and that is indeed the basic condition of being alive.

Now I'm just coming to the point. For something to exist and for something to be are different criteria altogether. A bar of iron may be said (by a conscious creature!) to exist; but to be involves the consciousness I referred to above. In other words, to be entails the knowledge that I am.

Accordingly the matter structure of the universe, although it may in some abstract sense be acknowledged to exist, cannot be said to be. There is no agency with the power to render this existence conscious of itself. If I may formulate it in a paradox: the universe does not possess nerves, hence it cannot be conscious! So the need is for nerves to evolve. Now this, as you know, has occurred. It is pointless to deny it. Several hundred thousand species on earth managed that feat; and (as above) I consider it altogether plausible to assume that elsewhere in the universe, similar evolutionary paths are available on planets with suitable environmental conditions.

You'll appreciate from this that to call the universe 'dead' is merely a metaphor. Something cannot be dead unless it has been alive. And what is implied in what I said said, is that the universe appears, in virtue of its bias for structure, to also contain (via carbon atoms) a bias for the sort of structure that will eventually, in selected environments, evolve into conscious organic entities. On earth, we know that one species of such organisms evolved the type of self-consciousness which, by extension, allows us to postulate that the universe has 'cognisance of itself'. It does so, because we are its agents for this self-knowledge.

Now short of writing a book (I may well do so at some future stage!), I must leave this difficult concept to stand by itself. But I will leave you with some hints from two philosophers who thought along the same lines. Erigena, who lived over 1000 years ago, published a vision of God which can on one level be read as conveying the notion that God, in order to become conscious of himself, needed to disperse his spirit throughout his creation, that God is conscious of his own being through us. He was excommunicated for this outrageous idea; but in our context, you need do nothing more than replace the term 'God' with 'universe'. Erigena himself would probably agree that the two are the same, or two sides sides of the same concept.

More recently, the German philosopher Schopenhauer theorised that what we call 'Will' and 'Energy' are really a fundamental force of the universe, the principle of activity itself. Thus the universe constitutes itself by investing this force in matter (energy) and in organisms (will). I suspect many physicists, if only they knew of this idea, would find much to agree with. Again, of course, one may interpret this force as a constituting agency, a means for the universe to acquire both consciousness and being. (Schopenhauer, a committed atheist, would however deny any connection to God.)

This is a lifelong search, Michael; but I hope that my answer will give you a kind of starting block. With such 'deep questions' it is always difficult to know where to begin; I would love to think this will obviate a lot of unnecessary ransacking of a literature chockerblock full with figments and fancies. Just don't confuse what I write with 'facts'. I simply reflect what, with the best conscience I can muster, is scientifically tenable and philosophically acceptable.

Jürgen Lawrenz


Ali asked:

I'm a philosophy graduate student from Algeria and I have many questions and ideas that make me worry, so I need you in order to help in order to get a good solution.

I'm very interested with the marriage a project and I hear many ideas about the criteria choosing of a future good wife. I heard a radio interview (Arabic service of the B.B.C.) with a professor specializing in "Family Sociology" that the difference of the speciality of the intellectual couple is very important for spiritual progress and stability, because it avoids routine (and anxiety if one of the partners feel that they are handicapped intellectually!). I would personally prefer a wife from the medical profession (a physician because I am interested in the integration between "Medicine and Philosophy" — it is a title of an important Magazine from UK).

So, would you mind directing me toward a good choice — because I believe in philosophical counselling — or would you orient me toward a specialist or articles or websites that give me a sufficient remedy of my suffering in my present state of indecision?

Well this is certainly one of the strangest questions on a philosophy forum... you want marriage advice? Ok.

Read this: The Seven Principles for Making Marriage Work, by Gottman, John. This guy can evaluate a couple and with 90% accuracy tell whether their marriage will work, and he has recently developed a mathematical model that seems to predict this. Amazing, right? But as far as I can tell it is backed up with consensual, double-blinded, empirical data.

So, what is it he says? The essence is this: the better friends you are with your partner, yes, friends, as in, "wow I like to hang out with [him/her]", the better your marriage will be. That's it for the big secret. Now the question becomes, just what exactly is it to be a friend? And that involves, in essence, respecting and liking the person for themselves. I mean, it all sounds awfully trivial, doesn't it? But... all the stuff about "love", "attraction", "love at first sight", etc., etc.... not the way to go. Be a friend who enjoys their company, and you've got it.

Now, the hard part. How does a man learn to be friends with a woman... or vice versa? Most cultures do not teach this, and indeed actively discourage the kind of relating that would lead to learning it and to such friendships. And look at the problems that result. I don't know the answer to this, for any given person. It took me about 30 years before I figured it out (and yes, it was before I read this book) and a great deal of effort in learning how to see a woman as someone I could be friends with before getting sexually involved. But it got me a good marriage, finally. Good luck!

Steven Ravett Brown

Marriage is an important decision in most cultures, so asking advice is no reason to send you to a 'specialist'.

Mind that seeing marriage as a project is extremely rational. Seeing intelligence as main criterion for selection in case of marriage is quite rational too, and very limited. Instead I propose to use the word 'creativity' and see intelligence only as part of it.

During evolution humans have got quite clever in choosing a good partner. The trait called attraction was developed in billions of ages. I agree that this trait was developed for finding a good mating partner, so as to have successful children. But the trait considered as well the fit of the parents. A bad marriage mostly leads to traumatized children. Sociology only exists some 300 years, and is a rational type of sport.

So if you feel attracted to some woman (more than ONLY physically, but that counts too), then learn to trust that feeling. That means: forget about human made concepts like wealth, appearance, intelligence, etc.

How do you think wild animals (no offense) find the perfect partner? (science found out they really do). Just by trusting intuitively on their sense of attraction. As such animals are more effective than humans. Maybe not every rational criterion is bad, but don't forget about the natural ones.

Henk Tuten


Celeste asked:

Does gender enter into and effect every aspect of a person, their decisions and their life?

I don't think rationality or emotions are particularly affected by gender. Decisions are based on reasons and desires, so gender has very little effect. There are feminists who claim that women are completely different from men, maternal and nurturing, and this would supposedly lead them to live their lives quite differently from men since they would have different needs and desires and reasons for decision-making. There might be a tiny element of truth in this, but I can't think of any women I know who are driven by a need to nurture in any way that contrasts with men.

Rachel Browne


Michael also asked:

I am not clear about what Jurgen Lawrenz calls 'instantiation of self'. How in Lawrenz's theory one can escape the problem of illusion?

It just occurred to me that I can give you a first rate example from 'real life' as a helpmate for understanding what I mean by Mind (Soul, Spirit) being 'instantiated' in a Self.

I leave undefined, as a matter which is neither scientifically nor philosophically ascertainable, whether this soul or mind or spirit therefore pre-exists or not. I'm cautiously inclined to answer this question in the negative, although it makes it more difficult. It would be easy enough, today, to accept Erigena's principle and suppose that the universe is infused with the spirit of God which seeks instantiation in humans. But my scientific research has persuaded me that the solution to this enigma is not as readily to hand. In the past, philosophers without science took the easy route of extrapolating from the human upon the divine dimension (and this includes eastern thinkers, Indian as well as Chinese and Arabic); but this is no longer feasible now that Kant has incontrovertibly shown the incompatibility not only of these dimensions with each other, but that our concept of infinity is deficient, for the same reason. One of the most important tasks facing philosophy is to work out an adequate concept of infinity. No philosopher known to me has even begun to tackle this.

In my philosophy the human creature is an animal (a mammal); take away the mind and you have simian showing minor somatic variations to chimpanzees. Accordingly the difference cannot be physical. Rather, it is a question that there is A BIAS in operation in the universe, which is easily seen in the fact that all matter in some way or another forms structures and that all the chemical elements have 'predilections' for assembling themselves. In a word, I repudiate the notion of 'chance'. Assembly may be undirected, but the bias sees to it that the 'chance' occurs. All of chemistry is devoted to the study of these biases, and human chemical engineers have discovered a numbers of artificial combinations that still work, but do not occur spontaneously: so here the human mind is introducing another BIAS. This is one reason why I say: The concept of unilateral illusion is itself an illusion. If we can 'interfere' with the spontaneous reality there is, then we must ipso facto have access to that reality.

Now the important criterion is this: that among that suite of naturally occurring chemical elements there is one, the carbon atom, whose BIAS is such as to give rise, under certain conditions (temperature, chemically suitable environment) yet altogether spontaneously and without coercion, to macro-molecules with the potential to transform into organisms. These entities (initially bacteria) possess, in turn, a BIAS to 'upgrade', to 'complexify'; and thus in the natural course of evolutionary passage, small communities which we call 'cells' arose as viable and independent living things.

At this point I'm going to jump over a few billion years of evolution. When the human being (or hominid) turns up some 2—5 million years ago, we find that this 'upgrading' has arrived at a truly mind-boggling complexity, not just relative to its body functions, but especially in relation to its nervous system. The important point here is this: that the human brain is made up entirely of a special variety of cells we call 'neurons'. Now it may be news to you that all these neurons are also organisms and accordingly individually alive. I have been amazed to discover that in this age of science and the universal distribution of knowledge, most people not a few scientists among them are so scientifically illiterate as to be unaware of this and instead believe that the brain is an 'instrument' or even a computer running software! In fact (let me stress this: IN FACT) the brain's 100 million plus neurons are a society all of their own, who make their living by building and working at the structures by which we experience sensations and perceptions, who live and feed, get sick and tired and eventually die just as we do.

It is these neurons who 'created' or 'invented', as a separate process, the mind. The conditions under which this occurred are unique to humans, but they are in-principle a potential or possibility of neuronal assemblies. So here is the point: that the Mind or Soul instantiated in a Self is a creative resource of the universe, coming into effect in biological matter of a suitable complexity of organisation.

In a strict sense, this potential or bias is already laid down in the very constitution of the carbon atom. If you like you may therefore (as a speculation) propose that 'God' (or by whatever name you wish to title ultimate BEING) 'seeded' the universe with carbon atoms in the 'foreknowledge' that in the natural, spontaneous course of its evolution, this universe would then give rise to creatures which could be endowed with the kind of self-consciousness that in turn enables the universe to attain to consciousness of its own being. And thus to continue speculating along one further step, this would imply that, just as we are self-conscious as a result of the combined non-conscious, yet sentitious work of microscopic organisms, so we humans abet, through our inhabitance of the imaginative dimension, the self-knowledge of the universe.

I need to emphasise here, that the last paragraph is evidently speculative; but the preceding are the facts that may conduce to this type of speculation. I could easily attach other scenarios and different speculations, as long as the facts, and especially biochemical and biological facts, are kept within sight.

From this you may deduce that I entertain rather stringent standards on what I consider to be admissable (metaphysical) speculations. It may serve as a guiding light to my repudiation of 'illusion' as a modus vivendi. To have validity, this concept needs a definition, and you will find on close examination that illusion on those terms cannot be defined without circularity. I suspect you may be inclined somewhat to eastern mysticism; I on the other hand see in it a necessary and indispensable stage in the growth of the human mind: a corrective to its (collective speaking) overweening ambition which, as we know only too well is apt to relapse from time to time into its infantile state. From one point of view, bearing in mind the addiction of millions of (exceedingly well-educated!) western people to flippant and frivolous beliefs, the eastern philosophers can be said to have grappled more seriously with the really fundamental issues, but that was a long time ago and since then they got stuck in this rut.. Its value for us today, if I may put it this way, lies in having brought the fragility of the mind to its own surface of consciousness; but of necessity we must go on and find our way, the 'golden road' which lies somewhere between the extreme materialism and the extreme transcendentalism that are still so characteristic of East and West.

Jürgen Lawrenz


John asked:

I'm having trouble answering a big question. Maybe you can be of help. Here it is:

"Different cultures have different truths."

"A truth is that which can be accepted universally."

What are the implications for knowledge of agreeing with these opposing statements?

These 2 statement are examples of 2 opposing visions in philosophy, the relative one and the absolute one. Accepting one of these visions means that you chose for a camp, either that of Nietzsche, Wittgenstein, Kuhn (relativists) or that of Popper and his followers.

For what is worth: My very personal opinion is that Karl Popper (though I admire him) chose what has been since the since Enlightenment the dominant camp, but was nevertheless mistaken.

The relative camp SEEMS to have the future.

Henk Tuten

It is not clear from your question whether you are interested in the implications for knowledge of agreeing with each of these statements individually or collectively. I'm going to try to answer in a way that addresses both possibilities.

It is widely (but not universally) accepted in Philosophy that "knowledge" constitutes a justified belief in a true proposition — where for our purpose here we can define a "proposition" as an assertion that says something that can be either true or false. So the implications of these two statements you have provided arise from their respective notions of what constitutes a "true proposition".

The two statements that trouble you present quite distinct and conflicting notions of "truth". But that is because they come from quite different conceptual realms. So it is not surprising that they appear to conflict when juxtaposed out of their natural habitats.

The first statement — "Different cultures have different truths." This is a classic statement from cultural anthropology. Within that context, the meaning of the statement derives from two observations: (a) what makes an identifiable "culture" are the common beliefs shared by the people of that culture; and (b) what separates one culture from another, are the differences between the common beliefs of the two cultures. For there to be two cultures, there must (almost by definition) be two different sets of common beliefs shared by two different groups of people.

For our purpose here, let's define "a belief" (like a proposition defined above) as an assertion that says something that can be either true or false. What marks a cultural belief, then, is the acceptance of some assertion as true by all (or at least the great majority of) the people of that culture. This general acceptance can be (and often is) quite independent of whether the assertion corresponds to the facts of the matter, or is consistent (coherent) with the other beliefs of the people of that culture. It can even be independent of whether in fact anyone at all actually believes the assertion to be true. All that really counts is whether the great majority behaves as if they believe the assertion to be true.

Within the context of cultural anthropology, the statement in question is not an attempt to establish a definition of "truth". Nor is it an attempt to claim that the notion of "truth" is culturally relative. It is instead a bit of poetic license used to express the fact that different cultures believe in different collections of fundamental assertions about their culture and their world. It is a description of what people believe to be true, rather than a statement about what is actually true or what is actually knowledge.

To take this statement out of its cultural anthropology context is to dip into a school of philosophical thought usually referred to as "Cultural Relativism" (for obvious reasons). Within this wider context, the statement would have to be interpreted as both a definition of "truth", and a claim that the notion of "truth" (and thus "knowledge") is culturally relative. Within Cultural Relativism, a belief is considered to be "true" if it is widely believed to be true within the relevant culture. Since beliefs differ between cultures, as documented by cultural anthropology, "truths" must necessarily differ between cultures.

(Cultural Relativism is more widely maintained as a system of Ethics than as a treatment of truth and knowledge. In Ethics, Cultural Relativism maintains that what is "good" and "right" is defined by the common beliefs of the culture as to what ought to be considered "good" and "right".)

The second statement — "A truth is that which can be accepted universally." Taken at face value, the statement is a straight definition of "truth". It establishes the criteria that determine whether or not some assertion is to be considered true. Whatever the assertion is, if it can be accepted universally, then it is to be considered true. Unlike the cultural anthropology context of the first statement, this definition of "truth" does not require actual acceptance by anyone. It requires only that such acceptance is possible, and makes no reference to how unlikely that possibility might be. Unlike the Correspondence Theory of "truth", it does not reference the actual facts of the matter. And unlike the Coherence Theory of "truth", it does not concern itself with the consistency of beliefs.

Consider an assertion such as "Unicorns exist" or "Fairies dance under the moonlight at the bottom of my garden". Certainly it is thinkable that these two assertions could be accepted universally — independently of whether unicorns or fairies exist or not; independently of whether a belief in the existence of unicorns or fairies is consistent with other beliefs held to be true; and independently of whether there actually is universal acceptance of these assertions or not. Therefore, each of these assertions would have to be regarded as "a truth". Clearly this is not a reasonable approach to a general meaning of "truth". And clearly, this notion of "truth" is inconsistent with notions expressed in either the cultural anthropology or Cultural Relativism contexts of the first statement. So we must assume that there is a hidden context behind this statement that has been lost in transmission.

If truth is determined by the cultural acceptance of the assertion as true, then you "know" any assertion that you believe to be true, and that you have cause to believe is generally accepted as true within your culture. Alternatively, if truth is determined by the possibility of universal acceptance of the assertion as true, then you "know" any assertion that you believe to be true, and that you have cause to believe could possibly be universally accepted as true.

Note that in both these cases, there is no reference to the actual facts of the matter, and no reference to the consistency between one assertion of knowledge and another. Thus, it would be perfectly feasible for you to "know" both that "Unicorns exist" and that "Unicorns do not exist". This is not how people normally think of knowledge they consider whether they "know" something.

In the absence of any context for the second statement, there are a number of ways to reinterpret it so that it makes a little more sense. We could, for example, draw upon the cultural context of the first statement and reinterpret the meaning of "universally" in the second to mean "universally within a culture". This reinterpretation would at least make the two statements consistent.

Another reinterpretation would be to understand "that which can be accepted universally" to mean "that for which there is justification that all rational people would accept if they were aware of it". This would incorporate the notion of justification critical to the concept of "knowledge" we are employing here. It would also eliminate the unlikely but remotely feasible possibilities opened up by the use of "can". On the other hand, without some contextual reason for this reinterpretation, it is certainly stretching the use of English to find this meaning in the words provided.

I'll leave you with the question of whether or not either the Cultural Relativist or the universal acceptance notion of "knowledge" and "truth" is consistent with how you employ those notions. I know for me, neither is reasonable. Personally, I subscribe to the Correspondence Theory of Truth (wherein an assertion is true just in case it accurately describes the facts of the matter). I find, therefore, that both of these statements are philosophically incorrect, although they may certainly possess poetic meaning within some special contexts (such as cultural anthropology).

Stuart Burns


David asked:

I'm translating into English an excerpt from a paper on Parmenides, yet I'm a novice in the subject of Philosophy. Can anyone tell me if the following terms are well-used in Philosophy, and if not, are there more appropriate synonyms?

1. The term "formulation": Does this just mean "statement"? Which is better to use.

"Fragment 6 explains this new formulation (of the two paths of enquiry), but the interpretations differ considerably ..."

2. The term "negation": At first I thought "negation" and "antithesis" are synonyms. Is that not the case?

"Parmenides' thesis and its negation are represented by two paths of enquiry, one of which,..."

1. If I make a statement, and then make a second statement intending to put the point made in the first statement more clearly, or from a different angle, or in response to a different question, then one would say that I had 'reformulated' my statement. It is appropriate to use the term 'formulation' when one is dealing with two or more statements which would be described as 'reformulations' of one and the same thought.

2. You are right to be worried about the use of the term 'negation' in Parmenides.

The key premise of Parmenides' two paths of inquiry is that there are only two alternatives, for any assertion of the form 'X is', namely, 'X is and needs must be', or 'X is not and cannot be'.

Consider the case where 'is' means 'exists'. (Similar things can be said when we are talking about the 'is' of predication, e.g. 'Pegasus is a horse'.) According the law of excluded middle, either Pegasus exists Pegasus does not exist. However, Parmenides reads this as saying that either Pegasus necessarily exists or Pegasus necessarily does not exist. But that is not a consequence of the law of excluded middle, since the logical negation of 'Pegasus necessarily exists' is not 'Pegasus necessarily does not exist' but rather, 'Pegasus does not necessarily exist'. 'Antithesis' is closer to what Parmenides is trying to say, than 'negation'.

Geoffrey Klempner

For a fantastic source on Latin and Greek originals and translations, with dictionaries, go here:


Steven Ravett Brown


Michael also asked:

Suppose we are sitting together talking and I produce a living rabbit. Then I cut the rabbit in half, right down the middle. Now we look at both halves of the rabbit and I ask you, "Now where is the rabbit?" Further suppose you decide to answer me, "There is no rabbit, only 2 half rabbits." Next I produce another rabbit and this time I cut off exactly on fourth of the rabbit. I produce a third rabbit cutting off exactly one eighth and ask the same question and I get the same answer. Finally, with one rabbit I just trim the end of one toe nail and I point at the rabbit and the piece of toenail that I have removed and again ask, "Where is the rabbit?'

This experiment makes it clear to me that what I am commonly calling a rabbit is a completely arbitrary definition of something that, in fact, never exists. The actual thing I am referring to when I say "rabbit" is just a mental image that does not actually correspond to anything of a real nature.

Suppose a coyote is eating a living rabbit. As we watch the rabbit eventually stops struggling and the coyote devours the rabbit, piece by piece until everything has been consumed. When did the rabbit go? At what point in this process did the rabbit cease to be? When the rabbit pieces are in the bowels of the coyote they are digested into smaller and smaller pieces until finally they are decomposed into their chemical constituents, absorbed and incorporated into the tissues of the coyote. It must be clear that any choice we make about when the rabbit is and when it ceases to be is completely arbitrary. Furthermore, what was once rabbit has become coyote. When it is one thing and when it becomes the other is again completely arbitrary. Any choice that we make has no relationship to the actual identity of the thing from the point of view of the Universe.

From the point of view of the Natural Order what we are calling a coyote and a rabbit are just porous bags of molecules, sacks of energy wrapped by the sheerest gossamer netting. And these bags or sacks may come close to each other and then move farther apart, at times commingling so intimately that they seem to be one. But it is always a matter of distance, sometimes very short, sometimes farther apart. It is always a continuum with no intrinsic borders, limits or boundaries. This demonstrates clearly that there are no individual entities, only relative concentrations of energy coming and going with extreme dynamism.

It becomes clear that from the point of view of the Universe there are no entities only actions, without entities that do the acting. Second, any actions of ours that arise from an idea of self, where self is different from some other, are actions based upon an illusion.

The examples address the issue of Entities, whether organismic or not. This separates in my mind the issue of life versus death from the issue of whether the Self is an illusion because there are no Entities.

What you're trying to tell me is that the universe is a seamless continuum of matter in which what seem to us to be entities are merely local concentrations. This is an idea due to Heraclitus and Parmenides, who lived about 2500 years ago. Another version of the same idea was taught by Schopenhauer. Certain theories in modern particle physics permit that interpretation. So you see the notion is both very old and tenacious.

However, to come to grips with the concept of entities and whether or not the notion of a 'self' relies on it and therefore is an illusion, I will state the theory in your language:

The self is a local concentration ("focus") of a species "C" of the matter/energy continuum ("m/ec") generated by a prior focus of m/ec of species "B" which in turn is the outcome of a preceding focus of m/ec of species "M". To explain:

Species "M" or "matter" is characterised by spontaneous, repetitive, mechanical, predictable and entropic congregation. This species (excluding any possible vacuum) accounts for more than 99.9% of the volume of the universe. If an intelligence was provided with the initial atomic configuration of this continuum, he would in principle be able to calculate the entire history of the universe, atom by atom, through to its end, irrespective of any trends towards local concentrations.

Species "B" ("bio-organisms') is characterised by spontaneous, erratic, non-mechanical and anentropic congregation. The same intelligence, if provided with the configuration of any congregation whatever would be unable to predict at any instant in time what will occur at any later time, except in some sub-species regarded as mass phenomena. The source of the erratic 'behaviour' is non-normative chemical assembly resulting in an integrated work cycle ("iwc"). Another name for iwc is 'metabolism', yet another 'homeostasis'. This species of the m/ec accounts for less than 0.1% of the volume of the universe.

Species "C" is characterised by non-perceivability, for although certain concentrations of the electro-magnetic spectrum are measurable, they attend but do not comprise Species "C". Moreover "C" displays a property of unknown and unperceivable composition which may be surmised to be responsible for firstly the erratic non-computable trends, secondly the anentropic (non-dissipatory) coherence and thirdly the self-referential capability of "B". The coherence as icw's confers on these focuses the status of "entity"; the self-referential attribute is commonly referred to as "[self-]consciousness".

It is, however, an outcome of the existence of "C", notwithstanding that it is undetectable by objective methods of assay, to confer on some members of "B" the aforementioned attribute of self-referential consciousness. Accordingly the universe ipso facto contains focuses of consciousness conferring on the universe itself the selfsame capacity within those local concentrations.

It is unknown whether or not these concentrations are dispersed across many localities of the universe; it is also unknown whether they are subject to evolutionary development. Known traces of these focuses account for an unknown percentage of the volume of the universe. It is a legitimate conjecture that these traces occupy 0% of the volume of the universe.

Two deductions ensue from this analysis. Firstly: that although the last-named property of some species in the universe account for 0% of m/ec, they comprise the only portion of local concentrations where the universe may be said to hold a form of awareness of its own being. Secondly, since these attributes are neither detectable nor manifest in any way whatever among any concentrations of "M", the universe would if it lacked species "C" altogether, not be referentially cognisant of itself and accordingly no avenue would be available to declare that it exists. Accordingly of a universe solely comprised of "C" it may indiscriminately be said "it exists" or "it does not exist". The statements would have identical meaning.

Reverting to common language: you can now deduce from the above (1) that "self" is not an entity but a property of some entities; and (2) that entities exist.

It may help to note that entities are distinguishable from objects; indeed the term 'object' is superfluous in this theory. Observe that your 'experiments' did not adequately differentiate these two types of congregation of m/ec.

As a purely speculative aside, let me say that there is no good reason for believing that science has cognisance of anything more than an infinitesimally small finite segment of the universe. The above theory, to which I am not unsympathetic, may then invite consideration of the possibility that the entropical drift (i.e. what stands behind big bang and big crunch theories) is just an effect suggested by observable phenomena, whereas non-observable trends, e.g. the evolution of consciousness, may be taking place at the same time, though unperceived. The universe may be in process not of burning itself to a cinder, but of converting itself to a "thought", and even this is an old idea, mooted in 1930 by James Jeans.

Finally, it is plain from the above that the notion of 'illusion' would entail a circular definition; and from this it follows that any tenable concept of 'illusion' can only be superadded as a special instance or incidental feature of the operation of 'C'.

Jürgen Lawrenz


Another Michael asked:

Will human beings be annihilated like dinosaurs one day? If the answer is yes, does it make any sense for human beings to still believe in GOD?

First let me say that for me anyone may believe anything.

Your first question is rhetoric. As you probably well know nobody has the answer. But surely humanity vanishing is not impossible. Democritus already said: everything in the universe is a matter of chance.

Second question: Does any belief (i.e. a god-belief) depend on the answer of the first question? The answer is twofold: IF you belief life has another sense than living and make the best of it THEN the answer could be yes. ELSE IF you believe living is only aiming for some kind of progress THEN the answer is not influenced by humanity being annihilated or not.

Because the dinosaurs got annihilated humans could progress. So after humans there probably be other beings that continue progress.

Believing in GOOD and BAD leads to questions like yours. Getting erased was a BAD thing for dinosaurs, but GOOD for humans. So should one be happy that dinosaurs disappeared? The dinosaur-GOD probably thought else wise, was that a different GOD then the one for humans?

In a relativist view GOOD and BAD depends on the system of knowledge. The absolutist (dogmatic and non-secular or not) thinks there is only one knowledge system (or truth).

Henk Tuten


John asked:

Is eating people wrong? Why?

This answer comes a bit late in the piece (it belongs to Answers 20), but I've had the benefit now of reading what previous respondents have had to say, and my tuppence worth of wisdom may still not be amiss in the context.

The question, in the end, has two dimensions to it:

1. People are organisms and thus distinguished from the 'dead matter universe' in certain ways.

2. People are self-conscious agents (in the old but by no means redundant terminology: people possess a soul) and thus are distinguished from all other organisms in certain ways.

Re Point 1. The fact that such questions can arise in the first instance is plainly based on the fact that every organism on this planet is food for some other organism. This is one feature by which the organic realm is distinguished from the inorganic.

Viewed strictly from the 'food chain' point of view, humans are food for lions and sharks (not to mention fleas, lice, bacteria and whatnot): well, why not for other humans? No logical argument is capable of resolving this issue in a morally responsible way. I put it this way to make the elementary point that notwithstanding Kant, the concept of a person (under this present Point 1) is not relevant in the context: humans are mammals. To an outer space visitor, looking for sources of organic food of the flesh variety, we would be as welcome as cows and sheep. Moreover there are human industries today which operate on precisely the same assumption: that the concept of a person is a useless addition to the concept of mammal. Research into artificial intelligence, cloning and a few others could not exist unless there was a covert belief (even if it remains unacknowledged) that ultimately the specific human-mammalian characteristics such as intelligence are portable, replaceable, reproducible and mechanisable. Whether or not this belief rests on a fallacy is not, in present context, an issue. The fact that such industries exist, consume billion-dollar funding grants and operate under the aforenamed intellectual conditions, speaks for itself.

This leaves us with Point 2 needing to come to the rescue. Paradoxically, a good way to start would be with the observation that none of the other organisms on earth have devised artificial means designed to replace themselves!

This is not as frivolous an observation as it may sound, but I'll stick to the essentials of the dimension which apply here. The concept of a person relies on a feature unique to humans in the organic realm, often called a 'soul', but 'mind' or 'spirit' will do equally well. Crucial to the concept of person is a recognition that the features identified by the term soul' point beyond an immediately comprehensible factual domain, even beyond the capacity of humans to truly understand what they mean by such a concept. This implies the possibility that the human animal is a participant, possibly the first participant, in an evolutionary potential of the universe that is not governed by criteria of objectivity such as we apply standardly to its study. In ages gone by humans have recognised this potential in various ways by accepting the existence of a creative God, who is at the same time the owner' of our soul and likely at some temporal juncture to reclaim' it and pipe it into his infinite habitat. That's not such a stupid idea; and I have often wondered why and how we modern, well-educated and scientifically alert denizens of the world want to reduce' this dynamic concept back to its poverty-stricken materialistic sticks-and-stones model. However, be that as it may, the concept of a person which derives from this ancient belief system does have the relevance that it is accompanied by a notion of an individually responsible soul inhabiting an indifferently (chance) selected human body, but and this is the crucial argument since that body now functions as the vessel for the nurturing and development of that soul, it is a criminal act against God to kill that body. And eating entails, necessarily, killing.

It should not be difficult, even for an atheist, to accept the embarrassed locution emergent property' (what property?) as a scientific substitute for the concept of soul' or mind'. In any case, the concept of emergent property' in itself implies uniqueness, it accepts by default that the result is an individual. What is lacking, however, is a notion (as in soul') of the sanctity of that property, why indeed it should be regarded as anything special at all.

So in our own dishevelled way, we cling to the ill-defined and untenable notion of a person', we seek explanations in social, environmental and evolutionary conditions for a moral definition, but we make no effort (scientifically) to retain the indispensable feature I have called sanctity'. I'm not advocating a religious point of view here; in my book sanctity' is a human concept. But in saying human I am already making a distinction-of-uniqueness, I am already acknowledging that something is different, if only I knew what it is!

What kind of conclusion can we reach from this? Firstly, that mind' or soul' are characteristics of unknown constitution and unknown purpose. Secondly, that one effect of possession of these characteristics is that their owner put questions abroad like, 'is it immoral to eat humans?' Thirdly, that in thinking about these problems, we limit and circumscribe our research effort by the application of inappropriate criteria (demanding that e.g. a soul be a thing with determinable thingness). And finally, that we inveterately persist in repudiating the genuine value of non-scientific ideas devoted to such research, while all the time most of us still feel' that the road to a real answer must lie in some such direction, not in an exclusive reliance on reductive methodology.

Somewhere recently I came across a book on bioaesthetic' principle, where the term making special' was put forward as an essential human characteristic in the transformation of banal objects and activities, frequently in context of religious ceremonial. It is nothing other, as you'll recognise, as the sanctity' I mentioned above. It is an inalienable prerogative of human to enact such transformations, and we will continue to fail in our endeavours to understand what a human being is for as long as we ignore the reality and indeed uniqueness of this faculty. For without it, when confronted with such an easy question as our questioner put to us, we have no leg to stand on.

Jürgen Lawrenz


Elizabeth asked:

I believe most of the questions we ask to ourselves are "irrelevant". When I say "Does anybody know why we are living?" what I am doing is just mixing up wrong ingredients to make a cake. And believe that the cake I made is really a cake!

Let me give an example to elaborate on this. Think of a machine that can make up questions by selecting words from a categoried list and putting them together. One question it might come up with by putting random words besides each other is: "why", "wood", "sing". Makes no sense huh?

We use a pretty good intelligence in eliminating such questions, however when we come to questions like "Does anybody know what we are living for?" we do "believe" that this is a relevant question and there should be an answer to it. Hence we search for one.

Let me turn back to the "Why wood sing?" question. After coming up a question like the above, our machine gets at certain possible answers to it using some rules until an algorithm feedbacks that a satisfactory answer is found, during its trials.

I call this the "fit". It is some statement that we come up with after a mind exercise, and that pushes us to a certain anxiety level which we associate with the occurrence we label as "finding a solution".

"Because wood burns" might be an answer that our machine finds. Though it does not make sense to us, what matters is whether the answer obeys the rules of the "fit" conditions. If we define "fit" as: if you can make a statement that would relate the attributes of wood and singing in "any way" then your statement is accepted, you can believe that it is a right answer, our machine thinks that it came up with a right answer.

Are there any articles, books that you might lead me revolving around these ideas?

Because you asked for references, I am going to start with this quote from Wittgenstein. From what you have said, I think you might see its relevance:

"The Earth has existed for millions of years" makes clearer sense than "The Earth has existed in the last five minutes". For I should ask anyone who asserted the latter: "What observations does this proposition refer to; and what observations would count against it?" — whereas I know what ideas and observations the former proposition goes with.

"A new-born child has no teeth." — "A goose has no teeth." — "A rose has no teeth." — This last at any rate — one would like to say — is obviously true! It is even surer than that a goose has none. — And yet it is none so clear. For where should a rose's teeth have been? The goose has none in its jaw. And neither, of course, has it any in its wings; but no one means that when he says it has no teeth. — Why, suppose one were to say: the cow chews its food and then dungs the rose with it, so the rose has teeth in the mouth of a beast. This would not be absurd, because one has no notion in advance where to look for teeth in a rose.

Ludwig Wittgenstein Philosophical Investigations pp.221-222

Here's a recipe for a cake: Take two squirts of liquid detergent, a cup of flour, a dollop of tomato ketchup and a large packet of salt. Put the mixture in a baking tin and leave out in the sun for four hours.

No? Why isn't that a cake? It is a 'cake' that a child might bake. A make-believe cake. Human beings might not find it edible but then again — you never know — ET might find the 'cake' delicious. Then again, in what sense could something be 'cake' for an alien being? We have to contrive a sense.

Not every sequence of words that sounds like a question is a question. Sometimes it's just obvious that the 'question' is not a real question, and sometimes it isn't so obvious. But what is a 'real question'? If someone utters a sequence of words that sounds like a question, and then someone else comes up with an answer that satisfies us, reduces our anxiety level or whatever, doesn't that prove that the question was a real question?

Now we're right on the edge of the precipice. Because if one accepts that, then it seems that philosophy is reduced to a trivial game.

Let's get back to the rose. In what sense is 'In the mouth of a cow' an answer to 'Where are a rose's teeth?' There's a 'fit' there, you can see the point. But it wouldn't even make a good riddle. The same is true of 'Why does wood sing?', 'Because of the whistling sound it makes when it burns.' What is characteristic of questions posed in genuine riddles is that while many possible answers might 'fit' one way or another, some particular answer impresses us as being the right answer. Riddles have a solution.

Here's a riddle from a Christmas cracker. 'Which flowers like to kiss?' 'Tulips.' Suppose someone asked you this and you thought, 'Orchids.' Why? 'Because of the "kissing" movement the orchid makes when the bee enters it.' The answer fits, but it won't do. On the other hand, there might be another 'right' answer to the question which flowers like to kiss (see if you can find one) so it isn't necessarily a matter of uniqueness. The distinction between the answer that merely 'fits' the riddle and the answer that 'solves' it might be difficult to define in the abstract, even though we intuitively grasp the difference. First, one has to ask what makes a riddle, which I suspect is almost as hard as defining a 'joke'.

In philosophy, 'solutions' don't come easily, but one recognizes the difference between answers that you can make a rational case for, and answers that merely 'fit' in some looser way. Similarly, it is a matter of philosophic judgement, not any set of fixed rules or precepts, let alone a universal theory of philosophical questions, that decides whether a question — like your example of 'Why are we living?' — is worth taking the trouble to answer.

Geoffrey Klempner

You're in good company, already Nietzsche and Wittgenstein considered most questions as nonsense, triggered by a wrong use of language. Even most questions that were bothering philosophers at their time.

You define the mechanism 'fit'. It supposes inherently that truth is recognized by the emotion associated with a 'correct' answer. But what does such an emotion tell me about your opinion of truth? Does this emotion depend on the system of thought used (relative), or not (absolute)?

Machines only execute commands so there must be rules leading to your emotion. Try to make these explicit.

To recognize Truth it takes an opinion about it. That is an essential question that still divides the philosophical community.

Henk Tuten

The only thing that I can think of that is relevant to what you're asking is the late Wittgenstein (and of course his students). Try:

Wittgenstein, L. The Blue and Brown Books. New York, NY: Harper & Row, 1965.

Wittgenstein, L. Philosophical Investigations. Edited by G. E. M. Anscombe. 3rd ed. New York, NY: Macmillan Publishing Co., 1968.

Wittgenstein, L. Remarks on the Philosophy of Psychology. Vol. 1. Translated by G. E. M. Anscombe. Edited by G. E. M. Anscombe and G. H. von Wright. Vol. I. Chicago, IL: The University of Chicago Press, 1988.

Wittgenstein, L. Remarks on the Philosophy of Psychology. Vol. 2. Translated by G. E. M. Anscombe. Edited by G. E. M. Anscombe and G. H. von Wright. Vol. II. Chicago, IL: The University of Chicago Press, 1988.

Steven Ravett Brown


Douglas asked:

I understand terrorism to be a tactic or technique involving the random murder of non-military combatants. What is your definition of terrorism and can terrorism be justified? Can terrorism be justified if the aims are good and if the end justifies the means?

You are right to view "terrorism" as the name of a means rather than an end. That is why it is possible for someone to be both a terrorist and a freedom fighter (given what most terrorists mean by "freedom" by which they usually mean "independence" a very different thing.)

So, of course, if the end does justify the means, then terrorism is, indeed, justified. That is a near tautology. But, does the end of terrorism (even supposing it can be attained, and that is another issue) justify the deliberate random targeting of innocent? That is, indeed, the issue.

Ken Stern

Your definition of terrorism is a formal and a rational one. Real 'terrorism' as I define it (i.e. not merely driven by purposes of power) attacks the established system of thought. In that dominating way of thinking it is PER DEFINITION wrong (apart from notions like deliberately killing non-military combatants).

My definition of terrorism is that it is an essential attack on the present basics of life in some part of the world. However I don't include terrorism undertaken for fundamentalist reasons. The driving force has to be to improve society or to keep it moving ahead, and NOT to go back to old rules for the sake of restoring the past.

Your definition can as well be applied for viruses and automobiles. I agree that both have terrorist tendencies, but don't think that is what you're aiming for!

Inherently such a definition risks being circular. Because of the definition you only recognize particular kinds of terrorism, and because of that you fail to see the danger or usefulness of terrorism.

In my definition the core of terrorism is a paradigm shift. Sometimes that takes killing, BUT not all of it is always to be prevented (nevertheless most terrorism has power motives).

If a system of thought (purely as example say the combination of capitalism and rationalism) has become so rigid that it can only be removed by terrorism then so be it. In evolutionary eyes the necessary killing is just a minor issue. In fact the U.S. leaders inherently used this kind of argument when starting the war against Saddam Hussein. Only they didn't see themselves as terrorists, because their system of thought is at present the dominant one.

Don't get me wrong, I absolutely don't approve of the mentioned war. BUT the point of view that killings sometimes are inevitable is refreshing, otherwise you end in endless debating. In that aspect ants are far superior to humans.

In principle even terrorism (like seen by me, and surely not the killing part when necessary) is refreshing. It forces one to think about his or her general rules of living. Remind that terrorists generally only start killing out of frustration, i.e. out of a lack of real communication.

Henk Tuten


Beth asked:

Why is there so much pollution? How can we stop air pollution? What causes air pollution? How did air pollution get started? When we knew there was so much pollution why didn't we stop it?

I'm only going to answer the first of these questions: the others pretty much answer themselves if you continue thinking along the lines of my answer. But the strong case is that we have only partially succeeded in coping with our human status. And this implies that, of all the billions of dollars we spend on luxury sciences, we've seem never to get around to the issues that affect us right here on earth, in our very survival (compare the ecology problem).

Anyway the initial answer is surprisingly simple. We are mammals, closely related to apes and monkeys, with which we evolved in parallel from an ancestral stem that comprised originally an (or many) species of arboreal simians. However, for arboreal creatures, pollution problems don't exist: they drop their rubbish to the ground (including faeces), where on the principle of organic recycling, other creatures will cart it away or decompose it. Accordingly the genetic profile of this whole evolutionary branch has failed to develop a "clean up after yourself'' instinct; you can see this in action in any zoo or in the wild among arboreal monkeys; more importantly, it is plainly there (or rather, still absent as an instinct) in human babies, every single one of whom has to be toilet trained. Compare this to cats, to which the pollution they create is a severe problem; and accordingly they possess an instinct for cleaning up after themselves as part of their genetic profile. When therefore the first primates left the trees (probably enforced by a changing habitat), their instincts were and remain inappropriate to terrestrial habitance. Specifically in relation to humans, this means we are required to apply our intellect to this problem, because (firstly) our numbers are too great to rely on the fairly slow recycling effort of small creatures, plants and bacteria, and (secondly) our many inventions represent just so many proliferations of rubbish, with which the environment cannot cope (and sooner or later, if we persevere in our thoughtless habits, we will literally choke to extinction on it!).

From this simple issue, as I said, enormous consequences arise. It strikes me as funny (not in the hilarious sense) that we are well educated these days to appreciate our intrinsic animal nature, and yet such obvious repercussions as those I've just drawn for you, are simply not made a part of it. Which of course they should!

Well, now that you know, I hope you feel encouraged to do whatever little you can, remembering that the quantity of rubbish produced by one person EVERY DAY is pretty high compared to other animals, and now multiply that by 4 billion. Conversely, every little bit of rubbish thoughtfully disposed of, and again multiplied by 4 billion, is a lot of (self-)help.

Jürgen Lawrenz


Jenna asked:

What exactly is the mind-body problem?

The mind-body problem is about how non-physical mind is connected to or arises from physical body.

The problem really arises because we make the assumption that the mind is non-physical in the first place but it is difficult not to do this give the nature of consciousness. The 17th century philosopher, Descartes, is often accused of starting this problem in his Mediations. Descartes asserted that while physical body was extended, the subjective 'I', or the mental was not. Through a thought experiment Descartes found that he could doubt that physical things exist, including that he had a body, so he could not be identified with his body. Unhappily, the argument relies on the assumption that the mental is different from the physical. Just because you can perform the psychological act of doubting you have a body, it doesn't follow that you are spatially distinct from it, unless you already prepared to make a distinction between the mental as non-extended and the physical as extended. But the Meditations is quite fascinating for many reasons, and is to be recommended. A far as the mind-body problem goes the difference between the mental and physical remains a problem because our conscious awareness of ourselves, or our mental life, doesn't seem to be spatial. We can divorce the mental from physical time and space in many ways.

But the problem arose much earlier than Descartes philosophy and persists today.

The Greek philosopher, Aristotle, thought that there was a part of the intellect that was separate from the lower capacity to form images derived from perception of the world, and 'it is this alone that is immortal and eternal'. While Aristotle saw man as fundamentally a biological organism, this sentence can only read as expressing a belief in a soul. If there is such a thing as a soul, then the problem of how it is connected to body arises. Aristotle couldn't account for this. He analysed man into a hierarchy of capacities, each dependent on the one below, but this bit of eternal intellect, he just leaves as 'separate'.

The problem of mental/ physical interaction is still with us today as it is widely claimed that the mental, which is non-physical cannot causally interact with the physical. What sort of causal laws would allow this to be the case? It does seem that by making a decision, which is a mental event, we can bring about something physical in the world, but science cannot explain this because science deals with the physical. The answer seems to be that there is no soul, no thing that is mental, but the mental is just an aspect of the physical world. However, there are philosophers who believe in mental realism and the possibility of psycho-physical laws holding between mind and body, so the mind-body problem hasn't been settled.

With scientific advance it is now understood that the mind is dependent on the brain but the problem remains about how the brain gives rise to consciousness. While it remains the case that philosophers with a bent towards neuroscience cannot explain this, and cannot find a theory that reduces consciousness or subjectivity to physical theory, it is still possible to hold that the mental could exist without the physical (and, for sure, it could logically in that we can imagine that it could) or that it has a separate reality.

Rachel Browne


Lea asked:

Where in the writings of Lao Tzu and Chuang Tzu is there a critique of Confucian metaphysics? I have been searching for days and have yet to find any such attack. I have found Taoist critiques of Confucian ethics and epistemology however, I have not been successful in finding an Taoist attack on Confucian metaphysics (or more accurately the Confucian's lack of metaphysics). Also am I correct in thinking that Confucians do no really have a metaphysics per se but that the neo-Confucians do?

As I am completely baffled and utterly frustrated, I would sincerely appreciate any help anyone can give.

I'm not an expert on Chinese philosophy, but to the best of my knowledge you didn't find a critique of Confucianism in Lao Tsu of Chuang Tsu, because they didn't write one and were not known to have done so. Moreover, to call their thinking, or that of Confucius himself, metaphysics, is quite inappropriate. This is importing western concepts into the Tao, which is not compatible with our notion of metaphysics. Such textbooks on Chinese philosophy as Chu Chai or Fung Yu Lang inform you on their first pages that metaphysics is an absentee from classical Chinese philosophy. So perhaps your search needs a re-examination of what you understand by metaphysics. Similarly with epistemology, With that, you have me baffled. I cannot remember reading a single epistemological text, critique or even epigram between Confucius and Han Fei. Now I concede that in post-classical philosophy (say, around the 11th century) there were some attempts to broaden the base of Chinese philosophy to include epistemology; and as to metaphysics I suppose it is legitimate to call it that once Buddhism infiltrated traditional doctrines. But I'm less certain on this and the best advice I can offer you is a perusal of Wing Sit Chan's Sourcebook of Chinese Philosophy. You don't have to read it's over 1000 pages, but if you study the author's introductions to each section, you should have no trouble isolating whichever trend or school showed metaphysical and/or epistemological leanings. Precious few, I would say, always exceptional, and never part of the main stream. Hope this helps in a small way!

Jürgen Lawrenz


Darren asked:

I am not a scientist, I am 27 years old and I have an idea in relation to time. It is not a subject which has consumed much of my "time", but for some reason I feel the need to confirm or deny my beginning of an idea and to know if it is (and I strongly presume it is not) a good basis for a theory.

Assuming time is unique to each individual. Could not each person have their own timeline i.e.. from birth through life your actual timeline remains a constant. You move along your timeline at your speed. The speed you progress along it is set — it cannot be altered, you can neither move slower or faster. This reinforces the theory that "now" does not exist because each individual has their own timeline, also because of this, the universe would continue to evolve without sentient beings, if one person's timeline is omitted other timelines would not be effected. This also runs with nature's natural cycles (for a crude example,the earth will evolve and the end of its natural cycle is collision with the sun — this is the earths timeline).

Every living object has their own timeline, the only objects which do not have a timeline (or natural cycle) are those created by man.

Each person moves through space along their timeline at their speed (time).It is therefore not possible to move back or forward along your own timeline, but is is possible to move to other people's timelines at any point. Events and future scenarios are always there, they are the background/ the "space". Your timeline can alter direction to move through events which it would not normally cross but overall events/ scenarios "float"/ pass across yours and everybody else's timeline and are therefore not totally predictable but the number of events crossing your timeline at any point can be numerous.

Despite actions/ future actions it must be remembered that no two individual timelines ever cross. If lines are crossed then maybe this is time travel.

Time is also therefore not universal, and to use the experiment with the atomic clocks as an example, this can not be related to time travel because each person's timeline in the experiment has remained the same, you have not changed their timeline in any way or the speed they travel along it.

Is this new idea (does it make sense!)or can you point me to other similar theories. For some reason to me it seems to work and explains a lot of paradoxes about time or backs up other theories i.e. absolute conception of space.

I am a layman and would be interested in knowing more about the paradoxes and theories to see if this idea can be extrapolated to provide answers.

I'm answering this because you seem in earnest.

Um... you're wrong.

No this is not a new idea. No it does not make sense.

That's the summary. I'll highlight a few points. For one thing, think about this: time is the basis of movement, right? That is, an object or whatever traverses a distance in a time interval, and so it moves... that's what movement is, right? Now, a) how then do you "move through" time; b) how does "time move"??? That doesn't work, if you think about it.

A "timeline". Ok, now just what is that, besides a spatial metaphor for a particular comprehension of time? Yes, yes... maybe you've read about "timelines" in relativity, in sci-fi, or whatever. A timeline in relativity comes from a very particular analysis of time and motion, which does not fit with what you're saying. Anything else is just metaphor.

Ok... "every living object has their timeline"... you're confusing at least two concepts of time here. One has to do with events taking place "in time"; another has to do with the passage of time (which by the way doesn't really mean anything... another metaphor). "Events" means something... but I don't think you know what that is, because no one else does either. Or to put it another way, there are lots of theories out there about what an "event" is. The "passage of time" is yet another, yes, spatial metaphor for time that we use to comprehend certain aspects of it.

There's just too much confusion for me to go further... you keep taking metaphors as if they are reality. "Timelines", "scenarios", "backward", "forward"...

Look... before you do anything else on this subject, read these books:

Lakoff, G. Women, Fire, and Dangerous Things. 2nd ed. Chicago, IL: The University of Chicago Press, 1990.

Lakoff, G., and M. Johnson. Philosophy in the Flesh: The Embodied Mind and Its Challenge to Western Thought. 1st ed. New York, NY: Basic Books, 1999.

Steven Ravett Brown


Kim asked:

Why is astrology condemned by Christianity?

Astrology is regarded by the Christian Church to be a form of fortune telling, it is one of a list of practices condemned by the church. Those involved in these practices are considered to be atheists, or more significantly, followers of the Devil. Hence, astrology is linked with trickery, conjuring, witchcraft, deceit, spiritualism, psychic phenomena, all types of fortune telling, mysticism, etc.. The claim is that these are all practices condemned by Jesus, Paul and the Apostles.

We might ask: If these practices are condemned why is it that Christianity accepts miracles? The answer is, simply because Jesus and the Apostles are understood to have performed miracles like healing the sick, making accurate predictions of future events, changing water into wine, raising the dead, feeding the five thousand, etc.. The difference between those who perform miracles and those who perform tricks is allegedly made clear in the New Testament where, in the Acts of the Apostles the powers of Stephen the Apostle are compared to those of an outstanding conjurer and mystic, and found by the people to be vastly different, to put it crudely, the mystic was not in the same league as Stephen, who is seen to have powers far superior to him. This power is alleged by the church to come from the gift of the Holy Spirit within him. Those selected by God are blessed by the presence of the Holy Spirit, which invests them with special powers outside those of ordinary people.

Anyone claiming supernatural powers who is not blessed by the presence of the Holy Spirit is considered by the church to be a charlatan or a servant of the Devil. The seeking out of witches in the 17th. and 18th. centuries is well documented, all the victims were accused and condemned on the basis of church dogma regarding special powers. If special power was not coming from the gift of the Holy Spirit there was only one other source — the Devil. Amongst the victims of the witch finders were many alleged fortune tellers, including those dabbling in astrology. It was feared that fortune tellers could not only foretell the future, but could also influence it. It was deemed likely that such influence would come by way of the Devil and would constitute a challenge to God's plans for his people.

Some creeds now have a more lenient approach to astrology, however there are still some more dogmatic creeds that adhere rigidly to the old order.

John Brandon

Astrology ascribes power over people's lives to "heavenly bodies" — the planets, sun, moon and stars — and their movements through the sky.

Christianity rejects this view. It holds that only the one God has power over the universe and everything and everyone in it.

In this, Christianity, Judaism and Islam agree. Islam also explicitly rejects faith in astrology. In the Qur'an, the prophet Abraham is said to have declared: "I love not Gods that set" (meaning the heavenly bodies).

Yahya Abdal-Aziz


John asked:

Does Descartes believe that there is not an external world?

On the contrary: the biggest book he wrote was largely about the external world, about cosmology, physics and science altogether. You seem to be confusing something elementary here. He said, he can conceive of his mind independently of having a body; but he never extends this to say, that in this world, it was possible to have a mind without a body or that the world is an illusion as such. What it boils down to is an incompatibility between mind stuff and matter stuff. But that's not tantamount to a denial of the latter.

Jürgen Lawrenz

No. In Meditation 6 Descartes argues that there is an external world, although it isn't at all the way it appears to us.

In Meditation 1 he presents the traditional skeptical arguments for thinking we do not know for certain whether there is an external world. He never argues that there is no external world, and, in fact, writes in his Preface that he believes there is an external world.

Ken Stern


Tweeglitch asked:

Set me straight please. Why isn't absence of evidence evidence of absence? If there is no evidence available in support of the existence or non-existence of some thing then what is the proper attitude to take to the question of its existence? Is it to say that it might exist? It can't be practical to act on this given all the 'mights', all the things that could be imagined which may or may not exist with nothing to support the truth of either. But to ignore the possibility (unknown as it is) by not acting on it doesn't seem right either; one's attitude would be the same as if it doesn't exist without actually saying so. So why not just say so? The model of the universe I use to navigate the world around me can only be built of, and updated by, evidence if I want it to be the truest reflection of what's actually out there. So if I detect no trace of some thing then that thing has no place in the model.

But surely you can think of situations in which, without evidence, it is clear that something is the case... a murder in which there is only one reasonable suspect, but where there has been no evidence found. Usually this is an unstable situation, i.e., evidence will be found. But not always. To put it another way, how exactly do you prove a negative?

But let's address your question directly: is the absence of evidence for — what? Another parallel universe? A god? Extraterrestrials? — evidence of their absence? Well, how about aliens? We have no evidence for the existence of aliens (no, sorry, all you UFO enthusiasts out there, read the Skeptical Inquirer). So are there no aliens? Well, given the size of the universe, etc., that seems utterly, absurdly, unlikely. So we have to say that at this point we just haven't explored enough of the universe to say one way or another, and that all the evidence for the origin of life here on earth indicates that it could happen somewhere else, that there should be extraterrestrials. But that's just inference, not evidence.

What about gods? I'm not even sure I want to do this again... I've done it so much here. I'll just say that I think the evidence, such as it is, for Jupiter, Krishna, Thor, Allah, Osiris, Yahweh, Astarte, and so forth is all equally strong (equally weak, actually). So it all sort of cancels out, doesn't it, since each has implications contradictory to the others. Therefore, even if you assume evidence for these and the multitude of other gods which humans have believed in, you're left with a category of possibilities which does not seem a reasonable one. So we go on to other categories. But again, how do we prove that Ra, for example, did not create the universe? I know of no such proof, and thus again we're left with processes of inference.

So we have to admit that while logical positivism is too strong, since it would seem to eliminate things like extraterrestrials, equally, simple belief or even some degree of "evidence" is too weak, since it would leave us arbitrarily worshiping some god, or maybe a different one every day, or whatever.

What to do, then? I'll tell you what: do science. How do you do that? I'm sorry but I just cannot condense that into a paragraph. Read this: Kitcher, P. The Advancement of Science; Science without Legend, Objectivity without Illusions. New York, NY: Oxford University Press, 1993. It's not an easy book, but it will give you some answers, I think.

Steven Ravett Brown


Mahmud asked:

I have a very simple, but very tricky question. How do we know that we know?

Well, most times we don't know we know, we just know. But sometimes, when we claim to know something, say that Quito is the capital of Ecuador, we will give reasons for believing that we know that. We can say things like not only did I look it up in the World Atlas, but I also looked it up in the latest World Almanac, and to make sure I knew that Quito was the capital of Ecuador, I even went so far as to telephone the Ecuadorean Consulate. The woman I talked to said that yes, Quito is the capital of Ecuador.

So that is how I know that I know that Quito is the capital of Ecuador. I just accumulated more evidence than I normally would have thought I needed.

Ken Stern


Sarah asked:

How do I support a Greek Philosopher's Theory such as Hesiod?

You have me bamboozled. Hesiod was a poet, who wrote up a cosmogony that served as a kind of basic religious text for the Greeks. In other words, his story is of how the world and the gods came into existence; and he does it by way of stringing together many episodes, as is customary in epic literature. Now is this what you wanted to know? If not, you may have to post your question again and try to clarify whether you are talking about Greek philosophers using Hesiod for their theories or whether you mean, what kind of philosophy can be extracted from Hesiod or, is Hesiod' a philosophical theory? All these are possible ways of reading your question.

Jürgen Lawrenz


Tina asked:

Is a person's ethical worth measured by their thoughts or actions?

Suppose you were standing next to someone who was beating up a child, thinking to yourself, "How horrible, I must stop this, it is totally immoral!". And you just stood there until the child died. What would your ethical worth be?

Suppose you were standing next to a person beating up a child you hated, and you were thinking,"Do I really care about this child? The one that just burned down my garage? The world is better without [him or her]!". Nonetheless, you restrain the person beating up the child and call the police.

Which is more ethical?

This is also a problem I have with such things as intelligence tests. Is a person who "should be" intelligent, who scores high on such tests (which is, I concede, an action of a sort), as intelligent as someone who scores just as high, or lower, who becomes a great mathematician, writes a great novel, etc.? I would say quite decisively "no". Yes, motivation, luck, etc., etc., play a part, as they do in everything. Nonetheless, after seeing multiple self-proclaimed "geniuses" languishing in coffeehouses, doing drugs, etc., and comparing them with others who, even if they spend time in the same company, manage to go back to their little room and write, or program, or whatever... I do not believe in the primacy of one's "thoughts", or in one's "essence", or in someone who is "really" good or smart or whatever, if only they'd just do something. No. Doing something is the essence of such attributes.

Steven Ravett Brown


Batey asked:

I would like to know, how different is Levinas from Heidegger in your opinion? Levinas talks about the Other and thinks that this is a new idea that Heidegger has missed, do you think this is fair to Heidegger? and could you please use some of Levinas examples of the Other, such as other people, God, "mysterious" women and ones own children to clarify my question?

Levinas is very different from Heidegger and this seems to be because the writings are deeply informed by religion and ethics and his objection to Heidegger is that ontology cannot give rise to values.

It is not that Heidegger 'missed' something, but that his project of thematization and the reduction of being to ontology does not reflect what is to dwell in a world where much is a mystery. The Other cannot be thematized because it is that which is beyond the realm of understanding, which is why Levinas uses a great deal of metaphorical language. While Heidegger distinguishes between beings and being, and thought that we could attempt the meaning of being, Levinas argues that there is also the 'otherwise than being' which is not ontological because it is the transcendent ethical relation which is infinite and non-graspable.

While for Heidegger, we experience being as being in time and so we experience being towards death, for Levinas we don't experience the infinite, but have unlimited responsibility towards the other, and so we do not have anxiety simply for our own death, but for the death of others, and that this is so is beyond representation. The difference here is that Heidegger looks at what can be said about subjective experience and can be represented and understood while Levinas moves beyond the simply subjective which gives rise only to concern for oneself and places man in a dwelling which is also metaphysical.

I'm not sure how fair Levinas's criticism of Heidegger is. Heidegger was limiting himself to a phenomenological description of how our experience can be represented or understood. Levinas wants to talk about that which cannot be reduced to representation and so cannot say that is experience as such but he offers a different account of our dwelling in the world which is more mystical. I think all that can be said is that you will favour one philosopher or the other depending on your own attitude to the non-representational, or whether your commitment is to the mystical or the rational.

God is other because he is not the 'thematization of the thinkable'. It is because of His transcendence and non-presence that atheism is possible. It is in our responsibility for other persons that we hear the word or command of God. This responsibility is shown in our recognition of 'Thou Shalt not Kill' and that we are ready to die for others.

The other person is also other as he not reducible to an ontological object. Another person is both closer than an object can be (think of kinship and the caress) and yet he also totally evades us in that he is more than we can perceive.

Levinas says that 'the other is par excellence the feminine' or the maternal, which is both a metaphor for responsibility but also for the mystery of otherness. 'Equivocation constitutes the epiphany of the feminine'. The idea of woman is something ambiguous that cannot be tied down because it is non-signifying suggesting both tenderness and fragility as well as responsibility for the vulnerability of others. The other, as with the feminine, is both non-definitive and infinite.

The parental relationship also puts us into a relation with infinity, or the future, as a promise of eternal life which will be ours, and yet not ours.

Rachel Browne


Jay asked:

What is it we immediately perceive in perception? Is it the intersubjective, material world as naive realism posits?

Or what is the relation of what we immediately perceive to the purported external world?

It depends on how old you are. And chances are that what you perceive in perception is stale news once you know you've perceived it. But I suspect that what you mean by 'immediate perception' is actually sensation. To make sense of this issue, you need to keep these two items clearly apart.

Sensation is what your nerves bring in. You see a light; you feel a pain: this is sensation. Some sensations (well: most of them) require further interpretation: this is where perception comes in. Perception is evaluation of sensations.

Let me explain in this in a simple example. When you hold an apple in your hand, the shape, the weight, the smell, the colour: all these are for the time being sensations. But by itself this would not make the apple recognisable to you as a fully-rounded object. Perception is therefore the composition of several sensations into a single perceptual unit.

Now when I say that the 'news' of this is stale by the time you are aware of it, what I mean is that this perception is not sourced exclusively from the outside. You could not recognise an apple unless you had an 'apple' memory. So when your perceptual faculty brings the apple stimuli together, it doesn't just simply rely on them without question: you are put (so to speak) 'on hold' while the contents of your memory are examined to verify the stimuli. It's quite fantastic, actually: let's say 69% of the apple you 'see' matches exactly what's in your memory, while the other 31% are new: then what you see will actually be a composite of 'remembered' and 'real' stimuli. The reason it happens this way is because images are extremely labour and space intensive; your brain could probably not assemble a good representation from just the 'factual' data in a time stamp useful to you (you know how long it takes to bring 50 gigs into your computer!). So the compare and match process serves for speeding up your recognition of that object.

Finally my remark about age: many types of stimuli are repetitive, as you get older, many of them might even be identical. In a very real sense, then, when you see something for the 50th time, your perception is not going to be bothered going through the 'expensive' routine of deciphering and constructing a stimulus for the 50th time, but just replays it from memory.

Now I should give you some references, but most books in this branch are so damned difficult that I'm reluctant. But to make a start somewhere, perhaps you should browse through Donald Hoffmann, Visual Intelligence (Norton). But be warned! This is a subject that, once you get a taste for it, won't ever leave you alone!

Jürgen Lawrenz


Eve asked:

I want to write a book based on philosophy. I don't want to be preaching my ideas, I want to explain a philosophical dilemma or question that is a subject of debate, and then put my opinion forward, but show how it relates to other opinions and what parts of those opinions I agree with and why.

Do you have any advice on how I should consider setting it out, do you think it would be best to just tackle a few questions that are related to each other? Do you think I should set it out as a story, or somewhere in between a novel and a textbook (I don't want it to be like a textbook or a revision guide!) Also are there any books I should read to research?

Another of those oldie questions down at the bottom of the list... I guess Eve has given up on this... but... for some reason this one intrigued me...

Ok. On the face of it, without another 10-20 years of education, what you want is simply ridiculous. But then I started thinking about Colin Wilson and people like him. So... take a look at some of the early novels of Colin Wilson: Adrift in Soho, The World of Violence, Ritual in the Dark... neato stuff, which he wrote in his 20s. Try something like that, if you're able.

Steven Ravett Brown

Advice is cheap, as the old saying goes; and in relation to your question, the difficulty is that you may not be altogether aware of how big a problem area you're tackling there. However, there are fundamentally two ways of attacking this issue:

Firstly, you can study works by writers who have done this sort of thing before; and obviously you would chose those whose writings are truly philosophical, not just argumentative. For example, Sartre's Age of Reason and Camus' The Outsider are in a loose definition "philosophy set in motion" in a fictional environment. You might use those as role models: read them, and then go to the secondary literature to get yourself directed to the portions of their more formal philosophical writing where the same issues are dealt with.

Other novelistic examples are found in many writers who are not specifically philosophers; I might mention Aldous Huxley and Thomas Mann as a conspicuous examples; but here as there you need to be well-informed of their philosophical backgrounds to derive profit. And you really cannot go past the classical examples of Voltaire's Candide and Bacon's New Atlantis.

Then there are writers, very few, whose novels are truly philosophical in an authentic sense, written by men who were philosophers but never actually wrote a philosophical text, just novels. Dostoyevsky's major novels belong into this class; and I would go so far as to say that no-one can claim to be philosophically comprehensively educated without having read at least The Devils and The Brothers Karamazov. Joseph Conrad's Nostronomo and Stendhal's The Chartreuse of Parma might also be said to make the grade, and there are a few others, although by now you might find yourself with a major reading list to tackle. So perhaps you might alternatively consider Option 2, which assumes that you possess reasonable literary talent and especially a facility for writing convincing dialogue.

An obvious starting point would then be to select one or several connected philosophical topics, say "good and evil" or "the concept of justice" or some topic in fashion today that motivates you. Read what philosophers have written about them, pro and con, and then put up a few characters whom you'll have to portray as "embodiments" of opposing trends. The more complex these characters, the more convincing they are likely to be. I mean by this: ensure that A is not just evil, but has a streak of unexpected and eloquent compassion about him/ her (for example). The ideas you set in motion, being exemplified by persons, must not be "monolithic", but shade in and out contingent on circumstances, events, loves, hates, politics etc. The best example of this sort of thing is again Dostoyevsky, and you could do much worse (if you're serious) than to read his Devils five or six times and attempt to tabulate the characteristics of the main characters and how, why and under what circumstances they come out, what conflicts cause their characteristics to change, crack under pressure, or become modified in one or the other direction.

I must not fail to mention, finally, Plato. Several of his dialogues are the best models ever written. Especially pertinent in your context would be Protagoras, but Symposium and Republic are equally brilliant, though more complex and extended.

Well now: It remains for me, I suppose, to wish you the best of luck and happiness in your endeavour; and I do expect a mention among your "influences" when you pick up your first Nobel!

Jürgen Lawrenz


Herschel asked:

Do you think consciousness is a emergent phenomenon, and if it is, is it just an analysis? Or is Consciousness a product of something greater and fundamental? I was wondering how an answer might account for near death experiences and anaesthesia

A Danish popular science writer, Tor Norretranders, wrote a book called The User Illusion, subtitled "Cutting Consciousness Down to Size" (1991, translation in Penguin 1998). I found it fairly persuasive, and the facts related fascinating. May I recommend you read it if you haven't already done so. His conclusion is that consciousness is an illusion! He attempts to prove that we make many routine and critical decisions, and act on them, BEFORE we are conscious of the need to. If he is right, one wonders why consciousness exists at all.

Tor Norretranders would say, I think, that consciousness is an emergent phenomenon, but it is vastly over-rated. It emerges from something greater and fundamental — the non-conscious computational processing done by the human brain, viewed as a massively parallel distributed processor. By contrast, he likens consciousness to a serial information stream controlled by a primitive computer with just a single CPU. The bandwidth — the information processing power, in bits per second — of the brain is enormous. But that of consciousness is tiny.

Tor Norretranders argues that an inordinate emphasis on consciousness has large, destructive effects in society and in our personal lives. Definitely worth a read. There's a lot in it to assimilate. I think I'd better read it again!

Yahya Abdal-Aziz


Roger asked:

My question has to do with the number of humans in the world versus the number of non-human sentient animals. It requires a rather lengthy setup, so bear with me.

Imagine that you could somehow catalog every fertilized egg cell (zygote) that would one day grow to be a sentient animal. That is, pretend that you had a computer database describing every zygote that ever existed, from that which would become a prehistoric mosquito to one that would become Albert Einstein. All would eventually mature into a fully conscious being. The database would contain all kinds of information about the zygote, but you would primarily be interested in the species of the animal that it would one day become. The reason that the database contains information about only the zygote and not the animal itself is to underscore the fact that we all start out as such, no matter how big we become.

Now imagine that you query your database for the total number of homo sapiens egg cells that have ever existed. Nobody knows what that number would be, but let us say that it is roughly 15 billion. Now query your database for the total number of all of the zygotes. Obviously, that number would be much much higher. The number of insects alone would be extremely high. There are 200 million insects alive for every human on earth (I got that from Hollywood Squares). Multiply that by the number of years that insects have existed, and you obviously have a very large number.

Divide the number of humans by the total number of animals, and the result is an extremely small number. It would seem that this number represents the probability of being a member of homo sapiens, given that you are any sort of sentient animal having been born sometime in the past. Let us conservatively estimate that number to be one in 10 trillion.

Now let us take a closer look at one in 10 trillion. An event having that probability would be EXTREMELY unlikely to occur. In fact, I would even describe it as being "freakishly improbable".

The average lottery player would be much more likely to win the jackpot twice during her or his lifetime. I do not know about you, but I have never won the lottery once let alone twice. It would seem that, by having been born human, you and I have won the jackpot in the lottery of life. Twice! I cannot accept this as true. "Freakishly improbable" events just do not happen to you or me!

Now, the following thought comes to mind:

Do the lower animals such as insects really possess a rudimentary consciousness?

Most researchers in the field say yes. Insects and other such animals do indeed possess a rudimentary consciousness. They are sentient. They are not like zombies, sensing and reacting to things without having an awareness of them. They have consciousness.

Thus, human consciousness is an extremely rare thing on our planet.

Here is my question:

Are we humans really the unlikely winners in the lottery of life, or is there some other explanation? Could the probability of being human actually be much greater than I have assumed?

The problem with your arithmetic exercise is that numbers as such have no bearing at all on the situation you are portraying. In calling the likelihood of a zygote to be a human rather than a mosquito 'freakish', you forget that the atoms in the universe that might or might not become part of a living thing have even more freakish improbability to account for.

Forget numbers and probabilities and look at structures. Everything in the universe has structure, and thus the only pertinent type of argument along your lines of thought is to consider whether one or another type of entity is structurally probable (or possible), i.e. whether it makes sense for some type of structure to exist and how and where it does. In such a context a spiral nebula is easily explained by gravitational forces; and the entropy that eventually results is equally plausible. The context of your question is biochemical and biological, however, which entails a different slant or perspective.

Let me get one thing out of the way quickly. What kind of species possess consciousness or not we can scarcely be certain about, but a default theory is that nerves are the absolute minimum. Now consider that nerves are themselves living things! Are they conscious of their own consciousness, or of ours, or of none? You see (I hope) what a futile theory this is! Consider further, just for now, that a great deal of this theorising is concerned with our place in the sun, and that the pendulum tends to swing rather wildly between theorists who want to convince us that we're just overgrown apes and should, perhaps, dismantle civilisation and return to the trees, and those who believe that we have a destiny to manage this planet before somehow we get ourselves installed as managers of the solar system or even the whole galaxy. Well, I can tell you I'm not on the side of the former, even if I reserve the right to remain sceptical about the latter.

Now: In logic (or in the lottery scenario) there was no compulsive reason for bacteria ever to grow into anything bigger or more complex, seeing that they were (and are) perfectly adapted to survive just about any calamity short of this planet physically blowing up. Yet it is a plain fact that 'upgrading' is a also a kind of 'default programme' for organisms: that's what the theory of evolution is all about. A deep problem in that theory is, however, that they are also lured down the path of numerological speculation and thus keep stringently to the mechanical doctrine of genetic accident — which is no answer at all, but a simple causal argument that leaves you looking for more causes right through to infinite regress.

I can't write a book here, although that's what is really necessary to answer you in depth. All I can do give you in one paragraph what my conclusions are, and then a couple of titles for you to pursue the matter on your own bat.

Carbon atoms are capable of forming polymeric chains of immense length and infinite flexibility. They are the only atom so endowed. Now this suggests that carbonaceous structures will be different in kind from all other molecules, as indeed they are. A specific type of this kind of molecule, called a macromolecule (i.e. giant molecule) or polymer, given certain temperature conditions, such as prevail on earth, has the capacity of 'turning itself over' in such a way as to construct an integrated work cycle without external mechanical push-and-shove. This is difficult for us to come to grips with: we are still enthralled to the cartesian division between mind and matter and therefore prone to seek explanations for all structures, including biological ones, by the route of reductionism. This doesn't work in the present scenario. However, the point to be made is that these latter structures, which we call 'bacteria', are obviously possible, and given the conditions named, altogether probable. Most biologists would tell you that the chance of some such form of life arising on any earth-like body in the universe is quite high. This leaves only the last item to be explained, why do they upgrade?

Now this is where numbers and probabilities get stuck. Remember me saying that it seems not to be compulsive for sheer survival. This part of the conclusion is therefore totally interimistic. On earth, it occurred because the atmosphere changed about 2.5 billion years ago to a level of toxicity that was fatal to microbial survival (I'm talking about the air we breathe!): and it is from this point onward that 'upgrading' sets in. Organisms needed to find a way to detoxify the air; this meant devising respiratory structures, therefore necessarily an increase in body size and complexity. This process has never stopped, but greater size and complexity impair survivability in other ways, as will be apparent without me spelling it out. But it explains why to every billion bacteria, there might be 1 million mozzies, 10 cats and 1 human. But all these structures are implicit from the moment that bacteria were compelled to follow this path of proliferating organic forms.

Ok, all this is rudimentary. My point is merely that under certain circumstances, as demonstrated by historical developments which we can easily trace back, this 'upgrading' was already latent in the first biological polymer when it came into existence. Whether such a path is logical may be another matter. Whether a human being would arise necessarily in the course of such evolutionary pathways I'm inclined to doubt — too many other contingent factors might intervene and human are not necessarily the goal' of these developments. There may no goal at all, of course; or if you're a believer, you might say that human are the outcome of a directed evolutionary path. One way or another, however, human-like creatures may not evolve on other planets, even though there is no logical argument why they should not. What happened here cannot be denied to be possible elsewhere, since as a possibility it has already occurred. Finally, the argument for 'upgrading', which includes, eventually, nerves and therefore consciousness, is pretty plausible too. There are (even if I deny it to mozzies and fleas) enough species on earth with nerves to admit that consciousness is an almost inevitable adjunct to young upwardly mobile creatures.

Humans? Well, put the question the other way: why dogs and cats and horses? All these are, to some extent, contingent developments. Human may have development without dogs, cats and horses. We would not have evolved, however, without the species mammals' making the grade. So there is a certain hierarchy which cannot be ignored. When one talks about structures, as I did, it is incumbent to remain aware of the priority of some types of structures before others.

But: the 'movie' of life's evolution on this earth is not a path, it's a bush with thousands and millions of branching points at every juncture. It's likely to be exactly the same, but different in detail, on any other life-bearing planet. So I would expect that some structure analogous to earthly life forms to arise wherever the possibility exists; I would expect (as indeed most of us do from sheer habit) that if there are mammals, they would also have biaxially symmetrical body forms, because these things don't happen like that just on a whim; but I would expect that their details might be very, very different. For example, four different types of eye have evolved on earth, and the lobster's eye is so different from ours as you wouldn't want to believe (it's made up of little pixel-like rectangles!). Moreover I am of the belief that, because the mind is a thing with structure, that aliens endowed with a mind would also show a similar structure of thinking to ours, though obviously different again in detail (namely in the influence which the environment would have on their intuitions). However, that's enough speculations; but I hope you get the point that it's not numbers, but structures which are important. Let me add, as a joke, that the probability of life to arise on earth (mathematically calculated on the chance of certain amino acids to join up by chance) is 0.0000000000001 or even less: the universe isn't old enough to see it happen. So: structure and complicity to the rescue!

Something you might like to read: Stuart Kauffman: At Home in the Universe; Graeme Cairns-Smith: Seven Clues to the Origin of Life; Ian Stewart and Jack Cohen: Figments of Reality. Now these books are fun; and since they were written by scientists working at the coal face of research, you can rest assured that they don't push fancy theories without some hard evidence at their back.

Best of luck!

Jürgen Lawrenz

Your question is based on an incorrect conception of probability. The probability of humans is either unanalyzable, or unity. If you want to attempt to analyze the probability of humans, you certainly cannot do it by counting zygotes. That assumes that all zygotes are in a big barrel, and someone is picking them out at random. I mean... really.... You might try to start with bits of RNA in a puddle and try to estimate the odds of coming up with human DNA... but how? We don't have the slightest idea of what conditions were on earth when life started evolving, so there's no way of estimating what the odds were of some process of which we have no knowledge. Not only that, but you just don't do statistics this way, I don't care what blather you've read in the papers. You do statistics either by taking a sample of a population, and estimating from characteristics of that sample how the same characteristics would be distributed in the population. Or you run a process multiple times and generate odds by looking at the results of the runs. Or, if you've got lots of information, e.g., that a die has 6 sides and it's shape and weight are evenly distributed, you might use that information to predict odds. Ok? So, which of those do you have for this little question? None.

You're trying to look at all zygotes and estimate from that sample how many should be human. But that kind of sampling depends on assumptions like: the distribution of the characteristic is random in the population. If it isn't, then how do you collect your sample? It's like saying, we'll take a random sample of fish and animals and from that see how likely it is that animals have four legs. What that ratio would mean would be that if you were an alien from space taking random samples of creatures on earth you'd get x percent with four legs. But that doesn't tell you how likely it is that animals have evolved four legs, all it tells you is about alien sampling results.

And you're not going to run evolution multiple times, unless you have a few billion years to spare.

So since you have one sample, and you can only do the run once... then your odds are just the odds of your spilling your tea, after you've spilled it. Absolute certainty. You have spilled the tea. Humans have evolved. Alternatively, you can say the odds are unknowable.

Steven Ravett Brown


Michael asked:

Given that reality is immense in comparison to myself, it is obvious that I myself, am not the most important thing of all those things in existence. Therefore, what I consist of is not important — my thoughts and ideas, such as love, happiness, etc. — everything that is personal to me. Having excluded all of those personal things, what is it that is most important then, of all things in existence? That is, in the processes of our making our choices in day-to-day living, what is it most reasonable to see as taking precedence over everything else, having already established that it cannot be anything of a personal nature?

Well, nothing is really, objectively important. Whether or not something is of importance is a value judgement: Surely, what is personal to you is important to you? How can you live as if your decisions, thoughts and ideas are simply of no value at all? You would simply have to give up, not bother, collapse on the floor. But after a bit you'd probably find value in getting up and dusting yourself down and getting on with things.

So, as far as immensity of reality is concerned, if you don't mind me saying, I agree that you are not important. But surely you must judge yourself to be important insofar as it is you who has to live this life. It is you who have to live with your decisions, and those close to you care about them too. And, given that they are close to you, they probably think you are important to them.

Why should anything have precedence? Certainly there is nothing in reality which determines what has precedence in value terms, although the natural environment seems quite important to all of us if one of our values is survival.

Rachel Browne

You and I, Michael, and all the immensities of the universe you refer to, are possibly nothing other than the hair on a flea's leg in some superdimension of which we cannot be cognisant. Okay: maybe we're not; maybe these immensities are 'real' and there is truly nothing other than this one universe we inhabit. Would it make any difference? Are there, in other words, scales of immensity meaningful to us, and is it of any help to our self-perception whether we are a speck in the universe or a hair on a fleas leg? Would it help if I told you that the hair on a flea's leg in our dimension is so enormous, that to a microbe it would seem like the Himalayas? I think the answer is in the negative on all counts.

But let me give you one example where they do count. When you were conceived, your existence began as 1 cell. But your grown body contains 6 billion of these; but although on the scale of 1 cell this is just another prohibitive immensity, here you are, with 'thoughts and ideas, such as love and happiness' etc. You wrote all this down and did not take note of what you wrote? Incredible! You wrote down, 'I have thoughts and ideas' and it did not occur to you what an immense privilege you enjoy! Fancy being endowed by the chance of being born with the gift of thought, the gift of love, the gift of ideas — if atoms could speak, do you not think that they would call it grossly unfair that you, a mere speck of mortal dust, should be so privileged, while they, the substance of the universe, are mute and deaf and dumb and in fact does not even possess enough agency to move themselves, let alone a thought!

All right, I exonerate you: after all, we humans do have an overall propensity to be jealous of beasts of prey and lunge at every opportunity to show how clever we are at killing and destroying, while taking the possession of a mind for granted. But this general lunacy does not invalidate my point. You are Michael, you have thoughts and ideas, and one of these thoughts concerns the immensity of the material structure of the universe and the 'puny' You which presumes to want love and happiness. And you wrote all this without realising or thinking about the fact that a handful of humans on this planet, scarcely 4 billion of them, possess minds, that is: self-conscious awareness; and this feature of the universe is so unique and absolutely precious in the face of an immensity of DEAD MATTER, that you felt intimidated rather than elated and grateful and filled with a sense of something so extraordinary that the immensities out there' shrink to a cipher. A sense that the universe is a mindless morgue of matter, which yet, in one remote little corner, began a process that for all you and I know may also have begun in many other places at roughly the same time: a process I call EVOLVING VALUE.

Value: mind stuff! Life!

No other reason whatever can be put up for considering the universe at all. That, potentially, it contains values. That potentially it contains minds. That meanwhile, it actually contains values and minds. And your's is one of them. Without your mind, and my mind, and everybody's thoughts, dreams and ideas, the universe would not know itself.

Jürgen Lawrenz

Your statement 'it is obvious ', isn't at all that obvious. On the contrary Nietzsche would approved of considering oneself most important in the universe (and at the same time staying humble). That means believing in yourself, and defending like a lion your own points of view (without getting unreasonable). In fact that's an answer to your question of what is most important.

At the same time such a question doesn't suit any use. Ask yourself what do you want to do with the answer. I could say God, science, humanity, The United States, etc., but does that make one happy?

Henk Tuten


Curtis asked:

I am looking for some good websites to teach the very basics of philosophy for my 6th grade students.

Can you point me in the right direction.

Rather than list the zillions of sites I found, why don't you do what I did: go to www.google.com and put in "6th grade philosophy". I think you'll be surprised.

Steven Ravett Brown


Aleena asked:

Is it possible for a society to sustain itself and have a moral order without a religious foundation for such an order?

Has there ever been a human society without a religion as the foundation for a moral order?

How do the alternatives (secular sources) for morality relate to the religious sources?

Which source (religious or secular) is superior or better in any way and why?

What are there so many difficulties in arriving at a definition of religion? What are the criteria or considerations in developing a definition? What is religion?

How can one give a position in a distinctly and explicitly philosophical fashion, at the same time being critical and comprehensive in developing and defending a position?

That's a 'nasty' question, and without doubt essential.

First let's make the distinction between religion and belief.

The word religo' in Latin means to bind', that meaning speaks for itself.

My personal opinion is that religion' tends to absolutism', while beings need (relative) beliefs.

Or said in another way: 'God hates religion', or 'beliefs want to be free'.

Every knowledge-system is based on beliefs. That doesn't mean that it has to become 'religious' (used in the sense of dogmatic), but the danger is always there. So to answer your first question: it is possible, but it proved to be difficult.

The answer to your second question I don't know, but I fear not yet.

The alternatives for religion (or dogmatism, determinism, absolutism, fundamentalism) are in my opinion found in relative views. Mind that there is nothing wrong with authoritarian knowledge, but it should be compensated (otherwise teachers become gods). Giving an example would be wrong, because not the view is important, but the way of viewing. Not necessary secular, because secular beliefs can become very dogmatic (as proved in Stalinism).

In essence it is the controversy between Popper and Kuhn, or the distinction between absolute and relative knowledge (Kuhn basing himself on Wittgenstein, and Wittgenstein (possibly without knowing) on Nietzsche. I respect very much both Karl Popper and Thomas Kuhn, but both stressed very much one side of the coin.

Any learning phase turns out to be mainly authoritarian (strict democracy in education failed), but should be followed by using your own creativity. That's what Nietzsche stressed with his 'Superman', he warned for religion but cherishes any free belief (even if he personally doesn't agree, but he admires fanatic (but reasonable) defending of own convictions).

It's like studying philosophy and thirsty for knowledge drinking the views of your professors (but hopefully mainly their methodology), and afterwards using the acquired knowledge to come to and defend your owns views. So studying is not about copying views of your teachers, but learning the means that are purposeful for you.

That's exactly what makes studying philosophy confusing. In most studies it is clear after ending that have you learned various methods, but in philosophy there is always the danger as well of being drowned in views. This is the 'religious' danger of philosophy, often without realizing you become being 'bound' by the views of some professor and learn to defend those.

Henk Tuten

I suppose you are aware that your question entails a lifetime's worth of study!? To deal with such a brief in a couple of paragraphs is simply not possible. The best, I feel, I can do for you in this restricted compass is to make a few elementary points, hoping that someone qualified in theology will add to it (and I take the risk of contradiction in my stride). So in relation to your first question, it is indubitably possible to have a moral social order without a religious foundation. Not only is it possible, but classical Chinese was a living example of it. Confucianism, although sometimes styled a secular religion, was essentially a social structure based on humanitarian principles adopted from Confucius and his followers (Mencius, Hsun Tsu) and adapted to the living needs of society. Its religious component was restricted to the performance of certain rites, which were never in our (Christian) sense religious, but a simple straightforward act of contemplation and human piety. Confucianism was, whatever the actual practice may have changed from time to time, an essentially and intrinsically secular doctrine and reigned as the dominant doctrine in China for nigh on 1500 years.

The others of your questions need to be addressed in a different kind of context. Whether religious or secular structures are superior is, I think, a non-issue. If history may be asked to 'prove' anything at all, then it can show at best that religion is 'good' only for two types of societies: those which are in an anarchic shambles and need the cohesiveness of a single doctrine imposed from above, and those societies which permit the individual the choice of their religion. China, ancient Greece and modern Europe/America belong among the latter type.

The difficulty of arriving at a satisfactory definition of religion is, simply, that majority opinion is not a philosophically relevant criterion. What we style 'higher religions' is, in my view, simple prejudice: we represent a higher type of civilisation, ergo our religion must be higher. I stress that this is just my opinion; and this pertains still when I add that the critical reflection on religion cannot ignore the fact that humans have throughout history (nomadic and prehistoric included) demonstrated a clear propensity for anthropomorphic thinking: we followers of Jehovah and Christ delude ourselves that we have a 'purer' concept; but against this it can be argued that (a) very little in the Christian philosophical literature is clearly non-anthropic and the little there is has almost no influence on the shape of the religion in either of its two major denominations; and (b) even the 'pure' concept cannot claim intrinsic superiority to the simple shamanic conception of good and evil spirits residing in plants and animals and human body parts. Pure or simplistic, both concepts have, critically assessed the same validity; and if critically you feel compelled to doubt one, then this automatically disqualifies the other (assuming no bias intervenes). To some extent, I would argue that a critical assessment of a religion is a non sequitur in any case: religion is principally a matter of belief, in the second instance a possibly metaphysical state of mind. But if one were to take the idea of a critique seriously, then you would have to first disown religion and seek a rationale for believing in a God. You would find that, I think, almost impossible.

Jürgen Lawrenz


Mike asked:

No matter what any person believes, the image of some sort of God will come into their mind. Even if they do not believe in that image, they will still hold an image of GOD in their minds, so that they can reject it. Therefore; GOD has to exist in the mind of the most ardent non-believer.

Since a belief is only a concept or a perception and a non-belief is the opposite of another person perception, both concepts can have no true proof of meaning in the existence or non-existence of God.

A Belief needs some doubt of the truth, for if there is truth, there is no need of belief. Therefore: A belief in God can never possess sufficient validity or proof.

But, a thought of God exists in everybody's Mind. And a thought is beyond any belief or non-belief.

The image of GOD that the non-believer wants to dismiss, still stays in his/ her mind and therefore must exist in that persons life. For something not to exist, it cannot be experienced in a thought.

This is proof beyond any Doubt... That GOD exists in every person's mind.

No matter what any person believes, the image of some sort of unicorn will come to their mind. Even if they do not believe in unicorns, they will still hold an image of a unicorn in their minds, so that they can reject it. Therefore, unicorns have to exist in the mind of the most ardent non-believer.

But a thought of unicorns exists in everybody's mind. And a thought is beyond any belief or non-belief.

The image of a unicorn that the non-believer wants to dismiss, still stays in his/her mind and therefore must exist in that person's life. For something not to exist, it cannot be experienced in a thought.

This is proof beyond any Doubt... That UNICORNS exist in every person's mind!

— Perhaps you can see the problem, now?

Steven Ravett Brown

Mike stated:

"No matter what any person believes, the image of some sort of God will come into their mind."

From this assumption, Mike proceeded to argue:

"That God exists in everyone's mind."

You can see he hasn't really made any progress. In order to demonstrate something, he first assumes it!

Unfortunately, although we may agree that the assumption is LIKELY to be right, it has a form that makes it very hard to prove. To prove that every member of some class has a particular property, it is necessary to either:

(a) examine every member exhaustively, without exception; or

(b) demonstrate that that property is a necessary consequence of membership in the class.

Of course, it is easy to disprove the assertion, if we can find just one counter-example. I give you the mentally handicapped man who lives in our street. He is a "person" and doubtless has beliefs. But he has no words. I will not assert that he has no image of God. Rather, I will ask you how he might acquire one? And if he did, how would you demonstrate that?

Yahya Abdal-Aziz


Mike asked:

One of the rules of physics is: opposites attract each other. Therefore, if an atheist exists, so must a God?

This argument might convince the converted, but who else?

Premise: "One of the rules of physics is: opposites attract each other." This is false. It would be more accurate to say: "One of the rules of physics is: if two particles exist and have charges of opposite signs, they will attract each other."

Hidden induction: "What holds true for physical particles will hold true for anything, including persons such as atheists and gods."

If we were to allow this (but why would we?), we could deduce that: "If a God and an atheist exist and are opposites, they will attract each other."

Hidden premise: "God and an atheist are opposites." It's much more common to believe that "God and Devil are opposites." and that "An atheist and a theist are opposites." But let's allow for the moment that the premise is true. Then we know that: "If a God and an atheist exist, because they are opposites, they will attract each other."

Hidden premise: "A God and an atheist exist." Easy to demonstrate the second part! OK, suppose God exists. Mike has now proven that God and the atheist attract each other.

What form does this attraction take, I wonder? In what way does the atheist move towards God? From my own observation, I've noticed that the few acknowledged atheists of my acquaintance have all been, personally, singularly indifferent to the concept of God. They find it irrelevant.

In the form in which Mike asked his question, it seems that he supposes that "IF one particle exists, then another, oppositely charged particle, must also exist." Surely not! The principle of physics he alludes to in his explicit Premise is not an existence proof. Based on that Premise alone, there is no reason to suppose that the universe consists of anything but, say, all positive particles without a single negative particle anywhere.

Yahya Abdal-Aziz

Your argument appears to run like this

1. Opposites attract
2. Atheists deny God exists
3. Therefore atheists are opposite to God
4. Atheists therefore attract God
5. To attract God, God must exist.

One major problem here is that the argument can be used to show that anything exists consider changing 2 to;

2'. John denies fairies exist
3'. John is therefore opposite to fairies.
4'. Fairies therefore exist.

If your argument is valid then we are committed to a huge ontology whereby denial of a thing's existence entails that it exists. I think therefore that something is wrong with your argument! Looking at 3 I think it is unfair to define someone who denies a things existence as the opposite to thing, is someone who denies the existence of a round-square the opposite of that thing? What would it even mean to say that you are the opposite of a round square? It is hard to spell out isn't it.

Not knowing the exact formulation of the physical law that opposites attract I suspect it deals with specific physical bodies, and is not applicable to our mental states. Even if you say it does then all you are entitled to say is that one mental sate i.e. atheism is attracted to another mental sate theism; not that the mental state atheism is attracted to God.

Mike Lee


Sandy asked:

Compare and contrast Hegel and Schopenhauer.

Hegel was a successful philosopher, Schopenhauer a failure. But then, Hegel needed the money, whereas Schopenhauer did not. These are two areas where they differ. They also had different views on women. Hegel was a happily married man and enjoyed the admiration of female members of the reading public, whereas Schopenhauer despised women and therefore tended to have very unfortunate relations with members of the opposite sex. For example, he hated to see them chattering on sidewalks and staircases and once took the extreme measure of silencing his cleaning woman by throwing her down the stairs. She broke a leg and sued him and got a life's pension awarded to her, payable by Schopenhauer, of course.

I should also mention that their prose style differs somewhat, for although they both wrote in German, Hegel's language is like a tropical forest, dense and dark but punctuated with flashes of luxuriant colour (a bit like a black python winding up a black tree while a shaft of sunlight plays on the plumage of a parakeet). Schopenhauer's diction, on the contrary, is stately and sonorous, flowing along majestically like a river of gold into the autumn sun.

Their social relations were not the best. Hegel took hardly any notice of Schopenhauer; and this made the latter so cranky, that he started using rather nasty epithets like 'insane' and 'fraudulent' about Hegel's philosophy. But even then no-one took any notice.

As to their philosophy, there is much to compare. They both studied Kant and they both studied with Fichte in their youth. But whereas Hegel criticised his former teacher in the genteel fashion appropriate to academic etiquette, Schopenhauer bestowed the epithets I just mentioned on Fichte as well. He might have been lucky that Fichte was dead, because the latter commanded his audiences like an army sergeant and might have made mince meat of the aristocratic Schopenhauer. Anyway, they both pilfered from the dead Fichte's writings whatever they could use: Hegel the dialectical method and much else besides, Schopenhauer the concept of Will and Imagination, which appears in the title of his only book.

Other than these few things, I can't think of any other obvious comparative and contrasting features between Hegel and Schopenhauer. If you need more, you might just have to read a little bit. Now this is a difficult issue, I'd be the first to admit. Both Hegel and Schopenhauer wrote huge blockbusters, and this leaves anyone in a pickle who hasn't got the time or the inclination. You might, for example, read what authors Singer and Janaway wrote about these philosophers in the Oxford Very Short Introductions, but this way you only get a vague drift, like the salt spray from a wave wafting inland, which is not the same as ducking into it. You get an opinion, not a philosophy. But that's a choice you might need to make.

Jürgen Lawrenz


Terry asked:

Which philosopher wrote about different kinds of love?

What is knowledge and how does one obtain it?

There are four classes of knowledge:

1. Tacit. This covers the large segment non-specifiable knowledge which is transmitted mostly by example. The teaching of skills like violin playing, or surgical diagnosis or artistic photography, belongs into this category, where instruction relies on demonstration rather than precept and where, as a pupil, you acquire the knowledge you need by trial and error and a close involvement with your materials. Hence it is the kind of knowledge in which judgement is all-important and where no one person is in possession of the complete range of knowledge that pertains to any single knowledge area.

2. Skeletal and/or Unfocused. This kind of knowledge arises when you absorb focused information, but owing to its complexity or sheer volume, you remember it as a generalised, unfocused structure of knowledge. It is the kind to which the proverb, 'Knowledge is not the having of it, but the knowing of where to get it', applies. So you hold fast to outlines, blocks and general patterns as well as, evidently, the means of acquiring the details to flesh out this skeleton.

3. Articulated. This type of knowledge is detailed and focused. This is where you command not only the structure, but the content as well. A memorised poem or train timetable will serve as examples. Obviously this cannot, on the whole, do without Item 2 also being available, because human minds can only hold so much detail. Of tremendous importance to this type of knowledge is, in addition, the means of hanging on to it. Depending on which area of knowledge is covered, this might entail laying it down in books, using abbreviations and aides-memoire etc.

4. Symbolic. This covers areas which we employ quite generally, though we are rarely explicitly aware of it. We use signs, allegories, indexicals, metaphors constantly; and these are an indubitably form of knowledge. It comes out, for example, in such a common adage as 'birds of a feather flock together', which is not an ornithological statement, but a general statement reflecting a certain knowledge of human behaviour.

These are matched by four knowledge acquisition systems, respectively:

1. Phyletic, concerning inherited memories, genetically transmitted characteristics, archetypes, subconscious conditioning and so on. A lot of knowing 'how to' is passed on through such means, as any mother of a baby will know from the day it is born. 2. Cognitive, which relates to understanding and comprehension. 3. Aesthetic, in the two-fold meaning of sensation and perception. Finally, 4. Prehensive, which is concerned with the physical objects, but since most of these do not come with a handle by which to grasp them, it also serves as a term to embrace the classification of knowledge under rubrics like weight, mass, volume, density, distance and so on.

Jürgen Lawrenz

A clever question, because almost everybody takes the word 'knowledge' for granted. Knowledge is in my view that part of a fantasy that you share with others. So it can be seen as a shared fairytale, or a shared game.

This it not to discredit knowledge, but my relative view. Knowledge to me is not 'absolute'. In fact it is not really important, it's the process of acquiring it that it is all about.

Studying philosophy is not meant to gather knowledge, but to acquire means to grasp it. Professors are not meant to lecture you their views only, but above all to give you means to construct your own views. And to teach you ways to evaluate these and those of others.

Knowledge is not to be obtained, like for instance money. The process of acquiring it makes you owner of the knowledge. In fact the two cannot be separated (they form a mathematical unity). To posses knowledge you need to have acquired it, and after personal acquiring you automatically own knowledge.

Compare it to the noun 'work', that includes effort, and without effort there is no work. Only you can let others do your work, but you can't just take their knowledge.

You can have workers killed after their effort, but killing wizards would be like killing the hen with the golden eggs. Without hen no golden eggs, and without wizards no knowledge. That is why Merlyn the wizard always was save, while knights were exchangeable.

Wizards never explain you how they realize their tricks, and as well you have to gather your own knowledge.

Henk Tuten


Adam asked:

Are IQ tests a reliable indicator of intelligence?

It is now fairly generally accepted that IQ tests are designed with specific preconceptions about intelligence in operation, which don't hold water at all. They are types of intelligence, but no guarantee of a person's ability to exercise or express that intelligence usefully or even suitably. So unfortunately, the answer is still "NO".

Moreover, since you've addressed your question to a philosophical forum, where prejudices such as those buried in IQ tests are (or should be) viewed with a jaundiced eye, I would suspect that no philosopher of any persuasion would even consider the possibility of intelligence being measurable by some moronic apparatus.

The only IQ test that counts is the test of close acquaintance among equals. If your intelligence is adjudged among your peers as higher than average, than this is a superior guide to the facts of the case than any so-called objective measurement. For although human are of biased, they always know the truth; but a apparatus may not be biased (while yet invariably they are), but they have no means of testing or establishing truth.

Jürgen Lawrenz


Jane asked:

What is the justification for the use of discipline/ punishment in schools?

Discipline is necessary in any organisation. The problem is, how to enforce that discipline. The problem becomes even more acute when dealing with children; but there is no avoiding the fact that discipline is a vital part of a young person's education. A young person requires guidance until s/he reaches an age where they can make judgements and weigh-up situations for themselves. Education is not just about academic subjects, it also includes, what was called by my generation, learning how to become a good citizen. Part of this education was learning good manners, learning to respect others, and a general idea of what was considered good and what was considered bad or evil. A major factor, which people often overlook, was self respect; it is difficult to respect others if we have no respect for ourselves. Underpinning all this was the firm imposition of discipline.

Of course, discipline of children was always considered to be the main responsibility of parents, the home was where good manners were expected to be taught, where parents were expected to set a good example, however, schools did take it upon themselves not only to back-up parents but to take a lead. A good hiding for a misbehaving pupil at school was acceptable to all, and was expected from the most famous public school in the land down to the most basic Council School. It was not unusual for a young person to be punished in school and on arriving home to discover that their parents knew of this would be subject to either another good hiding, or being sent straight to bed without their evening meal.

For those of us old enough to make the comparison there is, without doubt a massive paradigm shift in received knowledge regarding morality and ethics. This shift from Victorian times, through Edwardian Times, through the twenties and on into the sixties was hardly noticeable. the changes in the structure of society the began to accelerate at such a pace that the culture shock had a devastating effect on relationships between parents and their, particularly, teen-age offspring. Still living in the past paradigm and reluctant to accept the changes taking place, the earlier generation were left gasping like goldfish in a bowl that had been deprived of nearly all their life-giving water. The gap between the previous and the developing generation opened rapidly, the relationship between young people and adults changed as a different concept of discipline replaced the old concept. One of the major changes was the different attitude towards discipline in schools, corporal punishment was eventually banned, and a rather strange shift of emphasis from discipline of the young person to protection of the young person took place. Physical contact between teacher and pupil took on a sinister meaning: teachers began to feel confused and helpless, their world was turned upside down almost overnight. I distinctly remember a friend of mine almost losing his job because he placed his hand on the shoulder of a girl as she refused to step into line in the play-ground, blatantly refusing to obey the teacher she was quite prepared to hold up the entire school. She told my friend to take his hand off her shoulder because that was a form of assault and he would be reported. He appeared before the Head and the girl's parents the following day, was completely humiliated and warned about his future conduct. Bizarre events began to be reported in the media, like the one where the teachers were besieged in a high school by angry pupils who had taken offence at the attempted imposition of new instructions. Damage was done to staff cars, tyres let down, etc. before the police arrived to disperse the pupils.

The relaxing of discipline in schools initially brought about a lowering of academic standards, a high percentage of pupils leaving primary schools were poor at reading writing, Whether co-incidence or not, there was a continuing rise in truancy, vandalism, attacks on the elderly by young people, drug addiction in the young, thieving and general criminality in all areas. Along with this came a marked increase in general bad behaviour, decline of good manners, and lack of consideration and respect for others.

We must not overlook the fact that other changes had taken place in society which coincided with school discipline problems, the main one being changes in family life. The increasing divorce rate, a lack of interest in the sanctity of marriage, children having two homes with father and partner in one and mother and partner in the other, and so on. In addition to all this came a rapid decline in church going, and the almost total decline of Sunday Schools, both have had an effect on the discipline of young people. Ironically, these changes, which to some are catastrophic events, throws the responsibility for discipline back on the schools.

It is, of course, not the case that all young people fall into the categories indicated, I believe we have all met or know some wonderful young people who are a credit to their school and to society. Unfortunately, the daily exposure of the criminal behaviour of young people shows a continued increase. Even as I write this I am informed by the media that 17,000 juveniles stole vehicles last year. Add the thousands of young drug addicts, muggers and thieves over the same period, and we get the feeling that we are not just on the verge of anarchy, but that it has already arrived. It seems that not only do we need real discipline back in schools, but we also require a tightening of the lenient laws governing this country. When the do-gooders that brought all this about realise that there is a difference between assaults on children and regulated punishment for their own good, and for the good of society, we shall happen see a change back to normality, but don't hold your breath, and don't forget that the bias has swung so far towards the welfare of young people that parents can be taken to court by their own offspring who have decided that discipline is not for them.

John Brandon


Eric asked:

I am obsessed with this mind issue. Last answer was highly satisfactory. I began to read the references you sent me. Thanks a lot. In this one I want to bore you with some novice theories about mind, if you allow me.

I believe that mind works pretty much a Hierarchical Petri Net. In each level there is one statement, that can be modeled with places and transitions.

Let "Is Eric Good?" be a question asked to me. Let this question in its context be defined with two places and one transition, when I am subjected to this question.

In lower levels, I have multiple small nets in various levels and details connected to the concept "Eric", as well as "Goodness".

"Eric hit me", "Eric's sister is Jane", "Eric went to college" etc. "Elephants are good", "Bad is not Good", "Getting beaten is bad", etc.

These chunks are further linked to others like "Jane is a girl", "Jane is good".

I believe that man keeps his experiences in chunks like the ones above, very tiny nets with one or two places and transitions. However when asked a question, or while making an exercise man gets into these small chunks, decomposes each of these transitions and places, forms up new nets during run time, and aggregates them to come at a conclusion which is also a tiny net.

Turning back to my example, I conclude that "Eric is Bad". When asked the question, I simply run through the nets in the lower level, associate "Eric hit me" as "Something Bad" and conclude that "Eric should also be bad".

Now what I do is not just finding a "path" in the mathematical sense. If it was so I could easily come up with "Jane is Good" as an answer to the question. The process involves "path finding" but is much more complex.

Anyways, I was thinking that this was a good beginning for a modeling effort about how mind works and wanted to share it with you. I need to read more about this type of research.

Could you recommend me a couple of books/articles: 1) that looks at the issue through hierarchical nets and graph theory, preferably Petri Nets. 2) that contains my theory about runtime graph formation. 3) If people tried to explain this thing through graph theory and could not succeed please let me know about that as well. I do not want to waste a life on this :)

I assume you know about this site, then: http://www.daimi.au.dk/PetriNets/.

But here's one general point. There are many, many ways to approach and to model the mind. Various types of networks are one, of which Petri and graph theory are two categories. It may be that they are formally identical. I will give you some other references below; you can find all the Petri stuff you want at the above site. The danger, if you have not yet done much reading, is to take one of those as the way, and spend enormous amounts of time and energy at it. This is the danger of too narrow an education in any field. IF your goal is to take the Petri Net and elaborate it, and see what can be done with it, then, fine, go with that particular approach. IF your goal is to model or to think about the mind, you will be taking one out of dozens of approaches and effectively saying that it is the correct one. But it isn't, and here's why (and I'm sure many Petri people and others will disagree with this): the brain is an analog system, not a digital system. A computer is a digital system. Can a digital system "model" an analog one? Yes, certainly. What does "model" mean? Now, there's the question. I will not go into that here; suffice it to say that there is ongoing debate on that point. Second, can a digital system duplicate, functionally (obviously it could only be functionally), an analog system? This is another question, not the same as the last. If you want to create a mind in a computer (a digital computer), you've got major problems, I think... indeed, I don't think it's possible. But you can model one, up to a point. There's an important distinction here that many have missed.

So, what approaches should one take to a) model and b) duplicate mind? But you see that these are two different questions. Your approach above will not duplicate mind. But it might model it to a certain extent.

For history (and good background) in modeling and networks:

Ashby, W. R. Design for a Brain: The Origin of Adaptive Behaviour. London: Chapman and Hall Ltd., 1960.

Rosenblatt, F. Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms. Washington, D.C.: Spartan Books, 1962.

Minsky, M., and S. Papert. Perceptrons: An Introduction to Computational Geometry. Cambridge, MA: The MIT Press, 1969.

McCulloch, W. S. Embodiments of Mind. Cambridge, MA: The MIT Press, 1970.

McClelland, J.L. Parallel Distributed Processing: Explorations in the Microstructure of Cognition: Psychological and Biological Models. Edited by J.A. Feldman, P.J. Hayes and D.E. Rumelhart. Vol. 2, Computational Models of Cognition and Perception. Cambridge, MA: The MIT Press, 1986.

Rumelhart, D.E. Parallel Distributed Processing: Explorations in the Microstructure of Cognition: Foundations. Edited by J.A. Feldman, P.J. Hayes and D.E. Rumelhart. Vol. 1, Computational Models of Cognition and Perception. Cambridge, MA: The MIT Press, 1986.


Dreyfus, H. L. What Computers Can't Do. Cambridge, MA: The MIT Press, 1972.

Gurwitsch, A. The Field of Consciousness. Edited by A. van Kaam, Duquesne Studies: Psychological Series. Pittsburgh, PA: Duquesne University Press, 1964.

Husserl, E. The Idea of Phenomenology. Translated by W. P. Alston and G. Nakhnikian. Fourth ed. The Hague, Netherlands: Marinus Nijhoff, 1970.

Merleau-Ponty, M. Phenomenology of Perception. Edited by Ted Honderich. 1st ed, International Library of Philosophy and Scientific Method. New York, NY: Routledge & Kegan Paul, 1970.

Relatively early cognitive & modeling refs:

Allport, A. "Visual Attention." In Foundations of Cognitive Science, edited by M.I. Posner, 631-682. Cambridge, MA: The MIT Press, 1989.

Deese, J. The Structure of Associations in Language and Thought. Baltimore, MD: The Johns Hopkins Press, 1965.

Fodor, J. A., and Z. W. Pylyshyn. "Connectionism and Cognitive Architecture: A Critical Analysis." Cognition 28 (1988): 3-72.

Gardner, H. The Mind's New Science. New York, NY: BasicBooks, 1985.

Gregory, R. "Perceptions as Hypotheses." Philosophical Transactions of the Royal Society of London Series B, Biological Sciences 290 (1980): 181-197.

Grossberg, S. "How Does a Brain Build a Cognitive Code?" Psychological Review 87 (1980): 1-51.

Johnson, M. The Body in the Mind. Chicago, IL: University of Chicago Press, 1987.

Koffka, K. Principles of Gestalt Psychology. 2nd ed. New York, NY: Harcourt, Brace & World, Inc., 1963.

Mead, C. Analog Vlsi and Neural Systems. New York, NY: Addison Wesley Longman, Inc., 1988.

Neisser, U. Cognitive Psychology The Century Psychology Series. Englewood Cliffs, NJ: Prentice-Hall, Inc., 1967.

Pollack, J.B. "Recursive Auto-Associative Memory: Devising Compositional Distributed Representations." Proceedings of the Tenth Annual Conference of the Cognitive Science Society, Montreal. Cognitive Science Society 1988.

Posner, M.I., and S.J. Boies. "Components of Attention." Psychological Review 78, no. 5 (1972): 391-408.

Rosch, E., C.B. Mervis, W.D. Gray, D.M. Johnson, and P. Boyes-Braem. "Basic Objects in Natural Categories." Cognitive Psychology 8, no. 3 (1976): 382-439.

Shallice, T. "Information-Processing Models of Consciousness: Possibilities and Problems." In Consciousness in Contemporary Science, edited by A.J. Marcel and E. Bisiach. New York, NY: Clarendon Press, 1988.

Shepard, R. "Attention and the Metric Structure of the Stimulus Space." Journal of Mathematical Psychology 1 (1964): 54-87.

Shiffrin, R. M., and W. Schneider. "Automatic and Controlled Processing Revisited." Psychological Review 91, no. 2 (1984): 269-276.

Treisman, A.M., and G. Gelade. "A Feature-Integration Theory of Attention." Cognitive Psychology 12 (1980): 97-136.

More modern refs:

Baars, Bernard J. In the Theater of Consciousness: The Workspace of the Mind. 1st ed. New York, NY: Oxford University Press, 1997.

Chang, F. "Symbolically Speaking: A Connectionist Model of Sentence Production." Cognitive Science 26 (2002): 609-651.

Craik, F.I.M. "Levels of Processing: Past, Present . . . And Future?" Memory 10, no. 5/6 (2002): 305-318.

Demetriou, A., G. Spanoudis, C. Christou, and M. Platsidou. "Modeling the Stroop Phenomenon: Processes, Processing Flow, and Development." Cognitive Development 16 (2002): 987-1005.

Dipert, R.R. "The Mathematical Structure of the World: The World as Graph." The Journal of Philosophy 94, no. 7 (1997): 329-358.

Fauconnier, G., and M. Turner. "Conceptual Integration Networks." Cognitive Science 22, no. 2 (1998): 133-187.

???. The Way We Think: Conceptual Blending and the Mind's Hidden Complexities. New York, NY: Basic Books, 2002.

Gernsbacher, M. A. Language Comprehension as Structure Building. Hillsdale, NJ: Lawrence Erlbaum Associates, 1990.

Gopnik, A., and A.N. Meltzoff. Words, Thoughts, and Theories. Edited by L. Gleitman, S. Carey, E. Newport and E. Spekle, Learning, Development, and Conceptual Change. Cambridge, MA: The MIT Press, 1998.

Grossberg, S., E. Mingolla, and W.D. Ross. "Visual Brain and Visual Perception: How Does the Cortex Do Perceptual Grouping?" Trends in Neurosciences 20, no. 3 (1997): 106-111.

Halford, G.S., W.H. Wilson, and S. Phillips. "Processing Capacity Defined by Relational Complexity: Implications for Comparative, Developmental, and Cognitive Psychology." Behavioral and Brain Sciences 21 (1998): 803-865.

Harnad, S. "The Symbol Grounding Problem." Physica D 42 (1990): 335-346.

Kahana, M.J. "Associative Symmetry and Memory Theory." Memory & Cognition 30, no. 6 (2002): 823-840.

Libet, B. "The Timing of Mental Events: Libet's Experimental Findings and Their Implications." Consciousness and Cognition 11 (2002): 291-299.

Maddox, W.T., F.G. Ashby, and E.M. Waldron. "Multiple Attention Systems in Perceptual Categorization." Memory & Cognition 30, no. 3 (2002): 325-339.

Reisberg, D. Cognition: Exploring the Science of the Mind. 1st ed. New York, NY: W. W. Norton & Company, Inc., 1997.

Rieke, F., D. Warland, R. de Ruyter van Steveninck, and W. Bialek. Spikes: Exploring the Neural Code. Edited by T. J. Sejnowski and T. A. Poggio. 2nd ed, Computational Neuroscience. Cambridge, MA: The MIT Press, 1997.

Rizzolatti, G., L. Fadiga, V. Gallese, and L. Fogassi. "Premotor Cortex and the Recognition of Motor Actions." Cognitive Brain Research 3 (1996): 131-141.

Sloman, S. A., B.C. Love, and W.K. Ahn. "Feature Centrality and Conceptual Coherence." Cognitive Science 22, no. 2 (1998): 189-228.

Sun, R. Duality of the Mind: A Bottom-up Approach toward Cognition. Mahwah, NJ: Lawrence Erlbaum Associates, Inc., 2002.

Wegner, D. M., and J. A. Bargh. "Control and Automaticity in Social Life." edited by D. Gilbert, S. T. Fiske and G. Landzey. Boston, MA: McGraw-Hill, 1996.

Yaniv, I., D.E. Meyer, and N.S. Davidson. "Dynamic Memory Processes in Retrieving Answers to Questions: Recall Failures, Judgments of Knowing, and Acquisition of Information." Journal of Experimental Psychology: Learning, Memory, & Cognition 21, no. 6 (1996): 1509-1521.

The above is a bit, a slice, a small example, a mere taste, of what is out there. Dip into it before you go much further in your own thinking.

Steven Ravett Brown


Nicole asked:

Time travel is an extremely interesting subject, but is it really conceptually possible?

My straightforward answer is no, it is not possible, no matter how you bend it. But if I left it there, someone else will say, it is conceivable under such and such circumstances. So I'm going to have to invite you along on a little journey of problems, just two or three of them, but all bristling with way-out complexities. I'll try and make them as easy as possible, because it's worth thinking about these matters, and also because our lives are so much under the influence of science and science fiction today that the average person can hardly make out what to believe. And by golly, time travel is part of the fare! You must have noticed how much it is taken for granted, as if there were no argument about it!

Well now, since we have to start somewhere, let's take a peek at the 'space of all possible things/ events/ ideas'. Somewhere in this space you'll find time travel and no doubt millions of other ideas, thoughts, objects, events and possibilities that have been dreamt about. They are all in this 'space' as potentials, waiting to be realised. Yet the first thing to note about the 'space of all possible things etc.' is this: there is no such space; for even the 'space' itself, the concept of this 'space', is part of the 'space of all possible things etc.'! Hence it is not a real space, not a finite, three-dimensional volume, where things happen. So you understand that I'm talking about a conceptual space, an infinite realm with infinite possibilities that (so to speak) travels along with our finite realm of real things and real possibilities. It is the realm of the 'Maybe'.

The importance of this concept of infinity is not well appreciated, certainly not by time travellers. They tend rather indiscriminately to toss finite and infinite states around as if they were lego blocks. They talk about 'worm holes' and 'black holes' and 'big bang', and of 'string theory' and 'quantum flutters', which are all entangled with infinity. But consider that infinity means, by definition, that you can't count what's in it. So when you ask, how many atoms in the universe, you are immediately defining the universe as finite.

Having got this far, what about time? Well, it's really the same problem all over again. Is the universe in 'time' or not? Is time 'in' the universe or independent? Astronomers want to convince us that time was created with the big bang, but there is a big chink in that logic. For if the spread of time is finite, then of course the universe must be finite. And vice versa. But if the universe is finite, then we've only pushed the problem of infinity out of the way, because we are then supposing another universe which must contain ours; and that universe is probably contained in yet another: Russian doll universes all the way down. In philosophy this is called 'infinite regress'.

We're obviously getting ourselves into a huge mess. Let's narrow down our focus and note down a sort of definition: 'God invented time to prevent everything from happening at once.' This gives us a vital first clue to what's wrong with time travel. On this definition, time is a concept of simultaneity. It means that if two separate objects/ events occur such that third parties observing them agree in their happening at the same instant, these parties then have a means of plotting the events on a graph, marking their lines of approach and departure and assigning values (seconds, hours, days) to all changes in position. This graph is a 'frame of reference', which can now function as a tool for establishing the simultaneity of all events that fall within its scope. Evidently to make this work, a point at rest has to be presupposed, called the 'residual observer', around which the other events revolve.

Now another difficulty comes up. When you have three, four, a thousand, a billion frames of reference, practically all of them unknown to us because of the sheer size of the visible universe, the notion of simultaneity suddenly runs amok; our little graph just can't cope any more and you'll find that a second residual observer becomes necessary, then a third, a fourth...and in an infinite universe...? You guessed it: an infinite number of residual observers. Where does that leave our simple concept of time? Doesn't it mean there are as many times' as residual observers? True again.

So this doesn't get us anywhere. We're attacking the whole problem back to front. To find out 'what time really is', we need to put ourselves in the seat of time itself. We need to ride along with time on a beam of light. So let's now confront this issue with a 'practical' example.

Let's say you've been despatched from Earth to Alpha Centauri. In earth terms that trip is going to take four years at the speed of light; that's not time travel, but it will serve for an opener. When you last looked back, you might have seen your parents standing there, waving goodbye. A couple of days later, you look again and still they're there. Patient people! But when you look again a year later and find they haven't moved, you are suddenly jolted into the realisation that, of course, their image is travelling at the same speed with you. Time is standing still for you in relation to that scene.

Now difficult as it may be, try and draw a sound conclusion from this. These are not your parents, but merely their image. What then, if you could suddenly double back and return? The point is: nothing changes; and when you arrive, to your parents you will only have hovered in the stratosphere for a while and then come back down.

Now clearly this is nonsense. You've been en route for a year! Consequently there is an irresolvable contradiction: you cannot, as a physical body, be in two places at the same moment, but this is what the story entails.

It gets worse when you really start time travelling. Imagine yourself accelerating beyond the speed of light. As you gaze out the porthole, you'll see start seeing things you shouldn't: ice ages, continental drift, the earth aflame like a drop of molten iron etc etc. On our diagram of Earth, Alpha Centauri and yourself, your numbers are running into the negative: you've reversed the time relation between you and planet Home.

Now there is another side to this story. To observers on earth you would first dissolve and then disappear. Conventionally we take this to mean that the speed of light can only be attained by electromagnetic radiation (EMR), accordingly your acceleration has the effect of converting you and your craft into EMR. But this in turn means that, in relation to Earth, you have ceased to exist. You cannot therefore simply double back and hurtle back to Earth. She won't be there when you arrive. On your diagram, where Earth and Alpha Centauri comprise a frame of reference in close simultaneity, you have removed the residual observer, yourself.

But ah! you cry, even if I can't return to Earth, yet this is time travel, isn't it? Can't I now connect with another frame of reference?

Well, I promised you this was going to be complex, mind-boggling and irritating. For while you may conceivably exceed the speed of light in relation to your own system, you cannot exceed it in relation to light itself. Here the equation is EMR = Time. The grain of EMR in the universe is also the grain of time, and the best or simplest way to make sense of this is to reverse the notion of speed. To attain the speed of light means, in this context, for you to become decoupled from any frame of reference whatever because you have become connected to the stream of time/ light itself. But this 'stream' being the grain of time itself, means you are standing still again, only this time in relation to the whole universe. Then the objects of the physical universe, galaxies and nebulas and novas, will be fizzying around you in a bewildering torrent of criss-cross patterns across the entire 'sensurround' horizon. Indeed some or many of these objects may actually 'collide' with you, at the speed of light (!).

One last question: could you not 'decouple' from this unwished-for state and return to a definite existence? Unfortunately the answer, once again, must be 'no'. I keep saying 'you', as though there was a 'you' in this EMR stream. But of course, there's not: you have become a beam of light, pure EMR, which contains not the thinnest thread of information. Once upon a time, in your real life, 'you' were (among many other things) a packet of information; this is now gone, terminally erased. And this is of course the real crux of the matter.

Simultaneity is the coincidence of objects (information) in a frame of reference: and all these frames of reference are finite entities which might all, in principle, be co-ordinated in a network of finite observers. But 'behind' this structure is the structureless grain; picture it like a single dew drop somewhere in the midst of the Sahara desert. And in this structureless space all events occur simultaneously, just as the sand in the desert 'occurs' all at once; but for us, who have a finite perspective on them, these events occur in sequence and under conditions to which the concept of simultaneity can be fitted.

I hope all this makes sense to you! If you wanted to put it into a nutshell, you could say that time travel cannot happen because time is not real: it's not a road or a space or a field where you can identify Point A and Point B in relation to one permanent, unchanging residual observer. It is (as I said) the idea of some things occurring measurably simultaneously. So the crucial component (you might have picked this up when you recognised your parents as only an image) is this: that light waves bearing images are not physical reality. On this discrepancy the whole fancy breaks apart. Time travel, so understood, is mistaking a 'report' for the event itself; and of course a report can long outlast the event which has meanwhile ceased to be.

And this brings us back to the 'space of all possible things', where we started. Here simultaneity is meaningless, because in an infinite space nothing is simultaneous with anything else, there is no frame of reference and no residual observer; and indeed, there is nothing whatever in this 'space', not an atom, not a breath. Just dreams of finitude, of finite possibilities. Dreams of being, for nothing in this 'space', nor the space itself, has being.

Jürgen Lawrenz

It Depends what you mean by 'conceptually possible'. I would say that time-travel is logically possible because there seems to be no contradiction in the concept (which is obviously very different from sayings its physically possible in our world.)

The interesting questions, as far as I can tell is what known as the grandfather problem. Suppose that time-travel is possible. Now, suppose you go back to the time when your grandfather is in his youth and you kill him — this would mean that in the future, there will be no you. But then how could you have come back from the future and killed him?

Here I agree with David Lewis. He reckons that time-travel is possible but you can't change anything in the past. This is because he thinks of time as a big line and each point is equally real. Consider time T, when you travel back to point T*. Now, Lewis wants to say that point T* is equally real when you travel back as when you are there at point T*. The only difference is your perception of T*. The answer Lewis gives to the grandfather problem is that you can't kill your grandfather or change anything for that matter, for the reason that you were there already. This sounds weird but if you think about it it makes sense.

Rich Woodward

Travelling through time is something we all appear to do every day, this morning I was in the past but now I'm in the present which was the future! I assume however what you are talking about is when an individual travels to a time outside of the ordinary scope. There's an interesting article in Le Poidevin & McBeath's book The Philosophy of Time on the subject but I can't remember who wrote it, however here are two key issues.

First if we were to travel back in time it would appear possible that we could change the past, possibly causing a causal loop whereby our actions in the past affect the way we are in the future. Second there is the ontological status of the past and the future.

To deal with the first problem, consider the 'Back to the Future' scenario where the character potentially stops his mum meeting his father and therefore prevents his own existence. If this were to happen however it would not be the case that in the future that he could go back and prevent his own existence. The argument therefore entails that if he can prevent his own existence then he can't prevent his existence. The other apparent way to avoid this problem is to suggest that you can't affect the past when you go back, but this is somewhat strange. The way around this problem is to say that the Time traveller can affect the past however he can't change it. the 'past'' is already a determined system which the time traveller may cause an event in but any event that he causes will have already happened. He is therefore free to affect the past but he cannot change anything that happened in it.

The second issue is whether there is anywhere to travel to. There are two main positions on time which broadly are the tensed view and the tenseless view. Without going into the positions too much the tenseless view of time is that there is nothing ontologically privileged about the 'present' that we perceive, all times are equally real, thus this position is somewhat analogous to the conception most of hold on space where there is nothing special about 'here' rather it is just the place we happen to occupy. If you are a tenseless (b-theorist) theorist then there clearly is a 'place' to go to when you time travel.

The second position that is held is the tensed theory (a-theory) of time whereby there is something privileged about the present, namely it is the only time that is present. Time flows from the future into the present, and the present to the past. One of the main motivations for this position is that it allows us to hold that the future is open and allows for a non-deterministic position of the world. The a-theorist has more work to do than the b-theorist at this point as for the a-theorist three main positions are viable:

a. Only the present exists.
b. The past and the present exist.
c. The past present and future exist.

Now depending on which of a—c you accept you're potential to travel to those places is affected, clearly if you hold a then time travel is a priori impossible, if b then you can't go to the future.

There are other issues but I feel these are the main two. As I say if you have an interest in time I strongly recommend Le Poidevin and McBeath's anthology [The Philosophy of Time. Oxford University Press 1993].

Mike Lee

David Gerrold in his classic 70's sci-fi novel The Man Who Folded Himself (new edition published by BenBella Books, 2003 forthcoming) describes a version of 'time travel' where the time traveller hops to alternate time streams. For example, you could hop to a time stream where it is September 10, 2001 and foil the terrorist attack on the Twin Towers. That might make you feel good for a while. Until you realize that all you have succeeded in doing is prevent the attack in an alternative universe. In the actual universe, what happened happened, and can't be made to unhappen.

For more on Gerrold's time travel universe see my Afterword to The Man Who Folded Himself.

Geoffrey Klempner


Mark asked:

I've come up against an idea that won't budge. Perhaps you will see my error, or perhaps you can direct me to relevant literature. Here goes:

I've come to think it is impossible to imagine a universe in which I do not exist. Because in order to perceive of that imaginary place, I must have some sense data of it. And in order for there to be sense data, there must be some existing thing that senses.

It's as simple as the logical contradiction involved in imagining yourself being in a room in which you do not exist. You "look around", but you are not there. But what is doing the looking? Is it possible to imagine that nothing there exists and yet looking happens? I don't think so. I think we just ignore the fact that we must "be" in this universe-that-doesn't-contain-us, and sidestep the contradiction. I think we must postulate our existence in that universe in order to perceive it, and in doing so we violate the premise of our non-existence.

Even though it is subject to the whim of imagination, I cannot so bend the rules of logic to imagine that I both am and am not in the same place. I cannot imagine what that would be like.

I used to think (just this morning!) that it was a simple thing to imagine a universe in which I didn't exist. Now I think that to the extent that I can imagine it, I violate the premise of my non-existence.

First, you are confusing two senses of "imagine". One is "perceive" and one is "think of". Obviously you can think of a universe in which you don't exist... you're doing it above. Second, you can think of being in a room, the furniture in it, etc., etc., with as much detail as you want... but you still don't have to be visualizing that room. So the dilemma you're having is that you can't conceive of visualizing something without a viewpoint from which to visualize it, and that implies an observer. Ok fine. Perhaps it does. However, first, that observer doesn't have to be you as you actually are. For example, you could visualize a room from 1/4 inch above the floor, or from the viewpoint of an ant, right? Now, how could that be you? So just whose existence are you "postulating" there? No particular existence; it's just that in order to visualize, you need a reference point. You're postulating a point of view, not any particular observer. Of course, you could object that your observer is utilizing visible light instead of x-rays or sonar, and so that implies that observer is restricted by what you can conceive of... ok, fine... and...? So in order to visualize something you must do it in ways that are limited to what you can conceive of. Well, I'll agree to that. But I still don't see that you're then restricting it to you, as such.

Further, what is the point of this question? Clearly you can conceive of a universe in which you don't exist; as I say, you're doing it just fine above. So what do you want? You want to be able to visualize a room in another universe, one in which you don't exist... without having a viewpoint from which to visualize it? But then what would "visualize" mean? Surely the act of visualizing itself implies a viewpoint? I suppose you might want to visualize from all possible viewpoints at once, and you're disappointed that you cannot. Well that certainly is a human limitation, and as such it is a human viewpoint that we must utilize to visualize a room, or whatever. So in that very general sense you're correct. I guess you're going to have to find a computer to do your non-human visualizing for you.

Steven Ravett Brown

Your problem is exactly Kant's problem. So here you have a focus at once for your endeavours. Now I or another respondent could write at length on what you want to know, without soon getting to the bottom of it. But assuming that it is an issue which really troubles you, I would suggest that you sidestep all second-hand accounts and go straight for the Critique of Pure Reason. It is by no means as hard as often made out; in fact, on my view it is a model case of sober philosophical writing and certainly no more difficult to read than Bertrand Russell (who would not be pleased to hear me say this!). Nonetheless you may get stuck because there are inevitable historical associations to absorb, and to help you overcome these hurdles, let me recommend Sebastian Gardner's book on the Critique in the Routledge Guidebooks. Alternatively, Brian Magee has written a fine book on his own travails as a youth, very much of the same kind as your's; and how later his discovery of Kant changed his outlook on life and philosophy. This might be even better for you, given the similarities. The book is called Confessions of a Philosopher. Wishing you the best and that your argosy proves a happy and challenging adventure!

I now have a request to make of you. Your question is quite indiscriminate in its usage of the term 'sense data'. You're not to blame for it; it is a common fault. All the more reason to fix it! What you call 'sense data' are, in fact, sensa, which are the impressions received and processed by your nervous system, brain, perception, cognition etc. Sense data, on the contrary, are unformed stream of impressions which do not make it to any of these processing units. In other words: we are bombarded every second of time by millions of 'sense data', but what we then actually see, hear, touch etc. are 'sensa'; the 'data' are the rejects. I wish this elementary distinction was more readily observed. In my reading of the literature I have observed massive amounts of confusion arising precisely from its non-observance. So you can help! Make the point whenever you're asked to write or speak on the subject.

Jürgen Lawrenz

I get your point.

But in one aspect I don't believe you. I think you can very well imagine what people are saying about you when you're not there. That's a favourite pastime. Now serious: imagining a universe without yourself can be done in two ways.

1. Focusing on things that don't concern your personal presence (like gossip)

2. Focusing on things that need input of yourself

The second option acquires your presence. So it is impossible to imagine it without yourself.

It's like trying to imagine yourself doing an exam without being there. I don't know the definition of it, but to me that seems a 'contradictio in terminus'

Henk Tuten

I think I see what your getting at — kind of like Sartre's claim that you can't imagine your own funeral because your already there? I guess most of the problem revolves around how we imagine these situations. Visually imagining is hard because you can't escape your own first-person perspective — like in Sartre's example — there is a sense in which your there. But suppose we just represent the situation linguistically: we tell a story, about a world in which all that exist are gorillas that listen to classical music (which they've come up with) and eat purple asparagus. Now, I certainly don't exist in that world — I've stipulated that — but I have succeeded in representing that state of affairs but the simple method of Kripkean specification — (after Saul Kripke) whereby we simply stipulate what's going on that world and in doing so we represent that world. As long as that world is not inconsistent (i.e there are no logical contradictions) then that world is logically possible. I suppose that if we try and visually imagine that world that I always represent it from my viewpoint but like i've suggested, representation doesn't stop with our visual modalities.

Rich Woodward

The short answer is, I think, that sense data (if there are such things) don't have to be any particular person's sense data. It is the Idealist's mistake that all sense data must be his own sense data. Bishop Berkeley was (as might be expected) particularly bedevilled by this error.

Ken Stern


Aaron asked:

Music is perhaps one of the most influential things on people and society. I find it difficult to understand how it works (i.e.: the source and evolution of it into what is today). My friend and have been arguing about this question: What would define music's evolutionary pattern. We feel this could define music today and its future through an answer to this question.

For example, I said music is sort of a ray that increases its width as it evolves.

Coincidentally I've just written a paper on a related subject, so I'm kind of 'hot' with it. But your question really requires a book-length response, you will therefore forgive me if I just mention a few crucial facets and leave you to research what else may need to be discovered.

1. Like most advanced brain functions, the auditory cortex is connected to several major 'processing sites'. Consider that we have to be able to recognise the direction from which a sound comes, to distinguish if the sound might indicate danger, to decipher grunts, cries, squeals as well as words, and of course to recognise some sounds as music. Now we should distinguish the last two items from the rest, because they only make sense in a context of a mind-like intelligence.

2. This is obvious with words, for although some animals can be conditioned to 'understand' words, they remain to them simply differentiated sounds, i.e. signals. It is vain to suppose an animal could make anything of the phrase 'truth is beauty', because this sort of thing — the extraction of meaning from sounds-as-words is a mind's prerogative.

3. Likewise with music: some animals respond to the incidence of harmonious sound frequencies, but again it requires a mind to discriminate an intended communication, in short, to discern structure in these strands.

4. Cognition (i.e. the transformation of signals into semantic packets), however, takes place elsewhere than the auditory cortex: in the left hemisphere for words, in the right hemisphere for music; the reason being (it is supposed) that words require analysis, music synthesis; and this respectively happens to be the division of competence between the hemispheres.

5. Now here comes the really difficult part. Some time in the far distant history of hominids, the genetic structure for all this was laid down. Language was simple then, probably just a few dozen mostly monosyllabic words, while music might at first have been nothing more than the sing-song type of aural gesturing which we still do now (when we ask a question, we raise our pitch; when we protest, we descend a fifth; glee goes up a third etc etc). Occasionally drumming may have been added. Simple beginnings, but from the start with appropriate 'cognitive linkages' which, as human communications, would have been powerfully imprinted so that for all future time to come, the human brain would be enabled to discriminate between molecular vibrations modulated by tongue and lips (and later larynx) and those vibrations emanating still from vocal cords, but without or only little modulations. (Let me note in passing how little has changed. An orchestra today still comprises in the main instruments deputising for vocal cords and open cavities, namely strings, reeds and brass, while the percussion also retains its authentic function. Probably therein you'll find one reason why electronic music strikes us as 'unnatural').

6. From the foregoing you should have no difficulty in keeping the sensory and cognitive facets of music each in their own place. Some sounds are inherently 'beautiful' because they caress the nerves in the same way as a gentle stroke on the arm or a soft kiss; and in recent centuries the discovery of chromatic harmony and the refinements in instrumental production have added a new dimension to this indubitable pleasure. There is no real mystery here, as my comparison with a caress indicated; the mechanical detail is relatively well-known and not very interesting philosophically (unless you happen to be intrigued by physiology, as I confess I am).

7. The more important aspect of music is therefore (as you suggested) its tremendous influence on mood, and through that agency, on our mental and even spiritual well-being (or ill-being!). Now a lot of music, classical as well as popular, exerts mostly a visceral impact on our nerves, so this function is rarely more sophisticated than other sensory and sensual transactions, and there is a problem here. Because the mind is affected by its structural perception of these sounds as music, it reacts, and if the music is cheap, aggressive, violent, vicious as a lot of it happens to be, stress results. So in our world of incessantly piped and manipulated music, a great deal of social harm is done by the indiscriminate pouring out of this stuff over the public media. Strangely enough, this goes hand in hand with the peculiar fact that to many people, music is a surrogate religion, a surrogate narcotic and so forth — indications of heavy dependence and craving, which suggests a universal perception of some deep secret woven somehow into the fabric of music that demands endless repetition as a means of getting closer to it. — Now you mentioned 'evolutionary pattern': although it is not the path to a complete answer, it will serve to illuminate significant aspects; and so I will latch onto this and give you one reading of an evolutionary trail that has a pretty high degree of plausibility.

8. Have you ever been caught alone in an abandoned building or a forest on a pitch black night? Have you noticed how suddenly your sense of hearing becomes super-acute, how it enables you to navigate by locating objects and obstacles by the slightest sound, from the echo of your breathing to the cracking of a dried leaf, things you would never notice in the ordinary course of living? Well, among the hominids I mentioned earlier, this would have been a common, indispensable faculty. And of course, you would bring all your fears, your fright and apprehension, your determination and courage, to bear on the situation, and you would soon learn to distinguish the swoop of an owl's wings from the sniff of a wolf. You might like to elaborate such a scene, or many of them, in your imagination in order to appreciate how rapidly and kaleidoscopically your mood would change in the course of just a few minutes as you fight your way to freedom and safety. Now many, indeed innumerably many, of these subtle distinctions among sounds would have become (through cognitive linking and then genetic transmission) embedded in our species profile as a permanent resource of aural analysis, enabling us to recognise instantaneously the structural features of these molecular vibrations, as well as their significant mood associations: and now the crucial element in this theory is that these aural images, being a permanent repertoire, can be stimulated 'by proxy', by evocation and imitation, as similarly you can be inspired to feelings of terror, pity, love, excitement by just watching a movie. The avenue to this type of evocation of aural percepts is, of course, music.

9. So the 'deep secret' I spoke of is the hidden store of millennia of evolutionary travails and experiences of ancient hominids in their ascent to full humanness. Over the course of hominids evolution, these experiences would have amassed a considerable staple of functional sonic stimuli (I call them 'experience percepts'), and because they reflect each of them something utterly basic and fundamental to what it means to be a human being, the mood associations they evoke and stimulate when we play or listen to music are often of the type that strikes a very deep chord in us. But you can also see from this, I think, that ignorant manipulation is apt to have disastrous consequences. We have become very sophisticated since then; and societal living today has alienated us so much from the world of nature that we hardly recognise the difference any more between what is 'natural' and what is artificial. We have lost touch with the impact musical sounds have on our psyche, and are therefore unable to distinguish good from bad, good from evil, unless we spent years on it in a private endeavour to get back to these roots. This is a very recent phenomenon. For instance, if you read the poems of Tyrtaeus, you may be startled to find that he castigates the Spartan youths for tuning their instruments in (say) the lydian instead of the aeolian mode, recognising that one of these is extremely detrimental to their martial spirit. This is absolutely indiscernible to us today; it is a sensitivity long gone. But that power is still there, because it is power of the mind. We today just don't make enough of an effort any more to kept that flame truly alive.

If you wish to pursue some of these thoughts on your own, I can recommend a good book to start on: Music, Brain and Ecstasy by Robert Jourdain. The author is a musician with scientific training. Not much philosophy to be found in his pages; but another sorry chapter in our general delinquence in respect of music is that very few philosophers have written knowledgeably enough on music to qualify as real philosophy. A notable exception is Susanne Langer, whose books Philosophy in a New Key and Form and Feeling contain important chapters on music. Finally there is a book by Donald Merlin, Origins of the Modern Mind, which is not concerned with music at all, but enables you to study some of the evolutionary factors relevant to the mind in considerable depth. But to read this with profit, especially if music is your priority, you need to do a lot of independent thinking while the author talks to you, so this is perhaps a book to keep on the reserve list for when you have reached a relatively advanced stage in your studies.

Jürgen Lawrenz


Stephanie asked:

Explain the cogito?

The Cogito is Descartes famous argument to prove his own existence. Its most famous formulation is "I think, therefore, I exist" (although that formulation appears in only one place in Descartes' works).

The idea behind it is that it is self-refuting to doubt (or deny) that one exists because no one can doubt or deny that one exists unless he already exists. (Of course, as Descartes himself recognized, that shows only that you must exist while you deny or doubt that you do, not that you must exist when you do not.)

Ken Stern


Rich asked:

My problem is this. How can I seriously consider the major skeptical arguments without being depressed about the possibility of being a brain in a vat, never having any real knowledge of the world outside my own mind, and not being certain that I can ascribe consciousness to other people?

Well, if they depress you that much, maybe you should think about something else? It's not as if there aren't other worthwhile things to think about, after all. It seems to me that your problem may not be the topic but your general viewpoint, and perhaps you should seek professional help with that. If you are really seriously having problems ascribing consciousness to others, you do need professional help; please seek it.

Steven Ravett Brown

The problem described by you suffers from difficult conditions. Realize that your mind is quite able of imagining that other people are similar to you. So if you do possess consciousness, why not they?

True you're like a brain in a vat, BUT see instead of a weak point see that as a strong point. You're able to imagine anything you want. Then if you're not a masochist you're not going to imagine things that make you feel bad, don't you think so? (In fact that is precisely what fatalism is about.)

Henk Tuten


Tanja asked:

I was wondering what religion and philosophy have in common? and also what makes them different from each other. You see this is my 1st year studying religion and theology, and I'm very confused!

I was also wondering, between religion and philosophy, what is your opinion about which one is more necessary in the new century?

Religion is often wrongly associated with extrinsic factors like institutional setup or forms like worship, Holy texts, but religion is basically about ideas. Out of its ideas its institutions its behaviour and its history flow. Religion and philosophy are both about ideas. But they are about ideas in different ways. Broadly, religion is about ideas qua God; philosophy is about ideas qua thinking; and of course philosophy may take up thinking in more limited ways which do not recognise the universality and authority of reason, but subject reason to ideology and the like; but the same happens in religion, one may become 'pharisaic' about it. But religion has to still think about God and thinking in philosophy quickly comes to recognise the universalising power of reason. So while both philosophy and religion basically have to do with ideas, they have to do with ideas in different ways, only these different ways soon lead back toward the other again. Philosophy and religion can't get away from each other. Modern philosophy (since the 18th century in particular) is avowedly secular and therefore it tries to think in a way which will steer clear of religion (of ultimate notions such as love and truth). However, modern reasoning in ethics (of what is ordered to the good) steers even modern philosophy back toward questions of morality (of what is right) and thereby into the central province of religion.

Philosophy without religion is trivial and vain. Religion without philosophy is ignorant and often malignant. In the new century religion needs to rediscover its sister Philosophy, and Philosophy needs to soften her heart to the ideas precious to religion and join forces with it.

Matthew Del Nevo

I believe that you are contributing to your own confusion by trying to put a barrier between religion and philosophy. Religion is a philosophy and there is such a subject as 'Philosophy of Religion.' There is also a related topic called 'Moral Philosophy.' Philosophy asks questions like; Can we prove there is a God? Can it be shown that fundamental religious beliefs are true? Can it be shown that fundamental religious beliefs are possible? Are fundamental religious beliefs justifiable? Was the universe designed? Is it reasonable to hold fundamental religious beliefs? Are there beliefs which do not require justification? Is it a mistake to ask for justification of fundamental religious beliefs? Is religious belief possible?

As you will be aware, Theology is the study of God, religion and revelation. The difference between philosophy of religion, as briefly indicated above, and the topics of theology is that the latter are part of a philosophy which accepts by faith the existence of a god, and backs this up by a doctrine of beliefs which calls upon witness, prophets, representatives of God on earth, etc.. Religion unwittingly involves another facet of philosophy called 'Dualism,' which recognises a material body linked to a separate mind or soul, in most religions the soul is believed to survive the death of the material body. This often requires another belief, which could be argued has a metaphysical basis, and that is the notion of a location for the soul after the death of the physical body. In the christian religion this is called Heaven.

We could say that religious believers recognise the philosophical questions the answers to which,in a way, can either threaten or support their faith and beliefs. However, a conviction of the truth and authenticity of their position is sufficient to ward off any threat, and is sufficient to provide its own supportive arguments. Seen as a philosophy in its own right a major universal religion like Christianity is a powerful and intricate conceptual structure, based on an alleged source of divine revelation, the Bible. There is no argument within the christian religion regarding authenticity of the texts, differences only arise with regard to their interpretation.

Your second question about the necessity of philosophy and of religion in the new century depends on what you mean by necessity. To ask whether one is more necessary than the other is, to my mind, a bit like asking whether jam or marmalade is more necessary at at breakfast time. It is a simple matter of choice. In my personal opinion both have always been needed. I am not sure why you should single out the new century to favour one or the other. Unless, looking at it pragmatically you contribute to the general notion that religion is, and has been for some considerable time, on the decline. This seems true with regard to the christian religion, the general view is that churches are emptying rapidly. However, this is offset somewhat by the increased interesting New Age religions, but this is a subject for a separate debate.

There is a general feeling that the world is becoming more secular, seen in a swing towards material interests, and a corresponding swing away from spiritual consciousness. There is less dependence on the church for guidance: births, marriages and deaths are seen to involve the church less and less. Religion is no longer a foundation for the law of the land, it no longer constitutes a deterrent for law-breakers, nor does it provide a basis for accusation and punishment. The steady collapse of, at least, the christian religion has to some extent undermined moral and ethical persuasion.

There is much to say for and against religion, but in view of the secular shift and what seems an unhealthy increase in material ambition, I for one would certainly welcome some sort of religious revival. This, of course, is where philosophy can be very valuable in keeping a focus on religious and moral concerns, ironically, moral debate does not require a religion on which to base its tenets.

John Brandon


Melissa asked:

My question is: How can I relate or argue that dreams are experiences that contain aspects of consciousness? I want to tie in some philosophical references and form some sort of argument. Any suggestions on how I could go about doing this?

Well you can start with Descartes who considered the possibility that he might be dreaming. A suggestion here is that dreams are indistinguishable from being in an experiential perceptual state and while this isn't pertinent to Descartes' project, there is the assumption that dreams are conscious: A subject is aware of some content present to the mind and he also has self-awareness.

I am not sure that anyone has said that dreams are not conscious. Rather, it the case that they point an analyst to truths about the unconscious. So you can read Freud on this, since he analyses the content of dreams and so he too assumes that dreams are conscious or the patient would be unable to remember then. Of course, we cannot always remember dreams but that doesn't mean we didn't experience them at the time. Dreams are always experienced, but not always remembered or known.

Although it is thought that dreams point to truths about the unconscious, they also issue from unconscious desires or wishes, according to Freud. There is a causal relational state from the unconscious and consciousness which differs from perceptual states where the initiating causal state is something external in the world.

It so happens that I am reading Adam Philips (On Kissing, Tickling and Being Bored) and came across a discussion of the work of Masud Khan, again an analyst rather than a philosopher, but he has written a lot about dreaming. He sees the dream not just as an experience, but as revealing something about the 'impenetrable privacy of the self'. It would be interesting to look into this: The nature of the subject in a dream (is his ego present?) and the nature of representation in dreaming (in terms of the phenomenology of dreams, I would say that dreaming is quite different from perception, and Descartes ignored the phenomenological aspect of their closeness and strangeness. He didn't need to consider this of course because he was concerned to find evidence for the truth of sensations). But anyway this is where you might consider which aspects of consciousness are present in a dream. For this, Khan will be more helpful than Freud. And then to return to philosophy, you can look at John Wisdom's Philosophy and Psychoanalysis.

Rachel Browne

The kind of dreaming that you yourself guide in a state of semi-consciousness is in some circles called lucid dreaming. IF some dreams are about real life then see it as improvising on a theme. It helps if you can guide such a free stream of thought. Techniques like yoga are often used for this purpose.

Maybe on http://www.rider.edu/~suler/dreams.html, and on http://www.knuten.liu.se/~bjoch509/works/aristotle/dreams.html you'll find clues. It were the only references that I found that seem to give substantial information.

Henk Tuten


Spike asked:

Here's a bunch of questions I've been thinking on recently all to do with polytheism. I'm particularly interested as to whether there could be more than one omnipotent God. This seems of particular importance to the Christian trinitarian who seems, at least controversially, to be committed to the notion of three supreme beings.

Are there any other 'God' characteristics you can think of which are problematic to be owned by more than one being, or even better characteristics that would be greater only if there is another being which possesses them? Love has been suggested but I'm not convinced by that.

All these issues depend on your concept of 'omnipotence'. It is, I think, a delusion to believe that we have, or ever had, a thoroughly satisfactory, unambiguous notion of it. And thus, as you must know, quite a bevy of omnipotent gods have crossed the horizons of mankind. The Greeks thought of Zeus as omnipotent, yet he had trouble reading the future (cf. Prometheus myth) and was easily hoodwinked (cf. Iliad, he and Hera on Mount Ida), so at best this must reckoned as a qualified 'omnipotence'. Jehovah (and the Christian God) are also understood to be omnipotent, but he (or they) have dreadful communications problems with his/ their flocks, who simply couldn't understand his messages or thought they were mutually contradictory, on top of which he/ they broke or forgot promises rather too casually, so here is another qualified 'omnipotence'. All of which ought to suggest to you that it's the thought we associate with 'omnipotence' that is the really crucial issue. I'm inclined to think that it's an anthropomorphism to begin with and therefore irrelevant to spirit beings. I mean: to be totally blunt about it, a being absolutely omnipotent and omniscient could not be separated from the universe, it would have to be the universe itself; and this suggests that only pantheism and spinozist philosophy have a proper conception. But you may question whether even this 'proper' notion is meaningful. I can't see much sense in calling the universe and God by two separate names if they are the same, even if, as in Spinoza, there is God and everything in the universe is adjectival to him.

About trinitarianism I can't pretend to be knowledgeable, but even the little I know does not argue (as you put it) for three supreme beings. The very thought is a non-issue, for it is one being in three emanations. I think there is some confusion here (maybe shared even by some trinitarians themselves?) between the concept of three persons and the three-personed God. To God, one or three or three million makes no difference nor conflict; he is not bound by our sense of individuality; and thus however many 'persons' God is comprised of, he is still just one.

I don't know if this what you wanted to hear; but religion in my opinion is a seed bed of rubbery concepts, very few of them amenable to stringent philosophical examination, and the concept of omnipotence is one of these. I think one of the reasons why scholasticism has been in ill repute for such a long time (on the whole undeservedly so) is because they were obliged to invent ever more ingenious schemes for making such intrinsically unrealistic concepts credible; and you will see at once that, when the influence of the Church waned, it was one of the casualties and to be irremediably estranged from philosophy. So ultimately, if your curiosity is not dimmed, you need to keep with religions and swallow (if you can) whatever arguments are put forward by one doctrine or another.

Jürgen Lawrenz


Deb asked:

"War is not the answer." Do you agree or disagree? Why?

I presume that this question is associated with the present conflict in Iraq. However, I sense that it is aiming at broader treatment of the problem.

I suppose we could ask at the outset: "War is not the answer to what?" It is difficult to say whether one agrees or disagrees until an answer to this question is provided. Hence, there are situations in which most people would agree that war is the only answer, and situations in which most would agree that war was not the answer.

In a world where the human race is divided up into groups with vastly different world views, vastly different moral concepts, and vastly different regard for the sanctity of human life, there is going to be conflict; particularly when moral concepts and world views are reflected in widely diverse religions and creeds.

Let us ask a basic question. Why do states have armies? The answer that most would possibly offer is, to be prepared to defend the state against aggressors. The ironical point of this answer is that if every state firmly believed this no army would ever be used aggressively against another state! However, we know that this is not the case. Some armies are raised for the specific purpose of aggression. History is loaded with evidence of this as far back as records go.

To come back to the point, if a state is threatened and subsequently attacked by an aggressor, what should it do? If it allows the aggressor to walk into its country and occupy it, then this will avoid war. An example of this is Nazi Germany's annexing of Austria in 1938. Alternatively, if the oppressed state resists the occupation then that is war. The cost of Austria's action or non-action was to lose their independence and to be suppressed beneath the nazi yoke until the end of the Second World War. Poland was one of the states which put up resistance to the nazis, they were thoroughly beaten in a short time, suffering tens of thousands of both military and civilian casualties, then, like Austria, were suppressed beneath the nazi yoke. Because they had shown the temerity to resist and to kill Germans, they were treated worse than the Austrians. Now we have to choose. Was it better for Austria to lose their dignity, but save thousands of lives, or for Poland to lose thousands of lives but maintain their dignity? They, of course, both lost their freedom. Putting all this into the full context of the Second World War and considering the final outcome I believe that most, particularly at the time, would say that war was the answer, the vicious aggression of Germany, and later, Japan, had to be resisted, and somehow crushed.

In the current Iraq war the situation is rather more complex, and, therefore, there is no clear-cut answer. Those who are for the war can state a reason, those against can state a reason, the choice is very much a personal one. Those who believe it is right to free a people from their tyrant leader will say that the war is worth the suffering. Those who believe that the war is an excuse for hidden motives will be strongly against it. Then there are other factors like, Who gave the USA the right to police the world? Seeing that neither the USA or the UK can show real evidence of being threatened by Iraq, what right have they to attack this nation? What right have individuals to over-rule the UN? Then, of course, there is the underlying suspicion concerning oil. Considering all this, my personal view is that in this case war is not the answer. I also believe that there is no evidence to link Iraq to the eleventh of September abomination. Diplomacy in all cases of disagreement should be carried as far as it can possibly go, in this case it was not.

None of the above detracts from the fact that, like most others, I believe Saddam is a monster and that the world would be much better off without him.

John Brandon


Espie asked:

Can you give me an idea on presenting and explaining also evaluating Flew's analogy of the gardener? What is this analogy supposed to tell us about belief in God?

The general idea is that just as in the case of that garden different people can look at the garden and draw incompatible conclusions about whether the garden has a gardner or not, so two different people can look at the world and draw incompatible conclusions about whether there is a God who designed the world. (By the way, although Anthony Flew discussed and anthologized this paper, it was the philosopher, John Wisdom who wrote it.)

Ken Stern


Luke asked:

I have been reading Bertrand Russell's Introduction to Mathematical Philosophy, and I am stuck on his discussion of Frege's definition of the concept of number.

As a visual example, Russell talks about putting things into bins according to the relation of similarity. For example, I note that there is a one-to-one correspondence between my socks and my feet. So, I should put my pair of socks and my pair of feet in the same bin. In this bin, we can also put my hands, my gloves, my friend Eddie's hands, my friend Jenny's eyes, each married couple, and in fact any collection that comes in a pair. This bin will be, as Russell says, a collection with an infinite number of members, and each of these members is a collection with 2 members. We label (define) this bin as the number 2.

So here is where I start getting confused...Russell defines a number as "the set of all classes that are similar to the given class". (Here class essentially means 'collection'.) For an example, the number 2 is defined as the class of couples. I think my confusion is over what Russell means here by "the given class". He phrases it another way "The number of a class is the class of all those classes which are similar to it." What is meant by "it"? Which class is "it" referring to?

I am trying to sort this definition out in terms of the bins analogy. We assembled bins filled with collections that are similar to each other, and labeled them 'two' or 'three' or whatever the case may have been. But Russell then defines the number of the bin as the collection of the collections that are similar to the bin, not to each other. My confusion is that the bin has an infinite number of members, so its members are not similar to it, but to each other. (For example, the number of the bin of couples is the set of all couples, and there are an infinite amount of couples. They are similar to each other, not to the whole bin.) It seems to me that this definition of number leads to every number being infinite.

I think that the key to my understanding of this is the point which we define the number 2 to be the bin containing all couples. It seems that the class of all couples is not the same as the number 2. (for example, that bin has infinitely many members). Russell says essentially that of course defining 2 as the class of couples feels strange at first, but this strange feeling goes away. The bin containing all couples is a certainty whereas the number 2 is a "metaphysical entity about which we can never feel sure that it exists". Therefore it becomes natural to deal instead with the class of couples. I think my problem might be that the strange feeling has not gone away yet, and I could use some further discussion to help see why it should.

Well, notice that on p. 18 of Russell, B. Introduction to Mathematical Philosophy (London: George Allen & Unwin 1930) he states, "the number of a class is the class of all those classes that are similar to it". So first, you must be very very careful of your terminology here. It's not the set. Second, class does not mean "collection", and that is your basic problem. Russell very specifically states that this is incorrect; see p. 12, for example, where he says that he will speak of a "class" instead of a "collection".

The bin containing all couples has this similarity between couples: they all have two members. That is their one similar characteristic: that of having two elements. That one characteristic holds over an infinite number of specific instances, and that is the point of Russell's conception of the class.

So one might say, employing Russell's intensive definition (p. 12), that the "defining property" of the infinite-sized class (p. 13: "a class and a defining characteristic of it are practically interchangeable") of things with two members: couples, is twoness, which is the class-idea, the "number": two. The "bin" is precisely that defining property, no more and no less. Therefore, the class: couples: twoness: two; is precisely identical with that bin, and with the number two.

I mean, you're getting lost in the details. It's just a way of saying, "What do all sets of two things have in common? Hey, there are two of them! So we'll define the number two by just saying: what they have in common is that number." That's really it. Really. The only confusing thing is that Russell is taking that as a definition of number, which sort of turns things around from the normal way of thinking of it, which is: the number two describes that there are two things. He's just saying, no, it doesn't do that, what's really happening is that we get the number from our intuition, if you want to think of it that way, that what all those things have in common is that there are two of them. So realizing that we have that intuition of number after seeing all the couples is the "strange feeling", because we usually think that the number is first, as a description. You see?

Steven Ravett Brown


Fike asked:

What are the similarities and differences between Thomas Aquinas' and Martin Buber's thought towards religion? What are the strengths and weaknesses of their approaches to understand our relationship with a higher power? and what conclusion can you draw about the difference between describing something and interpreting something?

I really appreciate your time, thoughtful ideas and opinions.

Aside from obvious differences like Thomas being a Dominican monk from Southern Italy writing in the thirteenth century and Buber being a German Jew affiliated to Eastern European Hasidism writing in the Twentieth Century.

Both are existential, but Buber's emphasis is humanist, he defines man existentially, while Thomas defines God existentially. Buber is unclear about God, whether He is 'alongside' existence or 'over and against' it, but it is clear that Buber's God is interpersonal. Both writers relegate psychology to its proper place within metaphysics (rather than instead of it, as in Freud and Jung). Thomas has more to say about God and a theory of language (analogy) that is existential. Buber's existential theory of language is limited to basic utterances, and his ideas largely revolve around these. Thomas showed the proper way to talk about God by contrast with improper ways (metaphoricity, univocity, equivocity). Both men saw philosophy and theology in terms of each other and did not compartmentalize them. Thomas saved theology/ philosophy (he revolutionized it) from essentialism, Buber saved theology/ philosophy from psychologism and ideological monism (such as Marxism, materialism, Communism, historicism, phenomenalism and so on).

Matthew Del Nevo


Larry asked:

Are Works of Art in the mind?

I would like to add something to Jurgen's answer (Answers 20).

No. Works of art such as a painting or sculpture are objects in the world. There is a vast amount of literature on what makes an object a work of art, and one idea is it's fitness to be an object of aesthetic appreciation, although this allows in natural objects. Another is that the institution of art determines which things are works of art. Jurgen mentions music which is a particularly difficult case. Is the sonata the score, or the performance? If it is a performance that is still external to the mind, and might be thought of as an event. But then each performance is different so its relationship to the score might be an identifying feature.

There is a possibility that Martians might not be able to appreciate objects for their aesthetic qualities, but the only implication here is that they lack a mental and emotional capacity to respond in a certain way. Aesthetic qualities may be called tertiary. Whereas it is within a normal person's capacity to detect a secondary quality such as blue, aesthetic appreciation seems to be more a matter of taste. But even if there is this subjective element, there is still something in an object to which we respond, and our tastes can be educated so that we can be brought to appreciate an aspect of something. To say that appreciation of a work of art depends upon capacities and taste is not the same as saying that a work of art is in the head. It is simply to admit that aesthetic response is more sophisticated and less natural than having a sensation of blue.

A piece of music arouses emotions in us because of the nature of the music. It is not some strange accident that when a piece is played we just happen to feel sad. Art is deeply related to the way we live in the real world. Literature, which invites us into the internal lives of others, can change our attitude to people. Pictorial art can show us new ways to look at the world.

When we turn inwards, in our sadness on hearing a piece of music, this is an internal occurrence. What we learn from literature brings an internal change in us as subjects. When we see the world differently, there has been an internal adjustment. But in each case this is brought about by something external to us. If art was in the mind it would be a dream, and this is not what we mean by art. Works of art connect us the world, dreams do not.

Rachel Browne


Pierre asked:

Could you explain to me induction versus falsification?

I'm not totally sure what you're actually asking, though the wording suggests to me that you have in mind the collecting of evidence with a view to elaborating hypotheses or theories from this information (i.e. what Bacon meant when he dealt with this issue). It is not a sound philosophical principle (cf. Hume), but it works sufficiently well in a practical scientific setting and in many cases with such high probability of assurance that it would be churlish (as well as impractical) to deny the validity of the principle as a default methodology for science. Problems arise when scientists presume on the strength of any hypothesis or theory that may have been derived inductively to work out a proof positive in their favour. It is then that falsification enters the picture. But to appreciate what's at issue, you can't restrict yourself to science, where induction is in any case more preached than practised. You need to look at the philosophical backdrop, from where the concept of falsification arose in the first place.

Now the historical setting for this is Vienna in the 1920s and a group of thinkers collectively known as the 'Viennese Circle', whose philosophy goes by the title 'logical positivism'. Most of their members are forgotten names today; at most you might still come across Rudolf Carnap now and then, who was the leading light of the group. Put in the plainest terms, their doctrine was that knowledge is what can be established by watertight proof: so any theory purporting to be true needed to accumulate and then evaluate the evidence and, of course, both accumulation and evaluation relied on scientific tests for validation. A programme, one might say, as old as the hills; yet because these men brought their doctrines to bear on philosophy and indeed attacked philosophy for admitting such slipshod disciplines as metaphysics into its canons, the Vienna Circle achieved reknown and respect, and thinkers of the calibre of Wittgenstein and Russell did not disdain association with them, at least for some time.

The euphoria, however didn't last long — except in the public domain where, if today you see an advertisement for makeup or washing powder, where science is supposed to have proved the benefit of using a particular substance, that is a legacy of logical positivism. The problem is, of course, that science cannot possibly prove such a thing; in fact, the whole philosophical issue involved here is that science cannot prove, 'incontrovertibly and positively', anything whatever. The man who spotted the chink in this reasoning was the young Karl Popper, then loosely associated with the Circle, though never a member.

Popper himself only claims that his reading of Hume made him realise that the tenets of logical positivism were shown to be unsound by Hume 200 years before the Viennese brought them up, in full ignorance of this historical fact. The point he nailed down (on Humean precedent) was that you can observe what you will as many times as you wish, nevertheless phenomena confer no guarantees. It is an in-principle impossibility. From this realisation sprang his idea that a theory based on hard, factual and experimentally tested evidence may yet be accepted as true (for the time being) for as long as it has not been disproved, or, as he termed it, falsified. In short, he realised that phenomena do after all guarantee one thing: they return a 'positive false' to mistaken experiments! According to Popper, this is the best we can have: if a theory stands up to repeated blistering attempts at demolishing it, then we may confidently pronounce it as 'true', though always and ever only interimistically. Contrary therefore to the beliefs of the Vienna Circle, a theory is not established by positive evidence and experiment, but by resisting every effort at falsification.

To pursue this as an interest, obviously Popper's writings would be the best source: always get it from the horse's mouth if you can. The best, though a difficult text, is Objective Knowledge. But it's worth the effort.

Jürgen Lawrenz


Graham asked:

What is the true definition of the philosophy of education? And why do we need it?

That's a tricky question, but I'll have a try.

Philosophy is split in all kinds of specializations. Because the field of attention of philosophy is very wide this is wise. But I can understand your question as well. That is because often different specialization have opinions about the same question, that way they cause a lot of confusion.

In other sciences you see the same, but generally not that much. That shows I think that either in philosophy the definitions of specializations are too wide, or there is lesser discipline. No offense, but I suspect the last. The nature of philosophers is to discuss, and often this trait wins from the knowledge that in this case they should keep their mouths shut. Otherwise from the outside philosophy looks like a chicken pen.

On the Internet I was often surprised that different specializations as well answered questions differently. This with the meager excuse of using a slightly different view, but in most cases very confusing. Outsiders are generally not interested in minor differences in view. They want to know the basics, and not instead to find a lot of confusion.

So to answer your question: Specialization in philosophy is needed, BUT this should not externally lead to for outsiders uninteresting debates. Philosophy should do a better job in marketing its own product. When these conditions are satisfied, then a lot of definitions for Philosophy of Education will do.

Henk Tuten


Stu asked:

Apart from being able to feed yourself and build shelter what's the advantage of knowing anything?

You seem to be assuming that the only things that we desire are very basic needs such as food and shelter. However most humans have far more complex needs and desires than this. Love, companionship, wealth power, security etc are all powerful motivators. Knowledge itself is seen as an intrinsically good thing by many including myself.

True beliefs about the way the world is tend to be of greater value to us than false beliefs in interacting with the world than false beliefs and therefore help us to satisfy those other desires we might have.

Mike Lee

"Food" and "shelter" are nice vague words, aren't they? Now, just what do they mean? What kind of shelter do you want? You want clothes? What are "clothes"? Animal skins? Ok, how do you kill an animal and skin it? What do you do to the skin to make it into "clothes"? Let's see... you kill an animal with... a spear, right? Ok, how do you make a spear? You "cut" a tree... with what? You put a point on the tree branch... with what? Your teeth? No... a knife? How do you make a "knife"? Well, maybe you just hit the animal with a club, how about that? Ok... what animal? Where do you find it? Well, let's say you've found it and hit it... and you just rip the skin off... then what? You just drape the skin, all bloody and dripping, over yourself? And how long do you think it would take to rot? Whoops, I guess you have to treat it somehow... now, how do you do that?

Well we haven't even gotten past clothes yet, pretty crude ones, and we're sort of stuck from our lack of "knowledge", aren't we... I guess we have to learn how to make knives, to tan skins, that sort of thing, right? Now, once we've learned to make a knife to kill an animal... guess what, we can use it for other things! Like killing people... like cutting wood, if we make it big enough... and gosh, once we cut some wood, we can make a boat, a house... but to make a boat, we have to do that thing: "learn", you know... like, how to make a rudder, a mast, maybe even sails... and after all it would be nice to know how to navigate just a little, wouldn't it? Maybe make a net to catch some fish? But making a net means learning again... about knots, about making rope... it just never ends, does it. Once you learn how to make rope, then you can tie all sorts of things, can't you... I mean, a little fish, what's the harm in that? It makes some variety with all the meat we've been hitting with our clubs, right? Or are we using knives yet? Oh, by the way, how do we teach all this stuff to our kids... oh oh... we have to invent "writing"... oh dear, now it really starts, doesn't it.

I guess we also want "food" like vegetables and stuff, right? But that's just more to learn... plowing requires a plow... now what's that? How do you make one? How do you use one? You want to dig a hole... but that needs something like a shovel, and we don't even know how to mine metals yet, much less smelt them... so I guess we need wooden shovels... now how do you make one of those without a metal knife? Well you could chip stone into one, I guess... or use a stone knife to make one... how do you make a stone knife, anyway?

So "feeding yourself" and "building shelter" require enormous amounts of accumulated knowledge, if you want anything resembling what you're used to. You want to go into the wilderness and live off the land? Hey, sure, just don't forget where your axe came from... an iron mine, a smelter, a mold, etc... all requiring extremely sophisticated technology, supported by all sorts of infrastructure, technological and economic. Those nice warm clothes, woven on a loom from harvested cotton... the loom built from wood and metal, an accumulation of thousands of years of technology, the cotton grown with plows... even the sack you stuff the cotton balls into is woven, isn't it. Your leather boots... tell me, how do you make boots? Bootlaces? Boot soles? What if your boots are synthetic? Oboy. And all that knowledge can be used for... food, shelter, clothes, transportation. And maybe even a bit of fun now and then... is that so bad?

But maybe what you want is to live like the Native American Indian... noble and free, right? Well, noble, anyway... their lives were constrained by unbreakable customs... not what we'd call free. Well, you could break them... and die. Or get an infection... and die; sick... and die; injured... and die. Or maybe, if you're very very lucky, do ok until you kill off all the buffalo, the way the Native Americans' ancestors killed off the mammoths. Yes, they did. And starved, many of them. Well, there's always the nearest war, instead of TV... good entertainment, slaughtering your neighbors... very highly thought of in those days. Unless you were the ones getting slaughtered, anyway.

Now where were we... oh yes... the advantage of knowing things beyond "food" and "shelter"... you mean, like medicine?

Steven Ravett Brown


Cory asked:

It seems to me that by saying that knowledge is unattainable, the skeptic seems to be professing that he has knowledge of what is unattainable. Is this not being hypocritical?

Even though universal skeptics claim that true knowledge is unattainable, could they still claim that some beliefs are more worthy of being embraced than others? Do some universal skeptics still have any strong beliefs?

Cory also asked:

Hume said that just because the sun has risen every day in the past, it is not certain that the sun will rise again tomorrow. It was brought up in my philosophy class that we think that the sun will PROBABLY rise tomorrow. Is it possible for one to have certain knowledge that something will possibly/ probably happen? Could this disprove skeptics' arguments?

Why not? And they do claim that. I have never met a universal skeptic but, as I have just said, why shouldn't they? Believing and knowing are different.

About your second question. I don't think it is possible to have certain knowledge about anything, but that doesn't mean we cannot have knowledge, but without certainty.

We do, without doubt, think (believe) that the sun will rise tomorrow. And we also think (believe) it will probably rise tomorrow. As I said before, I don't believe it is possible to have certain knowledge about anything, but I think we can know that the sun will probably rise tomorrow. The fact that it has risen in the past, and that there are no good reasons for thinking it won't rise tomorrow, justifies us in thinking that it is not only thinking it will rise tomorrow, but knowing that it is probable that it will rise tomorrow. Moreover, it would also justify the claim that we know it will rise tomorrow.

The fact that it might not rise tomorrow, is not a good reason for thinking we do not know the sun will rise tomorrow, since the principle that if something might not happen, we cannot know that it will happen, is false. So, you can tell your classmates not to fret. We not only know it is probable that the sun will rise tomorrow; be know that the sun will rise tomorrow, although not for certain.

Ken Stern


Will asked:

Although, having taken a few courses, I have some background in analytic philosophy, I am still very unclear as to what it is to "do philosophy". How can we distinguish it from mere critical thinking? Suppose there is a problem. How can we approach it philosophically? When I did philosophy with, say, a social problem, I found that I seemed to not be thinking philosophically. It seemed to be something anyone with no acquaintance with philosophy can also do.

Your last remark: 'it seems that anyone can do it' characterizes your question. Looking at myself I partly agree. With common sense I got far in philosophy. Nevertheless I found that after detailed studying of subjects, I changed my opinions or refined them. So one can approach any problem with common sense (sometimes you get the impression that this is missing). BUT to draw conclusions you must know the total debate, so you have to go into detail. Not too fast, because common sense mostly gives you overview, while details tend to get you stuck in a local discussion. You need BOTH, as well overview and detail. What you maybe observe is that often philosophers go too far into detail, while for you the overview is missing. That might be true, but only indicates that they do a far better job in selling details than in providing overview.

It is not necessary to get impressed by philosophic titles, but like in all studies part of it is just hard work that could be done by any intelligent person using the right methods. Maybe philosophy as a study should show more that indeed it provides means that can be helpful.

Henk Tuten


Edaw asked:

I have some questions about religion:

Does God know what it feels like to not know everything? If not than he doesn't know everything. But if he does know what it feels like to not know everything, then he doesn't know everything.

Is there more than one Jesus? if E.T. exists and God exists did God gave his alien son to the aliens too?

How is reincarnation possible? Aren't we reincarnating every second since our cells are dying and being replaced?

Edaw also asked:

Is there such a thing as nothing apart from something?

When you say everything does that include nothing?

If nothing can't be destroyed, does that mean that nothing is conserved?

Since the universe is expanding do you think that everything is becoming nothing? Since the density of the universe is decreasing and the volume of the container of the universe is infinite, therefore a constant divided by a number approaches zero as the number approaches infinity.

I'm sure theologians would be happy to tackle your questions head on; and there was an era, known as scholasticism, when religion and philosophy were interlocked to such an extent that insoluble issues like these occupied the bulk of philosophical thinking. However, once one steps outside of theology, one is bound to mention not only the intrinsic insolubility of the questions, but how little they apply to God at all. For example, 'feelings' in whatever sense are distinctly creature attributes and I can think of no good argument why God should 'know' anything about them, why he should reserve a dimension of divine knowledge to the emotional foibles of creatures. Maybe you should look at your list again in the light of anthropomorphisms and reflect on the difference between a physical creature and a spiritual being. God is not a bigger than 'human' human, the very error into which many questioners fall who have not thought about this.

And so your other questions, about Jesus and ET, really need some thinking as well, because there are no ready-made answer that can just be pulled off a library shelf. As to 'nothing', the 'expanding universe', 'infinity' and so on, these are all very big issues, much debated, unresolved. Let me therefore recommend to you that start a little research of your own, whether on the internet or in libraries, which is infinitely more satisfying than getting my opinion on them, which might only send you on to another opinion and another, ad infinitum. Enjoy!

Jürgen Lawrenz


Francis asked:

Can you prove that God does not exist?

Just to add to Brandon's answer (Answers 20).

First, people do try to prove that God does not exist. An obvious argument that springs to mind is the argument from evil. Mackie famously formulated it in the deductive form such that:

God is omnipotent
God is omniscient
God is all loving
Evil Exists.

From these premises you can conclude that either God knows there is evil, can do something about it and yet does nat want to, or that he knows, wants to do something about it but can't, or would want to change it, could change it but doesn't about it. Thus god is not (Omnipotent and Omniscient and wholly God) Now that argument purports to show that God can't exist as defined by traditional theists. This sort of approach attempts to show that the concept of God is somehow internally incoherent.

Another route people use to show God exists is to explain all that needs explaining without appeal to a God, if God is explanatorily dispensable then why bother postulating him. Of course this doesn't prove he doesn't exist rather it suggests we have no epistemological warrant for our belief in God.

At this point Alvin Plantiga is worth mentioning as he argued that atheism is not the basic position to be adopted. His project referred to as Reformed epistemology is to show that belief in God can be a basic belief. Now forgive me if it gets a bit weak here but I'll attempt an overview of the Reformed Epistemology position.

Plantiga is a reliabilist about knowledge, he argues that beliefs are warranted so long as they are produced by a reliable means. Senses when working properly are such a means and as such we are warranted in holding those beliefs caused by the senses. Such beliefs are called basic beliefs. Now religious beliefs are caused by certain sorts of religious experience and if these are a reliable belief form mechanism we need no further justification. He goes on to argue that they are such a mechanism and therefore we may believe in God as a default position.

Apologies to Plantinga for any failures in exposition. Now Plantinga's position works against the second epistemological problem about God being explanatorily dispensable, however should it be the case that the concept of god is incoherent then no belief in Him can be justified.

To summarise whilst the burden of proof is often seen to lie with the theist to defend his belief Plantinga shifts the burden onto the atheist to show why the theist ought not to believe his religious experience.

Mike Lee


Diego asked:

What is the difference between Existentialism and Phenomenology?

I was hoping someone else would do this... "the" difference? Ok, how about this: phenomenology has to do with a movement from the internal to the external; it's about the primacy of the internal, of introspection as a way of solving metaphysical problems. Existentialism has to do with a movement from the external to the internal; it's about the primacy of the external, and how our methods and attitudes of dealing with the outside world constrain and shape our identities. There you go; I hope you understand that, because I'm not sure I do.

Steven Ravett Brown


MisterEd asked:

I heard a phrase, I think it was in Latin, that translated to, "In matters of opinion, debate is pointless. Nothing accounts for taste". Is this a known [Latin] saying? How would it be written in Latin, or whatever language it originated?

The answer to your question is 'de gustibus non est disputandem'. 'Gustibus' relates to 'gustatory', which relates to the taste on your tongue, so here is why taste is not a matter for disputing about.

Jürgen Lawrenz

The Latin you are looking for is:

"De gustibus non est disputandum."

"There is no [point in] disputing about tastes."

Ken Stern


Kenya asked:

I recently heard about a new type of logic, the name of which I cannot recall, that addresses/ resolves the contradictions inherent in dualism. Can you point me in the right direction since nothing in the Logic category site rings a bell?

I don't know if this has anything to do with dualism but you might be thinking of a kind of paraconsistent logic called dialetheism which is associated with Graham Priest and his claim that there can be true contradictions. There is an article on it on the Stanford encyclopedia of philosophy.

Rich Woodward


Ryan asked:

Is it just me or is everything ever written is philosophy completely obvious? If you have the ability reason you come up with the same answers as everyone else in history. Every time I read something new, the only thing I seem to learn is that someone else thought that way before me. Descartes may be right on the money with his wax but who really cares, tell me something I don't know. I am looking for someone terse, who can either enlighten me or if not possible least confirm my own findings.

It is one thing to read Thus Spoke Zarathustra which is at least quotable, but something like Beyond Good and Evil cannot be segregated this way, which means it has to be analyzed (boring!) and by the time your done all you can really say is Nietzsche is an idiot who talks to much about what isn't as opposed to what is. At least Machiavelli takes a stand, although his stand may be wrong. It seems like Socrates, Confucius, and Sun-Tzu are the only people original and nobody has yet added or taken away from their findings. Perhaps I should stick to reading Copleston and study philosophy as history, instead of a means of mental expansion. Anything you can give me to renew my love of thought would be greatly appreciated. But my question still remains: Is there or has there ever been anything left to discover in this field or has it all been subconsciously innate to the logical mind?

"Is it just me or is everything ever written is philosophy completely obvious? If you have the ability reason you come up with the same answers as everyone else in history."

That is clearly false; go to any library and you will find different viewpoints and opinions.

"Every time I read something new, the only thing I seem to learn is that someone else thought that way before me."

If it's "new", then it can't be the same, can it. So you've already contradicted yourself.

"Descartes may be right on the money with his wax but who really cares, tell me something I don't know. I am looking for someone terse, who can either enlighten me or if not possible least confirm my own findings."

Philosophy is not TV soundbites. I'll tell you what; find a "terse" writeup of Russell & Whitehead's "Principia". If what you want is to be spoonfed ideas then MTV is a great place to look. Not philosophy.

"It is one thing to read Thus Spoke Zarathustra which is at least quotable, but something like Beyond Good and Evil cannot be segregated this way, which means it has to be analyzed (boring!)..."

Oh dear, analyzing is "boring". Well what can I say... we philosophers are a boring lot, aren't we. Sitting around all day, "analyzing", "thinking"... you know, those boring things.

"...and by the time your done all you can really say is Nietzsche is an idiot who talks to much about what isn't as opposed to what is."

Yes, he was such an idiot... I guess all the well-read, boring people who write so much about him are also idiots, and the people who read them are too... well, I guess everyone, with perhaps one exception, is an idiot.

"At least Machiavelli takes a stand, although his stand may be wrong."

Oh dear... but to find out if it is wrong, you'll have to... analyze, won't you.

"It seems like Socrates, Confucius, and Sun-Tzu are the only people original and nobody has yet added or taken away from their findings."

Yes, you're absolutely correct. All the thousands of books and articles written since them are utter garbage, worthless trash, total idiocy. Please, pay them no attention.

"Perhaps I should stick to reading Copleston and study philosophy as history, instead of a means of mental expansion. Anything you can give me to renew my love of thought would be greatly appreciated.|

Um... I hate to be the one breaking this to you, but "thought" involves "analysis". Yes, I know... boring, boring...

"But my question still remains: Is there or has there ever been anything left to discover in this field or has it all been subconsciously innate to the logical mind?"

Oh nothing at all. It's all in the subconscious, just like Socrates said.

Steven Ravett Brown

Man! all in all I disagree with nearly everything you say — but then again I am a philosopher and if philosophers are good at anything its defending philosophy against the slander of others.

You write:

"It seems like Socrates, Confucius, and Sun-Tzu are the only people original and nobody has yet added or taken away from their findings."

Well, have you been reading recently? Here's a list a philosophers that you fail to mention, some old, most new, and the crazy things they say...

Heraclitus: You never step in the same river once(yes once)!!!
Thales: everything is water (???)
Graham Priest: there are true contradictions!
David Lewis: there are an infinite amount of concrete worlds!
Hilary Putnam: meaning ain't in the head!
Tyler Burge: belief ain't in the head!
Hartry Field: numbers don't exist!
Wittgenstein: the world is the totality of facts, not things!
Later Wittgenstein: 2 + 2 doesn't necessarily = 4!
Peter Singer: killing babies is ok!

Not to mention Quine, Carnap, Russell, Ayer, Kripke, McDowell, McTaggart, Kant, Hume, Blackburn and most philosophers ever. In fact nearly every philosopher ever has disagreed with nearly everyone else on nearly every topic — that's why its so much fun!

So get a book, read it and enjoy the crazy world of philosophers.

Lastly, "Is there or has there ever been anything left to discover in this field or has it all been subconsciously innate to the logical mind?". Well, I certainly don't think that a) I have a subconscious and b) that that subconscious has every entertained the idea that everything is water...

Rich Woodward


Arielle asked:

What are the pros and cons of the justified case of Socrates which led him to death by drinking poisonous hemlock juice?

I don't understand what you mean by justified, but I'll try to answer. Socrates was teaching sons of rich Athenian families to doubt the principles behind the then existing leadership and democracy. And he did a good job, because he was very convincing. As such he was a pain in the ass of those in power. These started a campaign of rumor that resulted in a 'democratic' trial (in my opinion far from a fair one, and based on a flaw in then used democracy that still exists). See him as part of opposition. By killing him he was definitely silenced. Many centuries later such painful mistakes were halted in the U.S.A. and in Europe by 'freedom of speech'

So Plato can be seen as one of the first sociologists (combining philosophy with actual politics). In the 20th century, due to Critical Theory sociology really developed. It's in my opinion an unnecessary pity that sociology and philosophy overlap almost completely.

I think that sociologists should not philosophize, and just practice existing philosophy in actual politics. And philosophers should keep away from short term politics. But it's slightly overdone to give such philosophers a death penalty.

Have a look at my essay on democracy at: http://huizen.daxis.nl/~henkt/democracy-essay.html

Henk Tuten


Andy asked:

The question of the the universe expanding and contracting has puzzled me. What is the universe contracting and expanding in? My concept of the universe has always been the it is the whole and there is nothing beyond the universe. If that is so what is outside the universe that it expands and contracts in?

I've asked myself the same question often enough and never found an answer. I can't give you one either, for it puzzles me as much as you why scientists come up with such obviously incongruent notions, which leave the word 'universe' out on a limb as a meaningless concept. Alternatively, of course, you could look upon it as an embarrassment of our understanding: we want to know if the universe is all there 'is', but we can't know, and so we look at atoms and electrons and quarks and leptons and imagine that in their rhythm a mirror image of the rhythm of the universe is displayed. Sorry: this is no answer. But there is no answer, and therefore the whole question is null and void. Perhaps that's one good reason why we still need philosophy!?

Jürgen Lawrenz


Barbara asked:

Is reality something experimented by the circumstances, that is [relative to] the presence of technology or the lack of it, or on the other hand, an absolute reality keeping the same basis established in ancient times?

Sometimes I just browse the old unanswered questions... Here's something for you to think about, Barbara. I'll take an example from the philosopher Heidegger. What is a hammer? It's something to hit nails with, right? What is a nail? Think about it. What if you'd never seen or heard of a hammer or a nail, and someone showed you a hammer. What would you see? Would you see the "reality" of the hammer? Or, in order to see the hammer as it "really" is, would you need to know its function, its purpose? But that purpose is technological, isn't it.

This seems easy... the reality of a hammer must be what you see when you don't know what it is, right? Ok, then what is a tree? You've never seen one before, and someone shows you a tree. What do you see?

Or take the other classic example, the blind men and the elephant. What is an elephant? Is it what you feel? What you feel and see? But how about its internal organs? After all, to really know the reality of an elephant, you have to know that eats plants, that it has lungs, etc., right? But then, what about its blood? Don't you know its "reality" better if you know that its blood is made of little cells floating in a liquid, and so forth? But this takes us back to the question of what technology lets us know about reality... and what reality "really" is, and how we know it. Would the "reality" of an elephant be different if we only could see it with x-rays, instead of "visible" light? No, you say? But to a being that only knew about seeing with x-rays, an elephant's real essence would be very different than what we understand its essence to be, wouldn't it? Or would it?

Steven Ravett Brown


Kirk asked:

What is the validity and/or utility of the distinction George Santayana makes between Pleasurable Beauty and Admirable Beauty in his theory of Aesthetics?

Very briefly, so as not to pre-empt your pleasure of reading this beautiful and easy to understand book: pleasurable beauty evidently relates to things 'artistic' (but not necessarily art) in the sense of adornment, embellishment, refinement, amusement, that are so to speak 'noncommittal', i.e. they exist, are made and enjoyed for the sake of enjoyment, pleasure and the enhancement of life generally. They are not to be despised on that account, though they are all too often misused. Admirable beauty, on the other hand, is a metaphysical concept and relates as much to truth as to value. A thing of beauty in the former sense need not be a thing of value; but in the latter sense it is, and you'll discover, if you haven't already done so, that Santayana's theory of beauty is in fact a philosophy of value. Now I know that Santayana tends to look at metaphysics with a squint in his eye, but truth and value are themselves metaphysical concepts, in other words, they are creations of the human mind that were not discovered in the environment or (as a present day fad would seek to convince us) have some bearing on evolutionary biological trends, on feeding, reproduction and so on. Beauty, truth and value belong among the staples of human proclivities that are frequently pursued for their own sake, not for gain, prestige, wealth etc. This is what he means by admirable beauty. It is strong and implies deeper and more meaningful connections — sub specie aeternitatis. The rest, I think, you should discover yourself!

Jürgen Lawrenz


Yenes asked:

"Psychological egoism is an extremely nasty view of human nature". Can the theory be made appealing? (by Thomas Hobbes).

Well you could try reading Ayn Rand. Her novels: The Fountainhead and Atlas Shrugged take a good stab at making egoism appealing. Then there's Dawkins and some of the interesting results in ethology, where altruism is related to the degree of kinship. That's an interesting rationale for the development of altruism based on the principle of genetic conservation through the protection of one's kin.

Steven Ravett Brown


Alex asked:

I have three related questions:

1. To what extent is a photograph a document?

2. What can one know epistemologically from a photograph?

3. Does saying a photograph can tell the truth presuppose a metaphysics of presence?

When you take a photograph, so-called 'halides' or silver particles suspended in the film's emulsion are exposed to the energy of light and literally burnt, so they turn black (I'll stick to B&W for simplicity). Thus a thin spread of burnt halides indicates weak light, and so on through to pitch black. Since our eyes perform basically similar functions (more noticeable at night), a photograph constitutes the record of an event at the instant of the camera aperture opening. It may therefore serve as a proxy witness for parties who did not attend.

Epistemologically, however, its value is, to say the least, dubious. This is because (a) it is a flat image, (b) the possibility of tampering, (c) the distortion of colour and hue values (unlike eyes, cameras cannot compensate for the quality of incident light) and (d) because a correct interpretation of any (suggested) movement is rarely clearly determinable.

From these few points you should have no trouble answering Point 3 in your list. A photograph cannot tell the truth even if used as true record, because truth hinges on factors which are not documentary (this is opposing the notion of 'fact' to the notion of 'truth': the two may coincide, but not necessarily, and the criterion used for arbitration is human judgement). Nor is a photo a suitable medium for metaphysical discussions of presence — among several other good reasons is that reflected light cannot certify the presence of the object whose reflected light is captured in it. That this light may stem from an object may be assumed, but again it may be direct light or the object may not be discernible: so your question reduces to the proposition that a photo is the record of the capture of some radiant energy, nothing more. All deductions from this point onwards are conjectures of varying degrees of reliability, from zero to near certainty. But no presence, real or metaphysical, need be associated with, or can be vouchsafed by, a photograph.

Jürgen Lawrenz


Amy asked:

I have been doing some reading on Arithmetical Concepts. I have to write a paper, and I don't fully understand how I can approach what I want to write about. I am trying to understand numbers. Now this is my theory, (mind you I am no philosopher). I think that numbers are sort of like characteristics of an object. Say you have five apples. Now you can't abstract the five from the apples, because say if you take the apples away, then you have nothing. You don't even have the "five" anymore. So how do I explain that?

Say you have five red apples; can you subtract the red from the apples? If you just take the apples away, you have nothing, not even red anymore. How you explain that is by saying that the red is a property of the apples. So what is the number 5 a property of, in the example? Well, what do you have? You have a "group", a "set", a "class", right? So how about this: the number 5 is a property of the group of apples, a kind of class characteristic that's only there when that particular kind of group is there... namely a group of 5 things. So then the number 5 is a property of any group of five things.

And also, it's a property of all groups of 5 things, right? In fact, it's the only common property of the set of all groups of 5 things, isn't it, because everything else can be different about the individual things... apples, ideas, fish, volcanoes, old unicorns... since we're including all possible sets of 5 things. So every set of 5 things is a class of 5 things. And all possible sets of 5 things, all together, is a big superclass... with just that one property in common... so the superclass is named... 5.

So we can say, "a number is the class [5, in this case] of all classes [groups of 5 things] which are similar to a given class [of 5 things]"... if we want to put it most generally... and this is precisely how Bertrand Russell defined (a little confusingly — he liked to be clever with words) number.

Steven Ravett Brown

The problem is that if you equate numbers with sets of objects you get into trouble. Lets consider your example: You have five apples and the number 5 is the characteristic of the set of the apples. Now, it depends whether you think that all there is to the number 5 is the set of all sets that have 5 members as to the problem. you write:

"Now you can't abstract the five from the apples, because say if you take the apples away, then you have nothing. You don't even have the "five" anymore."

Well, under one interpretation, the problem disappears because five is defined as the set of all sets that 5 applies to — not just the set of the apples in front of you. Taking some of the apples away does change the property of the particular set of apples — the set is no longer correctly described as a 5 member set, or a set with more than 3 members, if you take 2 or more apples away — but that does nothing to the number five because you've got all these other sets to point to (am I following you?).

A bigger problem with saying that the number 5 is just the set of all 5-member sets is this: what about the number zero or negative numbers. Since there is no aggregate of objects that has a size of less than one — all those numbers are going to be very difficult to explain. Furthermore, it means that numbers are no longer objects but properties, i.e like colour. This goes against how we normally think about numbers, philosophically at least.

Contemporary philosophers divide into two groups — platonists and nominalists about numbers. Platonists believe that the number 5 is an abstract object — i.e. an object that's not spatio-temporally located and has no causal powers. Nominalists, on the other hand, believe in the reduction of numbers to something more fundamental, typically logic (see Hartry Field, "Science without Numbers"). Now, there is an element of what you say that is ok, in that you talk about numbers as sets, but the sets that you seem to want to talk about are sets of actually existing objects, which is where you seem to get into trouble with your example.

Rich Woodward


Nathalie asked:

Is there a place for God in Aristotle's theory of cause and purpose?

To Thomas Aquinas, Aristotle was "The Philosopher", and since Aquinas based his whole system on Aristotle's writings, I would say that if you want an answer to this question in complete and agonizing detail, go read the Summa Theologica.

Steven Ravett Brown


Somebody asked:

A question about time.

I'm going to try and tackle your question in a roundabout way, because it's a pretty deep issue and I can't pretend to answer it definitively, only to throw out some ideas that may abet your understanding.

Let me give you three situations to compare:

a. The earth revolves around the sun once a year.

b. A needle is standing upright on its point.

c. On a certain day in 1606, Ben Jonson visited his friend Bill Shakespeare at the latter's lodgings for a drink and a chat. While they talked, Shakespeare would from time to time scribble a dozen or so lines of verse on a sheet of paper. Jonson later wrote that it was the culminating scene of Macbeth.

Before I turn to an explanation of what these items purport, let me first attend to the notion of "the womb of time", which is for all intents and purposes the core of your multilevel question, reduced to a neat metaphor. Now think about this for a moment: a woman is pregnant; she bears a growing embryo in her womb; and in the normal course of events this embryo would eventually see the light of day and claim full existence in 'real time'. Terribly suggestive imagery! It insinuates into our minds that a thing, to become an existent, temporally bounded entity, begins as an incomplete, rudimentary, seedlike fragment of thingness; that it starts at a definite moment, call it seeding or what you will, which puts a pattern of development in train with an issue to some extent pre-known and predictable.

So time, in this metaphor, is equated with a womb; but even though it is only a metaphor, the image does carry a significant freight of fallacy. It suggests that the universe 'seeds' time with its future contents once and for all, so that all objects and events are, in a sense, merely the specific occasions of their own realisation and that they are determined ahead of actually occurring. In the Bible this notion is expressed by another notorious metaphor: "It is written". Here the insinuation is that a "Book of Eternity" exists and the passage of time represents the pages being turned.

Both these metaphors have virtually universal status; they are accepted, believed and repeated ad infinitum as veritable truths, in other words as unexamined presuppositions of our thinking about time; and as such they infiltrate science, religion and philosophy as well. Yet I call the notion a fallacy, and I do this on the strength of a scrutiny of its broader meaning in various contexts, where I find that no account is taken of the elementary opposition between monodirectional, periodical and hierarchical principles in the organisation and propagation of events in the universe.

We might think of the "Book" as a program: it may assist with approaching the issues by comparison with a well-known technology. The big bang would then figure as the moment when the program is loaded and decompression inaugurated. Of course we must assume the program to be a self-starter, so that it begins its work without external triggers; but we must also assume that a kind of "residual electric potential" (gravity) comes in the same package with the expanding spatiotemporal shell, so that the elements released in the decompression will begin at once to interact with it and among each other.

At this stage it is worthwhile reminding ourselves that some very new ideas are actually quite old. All the way back in 1821, physicist Simon Laplace wrote of an ultimate intelligence capable of enumerating all the atoms in the universe and how, possessing a valid theory of gravitational attraction, this intelligence would therefore be in a position to calculate the future trajectory of each atom until its ultimate, terminal decay. Here is your idea again, couched in scientific terms. For such an intelligence, however (this is me speaking now) a concept of time would be meaningless, for the paths of this googolplex of atoms would be just a single immense but immobile and immutable graph. And this, at length, brings us back to my initial points.

Would this Ultimate Intelligence (UI) have any trouble with seeing Condition (a) through from beginning to end? None whatever, Laplace would say, and I have to agree. And to this day, physicists are inclined to keep agreeing; I spoke to one of their number only a few weeks ago, and he repeated this hypothesis to me and was very surprised to be told just how long ago it was first mooted!

But we are just coming to the crucial juncture: to a feature of this universe and the behaviour of its objects of which Laplace knew nothing. Laplace's UI could not cope with Point (b). Now this might raise eyebrows, but listen carefully. A needle standing on its point will obviously fall in line with one of its 360 degrees of angles but which? This again is an enigma with a long pedigree, for which a solution was worked out just before 1900 by Henri Poincaré (hence its diagrammatic representation is called 'Poincaré Section'). The solution was that the problem is insoluble! Given a 'fair' needle, i.e. without any bias, its support on a mere point creates an unstable equilibrium in which the 'wobble' of a single atom may influence the direction of its fall. But which atom? Well, even on a needle point there may be 100 million to choose from, but then you also need to find a reason why that particular atom wobbled. I think you'll now get the gist of the problem!

Let me apply a neat contemporary slogan to the situation: "It does not compute." But of course this translate exactly, word for word, into "It is not written."

Just for the heck of it, imagine that when it finally topples, the needle's bang on the table frightens the life out of a microbe, which goes running for its dear life . . . and suddenly you begin to realise that a lot of trajectories on the UI's graph are going to trail off in indeterminable wobbles of their own...

Actually this principle is so important, and yet so little appreciated, that it is apt to quote another example. Let's fire a bullet at a shop window, point blank and absolutely straightline. We'll assume, furthermore, that the convex end has been machined to absolute perfection and that the glass pane has a perfectly regular lattice structure. Now given these conditions, the bullet would be repelled! Why? Because (as an old philosophical principles states), in a perfect arrangement of elements, there must be a sufficient reason for any single atom to yield first; but lacking such reason, none does (this has been experimentally verified). The lesson here is that any action whatever relies on imperfections to facilitate the occurrence of actual events but what are imperfections other than more incalculable contingencies, more unwritten leaves in the Book?

And so, finally, to Point (c). We see here an agency at work, creating something new in rather unexpected circumstances. What this agency (Shakespeare) produced was not, however, a new arrangement of old atoms, but a web of ideas spun out of material which cannot be said to have any real existence at all certainly no trace of it would be detectable to Laplace's UI. For the written text, which might at first contradict me, is not after all the idea, but only its incidental token, that could easily have been replaced by Ben memorising the text. What a paradox! Humans think and put down their thinking on paper, but the moment another human picks it up, it's not the paper, but the thinking they reconstruct. Now where on UI's chart, do you suppose thought atoms might be represented?

All right: time for conclusions.

Point (a) covers what's known as 'determinism'. It applies, as we saw, to those features of the universe that are enumerable, calculable and mechanically predictable. The gross trends of such structures are relatively easy to foresee, because they are governed in the main by periodicity.

Point (b) brings the fine detail forward, which evidently has a latent influence on the trend of gross structures, and their intrinsic instability removes them from exact predictability. They are, however, predictable as mass points over some lengths of time, because in the main they are governed by hierarchical organisation.

Events of type (c) are strictly monodirectional and unrepeatable. It is also a characteristic of such events that they need not have any event-like or object-like consequences. Point (c), therefore, is the only one of the three that has any genuine bearing on the problem. For one can state as a general principle that the occurrence of just one event of Type (c) completely disqualifies the generality of deterministic principles and reduces their validity to the status of 'special instances'. Moreover, that same single occurrence puts paid to the notion of a "womb of time", for if the flow of time is thereby shown to be monodirectional, then obviously there is no further point in pursuing the image of a future in which Macbeth is already waiting for us. I suppose one easy way of comprehending this is, that we can calculate even today millions of different solar and stellar positions and work out (barring accidental intrusions of dark matter) what kind of window we have on the universe in 50 million years. But not even UI himself, lacking knowledge of thought atoms, could have predicted Macbeth until the day that it was actually written.

Jürgen Lawrenz