A Greater Bad Should Not Make a Lesser One More Acceptable and Making an Effort Towards Some Good Is Better Than None

Person A sees undue violence against a pet and starts revolting against it. He gets all agitated and starts voicing his concerns loudly. Person B looks at him and says, “Why are you getting all worked up about this? There are human beings who are dying every day and you are getting all angry about some mistreated animals.”

Person A passes a stand of catastrophe relief for the victims of a natural disaster in a remote island; she decides to contribute a bit of money for these victims. Person B looks at her and says, “Why are you spending your money on some people in some remote location whom you do not know when people here in your own country also need help?”

Person A does not like that his country practices unrestrained torture against some incarcerated individuals, regardless of what they have done or were intending to do. Person B is shocked and shouts, “Are you a fool? Do you even know what these guys would do to you if you were on the other end? Are you aware of how much evil and suffering they were intending to bring? Do not talk to me about any rights; they would not give you any if you were their prisoner.”

I guess the bigger irony is that, in most such cases, Person B’s indignation has more to do with him/her almost never standing for anything of worth and feeling guilty about it at some level. And by belittling the good efforts of Person A, he/she hopes to feel less bad about his/her state of inaction. By shaming Person A, Person B avoids shaming him/herself.

Person B throws out the rest of an apple from his car’s side window. Person A looks at him wondering. Person B retorts, “Oh please, this is nothing. Besides we have greater problems in this country. Look at how corrupt our politicians are!”

Person A sees an empty bottle left on the side of the street. Without thinking much, she picks it up and throws it in the public garbage bin nearby. Person B says, “Forget about it. Do you know how many kids throw bottles and cans on the sides of the streets around here every day?”

Person A does not like to have power running at home when it is not really needed. Person B mocks her by remarking, “You eat meat; do you know how much pollution that creates? And you are worried about few lamps running unnecessarily when you are not at home?”

With the wide availability of information, it is easier to rationalise not doing a certain good and to portray the good action of others as ineffective.

Person B, who is not in the habit of eating foie gras, finds it urgent to put in place a law that forbids the selling of foie gras in the name of animal rights; but, as he eats pork, he objects to laws that forbid some of the most inhumane ways of farming pigs (pigs are one of the most sentient mammals with capacity for self-awareness and emotional awareness) because this might impact the selling price of the meat he most likes.

It is easier for us to claim to be doing good when it is something that does not require much effort and action. As a good friend of mine once remarked, “This is how it is with human nature. They often choose an easy cause in order not to think about other more difficult ones.”

We all exhibit some of such behaviour and judgement in one way or another, and that includes me.

But a great amount of evil does not justify not paying attention to or dismissing as inconsequential other lesser evil somewhere else. And it is not because one cares about addressing some particular injustice that he/she is oblivious to other injustices. We all live a limited amount of time and have each his/her own character and individuality – each chooses his/her moral battles. The more important is to have some of such ethical sensibility that goes beyond the mere minimum of just avoiding trouble with the law. And what is also important is to encourage others to have some of such sensibility, as only collectively we can reach a better tomorrow, and none of us is capable of remedying all ills out there by him/herself.

As we are on the verge of a new year, please pick a good cause, no matter if it seems inconsequential to others, and make a genuine effort for it. And hope that your neighbour will similarly choose another good cause and strive for it. I will.

Happy holidays,

JHTF

Our Biology and Our Civilisation Disconnected

We live in a civilised and technological world very different from that of thousands of years ago. Civility, knowledge, and technology have been developed collectively through the efforts and hardships, and the needs and the wants, of many across times and ages. Great things have been achieved to alleviate some of the difficulties of the human condition and make us all, on average, more knowledgeable, more capable, but also more conscious. And while all has not always been for the better, and while threats of receding exist and should be recognised, the trend, even if not a smooth and steady one, has been towards greater civilisation and civility.

One of the key achievements of civilisation and technology is probably the remarkable general increase in life expectancy of humans around the world over the past few centuries, for a host of reasons, medical and other. And while this achievement is of tremendous value to us – as it would be to any living being – it does not come without new challenges. These challenges can be seen in the disconnect we increasingly face today between our biological condition and the civilisation we have created. We live much longer with civilisation and technology, and we need to live much longer to do something meaningful; all the while, some of the key characteristics of our biology did not change. Let us take few examples:

  • By late twenties, our cognitive processing speed is already well on the decline. We become wiser with age, and probably better decision-makers overall, but we do not have the same cognitive power as when we are young. We may also become less creative and imaginative in some areas, although the reasons behind this can be more due to longer periods of cultural conditioning than aging per se – the two are not possible to completely isolate from each other in any case.
  • Woman fertility declines substantially year-on-year in her thirties, and even faster in her forties, until the woman reaches menopause. Most men are technically fertile for most of their adult life, but their capacity for sexual activity also declines, some studies even claim as early as the beginning of the twenties.
  • And of course, physical power in humans is at its best in the late teens and early twenties, and it is on an incessant decline after that, all else being equal. As with cognition, some sports players manage to change their game as they grow older in order to last longer, but both intensity and endurance go down starting late twenties. The same goes for our motor skills and the sharpness of our senses.

We live today, on average, well beyond our physical, cognitive, sexual and sensing-capacity peaks. Life expectancy in the developed world is approaching eighty, and it has surpassed it in some countries already. And the one hundred-year mark for life expectancy is a distinct possibility in the coming two centuries. This means that we will live, on average, more than fifty years and more than two-thirds of our ‘useful’ life beyond our biological peak(s).

A world where we needed to reproduce fast and a lot, and conquer, dominate, and leave a legacy as quickly as possible before we die is no more – we have a greater leniency of time. And we need this leniency given where our civilisation and technology stand today. We need more time than centuries ago to absorb all that civilisation has developed, to learn, and to understand. And so, by the time we are done learning enough, understanding, and becoming sensible enough, we are already quite beyond our biological peak.

This disconnect creates us many practical challenges; we increasingly struggle to marry our biological condition and the civilisation and technologies we are creating. We try to remedy this disconnect by specialising (i.e. not learning everything but advancing in one particular path as quickly as possible in order to produce something new in it, while counting on others in society to do the rest); dropping school early to focus on a particular sport or modelling career, and returning to studying only after that (if at all); or looking for new medical ways of ‘going around’ our biological condition, such as freezing the eggs and finding a surrogate if the woman is too old by the time she decides to have a child. There is also another (lazier) way that is more dangerous to adopt; it is to blindly bypass in whole many aspects of civilisation, not bother understanding the achievements made so far, and become mere tools of civilisation and technology rather than conscious drivers of them.

It is likely that the disconnect between our biological condition and civilisation will increase even further with the continuous improvement in life expectancy and the continuous increase in the information and knowledge richness of the environment on which civilisation depends. We have to do something about it, no doubt. And maybe our cue comes from evolution. As we evolved to become human beings, we, and most primates for that matter, dropped biological features that may have been advantageous individually for the benefit of other features, while relying more on the community we started to live in to compensate for that. For example, our capacity to see wider angles was reduced for the benefit of much better three-dimensional vision, while counting on others in the community to spot any danger coming from angles we lost the capacity of seeing. Today, it seems reasonable that we may need to do more of such ‘outsourcing’ and sharing as we live much beyond our biological peak(s). With civilisation and technology, we increasingly rely not only on the community but also on machines and outsourced intelligence. This may raise some genuine fears of dependency and loss of control in us, but it does not seem that there is a reasonable way around it unless we start learning and understanding faster – we need only to strongly mitigate any possible risks.

As to another corollary, it is quite likely that greatness going forward becomes even more disconnected from biological peak(s). The greats of tomorrow may be very different from the greats of the past, and collective greatness may become entirely dominant over individual greatness.

JHTF

Privacy and the Digital Age

“Historically, privacy was almost implicit, because it was hard to find and gather information. But in the digital world, whether it’s digital cameras or satellites or just what you click on, we need to have more explicit rules […]” Bill Gates

We live in an age of easier information; information, whether good or bad, worthy or unworthy, travels faster, is more often recorded, and is traceable more easily. Pandora’s box of social information is open, and it will be very difficult to close it now, no matter what data protection assurances and technologies we are given. For every technology securing information, there is likely to be another one to decrypt it or go around it. And the trend towards more open information is only accelerating. Open information has great advantages; more informed constituents, new services, and greater access to knowledge are among its benefits. On the other hand, greater ease of deliberate misinformation, weakening of secrecy, and loss of privacy are among its problematic aspects. And yet, it is the latter, the loss of individual privacy, that is the most worrying to me, more worrying in my view, over the long run, than the loss of any government or corporate secrets.

Every year, we are given new ways of exposing our private lives more easily, not only to those with whom we want to share our lives, but also to everybody else – all ‘privacy policies’, ‘privacy settings’, and the like notwithstanding. For most of us, mature and young alike, we marvel at the greatness and ease of information-sharing technologies and we use them without necessarily paying attention to the possible long-term consequences. It is as if we are given new toys; we rush to play with them but to realise the consequences of our actions only with time. This does impact and will continue to impact our societies in many ways, some of which we can already foresee. Let us take an example: leaders in forty or fifty years from now, at least in the countries where social information-sharing technologies are rife today, will have to deal with the challenge of having a greater degree of their personal and private lives ‘out there’ for others to make use of as they wish. And so, said in another way, if we require to have leaders with no social vulnerability that is common knowledge forty or fifty years from now, what we will likely end up with is either individuals who are suspiciously too clean from a digital record point of view or too recluse from a young age to have not had much of a personal and social record online. We should then ask ourselves, is this the type of leaders we want? Or do we prefer that our future leaders have a normal human aspect like all of us?

Privacy is not about trying to hide things that are illegal or immoral; privacy is first and foremost about giving every human being a healthy degree of liberty to grow and to simply be in an environment that is protective to a certain degree. For the more introvert among us, privacy is equally crucial to regenerate psychologically. We are all evolving and learning beings; we are beings who get influenced by their environment; and we all make mistakes, even sometimes need to make mistakes, without which we do not learn. In the same way that a patient requires privacy with her doctor, a religious person with his clergy, a citizen with her lawyer, and a sportsman with his coach, all of us, and specifically the young among us, gravely need some level of protection of information in our private lives in order to grow and evolve in a healthy manner. People critical of this line of thinking may say that no one shares information today unless he or she wants to. This is not entirely accurate; moreover, referring back to the point above, we are sometimes unfortunately not even aware enough of the consequences of what we do, or the draw towards using new technologies is too strong to resist it at first. Walking a line between protecting one’s privacy and not being socially cut-off from the rest of the world has become a much more difficult exercise of late.

The discussion of the strict boundaries of privacy is becoming more of a social necessity. Unfortunately, the impressive advance of technologies of information only make this debate more pressing and the degree of awareness and the level of education of the users of such technologies all the more important, for their own sake but also for the sake of others around. There is no easy solution to the challenge of privacy vs. growth of technologies of information; we have to maintain a delicate moral balance between the two. But we have to be aware of the challenge first in order to do something about it. So let us start with that…

JHTF

The Common Status of Reading and Writing

Outside strict professional (i.e. largely for the sake of money-making) and academic (i.e., for many but not all, social climbing towards a better-earning professional position, and hence, ultimately again money-making) purposes, most people read to be entertained and most writers write to entertain. Some people write to communicate necessary ideas, even if such ideas are not generally of the ‘entertaining kind’, but they are more of a rare breed. Most commonly acclaimed books and most ‘Best Sellers’, as they are called in the industry, tend to be of the entertaining kind. Sensationalism, mystery, fantasy, fiction, romance, sexuality, ridicule, gossip, and conspiracy are some of the most popular types of reading and writing; they are entertaining, they trigger emotions in us, and they sell more easily. There are exceptions of course, but they are far and few in between. Seldom, books of outstanding quality and value-add make it through to common fame. Such books tend to standout and surpass entertainment books only over longer periods of time, and in many cases, through some form of academic push that ‘forces’ the many students who are going through the ranks of academia to read them (or at least purchase them but not actually read them in their entirety). The academic push is far from being utterly benign; serious books in academia are unfortunately often selected with some cultural and social bias (e.g., the French will predominantly select French authors and ideas, the Americans predominantly American authors and ideas, and so on and so forth; and institutions belonging to particular economic, political or religious schools of thought will only select self-serving books in subtle or openly propagandistic ways).

When television, radio, and the Internet did not exist, many people might have been drawn towards reading books and novels, as it was one of the fewer ways of being entertained. But in our days, with the constant emergence of new entertainment possibilities (thanks again to the money-making potential of the entertainment industry), being entertained through reading may have become for many people a reward not worth the effort. And so, the general interest in reading outside academic and professional spheres has been on the decline, at least in relative terms. Most of what is being written today in the form of books, essays, and papers is not being read enough; most does not penetrate or influence societies enough. Indeed, there are easier and quicker ways of being entertained, such as watching a movie that tells the gist of a book in 90mins or so – emotional roller coaster in condensed form. More so, thanks to new forms of writing over the Internet, long and focused reading has been replaced by short, quick and unfocused blips, which reduces even further an already weakened and lazy attention span in most of us.

It seems that reading serious thoughts and doing serious thinking are becoming more of a rarity, particularly outside academia and normal working hours. Note that we need not to read and write to do serious thinking, but elaborate thinking is very difficult without ultimately some form of reading and writing. We can justify to ourselves staying away from serious reading and writing by claiming that Technology today solves most of our daily preoccupations, and hence going back to making an effort to think is not necessary anymore, unless we stand to benefit monetarily from it in direct manners in a professional or academic setting. Indeed, most efforts in society are towards technological and material mastery. And that would be fine, if people did not regularly complain about unanswered questions in their lives, about disillusionments, about void and uncertainty, and about general dissatisfaction with the way things are. In other words, we want to be lazy, but we also desire that all our remaining problems be solved on their own, or by others, without any personal effort of ours.

I will continue to stand on the unpopular side against this degenerative evolution of our reading and writing habits, on the side of the long, serious, and developed, even if the usefulness of such might seem to be commonly questioned. One can indeed choose to forego serious thinking, to suppress any identity, and to walk blindly down a road set by others and by circumstances. But sooner or later, circumstances will eventually turn unfavourable; this almost always leads to wake-up calls in most of us and to the sudden need to search for the more serious things in life. A depersonalized way of living can be lived, but I doubt that more than very few can ultimately make peace with it – we can pretend that volition, self-esteem, self-respect, and expression of individuality are things without which we can do, but I have yet to see someone really going without them all of his/her life.

JHTF

Words, Languages, and Disagreements

It seems that humans are not the only ones who use same words for different meanings and intentions; chimpanzees do also use same gestures for several different purposes and senses, as recent research highlights.
http://www.independent.co.uk/news/science/chimpanzee-gestures-deciphered-in-world-first-after-scientists-decode-foot-stomps-and-hand-flings-9583455.html
Being more basic, such intentional animal gestures (forming a proper language) may not be open to the same kind of ambiguity as in the human linguistic world. Nonetheless, we should not rush into excluding all room for possible ambiguity.
JHTF

John H.T. Francis

“We do not, in general, use language according to strict rules – we commonly don’t think of rules of usage while talking, and we usually cannot produce any rules if we are asked to quote them.”

“But what we are destroying are only houses of cards, and we are clearing up the ground of language on which they stood.”

Both quotes are from Ludwig Wittgenstein.

I guess one cannot talk about Language and its uses without being reminded of Wittgenstein and without giving credit back to him.

Whenever I am about to start a serious conversation with somebody on a particular subject, such as the existence of God, whether there is such a thing as Fate, which country is more democratic than the other, or whether I am more right-wing or left-wing, I often start the conversation by asking my counterparty what she or he means by God, by Fate…

View original post 697 more words

Words, Languages, and Disagreements

“We do not, in general, use language according to strict rules – we commonly don’t think of rules of usage while talking, and we usually cannot produce any rules if we are asked to quote them.”

“But what we are destroying are only houses of cards, and we are clearing up the ground of language on which they stood.”

Both quotes are from Ludwig Wittgenstein.

I guess one cannot talk about Language and its uses without being reminded of Wittgenstein and without giving credit back to him.

Whenever I am about to start a serious conversation with somebody on a particular subject, such as the existence of God, whether there is such a thing as Fate, which country is more democratic than the other, or whether I am more right-wing or left-wing, I often start the conversation by asking my counterparty what she or he means by God, by Fate, by more democratic, or by right- and left-wing. I do so not because I am a fan of rhetoric or as a way of tricking my counterparty in the discussion; I rather do it to simply avoid useless and protracted discussions that lead nowhere because each is holding a different definition of the same word as a starting point, while not admitting the possibility or existence of another definition that might be used by a different person. People often jump into such discussions, argue for hours, then ‘agree to disagree’ in the best-case scenarios; while in reality, they would be talking most of the time about slightly or largely different things using the same words, and hence their discussion has been futile all along. It is therefore important to know a bit about the genesis and the various uses of Language – this great enabler of our cognition.

Languages, as we commonly attribute them to human beings, evolved organically and in an unorganised manner, as much as human beings themselves. They started by taking rudimentary forms and then evolved with our general cultural, intellectual and technological evolution. No one person or one group sat and defined any natural (or ‘nomological’) language as we know it today. Languages contain definitions of words and verbs and rules of grammar; but there can be many definitions of one word or one verb, which themselves rely on other definitions of words and verbs, not minding the circularity of definitions, and there are almost always exceptions to the rules of grammar. As such, there seems always to be some degree of vagueness in the meaning of words, verbs, and sentences when we probe into them. Unlike logical languages, natural languages are as much living and evolving as our cultures and are an integral aspect of them. Words can point to objects, to phenomena we observe around, to emotions, to concrete ideas, or to very abstract ideas and musings. Words can have their source in observation, in feeling, in thinking, in intuition, or in pragmatic needs. We can employ words for very definite, basic uses and objects. And we can employ words to try to relate to confused and undefined things, not knowing ourselves what we are exactly looking to express by the words, or simply to fill a temporary hole or weakness in our current understanding of the world. Words can be borrowed from other languages and cultures, used according to their original use in the language from which they were taken, or used in a different manner, sometimes quite strange to the origins of the word. And all of this evolves as we evolve with time and in different geographies in a way that a same word can mean very different things in one place and one time in comparison with another. Even in the same place and time, there can be confusion, inexactness, and differences of meaning, not only in common social life but also in academic circles, which are supposedly more rigorous. God, Fate, Democracy, Right-wing, Left-wing, all are examples of such that we have given above.

Many instances of disagreements and misconceptions stem from the fact that we do not think through the words we use as much as we should or that we think that what we associate with a certain word is exactly what others associate with that same word. Worse, we sometimes divide into parties based on a certain position vis-à-vis a word; one would think at first that it is a division vis-à-vis what the word represents, but, in reality, when we ask for more details about the representation of the word, we find quickly so many differences in such a representation that the only matter of substance that is left is a blind division vis-à-vis the word itself. Take Capitalism or Free Will; these are notable examples of words that have divided us into two camps for long periods of time, while actually the meanings of these words evolving all the same. It seems that sometimes we like to oppose each other more so than understand what is that concerning which we are opposed, and words are another tool in this game.

JHTF

Criticising the Hard Normative Stand in Philosophy

A historical idea has long existed that there is, or needs to be, a thought discipline that has monopoly over saying how various things of the world, of Existence, and of Reality, including Existence and Reality themselves, ought to be, and which types of questions, problems, and desires ought to be treated by which particular thought or practical discipline. Naturally, many philosophers gave Philosophy this crown jewel of thought; philosophy was and is still seen by them as possessing the monopoly of the ‘ought-to-be’. This is what is called in the discipline the ‘normative aspect’ of philosophy. Kant is for example famous for his contribution to Normative Ethics, Aristotle for seeking normativity in both Logic and Metaphysics, and Descartes for seeking normativity in Epistemology. In all cases, despite great, and in many cases indispensable, contributions, all these renowned thinkers remained short of their initial ambitions. However, the urge towards normative approaches in Philosophy is not only a historical, long-gone practice; it continues to exist in our more modern times with, for example, Habermas and his idealised thresholds (for legitimacy for example) in political theory and discursive agreement, or Popper and his several failed attempts at formalising his philosophy satisfactorily.

The dream of making Philosophy the driving normative discipline of Reality, Existence, and Knowledge has failed on multiple intellectual and practical accounts – we shall briefly talk about a few below.

On the intellectual front, David Hume, the eminent Scottish philosopher, whose ideas continue to find validity centuries later, was the first to formulate the difficulty with normative approaches in a clear manner in his famous ‘Is-Ought Problem’. Hume’s basic idea on the subject is simple: we observe the world around as it is, while when we seek to create norms (as that is what is sought by normative approaches), we are actually looking to talk about the world as it ought to be – whether be it from a moral, scientific, religious, political or other perspectives. For Hume, all of our ideas are based on our observations of the world as it is, and hence, there is no clear basis really in this jump we make from observing the world to saying how it ought to be; pretty simple and perplexing indeed.

Now for a practical account: historically, it is the other ‘lesser’ disciplines, as some might be inclined to call them, rather than Philosophy itself, that contributed more to what is at the core of the scope of normative approaches. For example, Physics contributed more to understanding Cosmogony than Philosophy, Formal Logic to understanding Epistemology, and Cognitive Sciences to understanding Phenomenology. But many philosophers still believe that Philosophy as a discipline is superior to Natural Sciences or Mathematics, or that it has the right to define what these other disciplines should treat and in which manner.

And thirdly, for an ontological account: since everything is interrelated and of the same fundamental nature, as I hold, and since all things are ultimately circular, then all events are ultimately at-par, and hence the idea of monopoly over norms of any discipline is a flawed one. There is no clear premise for why a series of events that has as role to analyse other events, as it is the case of any discipline of thought, should have some sort of fundamental superiority, or for that matter, be more influencing. And it is also likely that, because of the circularity, not only of the world of thought but also of the material world, all efforts towards developing detailed norms will remain ontologically inadequate (although existentially necessary for humans in some areas such as Ethics).

I am not trying to attack Philosophy in general in this short article; rather, I am directing criticism towards a particular form of arrogance in few philosophical circles in claiming monopoly over norms of the world of Thought, of Reality, and of Existence. Philosophy does remain the most adequate generalist discipline for asking the right questions and pointing to the specialists some of the areas on which it may be more worthy to focus. Philosophers can still attempt, and should still attempt, to hold together all the various thought disciplines in a whole that makes sense, no matter how much more difficult this has become of late with the advances of many specialised disciplines. The philosopher is a man or a woman who holds all the strings of human activity together; is torn between them; makes continuous efforts not to succumb one way or another; endures constant accusations of being wrong in what she/he does or says; and yet continuously perseveres. But centralising Thought does not mean monopolising it, and in a spherical geometry, there is no peak, there is no summit, but only interlinked parts.

On a final note for the specialists: the only normative ‘power’ out there, if there is one, is in Mathematics, in particular, in Set Theory, which is reducible to second-order logic. And even there, we still face the difficulty of ontological commitment, which we need not to detail on here.

JHTF