Wednesday, December 24, 2014

Some thoughts on language and culture

Late last year I raised the issue of accents in relation to language-learning. I was thinking about the way my French teachers had exaggerated the importance of getting the accent right (over fluency and effective communication). There was a element of snobisme in this. For example, there was a joke about one of my high school teachers (of Social Studies, not French, though he later taught in the Italian Department of my university). The cosmopolitan Mr. Lionel Lobstein was known to speak ten languages – all with the same accent. (Very funny it seemed at the time.*)

I liked the man, actually. He was charmingly awkward and wore strange, brightly-coloured woven ties (of which he clearly had a very large collection). He used to tell us about his Greek holidays which he seemed to spend sipping drinks and talking in shady, paved courtyards. The Greeks, he said, had their priorities right and valued conversation above practical household tasks and duties like mowing the lawn (or paying the bills?). There was a hint of sexist double standards in his attitudes, even a trace of misogyny, but one had the sense that he had been disappointed in love.

Getting back to the theme of language, however, we don't expect the French or other non-native English speakers to eliminate their native accents (and in fact tend to be rather disappointed if they do), so why should we try to eliminate ours?

But, of course, the goal of a 'perfect' accent was always, in classroom contexts at least, aspirational only. The actual goal was not so much to eliminate as merely to tone down or minimize the learner's inevitable (and unconscious) tendency to apply elements of the sound system of his or her native language to the language being learned.

In fact there is a lot to be said for general prescriptive standards with respect to accents and language generally (as well as for other aspects of social life) so long as they are sufficiently elastic to allow scope for a certain degree of individual variation and sensitive to wider currents of social and cultural change. Changing standards reflect a changing world.

Standards can be associated with perceived prestige and can change quite rapidly. Certainly, perceptions of the status and desirability of various British accents have changed dramatically in recent decades and the same probably applies to other languages.

But, while perceptions from within particular linguistic communities can change quickly, global perceptions shift more slowly and tend also to be associated with geopolitical and economic factors. A form of British English persisted as an international standard long after the power of the British Empire had faded. Cultural prestige, you could say, is a lagging indicator of a nation's geopolitical fortunes.

Given America's recent global dominance, it is hardly surprising that American English is currently riding high, the vast majority of learners aspiring to master American English and the accent known as General American – even if the United States is now seen in many quarters as a fading (and increasingly unloved) centre of power. And because so many non-native English speakers have in recent times learned English in school from an early age, typically using American-produced materials, their English is becoming more and more difficult to distinguish from that of Americans born and bred.

English may seem a bit like today's equivalent of Latin in medieval Europe, a universal language, but a better comparison might be with Koine Greek in the Mediterranean world circa 2000 years ago. Medieval Latin was primarily an ecclesiastical language and a language of scholarship and the law defining a pan-European cultural and scholarly elite, whereas English, while it has become the international language of science and scholarship, is perhaps even more significant (as Greek was in the Roman world) as a language of commerce and popular culture.**

Though not having to learn a second language to get on in the world can be interpreted as an advantage accruing to native English speakers, there may also be a downside for them, especially for speakers of the standard forms. Leaving aside questions of the various intrinsic and extrinsic values which are sometimes associated with bilingualism – and of course there is nothing stopping English speakers from learning another language – there is another issue which is worth noting. Namely, that native speakers of English generally, and American speakers of General American in particular, may be seen to have suffered a strange kind of cultural loss in that they no longer have 'ownership' of their own language.

They can never retreat into that familiar and intimate linguistic realm defined by common ancestry and shared culture and memories which a native language has traditionally provided.*** For them language and accent have, to a large extent, ceased to operate as a badge and guarantor of cultural identity.

Moreover, native speakers of the standard forms of English have effectively lost control of their language as it becomes the common property of – and will increasingly be shaped to meet the needs of – the many hundreds of millions of people from very different cultural backgrounds who have adopted it.



* Something similar, I later learned, was said of John von Neumann. But when one is a supreme mathematical genius the small matter of an entrenched Hungarian accent is beside the point (or even an asset perhaps).

** The enthusiasm for all things Greek in Roman times – it was fashionable to have a Greek slave to tutor your children, I understand – is another example of cultural prestige long outlasting the power and wealth of the originating nation.

*** A linguistic matrix of this kind has been a key feature of most human cultures – the bedrock, in fact – and an important driver of creativity. For example, vernaculars formed the basis of much modern European culture, and early literary works (in, for example, the Romance languages or English or German, or, later, the Slavic languages) were often seen as social and political statements, implicitly affirming the value not only of the particular language but also of its associated culture.

Friday, October 10, 2014

Patrick Modiano



Patrick Modiano, who has been awarded the 2014 Nobel Prize for literature, is one of the very few living writers who means anything to me on a personal level. I read a few of his novels after coming across Voyage de noces by chance about fifteen years ago and being impressed by its style and atmosphere and sense of place (but I remember thinking that it would not translate well into English).

A Reuters report quoted a comment Modiano made in a television interview three years ago: "After each novel, I have the impression that I have cleared it all away. But I know I'll come back over and over again to tiny details, little things that are part of what I am... In the end, we are all determined by the place and the time in which we were born."

Funnily enough, I have recently been trying to make a list of topics that particularly interest me, and one of them is not unrelated to Modiano's recurring preoccupations.

One item on the list runs as follows: The contingent (and unrepeatable) features of any individual's upbringing – which includes as a central element a unique and ever-changing cultural matrix – raises awkward questions about values. We like to think of our core values as being, if not objective or universal, then at least as having some permanent or abiding relevance. But do they?

I was thinking here of both aesthetic and moral values, by the way. Though certain very basic moral – and even aesthetic – ideas could be seen to have universal applicability, particular patterns of moral and aesthetic commitment (involving priorities and preferences) seem far more contingent on time and place and culture.

(My previous post also touches on some of these themes.)

Sunday, October 5, 2014

Sentimental education

Perhaps it has got something to do with having a father who was considerably older than my mother – and who himself was regularly mistaken for someone of an even earlier generation than the one he in fact belonged to – but I have always felt more culturally connected to times previous to my own. I am drawn, for example, to the intellectual culture of the early-to-mid 20th century, and to the latter part of that period for popular culture.

As a young undergraduate, I always used to prefer the late-in-the-day tutorials scheduled for the benefit of part-time, 'mature aged' students. They came on their way home from work in the city, the men in suits, the women smartly dressed and smelling of perfume. They knew stuff I didn't know and had strong opinions about things I had never really thought or even heard about.

There was a woman in her late twenties perhaps whom I used to talk to a lot when I was in my second year. She seemed slightly old-fashioned, out of her time somehow. And it turned out that she had quite – unusual – ideas.

For she had something of an obsession with someone I had only vaguely heard of, someone who was obviously a hero for her and who represented an apparently lost but (in her eyes) glorious cause – the fascist leader, Oswald Mosley. But politics (or political history) was not something I had strong opinions about at the time, and I just took her views as one aspect of a slightly odd and intriguing personality.

Not only the student population but also the academic staff (in stark contrast to today's equivalents) reflected a variety of political and social views, from left to right to totally apolitical.

I took a course on W.B. Yeats which influenced me quite deeply. It was taught by a Hungarian who had written a dissertation at Cambridge on 18th-century English gardens and who was very much in sympathy with Yeats's fin de siècle aestheticism as well as his general political tendencies and social views.

As a young student, I was – like the typical student character in a 19th-century novel – almost drowning in Romanticism. Wordsworth, Coleridge, Shelley, Emily Brontë, Gérard de Nerval, Baudelaire.

Two French novels I read around that time, Adolphe by Benjamin Constant and L'Education sentimentale by Gustave Flaubert, were each centred on a relationship between a younger man and an older woman: dark unpleasant books both of them, but strangely alluring. They and other Romantic texts coloured all my interactions and relationships (or non-relationships!) at the time.

Everything – especially everything female – was seen by me through a kind of literary lens which in retrospect I could have really done without. All that Romantic and pre-Raphaelite baggage made me quite as blind to immediate reality as (in a rather different way) the Mosleyite woman was.

In subsequent years I have come to reject just about everything associated with the Romantic movement. Except one thing, its one true – and overwhelmingly important – insight into the nature of reality: that, morally speaking, the natural world is value-free – there are no values in nature.*

The 18th-century philosophes saw themselves as science-driven and enlightened thinkers, but their deism perpetuated classical notions of a divinely guided universe. Ironically, it took the radical (and often self-consciously emotional) upheaval of the Romantic period to clear the way for a truly scientific and secular view of the world.



* Of course, I don't mean to deny that living beings have values and human beings have moral values, and that we constitute part of the natural world. But since the Romantic period it has been much harder to maintain the view that human values are somehow reflected in – or derive from – non-human realities, whether natural or supernatural. (This point – or one very like it – was made by Isaiah Berlin.)

Tuesday, September 9, 2014

Drama at Scientia Salon

[This is a revised account. (Sept. 11)]

A second essay of mine was published recently at Scientia Salon and it precipitated a heated discussion – or at least some dramatics and rather shrill claims and assertions on the part of one high-profile commenter* who, after an extensive and not very friendly interchange with another commenter (a British astrophysicist), announced that he would no longer be commenting at Scientia Salon.

The comment thread was closed after five days and over 300 comments.

I don't really want to write a commentary on this curious business, but I will say that I was not particularly impressed by the way 'Aravis Tarkheena' conducted himself. But others can read the essay and his reaction to it and to me and to fellow commenters and make their own assessments.


* He was writing under the pseudonym of Aravis Tarkheena, but his real identity is generally known: he teaches philosophy at a large American university.

Saturday, August 30, 2014

Proposed changes

I have decided to link both of my blogs to Google Plus. The only significant change as far as current readers are concerned relates to commenting: you have to have or open a Google Plus account in order to comment apparently. Sorry about this, but Blogger minus Google Plus is an increasingly unattractive and inflexible platform.

I have also been thinking about other blogging platforms and options but will be sticking with Blogger for the present at least.

More generally, I have been thinking about topics for possible future posts and I'm putting together a short list of questions and ideas. Since most seem more appropriate for the other blog, I intend to post the list there.

I will, however, be continuing to put up new material here as well. Still have a conservative tendency I guess...

Sunday, August 3, 2014

Watching, waiting and thinking small

I haven't been posting much lately, partly because I've been preoccupied with other matters and partly because I'm a bit more uncertain these days, not about my basic values so much as how those values might relate to current political realities and options.

As I have explained in the past, my kind of conservatism is pragmatic and responsive to changing circumstances. Any intelligent strategy must be responsive in this way.

And circumstances are changing. In terms of global institutions, power relations and general culture, the post-World War II order is fading or failing and it is unclear what kind of order – or disorder – is going to take its place. One thing which is clear is that the more or less continuous economic progress which has underpinned stability in Western democracies seems to be coming to an end.

Continuing financial and economic troubles portend social and ultimately political crises in Europe and perhaps in the US and other developed countries.

As geo-political and economic realities change, political ideas must change. The basic principles and themes may remain the same but the way individuals interpret them and align themselves doesn't.

Sometimes old ideas gain new relevance, or standard assumptions are exposed as inadequate in the light of current events.

I'm not sure if the political center is shifting in Western countries or if we are simply losing that space which has allowed the center-left and the center-right to cooperate and compete and dominate the political landscape since World War II.

Nor is it clear whether the Chinese model of state capitalism or other possible alternatives to liberal democracy will continue to look viable. Corporatism seems to be making a comeback as well as various forms of economic nationalism. Patriotic protectionism is one of the key policies of many far-right Western European parties (the Front National in France, for example).

I am by nature an observer rather than a player, a political quietist rather than an activist. The distinction between watcher and participator is not a clear one, however. There is no neutral place from which one can watch history unfold; what happens inevitably has ramifying consequences, sometimes very significant consequences. And, if we are talking about epochal changes, everyone will be affected in one way or another.

Could what we are currently witnessing be described as epochal change? I think so. There is certainly a lot going on at the moment.

And, though these changes are driven more by economic realities than anything else, ideas play a part too: the crude, emotion-driven ideas that motivate ordinary people to support this or that leader, to protest or not to protest; as well as the more sophisticated ideas promoted by ideologues and intellectuals.

As I have argued elsewhere, these latter kinds of ideas – the more elaborated and intellectualized ones – are often merely post hoc rationalizations or justifications, attempts to make courses of action decided on for other reasons appear morally or intellectually respectable. But ideological structures also play an active role in recruitment and in defining and sustaining political groupings.

Ideological structures, however sophisticated they may appear, are always inadequate as models of social and political reality. They are merely useful (or dangerous) abstractions, attempts to impose some kind of value-based order on an immensely complex social and political landscape.

When we move from the personal to the political, from the particular to the general, there is always some distortion and loss of meaning. The concepts become thinner and more abstract and run the risk of losing touch with psychological and social realities altogether. At least in the social sciences quantifiable measurements are made which guarantee some kind of link to the real world (tenuous though that link all too often is).

I try to keep my orientation empirical and my main focus on the particular rather than the general, on psychology rather than on political or social theory, on cultures and customs rather than on universalizing ideologies, on particular languages rather than on language.*

In line with this way of thinking, the basic values that I cleave to manifest themselves at the level of individual experience, at smaller rather than at larger scales.

This is reality. This is where we truly live.



* Even the notion of a language is at several removes from reality. There are, of course, dialects and regional and social variations. And, as Noam Chomsky has emphasized, in the end there are only idiolects which change over time: the linguistic structures or sets of structures which each of us has internalized are in the final analysis quite individual and unique.

Friday, May 30, 2014

At Scientia Salon

Massimo Pigliucci recently published an essay of mine ('Does philosophy have a future?') at his new site, Scientia Salon. I knew beforehand something which Massimo recently confirmed in a personal communication and reiterated in a long comment on my post: that he "couldn't disagree more" with my take on philosophy.

Just to give a bit of background, Massimo recently wrote a piece taking Neil deGrasse Tyson to task for making dismissive remarks about philosophy, so I was a bit surprised when he went ahead and published my essay on the heels of that controversy. And controversy it was because Massimo's piece got huge exposure as it was picked up by the Huffington Post (see the Twitter and Facebook numbers for that post).

Anyway here is a link to Scientia Salon: scroll down for my essay and Massimo's challenge to Tyson.

My piece was designed as a discussion-starter and it seems to have achieved its objective. The social media numbers were relatively good and it prompted a lively and long comment thread.

Thursday, April 10, 2014

Gloomy about Europe

Borrowing costs may have come down across Europe – even in Greece – but the underlying economic situation in many euro zone countries is still grim. And it may well be that the political consequences are only just starting to emerge.

In a recent piece in the Financial Times Gideon Rachman suggests that the euro crisis hasn't gone away: it has simply moved from the periphery to the core.

Italy, for example, has lost 25% of its industrial capacity since 2008, and the real level of unemployment is about 15%. The country's ratio of debt to GDP is more than 130%. France too has double-digit unemployment and the national debt is rising towards 100% of GDP.

Tension between European Central Bank president Mario Draghi and the German economic establishment including finance minister Wolfgang Schäuble seems to be increasing. And Rachman fears that Europe is very vulnerable to an external shock – such as higher energy prices as a result of any standoff with Russia over Ukraine.

Europe's fragile economy is in danger of being tipped into a deep recession. "And," writes Rachman, "a return to deep recession would favour the radical fringes in Europe."

Anyone who has witnessed Marine Le Pen's unequivocal and outspoken but brilliantly controlled television interview performances will be only too aware of the dangers.

But though she has made the Front National far more respectable than it was under her father and has achieved impressive electoral successes (such as in recent mayoral elections) with the prospect of more to come, she has not moved her party into the political – and certainly not into the economic – mainstream. She appeals directly (and effectively) to the French people and scorns the institutional status quo. And her protectionist economic policies are quite at odds with the economic thinking of both the centre left and the centre right.

The situation in other European countries, of course, is different but groups supporting policies similar to those of the FN are now a common feature of the political landscape.

In Asia, by contrast, despite rising nationalism and real threats to the continuing growth of trade and prosperity, bilateral free trade deals and other new arrangements to facilitate international trade and financial transactions are somehow continuing to happen. (Deals between Australia and Japan and Australia and South Korea have been finalized in recent days, for example, and a suite of new arrangements between Australia and China is expected to be concluded this year.)

Despite problems in some regions – like poor air or water quality, or food contamination – much of East Asia (and South and South-East Asia and Australasia) continues to benefit from strong levels of economic activity, with most countries still firmly committed to further lowering barriers to increased trade and investment.

Meanwhile Europe, driven by long-term historical, economic and demographic trends as well as more contingent cultural factors, is moving slowly (but apparently inexorably) to the periphery of global politics and trade. It doesn't help that, under an ineffective President and an increasingly dysfunctional political system, the United States seems to be drifting into a long, drawn-out and perhaps inevitable decline.

Such prognostications, I know, are not worth much. But as more people come to see things in this way – and they will if nothing happens to suddenly render this narrative implausible – these expectations will affect behaviour, feeding and consolidating deep, underlying economic and social trends. Even now, many Asian cities (like Singapore, for example) are booming and attracting some of Europe's and America's best talent.

I don't know that it makes sense to talk about Europe as a single entity. It has always been a patchwork of nations and regions with very different cultures and levels of prosperity. And – despite attempts over recent decades to create a more unified and homogeneous union – it remains something of a patchwork, more interdependent, certainly, but also more divided in new and complex ways.

Some regions, no doubt, will prosper; others will languish, relatively speaking. Given the overall economic situation, however, the best one can hope for, I think, is that the countries of Europe will maintain social harmony and hold at least to the general levels of prosperity which currently exist.

This would not be so bad. But such an outcome is far from guaranteed.

Thursday, April 3, 2014

A path not taken – twice

Reminiscing is a dubious business, sometimes indicative of a failing life, a fading brain (or both).

In fact, I have sometimes wondered whether Marcel Proust suffered from premature aging of his brain, as the past typically looms largest for dementia sufferers as old memories rise again eclipsing more recent and shallower ones. If he was in the early stages of dementia, he certainly made good use of his affliction.

For myself, I look to the past by and large only to try to make sense of the present, strongly believing that people (and organizations and societies) can only be properly understood when seen in the light of their development and history.

So for the individual, say, dwelling on past experiences or decisions need not be an entirely futile exercise and may even provide a better understanding of oneself and what it is one is really looking for (if indeed one is looking for anything at all).

In this regard, mistakes and bad decisions are particularly worth scrutinizing. Though what is lost is lost, critical scrutiny of past errors makes it less likely that similar patterns of behaviour will be repeated. (This is the essence of human intelligence, as I see it. Forget about cleverness.)

Though I'm skeptical about history as a discipline and the stories that historians tell, a sense of history gleaned from reading contemporary sources is undoubtedly valuable in understanding why things are as they are. Likewise, having a sense of an organization's history is a necessary prerequisite to understanding its culture. Learning from one's personal (and family) history is also possible, so long as one is able to remain sufficiently detached.


I've been thinking about the medical profession and doctors lately because I have had some recent dealings with them (concerning some minor but nagging symptoms which were bothering me*).

If I have regrets about paths not taken, not having taken a medical degree is not one of them. [A careful reader will be justifiably suspicious of the triple negative here. Does it indicate subconscious rationalization, a mind playing tricks on itself, I wonder?]

I tell myself that practising medicine would only have exacerbated my hypochondriacal propensities, because if one is constantly dealing with the health problems of others it is virtually impossible not to see potential parallels with the operations of one's own body.

And – have you noticed? – doctors seem all too often to die before their time. Statistics I have seen support this observation, and I think there is little doubt that the stress of dealing constantly with disease and death and being responsible day in, day out for making crucial decisions and giving advice to patients is largely the cause. (Also, easy access to benignantly lethal drugs has contributed to a relatively high suicide rate amongst doctors, I believe.)

Interestingly, two (at least) of my favorite writers were medically trained – William Somerset Maugham, whose early literary success allowed him to forego a medical career (and live to a grand age); and Anton Chekhov, who did practice (and died young). Another notable literary doctor was Céline (whom I haven't got around to reading).

My father had had thoughts of going to medical school. His mother was very keen on the idea (as mothers all too often are**).

At that time you needed a foreign language to get in, but his attempt at mastering French over a summer break with the aid of a linguistically-inclined college friend ended in failure.

Though he maintained a strong interest in science and medicine and (especially) genetics throughout his life, most of his reading – and he was a voracious reader*** – was non-scientific: history and (mid-20th-century) fiction.

He also maintained an exaggerated respect for the French language which he was very keen for me to keep up in high school.

He meant well but he was ineffective in steering his children in sensible directions, partly because he was increasingly out of touch with them and partly because he was out of touch with the times in which he lived. He remained only vaguely cognizant of the radical social and cultural changes that had occurred in the course of the four decades which separated his own high school years from those of his eldest child.


* The symptoms had nothing to do with my heart, but my general practitioner heard a murmur (which I have had from childhood and which has never caused problems) and he wanted it checked out. So I was booked in to have an echocardiogram. I was expecting something easy and quick like an ECG, and was surprised not only at how long it took but also at the physicality of it: all that poking and prodding and breathing out and holding one's breath and so on. To make matters worse, a couple of times during this process a terrible sloshing and gurgling noise – quite chaotic-sounding, actually – became briefly and alarmingly audible. I referred to this as I was getting dressed and the doctor was tapping away on the computer, trying to finish off whatever she had to finish off regarding my test. She said that that noise still bothered her and she was only now, after a number of years, starting to get used to it. I took some comfort in her remarks, as nervous airline passengers sometimes take comfort in the reactions – or non-reactions – of flight attendants to sudden turbulence or strange bumps or noises. Clearly my chaotic gurglings were not dramatically different from anyone else's...

** Jewish mothers especially? I rarely remember jokes; I tell them badly so what's the point? This one stuck however... From the shore, a Jewish mother sees her adult son in serious trouble in the water. "Help! Help!" she cries. "My son (the doctor) is drowning!"

*** Before there were any children, my parents went to a beach cottage together for a holiday. After they arrived, my father immediately got out a pile of books and settled into a comfortable chair. (Needless to say, this didn't do my mother's confidence any good. She was very young and naïve and starting to wonder about this time what she had got herself into.)

Friday, March 14, 2014

Michael Walzer on anti-Judaism

There is something about Michael Walzer's sympathetic presentation of David Nirenberg's ideas on the history of what he (following Nirenberg) calls anti-Judaism which seems – at least to me – to strike a false note. I am focussing here entirely on Walzer's essay and make no judgment about Nirenberg's book.

Part of the problem is that the term 'anti-Judaism' – which one naturally takes to refer specifically to the religion – is being stretched to encompass broader cultural and other factors.

For example, in his essay – the main point of which seems to be to promote the view that the Jews and Judaism of the Christian and post-Christian imagination bear no relationship to actual Jews and Judaism – Walzer writes:

"No doubt, Jews sometimes act out the roles that anti-Judaism assigns them – but so do the members of all the other national and religious groups, and in much greater numbers. The theory does not depend on the behavior of real Jews." [Emphasis mine.]

But, leaving that specific issue aside, consider this passage in which Walzer sums up (and endorses) David Nirenberg's core thesis about 'anti-Judaism':

"… [A]nti-Judaism claims to be explanatory. What is being explained is the social world; the explanatory tools are certain supposed features of Judaism; and the enemies are mostly not Jews but 'Judaizing' non-Jews who take on these features and are denounced for doing so. I will deal with only a few of Judaism’s negative characteristics: its hyperintellectualism; its predilection for tyranny; its equal and opposite predilection for subversive radicalism; and its this-worldly materialism, invoked […] by both Burke and Marx. None of this is actually descriptive; there certainly are examples of hyper-intellectual, tyrannical, subversive, and materialist Jews (and of dumb, powerless, conformist, and idealistic Jews), but Nirenberg insists, rightly, that real Jews have remarkably little to do with anti-Judaism." [My emphasis again.]

"None of this," Walzer wrote, "is actually descriptive." Just to be clear, he means that none of those listed characteristics is actually descriptive of Judaism.

The listed characteristics are descriptive of something, however, insofar as they are exemplified in the social world. This is important because, clearly, ideas can only be seen as explanatory (even if they are only in fact pseudo-explanatory) if the things they purport to explain are there to be explained (or mis-explained).

The question is, then, whether the characteristics in question have any correlation with Judaism or Judaic culture.

Walzer claims here that they do not, pure and simple.

But a strong case can be made that the Hebrew scriptures are characterized by a certain earthiness or 'embodiedness' or this-worldliness which is absent from many other religions. This is not materialism, certainly, but it does reflect a certain orientation which is very different from Platonistic idealism, for example.

There is also evidence of moral and political radicalism in various prophetic and apocalyptic texts.

Even Walzer admits in the course of his essay that the revolutionary Puritans at the time of the English civil war were actual Judaizers (and not just 'Judaizers') in that they focussed more on the Old than the New Testament. He also admits that many of the Bolsheviks were in fact Jewish – "though of the sort that Isaac Deutscher called 'non-Jewish Jews'."

As to hyperintellectualism, there is ample evidence for this in the Talmudic tradition, is there not?

Why not just admit these facts? I don't see the problem.

The issue is complicated and compounded – rather than clarified and resolved (which presumably was the intention) – by the introduction of the inevitably vague distinction between "real Jews" and "'Judaizing' non-Jews".

The basic thesis seems to run as follows...

Non-Jewish (and Jewish, like Marx) thinkers criticized certain forms of thought and action which they characterized as Judaic. But these forms of thought and action were in fact exemplified not so much by real Jews but rather by non-Jewish 'Judaizers'.

Real Jews have remarkably little to do with anti-Judaism precisely because the 'Judaism' of anti-Judaism has remarkably little to do with real Judaism (or indeed with anything Judaic). So the Judaizing non-Jews (or non-Jewish Judaizers) are not really Judaizing (or Judaizers) at all. (Except the Puritans, apparently.)

Got it?

Finally, let me address directly my concern about Walzer's use of the concept 'real Jews' and the substantive distinction which that concept entails between Jews and non-Jews. Making or assuming such a distinction leads inevitably to the sorts of odd dichotomies I was making fun of above and more generally to definitional dilemmas and arbitrary judgements which a more secular and pragmatic approach could easily avoid.

How would Isaac Deutscher's "non-Jewish Jews" fit in here, for example? Presumably they would be seen by Walzer as real Jews who had made some unfortunate choices!

And indeed, according to Philip Weiss who attended a talk Walzer gave to a Jewish audience in 2007, Walzer does seem to take this view (or something like it). Citing Exodus, Walzer allowed that there have always been irreligious Jews. He also accepted that there are Buddhist Jews. But Jews cease to be Jews by active conversion to, for example, Islam or Christianity. (Just in case you were wondering.)

Wednesday, March 12, 2014

Another thought on Wittgenstein's masked theatre remark

I want to follow up on some comments I made the other day concerning Wittgenstein's remark about Jews being attracted to masked theatre by looking at a passage from Michael Walzer's recent essay on anti-Judaism.

Walzer writes:

"The critique of Jewish cleverness is fairly continuous over time, but it appears with special force among German idealist philosophers of the eighteenth and nineteenth centuries, who repeat many of the supersessionist arguments of the early Christians. [That is, the idea that a new arrangement based on new and deeper values supersedes the old.] Kant understood the heteronomy he sought to overcome – action according to moral law externally imposed rather than freely accepted by the agent – in Jewish terms, but he was himself considered too Jewish by the philosophers who came next, most importantly by Hegel. Kantianism, Hegel claimed, was simply a new version of 'the Jewish principle of opposing thought to reality, reason to sense; this principle involves the rending of life and a lifeless connection between God and the world.' According to Hegel, Abraham had made a fateful choice: his rejection of the world in favor of a sublime God had alienated the Jews forever from the beauty of nature and made them the prisoners of law, incapable of love... [And] Schopenhauer, in the next generation, thought that the academic Hegelians of his time were 'Jews' and followers of 'the Jewish God'..."

Perhaps these notions of a certain kind of thought or intellectuality or cleverness leading to false perceptions and alienation – to being cut off from some supposedly deeper nature – may help to explain Wittgenstein's curious remark about masked theatre.

Originally I took Wittgenstein to be saying that masked theatre was an inferior form of theatre and its putative attraction for Jewish audiences was indicative merely of their lack of depth or artistic understanding.

But you could look at it another way. Perhaps Wittgenstein was thinking that masked theatre was ideally suited to expressing the alienation or cut-offness which Jews (or perhaps 'Jews') have supposedly inflicted on themselves by their intellectualism, etc.. This type of theatre would therefore be particularly meaningful to them (i.e. to intellectualizing Jews or 'Jews'*) – and possibly cathartic.

Belle Waring suggested (in response to my comment on her blog post in which I first raised this issue) that Wittgenstein can't have been referring to classical Greek theatre in his apparently disparaging remark. But this would no longer be the case if the remark was not (as I am now suggesting) intended to be disparaging at all.



* I may have something to say about this distinction and some other matters arising out of Walzer's article in a day or two – and then perhaps will give these Jewish themes a rest!

Sunday, March 9, 2014

Ludwig Wittgenstein and the 'Jewish mentality'

I made reference recently – in a comment on a blog post by Belle Waring on clowns and masks and related matters – to Wittgenstein's curious remark, made in a notebook around 1930, about 'masked theatre': namely, that only Jews will be attracted to it.

That discussion didn't really lead anywhere so I thought I might set out here a few suggestions and thoughts not only on Wittgenstein's comment but, more generally, on his attitude to Jews and Judaism.

I'm not sure what kind of masked theatre Wittgenstein was thinking of, but he appears to have been associating it somehow with the unpoetic, abstracting and intellectualizing tendencies which he saw as characterizing the 'Jewish mentality'.

Most of what Wittgenstein says about Jews is conventional 19th-century nonsense related to the idea that Jews are not truly creative, but the reference to masked theatre is decidedly odd.

This is drawing a very long bow but could he, in his eccentric way, be referring to something like what we would now see as autistic tendencies? Those on the autism spectrum have trouble reading subtle social signals (including facial expression) and – I don't know about masked theatre – but they do tend to gravitate more to comic books and cartoons than the rest of us. They also often have a narrow focus in their thinking and are sometimes highly gifted in mathematics and related disciplines.

And, of course, many of the greatest mathematical and scientific thinkers of the last century-and-a-half (autistic-tending or not) were Jewish...

But I am not making specific claims here so much as just tossing around some ideas to try to make sense of Wittgenstein's remark.*

He was aware, of course, of his Jewish forebears and at times referred to himself as Jewish. After previously playing down his Jewish background, he told his friend Fania Pascal in 1938 that three of his grandparents were Jewish. She subsequently discovered that all three of those Jewish grandparents were – or became – Christians.

Wittgenstein's paternal grandparents were both born to Jewish parents but were baptized as Lutherans and married in a Lutheran church.

His maternal grandfather was raised as a Catholic by his mother who had converted to Catholicism from Judaism. His maternal grandmother was a Catholic without a Jewish background.

"Some Jew," Fania Pascal remarked.

Wittgenstein was baptized a Catholic, instructed in the faith and even considered, about the age of thirty, taking holy orders. He was given a Catholic burial. But he lived his religious life very privately, a devout if unorthodox Christian.

I recall reading that he was not only devastated but disbelieving when he found out that the invading Nazis had classified his family as Jewish.

Wittgenstein's religious commitments have been downplayed or ignored, for the most part, by his philosophical followers but his anti-Semitic-sounding remarks (as well as his fundamental philosophical commitments) only really make sense when seen in the broader context of his Christian beliefs.

As this essay by Michael Walzer (a review of a book by David Nirenberg actually) makes clear, Western anti-Semitism – or, more precisely, anti-Judaism – is only comprehensible historically when seen in the light of Christian doctrines and is, or at least has been, an essentially explanatory idea intrinsic to Christian thinking.**

But there were always conflicting traditions of thought within Christianity, and the frictions were often traceable to tensions between those who emphasized classical elements and those whose focus was more on Biblical and Jewish sources.

Consider, for example, the controveries surrounding the notion of the 'Hebrew republic' in the 16th and 17th centuries when some prominent Christian scholars took a decidedly pro- or philo-Judaic approach. These individuals – mostly Protestants but also Jesuits – sought enlightenment about pressing political questions from the Hebrew Bible and its Jewish interpreters.

By Wittgenstein's time the concerns were quite different, of course, as various forms of idealism dominated the philosophical world and modernism battled with neo-scholasticism.

Wittgenstein's main problem with Catholicism, apparently, related to its emphasis on natural theology and, by extension, metaphysics, both of which disciplines he rejected unequivocally.

His views seem to have more in common with certain mystical and fideist traditions, and he found religious inspiration in the writings of Augustine, Kierkegaard, Tolstoy and Dostoevsky as well as certain books of the New Testament.

John Hayes discusses Wittgenstein's intense moral preoccupations in terms of his Jewish background, but I see these preoccupations as being quite consistent with those forms of Christianity which draw more on Hebraic sources than Greek or classical ones.

Hayes writes: "From the point of view of Wittgenstein's religious sensibility, [a] feeling of Jewishness seems to have manifested itself in a strong belief in the Last Judgement as a young man and, as an older one, in what he called the 'hundred per cent Hebraic' sense that what we do makes a difference in the end. Such a perspective compels taking our actions seriously; there is only one chance at life and an accounting at the end of it. Wittgenstein seems to have had an abiding sense of guilt which he constantly counter-balanced by a renewed resolution to live life decently."

Significantly Wittgenstein felt far more comfortable with Matthew's gospel – the most Jewish of the gospels – than, say, with the Gospel according to John. And he originally rejected but came eventually to see value in the letters of Paul – who had, it seems, been deeply involved with a mystical form of Judaism (as evidenced by his reference to the 'third heaven', for example).


* Another thought relates to concealment and deception, as Jews have often been characterized as deceptive. Wittgenstein was preoccupied at various stages with his own deceptions and confessed them to friends as a form of self-abasement. (They related, amongst other things, to his allowing misperceptions of the extent of his Jewish background to go uncorrected; and to violent behavior – which he had originally denied – towards children during his brief career as a primary school teacher in rural Austria). But, though masks do involve concealment and deception in a sense, so does theatre in general. And, not only is this decidedly not a morally suspect form of deception, any concealment or 'deception' is on the part of the players rather than the audience.

** Though anti-Semitic ideas are not necessarily associated directly with Christianity, there are usually at least indirect links. For example, though many 20th-century anti-Semites were influenced more by philosophical idealism than by Christianity, idealism can be seen as a development of the Platonistic elements embedded in the Christian tradition. And, surprisingly, some of the most virulent forms of Islamic anti-Semitism trace their origins to Christian sources.

Monday, March 3, 2014

Improving the odds

I recently sided with Lucy Kellaway on the importance of traditional social conventions and rituals in relation to eating and drinking, and mentioned in passing that I had reverted to drinking ordinary tea made in a pot rather than tea-bag tea.

That piece may have given a slightly misleading picture of my views because, though I certainly do feel – and no doubt will continue to feel – an emotional attachment to the rituals of eating and drinking, I must confess that my basic approach to food is more utilitarian than aesthetic.

I certainly share Kellaway's view that food itself, no matter how good, is "curiously forgettable" and that most of the real pleasure comes from the anticipation and the ritual.

Furthermore I see the current cultural obsession with food and the taste of food as being symptomatic of real cultural decline.

But my personal approach to food is in practice more closely linked to hypochondriacal tendencies than to conservative ones. When I was in my twenties a doctor asked me if I had read Three Men in a Boat. He was thinking of the (once) well-known passage describing how one of the characters leafs through a medical dictionary and finds he has symptoms that match virtually every disease in the book.

My usual way of dealing with perceived symptoms is to tweak my diet and hope they go away. I'll resist the temptation to give details of my current symptoms beyond saying that they are not painful or debilitating, just vaguely suggestive (as my symptoms invariably are) of very dark possibilities.

Of course, dark possibilities crystallize into dark realities for all of us in time; the trick (and it is a trick) is to keep that inevitable reckoning out of sight (and so out of mind) for as long as possible.

Part of this is psychological game-playing but the most important part relates to real choices impinging on general health.

And food choices, along with decisions relating to sleep and exercise, etc., give us – within the limits set by our respective genomes and other factors beyond our control – the ability to maximize (or minimize) our chances of longevity and well-being. We can alter the odds considerably, in fact. And this, to my mind at least, is hugely significant.

Maintaining traditional habits and patterns of eating and drinking is all very well. But the simple truth is that any such habits and customs – however appealing – will get short shrift from me if they stand in the way of my latest dietary prescriptions.*


* Just to give a sense of where I am at the moment... No alcohol, no sugar, no white flour, minimal caffeine (and no caffeine late in the day). Staples include: anchovies; lettuce mix; various fruits and vegetables; yogurt; heavy grainy bread; cheese; tahini; peanut butter; yeast extract; brewer's yeast; pea soup (with ham) and other types of soup. That's the bare bones of the 'what'; I may elaborate on the 'why' – on my rationale – another time. (The concept of glycemic load is important; and recently I decided to increase my intake of purines...)

Tuesday, February 18, 2014

The trouble with history

I have just come across an essay (published last November in the New York Review) by Mark Lilla on Hannah Arendt and how Margarethe von Trotta's recent film about her gets Arendt all wrong – by ignoring the fact that, as subsequent research has revealed, Arendt got Eichmann all wrong.

Von Trotta's films tend to based around strong and visionary female characters. And Hannah Arendt is presented as just such a powerful visionary.

One can see more clearly why the author of Eichmann in Jerusalem appeared to fit the bill when one considers certain aspects of the filmmaker's cultural and ideological background.

Lilla writes:

"When left-wing radicalism was at its violent peak in the 1970s, the following false syllogism became common wisdom: Nazi crimes were made possible by blind obedience to orders and social convention; therefore anyone who still obeys rules and follows convention is complicit with Nazism, while anyone who rebels against them strikes a retrospective blow against Hitler. For the left in that period, the Holocaust was not fundamentally about the Jews and hatred of Jews (in fact anti-Semitism was common on the radical left). It was, narcissistically, about the Germans' relation to themselves and their unwillingness, in the extreme case, to think for themselves."

Apart from the reference to narcissism, this seems true to me.

But I would also make the more general point (also made, if slightly more equivocally, by Lilla) that writers and filmmakers almost inevitably frame their works on controversial historical and political subjects in terms of simplistic ideologies and flawed logic.

If it didn't conjure up images of book-burning or the Index librorum prohibitorum, I would be tempted to indulge a small fantasy of mine and flesh out the notion of a world in which there would be no popular history books or films, biographical or otherwise – just easy access to a wide range of primary sources, and the minimal framework of scholarship required to authenticate, maintain and present this material to a wider public.

What would happen, of course, if one banned popular histories is that – as has happened so often in the past – enterprising writers would produce allegorical fictions which would make the same sorts of political and ideological points that popular histories would have made more directly (but not necessarily more effectively).

But, leaving aside questions about the desirability or effectiveness of censorship, there is no denying that reading letters and diaries and other documents from past eras (including literary works) is a powerful means of counteracting the myths that historians deliberately or unwittingly promote (even as they try, in many cases, to debunk other myths).

Thursday, February 13, 2014

The secret of their success

In her latest Financial Times column, Lucy Kellaway reports a casual but interesting observation on the apparently rather unspectacular careers of typical Etonians, and then – in gloriously unscientific and anecdotal terms – tries to explain this purported phenomenon.

A few weeks ago her husband attended an Eton College reunion for leavers of 1974.

"About 150 men crowded into the 15th-century chapel to belt out a quick 'Praise my Soul the King of Heaven' before settling down to eat, drink and reminiscence about schoolboy pranks while quietly trying to work out who had done best in the 40 years since then."

Kellaway's husband made two observations: one, how good they all looked; two, how relatively undistinguished their careers had turned out to be ("apart from one senior politician and one former newspaper editor"). They were all well-off but generally unremarkable, professionally speaking, which seemed surprising given their start in life.

But what about the inconvenient fact that the current UK Prime Minister, the Mayor of London and the Archbishop of Canterbury are all Etonians?

"I have never met David Cameron," writes Kellaway. "But I know Archbishop Justin Welby and Mayor Boris Johnson well enough to guess that neither is a stranger to insecurity. Both, too, have the capacity to work like dogs."

Kellaway references here The Triple Package by Amy Chua and Jed Rubenfeld which purports to show why certain groups (specifically Jews, Mormons and Chinese) do so much better than other groups in the US. The secret of their success, apparently, is a combination of a superiority complex, insecurity and impulse control.

The trouble with most Etonians is that they lack a sense of insecurity, it seems.

For good measure, Kellaway dismisses the effectiveness of passion, optimism, networking, resilience and life-long learning, and remarks on the "surprising success" of bereaved dyslexics.

Comfortingly, however, she notes that successful people are rarely (as she puts it) good eggs.

"Superior people are alienating; insecure people are exhausting. People who are both are doubly unbearable, especially when you take into account all the dissembling they usually do to mask both traits."

This is getting complicated, but let me put in my own two pennies' worth, my own speculative hypothesis: that English men and woman who attended élite schools (like Boris Johnson and, presumably, Lucy Kellaway) are more likely to set a higher priority on masking their sense of superiority than on masking their sense of insecurity (if they have one).

Regarding successful Americans, I have no strong intuitions and will resist the temptation to make any generalizations.

Thursday, January 30, 2014

Rules for eating

Financial Times columnist Lucy Kellaway recently wrote a piece on how she has stopped snacking at her desk at work, and is now an enthusiastic convert to restaurant reviewer A.A. Gill's rules of eating which run as follows:


1. Don't eat anything which can't be eaten with a knife and fork.

2. Eat at a table with a set place, preferably facing someone else who is also eating.

3. Never eat because you are hungry: always eat because it is lunchtime.

4. Never eat standing up.

5. Never eat anything with, or off, plastic or cardboard.

6. Never eat with a screen in the same room.

7. All meals must have at least two courses, except breakfast.


I don't know about all the specifics of these rules, but some such rules are definitely desirable. Like the rule once drummed into schoolchildren about not eating while walking in the street.

Of course, the focus shouldn't be on negative rules so much as on the social and psychological dimensions of eating (and not eating). On the ritual of food and drink.

As Kellaway points out, "food itself is curiously forgettable; what is not forgettable is the anticipation and the ritual."

Making tea is a paradigm case, and early last year I went back to making and drinking proper tea made in a pot. For me, tea bags were just a temporary (albeit decades long!) aberration.

As is generally the case with issues relating to manners and customs, all this is less trivial than it seems and is ultimately more about deep values than arbitrary preferences.

For rituals of eating and drinking not only contribute to defining our social and cultural environment but also, when they are functioning well, help to give us a sense of security, satisfaction and self-mastery.

Eating when you're hungry is just a little bit limp and boring, don't you think? Just a little bit ho-hum.

Monday, January 13, 2014

Death of an art form

I have given up on movies. There was a time when I trusted sections of the cinematic establishment sufficiently to willingly suspend my disbelief and surrender an hour or two of my time and attention to their products several times a week. There were writers and directors whose cultural background and preoccupations I shared to a great extent and who had (as I saw it) something interesting to say. They were people I respected.

It's a trust thing. Art forms are always about trust, even if they are also about making money.

Whilst government subsidies may attempt to keep local film industries alive, they inevitably encourage ideological conformity and artistic self-indulgence. And the main targeted audiences for mainstream films are now younger and globalized. Lowest common denominator. You know the deal.

A recent Telegraph article mentions that one of the people working as a quantitative analyst for American film producers – a former academic statistician – happens to be a distant relative of Albert Einstein. Symbolic somehow, don't you think, emblematic of cultural decline?

Nick Meaney (who is not a cousin of the physicist) runs a company in South London which assesses the earning prospects of film scripts based on algorithmic analysis of human-input scores relating to hundreds of categories (strength of location, proposed actors, etc.).

Their results are, apparently, much in demand by film producers. And they suggest, by the way, that nine times out of ten the big names have no effect on the box-office figures (assuming the replacements are competent).

So not only are movies not movies in the way they were, the stars are no longer stars in the old sense. Does the explanation lie with the actors, their image-makers, the changed nature of the product or the audiences?

Stars didn't come much bigger than Humphrey Bogart and Ingrid Bergman. But Meaney criticizes the movie classic Casablanca for being "too gloomy, downbeat and too long". He points out that it was only the sixth-best performing film of 1943.

It has performed rather well since, however.

I like Bogart's line as he (Rick) and Bergman (Ilsa) recall their last meeting in Paris at the time of the German invasion: "... You were wearing blue. The Germans were wearing gray."

Quantify that.