Much as I am enjoying writing this blog free of the constraints of 1914-15 Time, I think long-term followers may understand when I say that I still think of my 1914-15 ‘blography’ of George as Calderonia proper.
Those followers will remember that several posts between July 2014 and July 2015 touched on what I called chronotopia (e.g. 12 September 2014, 8 December 2014, 30 March 2015, 18 June 2015). This word was intended to define the difficulty I had, as a biographer, of two-timing with time, viz. of writing the day-to-day 1914-15 blog strictly in ‘real time’ whilst continuing to write George Calderon: Edwardian Genius in extended narrated time. I don’t know if it was Bakhtin who invented the word ‘chronotope’, but it was certainly needed, to identify the different time/space forms that different genres of writing employ (‘temporalities’ has also been used). When I was trying to write in two chronotopes simultaneously, I felt that circuits in my brain kept shorting. Above all, the ‘real time’ chronotope of the blog kept infecting my writing of the book, sapping its narrative propulsion, until faute de mieux I gave up writing the book for five months, concentrating on completing research for the last two chapters, which I started writing only when Calderonia proper was over.
When I said that I felt it was my brain that couldn’t cope with writing in these two chronotopes simultaneously, I was aware that I was speaking metaphorically, since I have, of course, no idea of how my own brain physically works. I was also very ready to believe that younger writers’ brains would be more able to two-time than a sixty-eight-year-old’s. But in the course of working on something quite different, I have just read a short article by Dr Detlef B. Linke, Professor of Clinical Neurophysiology and Neurosurgical Rehabilitation at Bonn University, which — in so far as I can understand it — seems to suggest there is empirical evidence for my metaphorical hypothesis.
Dr Linke begins his article by stating that ‘there is no common pacemaker by which intervals of time could be defined in the brain. […] The data demonstrate that the brain is not a clock in the physiological sense’. The brain has developed ‘different time-scales’ for itself, partly through ‘imaginative capabilities’ according to German Idealist philosophers (which I daresay Bakhtin would agree with), but:
Too much rhythmic synchronization is deleterious for informational content, and generalized rhythmic synchronization of the brain is a well-known pathological condition. The question of informational coding therefore has to be seen as a complex system of time-scales in which a complete presence of all the information would destroy the complex hierarchical and heterarchical interactions.
Polychronotopia, then, can cause a complete brainstorm… To complicate matters further, ‘for the right hemisphere, the flow of time is experienced as slower than for the left hemisphere’. Could this mean that two-timing with the narrative-biography-chronotope (slower) and the real-time-chronotope (faster) is a question of repeatedly switching from one hemisphere to the other, which some brains might be better at than others? Linke even seems to suggest that ‘a part of biography […] belongs to the way from the right to the left hemisphere’. He concludes, encouragingly:
Reflection takes time and therefore makes a difference to the passing of time. […] I think this difference need not be painful but it can, instead, be taken as the possibility of the brain being a host to others. There are good reasons to keep up differences and time-scales when they could be the origins for better contact with alterity. But the question of unity therefore remains, especially the question of the ability of being a host while being preserved in unity.
This begs a few questions, and sounds to me as though it may have been translated from German by someone whose native language is not the target one, but it’s still interesting, I find, coming from someone who is described as a ‘brain scientist’ and clearly works with measurable data, unlike myself who can merely speculate.
The full reference for this article is: Detlef Linke, ‘The Lord of Time: Brain Theory and Eschatology’, in The End of the World and the Ends of God, ed. by John Polkinghorne and Michael Welker (Harrisburg, Pennsylvania: Trinity Press International, 2000), pp. 42-46.
Last year in Calderonia, John Dewey, the author of Mirror of the Soul: A Life of the Poet Fyodor Tyutchev (2010), and I discussed the tristesse that a biographer often experiences when the subject of their biography has died in the text, so to speak. Since s/he has always known the subject is dead, and the death in the text being written rarely coincides with any commemorative date, we speculated that the tristesse/mourning is also connected with something happening in the brain as that very personal ‘time-track’ comes to an end.
Since then Dewey has published a much-needed collection, in his own elegant English translation, of stories by Yevgeniy Zamyatin, the bracing ‘first dissident’ Soviet writer; see http://www.brimstonepress.co.uk/books/detail/YZ-TheSignAndOtherStories.htm . In my first post next week I shall look at the possibility that there is a link between Zamyatin and George; whilst endeavouring to avoid both apophenia and pareidolia.
For the archive of posts since 31 July 2015, please click here.
The Somme: Ends and Beginnings
When did the Edwardian Age begin and end?
Obviously, in the literal sense it spanned Edward VII’s reign, 1901-10. Cultural historians, however, have long extended it beyond those dates, because the nexus of attitudes and values that we call ‘Edwardianism’ began to form before 1901 and died years after 1910. Thus Samuel Hynes, author of one of the most influential books about the period, The Edwardian Turn of Mind (1968), placed the beginning at ‘roughly the turn of the century’; some would even date it to the ‘naughty nineties’; Roy Hattersley (The Edwardians, 2004) dated it from Queen Victoria’s death; I perceive it setting in after the Queen’s jubilee of 1897. Hynes wrote that ‘the end of the Edwardian age is as certain as it was sudden — 4 August 1914’ and Hattersely agrees.
But 4 August 1914 is merely symbolical. We all know that Britain went into World War 1 with its Edwardian attitudes intact; that was part of the problem. Last year I followed the Gallipoli campaign on Calderonia from day to day and we could see that it was compounded of ponderousness and mental rigidity, wasteful false heroism, woefully arrogant misplaced self-confidence, dedicated amateurism, and lack of realism, to name but a few attitudes. The failings of the Edwardian officer-class were all too obvious to the ANZAC troops. A tougher new breed of soldier like General Charles Munro, who replaced Ian Hamilton and recommended evacuation, could also see them. I felt then that the Gallipoli disaster not only epitomised the worst of Edwardianism, it marked a turning point in it; the beginning of its end. My own working time-frame for the Edwardian Age became 1897-1915 (one has to spell these things out in one’s Introduction).
However, I don’t now believe that Gallipoli fundamentally changed attitudes. Its failures led to the fall of the Liberal government in May 1915, of Fisher and Churchill, and criticism of the campaign from the Australian journalist Keith Murdoch shook some people’s confidence, but the actual evacuation of January 1916 was seen as a triumph, the nation was still focussed on ‘gallantry’, defeat was not accepted as defeat. It was, I believe, the Battle of the Somme, which was launched at 7.30 a.m. today one hundred years ago, that triggered the break up of the Edwardian mindset.
The Battle of the Somme shared several features with Gallipoli, for example meticulous planning and preparation combined with complete inflexibility, a lack of intelligence analysis combined with a lack of risk analysis, and a belief in ‘heroism’ that made men totally expendable. Meticulous planning and preparation — complete military professionalism — were a good thing, of course. But if there was no plan B, no way of changing from plan A if it went awry, no use of intelligence and risk assessment, then as at Gallipoli meticulous orders could simply make self-destruction more efficient.
George and Kittie Calderon’s twenty-five-year-old friend Dick Sutton, who had been wounded twice by June 1916 and was now ADC to General Sir Henry Rawlinson, commander of the Fourth Army at the Somme, wrote in his diary on 30 June 1916:
As we know, that is what happened. Rawlinson ignored intelligence about the depth and strength of the Germans’ dugouts, believed that the long preliminary artillery barrage had destroyed the German front line, and insisted on his own rigid plan. By the afternoon of today the scale of the disaster was clear, but neither Rawlinson nor Haig could change their plan… By the end of today, the British Army had sustained more casualties than on any other day in its history — 57,470, of which 19,240 dead. When the Battle ended in November, the casualty figure was over a million from both sides.
Yet the effects of the Battle of the Somme were quite different from those of Gallipoli. The army did enter a learning curve. They had invented the ‘creeping barrage’, as Sutton’s diary indicates, but it and the artillery plainly needed improving. Tanks were first used at the Somme in September, but the army had to learn how to combine them with the creeping barrage and infantry assault. This would eventually be a war-winner. The Somme also demonstrated the almost complete irrelevance, if not counter-productivity, of the cavalry. I have the clear impression that defeat on the Somme at last shook military and popular attitudes to the core. As Peter Hart has written in his Gallipoli (2011), the ‘casual arrogance’ that lay behind the Dardanelles disaster was ‘finally exorcised by the Germans on the gently rolling ridges and valleys of the Somme in 1916’.
Edwardianism died at the Somme. This was a very positive development.
For the archive of posts since 31 July 2015, please click here.