Pages

Monday, December 31, 2012

On training to be a vector

Copenhagen, from my flickr

I saw a picture of a Seoul skyline today, with, in the the foreground, a bamboo (inspired) roofed temple and in the background curtain walled skyscrapers. The use of bamboo rods to make a roof determines the shape of the roof, and this in turn makes the constraint an exercise in honing beauty out of a difficult material. So the roof was painted, curved like a wave, the tube-ends ornamented and waterproofed. Maybe it was not even bamboo anymore.

I bring this up because I am working on a local North-American project for private clients and I was musing about alternate materials like bamboo. I am a vector, like most architects, for cross cultural forms and materials. Vectors can carry diseases like mosquitoes do malaria, can fertilize like bees, or they can annoy like architects with out-there proposals.

It would be a risk for my client to consider non-local, non-traditional (in North America) roofing materials, however carefully detailed. They pay me mostly to avoid having to deal with the building officials who can be a hurdle, and who likes things tied up in nice local ribbons.

Did my training prepare me for this vector role? Did it give me the selling skills, the confidence and the ability to pierce the cultural protection force field.

No.

It did give me some technical and design exposure, but it tried to destroy any confidence I may have had about my ideas, like many professional schools do. Destroy the character and rebuild it. Pavlov showed that that does not work for all character types. He had identified four types in his research on brainwashing for the Russian military:
"the strong and impetuous type, the strong equilibrated and quiet type, the strong equilibrated and lively type, and the weak type",
i.e.:
  1. Those who took it, who fought back, who grew stronger but who were not prone to being followers, who may hold secret grudges (Jason Bourne)
  2. Those who  held their core, and could later be commanded and trusted (Arnold Governator...)
  3. Those who talked back, who talked it though, but who broke in a good way, who could be trusted (Bruce Willis)
  4. Those that broke and that could not become strong again (Darth Vader)

I am not sure where I fit in. Maybe this is a spectrum. I know that I hold no love for the empire my professors, that I would cross the street if I saw them coming first, and that I have not bought into the ideology of pragmatic architecture that they tried to instill. (I will post on the kind of abuse perpetrated on students at another time).

I think I am an "advisory vector". I usually will test the waters with a client, propose alternatives, and not be too upset if they go with the middle of the road. It is after all their money, their baby. This avoids disasters like the Toronto spiky museum additions, and the 1776 foot high skyscrapers that mean nothing, but it also tends to promote the status quo. Public clients are another story some might argue, but I disagree. Public money should not be abused for the benefit of the ego of the star architect. It is a difficult call, since good star architecture can revive a place, bring tourism money, but it can also be just annoying and self-serving and an abuse of the committee process, not to mention public funds..

Can someone link this to ethics of the profession? Is it more ethical to be a cross pollinating vector? Is it more moral to be a bee than to be a carpenter ant? Architecture, when it happens, gives meaning to construction, but it has to happen, and for it to happen, there has to be a will, and a will must exercise power, leadership. This is not a given in many contracts.  What is art? :-)



Wednesday, November 7, 2012

Architecture is predicting the future



When an architect is asked to design something, part of the challenge is to anticipate the effect that this usually large built and immovable object will have on the many people that will see, use and possibly inhabit it.

Conceptual art approaches can help, for example making a building evoke an idea, or using an idea to start a design, but they are never enough. That my building is 1776 feet high may help get sponsorship, but in it will hardly be perceived.

Fame or notoriety help. Celebrity architects draw buzz and crowds. One can invest genius in a built object and the aura of the famous designer creates a culture around the space.

The inescapable challenge of architecture however is to be present in time and space in a significant way.

Here is an example of how things can go wrong:

In my little town, the well-meaning heritage preservation committee got involved in the review of a design for a pump house that would sit along the waterfront. The building would need to have the volume of a large two storey house.

The committee looked around "for context" and found a nearby older shed that had two stone gable ends and a very symmetrical sloped roof, a kind of 19th century shed. So they strongly suggested that the utility's architect (or most likely engineer), take the cue.

The intention was to reflect the heritage of the industrial waterfront area.

The result was a large, white stucco covered imitation of an old shed, like an over sized paper model of it, sitting there contrasting against the dark water and sky.

I don't think that was the intention. I also think that any architect would have easily anticipated this and informed the client of the danger. Unfortunately, the heritage preservation committee is wedded to form and mimesis (a pompous architect term meaning copycatting). Unfortunately, it is an easy notion to sell, a concept that seems to make sense, reflect shapes, and if you are lucky materials.

An aside:
I agree with many of my colleagues who say that if you are going to do some restoration work or add to an older building, choose one: re-use material or re-use form and proportion, but don't do both, at the risk of getting some half-assed mimesis.

Ok, so back to the story. This thing now sits on the waterfront, annoying, out of place, and a reminder of industrial sheds we lost, but bigger and worse.

So instead of learning a lesson, this heritage committee moves on, job well done, and is now addressing the problem of our train station. Again, mimesis is being suggested, but this time there is no proximate context, so they looked in the pattern books for old train stations with huge eaves, and came up with some hybrid Frank Lloyd Wrightish stone bunker, because they wanted the stone of course.

What is galling is that the railway people had proposed a very nice, airy design, with proper functional overhangs to protect people on the platform, but with nice glass, and curves and a good light feel to it, considering the barren context of the area, a jewel that may have lightened the oppression of the district.

I don't think it will be realized, because no-one thinks that architects can predict the future, especially when the heritage preservation committee is locked in the past.


Thursday, November 1, 2012

A Tale of Two Realities


I have just begun a new job. No hiatus between this new one and the old. One day I am in hell, the next in paradise.

The one I left was as a software manager in a large organization, and the current one is as a software lead in a very small organization.

Other objective differences: the previous job was bureaucratic, there were many processes and constraints and a network of communication and decision paths, and the systems my team built had to reflect the complex interrelated and often historically constrained requirements.

The current one is technical, working within a small team and with very precise mostly self-defined scientific requirements, adjusted to meet specific customer requests. The software has to run precision equipment to capture and calculate results within extremely tight tolerances, both in time and numerical accuracy. The only measure of quality is repeatable precision. The culture is one of technology and science.

In the new job, I report to people who have the same engineering training as I do, who have more experience in the R&D business than I do, who I can ask questions of without risk of offending or destabilizing them emotionally. Peers and superiors in all senses.

In my previous job, I reported to people who asked me to explain what I did "in simple terms" and who liked to hear themselves talk so that they could pretend to make sense of their lumbering thoughts in public forums, often referred to as "meetings" but which felt like beatings. Dissent could be career limiting.

The analogy I would like to give to dispel any conclusion that I may be a pretentious ass is as follows:

Suppose you are a medical practitioner, a physician, and you see patients in a clinical context. You have to diagnose and prescribe treatment as part of your day to day routine.

Also, suppose that you are "managed" by someone who had not undergone the same level of training as you, say a registered nurse, who knew the lingo, understood the context, but did not feel confident enough to make life or death decisions affecting patient health, but felt competent enough or was somehow appointed to tell you, the physician, how to run your practice.

For example, "please use language that is easier to understand in your charts". Or  "make sure your handwriting is legible for the pharmacist", or " you have not seen your quota of patients today, why did you spend 6 minutes more than average with Mrs. Smith today?".

Could you practice under such circumstances? How long would you last?

I lasted 5 years in the analogous IT context. My manager was a technologist, previously a school teacher who joined th IT boom of the nineties and hung on. She did a bit of COBOL programming, hated it and went into management because she could talk better than she could listen or understand.

She is proud to claim that she does not understand the need for system architecture versus program design, her eyes glaze over when there is talk of latency, language choice, servers, data flow analysis, code profiling, and factoring and she had no patience for options analysis or proof-of-concepts. She will not deign to read code. She has trouble with email client software and spreadsheets.

She wants simple, clear explanations and plans, and wants to set deadlines before beginning design, because iterative work can only lead to grief. She always used waterfall approaches in COBOL, and they worked fine for her.

Her weakness and fear prevent her from adding value, from making decisions. She was a pass-through for her manager's decisions to me and my peers. No value added, but lots of aggravation and delay. Never understood or wanted the concept of situational management, which is probably the most effective way of managing people, so effective that there have been lawsuits about who has a right to claim authorship and teach it. She is a scared rabbit, hanging on until she can retire and cash in.

Others in that environment have said that we, the technical folk who did the work, should not "speak Neanderthal" when in the presence of senior executives, by which it was meant that we should use non-IT terms of less than 3 syllables. The executives in question were the CIO and her direct reports.

But all this is now in my past, one-day-old stuff. I am now in another world, where the conversations around me centre on measurement tolerances, ADC resolutions, real-time interrupt-driven code, clock speed and sampling rates, hierarchical state machine approaches, language selection and compiler efficiencies. I have scope probes and micro-controllers on benches beside me, no fuzzy walled grey cubicle partitions in sight, and a project to deliver with people I can talk to who want me to talk dirty.

I am in heaven.

Friday, October 26, 2012

Gender relations

Franz Erhard Walther, 1983 - Paris, Centre Pompidou - by A. Barake

Still digesting the readings from Thanksgiving...

I read Simone de Beauvoir's "The Second Sex" recently, and as I was driving in this morning, it hit me how such a study can be a good reflection on the state of gender relations at a point in time yet how it is just that, a touch point  since these relations are so varied, so much in flux, constantly evolving and I asked myself if her study was of any use in day-to-day life.

The sample size being dealt with is huge- 3 billion give or take a few million on each side of the equation, and this results in a large variety of sexual behaviours, variability and variety. The middle of the bell curve is very broad, even if we limit it to Western culture, as she does.

Yes, we are a laboratory for sex, as we should be. Until we get totally artificial about it, survival and renewal require this experimentation. Since our brains are so adaptable, interest in sex requires huge variety I guess. There is no end to the variations, and the minor things are as exciting as the major ones, however you define the categories. Marketing and advertising depend on this instinct as their currency of the new.

Despite the difficulty of categorizing such a complex topic, De Beauvoir's expose covers and uncovers much ground. Her method is to work at the delta she perceives between gender status, behaviour and emotion in every context of her time, and historically where she can.

One observation which struck me was the notion that a large number of  women she observed define themselves through a male spouse or mate, as a means to gain territory, because of the cultural power disparity that existed. Desire and power, the old story, examined with great perspicacity and dissected and countered with new approaches. The central thesis of the book.

I extrapolated this notion to the study of relations between gay men and hetero women - to try to answer the question of why there can be friendly attraction without sexual attraction. Also, to test whether there was a converse with lesbians and hetero men. There isn't in a general sense. De Beauvoir has a chapter on lesbianism, and she concludes that the relation between heterosexual and same-sex desire is much closer and symmetrical in women than in men.

Having also read what Camille Paglia says about homosexuality, I am not sure that one can conclude anything like that. The spectrum is very broad as I wrote earlier, and maybe de Beauvoir's snapshot is just a reflection of what she could see, or what was visible when she wrote the book at mid-century.

Gender relations and sex remain very difficult things to generalize about, and I think it goes back to the idea that we are attracted to the new, to variety, and that behaviours that test and taste that variety are usually good for evolution, for survival, so long as they are not culturally or biologically damaging.

As the culture and the population widen, the behaviours that can be experimented with can grow, there are wider safety margins, and there is also motivation and opportunity. The taboos and secrecy around sex that act as cultural anchors to maintain tribal cohesion loosen as the tribe expands to include the planet's population. The need for protectionism disappears, reflecting the monetary and commercial globalization.

This may be why there is a resurgence of fundamentalist thought, a backlash, a clinging to what can be perceived as a moral high-ground, based on limited communications, limited population size, limited cultural migration. Pockets of resistance.

Oh yes, the book was banned by the Vatican. Go figure.

Thursday, October 18, 2012

This is brought to you by...

The Turing test.
This blog is a simulation. It is written by an AI.

Half-baked vs open to possibilities? A manifesto.

studio series A. Barake 2008

I like the unfinished.

Whether it be studies, sketches, snippets of songs, or first drafts.

They seem alive, they allow for the future.

The future that I can choose to defer, giving me the hope of potential, of life going on.

Half-baked means that dinner is coming.

I also like the fully realized. It is post-coital, closure, comfort, time for closing the eyes and savoring.

So what is left not to like in this spectrum?

The mediocre, the completed in haste, the delivered for money not love, the faked, the cliche , the pandering to mass appeal, the blue smoke sentimentality of bad films by Spielberg and his ilk.

Playing the devil's advocate, I can see that the charges of laziness and lack of discipline can be laid. Other lesser judgements could be "wishy-washy", indecisive, uncommitted.

Yes, all valid categories for this approach, but the Venn diagram is flawed. These are not super sets of the incomplete. The incomplete is the super set of the attributes I list, and of many others. The incomplete action is the existential act, the rest is just noise and judgement. Being incomplete may not lead to riches, mass appeal, or to classic status, but it can reflect life and happiness. Certainly for me.


Thursday, October 11, 2012

Themes versus categories

From my flickr


Trying to make sense of experiences and events can lead to categorization.

I try to resist that, given that I think and have written here that categorizations are usually trade-offs between understanding and convenience.

A grouping is as good as the criteria used for grouping, and there's the rub.

Rational categorizations, the ones used by encyclopedias, are part of our cultural thinking. What I think is missing from the cultural tool arsenal is emotional grouping. (I really should say it is missing from the serious argumentative arsenal, but is very present in rhetoric, which can have a bad name.)

I think the reason for that is individuality. Emotions are based on experience, and this differentiates the perceived and experiential validity of the grouping depending on the audiences' family, clan, nation, etc... This may be why AI is so hard.

But...

now we have the means to not generalize.

We can simulate individuals, mirror them really, and build networks of relationships, like Facebook and other network applications do, and map the emotional gradients and correlate to the other more usual and accepted categorizations and deal with the rhetoric.

But I did not begin this post to write about that.

I want to write about the notion of theme, which is a categorization of emotion and fact, intertwined, used in the arts.

Let me give an example: the outsider theme

I was reading Edward Said's autobiography over (Canadian) Thanksgiving, where he talks about his feelings of being between cultures, and I related it to other immigrant and emigrant writings. Rushdie provides liner notes to Said's book, and I believe that Hitchens was a supporter and friend. I mention this because both are contrarians and sometimes wilful outsiders.

How to describe, or even define the theme of "the outsider"? How does it differ from a category?

Outsiders may feel kinship based on the feeling of belonging, which is rooted in emotion, even in biology.
Attempting to completely categorize the theme of the outsider with words will fail, because one has to feel as an outsider to completely know and understand. Sometimes the feeling is due to internal stuff: memories, wounds, predispositions, inheritance, rather than external immediate factors. It is not reliably induce-able and I would also say that it is not measurable - yet.

For example, a novel's or a painting's theme is a shot in the dark, a collection of experiences and perceptions manipulated to model a simulacrum or resonance of the theme in the audience's mind.

The success of the work depends on the accuracy of the resonance between "player" and "listener", and sometimes on the group think that comes with it, the meme concept, aided by marketing, and that is how culture moves along and changes. Evolution in action.

Happy (survivalist) themes succeed more easily, thus escapist magic, futuristic fantasy, sex stuff. Stuff that is more emotionally ambivalent like Graham Greene's and J.M. Coetzee's  works provide themes that can take root, sometimes grow into the main thread, the canon, produce seed, and thus can join other classics in the taxonomy or network of thematic relationships we call culture.

Themes versus categories. We can now manage thematic encyclopedias. The Web is the perfect tool for this.

Friday, October 5, 2012

Another rant about software



Software does not really rot, despite what developers experience every day.

What decays is its environment.

Software embodies perfectly adapted machines that cease to function as everything around them evolves. Software has no friction to wear it out. The claimed obsolescence occurs because it gets old compared to the world around it. At least that is the hype. Clients are missing out on upgrades, security fixes, whatever. The Internet helps of course because it is a true wilderness with predation.

The smart boys in the IT marketing departments of big companies learned this probably by accident and have been helping the phenomenon along ever since.

This is the upgrade cycle.

The lucrative business model is based on what I think is a partly artificial problem, or at least one that has been blown out of proportion.

Here is a fantasy based on a moral, ethical and unrealistic world:

A program is written to meet a requirement, is tested, goes through a few iterations with the user community and becomes stable and useful and productive and everyone is happy for a long time.
The underlying hardware also gets faster and better but remains consistent with the older platforms so that the software can continue to run, or be upgraded in an incremental way, but stays in support essentially as long as nothing fundamentally better comes along.

This is not impossible, Windows and IBM both provided this kind of upgrade path for their OS's and systems  for a long time, but then realized that there was much more money to be made by setting deadlines on support so as to convince client IT managers that their stuff may break and cost too much to fix if they did not pay protection upgrade.

FUD is easy when knowledge and complexity do not keep up with each other, and a company certainly can create and control complexity, which has the added bonus of thwarting compatibility, integration and competition. IT companies that have survived have defied the inexorable race to zero cost of most consumer technology, especially something as perfectly light, reproducible and useful as software, by bucking all the good practices of design while giving them marketing lip service. HP, and other engineering companies, like DEC may have misunderstood this twisted logic. Sun certainly did.

Open source is a defense and a mitigation to this pathology and has the potential  to reduce the crazy costs associated with IT change by distributing the cycle of maintenance it across IT shops, using common knowledge. It is an extension of the Unix ideals of clarity and modularity and community. Heresy of course. Communism some have called it.

So we continue to have big companies and governments paying huge sums for essentially very little, a sort of insurance. Let's call it that then, software insurance. That is heresy too, especially if you read the disclaimers that are standard with all commercial software. No guarantees.

Any suggestions on how to shine some light here?




Friday, September 28, 2012

What's next?

Copyright 2018 A. Barake
E.M. Forster in his "Aspects of the Novel" lectures discusses the difference between plot and story. Plot includes meaning. Can we know meaning, and do we want it in a story? The novelist's art can make meaning crystallize in the reader's mind by drawing on shared experiences, on context, on cultural tropes. Can this not happen through simple storytelling as well?

A poetry teacher I remember from long ago said that the writer must try to control intent as closely as possible. He was dead against the scatter words to the winds approach of free-association that some of us were playing with. In fact he chastised us quite severely about it and the reprimand stuck to this day.

I was thinking of plot versus storytelling because of the wild popularity of the Harry Potter stories and the recent hype around Rowling's imminent release of a "serious" novel for adults. Harry Potter works are driven by the "what's next" urge. I remember reading stories to my younger brother before bed and he could not wait for the next instalment, to the point of learning to read so that he could get to the denouement of the latest twist before the next evening.

So will Rowling's addition of plot to storytelling make her serious novel more successful? I don't think so.
Plot is a luxury, at least in the commercial sense.

The success of the Fifty Shades of Grey trilogy adds evidence to my claim. I think the sexual encounters are the equivalent of the "what's next" of a standard story. Same principle as in detective fiction. There are comments on readers' site about skipping across the sexual descriptions just to get to the next variation setup. So it is about anticipation, more so than about meaning, or description.

Tipping the balance of writing towards such commercially successful mechanisms makes the writing more cinematic, flatter dimensionally. I am not making a value judgement. I think that it is not sustainable. Art is about Eros, about tension and sublimation, and repeated denouements lead to wanting bigger and bigger bangs.

So conservation is a goal of plot artistry, learning to fish rather than being served the trout all cooked. Playing with the mind. It sounds pompous, but writing in a sustained way is to influence and seduce and to keep the love alive. Boring is the worst insult you could throw at a writer, and that risk exists if stories begin to repeat as they ultimately have to. Even the Thousand and One Nights  had to end in a boring marriage.




Wednesday, February 29, 2012

Cognitive Philosophy

There seems to be a movement afoot - long overdue - to embody the foundations of intelligence and philosophy. It is partially old news, the heroic AI of the 50's, 60's and 70's has been discounted despite our benefiting from the results of that research in today's mainstream technology: neural nets, machine speech recognition, handwriting recognition, natural language translation, cars that drive themselves, and all kind of other low-level intelligent (unconscious?) activities simulated by programs.
I think that it is the Minsky flavour of AI (now re branded cognitive science), the one grounded in logic, seems to have fallen out of favour -since the fruits of that work did not rise up to the expectation of the "I" in AI; an AI based only on logic, a formal game with symbols that pretended to be words. How can that be smart without the observer having to make all kinds of allowances and assumptions?

This stuff was subject to sometimes acrimonious debate for a while, until technology and big data made many of the arguments moot.

The good news is that there is now a strong movement towards the recognition of the role of the perceptual environment and its embodiment in animals like us as part of the equation.

This realization may be due to the Web, indirectly, since it should be now be obvious that the seeming "intelligence" of the Google search would not exist without the "environment" of all the Web sites, which are of course interactions with people's brains (in fact there was a very simple AI program that used "Google distance" as a way to measure metaphorical strength between concepts. For example "Castro" and "cigar" scored pretty high on that scale).

So now there is a movement, a possibly serendipitous program of research coming together to ground AI in the real-world of experience, and some of the people that I think are leading this effort align themselves along a somewhat elegant but odd symmetry of ideas.

Let me explain.

There is David Gelernter, the computer scientist and columnist who wrote "The Muse in the Machine" and "Mirror worlds". These books made me want to go back and look at AI again. In fact I ran back. His arguments are so cogent, so seemingly simple and common-sensical that I almost missed their radical novelty. He states that data and the links between data is the key to intelligence. These links are often emotional gradients, in the sense that logic is not always what binds ideas; it is similar emotional affect!

The brain associates memories based on the closeness of the emotions they evoke, the sensory input parallels that occurred as the memory is created. The whole model, not just the idea...how you felt, how cold or hot or tired or dizzy you were. This links data to environment so elegantly that it is a mystery to me why no-one has picked up on this earlier. He breaks the syntax/semantics dichotomy very deeply. Also, Gelernter has very conservative political views - right of centre. I am noting this now so as to make my strange symmetry argument a bit further down. Bear with me.

There is George Lakoff, the cognitive linguist. His work with Johnson on embodied philosophy and with Nunez on the origins of math, takes the empirical evidence around the links between our brain wiring and the perception of the world and uses it as a foundation for philosophy.

Basic brain mechanisms are considered to be foundational and ideas grow as hierarchies of metaphor (i.e. links).

Metaphor becomes a key mechanism for cognition. Similarly to Gelernter,  Lakoff et al propound that metaphors are based on evolutionary and environmental constraints, not some platonic logic ideal. No such thing as the separation of  mind from the world of things. These guys are existentialist scientists. Finally! The rampant positivism and Platonism of the English speaking world, the Whitehead/Russell school, is giving ground. Merleau-Ponty and Husserl are now more visible in the world of scientific cognition.

Lakoff is an ex-student of Chomsky. He seems to disagree with the flavour of Chomsky' thesis of hard-wired syntax in the brain. He believes syntax and semantics to be more of a spectrum of complexity in cognitive biology than a differentiator between humans and other animals. Also he is a non-neo-con, a strong and vocal opponent of  right wing politics as it is played in the US today, and although I hesitate to slot him with Chomsky on the political front, they are on the same side of a hypothetical centre line.

So we have right and a left wing thinkers on cognition who fundamentally disagree in the political realm, the realm of people relations, nations, influence, and power. That is a healthy and good thing. This lends support to the notion that there is a grounding in "objective" reality for this thinking on cognition, it is much more than opinion.  (A slim argument, I know, but hey, this is my blog, I can say what I want.)

Aside from the political thing, I want to continue to talk about is the cognition thing. There is another thinker/scientist that seems to be breaking new ground along those same lines.

He is a neuro-scientist, one who gets his hands dirty opening up craniums, working with brain surgeons. William H. Calvin. I read two books of his: "Conversations with Neil's Brain" and "How Brains Think, Evolving Intelligence Then and Now". His ideas centre around the notion that cognition is an initially subconscious Darwinian process among neural bundles that remember perceptions, integrate them and ultimately model the environment perceived by the body, and that the successful "species" of memories/emotions (let's call them cognitions) are the ones that we call conscious, the ones that rise to the top of the noise of cognition, to differentiate themselves as our inner voices.

His research fundamentally agrees with the models put forth by Gelernter and Lakoff. Like Lakoff, he believes that most cognition in the brain is fomented through unconscious processes. Like Gelernter he finds that memories are linked through non-logical processes. He shares with Lakoff a deep knowledge of Chomsky's ideas on innate language ability and like Lakoff disputes the simplicity of these models, without denying the human brain's ability to parse language. He also talks of metaphor, not surprisingly, since all this is supported by empirical evidence at the chemical and behavioural level.

So I am very happy that this stuff is happening. I tended to get hot under the collar when I read "classic" texts by Minsky or Pinker on intelligence. They seemed simplistic and smug. I was always perplexed by the lack of attention given to Grey Walter's ideas which preceded but anticipated Gelernter's. I wanted 'The Muse in the Machine" to be discussed more, but to my chagrin, Penrose's strange books on quantum brain hypotheses got more press.

Let's get back to empirical research, let's continue the program of work that will ultimately bring philosophy and psychology back into science.The foundational thinkers list above should also include Turing. Even Penrose mentions his "other work". He did mathematical research on molecular biology (plants mostly) and how stem cells can become complex structures. His mathematically-based hypotheses on these matters were validated experimentally just this year in the UK. Ultimately this theoretical work may prove to explain some of the brain mechanisms underlying the work Calvin and others are doing.

There are conceptual layers to all this analysis, from the bottom-up: the molecular, the cellular, the systems (bundles of neurons evolving), and the cognitive and behavioral, leading to metaphor, logic and science, including physics, on to philosophy, morality  and politics. The gaps needs to be filled so that we have a continuum of understanding. A New Science in the Vico sense. Something we made and understood that we made, but that is grounded in experience.

(I just found an earlier post that was wishing this stuff would happen, and so here we are.)

Wednesday, February 15, 2012

Coetzee's Elizabeth Costello device

I am working my way through Coetzee's novels in no particular order, and I have just finished Slow Man. Elizabeth Costello's role becomes clear there. Coetzee, no doubt all too aware of his self-centredness; in fact he keeps alluding to it through his secondary characters, who keep asking if they really can be considered secondary, if anyone can; so as I was saying, to deal with that dilemma, he creates this other author's voice, to take him outside J.M. In a sense it works. He manages to see things differently, and I am sure the choice of gender helps him. Empathy is strong in his writing, but it is a clinical empathy, one rooted in observation, like a photographer that sees misery, captures it and then moves on. It is not care, it is halfway to care. So The Elizabeth Costello device pushes the plot towards action, towards mistakes, she forces him to do and to confront.
Again, why the fuss? Why not just write from the perspective of other voices than his own? Probably because he is so honest and is not wired that way.
The method succeeds in making his work interesting, but as John Crace in his pastiche of Coetzze in the Guardian  notes, he keeps writing the same book. Why not?

In fact I have argued the other way for songwriters. Is the medium a factor? Can songs be egocentric, since they are an accepted form of wooing? Are novels different? No answers here.

Saturday, February 11, 2012

Comments on Coetzee and the books Elizabeth Costello, Diary of a Bad Year and Summertime

J.M. Coetzee’s work is more often about himself than not, in subtle reflections that achieve the necessary universality. Yet I feel a great unease about it. The undertone of self-criticism and mockery is subverted by the very fact that he is writing about himself. In effect saying that that is what is important. I know that it is deeper than that, that he uses himself consciously as a device, one that is central to his approach to fiction, but I also know that sadness and depression come from loneliness.

However, Coetzee has succeeded in reaching others, he has two Bookers, a Nobel and is loved by readers and writers, including me. He says things that need to be said, with courage and balance.

But, and this is what gnaws at me, there is the self-centredness, the one he is aware of, that he digs into all the time, the fatal flaw.

Maybe I can talk about the novels I mention in the title, to give examples. Elizabeth Costello reads like a crotchety set of essays by someone who does not care how they are perceived, because they are gone, above criticism. Costello is dead at the end, in a sort of purgatory, and this gives closure to all the ranting. It is good ranting though, holds your attention, and despite the characterization, shows balance. This is the tour de force. The hard foundations of Coetzee’s writing are a) humanism, probably more accurate to say animalism, avoiding pain, b) guilt about colonialism, and anger about the guilt, since he is not directly guilty, he is constantly trying to extricate himself from it c) self and others, connection, sex as a mystery.

Elizabeth Costello contains a casual sexual encounter in a hotel, between business travellers that is masterful in showing how a we inhabit our bodies when close to one another, how strange, beautiful and limiting awareness of sexual contact is. Sex and loneliness in Coetzee’s work are bound tightly. He does not really understand the other, despite all the voices.

This brings me to Summertime, where he reflects on his middle years through women that have known him. This is masterfully written, but the uneasiness I feel about it relates to the above, he cannot get out of himself. He realizes it, since his characters allude to his “autism”. Realizing it and trying to exorcise it and ultimately failing in my view is what makes the work so strong, but also so flawed. He can go on telling us he knows this, and there are two possible conclusions: i) he is truly failing - case closed ii) he is using this mechanism to drive his creativity, it is a shtick. I refuse to believe the second.

In Diary of a Bad Year, he tries again to bring out Coetzee from his shell. Two and a half points of view. Himself, as always, a young woman who becomes his typist, and the views of her boyfriend, an foil to the humanist/animalism/anarchist Coetzee character. Archetype does not mean one-dimensional here. He is fully fleshed, but a bit distant, a bit vague, a bit too consistent. Coetzee is getting revenge it seems on that type. I know these types well, I dislike them too, but again, I almost want to not believe that Coetzee is exercising a form of petty literary revenge. Not petty I guess, the themes are broad enough, but in all three books there is a sort of accounts settling smell, things that his “autism” may have prevented him from doing in real time may be coming out.

So in conclusion, I must admire the man and admit that I enjoy the work, but some of it seems to evade complete control, which to me is a criterion of classic art. This is the mystery of Coetzee’s work.