Feeds:
Posts
Comments

Chris APL

The title of this post describes how I was feeling by the end of a week-long experiential learning course that I participated in last week at Henley; one of the most uncomfortable, confusing, frustrating and informative weeks I’ve encountered for quite some time.

Let me explain.

The course is called “The Advanced Personal Leadership programme” and has been a feature on the Executive Education menu at Henley for several decades. It has a reputation. If you ask the Business School about it, you get a limited amount of information. It’s five and a half days. It’s small groups. It’s about leadership and group dynamics. It’s experiential. It’s quite expensive. And that’s about it.

If you ask people who have already done it what it’s about, you either get the 1,000-yard stare of the fatigued Marine firing the rocket grenade half-way through the film Apocalypse Now, or you get a very pregnant pause while they think to themselves how to describe it. Then they usually say, ‘it’ll be interesting’. Which doesn’t help.

So, the scenario is basically that a group of managers (this detail will be important) arrive and are introduced to a daily, set timetable of sub- group and plenary sessions, some of which last until the evening, where the topic of discussion is – let’s be honest – somewhat vague. You are presented with a scenario that the existence of this group(s) is to be treated as an organisation, that sets itself tasks and monitors its own progress. The consultants sit it in the smaller group meetings, but do not say much. If I tell you that this about all the input you are going to get as a participant, then you will see how for the first couple of days I was pretty much in my “tutor” element. And my fellow group members were in their “manager” element. They were busily trying to second guess what the pragmatic problem was to be solved, and I was busy trying to work out what the philosophical lesson was to be learnt.  By Wednesday is was clear that everyone was at breaking point. I could see that they saw that I had no idea how to set a goal and move to structure, while it was clear to me that… well, actually not much was clear to me at this point.  I should add that although there was a task of sorts, the content and conduct of the sessions was more or less up to us. But to say more than that strays either into the territory of sharing what was said (which we agreed not to) or sharing why it was set up this way (which, reader, is up to you to find out by doing it yourself one day).

By Day four my feeling was that this was like Waiting For Godot, which was once famously reviewed as “a theoretical impossibility—a play in which nothing happens, that yet keeps audiences glued to their seats. What’s more, since the second act is a subtly different reprise of the first, he has written a play in which nothing happens, twice”. Of course, a lot was happening – precisely because all the other noise, facades, easy answers, models and trite utterances were inappropriate to the task.

For myself, I have (so far) the following as my conclusions:

1. All’s well that ends well. We pulled something together in the end, though I’m not sure whether what we pulled together wasn’t just the getting to the end of the process

2. I have a new-found admiration for people who are successful as managers and who have the self-awareness to develop themselves as leaders (whatever that is).

3. On no account must you ever hire me as a manager. Ever. That’s not where I can add much value. I appreciate that you also need structure and you need answers, but I’m just happier in the not-knowing and the asking. When you need that stuff, I’m your man.

4. Leadership, followership, getting things done with and through other people is political. There is no getting around it.

I don’t know how it is for my fellow participants, who all had their own ‘ah-ha’ moments (I mean of the insightful, not the Alan Partridge kind) during the week. I was in a new field of thinking when I delivered two PD workshops in the days after the APL, one where I had more compassion for the world as seen through the eyes of most of the Henley MBAs. Mind you, not enough compassion completely to stop messing ith their heads …

The photo in this post was one taken as part of our “event” at the end of the week. We all set up photos to illustrate a personal learning from the week. I am placing one set of chess pieces neatly in a row, while keeping the other foot firmly among the jumbled pieces of the other side.

I really enjoyed this funny TED talk by New Yorker magazine cartoonist and staffer Bob Mankoff. The point he is making, however, that nothing is funny in and of itself, is precise. And true for all acts that are communicative or informational, including, of course, management.

We are often convinced that a decision is, in and of itself, good or bad, right or wrong, clever or stupid, but these labels only apply to the relationship the decision has with and in context. It is this fundamental and espistemological point where we must begin Personal Development, too.

Childish wisdom?

Heathrow 2013

Don’t know why, but these paradoxical thoughts, adapted from Stephen Mitchell’s translation of the Tao Te Ching really tickle my fancy…

Dark Light
Weak Power
Tarnished Purity
Changeable Steadfastness
Obscure Clarity
Unsophisticated Art
Indifferent Love
Childish Wisdom

It’s not just the presentation of opposites – the deliberate placing together of certain ideas sets them up as contrary, but rather than cancel each other, the thought is one that sheds light on the nature of each.

William Blake: Illustrations to Milton's "Paradise Lost"

William Blake: Illustrations to Milton’s “Paradise Lost”

“Harmless”.

This was the original entry for planet Earth in Douglas Adams’ the Hitchhiker’s Guide to the Galaxy, which was later expanded by the book’s sub-editors in a subsequent edition to… “Mostly harmless”.

It’s great to revise a definition, and a nice way to begin a meandering blog entry.

Every now and again I like to try to rekindle my thoughts regarding the aim of education. I have rather got into the habit of saying only that ‘the aim of education is emancipation’. I’m not sure this is enough. After all, emancipation implies someone else (or someone else’s ideas) from which one has been given freedom. Though I know in many parts of the world that is a real issue, this wasn’t quite what I meant. I had in mind an internally generated aim, not a “release by” but a “release in”, achieved without external reference to anyone (or any thing) else.

So far, the best I’ve managed to come up with is: ‘the aim of education is freedom from comparison’.

This expresses more what I want for the Henley MBAs; that they should make informed choices not restrained by alignment to the notions defined by past experience or by prediction of future event alone (or, perhaps, at all). For personal development, the aim is freedom from validation, and from uncritical judgement of the opinion of others. It is an act of becoming completely at ease and at one with the world as it actually is. In its unspoken assumption of control over the world, our current pedagogy is very poor at this. For me, “freedom from comparison” is significant because it demands that you know under what system of restraints (i.e. being governed by what you cannot do) your awareness level is being limited. Awareness, actually, is the word I’m looking for.

In fact, I think “awareness” could stand as the real aim of education. Awareness subsumes comparison.

How do you get to awareness? (Easy when you know how, huh?) I think awareness is, in some way, being in tune with all forms of living system that demonstrate mental process in their function (Bateson, 1979), but explaining it is not easy with our current mental maps. The greatest barrier to awareness in education is whether or not we are aware of what a context is. Without context, education has no meaning, but meaning is not a thing, it is a pattern (i.e. it has no physical properties or dimensions, so is not to be quantified, objectified or reified in the manner that modern science has envisaged). Meaning carries weight (metaphorically) when it contains coded forms of information of what we can exclude (not what we must include) as alternative possibilities in each case. A red stop-light “tells us” nothing in and of itself. Its meaning is a very complex systemic property of interconnected levels of information (knowledge and structure of the legal system, social conventions on behaviours that align with the legal system, regulated processes of driver instruction and licensing, moral imperatives on behaviours that do not endanger others, etc.). The more such information it carries, the higher the probability of it not occurring just by chance.

All the possible restraints exist for us in nested levels of categories that each contain redundancies (i.e. information of the whole from a part) that mean we can navigate this complex social world without needing to exhaust ourselves with mental processing of every alternative. Systems of restraints are what keep dynamic systems stable over time. Including ‘you’ (as a circuit).  Your breathing, for example, works in a comparable way because your ability (for short periods only) to make this process a conscious one is merely an illustration of this whole nesting principle.

Managers carry with them maps of how their organisations work, and these maps contain many taken-for-granteds. We don’t understand this ‘gut feeling’ very well, but it is redundancy that allows educated guesswork on the part of the manager. Redundancy gives that person a better than random chance of ‘filling in the gaps’. The freedom inherent in management education is observed in how leaders conduct themselves and their work, and I think uncovering how these systems of restraints are universal could free their thinking and learning potential. To do this, education must seek news of difference (i.e. where are the limits?). The internal territory contains homogeneity or redundancy of information and there is nothing to be learned here. The individual is involved in the task of locating the boundaries where mistakes may be made in order to learn.

Reference

Bateson, G (1979), Mind and Nature: a necessary unity, E P Dutton

Ken Bull

Ken-Bull

It is with great sadness that we learnt this week of the death of the wonderful Ken Bull, known to many on the Henley Flexible MBA from his comprehensive and supportive marker feedback in the PD assignments. Ken was diagnosed with an aggressive cancer last year, and died peacefully on February 6th with family around him in Brighton.

Ken was an incredible person – full of optimism, warmth and humour. He had a long association with Henley as a tutor and worked both as an internal and external member of staff. He was latterly also a personal tutor on the Executive MBA also.  His funeral will be on Friday February 20th.

Inference /ˈɪnf(ə)r(ə)ns/ n.  a conclusion reached on the basis of evidence and reasoning

Blake's engraving of Chaucer's Canterbury Tales - metaphors everywhere!

Blake’s engraving of Chaucer’s Canterbury Tales – metaphors everywhere!

A lot of inference goes on in management education, but I wonder how much of it is rigorous or even does any good. Here are a few thoughts on this topic.

1. Inductive reasoning

Imagine that you are given a map of an inland territory, part of a larger land mass (maybe a continent) and asked continue drawing the map. If you only use what is already there, you would have to extend by extrapolation to continue the drawing outward. Each time you drew something, sure, you could go and check, and any new data could become part of the further drawing. This is inductive – your test of accuracy is in the form of further observation (trial and error). If you were incapable of learning from this, you would be restricted to the same simple protocols of feedback (that is, you could make corrections to your map but not to your method of map making). Does this mode of inference equal learning how to learn? Not much. Arguably, only in respect to the meta-level skill of getting better at a process of trial and error (i.e. if placed in a different situation that required blind trial and error, your years of map-making like this might have resulted in an increase in speed of your trial and error method). This is often what happens to managers as they acquire skills during their careers.

2. Deductive reasoning

Given the same starting point of having to draw a map from a fragment, you notice that on the partial map there is a river. If you know that water always flows downhill (knowing this doesn’t restrict you to inductive reasoning for map making, though you might not use this knowledge), you could perhaps predict and then draw the likely course of the river into your new map on the basis of other extrapolations. In other words, you draw through a mental process of “if…, then…”.

The accuracy of your map (your prediction, or “then”) is still subject to verification by observation, but is now based on application of a covering rule – it contains a test of a hypothesis. Similarly, you may apply other rules, such as that rivers flow into other rivers and eventually into the sea, and this thought also becomes part of an imaginary map to be tested against experiment. The application of a premise established earlier in time (a priori) and independently of experience is deductive inference. Of course, there is a possibility (sometimes an aim) that a hypothesis is not matched by data. Assuming you can trust the data (i.e. your senses), you now have a route to amendment of either the hypothesis, or of the covering rule that generated the hypothesis. Is this learning to learn? The same argument could be made as for learning in inductive inference, that a you get better at applying deductive hypotheses in other contexts and this is a sort of learning how to learn. Inductive and deductive practices often go hand in hand with in practice and the boundary between them is actually only arbitrary. In either case, learning is correction of error in terms of a specific response from within a given set of alternatives (context learning), and not a correction of error in terms of change in set of choices.

The problem for deductive reasoning is that the ‘then’ is only as sound as the premises informing the ‘if’ premise itself. The best ‘ifs’ are those that express fundamental principles, but this is far from easy in managerial situations. The device, or reasoning, that can make the leap from date to theory is called abductive (sometimes retroductive).

3. Abductive reasoning

Ok, you are equipped with a keen eye for observation and a decent education and you are given the map-making task. How do you proceed? Yes, you could dive in (as many managers think they ought to) and set about solving the problem you’ve been set (whether by trial and error, or by prediction based on rules you have been taught), but neither of these will lead you to higher level learning. Induction is clear and simple and will suffice for a bit. Deduction is clever and structured, and will work as long as the premise holds up. But neither one produces anything novel. Neither is creative. And neither one leads to deeper understanding of the world as it actually is. For that, you need also abduction.

Abduction is deliberately taking the explanation for one set of phenomena and asserting this also as explanation for the data you have. It’s a more complex and artful form of reasoning as it contains a leap, and crucially demands an understanding of a deeper pattern (i.e. a knowledge of what connects otherwise disparate forms).

Pattern.

This may all sound a bit woolly, but for many great scientists, abduction is how they explore new ground. It is guessing, but educated, informed guessing is a good thing. For example, Albert Einstein and Richard Feynman each used such informed guesswork to bridge gaps from data to theory.

I like to characterise abduction as use of the logic of metaphor, and few things have the ability to drive the imagination, or kick-start the generative process of creation, than the bringing together of two unalike things in order to see how they are alike.

 

IMG_4543
The nature of reality has long puzzled human beings. How do you find a way to talk (or even think about) what the world the way it actually is without getting lost in the obvious limitations placed on your understanding by (for example) language?

As part of this ongoing project I’ve recently been pondering the dialectic. It seems an important word to have a feeling for if you are involved in the learning business or, as is perhaps more accurate, the awareness business. It’s an old word, which is to say that it captures an old idea taken from the ancient Greeks, one that is usually defined as something along the lines of ‘truth arrived at by discussion of two points of view’. Debate, in other words. As a discursive method of learning, it all has a rather Socratic, Platonic or Aristotelian ring to it.

Passed down from the Greeks through Medieval theological hands, there is another more recent use of the term dialectic that has certainly had a turbulent effect on our modern world, namely the dialectic tradition developed in the German philosophical tradition of the 18th and 19th Century. Most well-known of these ideas is probably the “thesis-antithesis-synthesis” of dialectic logic as applied to the problem of working out what there is to know. There are several things assumed in the German use of the concept. From the Realism of Kant (whereby the true nature of reality (ding an sich) is open to enquiry indirectly through logic and reason), through to the counter-poised Idealism of Fichte and Hegel (wherein the true nature of reality is no more or less than the phenomenological experience), and on via the various twists and turns of continental philosophy ever since, there are three underlying precepts:

1. That our experience of the world is transient

2. That all is composed of oppositions

3. That change is spiral, and that over time it develops in a direction.

Marx and Engels famously constructed a material dialectic to propose a movement in history and society as fundamental inevitability. Charles Darwin, and other evolutionary theorists of the time, also allowed for a world of contradictory dualities and for there to be progress in nature. Thus, the survival of the fittest in speciation becomes synonymous with the perpetuation of culture in human society. For German dialectics, this is the bridging idea between social and natural sciences.

This is not the whole story, however.

Where German dialectics proposes contradictions, or dualisms, English dialectics proposes contraries. Unlike a contradiction, which is a negation and is destructive, contraries can and in fact must co-exist (rather like the poles of a magnet) in order for us to be able to draw distinction and differences. And, as Harries-Jones (1986) points out, “recognition of contraries does not cleave a unity.” By implication, on the other hand, the ‘thesis-antithesis’ viewpoint does, and in so doing creates something new and better.

But reality is, by definition, a unity, so it’s important to have a coherent philosophical position in tune with that. Not to be in tune with nature is not to understand how living systems operate, and not to understand how living systems operate (but to possess the wherewithal to unbalance the balance between you and nature) is a surefire recipe for disaster, sooner or later.

What’s more, the idea that differences can co-exist rather than battle it out for survival feels a lot more positive and ecological. It is the task of the observer to find a better method of explaining how this is so. For example, Gregory Bateson proposed a method of ‘double description’ which was in line with the English dialectic, where alternatives are not oppositions but just different mutual features of an inseparability. Science (rigour, description) and Art (imagination, metaphor) are thus understood not as negations of each other but also as contraries (sources of difference that enable double description). They are intrinsically connected, co-existing features of the same thing.

This has the distinct advantage of avoiding dualism, but more importantly creates a bridge between how we think about the world and how the world actually is. This is a new field of thinking, and one that may inform a more enlightened discussion of the purpose of management.

Reference

Harries-Jones, P (1986) Mapping, Continuing the Conversation: A newsletter on the ideas of Gregory Bateson, No. 5, pp 5 – 7

Values for money?

IMAG0513

Personal Development workshops on the MBA run throughout the year, and across several locations, but they also tend to cluster; the same title seems to run several times in close succession. We have seasons of Starter Workshops, and one has just finished.  Now it’s all about the second in a sequence of four workshops – Development Plans. I’ve been thinking about values in the past couple of weeks, as this is one of the subjects featured in this workshop.

Trying to move my own thinking on, I’ve been reflecting on how to say something new about this.  When I joined Henley, I think (like most Business Schools) discussion of values was restricted and narrow. The view – albeit the dominant one in management education – defined values as enduring beliefs rooted in reason and represented in lists of nouns. This is, in fact, now the discourse used by corporations as well as individuals, and is what most of us think when we think about what our own values are.

I wanted, when I took over, to expand on this so I first tried to provide an alternative interpretation of values by proposing that you could equally regard them as pre-linguistic and not arrived at through reason (just “there”, which really puts the cat among the pigeons when you realise that on an MBA doing anything without resorting to words is difficult). By asserting that our values are somehow pre-existing, and collectively generated ideas rather than just concepts, it becomes possible to see the limits of a strictly linguistic basis. More recently I’ve been trying to take that one step further.

What are the base assumptions behind our working definition of values?

When asked, most post-experience MBA students will volunteer phrases such as “personal beliefs”, “guides to ethical behaviour”, “collective goals”, “codes of conduct to guide decisions and choices”, and “statements of fundamental purpose”, to describe what they mean. Values are seen as expressions of drive, as motivation and as a sort of enabler of choice (or limiter to choice) – rather like a set of rules. But people are often confused as to whether values are ‘things’ inside individuals, or ‘things’ owned in groups and societies. It feels like both, a bit. Discussion sometimes stretches to whether values change (either for individuals, or in societies) over time.

But my wonder is whether we need to step back and look at the assumptions behind these impressions and beliefs about values. For example, have we always interpreted values in the same way or this is a recent phenomenon? Can our definition of history help explain why we tend to invoke values in the way we do (that is, purposeful, definable, rational and concrete)?

Here are a few assumptions that I think our culture makes:

1. Human society is essentially moving in a direction of ever-increasing sophistication and refinement. Change is directional, and there are desirable ends to which we, as a species, are moving. This view seems to underpin not just the theistic religions of the west but the trajectory of science as well. Business, by extension, is purposeful in its own purpose (i.e. we are in our nature drawn to grow and evolve toward something).

2. Human societies are organised in such a way that purpose is growth. Expansion is progress. More.

3. Such growth, development or purposeful activity is a consequence of the examination with (a comparison of) the past. The past is considered real, reconstructed as history.  Events constitute more than a chronicle, they are concrete in time. Further, they are as concrete in the future as they are in the past. Events exhibit trends.

4. Therefore, our societal goals and the values that explain them are purposeful, time-bound and linear. Our goals are growth oriented and are powered by the scientific understanding of resource usage.

5. Values are seen as being made of the same ‘stuff’ as other forms of knowledge.

What would be an alternative view, one that could help us surface these assumptions and thereby clarify our thinking? It could, presumably, include the following elements (that emerge under the tutelage of my favourite philosopher Alan Watts):

1. Human society is seen as what it is in the present, not the future or the past. There is no temporal progression toward a more complete or sophisticated future.

2. Events are recorded, but as chronicles, not stories. Other than a sort of cyclical and poetic meaning, this narrative has no particular pattern of meaning.

3. Societies are aimed at maintaining a balance with or in nature, not a conquest of it, as the key to sustainable community.

4. Societal goals that are valued are those that celebrate this relational world and our relationship with it.

These are just some thoughts. They do need further development, but I wanted to get them down.

cave book signedI’ve been neglecting my blog all summer. The weight of the growing space between my posts has only made moving to break the inertia more difficult. My muse for this post is Nick Cave.

I adore Nick Cave. I admit it.

With its etymological overtones of worship, supplication and confession, ‘adoration’ is not too strong a word. Not only does to describe Cave’s approach to his art and well-spring for inspiration and imagery in much of the content of that art, but it sums up the feelings that his art (especially in its live performance) engenders in others.  I’ve seen the Bad Seeds play four times (all in Hungary) over the years and they are – far and away – the best live band I’ve come across. Thanks to the generosity and planning of my younger daughter during a festival where the band were playing, I have been the proud owner of a dedication written in a copy of  Cave’s book The Death of Bunny Munro (see photo) since the occasion of my 50th birthday. I’d frame it, but one must keep books where they can be touched, flipped through and occasionally read.

Then this week I went to see 20,000 Days on Earth, a new film about Nick Cave and his music/identity/past/present beautifully made by (and for) Cave fans.The film contains documentary elements, but is more a creative act than a chronicle (although the act of archiving, chronicling and in particular remembering are central themes). It follows Cave and several band members while they write, rehearse and record the album Push the Sky Away, but really this is a look at how Cave’s mind works and how different types of collaboration help feed his creativity. There are some beautiful lines in his voice-overs and a few devices such as the conversation with a therapist and another with some archivists. Some of the most interesting moments come about in conversations he has with charatcers from his past as he drives around Brighton in his Jaguar. These include Ray Winstone (on performance and acting), Blixa Bargeld (on the Berlin days) and Kylie (on that duet). And then there are snippets of songs at rehearsal and in concert, ending with just about the whole of Jubilee Street, a crescendo performed at the Sydney Opera House.

Cave was born on September 22nnd 1957 (happy birthday to him!) and has been involved in music, art and literature (but mostly music) for many decades. I was aware of, but not attracted to, the Birthday Party in my early 20s in London, so my adoration begins during the late 1980s with the Bad Seeds and has remained steadfastly devout ever since. But I think you have to see Nick Cave perform live and then you have to decide whether that is what moves you.

Mapping

birds

I’m taking a few days, belatedly, in catching up on missing work. One of the things I’ve been putting off is the up-dating and house-keeping necessary for the course materials we use in our MBA Starter workshops.

This three-day event is divided between the PD stuff – which I tend to improvise on as I go – and seven Study Skills sessions. These sessions have been authored, co-authored, delivered and developed by many experienced hands over the last year or two, so we need to learn from the repeated delivery. As luck would have it, I’m the Module Convenor, so it’s my job to tidy things up while still giving colleagues the freedom to deliver the aims of the sessions in their own voice.

I’ve noticed that when you revisit some details they can reveal connections to you that are somehow overlooked the first time round. One such example is the relationship between “concept” (or construct), “framework”, “model” and “theory”. These form an important part of the language of study and assessment at master’s level, so we have always had a session to introduce them.

It occurs to me that there is a sequence in the four:

  • Concept                     Individual items that represent abstract ideas, or mental objects. Our ability to conceptualise is almost limitless. Concepts are sometimes seen as the building blocks of theory.  Concepts are driven by our epistemology (way of knowing).

 

  • Framework             The arrangement of concepts in a taxonomy or typology (i.e. a classification of parts) where the order does not affect the nature of the taxonomy (PESTLE is a good example). Frameworks have a fairly loose relationship with theory but can be very effective in narrowing down the mass of data and possibilities to manageable chunks. Frameworks are driven by the same epistemology as concepts (after all, a framework is also a concept), but are always at one level of abstraction away from the concepts they contain.

 

  • Model                        The arrangement of concepts where the order or position does make a difference. Models can show cause and effect, as well as before and after, relationships. The aim of a model is to achieve accurate description of those relationships. Models may be generated by theory, or may be a step on the road (a guess, in other words) to the establishment of theory. MBAs are attracted to models in order to apply other people’s thinking to a given problem in hand (short cuts). Models are driven by expediency.

 

  • Theory                       The aim of theory is to explain. Theory tries to map data to underlying tautology in such a way that the steps between them could not be in doubt. Most work in science is the search by various means of inference of more complete theory. A better theory is one that explains more than its predecessor. MBAs are not attracted to theory usually until it’s too late! Theory is driven by curiosity.

 

 

%d bloggers like this: