Saturday, November 04, 2006

Warning: Excited Rhetoric based on Unqualified Assumptions

This long and soupy waffle is going to be mostly about the relationship between what they call ‘art’ and the Principle of Charity. With all the broadsheets in the country busy wiping our minds blank with their reccomendations and tips for surviving the Bienniale I have to increasingly re-assure myself that this is standard practice in case I give in to the temptation to hit my head very hard against the wall. It’s not that the reviews are at all bad – far from it – but if the conoisseur-wannabe were to hold the word of the Straits Times as some kind of holy gold they won’t bother to make any alternative interpretations of the artwork when they do see it. As it is I am assuming that the articles are there so that the layman who is ordinarily unable to see anything particularly aesthetic in what is called ‘art’ any more will find something to appreciate, which is infinitely better than not understanding any part of it at all.

Even if the person I have in my mind doesn’t take the Straits Times as the sacred word as I suggest the perception of the art he will see might well be warped in favour of the article he read. Although a similar event with movie reviews, a movie is a thorougly immersive experience – all five senses and then some – in a way an arty piece may not be. Some of the latter can be abstract at best. People can like DoA the Movie knowing that it’s really crap (really!) but they can’t do the same for art; what the critics say invariably affects common perception of ‘artwork’ because it has more or less achieved some coveted status as ‘highbrow’, and largely requires interpreters who pare it down for the masses to appreciate. And the writers at the newspapers are doing just that i.e. doing their job. But! Think of it! A few hundred people passing an installation either not looking at it because they’re not interested (that’s fine), and a few hundred thousand passing an installation looking at it and THINKING EXACTLY THE SAME THING because they’re all remembering the SAME DAMN ARTICLE!

I wouldn’t mind the possibility of this quite so much if the dominant spreadsheet weren’t quite so… dominant. Although it does provide some kind of universal, most general guide to the scene it cannot say everything in two pages, but people tend to assume it can because, of course, it is a newspaper and should tell the Citizen everything he needs to know… especially if he really is trying to understand… but it’s probably like running repeatedly into your mounting block.

I might as well confess it here and now. I work and write and speak and live in a space where the Principle of Charity is applied in the most liberal doses, and if I am misunderstood it will largely be by wilful choice, and if I misunderstand it will be similarly so for many cases. If I misunderstand involuntarily and it is picked up upon a point of view that I have missed will in all likelyhood be explained with due care, with minimal impatience or ire. However much I wish I can be so I cannot see from the sole perspective but for the bedrock principles I refuse to really discuss here. You may not kill me when I say that I’m not going to judge the hijackers to the planes on September the Eleventh until I know every bloody thing that made them do it. (Although being a biased individual I CAN say that Hitler, at least, was a real bastard.) Put me in a conversation with someone beholden to the principle to be completely convinced that everything we agree upon is certain to be right and I shall be afraid to say anything at all.

So it is with a homing instinct for the weird and wonderful that the weird and wonderful is rationalised, described, rationalised again, digested, and messed around with marvellous alchemical thought processes that aren’t marvellously alchemical until it becomes boring: ‘Entreprenurial’, ‘creative’, hear my ears prick in Singing Singapore.

Most of what I’ve written so far are excited rhetoric based on unqualified assumptions, but use the Principle of Charity –

How can something be weird and wonderful if you think you understand it?

And would this be how the Principle of Charity is essential to art? I think so. I don’t care if you don’t. I’m not putting my argument down in standard form and you’ve read the earlier bits you should know precisely why. Is this a sole perspective? Should I pick a sole perspective after seeing more? Is this a principle by itself? Sometimes it’s more fun.

Just don’t call it the ‘Principle of Charity’ in this context. It’s not ‘charity’ any more. It’s paths, branches, mystery, nodes, air and aether. It’s night and nova. On things of this sort it doesn’t have to be coherent because that’s what it is; I don’t understand what I just said because I can’t. Nobody will understand everything, but try anyway.

Monday, September 25, 2006

Deus et Aemaeth

Dear Ms Lai, if you are reading this now, please beware the language issues and political-religious midgewater. We were supposed to clean up our blogs before they went public, but this needed to be honest.



I’ve always, always, HATED puppets.

Even now as I declare it I half harbour a superstitious fear that they will hear, and they will revenge. But the fear stays between my heart and my stomach, somewhere in my gall bladder, simmering in yellow juice until it is ejected away with the urea and the saccharine.

When I was small I must confess that I wanted a doll very badly, because I had none: the best my parents had scraped up for me was – if I remember – Lego. We played with Lego day and night and built houses with the bricks and hopped the little yellow figurines around inside, their little Lego worlds accounting for squares worth forty-by-forty smaller squares and the number of bridges we could build between them. We disassembled little Lego people to see how the joints worked and switched the heads around so that a head with a little synthetic ponytail will end up on a muscle-bound torso printed with synthetic tattoes. We played little games with the little Lego people I can never remember now, but I don’t think I ever want to watch a small child play at Lego. Lego is cruel.

My first doll was a second-hand Barbie. In the days where your peers routinely expected a Cabbage Patch monstrosity or half-sized automobiles (for Ken to pick Barbie up in) for your birthday, and took them to kindergarten to show them off, this was nothing next to a cardinal sin on the part of my parents, who really didn’t approve of dolls (for reasons concerning ‘creative toys’, but heck about that!). I think they were hoping to break me out of my Barbie fetish. In any case I didn’t care – I was fascinated. I broke a comb in her hopelessly dirty tangled mane and felt at the hard little bumps on her chest, wondering why my mum was soft. Then I made her lord it over all the other toys. Then my younger brother joined in, and the games evolved from tea parties to criminal interrogations. If children are cruel to Lego they should never be allowed anywhere near a doll. Susan Hill of the didactic morals on parental guidance evidently hasn’t remembered much of her own balanced childhood.

The trauma started before I twisted the little Barbie’s head off – by accident, I swear! – but didn’t deter me from asking for another doll, a brand-new one this time, because I knew something was wrong with the one I had – which I’d attributed to it’s second-hand-ness. Of course that didn’t solve what was really wrong, which stayed at large and elusive until it came back years after I’d completely forgotten about it. My mother wouldn’t buy me one I wanted, in a blue-purple taffeta ball dress, for some lame reason which I dismissed out of hand but complied with because one doll was better than pissing her off and getting no doll. (I thought.) So I got my new doll. It came in wholesome pink and white and had legs so varnished that they shone. This new alpha female had a catfight with the old and loved-off model the moment she came home: the old one lost on account of her head being lying somewhere in the depths of the toy-box, under an avalanche’s worth of discarded Lego. Then in a year I entered Primary Four and suddenly I had no time any more for toys. The crushing homework, the first computer game that ever saw the interior of my family’s apartment and new classroom politics put toys out of my mind for ever. I never even noticed when my mother got rid of them – but she must have, because they’re not anywhere any more. In fact if I came across my old doll’s head by itself on the floor now I’d scream. And never stop.

I first saw the Sims when I was twelve. To show off the game functions my friend made a whole family of Sims, built them a house, filled it with a fireplace and flammable stuff, put them in and removed the door. So the all the idiots died. (None of them even thought of opening a window to jump out.) What is relevant about this scenario is that we were manipulating them and their brief lives, yes, in what must seemingly be in a more godlike way, made real through the screen and the then-excellent graphics. But it still felt less real than twisting the head off a doll. I mean it. After all, I had two dolls, and. And.

It took me a decade to realise that neither doll had name. They were just 'The Barbie'. Not a real name, just a brand, something poor Mattel's daughter will have to live down.

Why the sudden soliloquy? – I’ve just watched Ghost in the Shell: Innocence. It is not a good anime to watch just a week before your first final exam, but here I go again, tempting fate. (Admittedly, the first first final exam is KI of all things, but that's irrelevant considering how all the other subjects are piled up behind it.) It didn’t put new questions into my mind on the subject of dolls – not hardly – it merely reawakened old ones… I wish I had a guardian angel to warn me whenever I skid past the sign on the rink that says ‘thin ice’. My mind is like an ice floe it has safe spots, weak spots, spots with holes in it for fishing and spots in which the smallest and most unsuspecting mosquito could sit on and probably shatter.

Why am I talking about dolls? – because dolls are the most anthropomorphic – personifications (ha!) – of people I have at hand. Because we train children to grow up with dolls to inflict with our desires, our beastiality and most faithful fear: we teach them to be children at dolls, because dolls can’t fight back. Puppets can’t fight back. A puppet sitting among its tangled strings, limp and hooked in unnatural ways, is a terrifying sight. A puppet being openly manipulated across the stage in a show is slightly more bearable only mostly because the terror is hidden behind the story. A puppet being openly manipulated by a hand to do horrible things to itself is fodder for a recurring nightmare. Shit, that I think of it means I have to wish I’d better not actually dream about it, because if I do at this critical period I am going to start screaming in the middle of my math promos. And never stop.

Why are all religions so fanatically sacrificial? In multi-cultural Singapore I see masochism everywhere I walk. People staple themselves all over with huge long pins and carry pretty portable pavilions on them which must weigh like hell; people walk on fire; people slaughter goats ceremoniously; people are nailed onto crosses. In fact in the Month of the Hungry Ghost it is said that people used to sacrifice a pair of young children to the fire to pay company to the deceased so their ancestors won’t get lonely. Now they are replaced with effigies of cheap papier clothes and cheap gold leaf and rouge’d cheeks that rubs your fingers red and black eyes, black eyes that put the fear of death in you, if they are all you are going to receive after you die, and Mamoru Oshii must have had seen a ceremony like that as well because he put it in his movie. He knows. He has the benefit of age and experience over me and so he is overwhelming; at the same time given the proof of his existence I can’t help but believe that there are other people around the world who feel the same.

And this is why this is on my blog. You can slap me with a jail term for bureaucratically-defined general disillusionment. I am telling my truth.

Fake-drawn children’s faces ashing, curling in fantastic curlicues from the edges in, revealing their paper vulnerability before they are altogether burnt: in the roaring wavering of flame their eaten bodies seem to writhe, and this is why they are consigned to fire of all things – through the fire we see the way to heck. This is the way in which they travel; the flakes in the sky are merely the flesh.

This is what I imagine in my head if I ever voice out my objections.


Me: *$&#@!

Mother: Shhh! Be quiet! (cue cursory threats of impending pain.) Be quiet! (giving up.) Shut up!

Me: But it’s -ing horrible! Stop it!

Mother: It’s traditional!

Me: I want to go home!

Mother: It’s not over yet! Show some respect to your grandmother!

Me: BUT THEY’RE THROWING KIDS INTO THE FIRE!

Mother: They’re not real kids! They’re ten bucks from the store down the road! They were made in a factory!



You have no idea how horrified I would be by the time I get to this part. Unless you do.


--
A quote from the movie: “We weep for a bird's cry, but not for a fish's blood. Blessed are those with voice. If the dolls also had voices, no doubt they would have screamed: ‘I didn't want to become human.’”
--



And a poem from me.



--
BARBIE

When I was midway little I possessed the one doll.
When you have only the one there is no need for a name.
She had too many shoes but one dress: the one in which she came,
Her long hair blonde and ready-brushed to a frenzied cotton-boll,
With silky spider legs rose-tanned, with breasts that came to points,
Cool cerise lipstick chipping off a bleached, clenched smile,
Enbalmed in plastic brilliance that made it hard to revile
The fact that she was a little… faulty at the joints.

I’d lavish hours on her alone regardless of the time,
In toy-gatherings of parties heaped with plastic foods sublime:
Shove gorgeous silver dancing shoes upon her stilted feet
And dream up her Prince Charming just to make the waltz complete –
And when I twist her head of (in accident!) and can’t fix her after all,
She was packed in separate boxes, and hid behind the wall.

--

Monday, September 18, 2006

KI in a year

I would like to shout this out, very loudly
I have evolved

There.

I used to be a semi-empiricist, with a generous dose of naïve realism thrown in. Now I rather think it is an impossible combination. It is difficult not to be a naïve realist: on some level all of us are naïve realists, so it’s not really fair to count that in. Either all days are holy or none are.

I believe I subscribe to existentialism.

It is a very sneaky way to announce my that I do not need to really justify my belief, because to be existentialist is to deviate in any form of belief in which one can flourish – and therefore not have to justify the belief. Some things are left best to that smoggy netherwhere where things can either stay vaguely undescribed or be told in a series of bestsellers, where the money makes it worth the trouble.

Therefore it goes without saying that I like Søren Kierkegaard better than I like John Locke or even that idiot George Berkely. And you can see that I have grown aggressively opiniated too. Also I throw names and philosophik around like so many authoritative gems: you see I have learnt to cheat.

This cheating has become a growing necessity in the space of my mind because the more I read, the more I become aware of how little I plumb the depths of each famous philosopher. Most of these dead white men were bloody interesting and I have had so little time to even glance at them, so I merely skim the surface of each for my meagre purposes – assignments and all that. The worst of it is that my limited memory makes it difficult for me to remember even the little I have read. To know the basic concepts and use the thinker’s name in reference to it is, what I am told, very welcome in examinations: on several levels in my conscience it refuses to be dislodged from the impression of being an act of dishonesty. Is this dishonesty unavoidable? This might put into perspective some of my essays throughout the year, in which I semi-consciously make it policy to never drop names if possible (although I have to confess that in some cases, the temptation to mention Descartes swam strong).

And yet I have to thank KI for opening up my mind to these foreign names, foreign thoughts and foreign concepts. Descartes and Darwin were as meaningless and as distant as the stars before I was made to be introduced to what they were about: before that, all I did was glance briefly at them from afar and forget them again instantly. I was never interested in the development of astronomy or plate tectonics or washing machines until I ran into the great huge block of ignorance revealed by the roving torchlight of KI. Aristophanes, John Wallace, the Book of Kells, synaesthesia and dadaism were discovered along the way: I was merely dragged frantically in the dust by a great mad desire to know more – irresistible as the tides, barely confounded by the restrictions of time and circumstance. How glorious. How mundane. How frustrating. And I remember so little of what I read.

Which is probably why most of what I do retain comes from discussion. I am fortunate enough to belong to a circle of close friends who all take KI: we gather regularly (usually on weekends) to mug, jaw, eat and argue. And we talk about things: the variability and reliability of personality typology, the fundamentals of poetry appreciation, quantum physics (usually this involves two particular people explaining things to the rest of us, who stare gormlessly and try to understand), popular culture. We recommend books to each other, compare notes, thrash out essay outlines, and get nothing done. One or two prominent characters might get torn to shreds in a perfunctory comment. Small acts of violence might take place as querulous sensibilities disagree. But it is always fun, and it is always enlightening. Although the best discussions more commonly take place between two or three people only – this usually happens at least once a week, compared to the big huge group discussions, which are really rather rare – I have become increasingly accustomed to finding knowledge and inquiry in everything, and to exist to look at the world through jaded eyes wearing rose-tinted glasses is the best thing about being.

Essay on literature

‘Discuss critically the view that literary heritage provides the most effective way of sharing knowledge in a society.’



The definition of ‘literature’ itself is contentious, with the line strangely blurred around the edges. Most agree with the Princeton online dictionary when it defines ‘literature’ to be ‘creative writing of recognised artistic value’: mainly works of fiction, or non-fiction pertaining to matters of interest to the public. Most of the examples found from a quick internet search listed plays, poems and ‘anything of the written word’: literature today, indeed, is often alluded only to such narrow terms. However, if we take the etymological root of this word – according to Wikipedia, literally meaning ‘acquaintance with letters’, deriving from the Latin ‘littera’ (“an individual written character”) – as the basis of this discussion we may well presume ‘literature’ to mean the written repository of knowledge available to all fields of inquiry, from medical to economic to the most dues ex machina fantasy novel in living history. As if it were not enough, Wikipedia goes on to elaborate on the inclusion of oral tradition as a form of literature although being not, strictly, in print. After all, the first recorded stories came from oral tradition before a version of them was written down. We may still come to consider ‘literary heritage’ everything of words passed down to us from our ancestors, be it in writing or otherwise.

Literary heritage by the most popular definition, however, is usually reserved for the beloved ‘Great Books’, classics that endured over time and evolved into canon in the literary world (official or otherwise). These range from works of fiction such as ‘Pride and Prejudice’ (that remark on society in her time) to Descartes’ Meditations (on philosophy), to the teachings or records or theses of Plato and Pliny or Ptolemy. Each have gained importance and dropped from favour in their time, or still reign in importance today, relative to their relevance to the world of today. In this case, everything written on human nature is bound to remain, because human nature will always remain essential and integral as long as human societies exist: Jane Austen and Chaucer are still read and appreciated in this context. Even so, meanings might be veiled in symbolism or subtlety to avoid complications, persecution or merely for realism’s sake, and remain there to be interpreted as the reader wishes. Knowledge is shared in that the writer offers the situation for the reader’s consideration: however, knowledge is not shared efficiently in that many alternative interpretations of one phrase might be produced as the reader attempts to tease out meaning through the framework or language. And even yet some might argue that this is the virtue of works of fiction and of language, and that multiple meanings of one thing may be derived – all of them true – thereby proving the efficiency of language in conveying a set of ideas, simultaneously

And yet relying on the audience to decipher meanings presents problems of efficiency: in many cases, this ‘sharing of knowledge’ through conveying meanings depends much on the audience itself. For many, for example, ‘Twelfth Night’ is categorised as a comedy, and many of Shakespeare’s time regarded it as such. A reader of today might question its status as a comedy and, if he were a literature student, question a few parts that do not appear quite so funny by today’s standards. A latter-day scholar of Shakespeare may speak authoritatively of the dark side of the glitzy, dues ex machina world of Twelfth Night and Shakespeare’s hidden messages left throughout the text on class stratification, women and other pertinent issues. The audience sees things depending on what they are looking for, and this is what impedes the sharing of information – the audience sees exactly what it wants to see, and their level of satisfaction with the meaning inherent in the text depends on how deep they are inclined to dive.

Having said that, I go on now to question the integrity of literature as an effective communication of the artist’s intended ideas. Just how much can we consider something to be ‘literature’? Ideally, as mentioned in a website for brochure design (in which a glossary is included), literature should be something ‘in which compositional excellence and advancement in the art of writing are higher priorities than are considerations of profit or commercial appeal.’ And yet, as proved by many of the writers out in the wild whose main purpose in writing is to earn a living, and to the very core of accepted contemporary literary canon, the Bard himself – who made his plays palatable to the public in order to keep eating – literature panders to the public. If … even as we can clearly see that Shakespeare managed to juggle between his artistic integrity and the need for a crowd-pleaser so well that we still marvel at the insights of his works today, we can only assume its use to us in the context of the twenty-first century. Contemporary knowledge tends to have to be cautious of the social winds in which it works: literary heritage only works over time as each generation of equal bigots wither in their turn, and the next sees with differently-prejudiced eyes. Therefore the conveyance of knowledge from writer to reader or watcher or listener is not always unimpaired, and perhaps vital information could be missed.

More questionable is the definition of literary heritage: if they are to include all that has been written in the past and passed down, surely they might include texts of the less-than-usual stuff, such as perhaps fifteenth century porn: can such books be considered, in the common run, literary heritage? If not for the conveyance of ideas (some as can definitely be expected, for human nature, as said earlier, is everywhere), they do still otherwise provide knowledge in the form of their existence through the (social or moral) context in which they existed, but sometimes very little further. Their value, therefore, lies less in the ‘knowledge’ that is written, but more as a cross-reference to the culture of their time. And again, in this case, this knowledge cannot be acquired directly, but through interpretation – therefore reinforcing the dubiousness of this text as a truly efficient way of sharing knowledge.

Interpretation is particularly important as each occur through communication style and context that may have been obvious in the past but have now become obsolete. Often, centuries-old text – with language being as subject to evolution as social norms are – are incomprehensible to the modern-day reader, and present even skilled scholars with some difficulty as to the translation of meaning and mood on top of all the cultural references of the text’s time which, despite probably being assumed as common knowledge then, is in all probability completely mystifying to the modern sensibility. Related to this is the inevitable tendency for what the ancients regarded as common sense (eg. the world is flat) to be seen through our eyes as romantic (or unenlightened) fiction. Such barriers tend to hinder the efficiency of communications between ‘literary heritage’ and a society which contrives to discover knowledge from it, partly because the man on the street in a first-world country is brought up to take these things for granted, and the man on the street in a third-world country will most possibly (being likely to be uneducated) take what is said from the text for granted. And, both of these men wrapped up with the daily business of life, many would not take into concern the ‘knowledge’ of literary heritage as imperative knowledge: little interest is there, especially when the texts themselves remain mostly indecipherable and its context, quite literally, ‘out of this world’. Again the population relies on the few middlemen in between to take care of the sifting and judging of ‘useful’ information from the ‘erroneous’ knowledge: efficiency in this medium as a mode of sharing knowledge is reduced to pearls of wisdom doled out by a few.

However, literary heritage is useful as an asset for communal identities, an efficient tool in which a group of otherwise unrelated people may find an anchor and a reason for them to stay together: histories and literary works as a point of pride among cultures and as a kind of advertisement to their greatness and shared uniqueness, which creates a sense of belonging among individuals and perhaps increases their commitment to its preservation. Given the ongoing prevalence of this in the face of globalisation, one must concede that the sharing of information from these literary texts (for every Indian child to know the Ramayana, or every Chinese able to recite the Tang poems) has been efficient to say the least in the success literary heritage enjoys as the champions of culture.

With globalisation occurring throughout the world, literary heritage has also become a way for bridging communities – each with their own sets of beliefs and words that is shared in a sort of exchange, with translations (such as, for example, the works of Honore de Balzac into English) becoming common and widely available to the masses. Societies may plunder each other’s caches of literary heritage, bringing the term into a general use for all literary heritage as independent of their belonging to their various culture that, in a global village, allows the extremely efficient sharing of knowledge across societies, within a society.

In fact, if we are to see ‘literary heritage’ in the context it has been for the whole of this essay – literary heritage both in oral tradition and written text – it is inevitable that they should be the most efficient way of sharing ideas and knowledge of the past: this was what words were invented for, after all: to communicate ideas, and to repeat ideas that have been communicated so that they might be referenced to and remembered. Moreover, literary knowledge provides links between past and present; it allows those who access enough of it to see precedence, trends, and from there predict patterns of behaviour, making it more than relevant to today’s world.

However, this ‘literary heritage’ cannot maintain this same position in the context of the modern world, and if the information involves current events: such knowledge, although a part of history (because history is made constantly), cannot be considered ‘heritage’ in the same cultural-societal context, and such information – war news, the price of lobsters in the market, inspirational stories – may tend to be of vital importance to their readers: too important to be disregarded in the light of sharing knowledge. With this void to fill the less traditional premium mediums take over: the newspaper, the television, and the internet, all of them ready and able to tackle current problems in current contexts. Although their reliability may also be disputed, their means of ‘sharing’ are efficient and the readers having no choice but to take the knowledge they offer in the context in they are read, until at least the benefit of hindsight and in the wake of later events.

These two forms of communication – literary heritage and current media – deal with different varieties of information and should be considered distinct and separate: therefore it still holds that while literary heritage does not provide the most effective way of sharing knowledge in society overall, it remains so for sharing knowledge that applies to the past, across cultures, and evolution of the society itself.




http://wordnet.princeton.edu/perl/webwn?s=literature
http://www.brochure-design.com/brochure-design-publishing-terms.html
http://en.wikipedia.org/wiki/Literature

Essay on mathematics

‘There is nothing more objective than the laws of mathematics.’ Discuss, making reference to the mathematical and at least one other mode of inquiry to illustrate your arguments.


It must be established first and foremost that if the laws of mathetics cannot be in themselves any more than a set of absolutes then it should have no bearing on their objectivity when taken in this context, and that the objectivity of mathematical inquiry as a mode of acquiring knowledge depends more on the application of the mathematical laws than the laws themselves. Mathematical laws exist cohesively in a body of beliefs through which a framework is built for the intellect: no law accepted as canon within this sphere can contradict another or, within any set of circumstances in which they apply, result in an anomalous answer. Dealing with objectivity in concepts such as those of mathematical laws at least seems to be simple because mathematics exists in a language without aberration or equivocation: every symbol is interpreted in one way only, every possible gap plugged before a new equation is allowed tot ake its place among the pantheon.

There are several paradoxes, however, that seem to undermine this. Below is an example:

Let a = b

a² = ab *
a² + a² = a² + ab
2 a² = a² + ab
2 a² – 2ab = a² + ab – 2ab
= a² – ab
2 (a² - ab) = 1 (a² – ab)
Divide both sides by (a² – ab)

THEREFORE 2 = 1



It is difficult but not impossible to identify a flaw in the ‘argument’ presented in this series of equations, but it can be clearly seen that the conclusion is a blatant contradiction of its premises and the most basic assumption of mathematical law, that every digit is a distinct and separate concept that cannot be equal to something other than itself. This law bends itself for algebraic representations such as ‘a’ and ‘b’ because they denote unknowns, but digits are absolutes and can – possibly? – be considered laws in mathematics themselves. Without this law in particular there can be no mathematics. Therefore, it is the ‘argument’ that is not valid. If this progression of beliefs is invalid, can they even be considered objective? This proves that objectivity in mathematical knowledge exists not in the laws themselves – since they act as the immutable framework itself in which mathematics runs, and the removal of any one law will lead to the construct of (probably) the whole construct – but in the series and method in which they are applied.

Otherwise, mathematical laws can be considered objective because they conform to physical evidence in the real world. Anyone who cuts up a triangle of any orientation and aligns the three original angles of the whole together will find that they join up to form a straight line. Pythagoras’ theorem can be performed using sticks of different lengths in various compositions. And yet by using this as a benchmark for objectivity is no better than saying that mathematical laws are as objective as empirical evidence, especially when empirical evidence – which becomes subjective as a result of faulty reception, error and the influence of prior prejudices – is considered less objective than mathematical evidence. Even through description does mathematical law become subjective as this second-hand understanding has been coloured by the medium through which it has passed and, even worse, has to be empirically experienced before it can be processed. In this it can be seen that if mathematical laws are to be considered objective at all, they may have to be considered so in the concept of concepts only – outside our immediate comprehension. All it can be agreed upon is that the principle of mathematics as a set of relations is objective.

This does not mean, however, that mathematical laws can be considered the most objective set of relations, particularly because it applies so little to our immediate experiences and cannot be linked to other abstract notions such as sensation, life or death. What makes mathematics and its laws so objective and reliable are its specific notations and limited functions. In such cases mathematical laws – or even mathematics as a whole – can provide technicalities, statistics and calibrations which are, on their own, objective, but cannot explain the nature of a thing such as experience or memory. And, because looking at the parts does not make a whole, these ‘objective’ calculations become subjective in understanding the thing itself and does not suffice as a legitimate mode of inquiry outside its immediate sphere and without benefit of extrapolation. Therefore it cannot be said that mathematical laws are objective in all or any context.

And yet at this point in time the superiority in terms of objectivity that the laws of mathematics have over, say, empirical evidence, has been stated. What prevents it from being the most objective system of working is its very rigidity. Compare this with the workings of contextualism, a mode of inquiry through which any event, thing or concept is judted relative to the things that surround or concern it. While som insist that contextualism is subjective to the extreme because it uses so many factors as clauses and benchmarks, it can be also considered objective because it allows many definitions to be attributed to one thing specific to the set of conditions in which it is in, all of them (at least potentially) accurate – which makes it valid, and as objective as possible to the situation. This flexibility gives the methanics of contextualism an edge that the laws of mathematics do not possess.

The laws of mathematics cannot be considered more objective over everything else either because, as mentioned before, its ‘objectiveness’ is inherent in its system (of notation and speficifity etc.) In this case, similar constructs such as binary or musical compositions are at least as – or even more – objective than mathematics and the laws they concern. Binary, in particular, being so simple (with only two variables and infinite combinations), can be considered more ‘objective’ than even mathematical laws in this case. However, neither binary nor rules in musical notation guarantee inquiry, only recognition: specific symbols to specific references or responses. This would mean that ‘objectivity’ would be called into question here as well because these constructs only serve to indicate, and not to argue – per se. This can be applied to mathematical laws (in themselves, not mathematics as a whole!) and recapitulates the point of objectivity having little or no bearing on the purpose and function of mathematical laws but for the bundaries in which their variables stretch.

And even if all of these were to be somehow disproved, there is the problem of this concept called ‘mathematical induction’ to consider. Mathematical induction attempts to achieve an answer by assuming something, and then building on these assumptions with more assumptions, with the specific intent to force this specific answer. The result is something based entirely on these consecutive assumptions which, to all purposes, has now become subjective (because of the nature with which it was achieved). This contradicts the assumption of mathematical laws, all mathematical laws, being objective.

On digging yet deeper, building on assumptions seems to be what most (if not all) mathematical laws apply in their workings, because sooner or later everything boils down into a fundamental concepts – as illustrated in Peano’s Axioms, a set of rules defining the scope of natural numbers which, for all anyone knows, is as likely to be assumptions as it is truth. Without this foundation, the whole structure of mathematics – all the laws – would self-destruct. More tenuous the link between the concept of mathematical laws and objectivity become when one realises that such ‘laws’ are merely observations of trends that have not yet been disproved: similar things have happened in spheres of knowledge such as physics and philosophy: the once accepted truth of Newton’s laws was displaced by Heisenberg’s uncertainity principle on the micromolecular level. Objectivity does not apply so much when reliability is being questioned.

Everything that has been discussed so far, however, mathematical laws, empirical evidence or contextualism together, has pinpointed the propensity for every ideology to be subjective to a vantage point of a sort: in the case of contextualism, for example, it is the context (!) and for all of mathematics, it is in the basic concept of the existence of distinct and separate quantities (numbers). Therefore one cannot say that mathematics is any less subjective than any other mode of inquiry, although it can be argued that it is more objective than some: thus it cannot be said, either, that t here is nothing more objective than the laws of mathematics.



(* my friends tell me that this example works only if a = 0 .
Ril: 'You can't divide by a^2-ab on both sides, because if a = b, a^2-ab = a^2-a^2 = 0'

But at first sight is frighteningly convincing and confusing. This Example was shamelessly stolen from Dr Alfi’s first logic test.)

Essay on experience

‘No man’s knowledge here can go beyond his experience.’ Discuss the implications of this statement.

To acknowledge this statement is first and foremost to debunk the existence of a priori knowledge. A priori knowledge, by its very definition, represents a set of knowledge which is claimed to be independent of experience. Examples such as math, and definition, spring to mind: we know that a cube has six square sides, that 1+1=2, and that white is not black. To cut a long story short, a priori knowledge usually describes what we know that is independent of its existence in reality (if they even exist there in the first place!), usually claiming a place as some kind of ‘higher form’ of knowledge: it works in the domain of units, definitions, concepts, contrasts, and the imagination – which many would not define (!) as knowledge, especially in terms of definitions (eg. a cube by definition has six square sides, and that is why it is a cube).

The assertion of mathematical knowledge as an a priori system is contentious at best. For example, tn order to illustrate the arithmetic to the unintiated – such as when you attempt to explain the rules of addition to very small children – you tend to need to rely on physical stimulus, or a ‘live demonstration’. A child will not understand if you scratch out a page of ridiculously complicated proofs for 1+1=2 (and it exists: it can apparantly be found at http://www.mrtc.mdh.se/~icc/1+1=2.htm), and neither will the ordinary person on the street. And yet the ordinary person on the street probably knows the difference between one and two. If you asked them to prove 1+1=2, the odds are that they’ll take two things (say, apples) of identical value (to all intents and purposes) to demonstrate. This most basic and very physically-oriented mode of explaination of such concepts is what is usually used to educate small children, who live entirely on the most basic, sensual level. And all this little experiment really proves is to show that even math, that abstract school of philosophy that supposedly exists without reference to reality – at the most fundamental level, knowledge can only be transferred through empirical means. This instantly undermines all claims that math is strictly disconnected from the world as we know it because it can only be first perceived through reliance on reality. Therefore: we cannot deny reality on our terms even if were are to eventually conquer ‘a priori’ knowledge, meaning that no man’s knowledge here can go beyond his experience?

Moreover: if you asked an ordinary person on the street to prove 1+1=2, he will also very likely think you some kind of simpleton – because the explaination (an apple in each hand) is so elementary, he might say if were probed further. And he would be right. But herein lies the fault of a priori knowledge: increased assumption of its truth. Although these ‘truths’ eventually become the foundation of all our later knowledge, they evolve into what is commonly known as ‘common knowledge’, or, more accurately, canon: accepted, universal beliefs which have, for many, yet to be proved. Everyone knows that 1+1=2, but not everyone has read the Principia Mathematica (by Bertrand Russell and Alfred North Whitehead) in which the explaination for 1+1=2 is irrevocably proved, and few among those who have read it actually even understand it. For most of the few who have even heard of such proof for 1+1=2 they are content enough to ‘know’ (i.e. to be told, by some authority – isn’t this experience as well? We’ll cover that later) that such proof exists, feeling no urgent need to go and really find out for themselves. Can 1+1=2 for them and everyone else (apart from the people who understand the proof) be considered true ‘knowledge’? Without actually experiencing understanding, it should not be possible to say that one truly ‘knows’. ‘A priori’ – in this case – is clearly exposed as a sham. Not very similarly, one knows that black is not white only because one experiences black and white through vision: a person blind from birth could not tell the difference between black and white in the way the sighted can. If he knows there is a difference between black and white it will be because someone has described it to him, and even that is a sensory experience as well.

In fact, nobody can deny that we get our primary information through sensory experience. The eyes, ears, nose, tongue, skin perceive for our brains and obligingly feed it information through a complicated series of nodes, nuitrition, synapses, molecular physics and calcium ions. Given the delicate balances needed for this much underrated background mechanism to function at optimum level, errors are common and information is either misfed to the brain – or the brain itself malfunctions. The brain can also be misguided by complications so that it fails to process and discover hidden fallacies. Therefore we cannot know if the ‘knowledge’ we gather is ‘truth’. However, this is the best we can get – although weapons such as Occam’s Razor and proven logical structures aim to narrow the margin of error. Yet the stimului themselves are served to us on a level we cannot ignore, and we invariably, involuntary take this stimuli and the various almost-instantaneous conclusion drawn from it (in accordance to patterns of past experience) for truth – real knowledge – until second glance at least: this is why it can be said, on these terms, that no man’s knowledge can truly go beyond his experience.

Secondary information we often find through, as it were, second-hand information. Hearing accounts, reading, looking at diagrams, translating codes or understanding braille to absorb the (comparatively) first-hand accounts of the people who are describing such things through any such or similar medium may constitute what I define here as secondary information. And yet we are made by nature to access this secondary information through our senses – in effect experiencing the speaker’s experience first-hand. In learning through the experience of others, you actually experience this learning.

And what scientific knowledge? Many of the things science investigates concern things too big, too small, to deep in our bodies, or too far away relative to our everyday level of conciousness for us to observe. Our insights into such matters thus come mostly from ‘primary’ information obtained through instruments. Examples of two instruments with the same basic mechanics, roughly the same usage and radically different targets are telescopes and microscopes. While telescopes enlarge what is very far away to a fraction of its size so that it is at least visible, microscopes enlarge what is very near to thousands of times its real size or more – also so that it is at least visible. Both instruments are subject to error and the slightest inaccuracy may result in anomalous data. With the possibility for error doubled through sensory inaccuracy, human error as well as mechanical failure, can the act of taking scientific data through these instruments be classified as an experience, or an experience of an experience (by the instrument)? One way of resolving this is to point out that it is the experience of the eye peering down the tube of the microscope or telescope and what the brain makes of this experience that is what truly makes the knowledge, but even so it cannot be certain that true scientific knowledge exists because of the double veil of Chinese whispers. Experience alone cannot therefore validate our knowledge of the world in this sense.

Now comes the difficult part. Does the world even really exist, or have our senses – universally acknowledged to be so unreliable – been deceiving us for the whole time we have been ‘alive’? Solipsists argue that there is no way to know anything but the existence of personal thought because, for all you know, you could be a brain floating in a vat somewhere being fed illusions on reading this essay. Nothing this brain (you!) gains is real knowledge, just random impulses dictated by the almighty controlling device or the mad scientist of your choice. In fact, Descartes most famous quote ‘I think, therefore I am’ makes the same argument in promising no guarantee of knowledge of anything but your personal, experienced thought. His argument ran somewhat like this in a rather truncated form:

‘If I am thinking about whether I exist, the fact that I am thinking about it denies the possibility that my thinking conciousness may not exist – because I am thinking about it.’

Therefore, he concludes, true knowledge only exists in this little cogito ergo sum, which itself comes from the experience of thinking.

The new conundrum is now the difficulty of putting ‘thought’ in the same category as ‘experience’. Rationalists, in particular, insist that the two are distinct and separate entities. However, one cannot deny that one experiences not only thought, but brainwaves, fragments of a brilliant idea, carthartic turmoil, epiphanies. Many of these are not brought about by the conscious mind, but ‘come’ to us – sometimes against our ‘natural’ inclinations, usually due to its implications on impending change or personality. Dreams are another ambiguous proof of illogical, unwanted interference in what would otherwise be an impeccable rational construct as some would have it, and usually described in the same way by the layman as any event that happens to him in ‘real life’. So you could say that they are experiences in their own right. You could just as easily say that they are the culmination of the impulses gathered by the subconscious, which – ultimately – are still experiences, direct experiences, which are indirectly recorded and stored until whatever simulus brings them back out into the light of day in whatever form.

Experiences, it seems, are all we have to go by in the end: the only way to stop gaining experiences and from there, some measure of knowledge, is to become completely unconscious. The implications of the statement that no man is able to transcend this boundary here only echo, amazingly, the layman’s banal point of view in the real world as he experiences it – a point of view ironically probably also gained mainly from experience.


References:
Leibniz
Descartes
http://www.importanceofphilosophy.com/Irrational_APriori.html
http://www.iep.utm.edu/a/apriori.htm
Wikipedia

Essay on aesthetics

‘Aesthetics has no real value to societies.’ Examine, with reference to the Aesthetics, how the concept of value applies to knowledge.


The concept of value is inherent in the workings of thought in that it forms the very basic catalyst to the process of decision-making or conclusion-reaching by the conscious mind, both of which is what the conscious mind is defined: and it is by these two processes that information is accepted and knowledge is made in this individual’s conception of his world.

The concept of value, standing alone, is intimately entwined in the formation of knowledge in that some system of prioritisation must take place in the processing of information. We are being bombarded second by second with overwhelmingly sensory stimulus and the information they carry, and this alone forces our limited minds to focus on some and ignore some, according to our needs. What we focus on is what we deem important – in other words, ‘valuable’ – and it is in these areas that knowledge is most gained. Without being able to rank input by whatever standard, be it logical structure or pure whim, no decisive opinion or action will be able to take place: the mind would be incoherent and completely randomised (for even in the case of pure whim some unconscious preference must be working). And in such a mind, knowledge is impossible. ‘Value’, being the criteria with which we gather or retain knowledge,

The role of the concept of value I knowledge is particularly apparent where the ‘value’ of a concept becomes fixed in accepted beliefs (‘Pluto is a planet.’ ‘The woman I call my mother gave birth to me.’ ‘Cauliflowers aren’t pink’) for the everyday purpose. Such ‘common knowledge’ the man on the street usually takes for granted and does not question, its value unimpeachable but for the strongest, most indisputable evidence which he may still find hard to accept – because of the difficulty of disassociating value with the mistaken belief. By acting as the most basic step to the formation of a conclusion from a series of input, recurring input resulting in the repeated affirmation of the conclusion increases its value in the individual consciousness – until it has evolved from hypothesis to ‘fact’. This ‘fact’, this knowledge, forms the basis of the bias and prejudices that individuals invariably carry all through their lives.

When one considers these two trains of thought integrating the concept of value with the concept of knowledge as it is achieved, one cannot disregard the value of Aesthetics – to the individual. ‘Aesthetics’, as goes in an article of the same name in wikipedia, ‘is a branch of value theory which studies sensory or sensori-emotional values, sometimes called judgments of sentiment or taste. What makes something beautiful, sublime, disgusting, fun, cute, silly, entertaining, pretentious, discordant, harmonious, boring, humorous, or tragic.’ Where there is judgement there is evaluation. By its very definition, aesthetics has identified itself as a branch of the concept of value.

On another level, aesthetics are important to the individual mainly because they function as the very definition of self. When someone is asked to describe himself in an everyday situation, he begins with what he believes are facts – most of which involve the bedrock beliefs described earlier, with their value accrued throughout his lifetime by repeated affirmation: ‘My mother was one Tamarind Tree.’ ‘I was born in Frankenstein hospital.’ Then he begins to elaborate upon his preferences – ‘I like the smell of margarine soap, I think it ranks way above rose detergent’ – which, when one thinks deeper on it, really all depends on the value the individual places on one object above another. If personal idiosyncrany relies solely on preferences, which are simply a result of the decision process and therefore the usage of value judgements, then identity is nothing more than an expression of such aesthetic statements by which the individual lives. Knowledge of the self, therefore, derives almost entirely from the interaction between himself the aesthetic.

And so it has been established that the Aesthetic is important to the individual – but is it of real value to society, or – as the statement puts it – societies? The simplest argument would be that as the individual is to society what a brick is to a wall, what concerns all the members of the society should rightly concern the society as well. Therefore aesthetics has a very real value to societies. But even without considering this factor, Aesthetics remains of paramount importance to a working society because it is aesthetics – preferences – that defines ethics and norms, which acts as a foundation for the cohesion that the most fundamental function of society is to maintain. If society is a wall and its component individuals are the bricks, then these ethics and norms – which are set, conserved and revised by the aesthetic – form the cement that glue the bricks into the wall. Knowledge of society and knowledge in a society are made, distributed and executed according to its aesthetic principles.

If one contests that ‘aesthetics’ may be of no further value to society, the purpose of ‘society’ first has to be defined – and has not been, yet. If the role of society is merely for survival – as a large group of beings might in foraging for food or dissuading predators, aesthetics may play little part in it. But if the purpose of a ‘society’, once the basic needs for food, shelter and safety are to be met, is to not merely survive but survive in a clash between societies – with the continuous annexation and absorbtion of one by the other at any place or time – aesthetics it becomes what enables a culture to outlast another. In the current state of affairs, in the rush of increased globalisation and the creation of a Western-based ‘world’ society that is unable to hold even more than a few of the individual characteristics that illustrate the thousands of cultures scattered across the earth, the aesthetics of a culture become both its sacred ground and its selling point. The ‘japanese aesthetic’, for example, has become known around the world for its distinctive symbolism or use of lacquer; the terracotta army of Qin Shihuang’s mausoleum became for a while the face of China’s tourist industry. The delinations of characteristic Aesthetic is what keeps native culture from extinction and acts as a root for societies, which retain individual identity as a collective.

In addition, some believe that the society exists for the individual – instead the other way round. The benefits of a society in coordinated action and a set of common ethics is enjoyed by the individual, who is provided an incentive to continue his support the society. Part of the ‘perks’ of belonging to a society, as it were, is the sense of collective identity – which is what cultural aesthetics sustain. And it is this collective identity that holds the community together – not just in terms of artistic sensibility, but also in moral preferences, norms and standards of acceptable behaviour. It is because of aesthetics that the society, as both an institution and as a repository of communal knowledge, is able to function. So it is, again, the value of society to the individual, as well as the value of aesthetics to the individual, that results in the value of aesthetics to a society.

As a result, one might be able to discern that the concept of value in reference to aesthetics applies to the most fundamental knowledge in an individual, and through him, society: the assertion that aesthetics has no real value to societies is proven to be simply not true.

Art Presentation, long ago

Presentation, due to time and mind constraints, has been restricted strictly only to Western visual art in the form of sculpture and 2D mediums, NOT including advertisements or photography.




Pre-1000
2DMain medium: egg tempura, inks
Forms: illuminated manuscripts (eg. Lindisfarne Gospels, Book of Kells); frescoes; mosaics; statues, paintings
Themes: mostly religious or portrait works
Characeristics: two-dimensional, stylised

Gothic (1200 – 1450)
2D Main medium: egg tempura
Forms: paintings, sculpture, illuminated manuscripts
Themes: mostly religious or portrait works
Characteristics: Stylised, but with increased naturalism

Renaissance (1400 – 1600)
2D Main medium: OIL PAINTS -- Throughout the Renaissance period, artists first began to experiment with oil-based paints, mixing powdered pigments with linseed oil. The slow-drying nature of the medium allowed the painter to edit his work for several months. Perspective and attention to light became important to artists, as well as architectural accuracy in backgrounds.
Forms: frescoes, paintings, portraits
Themes: biblical characters (placed huge importance on the Madonna); people of Greek or Roman mythology; the human body (particularly the nude!) – the idealised human form, purity in expression
Characteristics: see ‘medium’; 3D, individuals that portray for the first time personality and behaviour
Notes: increased blend of art and science

Classicalism (1600 – 1800)
Themes: imitation of Greek and Roman art, which ended in the 5th century with the fall of the Roman empire
Characteristics: adherence to traditional aesthetic formalities favoured over expression and individuality; conservative emphasis on balance, order, unity, symmetry and dignity
1826: INVENTION OF PHOTOGRAPHY


Romanticism (1800 – 1850)
Themes: emphasis on emotional, spontaneous and imaginative approaches; focus on emotion and freedom by way of subjectivity and individualism
Characteristics: country idyll, solitude, peace: exploring everything exotic, mysterious, remote, occult; emphasis on emotional and spiritual themes
Notes: Caused by the sudden social changes that occurred during the French Revolution and the Napoleonic era, Romanticism was formed as a revolt against Neoclassicism and its emphasis on order, harmony, balance, idealization, and rationality.
Evolved into a fully intellectual movement that rejected the traditional values of social structure and religion; encouraged individualism, emotions, and nature.
Techniques to develop associations in the mind of the viewer were purposefully explored à basis for later surrealist, expressionist movements

Realism (1850 - 1880)
Themes: accurate, unembellished, and detailed depiction of nature or contemporary life. The movement prefers an observation of physical appearance rather than imagination or idealization.
Characteristics: painting common, ordinary, sometimes ugly images rather than the stiff, conventional pictures favored by upper-class society.
Notes: It was an opposition to the traditional approach to Neoclassicism and the drama of Romanticism: Realists strived to paint scenes as they actually appeared. Often the artists depicted ugly and common subjects that normally alluded to a social, political, or moral message.

Impressionism (1865 – 1885)
Themes: Emphasis on loose imagery rather than finely delineated pictures. Subject matter was most often landscape or scenes from daily life.
Characteristics: The artists of the movement worked mostly outdoors and strived to capture the variations of light at differing times throughout the day. Color palettes were colorful and blacks or grays rarely used. Emphasized sunlight, shadows, and direct and reflected light. In order to produce vibrant colors, they applied short brush strokes of contrasting colors to the canvas, rather than mixing hues on a palette.
Notes: a movement founded in Paris as an opposition to the rigid traditions favored by institutions such as the Academie des Beaux-Arts.

Symbolism (1880 – 1895)
Themes: fantasy and imagination in the depiction of objects. The artists of the movement often used metaphors and symbols to suggest a subject and favored mystical and occult themes. Influenced by Romanticism and the Pre-Raphaelite Brotherhood, the movement strived to depict the symbols of ideas.
Characteristics: Some works would often contain grotesque and fantastical imagery such as severed heads, monsters, and spirits. In addition, their works sometimes contained references to the Bible and ancient myths. Other Symbolists took a more traditional approach, using lines and colors to produce desired emotional effects.
Notes: Symbolists were opposed to the visual realism of the Impressionists and serious nature of the Industrial Age. Their aim was to portray mysterious and ambiguous interpretations of emotions and ideas by using unobvious symbols.
Important to the development of surrealism.

Expressionism (1905 – 1945)
Themes: the artist is free to move beyond the limitations of objective subject matter and to concentrate on the feeling and impact derived from the artist’s inspiration.
Characteristics: evocative, high-keyed psychological aesthetic
Notes: Expressionist sought to reveal inner, spiritual and emotional foundations of human existence, rather than the external, surface appearances depicted by the Impressionists.
The Expressionist movement took inspiration from Symbolism, Fauvism, and Cubism in its departure from accurate subject matter.
The successors of the original group of expressionists, who fell apart due to artistic differences and WWI, called themselves the Dresdner Sezession – the most famous of which is Gustav Klimt.
Opened the door to abstraction because of its ideals of experimentation, spiritual representation and originality.

Abstract (1910 --)
Abandoning the late 19th century European idea that art should imitate nature, Abstract art does not strive to create accurate representations of any forms or objects. Artists employing the style take an object and either simplify or exaggerate it by altering its color shape and form. Abstract art developed before the twentieth century abstract patterns have roots in ancient history showing up in early decorations for textiles and pottery.

Dadaism (1916 – 1924)
Dada began as an anti-art movement, in the sense that it rejected the way art was appreciated and defined in contemporary art scenes. Founded in Zurich, Switzerland, the movement was a response to World War I. It had no unifying aesthetic characteristics but what brought together the Dadaists was that they shared a nihilistic attitude towards the traditional expectations of artists and writers. The word Dada literally means both "hobby horse" and "father", but was chosen at random more for the naive sound.
Other elements integral to the Dada movement were the non-attempt to underlie work with any reference to intellectual analysis. Dada was also a reaction the bourgeois Victorian values of the late 19th and early 20th centuries. The work was also absurd and playful but at times intuitive and even cryptic. Methods of production were unconventional, employing the chance technique, and found objects. Dadaists rejection of these values was an attempt to make a statement on the social values and cultural trends of a contemporary world facing a devastating period of war.


Surrealism (1924 – 1955)
Themes: the expression of imagination as revealed in dreams.
Characteristics: … surrealness! Dream landscapes, spontaneous, where images may appear in the real world or not
Notes: Also similar to the 19th century Symbolist movement, Surrealism was based on the psychoanalytic theories of Sigmund Freud and Carl Jung, emphasizing imagination and subconscious imagery. Work usually contained realist imagery arranged in a nonsensical style in order to create a dreamlike state. Surrealist painting incorporated a lot of content and technique. Surrealism incorporated and celebrated the art of children and primitive art. They appreciated the innocent eye in that the untrained artist was more liberated to depict their actual imaginative ideas.





MAIN POINTS

How art is involved in the knowledge: art moves knowledge, encourages social change, reflective of the values of the era
Religious --> portrait --> people --> landscape --> impressions --> mental landscape etc.
2D --> 3D --> stylised: stiff vs. contorted --> empiricist --> fantastical --> impressionist --> symbolic
how movements build on each other
how science changes art

Saturday, September 16, 2006

History

I was having a discussion with a friend.

Discussions with friends – this one in particular – tend to turn out to be about peculiar things. The day before that, we were talking randomly about the relationship between literature and ‘kids these days’, as well as the affectation of modernist texts to write in a romantic style. But that’s another story.

This day we were sitting in a void deck. I was eating small egg white biscuits and she was drinking apple juice out of those 1 litre jugs from the supermarket. We both attempted to study like serious mugging students, then gave up and started talking instead.

She had a history essay to do with the question of whether it was really true that ‘the Soviet Union was doomed if it reformed and doomed if it didn’t.’ I had my history notes spread out in front of me anyway, so we started talking about how she might go about writing the essay.

Everything I’ve said so far is irrelevant to KI except for justifying the strange avenues by which my strange ideas and ideals are constructed. Most of them are by talking nonsense to people, this one in particular. No sooner had she remarked that history was nothing so much like an absurdist play than I remembered something completely random I had said earlier about aliens and breakfast.

Me: Remember just now, when I said that if aliens came down to earth – here – today, which would they be more confused by – porridge or you tiao?

Her: A hive mind of aliens would absolutely gape at the entirety of human history.
Her: That’s why I said history was like an absurdist play. All the characters know exactly what’s going on and everything makes perfect sense but the audience finds it… absurd. I mean, East Europe splits into two and fights over itself, and then America makes a bomb that could destroy the whole world and drops it into Japan of all places! It is entirely ridiculous!

Me: So – the small things!

Her: The small things are ridiculous as well.

(insert ramble here I can’t remember, sorry)

Me: Benefit of hindsight.

And then we went on about how Khrushchev might have accomplished what Gorbachev could not, because Gorbachev was a blind blundering incompetent when it came to economic policy. Then we rifled through what we knew of Soviet Union history and eventually blamed Stalin for everything. Stalin is good to blame: he was a blood-handed despot and he died at the most inconvenient time. If he had held on for decades like Fidel Castro Russia would have suffered a lot more but rather quite a few fiascos might have been prevented.

Me: But we can’t say for sure what would have happened, because it didn’t.

Her: But we can think of what might have.

(I show her a section in Terry Pratchett’s Darwin’s Watch on the ‘Trousers of Time’, the moment where a choice is made and destiny splits into two different paths that branch infinitely with every subsequent choice, and we talk briefly and insignificantly about counterfactual realities.)

Then, suddenly:

Her: Speaking of absurd plays.
Her: I cannot imagine anyone dedicating themseles to driving a bus for the rest of his life.
Her: The worst examples must be the ones that drive loop services! Round and round! All day!

Then she speaks what has been swimming in my mind.
Her: It’s even in an infinity sign! (drawing the closed loop in the air.) Round and round!



And that’s history in a nutshell. Yes even the bit about bus drivers.






[Thanks to the friend who remains unnamed for this post, unless she doesn’t mind being named, in which case I will edit this post]

Appearance and reality, ideal and ideology – ad nauseum please!

I was watching my mother trim the fat off a pink, stripped chicken when I suddenly thought: ‘so this is what we look like on the inside.’

It was a very disturbing thought. When you are a carnivorous little hypocrite like me it is best not to think too much about the stuff that tastes so good in your mouth. For all we know we could be eating the offspring of poor families deep-frozen from Victorian times.

That, if I may mention, is a disturbing thought too. Most disturbing, however, is the niggling notion clinging fiercely to the back of my brain that I wouldn’t mind trying the meat-as-it-is either – i.e. raw.

I bring up no defence against myself except for that it looked like a special kind of jelly. The canon of Pavlovian conditioning tells me that this might account for the strange impulses I get from staring at that slimy pink lump – but no. The jelly I like is radically different both in appearance and smell from a slab of limp substance on a chopping board. Moreover I have never tasted raw meat before, and have no inclination to act on the inclination to try. I have been fortunate enough to have eaten only vegetables, fruit and processed preservatives uncooked; the only exception would be three very thin slices of salmon, which bears only a superficial resemblance to the almost-defrosted chicken in the kitchen sink. Bye bye Pavlov.

The primal subconscious so dearly beloved by Freud and his bevy of little protegés is no excuse either. For one, if it were subconscious, why am I conscious of it? I’m only supposed to be conscious about other people’s subconsciousness(es). For another: the attraction has ABSOLUTELY NOTHING to do with sex. All right, I know I grossly overgeneralise, but I’m not sure I am the only one of the world with these strange little urges to go picking at the dismembered corpses of dead fowl. In any case, if these ‘strange little urges’ were the unhappy remnants of the tastes and preferences of a triumphantly savage evolutionary throwback, I’m in hardly any position to help it – but what I can do is not follow up on the notion to pick up the oozing stuff and run away with it. Which is precisely what I did. Which brings us way back to the clash of civilisations – the clash between impulse and ideology.

A classic and ever-relevant example of this would be of how ‘Meat’ always sounds so much more appealing compared to ‘Muscles’ or ‘What might once have been the legs of an eating, breathing, defaecating being’. Once in my impressionable childhood I asked about what part of the animal we were eating at the moment and was reluctantly told ‘muscles’ – and when I tried to probe further I was firmly told to stop asking so many irritating questions. The scientific truth and its implications seem to be something everyone is quite happy to remain oblivious about. I slash my way through the foliage to reach it at my own peril.

This route of conversation is particularly pertinent because, being Chinese all the way down to my little toenails, I grew up eating Chinese – rice, cailan, hair fungus, crocodile soup. The Chinese are famous for being able to make a good dish out of everything and anything that may or may not move. This means a lot of difficult questions at dinnertime – and if you think the questions are bad, the answers are worse: it doesn’t help that my mother is a nurse who always knows exactly what we are eating. Liver, brain, lungs, kidneys, trotters, stomach – colon – sea cucumbers – all the things that normally go into sausages in the hardly less decadent worlds of traditional European cuisine, they come out unmauled and recognisable through the heavy sauces in weddings all over the country. Even better, these dishes tend to be exotic and very poetically-named: one is expected to be so grateful for the opportunity of actually getting to lay tongue and chopstick to these delicacies that any childish enquiries to the original functions of the unmentionables in the gravy was liable to get a smack in reply. A smack once too often tends to breed a certain complacency in the food put in front of you until you grow up a little bit only to find that, all your life, you have been seriously enjoying congealed pig’s blood as some sort of tofu substitute. The feeling is quite indescribably strange.

Education in food, as one might say, comes first and foremost from the clumsy but amazingly accurate English translations of dishes on the neon menus at the humble hawker centres. Verily does one find truth in the strangest places.

It is not just words and ideology that make eatables… edible. It is packaging as well. A fortune goes into the food preservative and colouring industry just so that our apple juice does not turn out in their transparant jugs to be some kind of grey colour (like baby apple puree), and that jellies squirt properly out of their appealing plastic cuplets in liquid neon shades that children love, and that ice cream retains their jewel-like creaminess and orange boiled sweets melt a wholesome orange. The moment aesthetics was applied to food was the moment were were asking for trouble with the truth. Even the little labels listing the ingredients on one side of the box has little impact on what we think we are eating. The tangible properties of a food we didn’t kill, gut and fillet ourselves are the first we go by and usually the only ones we consider when making a choice for tomorrow’s lunch: no one is going to care about the percentage content of sodium bicarbonate unless it is widely known (to supestitious precision) to be prodigously unhealthy. Food labels are there only for worried mothers and health freaks; ordinary people, much less snack-scarfing adolescents, are unlikely to bother. How much do they know of what they are eating? Are your potato chips really made of potato?

To say it is all about packaging is to call a tiger a small pair of earmuffs. And it is supermarkets that is the biggest and baddest player in this civilised game of deception. In small-scale grocery stores there is bad fluorescent lighting and rather ugly iron shelves; in wet markets everything is hot, raw, noisy and bloody where the fish on the ice are still alive and the butcher cheerfully carves the meat to your liking. In supermarkets everything is different. Food is packed in neat and coordinated rows according to selection and category. Food is wrapped in plastic and put under treatment of the most flattering UV light. Food is stuck with yellow promotion markers, advertised, arranged in appealing formations to entice the hand that pays the cash. Allow me to revert to the example of raw meat: let us not talk about chicken this time. Let us talk about pork or steak. (No, not pig and cow, pork and steak! You see?) In the wet market these products are hung in all their raw, whole and gory glory, without benefit of special lighting or air-conditioning to hamper bringing the distinctivly stinging scent of blood to your nose. In a supermarket the pork or steak are lovingly bundled in transparant plastic held taut against the neatly portioned cuts and the styrofoam, with little stickers telling anyone who would read about how happy the aforementioned pig or cow was before it died. I spent most of my childhood in the supermarket poking with fascination at the taut transparant plastic, feeling it yield beneath my fingers, and then poking the clean, strange, sterilised stuff beneath it, and then reading the label proclaiming in mechanised type that I was holding in my hands a great lovely specimen of HIND LOIN.

A supermarket already is a vast conspiracy by itself. Worse than a supermarket is your mind.

Construction of knowledge in food is something not to think about too much or too deeply if one intends to continue to eat, because you can’t eat without killing something. One day if someone suddenly makes the discovery that green things really do have thoughts and feelings, carrot juice would become murder – for real.

If God made this world in peace and harmony, it was a bloody farce. If people made ‘civilisation’ a civilised word as they do all the time I think I shall in the future not attempt to desist from simply laughing in their faces.

Saturday, August 19, 2006

Ranger's Creed 2nd edition

Recognizing that I volunteered as a KI student, fully knowing the hazards of my chosen profession (KI is a profession, isn't it?), I will always endeavour to uphold the prestige, honor, and high esprit de corps of my KI CLASS, mostly by reading big thick books with repapered covers in front of the GP masses.

Acknowledging the fact that a KI student is a more elite soldier who arrives at the cutting edge of battle by land, sea, or air (meaning from all directions, by all means for the unenlightened), I accept the fact that as a KI student my country (country!) expects me to move farther, faster and fight harder than any other soldier. All this means is that we are expected to become politically/scientifically useful in the near future and eventually come to dominate the economy.

Never shall I fail my comrades, teachers or otherwise. I will always keep myself mentally alert, physically strong (pffft!) and morally straight and I will shoulder more than my share of the task whatever it may be. Moral straightness can be weighed on different yardsticks, especially when it concerns the relativist point of view, but we shall not discuss that here because it will ruin the pretty oath. One-hundred-percent and then some. (I like this phrase.)

Gallantly I will show the world that I am a specially selected and well-trained soldier, mostly by not eating the monkeys on display in the zoo. My courtesy to superior officers, neatness of dress and care of equipment shall set the example for others to follow, and the arsenal in my pencilbox the envy of all.

Energetically will I meet the enemies of my country -- the legions of ignorance! I shall defeat them on the field of battle for I am BETTER TRAINED and will fight with all my might. (And then some.) Surrender is not a KI-student word. I will never leave a fallen comrade to fall into the hands of the enemy and under no circumstances will I ever embarrass my country, perhaps by eating the monkeys on display in the zoo.

Readily will I display the intestinal fortitude (I love this phrase.) required to fight on to attain the KI nirvana and complete the mission though I be the lone survivor.

Tuesday, August 15, 2006

Oops.

I have been neglecting my posting responsibilities. Essays will be uploaded soon -- after Mr Cheong returns them -- and since one goes into the maw of the teachers' lair every week, there'll probably be something new once in a while.

In the meantime, here's a teaser:



The Hawthorne experiments and the Uncertainty Principle demonstrate that we can never hope to know the world as it really is only in the context of pure scientific inquiry. For example, for the Uncertainty Principle, this 'world as it really is' exists only if 'know' refers to objective, calculable truth down to the smallest unit of existence. While in the passage it has been admitted that you can only know either the velocity (edit: momentum) or the position of a very small particle but not both, one can predict trends by discovering the velocity and the propensity of these particles in terms of positioning separately, but under as similar conditions as possible, and from there create a view of the world which will probably be close to what 'it really is'. So there is hope that we can 'kknow the world as it really is', especially if -- with new technology being invented by the year -- a way is eventually discovered to circumvent the Uncertainty Principle and allow both the position and the velocity of a very small object to be found at one time.

This case is similar for the Hawthorne experiments, in which the results are affected because the workers, with their personal goals and agendas, were aware of being scrutinised and so would attempt to warp the results in their favour by, perhaps, giving proof of their hard work. This trend is easily circumvented as well simply by not letting the workers know about their being observed: for example, a spy could be planted in their midst, or secret cameras installed. In short, in observing the psychological or social world, making observations that are not contaminated by the observed's activities and choices is achievable merely by ensuring that the subject observed is not aware of being observed in the first place.

At this point in time it can be agreed upon that while the Hawthorne experiments and the Uncertainty Principle demonstrate only the problems that one runs into when looking to 'know' the world as it 'really is', with such obstacles potentially solvable through the invention and the use of future or higher levels of technology -- the suggestion of using secret cameras in place of manual supervision to observe the workers in the Hawthorne experiments to aboid disturbing the workers' normal sensibilities is one such. In short, there is still hope. However, while these conundrums posed by the Hawthorne experiments and the Uncertainty Principle do appear solvable, there remains the question of the problem of defining a unit. Both the Hawthorne experiments and the Uncertainty Principle represent or concern microcosms of what the layman would define as the 'real world', what the essay terms as the 'macroscopic world of matter', the Newtonian mechanics to our atomic science. To show the contrast, even the passage admits readily that the macroscopic world is easily, if I may take liberties with the word, 'solved' -- both the position and the momentum of a body are easily calculable, for one. The problems arise when one attempts to go deeper into each smaller unit (especially comparative to the human's average size or perception), where even the most minute of errors are infinitely magnified. Given that human nature by its very definition is errornous, the 'world as it really is' becomes yet more precarious the deeper we decide to plunge. Also, given that we would continue to discover infinitely small divisions of particles (the atom was conceptualised before Anno Domini as the smallest possible unit of matter, but then as yet more powerful microscopes were invented, electrons were discovered, and then quarks, and so on -- as, logically, something has to be made of something else), completely precise knowledge of the world from this perspective becomes progressively more difficult to find.

It is simliar for the Hawthorne experiments (if one considers the purpose not to be for the original intent of finding the ideal factory conditions, but for predicting human behaviour). This study concerns the difficulty of studying behavorial patterns objectively, but let us assume that they would want to go deeper; the scientists may decide that they wanted to know the rain processes of a human functioning in otherwise normal circumstances, for example. To do that they might have to attach patches to the subject's body. But the fact that knowing these patches are there may cause heightened anxiety or may actually directly affect the workings of the neurons in the brain, which further corrupts the data collected -- compared to the simple, distant observation of increased productivity under human monitoring in the Hawthorne experiments themselves. And so on, ad infinitum (edit: ad nauseum?). As the scientists probe deeper, perhaps even beyond cranial processes to interfering with electrical impulses and synapse transmissions in their search for what makes people tick, the possibilities for error become yet more and more enormous. The problem with finding ultimate knowledge about the world is in that with infinitely divisible units, it is impossible -- partly because they are so small, and partly because the room for error is so great.

But here is a counter-example. Supposing that even time has infinitely divisible units. And yet, the smaller the unit, the less room there is for error: instead of a series of events which appens for an extremely quick duration and is gone in a flash, a very small sliver of a unit of time -- magnified to present a significant duration for the purposes of study, of course -- will display a world that is almost immobile, perfect, well-nigh unaffectable (for even if something moves very fast, its velocity eventually approaches something very near to -- but not quite -- nil with each progressively smaller unit of time). This is on the surface in complete opposition to the earlier example of the divisibility of units in matter and the psychological sea. However, all three examples are similar in that they present an extremely warped view of the world that is not representative of what, as far as peoplea re concerned especially in their daily lives, the world "really is". Deducing the goal of achieving knowledge of the world is thus proved not to be logically possible with going into increasingly smaller and less relevant microcosms of what is usually experienced, most importantly at the first level of perception (the senses -- after all, one cannot see an atom), and next at the palteau of the question of reality (as scientific proof is well-nigh worthless in providing surety of even the existence of the world, especially against solipcism.)

In conclusion, the Hawthorne experiments and the Uncertainty Principle demonstrate that we can enver hope to know the world as it really is at the scientific level, and in that it does not answer the question of the existence of reality at all. However, it is also invalid because it does not appeal to our every (sensory or psychological) perception of knowledge of the world and, following the 'infinitely divisible unit' argument, presents an innacurate picture to the 'world as it is'. The 'infinitely divisible unit' argument is particularly controversial, however, because even as it advocates the invalidity of the Hawthorne experimetnts and the Uncertainty Principle by proving them inaccurate, it also concedes that one can never pin down the subtlelities of the world 'as it really is' in the scientific, micromolecular context. Therefore the Hawthorne experiments and the Uncertainty Principle demonstrate that we can never hope to know the world as it really is to the extent that the 'world as it really is' has to be first defined as that of the material, macroscopic world in which our perceptions exist; reality itself; or in the scientific concept, where the answer to everything (edit: anything) can be found in units and particles.



This was my mid-year paper. You can tell from the overly flowery language that I was getting quite high on adrenaline by the end. ('Psychological sea!' Whatever was I thinking)

If you read this, please counter-argue or otherwise comment, and I shall be a happy girl.

Saturday, May 13, 2006

This Has Nothing To Do WIth KI Homework.

I was surfing through the Cold War history on Wikipedia when I reached the page about the Berlin Blockade.

It was a very short page.

What was strange was that what struck me most about the page was not how many tonnes of food were delivered or thingies, but about Gail Halvorsen and his candy bombing.

'Halvorsen says he had the idea after giving a few sticks of chewing gum to some children watching the planes from outside the Tempelhof base. Wanting to give more, he promised to drop more candy from his plane the next day. Because the planes would arrive every 90 seconds, the children naturally couldn't distinguish his from the others. However, Halvorsen promised to wiggle the wings to identify himself, which led to his nickname "Uncle Wiggly Wings".'

I kept thinking, 'what a nice man'.

He was born in 1922 and he's still alive. Hmmm.



Today is Sunday.

I did my Math tutorial!!!! WOOT.

This is going to be a busy week. NAPFA is going to roll me into a pancake and fry me in foie gras. PreU Seminar will merely be the mint garnishing.