Posts Tagged ‘epistry’

Common Sense 1

October 29, 2015

Common Sense

Advertisements

In Doubt we Trust. Newsletter #23.

January 4, 2015

Fundamentalism is one of those modern predicaments that often come clothed in ancient garb. Religious fundamentalists like to tout their faithfulness to a pure version of their tradition. In practice, fundamentalism is more about exclusion rather than purity; co-religionists are often targeted for their impure faith – perhaps they sing and dance or celebrate a festival that they shouldn’t. As for those who are outside the cicle, they are fair game. There’s no room for doubt or accommodation; certainty is the hallmark of the fundamentalist. When seen this way, there’s no shortage of scientific fundamentalists either. People like Richard Dawkins are as vehement in their atheism as any Taliban preacher. 

It’s easy to see that certainty is incompatible with humility; without humility, there’s no going forward. Let me be clear, I am not talking about humility as an emotion – some of the most fundamentalist people I know are humble in their external attitude and fanatics in their faith. Humility is an orientation that recognizes one’s humanity and the incompleteness of one’s knowledge. That’s the attitude of the seeker, who is full of doubt, even if she comes to that doubt with great faith. If certainty is the standard of the fundamentalist, doubt is the engine of the seeker. 

I like doubt because certainty is boring. Humility is not just a negative attribute, i.e., the lack of arrogance or omniscience; it is also a positive energy that propels one forward to ask new questions. Let’s put it another way: there are two ways of being: the answer way and the question way. The answer way wants certainty, though it will settle for closure when it can’t get certainty. Consider science, both as it is taught and how it advances: it does so by stacking one answer on top of another. Papers get published because they settled a doubt or verified a hypothesis. There’s no journal of questions. Engineers are more modest. There are no final answers, but products have to ship and customers have to be served and until then there’s a temporary freeze on development. That’s what I mean by the termclosure, you close off all options until further notice. 

The question way has much less prestige. There are no patents for questions. There are no named professorships at Harvard for questions. In fact, it is often dangerous, as children learn quickly after asking awkward questions at home or school. On the other hand, a good question is like an arrow pointed at the uncovered belly of the dragon (I just saw the last episode of the Hobbit); it can bring the whole edifice down and usher a revolution in thought. To the questioner, an answer is just a question’s way of asking another question. A hypothesis might well be verified, but verification is important only to the extent to which it is the key to another door. 

The answer way makes a concession to the fundamentalist. It says, “I am ready to believe, but only when I see it.” Like the fundamentalist, the answerer wants certainty; he is just willing to test his faith a little more. Trust but verify. The question way makes no such concession. There’s always grass to be gathered and a fire to be lit. 

Mindfulness: The Monk in the Machine. Newsletter #21.

December 21, 2014

This week’s newsletter continues last week’s discussion of tradition. 

Some years ago, when I was a graduate student, I mentioned to a maverick cognitive scientist that I was beginning to look at Indian philosophy as a way of breaking through some of the conceptual puzzles in cognitive science. I had Bimal Matilal’s book on perception in my hand, which I handed over to him. He handed it back to me after a minute and said: “but this is too analytic; isn’t Indian philosophy more about sitting by the river side and watch the river go by?”

Eastern men in robes have had a long run of making history in the west. It probably started with Vivekananda, Suzuki and Dharmapala in the late nineteenth century, succeeded by Gandhi and Tagore and a few decades later, the various gurus from Chogyam Trungpa to Osho. The combination of eastern mysticism and western science has proven itself a surefire bestseller.

Unfortunately,mysticism always lives in counter-culture, not in the mainstream. In fact, bringing meditation to the mainstream has required an explicit disavowal of anything mystical, or for that matter, anything to do with the Indian sources that it came from. Consider the immense success of mindfulness. Just take a look at the graph below, a google n-gram of the use of the term “mindfulness” between 1950 and 2008. Do you see a trend? 

If graphs aren’t your thing, you might be better persuaded by the recent popularity of mindfulness on network tv or the increasing number of celebrities and rich people attributing their success and sanity to mindfulness. Here’s a quick check of its effectiveness. Take any daily life activity – let’s call it X – and prepend ‘mindful’ in front of it, making it mindful X. In other words:

  1. Mindful eating
  2. Mindful work
  3. Mindful learning
  4. Mindful %$*

Doesn’t it sound so much better when it’s mindful? If you eat all the time you’re a pig but if you eat mindfully, you are a babe. Like yoga before it, mindfulness has traversed the hype cycle from niche to buzzword to suburban staple.  I don’t have a problem with that; may you be happy in your endeavors. If the meditation cushion is a stairmaster for the mind, more power to cushions. I start having problems when mindfulness becomes a theory of change. For example:

Before: Workers don’t have rights so they fight to unionize. 

After: Workers don’t have rights so they enroll in mindfulness classes. 

The first is an effective way of changing the world. The second, not so much. You might be thinking, what about that Buddha, didn’t he change the world through meditation? Well, the Buddha did change the world for the better. He did meditate. But did he change the world through meditation? What is meditation anyway? 

We think of meditation as one thing. Like science. But there are thousands of meditative practices, just as there are thousands of scientific techniques. Some of these practices are broadly of the kind we would term mindfulness. Others are quite different – prayer, analytic reading of texts, tantric visualizations and so on.

We wouldn’t take a scientist seriously if all she knew was matrix multiplication. Why is meditation any different? It’s a little bit like teaching people multiplication tables and assuming that they will be able to model the motion of planets. It doesn’t work that way. 

So my real problem with mindfulness is that it is immensely reductive and in being so, it lends itself easily to appropriation by powers that are anything but mindful. We don’t need any more drugs that blind us to the disasters unfolding everywhere. Especially not those that give the illusion of making the world a better place. Let’s meditate by all means, but let’s also inquire into the human condition, think critically and engage with others. In other words, do all the things that the Buddha did when he wasn’t meditating. 

As for those of us who are interested in the value of Indian texts and sources, the success of mindfulness is a cautionary tale: don’t put all your eggs in one basket. Think of science as google, a universal index of what’s valuable. Just as google can make your website very popular and then destroy your business model when its algorithms change or it makes its own version of your product, an over-reliance on science to validate your tradition can lead to trouble. 

The Use and Abuse of Tradition: Newsletter #20

December 13, 2014

I am not much of a traditionalist. As far as I am concerned, the future is more important than the past. At the same time, human beings are shaped by history and geography; our past both constrains us and sets us free. As a result, I find myself caught between traditionalists and modernizers.

A few days ago, I posted a note on Sanskrit learning on Facebook and it attracted much more attention than I expected it to receive; clearly, Sanskrit has immense emotional resonance both to its votaries and to its detractors. Let’s set aside the political impulses behind Sanskritization in India (or Biblical learning in the US or Hebrew in Israel) and look at what thoughtful proponents might say on each side.

To the traditionalist, Sanskrit is the source of much wisdom, wisdom that’s been systematically denigrated and marginalized. The traditionalist would deploy scholarly resources toward the translation of Sanskrit texts and toward building a new community of scholars engaging with Sanskrit texts in the original as well as in translation. In other words, what European scholars did with Greek texts many centuries ago and continue to this day. In this view, Adi Shankara deserves as much attention as Aristotle. I agree with this view.

To the modernizer, Sanskrit is an elite language, forbidden to most inhabitants of the subcontinent by virtue of caste and gender. It is the language of a deeply unequal system. To the extent it has interesting ideas, the ideas are so far removed from modern concerns that there isn’t much in the way of practical wisdom to be gained from studying these texts. I agree with this view as well.

It does seem like a contradiction doesn’t it? Let me explain why it isn’t.

Creativity and tradition

To the extent that tradition is to be preserved rather than built upon, it’s dead. In other words, when someone says that all of us should learn the Vedas – setting aside the fact that such a practice would explicitly contravene the tradition itself – I hear someone clutching at straws.

For the sake of argument, let’s assume all modern scientific knowledge is contained in the Vedas. So what? How does that help us do science better? In fact, consider that Newton’s Principia contains a very large portion of modern scientific knowledge; yet no one is asking all school children to read the Principia. Instead, we teach them classical mechanics and the calculus. A living tradition has ways of translating texts into theories.

In fact, the core of the fundamentalists condition is a tragedy. They are tacitly aware that their tradition (independent of the religion or ethnicity involved) has lost its bearings, that it can no longer offer a credible response to the human condition and yet, as creatures of history, they know we can’t get out of the well into which we have fallen without using the tradition itself as a ladder.

Where there’s tragedy, there’s also hope. A deep tradition has the resources to spur creative responses even as it abandons some of the cherished assumptions of the past. That’s what scientists do when they set aside theories of matter; that’s what Carnatic and Hindustani musicians have done to keep their musical traditions alive. In other words, let’s treat our traditions as artists do, as creative resources. We build our castles on top of foundations dug by others. Even radical change needs a launching pad. A Picasso needs a Rembrandt; a Gandhi needs a Ramakrishna.

So my counsel to the Sanskrit traditionalists is this: inhabit the premises of this ancient house and see which beams need to be strengthened, which walls need to be torn down and which rooms need to be repainted. Be merciless in that vision. My counsel to the modernizers is this: do not think yourself outside this history. 

The Right Abstraction: Newsletter #19

December 6, 2014

 

I have been fascinated with abstraction for as long as I can remember. The disciplines I am drawn to instinctively – mathematics, physics, philosophy, programming, literature, design, cognition, religion; to name a few – are all disciplines that truck in abstractions. 

Good abstractions make life easier for all of us and greatly enhance human culture. Writing is a good example: we abstract away particulars such as handwriting and font size or physical location and label them under one heading: “this is so and so’s article.” There’s a sense in which all copies of Shakespeare’s Hamlet are the same or close enough to being so. 

As you can see, abstraction is tied to identity – if I were to think of one thing that defines the process of abstraction, it’s the ability to categorize two different things as the same. Some abstractions are natural, such as personal identity: after all, without abstraction, how would you experience yourself as the same person over a twenty year period while your hair is falling out and your teeth are decaying? Other abstractions are human created, such as desktop UIs that help you copy and paste files. 

Desktop user interfaces tell us that abstraction isn’t opposed to concreteness; in fact, some of the best abstractions make entities tangible to us, just as the mouse and the keyboard makes files on a computer available to us. Tangible abstractions are all around us; words on a screen being the best example. Tangibility is one of two principles I consider paramount while designing good abstractions; the other is representation. The principles can be summarized in two short slogans:

  1. No abstraction without representation. 
  2. Make things tangible. 

When we represent something in language, art or law, we make it explicit; we give it rights and responsibilities; in a nutshell, we take it seriously. That’s why representative democracy for all it’s faults is better than the people’s republics. Sovereignty – another abstraction – has no meaning if it isn’t translated into institutions such as parliaments that represent that abstraction. Abstraction without representation is toothless. For example, I cringe whenever someone talks about balancing the needs of development and the environment. Economic growth is very well represented. Corporations are in the business of turning abstract theories of growth into real profits. The environment isn’t represented at all. While there are laws, implementation is poor and no one who speaks for nature.  Until we create institutions that represent the non-human world, we can’t talk about balancing environmental needs with economic needs. 

Representations are even better when they come with a tangible interface, an API in the computer programming sense of that term. The lexicon is an interesting abstraction, but dictionaries make the lexicon tangible. You might be surprised to know that standardized spellings are very recent; as late as the eighteenth century, people would spell a word in different ways in one article or book. The lexicon represents words. Dictionaries make the lexicon tangible. Abstractions such as computer mice and keyboards make computing tangible. Voting makes democracy tangible.

Tangibility works as a great UI for abstraction when the interface behaves the way you expect it to do so, even as it invites you into a space that’s different from anything you have experienced before. For example, Leibniz’s dy/dx notation makes calculus tangible, because it helps you manipulate infinitesimals the way you manipulate regular numbers even as it takes us far away from the world of bare multiplication and division. 

These two principles aren’t a historical curiosity; as software leaves the world of the screen and enters the world of gestural interfaces and physical objects, we will need an entirely new framework for understanding abstractions, including new representations and new interfaces. The dominant abstractions of the last two thousand years are all 2D abstractions, i.e., abstractions made tangible on paper and screen. As software and hardware intermingle – itself a new abstraction – we are faced with the task of building 3D abstractions. The world will assimilate software as software eats the world. 

Newsletter 18: The Society of Knowledge

November 29, 2014

We live in a knowledge society but we don’t have a universal class of knowledge professionals. Every profession deemed universal is represented throughout society. Doctors ply their wares in rural clinics, small town hospitals and the Harvard Medical School. Lawyers occupy the White House every four years. Engineers and architects work for the department of transport, the local real estate contractor and Google. There’s a teacher in every village.

However, we don’t find knowledge professionals anywhere besides universities, where they’re typically called professors. Even there, professors aren’t certified as knowledge professionals but as bearers of some specialized body of knowledge. There’s nothing that makes a professor into a professor; there are only professors of history and chemistry. That’s strange, for lawyers can’t be lawyers without passing the bar, engineers need to be certified and teachers need a degree in education. We mark our respect for a profession by declaring a badge that certifies entry into that profession. That certificate also universalizes the profession, so that it can take root in every nook and corner of modern society. Every startup has a CEO, a CTO and a COO. They don’t have CKOs. The ivory tower has prestige, but intellectually, it’s as much a ghetto as it’s a beacon.

You might say that a PhD is the certificate for professors. It’s partly true, but most PhD’s aren’t professors and will never be one. Most PhD’s leave the profession of professing, or worse, languish as adjunct faculty. If the certification is a signal of respectable livelihood, then a PhD is a very poor guarantee. Imagine the heartburn that would ensue if 70% of those with a law or medical degree had a position that paid close to minimum wage and no hope of getting a better job.

In any case, a PhD is a certification of specialized knowledge, not of knowledge as such. A knowledge bearer should be closer to a philosopher, a practical philosopher, than a possessor of arcane information. Socrates thought his role was to be the midwife of wisdom. I believe that role is far more important today than it was in Athens in 399 BCE. We are deluged by information on the one hand and plagued by uncertainty about the future on the other. The information deluge and uncertainty aren’t unrelated; the world is changing quickly, which leads to more information – both signal and noise – and more uncertainty.

In times of knowledge scarcity, knowledge professions are gate keepers to access – which is why we have priesthoods and ivory towers. We have moved far from those times. Knowledge is no longer about access but about value: what trends are important and what are fads? What’s worth learning and why? In the future, every individual, every company and every society will rise or fall on the basis of it’s understanding of value. We need a new category of professionals who will act as weather vanes for the new winds that are blowing; people who understand data making and meaning making. They shouldn’t be content with being midwives of wisdom. Instead they should boldly go where no one has gone before and take us with them. 

Newsletter 17: Communicating Knowledge

November 23, 2014

I have been thinking about knowledge and collaboration for a long time, for it greatly affects my own life as an scholar and researcher. The open source movement didn’t invent collaboration; academics were collaborating freely – both as in beer and as in freedom – before software engineers. After all, professional engineers work on products that are bought and sold, while academics (in principle, if not in practice) share their wisdom in return for society’s generosity in funding their exploration.

In practice, software engineers collaborate a lot more and a lot more freely than academics do. Wherever you look, the situation is better in industry with all the cut-throat competition than in academia, with its public charter. Some of it is because academia is actually a lot more cut-throat than industry – there are fewer jobs and there’s less money. Further, unlike an industry professional who can sell expensive widgets for a living, an academic only has their data and their content to flog to the world. The sociology and the economics of academia is well understood now and I will remain silent on this issue from now on; you can always read the Chronicle of Higher Education to see the daily lamentation.

Let me talk about a structural issue instead. You might have heard of the famous slogan: “the medium is the message.” In other words, the means through which you communicate influences the content of your communication. TV news is not the same as newspaper news. For the same reason, academic collaboration isn’t the same as software collaboration. Software collaboration – mostly done via version control systems – is real time, ongoing and continuous. The time cycle is in the order of hours, if not minutes. The technologies that support collaboration are more or less instantaneous: you run a git push origin master and your collaborator has your contribution in front of them.

Academic writing is a lot slower. Its collaboration technology is built around citations, responses and feedback that have a time cycle of months or more. Worked well in the seventeenth century; now, not so well. It’s true that you can write your scientific paper on a Google doc and see your collaborators’ response in real time. That’s missing the point – collaborating on an office document has none of the language and ritual of paper writing. Every element of a scientific article, from the abstract to the introduction, the citations, the data, the discussion, the conclusion and the references, is designed (unconsciously, as a result of a slow evolution over centuries) to address a single problem: how can I communicate my work to a community that lives far away from me and doesn’t have access to my mind or my lab? It’s that mental organization that has enabled a scholarly edifice, built on top of each other’s work. Unfortunately, that design has a half-life of months.

We now expect instant feedback from our communication systems – wherever you look from phone and Skype to SMS, Whatsapp and Facebook messenger, people are used to ongoing, real-time conversation across the world. When I first came to the US in the early nineties, I was still writing letters by hand to my friends and family. Most of them didn’t have a phone or a computer. I would write a letter, post it and then wait for a month or so before I received a reply. In a couple of years, we had all switched to email. It’s true that the handwritten letter had an emotional impact that an email can never have, but for most purposes we don’t need that handwritten note. Certainly not in an academic setting. Scholarly collaboration needs to reflect this new cognitive landscape. A revolution in knowledge needs a revolution in communication.

Weekly Newsletter #16: Text is Technology

November 16, 2014

If you have been following my newsletters, you know that I am obsessed with text in its various forms:

  1. Writing
  2. Code
  3. Mathematics
  4. Stories

and so on.

As a – more or less – universally literate society, we have pushed text into the background. We read text, but we don’t examine the mechanisms behind text. It’s useful to view text through an engineer’s eyes, since text is technology . It is, in fact, the technology that makes idealism possible.

Philosophers have talked about the clash between idealism and realism for millennia. Simply put, realists privilege hardware over software, while idealists privilege software over hardware. That distinction plays out in every human endeavor. In science, idealists privilege ultimate laws and principles (think string theory) while realists privilege manipulation and prediction (think biology). In foreign policy, realists talk about “the national interest” while idealists talk about democracy and freedom. In IT, idealists write software and realists build hardware.

I am a software kind of guy, though I find hardware immensely fascinating; every major human advance is an interplay between the two. Gutenberg makes both Cervantes and Galileo possible. Computing brings texts and materials together in an unprecedented manner that we’re only beginning to unravel.

Why do I say that?

Consider the archetypal piece of hardware: the machine. Turing showed that machines are nothing but text – the essence of machines is about drawing 0’s and 1’s on paper while erasing and writing those digits. With computing we are now able to control and move objects by writing about them. How amazing is that?

We don’t really understand how that happens though; I think we await new innovations in abstraction before we will understand how text can move stuff. If that sounds awfully like the mind-body problem, you would be right. In other words, AI, cognitive science and the philosophy of mind are tied to the evolution of writing. Put another way, the future of media is the future of the mind.

This Week’s Links

Two articles on media and text.

  1. Why text still rules. Posted earlier, but especially relevant to this discussion.
  2. The future of new media.

Weekly Newsletter #15: Coding Philosophy

November 9, 2014

As I have said on other occasions, code is language and programming is a natural evolution of writing. In my opinion – I am a biased observer – philosophy and literature are the high points of textual culture. Mathematics comes a close third, but I tend to subsume mathematics under philosophy, since it’s a humanistic pursuit masquerading as science. Let’s set literature aside for the moment and ask how writing influences philosophy and mathematics.

Philosophical prose is most effective when it invents new forms of writing. Technical concepts are essential, but so are symbolic representations, of which logic is the most important. Logic has been the medium of philosophizing. Starting with Aristotle but greatly enhanced in the last hundred odd years, we expect rigorous philosophical argument to be cast in logical form.

Logic was my first love before I turned toward other mathematical pursuits, but I didn’t find it expressive enough for my needs. On the other hand, while mathematics is expressed formally, it doesn’t concern itself with the nature of form itself. So we are stuck with two alternatives:

  1. Logic, with its emphasis on syntactic form and lack of expressive richness
  2. Mathematics, with it’s extraordinary diversity of expression, but lack of curiosity about the nature of form 

Is there a third alternative? I believe so; in fact, I am beginning to think that programming is that alternative. Programming is rich in expression – it is grounded in text, but makes room for many other sensory inputs; it also interfaces with most of the devices we make these days. Programming also concerns itself with the nature of form – one look at the sharp debates about the merits of various programming languages (C vs C++ vs Lisp vs Haskell vs Javascript) and programming paradigms (OO vs Functional vs Imperative) shows the depth of concern over the form of programs, not just their content. 

In this picture, programming is distinct from the theory of computation. In fact, the two are no more related than literature is related to the theory of pens and pencils. It’s time to abstract programming from it’s origins in computation. Once you do so, it becomes clear that code is an excellent vehicle for philosophy. Let’s take a look at one prototypical use of programming to improve current philosophical practice. 

For the most part, philosophical arguments are written in prose with a leavening of logic – if you’re working in the analytic tradition; I have no idea how continental philosophers leaven their prose. The architecture of these arguments is difficult to decipher. Try reading Kant’s Critique of Pure Reason and you will see what I am talking about. Philosophers do use tools such as thought experiments to convey their intuitions, but the arguments remain as dense as ever. 

Now imagine rewriting philosophical arguments as computer programs, with thought experiments and other “active” elements being written in code and the main flow of the argument in the comments (as in commenting on the code). Further, we can modularize philosophical arguments into separate code chunks and invoke relevant chunks as function calls or as separate modules. A typical argument module would look like this: 

theory_of_justice.phil

include veil_of_ignorance
begin argument
....
end argument

The cognitive advantages alone are worth the effort – it’s so much easier to create shared mental models when the design of an argument is well organized and available at a glance. When expressed in code, philosophy becomes an active discipline; a discipline one can experiment with and demonstrate to the public. These are some of the deepest intuitions that human beings have ever had about the nature of reality. All of us care about them but most of us can’t access them except in a watered down or new-agey form. That’s a real pity. Socrates conducted his philosophizing in public, our in the street. Programming can help return that spirit of street metaphysics. Code can do to philosophy that calculators did to arithmetic.

I am deliberately staying away from the idea of philohacking, but the phrase is on the tip of my tongue. Philohacking makes it that much harder to bullshit one’s way through philosophizing. At the same time, philosophy will get a second wind; it will no longer be consigned to it’s current role as a commentator on the sciences. Instead, this oldest of subjects will reconnect with it’s roots in the aesthetic and creative impulse. 

The advantages for programming (and programmers) is equally deep. For one, the philosophical lens will help formalize what all programmers know already: the comments are as important as the code. Philosophical training can help programmers become as rigorous and creative about the comments as they are about the code they write. Think of the philosopher as the yin to the programmers yang: ultimately, both are building symbolic structures grounded in language or language like formalism. The merger of the two will create an entirely new discipline. 

This Week’s Links

No links this week – wasn’t reading enough and nothing that inspired me in the limited reading that I did. 

Weekly Newsletter #14: Universal Knowledge

November 2, 2014

Many of us have a dream: make all the knowledge in the world accessible to everyone. Part of that problem is that we have struggled immensely to access knowledge – texts, mentors and a peer network. The internet has made accessing information much easier, but if anything it has made accessing knowledge harder, for it has added an additional layer of complexity to the seeker’s pursuit.

Information overload is arguably worse than scarcity, for it makes it much harder to know what’s worth pursuing. Add that to the inherent complexity of knowledge, and you have a very hard system to crack. Knowledge is not information. Information, for better or worse, is objective, mechanized and easy to access in the age of Google. It’s also (no longer, anyway) not a source of livelihood. Knowledge, on the other hand, is inherently value laden, socially mediated, greatly influenced by power relations.

As a characteristically human activity, knowledge work is a plausible path to livelihood, career and identity. Making knowledge available to all is intrinsically tied to making knowledge work available to all. That’s a much harder problem than making information available to everyone because of the complexity of human knowledge.

The Minaret

Human knowledge has become enormously complex over the last five hundred years. Three hundred years ago, it was possible for a scientist to know all the science that existed. Two hundred years ago, it was possible for a good mathematician to know all the math that existed. A hundred years ago, a good algebraist would have known all the algebra that existed. Now, it’s impossible for a good algebraic geometer to know all the algebraic geometry that exists.

The Mughal emperor Shahjahan was imprisoned by his own son, Aurangzeb, and spent his last days in one corner of the fort in Agra. Fortunately, Aurangzeb was kind enough to house Shahjahan in a room whose window overlooked the Taj Mahal. From his minaret, the deposed emperor could look at his beloved ex-wife in whose honor he had spent half the empire’s fortune by building that great paean to romance.

Our own predicament as a knowledge civilization seems similar to the Mughal emperor’s downfall. Of all the things we are proud of in the modern world – from spaceships to ipods, from human rights to the UN – we are perhaps proud of knowledge the most. Technological fads come and go, scientific theories fall by the wayside, but our collective capacity to build layers of knowledge on top of each other is the foundation for all other innovation.

We believe that our collective creation of knowledge is one of the great edifices in human history. An unfortunate consequence of our impressive tower of knowledge is that we are all stuck in our respective minarets. That’s the price of success right? Isn’t it OK for us to be little cogs in the giant wisdom wheel?

Let me offer a counter view, that our knowledge system is obese and unhealthy. The very immensity of our knowledge system is making it unstable. Major structural errors are invisible to us because we only have access to a view out of our minarets. Now, there’s nothing wrong with an incomplete knowledge system; we are finite creatures and absolute truth is permanently hidden from us. However, our current knowledge system claims to be absolute in its aims; not only that, it behaves as a totalizing system in practice, with complete control over all the channels of knowledge, all means of legitimacy and advancement and access to livelihood. In other words, it is an idolatrous system. I want to start a constructive program to create an alternate system. A system that’s radically simpler, even simplistic. It literally should be child’s play.

Building Blocks

What might an alternate set of building blocks look like? Some thoughts below:

  1. Transparency: A building block should be cognitively transparent. If you are in a learning phase, the block can have a tiny bit of difficulty that get’s ironed out with practice. Of course, what’s cognitively transparent for me may not be cognitively transparent for you, therefore, we need smart ways of figuring out where a particular individual stands.

  2. Play: It should be fun, at least in the early years, to play with these knowledge blocks. Deep play, which teaches you new skills while situating outside the realm of social climbing is most likely to make old knowledge easy to digest and produce new knowledge.

  3. Craft. We should assume that the production of knowledge is a profession, like any other craft profession. Therefore, it should be rewarding, lead to a sustainable livelihood but at the same time, it shouldn’t be dominated by stars and elite conceptions. As knowledge that’s dominated by practice, there’s no reason to push “research” as the main goal of knowledge. We obviously want to push the boundaries of knowledge, but why should that be done in a spirit of competition, distrust and ego? Instead, we are much better off creating systems of collaborative pushing of knowledge boundaries. That’s the right thing to do in both senses of that term – it’s a surer path to the truth and it’s the ethical thing to do – uses fewer resources and creates public goods as a designed outcome.

  4. Livelihood. We now have an incredibly hierarchical knowledge system. At the top are the elite knowledge workers, the philosopher kings who are paid to think deep thoughts. Then there’s a much larger community of professionals who have permanent positions but aren’t seen as creators. Then there’s an even larger community of adjuncts who neither have security nor do they have any status as creators.

This hierarchical knowledge system is wasteful, for investigators see themselves in competition with each other for scarce resources. Instead, if we think of knowledge workers as craftsmen, they should be supported for the replication of their skills in other people. A knowledge society needs a large population of knowledge workers whether they create new knowledge or not, just as we need lots of doctors whether they can treat new diseases or not.

This week’s links

Only marginally related to the topic of the newsletter, but great reading anyway:

  1. Alan Moore’s million word novel in the making. 
  2. Rescuing “The Philosopher” from his caricature. FYI – Such was Aristotle’s fame in the middle ages that he was called the philosopher, as if no one else could attain that stature.