dendritic arborization • I like that phrase

disordered thought processes

hidden in the seeming chaos is beautiful, elegant order—at least, I hope that's true.

teaching a computer to read your mind

posted on

The crux of the eternal static versus dynamic typing debate is just how much are you willing to let the computer (or more accurately, the language implementers) decide what you mean. Those who favor static typing tend to favor explicit direction over implicit intuitive understanding, and strictly-defined categories and hierarchies rather than free-for-all tag webs and interconnections. The static typist immediately recognizes that the computer (specifically, the compiler or the interpreter) is a non-intelligent entity that must be told exactly what to do, or else you’re liable to saw your own foot off. The dynamic typist, while not delusional about just how intelligent the computer is, is willing to have a little more faith in the language implementers, believing that they will do the Right Thing™ with the input that is fed to them.

It is apt that most dynamically typed languages are interpreted. It is not just metaphoric that the most important part of the interpreter is the parser. What I believe the goal of dynamically-typed languages is to be able to take input from a programmer, and output machine language that does pretty much what the programmer means. In other words, interpreters of dynamically-typed languages need to be able to understand human idioms.

This is intrinsically more likely to result in unexpected behavior, but we aren’t talking about obvious operations like addition or concatenation. We’re talking about more abstract things, like how certain tables in a database should be related to one another, or what to do with that extra parameter when the object passing a message isn’t expecting it.

The static-typist is most comfortable with telling the computer exactly what to do in these circumstances. This can be a time-consuming, and error-prone activity. The dynamic-typist is probably more willing to let the language implementers make these decisions, with the knowledge that what you thought you put in may not be what the interpreter actually thinks you put in, resulting in perhaps wildly erroneous output (but not a crash!)

Some authors comment on the fallacy that we’re talking about the difference between strong typing and weak typing. For example, any language that lets you implicitly cast between different types is de facto a weakly-typed language, and the most popular statically-typed languages allow you to do just that.

So the real axis is static vs dynamic typing.

But I think another level on which to think about these things is the difference between brittle handling and flexible handling.

In the day and age when we have clock cycles to spare, with CPU cores running idle, I think we should really expect more of our computers. I really think that computer programming should be more tolerant with bad input. I’m not saying that a compiler should just ignore syntax errors, or let a grossly mis-cast variable go ahead and ruin your stack, but language implementers really should start using all this extra CPU time we have. Capabilities like self-reflection are the beginnings of this.


posted on

Ang hindi marunong lumingon sa pinanggalingan, hindi makararating sa paroroonan
{If you don’t look back to see where you came from, you won’t make it to where you’re trying to get to} —popularly ascribed to Jose Rizal, physician, poet, polymath, and icon of the Philippine Revolution

In six months, my plan for the future will officially run out.

Now, if you happen to know me, you’ll know that I’ve never been one for planning for the future. I like to think of myself as being spontaneous, and being able to go with the flow. The alternate way to spin this, though, is that at very random times, I can be impetuous and impulsive, with no regard, or at least very little regard, for the consequences at hand. And I tend to favor the path of least resistance, which we all know eventually leads to the bottom of the sea.

But there are good, well validated, reasons for my devil-may-care attitude. If you think about it, we control very little about our lives. The outcome and course of your life is at least 50% determined by the personal characteristics and socioeconomic status of your parents. Things like what kind of high school you go to are completely controlled by (1) accidents of geography and (2) how much money your parents are willing to part with in order to stack the deck in your favor when you apply to college. Other things, like who you make friends with, and who (or if) you marry, are also largely accidents of propinquity. If you stop and think about it, the things you have the most control over are things you probably take for granted, and do over and over again, day and in day out, and sadly, rarely savor and enjoy.

Hence, the thing that my oldest friend taught me when we were still in high school: the simple pleasures in life are what count the most.

But this is where the path of least resistance thing clicks into place: at least 67%-75% of my extended family are in health care. So it was probably the most natural thing for me to follow in their footsteps.

There were many junctures in my life where this particular goal came into extreme doubt, but the decisions are too many, and there were too many twisting and turnings for me to even imagine what my life would be like now if I had chosen otherwise.

In the end, this one goal I set for myself, and at times took for granted, guided the course of the last ten years of my life.

Truth be known, I’m going to be three years behind schedule, but by my standards that a pretty decent margin of error. I’ve most definitely taken some interesting detours along the way. If everything had gone accordingly to plan, though, I would’ve been done with medical school and residency by the time I was 28.

On the other hand, sometimes Luck gives you small, but valuable and worthwhile, gifts. If I had started residency more than a year earlier than I did, I would’ve done my intern year without the protection of the mandatory 80 hour work week or the 30 hour (24+6) day. (As miserable as it was already, I can’t even imagine how awfully painful this would’ve been.) I I had finished medical school a year later than I did, I would’ve had to pay $1,000 for having the privilege of taking the clinical skills examination as part of the licensing process, which basically involves spending half a day interacting with fake patients. From what I understand, the only real thing it does is weed out the complete sociopaths, but of course no one is ever going to do a rigorous analysis of whether or not this test even succeeds at this modest task.

It’s very strange. I have been in the revenge business so long, now that it’s over, I don’t know what to do with the rest of my life.
— Iñigo Montoya from “The Princess Bride”

But I am certainly at an Iñigo Montoya moment. I’ve been so busy with becoming a physician, that I have no idea what I want to do with the rest of my life. I have this premonition that it’s not going to be an overly long span of time, in any case, since I don’t do very much to maintain my health, and I’m finding that it’s actually starting to fail in some ways, but whether we’re talking about the next few years, or the next few decades, I still find this great big yawning chasm of the unknown before me.

I suppose I also learned a lesson from the movie “City Slickers”, specifically from the late Jack Palance: one thing.

All it takes to have a good reason to live is to have one single goal.

And that would be a Christmas wish come true.

full circle

posted on

Now it all makes sense.

Apparently, the trend for all computer languages is to approximate Lisp. And, at least today, Ruby is as close as you can get without having to use recursively nested parentheses.

The interesting thing about Lisp is that is was designed to be notation rather an actual programming language. Lisp was intended to be a systematic way in which you can describe a Turing Machine. In other words, it was meant for communication, specifically, from one human to another human, and was not just meant to tell a CPU how to spin its registers.

While people talk about the elegance of lists of lists (and it is elegant in its way), what I find most intriguing about Lisp is its human language aspect.

One can argue that perhaps car and cdr are not the most transparent things in the world, but I think it’s not for nothing that Lisp is associated with AI in the minds of we who have a bare inkling of it. In truth, the most I have ever been exposed to Lisp is random dabblings in Scheme, and my time spent screwing around with Logo.

With Logo, the language aspect thing always intrigued me. Like I mentioned, I had long been interested in writing a text adventure game—interactive fiction. And parsing of natural language is one of the things that lists of lists are good for handling. But it ended up way beyond what my 8 year old brain could handle, so I moved onto procedural languages, which were easier to grok.

And interestingly, the whole thing with the Lines of Code issue is not necessarily LoC per se, nor is it even necessarily complexity. The issue is what I would call semantic density, and it’s related to the sociolinguistic concept of context-dependence.

In the end there is a trade-off: the potential ambiguity of implicitness for the tedium of explicitness. —JA Robson’s comment on ”Size is the Enemy” on Coding Horror

This is the difference between languages like Ruby and Perl versus Assembly/C/Java, and it is related to the difference between Mandarin Chinese and English.

The thing to remember is that, while most people just think of a programming language as a way to tell the CPU what to do, in reality, it is just as important as a way to communicate your intent and your formulation of concepts to whoever else might have to maintain your code.

Particularly in this post-modern world, where everything worth doing has already been done, and all the clever algorithms have already been formulated, trying to re-invent the wheel is an exercise in futility.

This is probably why Lisp has such appeal. Ultimately, it is a human language for talking about computer algorithms. This is where it gets a lot of its power: it leverages the existing language of mathematics.

tags: , ,


posted on

I dig this quote:

All models are wrong. Some are useful.
—unnamed source from ”The Mythical 5%” by Bruce Eckel

This encapsulates how a scientist must think (although probably very few are actually able to keep up this amount of skepticism.)


recapitulation of the ontogeny of computer languages

posted on

Steve Yegge’s rant about huge code bases and how Java exacerbates the problem is definitely circulating the internets. Jeff Atwood at Coding Horror chimes in and agrees wholeheartedly.

I recall running into static typing issues making my code unnecessarly verbose back when I was using Turbo Pascal. (Mostly because you had to declare how long strings were, and because of the way you had to declare records/structures, you might even have to create classes of certain string length, like String80 for an 80-character string, and String78 for a 78-character string.) But the issue really isn’t typing. True, I do believe static typing tends to add nearly meaningless, semantically sparse lines to your code, but if you think about it, most of the popular interpreted (i.e., scripting) languages are actually strongly typed as well. In Microsoft Basic 2.0 (the interpreter that the Commodore 64 used), you declared type by appending a special character, so you always knew that A$ was a string, A% was an integer, and A was float, and that was all you really had, unless you counted arrays, which were made up of these base classes. In Perl, $var is a scalar (which can be a string or a number, to be sure, so maybe it’s not that strong in terms of typing), @var is an array, %var is a hash. And while Ruby doesn’t really use these markers, you have to explicitly re-cast Integers to Strings (using tos) or to Floats (using tof).

So typing doesn’t have anything to with it.

I think what the problem is, is that most people don’t construct sane objects.

Sometimes the problem lies in the base classes of the language. There may be too many redundant classes—similar but not-quite-the-same—and you end up having to figure out which methods can take which objects, and sometimes you end up writing all sorts of kludgery to use the objects you want and process them with the methods you want—because polymorphism in C++ isn’t all that it’s cracked up to be.

Sometimes the language isn’t quite completely object-oriented, and a lot of the idiom is still procedural in style. A lot of these languages had OOP grafted onto them, when their lineages are clearly from procedural languages. (Perl is certainly evidence of this, as are ObjC and C++)

On the other hand, it seems like most of the coders who disagree with Yegge and Atwood are more into procedural code, and some are even overtly hostile to OOP. I think part of the problem lies in the fact that a lot of people ended up learning OOP through C++, which, from what I remember, isn’t much fun. It was easier to write kludgy C than it was to deal with C++’s class system.

The other thing is that it seems like it’s really difficult to tailor your classes to the functionality you want. While most OO allows polymorphism, this isn’t quite exactly what you need. I feel like the right solution to the problem of creating subclasses is to either go the ObjC/Smalltalk way, and utilize message passing, or go the Ruby way (which also allows message passing,) and utilize mix-ins.

The way to decrease the number of lines a developer needs to write is to come up with intelligent (not just intelligently-designed) base classes and interfaces. So when you throw a not-quite-right object at it, it won’t just crash-out, and it won’t perform completely unexpected and often destructive operations. The idea of self-reflection is a step in this direction. Rails uses this to its advantage (as does ObjC and Smalltalk, from what I understand.)

Ultimately, you have to choose: do you want to control the means, or are you more interested in the ends? What assembly and C allows you to do is control explicitly or almost explicitly what the machine does. I believe this necessarily has to come at the expense of the end results. In contrast, semantically dense OO systems like Smalltalk and Ruby are good at delivering the end results as you intended, but you have very little control over what algorithms are used to actually deliver the results. This is the reason why C is often touted as the epitome of speed: every command translates easily into the appropriate machine code. Whereas interpreted languages inherently have a lot of overhead, and a lot of indirectness, but it’s easier to get from point A to point B.


posted on

I read Barack Obama’s speech and felt like I had to post it (originally on

Ten months ago, I stood on the steps of the Old State Capitol in Springfield, Ill., and began an unlikely journey to change America.

I did not run for the presidency to fulfill some long-held ambition or because I believed it was somehow owed to me. I chose to run in this election — at this moment — because of what Dr. King called “the fierce urgency of now.” Because we are at a defining moment in our history. Our nation is at war. Our planet is in peril. Our health care system is broken, our economy is out of balance, our education system fails too many of our children, and our retirement system is in tatters.

At this defining moment, we cannot wait any longer for universal health care. We cannot wait to fix our schools. We cannot wait for good jobs, and living wages, and pensions we can count on. We cannot wait to halt global warming, and we cannot wait to end this war in Iraq.

I chose to run because I believed that the size of these challenges had outgrown the capacity of our broken and divided politics to solve them; because I believed that Americans of every political stripe were hungry for a new kind of politics, a politics that focused not just on how to win but why we should, a politics that focused on those values and ideals that we held in common as Americans; a politics that favored common sense over ideology, straight talk over spin.

Most of all, I believed in the power of the American people to be the real agents of change in this country — because we are not as divided as our politics suggests; because we are a decent, generous people willing to work hard and sacrifice for future generations; and I was certain that if we could just mobilize our voices to challenge the special interests that dominate Washington and challenge ourselves to reach for something better, there was no problem we couldn’t solve — no destiny we couldn’t fulfill.

Ten months later, Iowa, you have vindicated that faith. You’ve come out in the blistering heat and the bitter cold not just to cheer, but to challenge — to ask the tough questions; to lift the hood and kick the tires; to serve as one place in America where someone who hasn’t spent their life in the Washington spotlight can get a fair hearing.

You’ve earned the role you play in our democracy because no one takes it more seriously. And I believe that’s true this year more than ever because, like me, you feel that same sense of urgency.

All across this state, you’ve shared with me your stories. And all too often they’ve been stories of struggle and hardship.

I’ve heard from seniors who were betrayed by CEOs who dumped their pensions while pocketing bonuses, and from those who still can’t afford their prescriptions because Congress refused to negotiate with the drug companies for the cheapest available price.

I’ve met Maytag workers who labored all their lives only to see their jobs shipped overseas; who now compete with their teenagers for $7-an-hour jobs at Wal-Mart.

I’ve spoken with teachers who are working at doughnut shops after school just to make ends meet, who are still digging into their own pockets to pay for school supplies.

Just two weeks ago, I heard a young woman in Cedar Rapids who told me she only gets three hours of sleep because she works the night shift after a full day of college and still can’t afford health care for a sister with cerebral palsy. She spoke not with self-pity but with determination, and wonders why the government isn’t doing more to help her afford the education that will allow her to live out her dreams.

I’ve spoken to veterans who talk with pride about what they’ve accomplished in Afghanistan and Iraq, but who nevertheless think of those they’ve left behind and question the wisdom of our mission in Iraq; the mothers weeping in my arms over the memories of their sons; the disabled or homeless vets who wonder why their service has been forgotten.

And I’ve spoken to Americans in every corner of the state, patriots all, who wonder why we have allowed our standing in the world to decline so badly, so quickly. They know this has not made us safer. They know that we must never negotiate out of fear, but that we must never fear to negotiate with our enemies as well as our friends. They are ashamed of Abu Ghraib and Guantanamo and warrantless wiretaps and ambiguity on torture. They love their country and want its cherished values and ideals restored.

It is precisely because you’ve experienced these frustrations, and seen the cost of inaction in your own lives, that you understand why we can’t afford to settle for the same old politics. You know that we can’t afford to allow the insurance lobbyists to kill health care reform one more time, and the oil lobbyists to keep us addicted to fossil fuels because no one stood up and took their power away when they had the chance.

You know that we can’t afford four more years of the same divisive food fight in Washington that’s about scoring political points instead of solving problems; that’s about tearing your opponents down instead of lifting this country up.

We can’t afford the same politics of fear that tells Democrats that the only way to look tough on national security is to talk, act and vote like George Bush Republicans; that invokes 9/11 as a way to scare up votes instead of a challenge that should unite all Americans to defeat our real enemies.

We can’t afford to be so worried about losing the next election that we lose the battles we owe to the next generation.

The real gamble in this election is playing the same Washington game with the same Washington players and expecting a different result. And that’s a risk we can’t take. Not this year. Not when the stakes are this high.

In this election, it is time to turn the page. In seven days, it is time to stand for change.

This has been our message since the beginning of this campaign. It was our message when we were down, and our message when we were up. And it must be catching on, because in these last few weeks, everyone is talking about change.

But you can’t at once argue that you’re the master of a broken system in Washington and offer yourself as the person to change it. You can’t fall in line behind the conventional thinking on issues as profound as war and offer yourself as the leader who is best prepared to chart a new and better course for America.

The truth is, you can have the right kind of experience and the wrong kind of experience. Mine is rooted in the real lives of real people and it will bring real results if we have the courage to change. I believe deeply in those words. But they are not mine. They were Bill Clinton’s in 1992, when Washington insiders questioned his readiness to lead.

My experience is rooted in the lives of the men and women on the South Side of Chicago who I fought for as an organizer when the local steel plant closed. It’s rooted in the lives of the people I stood up for as a civil rights lawyer when they were denied opportunity on the job or justice at the voting booth because of what they looked like or where they came from. It’s rooted in an understanding of how the world sees America that I gained from living, traveling and having family beyond our shores — an understanding that led me to oppose this war in Iraq from the start. It’s experience rooted in the real lives of real people, and it’s the kind of experience Washington needs right now.

There are others in this race who say that this kind of change sounds good, but that I’m not angry or confrontational enough to get it done.

Well, let me tell you something, Iowa. I don’t need any lectures on how to bring about change, because I haven’t just talked about it on the campaign trail. I’ve fought for change all my life.

I walked away from a job on Wall Street to bring job training to the jobless and after-school programs to kids on the streets of Chicago.

I turned down the big-money law firms to win justice for the powerless as a civil rights lawyer.

I took on the lobbyists in Illinois and brought Democrats and Republicans together to expand health care to 150,000 people and pass the first major campaign finance reform in 25 years; and I did the same thing in Washington when we passed the toughest lobbying reform since Watergate. I’m the only candidate in this race who hasn’t just talked about taking power away from lobbyists, I’ve actually done it. So if you want to know what kind of choices we’ll make as president, you should take a look at the choices we made when we had the chance to bring about change that wasn’t easy or convenient.

That’s the kind of change that’s more than just rhetoric — that’s change you can believe in.

It’s change that won’t just come from more anger at Washington or turning up the heat on Republicans. There’s no shortage of anger and bluster and bitter partisanship out there. We don’t need more heat. We need more light. I’ve learned in my life that you can stand firm in your principles while still reaching out to those who might not always agree with you. And although the Republican operatives in Washington might not be interested in hearing what we have to say, I think Republican and independent voters outside of Washington are. That’s the once-in-a-generation opportunity we have in this election.

For the first time in a long time, we have the chance to build a new majority of not just Democrats, but independents and Republicans who’ve lost faith in their Washington leaders but want to believe again — who desperately want something new.

We can change the electoral math that’s been all about division and make it about addition — about building a coalition for change and progress that stretches through blue states and red states. That’s how I won some of the reddest, most Republican counties in Illinois. That’s why the polls show that I do best against the Republicans running for president — because we’re attracting more support from independents and Republicans than any other candidate. That’s how we’ll win in November and that’s how we’ll change this country over the next four years.

In the end, the argument we are having between the candidates in the last seven days is not just about the meaning of change. It’s about the meaning of hope. Some of my opponents appear scornful of the word; they think it speaks of naiveté, passivity and wishful thinking.

But that’s not what hope is. Hope is not blind optimism. It’s not ignoring the enormity of the task before us or the roadblocks that stand in our path. Yes, the lobbyists will fight us. Yes, the Republican attack dogs will go after us in the general election. Yes, the problems of poverty and climate change and failing schools will resist easy repair. I know — I’ve been on the streets; I’ve been in the courts. I’ve watched legislation die because the powerful held sway and good intentions weren’t fortified by political will, and I’ve watched a nation get misled into war because no one had the judgment or the courage to ask the hard questions before we sent our troops to fight.

But I also know this. I know that hope has been the guiding force behind the most improbable changes this country has ever made. In the face of tyranny, it’s what led a band of colonists to rise up against an Empire. In the face of slavery, it’s what fueled the resistance of the slave and the abolitionist, and what allowed a president to chart a treacherous course to ensure that the nation would not continue half slave and half free. In the face of war and Depression, it’s what led the greatest of generations to free a continent and heal a nation. In the face of oppression, it’s what led young men and women to sit at lunch counters and brave fire hoses and march through the streets of Selma and Montgomery for freedom’s cause. That’s the power of hope — to imagine, and then work for, what had seemed impossible before.

That’s the change we seek. And that’s the change you can stand for in seven days.

We’ve already beaten odds that the cynics said couldn’t be beaten. When we started 10 months ago, they said we couldn’t run a different kind of campaign.

They said we couldn’t compete without taking money from Washington lobbyists. But you proved them wrong when we raised more small donations from more Americans than any other campaign in history.

They said we couldn’t be successful if we didn’t have the full support of the establishment in Washington. But you proved them wrong when we built a grass-roots movement that could forever change the face of American politics.

They said we wouldn’t have a chance in this campaign unless we resorted to the same old negative attacks. But we resisted, even when we were written off, and ran a positive campaign that pointed out real differences and rejected the politics of slash and burn.

And now, in seven days, you have a chance once again to prove the cynics wrong. In seven days, what was improbable has the chance to beat what Washington said was inevitable. And that’s why in these last weeks, Washington is fighting back with everything it has — with attack ads and insults; with distractions and dishonesty; with millions of dollars from outside groups and undisclosed donors to try and block our path.

We’ve seen this script many times before. But I know that this time can be different.

Because I know that when the American people believe in something, it happens.

If you believe, then we can tell the lobbyists that their days of setting the agenda in Washington are over.

If you believe, then we can stop making promises to America’s workers and start delivering — jobs that pay, health care that’s affordable, pensions you can count on, and a tax cut for working Americans instead of the companies who send their jobs overseas.

If you believe, we can offer a world-class education to every child, and pay our teachers more, and make college dreams a reality for every American.

If you believe, we can save this planet and end our dependence on foreign oil.

If you believe, we can end this war, close Guantanamo, restore our standing, renew our diplomacy and once again respect the Constitution of the United States of America.

That’s the future within our reach. That’s what hope is — that thing inside us that insists, despite all evidence to the contrary, that something better is waiting for us around the corner. But only if we’re willing to work for it and fight for it. To shed our fears and our doubts and our cynicism. To glory in the task before us of remaking this country block by block, precinct by precinct, county by county, state by state.

There is a moment in the life of every generation when, if we are to make our mark on history, this spirit must break through.

This is the moment.

This is our time.

And if you will stand with me in seven days — if you will stand for change so that our children have the same chance that somebody gave us; if you’ll stand to keep the American dream alive for those who still hunger for opportunity and thirst for justice; if you’re ready to stop settling for what the cynics tell you you must accept, and finally reach for what you know is possible, then we will win this caucus, we will win this election, we will change the course of history, and the real journey — to heal a nation and repair the world — will have truly begun.

Thank you.

tags: ,

oh no, not again

posted on

I won’t disagree with the notion that Western Imperialism has caused much evil (the plight of the Pakistani in the wake of Benazir Bhutto’s assassination is yet another piece of evidence in that regard) but this monologue from Shakespeare still sends chills down my spine

Once more unto the breach, dear friends, once more,
Or close the wall up with our English dead!
In peace there’s nothing so becomes a man
As modest stillness and humility,
But when the blast of war blows in our ears,
Then imitate the action of the tiger:
Stiffen the sinews, summon up the blood,
Disguise fair nature with hard-favored rage;
Then lend the eye a terrible aspect:
Let it pry through the portage of the head
Like the brass cannon; let the brow o’erwhelm it
As fearfully as doth a gallèd rock
O’erhang and jutty his confounded base,
Swilled with the wild and wasteful ocean.
Now set the teeth and stretch the nostril wide,
Hold hard the breath and bend up every spirit
To his full height! On, on, you noble English,
Whose blood is fet from fathers of war-proof,
Fathers that like so many Alexanders
Have in these parts from morn till even fought
And sheathed their swords for lack of argument.
Dishonor not your mothers; now attest
That those whom you called fathers did beget you!
Be copy now to men of grosser blood
And teach them how to war! And you, good yeomen,
Whose limbs were made in England, show us here
The mettle of your pasture. Let us swear
That you are worth your breeding; which I doubt not,
For there is none of you so mean and base
That hath not noble lustre in your eyes.
I see you stand like greyhounds in the slips,
Straining upon the start. The game’s afoot!
Follow your spirit; and upon this charge
Cry ‘God for Harry! England and Saint George!’

In the spirit of Winston Churchill, Western civilization may have fucked up the world, but it’s still the only culture in all of history to come up the notion of democracy, where everyone is supposed to have a voice. There may be no place on earth that reaches this ideal, but the fact that this ideal even exists is already a miracle.

benazir bhutto

posted on

I feel extremely saddened with thinking about Benazir Bhutto’s assassination, casting a shadow on the end of the year. News of her death rocketed across the blogosphere at near light speed.

On one hand, it was the first time in my life that I heard about a significant event solely throught the Internet. It illustrates the bizarre connectedness that it allows, as I scan articles from people I don’t know personally, but whom I’ve followed for upwards of almost two years now, all in my feed reader.

On the other hand, it also demonstrates how far humanity has to go before reaching any modicum of civilization. In a world where murder—indeed, mass murder—is a politically-acceptable expedient, I can’t help but wonder where the hell people ever got the idea that we were any better than animals.

I am suddenly reminded of A Canticle for Leibowitz, a prophetic speculative fiction novel written by Walter M. Miller, which poignantly portrays humanity’s penchant for cyclical self-destruction. (I am also suddenly reminded of the fact that Pakistan is a nuclear power.)

I am also reminded of the parallel with Benigno Aquino, Jr.’s assassination which occurred more than twenty-four years ago. I was way too young at the time to understand the import of that event, but its repercussions continue to echo to the present day.

A part of me hopes that Bhutto’s death can set about a revolution akin to the People’s Power Revolution in 1986. But another part of me realizes that all revolutions tend to be halted abruptly long before they attain their idealistic goals. The current political situation in the Philippines is testament to that, as is the neocon attempt in the U.S. to turn back the clock to the pre-FDR era (and some would say, to the pre-Abraham Lincoln era.)

I am horribly aware of the fact that he who has the most guns makes the rules, that people generally prefer expedience to actual justice, and that no good deed ever goes unpunished. But a new year is dawning, and as I’ve been wont to say: Dum spiro, spero.

simplelog to mephisto

posted on

For some reason, I’ve never ever successfully utilized the converter infrastructure found in vendors/plugins/mephisto_converters/lib/converters, so I’ve generally had to cobble together my own kludge.


require 'converters/base'
require 'converters/simplelog'

@users = User.find(:all) default_user = @users.first site = Site.find(1)

Simplelog::Post.find(:all).each do |sl| puts "Importing post #{}: #{sl.title}" mp = mp.title = sl.title mp.excerpt = '' mp.body = sl.body mp.createdat = sl.createdat mp.publishedat = sl.createdat mp.updatedat = sl.modifiedat mp.filter = sl.textfilter + 'filter' mp.permalink = sl.permalink mp.user = default_user mp.updater = default_user = site mp.author_ip = '' mp.tag = sl.tag.collect(&:name) * ', ' mp.approved = true mp.section_ids = [1]! end

Simplelog::Comment.find(:all).each do |sl| mp.body = sl.body mp.filter = "markdown_filter" mp.createdat = sl.createdat mp.updatedat = sl.modifiedat mp.publishedat = sl.createdat = mp.author_url = sl.url mp.author_email = mp.author_ip = sl.ip mp.articleid = Article.findbytitle(Simplelog::Post.findbyid(sl.postid).title).id mp.approved = true! end

This relies on the following classes: simplelog/author.rb

module Simplelog
  class Author < ActiveRecord::Base
    establish_connection configurations['simplelog']
    hasmany :posts, :dependent => :destroy, :classname => 'Simplelog::Post'


module Simplelog
  class Comment < ActiveRecord::Base
    establish_connection configurations['simplelog']
    belongsto :post, :classname => 'Simplelog::Post'


module Simplelog
  class Page < ActiveRecord::Base
    establish_connection configurations['simplelog']


module Simplelog
  class Post < ActiveRecord::Base
    establish_connection configurations['simplelog']
    hasandbelongs_to_many :tag,
      :class_name => 'Simplelog::Tag',
      :jointable => 'tagsposts'
    belongsto :author, :classname => 'Simplelog::Author'
    hasmany :comments, :conditions => ['isapproved = ?', true], :dependent => :destroy, :class_name => 'Simplelog::Comment'


module Simplelog
  class Tag < ActiveRecord::Base
    establish_connection configurations['simplelog']
    hasandbelongs_to_many :post,
      :class_name => 'Simplelog::Post',
      :jointable => 'tagspost'

The Post definition and the Tag definition have a little voodoo for dealing with the transition from Rails 1.2 to Rails 2.0. In 1.2, acts_as_taggable names the join table backwards compared to 2.0, where it uses the normal semantics.

If this weren’t just a hack, I’d probably package it a little more nicely, but as it is, I’d appreciate any tips on how to actually get the converter infrastructure to work.

dreamhost, htaccess, and routes.rb

posted on

I have never been able to get my .htaccess file to properly redirect requests from different blog engines. For example, Simplelog tacks on either /archives/ or /past/ to its URLs, and Typo tacks on /articles/ to its posts. That’s one of the things I like about Mephisto: it doesn’t add what I feel are superfluous tokens to the URLs. (Although I am still trying to figure out how to get rid of /archives/ from the monthly posts.)

I assumed that Dreamhost uses Apache exclusively to serve all content, including Rails, which is the reason why you have to set it up to use Fast-CGI for any modicum of efficiency, but I feel like it’s ignoring my .htaccess file.

So I’m using Mephisto’s redirect syntax. I’m not sure where it should really go, but it seems to work with just putting it in routes.rb

The following snippet maps Simplelog’s tagged archives (/past/tag/) to Mephisto’s tagged archives (/tag/)


ActionController::Routing::Routes.draw do |map|
Mephisto::Routing.redirect 'past/?' => '$1'
Mephisto::Routing.connect_with map

(The bold is the line I inserted.)


truth, truthiness, and authentic fiction

posted on

In the Western model of education, there is an operational distinction between physics and metaphysics. The former gets you grants from the Department of Defense, and opens doors to working at NASA or JPL. You get to work with nuclear reactors and supercolliders and fusion bombs and Einstein-Bose condensates. The latter is stereotyped as the demesne of hippies trapped in the 1960s and undergrads who have no idea what they want to do with their lives. Generally, the discipline is called philosophy and not metaphysics, but a rose is a rose. You know you’re pretty marginal when even the social science and humanities people look at you with that “What the hell do you do?” look in their eyes.

What is strange is that this was not always so. When the Roman Catholic Church held sway over the Western world, physics and metaphysics were the same thing. If you think about it, it makes a hell of a lot of sense. Even in this present day, physicists expend a huge amount of effort into trying to figure out (1) where everything comes from and (2) where everything goes. In other words, a Theory of Everything™. The current incarnation of the most popular theory out there is called M-theory, where the M could easily stand for “meta.” The more popular nomenclature is String Theory, and it’s really just contemporary metaphysics dressed up with the trappings of mathematics since none of it is in a testable state at this time.

But I’m not here to argue semantics, nor really discuss the curious divide between hard science and philosophy.

What started me on this tack is going to midnight mass on Christmas.

If you’ve been following this blog for any amount of time, you may recall me mentioning I’ve been in a terrible crisis of faith since 2001. I was born and raised Roman Catholic, was baptised, participated in the Eucharist, and was Confirmed. I attended a parochial elementary school and junior high, and went to an all-boys high school run by Jesuits. Even all through college and most of med school, I still went to mass every Sunday.

And then a bunch of lunatic-fringe Muslims hijacked a few planes and crashed them into the WTC and the Pentagon.

This is not where everything went to shit quite yet.

We all know that religious fundamentalists are scary people who need to be quarantined and maybe even euthanized. Right? Right? I mean, it’s not surprising that a bunch of whack jobs would do such a thing, right?

That’s where I part company with most of the Western world, I guess.

The sad thing is that what we really have to chose between are Islamic psychos and the Christian fascists. Religious fundamentalists are going to destroy the world, and there is nothing we can do to stop them.

Despite my training in Western science, and despite the disappointments I’ve suffered from my faith, I still haven’t abandoned the idea that there might be a God after all. I seriously doubt that he/she is like the God described in the Old Testament, but I think that the possibility of the existence of an omnipresent, omnipotent, omniscient hyperintelligence is greater than zero, meaning that, given enough time in the universe, one or several are bound to occur.

I mean, I really doubt that the God that these sick fucks worship actually exists, but I still haven’t abandoned the idea that there might be some kind of Presence™ out there that is relatively benign, that may or may not take an interest in our little pale blue dot orbiting and unremarkable yellow sun in the backwaters of an unremarkable spiral galaxy sitting in the midst of an unremarkable galaxy cluster.

The fact that it’s a possibility that isn’t ruled out by the laws of physics nor the laws of thermodynamics means that atheism can’t be right, either. I think the only honest way to go without being overly dogmatic and ramming your beliefs down other people’s throats is to be agnostic. Wishy-washy maybe, but what if the Flying Spaghetti Monster really exists? What then?

Seriously, though, from the atheists that I’ve seen who are vocal on the Internet, it just seems like yet another religion from which you can exclude others and condemn them. Not really my taste, and if you can’t disprove the existence of an omnipresent, omnipotent, omniscient hyperintelligence, then how do you know what the truth is? You can’t. Simple as that.

Mind you, this is by far not an apologia for the raving lunatics who claim to have the keys to Salvation™. I think anyone who is dogmatic about anything but can’t prove their point with reproducible experiments should just shut the hell up and let people who have real talent get on with the business of discovering the inner workings of the universe. Anyone who thinks that they, and only they, know the truth is either selling something, or smoking something.

The reason I believe that omnipresent, omniscient, omnipotent hyperintelligences may exist is the simple fact that we know matter can self-organize, and that self-organized matter can become intelligent (perhaps I’m using the word too loosely, but you get the picture.) And it so happens that some self-organized intelligent entities (read, human beings) are interested in trying to create artificial intelligences that have some or all of these capabilities. If AI is truly possible (and it is still an “if,” since we have yet to produce a program that can really pass the Turing Test), then it should follow that hyper-AI is also possible. While the only intelligent form of matter we know happens to consist of mostly carbon and uses a highly distributed network of networks of nanoprocessors (in other words, neurons organized into nuclei, organized into functional partitions in the brain) for computing tasks, operating in a mostly aqueous environment, we are, after all, actively trying to replicate or at least emulate this functionality on silicon. And if it can be done in silicon, why not in uranium or lanthanum? Why not in a 10,000 kelvin gas cloud with hydrogen nuclei and hydrogen nuclei encoding state and performing quantum calculations? Or in a network of quasars, in which gamma-ray bursts are analogous to the release of neurotransmitters in our brains?

So I think that it is still possible that something like a God may be out there, although I am rather certain that we have no idea what he/she/it is truly like.

In literature, such a creature is already well described. Charlie Stross describes The Eschaton, a highly distributed hyperintelligence inhabiting the galaxy, in his books Singularity Sky and Iron Sunrise. And while the Eschaton may have limits, it seems pretty close to being omnipresent, omnipotent, and omniscient. Other distributed intelligences haunt the science fiction scene, like the Oracle and the Architect from the Matrix, Wintermute and Neuromancer from Neuromancer, Skynet from the Terminator Series. And while I have never read anything that Vernon Vinge wrote, I get the sense that he believes that our descendants are destined to evolve into similar creatures, for whom the vast vacuum of space is not a limit.

Arthur C Clarke’s laws always come back to me. The one I always remember is that “Any sufficiently advanced technology is indistinguishable from magic.” If this is the case, how do we know that miracles can’t be explained scientifically? I think it’s because, in a lot of ways, the average human mind is lazy. Instead of wanting to find the truth, the human mind just wants a pretty story that they can use as an answer for whenever their actions are challenged. Just think of the times that real people have actually, sincerely, claimed that God made them do it. (And think of the mayhem and suffering most of these people have wreaked on the world at large. God’s name is definitely sullied by the never-ending line of cheaters, liars, bullies, and outright assholes who claim to have a direct hotline to the deity him/herself.)

But I stand by my belief that fundamentalists should be killed, incarcerated, brainwashed, or lobotomized. If we got rid of these fucks, we could probably end like 95% of the problems of humanity.

axial tilt

posted on

The words come bubbling up all of the sudden

There are no happy endings, because nothing ever ends.

The light at the end of the tunnel
becomes the first gasp of air
lungs burning, heart straining
breaking through the water’s surface
at the end comes the start
and it is never finished
and the long count begins again

(If I could just win through without breaking
not stumble and fall as I cross the threshold
is it too much to ask not to have to crawl across the finish line?)
blood-stained, streaked with tears, covered in sweat
I will get there
ever unlooked for, unheralded, unwanted
my destiny is inexorable
and this cup will not pass

there is, and always will be
the sunlight
and the blue sky
and maybe my heart will sing just a little
and forget the decades of sorrow for a moment
there is starlight, moonlight
to guide my path
always seeking, always searching

But you must remind me, little one. When I… when I lose myself—when I lose her—you must remind me that I am still searching, still waiting… that I have never forgotten her, never turned from all she taught me. I sit in this place… I sit… but in my mind, in my poor mind, I am always away with her….

misunderstanding modern medicine

posted on

I have finally found a synonym for my embryonic philosophy tha I’ve been calling “The Art of Not Wanting.” Akin to Hindu and Buddhist ideals (where desire brings about suffering),voluntary simplicity is a lifestyle that eschews the excesses of the modern and post-modern era. It has significant bearing on the contemporary environmentalist movement as well as with its intersection with Neomarxism.

But the quote that struck me was how an Amazon reviewer of the book describing this kind of lifestyle stated that “using a public hospital” was among their list of things that would not be considered “viable stylishness.”

First of all, yeah, maybe this kind of lifestyle is impossible for the late 20-something/early 30-something white hipster living in the sophisticated metropolis to contemplate, but even people-of-color who grew up in upper middle class families are familiar with the concepts of being thrifty, simply from hearing stories of their forebears. While I personally don’t know the feeling of abject poverty, my immigrant parents certainly do, and the way they live their lives reflect this, non-withstanding the (now) dual six-figure incomes.

Secondly, probably 90% of the world lives relatively cheaply with at least some style; this kind of comment is, sadly, symptomatic of colonialist elitism.

Thirdly, unless you actually work in health care, you have no idea what you’re talking about with regards to municipal hospitals. While no doubt many of them are stinking cesspools temporarily housing ne’er-do-wells with no chance of survival, pretty much all of them are teaching hospitals, often attached to some rather well known universities. While they may not always be able to afford the technological bells and whistles, they generally practice cutting-edge evidence-based medicine, something that many private practice docs working at community hospitals scoff at despite the rigorous proof that it improves outcomes and cuts costs. In medicine, newness and shinyness does not necessarily equal better health care, and it’s really disturbing that very few people outside of health care actually understand that.

switching back from simplelog to mephisto

posted on

In case you didn’t notice, I also switched my blog engine again. Now that Rails 2.0 is out, I thought I’d give Mephisto (from svn another spin, and it seems to be working relatively well, much better than when I last tried it, although I still get the occasional 500 error.

While I really dug Simplelog’s clean interface and really nice looking, clean themes, it’s got a really tiny community, mostly of non-Rails hackers, and it doesn’t seem like there has been much ongoing development since 2.0.2 was released more than 10 months ago. Mephisto just seems more active right now.

Eventually, I’ll post the script I used to migrate my entries from Simplelog back to Mephisto.

documented higher risk of mortality

posted on

It’s official. I have hypertension, which is more simply known as high blood pressure.

I finally went to see my doctor yesterday, and found that my blood pressure was a rather alarming 150/93. Now, for the past year or so, I’ve been intermittently checking my blood pressure myself, and it certainly hasn’t been normal, somewhere between 130-140 systolic. Which probably explained the throbbing in my head. Knowing that it was highly unlikely that I would adopt a healthy lifestyle anytime soon, I opted for treatment.

Call it placebo effect, but after I started taking meds, I started feeling a lot better later in the day. I slept better than I’ve slept all week, and I actually woke up half-an-hour before I had to feeling refreshed.

The other thing I discovered was that my alanine transaminase is twice the upper limit of normal. I’m pretty sure it’s not chronic viral hepatitis, nor is it some bizarre genetic disorder. More likely than not, it’s simply chronic burger toxicity (which is facetious medical slang for being a fat-ass.) I am still horrified by the fact that Morgan Spurlock managed to get his jack his transaminases up to the 500s—about 10x the upper limit of normal—by eating McDonald’s only for a month in “Supersize Me”. Even still, it has done little to fix my crappy eating habits.

There is a part of me that feels as fatalistic as ever. Right now, I don’t have any particular reason to want to prolong my life significantly. I figure I should at least make it to fifty, give or take a few years, before suffering a heart attack or stroke. Twenty years still seems like a pretty long time.

And I guess I still haven’t been cured of my depression, really. There is a part of me that still feels that there’s no way things are going to get better, and that it’s all downhill from here anyway. And if the next twenty years are actually going to be worse than the last thirty years, I’m not sure I want to keep going.

Still, there’s something to say about feeling healthier (even if it might be entirely illusory.) The fact that my headache went away after taking my anti-hypertensive meds felt pretty good, as did the fact that I felt pretty well rested.

I’m not looking for guarantees. Realistically, I know that the future is a great big unknown. Both good and bad things are destined to happen, and while I can envision all the bad things—all the things that are inevitable, like sickness and death, failure and defeat—I have no idea how to predict the good things. When I look back at my life, to be truthful, serendipity has been remarkably good to me, and I doubt I could’ve made it this far without such good fortune.

I realize that the Laws of Thermodynamics are misleading. While I’ve tended to see these laws as inevitables, in reality, they’re just statistical truths. While on average, all things tend to fall apart, and disorder eventually wins, the universe is rife with examples where this isn’t true, particularly in the short term. So maybe we can’t escape the fact that the universe will become a cold, lifeless vacuum in fifteen billion years or so, but that probably doesn’t have much bearing on what tomorrow has to bring.

scattered thoughts on code complexity and natural language

posted on

Steve Yegge’s rants about programming are always really interesting. I’m all about the big picture, and I like how he can properly abstract his arguments so that it makes sense to a non-specialist. Very few technically competent people (whatever the field) are actually able to do this, and if they could, it would certainly make cross-discipline interaction a lot easier.

His latest diatribe is essentially about the unmaintainability of massive amounts of code that has low semantic expressivity.

Several commenters seem to miss the point completely and start arguing about Lines of Code™ and how all projects are doomed to massiveness, and clearly, these people have never worked with languages such as Perl, Ruby, or Lisp.

But as other commenters have pointed out, what Yegge is actually talking about is the concept-to-code ratio. To put it another way, how many keywords does it take to spell out a particular concept?

This is intimately related to a notion in sociolinguistics that is commonly referred to as context which has a specifically narrow connotation in this field. From the rudimentary linguistics class I took as an undergrad, the most vivid example I remember is the contrast between Mandarin Chinese and English. Mandarin is generally classified as a high-context language (or, perhaps, high-context-dependence) whereas English is a generally classified as a low-context language (low-context-dependence). What this means is that if I were to say something in Mandarin, in theory, I could say it in far fewer words than its English counterpart, mostly because you have all the necessary cultural context to understand what it is I’m trying to say.

But this isn’t the only axis by which semantic expressivity can be judged. While Mandarin can be semantically compact, it is probably on the more difficult end of the spectrum of languages to learn and highly dependent on familiarity with Chinese culture. In contrast, English—which is more semantically expansive—makes for a wonderful lingua franca.

How does this apply to programming languages?

Java is a relatively low-context-dependent language. You have to spell out a lot of things. You have to specify type, class, etc., etc. Sure, it’s far fewer things than you would have to spell out in C, or God help you, x86 assembly, but when you compare it to languages like Perl and Ruby, it’s a lot of stuff. This surely contributes to the code bloat issue.

Java’s low-context-dependence is probably what makes it such a widely used language, though.

Ruby, however, seems to be a high-context-dependent language. I say this mostly because of my exposure to Rails. Code targeted to Rails is compact and semantically dense but you may have no idea how things are being implemented. The details have been abstractified and hidden, and it is well known that you can write a fully-functional Rails app without really knowing much Ruby at all.

But the difference between Ruby and Mandarin Chinese is that I feel that Ruby is far more transparent and easy to understand for the non-native.

A little personal background: I got my first computer—a Commodore 64—when I was 8 years old. I learned how to program in BASIC, 6502 Assembly, and Logo (which, if you strip away the turtle graphics, is apparently highly reminiscent of Lisp.) I got my first x86 machine when I was 13, and learned how to program in Pascal. (When I was in high school, the Advanced Placement Computer Science course was based on Pascal. I never actually took the course but managed to get a 4 out 5 on the AP test.) In college I dabbled a little in C and C++ and in grad school, I learned Perl.

In all that time, I never really wrote a complete app, unless you count the extraordinarily rudimentary patient database system I wrote in Turbo Pascal 5.0 for my dad, or the hacked-together Visual Basic program I set up to help me keep track of medical billing when I used to work for this solo family practitioner. The most complicated thing I ever wrote in Perl is a CGI script that transliterates Roman characters to Alibata.

As you can see, I’m not a professional coder.

But I am an enthusiastic hobbyist linguist. And I’m more than a little intrigued by artificial intelligence. One of the things that I used to try and implement (in BASIC, of all languages!) was a programming language based on natural language. At the time, speech-to-text seemed to be an impossible, highly futuristic idea, but I figured that if they ever figure out how to turn speech into text, they’d still need an engine to parse it into actual commands. (These were the ideas I woudl get whenever I would hear Jean Luc Picard ask the ship’s computer to do something.)

In retrospect, this was probably a little too over the head of the average 10 year old, and at the time, I definitely couldn’t figure out how to get from here to there.

What I discovered, instead, were text adventure games.

The prototype for this genre is Colossal Cave Adventure, or more commonly, just Adventure. In modern parlance, they are sometimes known as interactive fiction. The variant/descendant that I was first exposed to was Zork. This was my first encounter with Infocom and their pioneering text parsing engine.

For the longest time, I had sought to implement my own text parsing engine, but to no avail. Interestingly, of all the languages that I had access to at the time, it seemed most intuitive to implement a text parsing engine in Logo.

But back to the modern-era of computer programming.

One can look at the divide between C/C++/Java/C# and Perl/Python/Ruby/Javascript as merely a generational-gap. The old school programmers tend to use the former. The newbies use the latter.

But I think there is another distinction to be made: the former group of languages tend to be compiled down into machine code or byte code fairly straightforwardly. The higher-level concepts, procedures, functions, objects, etc. are all representations of actual low-level entities. What the former group of languages allows one to do is control the machine with almost as much precision as one could with writing in straight-up machine language. The higher level language exists mostly so that a coder can actually read what’s going on, and so that they don’t have to look at the minutiae involved with saving registers and stacks and what-not every time they call a procedure.

The syntax and the implementation are tightly coupled—simple high-level concepts get translatetd to simple machine code and complex high-level concepts get translated to complex machine code—and sometimes this means that making the code easy-to-read and easy-to-maintain is diametrically opposed to keeping performance bearable and keeping memory usage sane.

The latter group of languages all started out as scripting languages, and it used to be that the main purpose of a scripting language was to automate various higher level tasks. I would hazard to guess that Larry Wall didn’t really intend people to build huge, monstrous piles of code to completely power multimillion dollar enterprises almost entirely in Perl, but what’s done is done, and here we are.

Part of the rationale for something like Perl is to make the invocation of certain tasks easier, in the sense that you can use somewhat more natural language to make a computer do something. Hence, Perl’s mantra of TMTOWTDI—There’s more than one way to do it. The syntax of the command may have no bearing whatsoever on how the actual task is implemented. A single line of Perl can get expanded into a humongous pile of complicated machine code, but the average coder need not know anything about the underying complexity. They can just get stuff done. In the era of 640K of RAM and 80 MB hard drives, this was not all that tenable. Coders needed to count every single byte and every single processor cycle.

But now we have machines that have gigabytes of RAM and terabytes of hard drive space, so this type of obsessive-compulsive fastidiousness is not as necessary. We can let the computer do even more of the grunt-work.

So the most intriguing part of modern languages like Ruby is the idea of self-reflection. This really gets the AI otaku in me excited. One way to look at it less sensationally, though, is to think of self-reflection as a way to make invoking certain tasks more like asking for things in natural language.

Rails is the perhaps the prototype in this regard. While MVC paradigms have existed for a while, the one thing that makes Rails stand out is the idea of convention over configuration. Configuration is the old school way to do things: control ever part of the system and tell the computer exactly how to do it, just shy of coding in pure machine language. Convention takes advantage of sociolinguistic context.

And with the pairing of convention over configuration (sociolinguistic context) with self-reflection, you’ve just come that much closer to implementing AI.

six degrees from robin hood to j.r.r. tolkien

posted on

Wikipedia has basically become the path of least resistance these days, and if I want to find information on anything, it tends to become my first stop. Which is sometimes unfortunate, because sometimes the primary sources aren’t exactly transparent. There are very few well-documented Wikipedia articles, and the ones that are well-documented have way too many references, leaving me with no idea how to stratify the authoritativeness of each reference. I can understand the reluctance to perform this stratification: it’s a lot of work, and the tendency is to leave the burden—perhaps quite rightly—on the reader, but failing to do this makes Wikipedia far less useful than it could be.

But in any case, I stumbled upon this blog post from Corielle about the “Robin Hood” series produced by the BBC. Naturally, this led me to look up Robin Hood, whom I did not realize may have been actually based on the lives of real outlaws such as Hereward the Wake who was an English Dane that rebelled against Edward the Confessor, whom he saw as aligned with the Norman French; Fulk FitzWarin who had an emnity with Prince John ever since they were children, and who ended getting his land stripped away from him; and Eustace the Monk, a pirate who was employed both by England and France against each other (and who actually reminds me more of the Dread Pirate Roberts, although apparently this is another guy entirely, in an entirely different century.)

Of note, Sherwood Forest and Nottinghamshire were originally part of the Anglo-Saxon kingdom of Mercia, with which J.R.R. Tolkien had a great interest in. In particular, Rohan (also known as the Mark) was based on the kingdom of Mercia. (In particular, “the Mark” is thought to be how the Mercians may have referred to their kingdom.)

And of no account, I am suddenly reminded of a scene from “The Last Unicorn” where Schmendrick the Magician gets captured by a bunch of outlaws. In a bid to impress them, he accidentally summons Robin Hood, Maid Marian, and the Merry Men.

What is this? This is not happening! Robin Hood is a myth! We are the reality! Magic is magic, but the truth is us! Right?
—Captain Cully from “The Last Unicorn”

And then I think of Aragorn, Legolas, and Gimli meeting the Riders of the Mark.

These are indeed strange days. Dreams and legends spring to life out of the grass.
—Éomer ́Éomundson from The Lord of the Rings

Not we but those who come after will make the legends of our time.
—Aragorn son of Arathorn from The Lord of the Rings

where is everything?

posted on

Coding—even in Ruby—is not exactly plug-and-play, but it's a whole hell of a lot easier than it used to be, I guess.



posted on

when degrees of freedom
just one
a single loss
enough to imprison


words come falteringly

the music of
laughter and quiet conversation
fades out like
the last five seconds of a movie

like a glowing hot ember
slowly fading to grey, cold ash

like a dreamer waking to drear dawn
happiness becomes indifference
joy becomes numbness

in time, the cold sets in
the light is just a reflection of a reflection
blurred and broken
and still

I am waiting for the light
(and the dawn shall set me free)
I am praying for the sun’s rebirth
the darkness before dawn fast approaches
despair quells my song

in time, the cold is all I know
the echoing silence where my thoughts
and the futile sussurations
of a man who has gone too far
didn’t mean for it to get this way
don’t know my way back to the start

(In the shadows of cruel night
I know the sun will come again
but each minute stretches to an hour
the more I count, the longer it goes.)

(In the shadows of endless night
tears come unbidden
unshed from years long wasted
dust and ashes
of regret—
and still I ask:
How can one regret what never came to pass?)

What do I take
What do I leave
without surrendering a piece of my soul?


the longest road

posted on

Just when you thought it couldn’t get lonelier. Just when you thought it couldn’t possibly get any more difficult than it already is. There will be no resting on any laurels. The road ahead climbs up steeply, into the forbidding vault of the heavens.

On one of these drives from L.A. to S.D., I gazed at the northern sky and watched the planes queue up to land at LAX. It looked like rush hour at 30,000 feet. I could almost imagine a freeway stretching from the tarmac up to the stratosphere, creating an enormous arcing bridge to the rest of the country. The vision was actually a little breath-taking.

The road that I find myself faced with makes me think of this impossible thoroughfare, some sort of concrete, ultramodern version of the rainbow leading to the pot of gold.

There are no quick fixes. Either I face the obstacles in my life, or I spend all my time avoiding them. They’re not going to go away on their own.

Like every journey, it’s going to have to be one step at a time. I do worry that as I reach those heights, I’ll start running out of oxygen. This probably shouldn’t be a journey that I can take lightly.

It’s onward and upward or bust. Ain’t no turning back now.

still chasing starlight/the relationship of music and spacetime

posted on

I think it might’ve been Sirius, the dog star, in the southern sky that lit my way tonight, like a beacon, brighter than the ambient glow of the urban sprawl before me, but I only have a faint grasp of celestialography, so I could be wrong.

Ten days until the sun finally halts its retreat and finally stands its ground. Twenty days until the year’s end, leaving me wondering about the future, and whether it’s even worth wondering at all.

The problem with driving down to San Diego with only my iPod as my companion is that I can get lost in the random music that it plays, dragging me through my memories, many of them dark and bitter. The following is not necessarily exact, but it serves as a rough guide.

  1. Vienna Tang “Harbor”
    hauntingly echoing my deepest desire, although perhaps something that will never come to pass in this lifetime.
  2. Semisonic “Singing in My Sleep”
    on the connector ramp from the Glendale Fwy southbound to the Golden State Fwy southbound, bringing back faint memories of nine years ago after leaving the Bay Area in defeat, and resigning myself to at least a year in limbo in L.A.
  3. Hooverphonic “Cinderella”
    past the junction of the Golden State Fwy with the Pasadena Fwy, on the way to the East L.A. Interchange. The rhythm of the song at first makes me think of “Bettie Davis Eyes” by Kim Carnes. Maybe this could be inspiration for a mashup.
  4. Amina “Hilli”
    speeding through Irvine, past the El Toro Y, making me think of something that might have been composed by Nobuo Uematsu for the theme of some imaginary town in some as-of-yet undrafted installment of Final Fantasy
  5. Aaliyah “Journey to the Past”
    as I wound my way through Laguna Niguel, remembering faint memories of ten Decembers past, and my heart not didn’t so much break, as it did just dry out. And still I dream of home.
  6. Hooverphonic “Battersea”
    through San Clemente. The lyrics are faint, leaving haunting traces in my mind.
  7. Nelly Furtado “All Good Things (Come to an End)”
    through Camp Pendelton. This song has captured my mind ever since I heard it for the first time this summer, and the answer is quite simple, and quite bitter.
  8. Frou Frou “Hear Me Out”
    probably either Oceanside or Carlsbad by this time.
  9. Feist “Secret Heart”
    probably Encinitas or Solana Beach. Reminding me of how so many words have died stillborn in my heart, freeze dried by despair, evaporated by helplessness.
  10. Sunny Day Real Estate “Song About an Angel”
    going past the merge, heading south on the 805
  11. S Club 7 “Never Had A Dream Come True”
    southbound on the 805 past La Jolla, through Clairemont Mesa, to the connector ramp to the southbound 163. This song always kills me, dragging me through the last ten years, and sticking a dagger right in my half-rotting, half-dessicated heart.
  12. Anggun “On the Breath of an Angel”
    exiting the 163 to Friar’s Road, remembering that even with the mess I could’ve turned everything into, she still saves me with her friendship.

It was pretty much ten years ago when I realized that my life would definitely not have a “happily ever after” ending. It’s not that I would necessarily live a tragic life, though. I mean, everyone has their regrets and failures that haunt them for the rest of their lives, right? At least that’s what I tell myself whenever I start feeling sorry for myself.

The more that time passes, the more it becomes apparent that the way things went down was inevitable. The moment came, I was tested, and I was found sorely wanting. I wasn’t meant to be the one, and that’s the way the cookie crumbles.

And yet, somehow, everything that has happened since seems to be an echo, a reverberation from that time long gone, and even this far out, I can’t seem to completely break free of my self-destructive patterns. It’s as if from that moment on, I was doomed. I was damned.

For a while, I’ve held out hope that things would change for me, that I would grow, that I would eventually have my chance for happiness someday. Even though I’ve wanted to give up, I’ve kept going, still keeping this ember of hope burning, still somehow hoping for some miracle.

I thought, “Oh God, my chance has come at last!”
but then a strange fear gripped me and I just couldn’t ask
—The Smiths “There is a Light That Never Goes Out”

I wonder how many years must go by before I must accept that my hope has run out. How many years must go by before I can just thrown in the towel, call it quits. Some things were never meant to happen.

Some are like water, some are like the heat
Some are a melody and some are the beat
—Alphaville “Forever Young”

I think, sometimes, of the curse of The Flying Dutchman, doomed to wander the seas until the end of time, never able to reach the shore. Or of Coleridge’s doomed Ancient Mariner, or perhaps the Wandering Jew. Bill Murray in “Groundhog Day.”

But I’m still hedging my bets. I also think of Schmendrick the Magician, cursed to never age until he learns the secrets of magic, and reaches his full potential. Maybe, still, maybe, I’ll meet a unicorn, and maybe even someone like Molly Grue, and while the story won’t necessarily end happily ever after, maybe I can at least find my way home again, and at least have some sort of peaceful end.

versioning metaphor

posted on

Still reading stuff about multicellular computing.

Not necessarily applicable to software above the level of a single device, but Burbeck gets it wrong with regards to versioning issues in multicellular organisms.

For one thing, there is the key versioning difference between somatic cells and gametes. Somatic cells, the type of cells that become neurons, myocytes, hepatocytes, blood cells, etc., are diploid and always have two copies of a particular code sequence of DNA: two copies of every gene, which are going to be two versions, one from each parent. Gametes are haploid, containing only one copy of every gene. The creation of gametes involves choosing which version of a gene to include. This has been thought to be a completely random process, although as we learn more, it may not be completely random.

More dramatic is the versioning that occurs in cells of the immune system: each cell gets random changes to its DNA—bits/base pairs are spliced out and discarded permanently, allowing each cell to recognize different parts of different foreign organisms—different epitopes. Versioning becomes important because the body has to deactivate versions that react to itself. Versioning also occurs when a B lymphocyte (a specific type of cell in the immune system) class-switches: it dumps entire segments of DNA in order to form different classes of immunoglobulin—different classes of antibody, corresponding to different phases of infection. (IgM and IgD are produced acutely, IgG is produced generally after six weeks or so.)

Another type of versioning occurs with the specialization of stem cells. A pluripotent cell becomes a multipotent cell, which then becomes a stem cell. The stem cell divides so that one daughter cell remains a stem cell, while the other daughter cell goes on to differentiate.

The reason why you can’t use differentiated cells to form other types of cells like you can with stem cells is that there are differences in the resultant DNA. While it doesn’t necessarily mean that actual bits/base pairs are discarded, the histone scaffolding only allows certain genes to be expressed, and we don’t (yet) have a good handle on how to manipulate the scaffolding (although as recent research has shown, we can reverse differentiation to a certain extent.) The problem with non-selective manipulation of the scaffolding is that this is precisely the mechanism that causes cancer: cells that mutate and become malignant tend to de-differentiate.

Yet another example of versioning is what occurs on the X chromosome in women: in each and every somatic cell, one of the two X chromosomes gets randomly deactivated so that there is a proper balance in gene expression. (This process is called lyonization.) Because the two versions are essentially guaranteed to be different (one is from mother and the other from father), it can sometimes result in mosaicism of traits. More interesting is the fact, in some instances, the cells are able to uniformly deactivate the version that has a deleterious mutant gene.

Then there are the sub-organismal transactions of data. Viruses and transposons continually infect cells and their DNA sits in the nucleus, often quiescent until some stress stimulus forces it to manifest itself, which then provokes an immune response and eventually leads to apoptosis. While the viruses we are most familiar with in daily life: rhinoviruses, adenoviruses, rotavirus, etc., infect cells that have limited life cycles, making such infections self-limited, there are other viruses that essentially take up permanent residence in our DNA: varicella, herpes simplex, cytomegalovirus. Also hepatitis B and C. And HIV. These predominantly cause diseases of reactivation. (For example, the primary infection in HIV really just causes flu-like symptoms which eventually subside. The disease doesn’t really progress until much later.) While one can argue that these are foreign, invading DNA/code, it is clear that the packaging, transporting, and integration of pieces of DNA has long been a part of life, and may have a direct hand in evolution.

Again, the similarities between cell biology and computer software became more apparent with the popularization of Open Source. The prime example of DNA versioning is with the Linux kernel. The same source code—the same DNA—is used to generate widely different kinds of kernels. Some are run on desktop computers, others on servers, but even more drastic are the different versions that run mobile phones, PDAs, routers, DVRs, and various other embedded devices.

We are beginning to enter the age where code can become self-sustaining. Mac OS X’s and Windows software update programs are just simple examples of what is likely to become standard fare. Google and Amazon already give suggestions about what other content we might be interested in. We can make DVRs selectively record shows we might be interested in. iTunes has already had rudiments of being able to select songs that you like in your collection, simply by keeping track of the play count. For very specific things, the computer can anticipate your needs.

Imagine if you extrapolated this principle to software—RNA—itself. Obviously there would be enormous security risks. But what if your system anticipated and automatically downloaded software you might be interested in? What if your Linux distro could anticipate which of the cutting edge patches you might be more interested in and it could offer to immediately download and compile it in? What if embedded devices, when connected to the Internet, could detect when a patch relevant to that particular device was released, and it could essentially update itself?

central dogma

posted on

Of course, I suppose I really should’ve searched Google before trying to coin a phrase. Other people have already used the analogy of the mechanisms of life to the mechanisms of computer programming and information technology.

While the notion of objects (in a programming sense) being self-contained entities consisting of both executable code and inert data is accurately descriptive of cellular mechanisms, the idea of software above the level of a single device being analogous to multicellular organisms hasn’t been quite addressed.

For this, we need to discuss the dominant paradigm in cellular biology, ostentatiously called the central dogma: DNA→RNA→protein. This fits well with the usual flow of code: source→raw object code/byte code→machine language. This also matches the trickle-down concept of the current World Wide Web: you download stuff from the Web onto your computer, and you then transfer digital music or videos to your device.

But in biology, the discovery of retroviruses proved that the dogma wasn’t quite that strict. In this case, RNA→DNA. In fact, it is becoming more accepted that life may have started out with RNA rather than DNA. And proteins aren’t left out of the flow of information: certainly ribosomes and histones affect the expression of DNA, not to mention the flow of nuclear receptors, as well as the transcription and replication mechanisms that copy DNA→RNA and DNA→DNA.

While it is unknown whether or not life started as RNA, code definitely started on single-cell computers. I’m not sure where to put the old mainframe servers into the paradigm, but most modern servers are essentially single-cell computers or networks of single-cell computers. In the current incarnation, interpreted languages such as perl, PHP, python, and ruby, not to mention javascript, are the “duct tape that holds the Web together.” Java and C# also fit into this schema, although there is an extra level of abstraction in the form of a virtual machine. These languages all correspond to RNA: partly structural, partly functional; executable code that isn’t raw machine language. Meanwhile, compiled languages like C/ObjC/C++ correspond to DNA: pure source code that needs to be transcribed to object code, which then needs to be translated to machine language.

But the idea of multicellular computing applying to software above the level of a single device is less concrete than this. DNA is the content that sits on the servers. RNA is the software that manages the content: browsers, media players, sync software, iTunes, RSS aggregators, but it also applies to the OS. Protein is when the content is actually used/activated/consumed: when an MP3 file is listened to, when an AVI is watched, when a Flash or Java application is deployed on a mobile device.

The Open Source paradigm makes it readily apparent that DNA, that is, content—source code—can be turned into RNA. You no longer have to buy RNA directly from the software developer. You can build it yourself.

While security practices are meant to prevent the inadvertant running of untrusted code, it’s going to happen anyway. In fact, it’s meant to happen. You can cut-and-paste scripts from the Web and deploy them willy-nilly. Scripts will mutate, reproduce, metastasize. The evolution of the Web is dependent on the flow of information to and fro.

always struggling with inertia

posted on

Am I growing set in my ways? Or is it just that I really hate this time of year, and the night feels like a smothering weight crushing me into the ground?

I’m actually counting the days until the winter solstice, when the daylight finally stops dying. It has never really been the cold that gets me (although don’t get me wrong, I’m no fan of the cold), it has always been the darkness and the gloom.

Still, the falling rain is comforting in a paradoxical way. Maybe it’s the familiarity. My memories of the patter of raindrops stretch back into my childhood, some few, scattered, random moments that remind me of home, of belonging. They remind me of times when I actually felt loved, and protected.

The smell of the earth after a few days of rain always makes me at ease. It’s like a fresh start, a beginning. The beginning of a new story, without regret, without sorrow.

So I wait patiently for the sun to come back. I know that even the longest winter eventually gives way to spring, and then to the glory of summer, but I guess the little kid in me is still afraid that the sun will never come back, leaving me to the mercy of the monsters lurking in the dark.

multicellular computing

posted on

A phrase that seems to be cropping up more and more to describe Web 2.0 and the evolution towards Web 3.0 is ”software above the level of a single device.

Not exactly a mellifluous phrase that rolls off the tongue, but the concept is key. It’s also locked into the concept of ubiquitous computing, although they aren’t the same thing.

The way I understand it, software above the level of a single device has the following characteristics:

  1. Platform agnostic
  2. Integrates computing power of different scales and different purposes

The classic example widely used is iTunes. It seamlessly integrates (1) the user’s computer (which runs either Mac OS X and Windows—not quite platform agnostic, but still useful as an example) (2) the user’s iPod and (3) Apple’s iTunes Music Store servers.

But I also think that Web 2.0 encompasses things like RSS aggregators, webmail clients, and SMTP-to-SMS gateways. You could probably throw in IM into the mix, too. Online RSS aggregators like Google Reader are definitely platform-agnostic and integrate computers of different scale and purpose, tying together blogs, online incarnations of dead tree newspapers, and community forums like Slashdot and digg, and you may very well be able to get your mobile phone to browse Google Reader directly (although I haven’t tried it myself.) Neither webmail clients nor SMTP-to-SMS gateways really have the sheen and polish that is typical of Web 2.0, mostly because they have existed long before AJAX became such a widely-deployed program technique. But their existence allows you to easily keep a real-time online presence, assuming you always carry your phone with you. With most mobile phone software suites including some form of online instant messaging, the evolution continues, and now that Gmail integrates AOL Instant Messaging, the cycle of recursion is complete.

To elucidate it more concretely, what I mean by integrating computing power of different scales and different purposes is that there is an asymmetry between the computers (used in a relatively loose sense) involved in the integration. For example, your mobile phone or digital music player may very well sport an ARM XScale CPU, running at something like 624 MHz, with at least a few hundred megs of RAM, plus-or-minus a hard drive. Meanwhile, your desktop or notebook has some sort of Intel or AMD processor running somewhere in the 2 GHz or more range, with possibly several gigs of RAM and anywhere between 100 GB to a terabyte or more of hard drive space. And the average server farm usually uses several boxes with several Intel, AMD, or even PPC CPUs, each with some enormous amount of RAM, and some exponential amount of hard drive space.

And generally, the purposes of each level of computing power differ as well. Your mobile phone is generally most useful for, you know, calling people, and maybe sending text messages as well. Your digital music player pretty much plays digital music. Your notebook or desktop, while still a much more general purpose machine, is, for the average person, often relegated to workaday tasks like composing documents. Meanwhile, these server farms are usually dedicated to a single purpose: hosting content. In the case of iTMS, this means digital music and video. In the case of Google, this means the index of the entire web. In the case of Wikipedia, it means a compendium of everything and anything.

I think this may be where the client-server concept missed the mark. In the 1990’s, people really envisioned thin-clients in a really limited role. They were just more sophisticated dumb terminals connected to some type of server that didn’t necessarily have to be a mainframe. The so-called Network PC never really took off because all we were doing was shuffling around the hardware. In the end, you would use a thin-client in pretty much the same way you would use a full-fledged PC.

Handheld computing also sort of missed the mark as well. Palms and Pocket PCs really just try to cram desktop computing applications into a smaller space with a weirder UI—they’re essentially a substitute for situations for when you don’t have access to a desktop, or can’t really carry a notebook.

What needed to happen was specialization.

The all-in-one capabilities of a PC are starting to become less critical. Computer appliances are finally really starting to take off. Mobile phones are essentially ubiquitous. Digital music players are as prevalent as portable cassette players and portable CD players were in their respective eras. Digital video recorders are practically standard with cable packages nowadays. Gaming has been widely dominated by consoles and their handheld counterparts. The applications on dedicated (but smaller scale) computers (again, loose usage) are typically far superior in terms of convenience when compared to their notebook/desktop counterparts.

On the other end of the spectrum, any content that hopes to be significant must sit within cold iron that is tethered to the Internet. While you can certainly serve content from your desktop PC, this is neither reliable nor generally supported or even allowed by many ISPs. Any serious web app has to be available 24 hours/365 days, and a DSL connection ain’t gonna cut it.

So what is the role of the PC? If you follow O’Reilly’s thought process, it is the conduit. The configurer. The deployer. While you can certainly configure your mobile phone from within its sometimes inadequate UI, and even browse the web on it, things are much easier and cheaper when you can just push content from your desktop to your phone. The reigning paradigm is, and will probably continue to be, browsing the web from a large screen. This is likely to continue even when desktops and notebooks as we know them are largely obsolete. Ubiquitous computing as envisioned by the intersection of popular culture and science fiction in films such as “The Matrix” and “Minority Report”, while abolishing the notion of a huge non-descript ugly beige box with a noisy fan, still favor visual representation of information on a large viewport or at least on multiple mid-sized viewports. Even with the advent of the iPhone, I am skeptical that the general public will favor manipulation of data on small viewports over large tactile viewports that will initially be displayed on large plasma screens, then perhaps on digital paper, then, eventually, when the ubicomp revolution is complete, in thin air using holographic technology.

When ubicomp becomes mainstream, I envision an era where the PC as an actual device may not even exist. When hardware is cheap and truly ubiquitous, in the sense of a thorough blanketing of all the meatspace commonly traversed by people, the function of PCs can be distributed. If input devices become essentially virtual (like the tactile GUI featured in “Minority Report” or even existing holographic keyboards) and the display is also purely holographic, with ubiquitous wireless communications, the actual computer hardware need not be in the same room, or even in the same building, and if the wireless network is truly ubiquitous, you can easily roam from place to place.

But seeing the awkward phrase “software above the level of a single device” often has got my gears turning. There’s got to be a more elegant way to encapsulate this concept.

The move from the all-in-one PC to a world more dependent on smaller, cheaper, dedicated smart peripherals (meaning peripherals that have their own CPUs, RAM, and even hard drive space) as well as on larger server clusters and mainframes reminds me a lot of the evolution of single-celled organisms to multi-celled organisms. You can think of all-in-one PCs as single cells, and thin-client networks as colonies of single cells. Mainframes are huge single-celled organisms. Server farms and clusters are syncitiae. The real evolutionary leap is when single cells began to specialize and perform unique but limited functions.

I am tempted to refer to the use of software above the level of a single device as multicellular computing, but this is unlikely to resonate with people who don’t have backgrounds in biology. Still, there’s got to be a better name for it.


posted on



posted on


part 1

in this city
there are trains
sleek and silent
thrumming at 60 hertz
the sound of stoppered lightning
of magnetic fields
and electron winds
meandering above the arroyo
and the freeway below
swollen with chrome and rubber

or utilitarian, work-a-day
rumbling beside
the Way of the King
El Camino Real
still smelling that stench of sulfur
amidst the burning coal
the tracks and ties laid down in
those by-gone days
that ancien regime
the empire that was, will be
myth layered upon geology, geography
slapdash palimpest, decoupage
of history, scams, and dreams

the surf splashing up against the shore
my eyes gaze into that infinite distance
where the sky and sea bind one another

and home
the glowing, glimmering tower
in sunset, and in twilight’s actinic glow
the fiery fragrance of the Santa Anas
rolling in through the canyons and passes

for want of a better name
and still I feel exiled
furtive, hidden

part 2

why I cannot let this impossibility go
this unfulfillable ache that rends and tears
that gnaws and rasps at my insides
leaving me gasping
I cannot draw breath to scream

your eyes glowing with the future unfolding
(where I am some vagrant interloper,
some scoundrel hitch-hiking down the road)
and even in this dream where I have no place
I am warm
I am welcome

an eager spectator to the tale of your happiness
even though I watch through the window
face pressed up against the glass
feeling nothing of the heat of that hearth


I carry this in my mind
like a flask filled with cordial
savor it one drop at a time

part 3

memory fades
bled of color
muted and silent
and all I know is the cold
seeping into my soul
enveloping me

it wasn’t so much the cold, though
but the futile cycles
of thawing and freezing
of endless sunlight
and endless darkness
of summers that never seemed to come
and winters that would not relent

in this time-slice of a future that could not be
somehow saved from the trash heap
of unforgiving history
though lovely be thine skin
and shapely still thou dost appear
mine heart you never had
a future long-past where I did not belong

(in even in these quiet moments,
I remember a scarce blessing or two or three
and my faint-hearted wandering
amidst the maze that mirrored my soul.)

part 4

so close to home
that it grew alien
and I never learned to separate
the past from the present
the present from the future
locked in this never-ending time warp
this enraptured Groundhog’s Day
whirling me around in existential dread

in a thousand different ways
my innocence was forever shattered
all illusion of purity and virtue lost
leaving the ugly entrails of vile humanity
writhing in the open wind
for all the world to see

this paradise that is poison
this land of opportunity like a fly trap
a honey pot

I’ve never met so many low-lifes and thieves
so many people sick of dreaming
calling their pedestrian vision “reality”
devoid of any hope or sanity
their brains scrambled beyond salvage
beyond reason
dreaming to get dismembered, disintegrated
for a boy who would be king
turning this place into a savage country
pitiless, arid, worthless
that even General Santa Anna would willingly cede

part 5

lost wanderer
soul billowing in the wind
uselessly, like a flag on the mast
at full gale force

my soul orbits that singularity
where you tunnel through space-time
well beyond my reach
I am still chasing starlight
flying farther and faster away from me

my maps all lead me awry
my compass needle spins wild
even the stars are all wrong
I doubt that in this lifetime
I will reach you
but I have no where else to go


amazon and itms

posted on

Robert Scoble seems to be spinning this as an attack on Apple, but as an iPod owner, what this means is that I now have two places where I can legitimately buy songs in digital format (and even more if more artists get with it and go the way of Radiohead.) Looks like a win-win to me.

tags: , ,

facing the unknown

posted on

will it be just like falling asleep
without waking
an eternal night
without sun’s dawning
no stars, no moon
just the silence
and the void?

is there a last goodbye,
a threshold you must cross
a point of no return?
when does living start becoming dying?
when does the soul stop stirring
grow stagnant, still, now rotting?

even in this late, unhopeful hour
I grow afraid
of the cold, of the dark
of the unforgiving emptiness

this vacuum causing traction upon my soul
my insides slowly implode
like a dying star losing it’s fight against gravity

I do not hope, but despair wracks my body
steals every breath
makes each waiting moment painful agony
I dare not wish, but volition still stirs deep within me
wriggles and writhes like some small animal
scorched and beaten, mangled and cornered
each moment meted out by its frantic heartbeat
thrumming in time to each ragged breath
as hard as I’ve tried to kill it
it fights with preternatural power
with unquellable primitive instinct

I believe
that the transformation from living
to dead
is no mean task
not a simple thing that is done as a trifle
the road of death itself
is long and hard
and there is a price to pay
for that dark road among the stars

you cast away all hope
is the first thing to pay
and even dreams of love
evaporate to the wind
and still you weigh too much
and the path cannot hold you

the poisons are incremental
you must concentrate long and hard
drink deep of the draught of Dionysius
or instill the liquor of Morpheus into thine veins
the powders of the Alpha and mayhap the Omega
the leaves of the Quechua of the Andean Plateaus and still and still
Death is a hard mistress to win

even with miracles scintillating through my brain
this emptiness will not stop
this emptiness keeps expanding
blotting out the sun and the stars
consuming everything and anything
outpacing the small gains I make everyday
and still the darkness will not catch me
it drives me still, taunting me with death
but not giving it to me
and I hang limp and flaccid
weary and worn

the end of the universe is a dull, drear thing
not with a bang
but the slow dispersion of all Creation
colder and colder
beyond freezing
but never still
disorder ever increases
but the darkness keeps expanding

is this my fate, to hang still in this rarefied nothingness?
suspended by my shorts, to stare at the receding stars?
am I, just now, doomed, and yet will live a trillion years
to witness the heat death of the Universe?
what new thing might I see under the sun
when even the sun is blown to oblivion
a smoldering coal ember
fading, fading, but never still?

The question is not new
perhaps even when I was new born, the question had formed on my lips
though I had not the words to ask it:
why am I?
and the silence is bitter to me
is colder than ice, than steel, than the empty vacuum of space
and I can guess at the answer
but even if I know that there is no answer
I still am not free

life is a hard burden
growing heavier, fatter, more ponderous in my mind but I dare not cast it off

the suffering that is known
—I cannot bear it— but I neither dare

Until one hour, one minute, one second
either I will know the answer
or gain the courage to set myself free