dendritic arborization • I like that phrase

disordered thought processes

hidden in the seeming chaos is beautiful, elegant order—at least, I hope that's true.


posted on February 28th, 2008

Not good.

genetics: only the abnormal is interesting

posted on February 25th, 2008

One of the things that drew me to the field of genetics when I was an undergraduate was the fact that geneticists seemed to value the abnormal over the normal. Genes are named for the “abnormal” phenotype, rather than the wild-type phenotype. So you get names like hedgehog, a gene, which when mutated, causes a fruitfly to develop “lawns” of prickly denticles, and fringe, which makes the fruit-fly have, well, fringes. And geneticists aren’t afraid to appropriate facets of sociology and popular culture. So one of the better known mammalian analogues of hedgehog is sonic hedgehog, named after the video game character.

Sonic the Hedgehog

And a vertebrate analogue of fringe is radical fringe. There’s even an analogue called lunatic fringe.

So it is no surprise when scientists recently discovered the gene responsible for calibrating a flower’s internal clock that they named it FIONA1, after the heroine from “Shrek”, who is human by day and ogress by night.

Princess Fiona during daylight hoursPrincess Fiona at night

quiz: what punctuation mark are you?

posted on February 24th, 2008

You Are a Question Mark
You seek knowledge and insight in every form possible. You love learning.
And while you know a lot, you don’t act like a know it all. You’re open to learning you’re wrong.

You ask a lot of questions, collect a lot of data, and always dig deep to find out more.
You’re naturally curious and inquisitive. You jump to ask a question when the opportunity arises.

Your friends see you as interesting, insightful, and thought provoking.
(But they’re not always up for the intense inquisitions that you love!)

You excel in: Higher education

You get along best with: The Comma
What Punctuation Mark Are You?

serious geekery

posted on February 24th, 2008

Well, this is a fun little puzzle from Google Blogoscoped

I’ve translated them to Ruby-esque code fragments (with some Rails-isms sprinkled in.)

// idiom 1
cop[0].goodInPercent = 100;
cop[1].goodInPercent = 0;

return GoodCop + BadCop

// idiom 2
isCrowd = personCounter >= 3;

assert (3 == isCrowd)

// idiom 3
injury += insult;

return (insult + injury)

// idiom 4
1: board.draw();
goto 1;

def Board.back
  Board.type = ‘drawing’

// idiom 5
if (bird[1].feather == bird[2].feather) {

flock.collect {|bird| feathers.each {|feather| bird.find_by_feather(feather)}}

// idiom 6
a = getThickness('blood');
b = getThickness('water');
assert(a > b);

Blood.SpecificGravity = 1.0506 # reference:
Water.SpecificGravity = 1.0000
assert (Blood.SpecificGravity > Water.SpecificGravity)

// idiom 7

class Spade
  def = value   end
end (‘Spade’)

// idiom 8
function die(max) {
    for (i = 1; i <= max; i++) {


// idiom 9
prey = 'worm';
time = getCurrentTime();
if (time >= 4 && time <= 8) {

bird = => ‘early’)
worm =
if bird.shift == ‘early’ {

// idiom 10
while ( ) {

Nero(:instrument => ‘fiddle’) while Rome.is_burning?

// idiom 11
function getValue(garbage) {
    return garbage;

print Value.Out

// idiom 12
take(salt * .01);

Salt.take(:mass => 0.06479891, :unit => ‘gram’)

// idiom 13
var here = false;
var there = false;

assert (!(here or there) == true)

// idiom 14
if (i == 2) {

def self.takes(value)
  if value == 2 then self.tango();

// idiom 15
days = 365;
for (day = 1; day <= days; day++) {
    if ( random(0,100) <= 50 ) apple++;
if (apple <= days) doctor();

year.collect {|day| if day.applecount > 0 then doctor.keepaway()}

// idiom 16
if ( !dogs.sleep() ) {

if dog.is_asleep? then dog.lay()

// idiom 17
function tunnel() {
    var dark;
    for (i = 0; i < 10; i++) {
        dark = true;
    dark = !dark;
    return dark;

Class Tunnel
  def self.istherelight?
    return (self.position == END)   end

// idiom 18
if ( ape.inLineOfSight(it.x, it.y) );

if monkey.doessee?(:actor) then

// idiom 19
return || way.high;

return way(:my) || way(:high)

// idiom 20
hay[ random(0, hay.length - 1) ] = 'needle';

hay =

// idiom 21
a = 0;
b = 1;

// idiom 22
function getGain(pain) {
    return pain >= 1;
if (pain.level == 0) then gain.level = 0

// idiom 23
if (cooks >= 3) {
    broth = null;

if cook.count >= 3 then broth.is_spoiled? = true

// idiom 24
if (a != 'cake');

Class Cake
  def Cake.have()
    Cake.ishad? = Cake.iseaten? ? false : true

    Cake.iseaten? = Cake.ishad? ? false : true
  end end

// idiom 25
doesStand = you == me;

if We.is_united? then We.stand()
if We.is_divided? then We.fall()

// idiom 26
var location = getLocation();
if (location == 'rome') {
    do( location.getCitizen() );
if person.locale == ‘Rome’ then person.behavior(:Roman)

random walking through pubmed

posted on February 24th, 2008

I’m a sucker for cross-disciplinary, cross-age demographic topics, and Grand Rounds today was given by Martina Brueckner, M.D., a pediatric cardiologist and scientist from Yale.

She talked about how cilia are responsible for generating the left-right asymmetry in mammals. The most obvious of these asymmetries are organ position: the liver is on the right, the spleen and the stomach are on the left. But major organs themselves are asymmetric: the left side of the heart is much more muscular than the right side, for example, and the left side of the brain is usually bigger than the right in most people.

It has been known that primary cilial dyskinesia can be associated with situs inversus totalis (in Kartagener’s syndrome but also with other heterotaxies such as situs ambiguus that is more frequently associated with congenital heart disease.

Cilial dysfunction is also associated with polycystic kidney disease. While the cilia themselves may be normal, what is abnormal are either of two genes: polycystin-1 or polycystin-2. Interestingly, these are mechanosensitive cation channels which regulate the flow of calcium. Even more interesting, these types of channels can be inhibited by gadolinium.

My addled brain made this immediate jump: cilia are everywhere in the body, gadolinium is thought to cause nephrogenic systemic fibrosis, maybe it these mechanosensitive/strech-activated cation channels that mediate this serious adverse affect?

Now, granted, it’s certain to be much more complicated than that. The toxicity of free gadolinium ions has not been really looked at specifically, although some effects are known from exposure to the class of rare earth metals. One of the proposed mechanisms of NSF is that, since in renal failure, the gadolinium-chelate complex hangs around for quite a while, there’s enough time for the free gadolinium ion to dissociate.

Another paper talks about an association between the development of NSF in the setting of a pro-inflammatory condition. This would actually make sense, since it is known that cytokines promote fibrosis and scar formation.

The study of NSF is still in its infancy. In the mean time, it’ll be difficult to get any sort of imaging done in anyone with seriously boxed kidneys.

trying to avoid misogyny

posted on February 21st, 2008

Now I’ve been an Obama supporter since he ran for Illinois State Senate. I definitely want him to win the Democratic nomination so that he can kick McCain’s ass and win the presidency.

Truth be told, I *am* totally turned off by HRC’s strategy of promising a return to the 1990’s. While the Clinton Era (which I like to refer to as the Pax Clintonia) was definitely a better time than the W Era (which I like to refer to as the Fall of the Republic), you know what, it could’ve been better.

This is not to say that I think Obama is going to somehow get Democrats and Republicans to hold hands and sing Kumbayah for the next four years. In all reality, even if the Democrats dominate two of three branches of the government, I don’t care if they don’t do anything at all. All I want to stop is the active demolition of the Constitution by W and Big Dick. I don’t particularly care if Congress ends up in gridlock for the rest of my lifetime. As long as they’re not actively trying to subvert the intent of the Founding Fathers and not trying to setup a theocracy backed by mercenaries who are above the law, I’ll be happy.

While my vote tends to go to the progressive candidate, my ideal form of government is still utopian anarchy. Not that that’s ever going to happen, but a man can dream, can’t he?

So if HRC manages to take Obama down somehow, I guess I will probably grudgingly vote for her. (Assuming that I don’t come down with another case of acute gastroenteritis.)

Alison of bluishorange complains about how people refer to Barack as “Obama” while they refer to HRC as “Hillary.”, but realizes that HRC herself is promoting her first name in her campaign. Now, I can see how this can be construed as disrespectful. In all honesty, that’s why I call her HRC. (You know, like FDR, JFK, RFK, MLK, LBJ. I mean, that’s not bad company, is it?) You can’t just call her “Clinton” because that’ll be too confusing. (Seriously, if she wins the presidency, how do we refer to them? Are we going to be doomed to constantly refer to Bill as “former president and first gentleman Clinton”?)

And referring to Barack as BHO is just not going to work for aesthetic reasons.

Last night, I dreamt that the L.A. Lakers were in the playoffs, and they had fought pitched battles against the strongest teams in the Western Conference, ekeing out a bare win in the seventh game of the Conference Finals in overtime, only to lose the NBA Championship to some pathetic Eastern Conference team, and I woke up completely drained and agitated.

And I realized that my dream wasn’t about basketball, but about the Presidential Race. I really want Obama to win this thing, but I’m afraid of how I’ll feel if he doesn’t.

urban warfare

posted on February 21st, 2008

Apparently the Glendale Freeway is closed right before the exit to my parents’ house because some guys started firing on cops with automatic weapons, and the LAPD shot a few of them back and killed one.

View Larger Map

A few folks are commenting about this in real time.

not finding

posted on February 21st, 2008

were I not to find
that which I most desperately seek
what would this life be worth?
not nil, I pray
even in this half-existence
can I not steal a few drops of
reflected sunlight
from ghosts and phantoms
of things that could never be?
like a half-starved cur
begging at his master’s feet.

through bone weariness
I have toiled
even through the lonely night
I kept vigil
past the dizzying confusion
of a million thoughts
like thread
like spun cloth
woven, tied, knotted
haphazardly sewn together
then rent apart

and in this darkest hour
when my soul quivers
my heart quakes
knowing that all is lost
(but can you really lose
what you never had?)
I am hopeful.

the road never ending
(though my journey will someday
cease, unfailing)
like a tattered ribbon unfurling
before me
meandering, twisting
lost in the mists and shadows
over the ashen hillsides
where the lichen and the moss
scrabble greedily for rain
seeping into the desert sand
onto the barren plains
of scrub and tumbleweeds
past the the bank of drifting fog
disappearing amid the storm clouds
ere it comes to the horizon

and all the distances
I have trodden
still the road goes ever on
and darkness has always loomed behind me
while I go and chase the sun

and still the days unending
like bricks or tiles
lined orderly upon the calendar page
as if fleet-footed time
could truly be captured
within this black-and-white
prison of lines and numbers

10,000 sunsets
(and how many rainy days)
and I cannot conceive of the end
still hoping that I might continue travelling
so long as I do not find that which I seek

what a devil’s deal that would be to live on and on, beyond all bearing
as long as I never discovered
that for which my heart so piteously yearns and longs

worst president ever

posted on February 20th, 2008

Jeff Albertson AKA The Comic Book GuyGeorge W Bush AKA The Worst President of the United States

W’s approval rating is down to 19%

cerebral malaria

posted on February 17th, 2008

Erythropoietin protects children from the cerebrovascular ischemic effects of cerebral malaria.

Well, this certainly makes sense. One of the major effects of malaria is, after all, severe hemolytic anemia, and anything that will increase oxygen delivery to the brain is going to help.

But the pro-thrombotic effects of parasites lodged in the the brain’s blood vessels are another thing entirely, resembling nothing so much as disseminated intravascular coagulation (DIC)

And I don’t know how practical trying to use Epo in developing countries would be, considering that you have to give it subcutaneously or intravenously, and that it’s pretty damn expensive.

google reader

posted on February 17th, 2008

I find it really ironic. Despite the fact that I’m drawn to technology, I find myself resisting dominant trends. When everyone had CD players, I was still hanging on to cassette tapes. When the world was dominated with x86 clones, I was still banging away at my 8-bit Commodore 64. When Windows 3.0 came out, I stuck with MS-DOS.

When GUIs became the dominant interface, I stubbornly hung on to the command-line, and stayed away from Mac OS for the longest time precisely because there was no command-line.

When the cel phone revolution took hold, I held out until 2001, when getting stranded in a storm without a pay phone in sight finally convinced me that being able to call for help was a good idea. And when mp3 players started making their way to the market, I didn’t buy in until 2002, just after I had finally given in and bought my first notebook, which incidentally also marked the time I finally stopped using floppy drives and started using USB flash drives.

When PDA mania was at its peak, I gave in temporarily, but I find my Palm now sitting in a drawer collecting dust. I keep my random notes in Mead notebooks (although I have recently upgraded to Moleskin notebooks.) At work, I still use index cards and little scraps of paper and dislike having to use the computer, except for using Google to search for scholarly articles.

And when Web 2.0 came to the fore, I still hung on to my command-line apps, using first Pine, then Mutt, then grudgingly moving on to the GUI desktop with Evolution and then to read my e-mail.

But then Google finally snagged me. Working faster than, with all of the niceties of a desktop client and even a command-line client, with powerful spam-killing abilities to boot, I’m completely hooked on Gmail.

But that’s not all.

I didn’t jump onto the RSS feed reader bandwagon until 2006. Using Safari was not all that intuitive, and I definitely didn’t want to use (particularly not with the number of feeds I follow), so I looked around for a dedicated desktop RSS reader. I really dug (and still do dig) Vienna and I’ve been using it for a year and a half now.

But Google Reader is useful when I’m not using my iBook or my Mac Mini, and particularly useful at work, since they can’t really block Google (although they can disable Javascript, which is a real pain-in-the-ass.)

So for a while, I found myself exporting and importing OPML files. (And, quite nicely, both Vienna and Google Reader will sync feeds automatically. You don’t have to worry about duplicates.)

Then the Facebook revolution took hold, and Google Reader added the “Share This” function.

This is kind of neat, because it funnels any of the links you share from Google Reader onto your Facebook profile and onto the friend feed. But I could do something similar with Vienna and using the Facebook app, and the advantage with going through is that you can insert little comments about the links.

So now I have three tiers of bookmarking:

  1. Links that I really want to hold onto for future reference end up on In fact, has pretty much replaced my browser’s bookmark function. About the only use I have for bookmarks in Safari is to house those Javascript scriplets that allow me to share links to and Facebook.
  2. Links that I find amusing or informative while I’m on my iBook or Mac Mini that might be entertaining or enlightening to my friends but which I probably won’t refer to in the future get shared directly using the ”Share on Facebook” Javascript bookmark button scriplet. Mostly these are trendy memes, particularly videos and songs.
  3. Link that I find amusing or informative while I’m at work or on somebody else’s machine that might be entertaining or enlightening to my friends get shared through Google Reader.

All of these find their way to my Facebook profile and the omnipresent friends feed. And if Google Reader added a way to add comments, I might never use Facebook’s bookmarklet.

Another neat feature that Google recently added were RSS Feed recommendations. I haven’t really gone crazy with this feature because I’m already following over 600 feeds, and I’m trying to prevent RSS feed reading from completely taking over my life.

But the most compelling feature that will probably end up making Google Reader my primary RSS reader is the Trends feature. Google breaks down your reading habits, tabulates things, and spits out a nice chart.

I discovered some fairly disturbing things.

So far, it appears that the feed I read the most items from is digg, despite the fact that I constantly deride the intelligence of people who use that site, although I only in fact read 4% of the articles posted. For the most part, though, I’ve been following political blogs the most, it seems. The blog that I’ve been following the closest (reading 14% of the articles published) is the political blog Balloon Juice, written by a former Republican whose intellect prevented him from following W’s disastrous neocon (with emphasis on “con” as in “convict”) administration off the precipice. Coming in second at 8% is [Crooks and Liars][17] described as “a politically left blog focusing on political events.”

So now Google is my primary mail app, and my primary RSS reader. How soon until Google makes my OS completely irrelevant?

You know, if Google wanted to rule the world, all they need is a paramilitary organization (I hear Blackwater is looking for some positive PR), and they’re pretty much there.

desktop linux

posted on February 16th, 2008

All sorts of theories have been proposed about why Linux has not taken a significant chunk of the market for desktop OSes. While there may be merit in some of the psychosocial economics presented, it ignores big, more concrete reasons why Linux hasn’t seriously eroded Windows market share, and why even Mac OS X is outstripping it on the desktop front.

The pre-installation argument

Everyone knows that people are generally lazy. If your machine comes with XP, you’re probably going to just use XP. Even when they only have the base OS installed on an otherwise empty hard drive, very few people are actually motivated enough to wipe out their hard drive and install a different OS. (And the longer someone gives in to inertia, more and more crap will accumulate on the hard drive. And the more cruft that accumulates on someone’s hard drive, the less likely they are going to want to reformat and reinstall.) This is what makes the rebellion against using Vista so remarkable. It’s the first time people are demanding that they have the right to delete the base OS and install something else.

But notice something about the Vista debacle: people are demanding that the computer manufacturers themselves should be responsible for pre-installing XP. They don’t want to deal with having to do the reformat and reinstall themselves.

Now I know that Dell offers machines pre-installed with Ubuntu, but it’s not the default. You have to ask for it. Even this small obstacle is probably enough to deter people from adopting Linux. The fact that you have to ask for XP to be pre-installed instead of Vista is probably the reason that Vista’s numbers are even as high as they are.

Pre-installation is the reason why IE ended up destroying Netscape. Despite the fact that IE 1 was worthless and IE 2 was still rather primitive compared to their Netscape counterparts, the fact that you didn’t have to download and install anything essentially assured IE’s dominance.

Hardware first, software second

It’s true that most Linux distros have all the capabilities that most casual computer users need from a computer: web browsing, basic office productivity software (word processing, spreadsheet, graphics manipulation), and the ability to listen to mp3s and watch movies. The most popular Linux distros sport desktop interfaces that are decidedly Windows-like, and most computer-literate people have no real problem with navigating the Linux desktop.

But the average person doesn’t really choose an OS. They decide what kind of computer they want to buy.

In the same way, people don’t really choose automatic transmission or four-wheel drive. They decide what kind of car they want to buy.

And if almost every single computer manufacturer out there pre-installs Vista on their machines, then more likely than not, unless you’ve got very particular needs, you’re going to end up with a computer with Vista.

Notice that Mac users don’t choose to run Mac OS X. What they decide is that they want to buy a Mac. And not just any Mac. They have to choose between very different form factors: MacBook, MacBook Pro, MacBook Air, Mac Mini, iMac, Mac Pro, XServe. While in terms of architecture, these machines are interchangeable, their specific hardware is very different, in the same way that while two cars may have the same engine and chassis, the sports coupe is decidedly different from the station wagon.

The choice of hardware is what is going to cost you big bucks. And you just have to live with the OS it comes with.

So until a particular computer manufacturer has a particular stake in selling Linux machines, and until they start exerting marketing resources to that goal, desktop dominance just ain’t going to happen.

Disconnected from the goal

I think that the reason why Mac users are so fanatical about their OS, but more importantly, about their machine, is that they know that Apple is interested in designing machines specifically for the home desktop. So Apple’s strategy has always been to design the hardware first, and the interface comes as a natural result. Since their goal is to sell hardware, they need to make decent software to make buying and owning a Mac a worthwhile experience. So they’ve research optimal UI design, and they’ve come up with the Human Interface Guidelines, and most popular applications on the Mac are extremely intuitive to use.

In contrast, the average manufacturer of x86 machines don’t really specifically target the home desktop. The same Dell that sits in your house is probably the exact same Dell that is sitting in the office, except maybe the one in the office doesn’t have speakers and has a bigger hard drive. And in reality, most manufacturers of x86 machines care way more about the business market than they do about the consumer market.

As far as Mac OS X goes, Apple isn’t so much interested in creating an optimal OS as they are in creating an optimal interaction and experience with the hardware they want to sell you.

In contrast, Microsoft just wants to sell software. In the 8-bit days of yore, they didn’t even sell code directly to the consumer. They contracted with computer manufacturers to design their OSes, or more accurately, their interfaces. Back in those days, the interface was generally a BASIC interpreter. And Microsoft wrote BASIC implementations for all the major players in those days: Apple, Atari, Commodore, Tandy, and IBM.

The first actual OS (or more accurately, kernel and command-line interface, I suppose) that Microsoft built was MS-DOS, initially licensed to IBM (who customized it and released it as PC-DOS), but then became available for other OEMs to license.

So all of these companies took Microsoft’s source code and customized it for their particular machines. Whether or not a machine sold wasn’t Microsoft’s responsibility. And platform compatibility was something no one really cared about.

When various computer companies managed to reverse engineer IBM’s BIOS, the age of clones had arrived, prices dropped, and everyone pretty much standardized on IBM-compatible architecture, which meant an x86 CPU with BIOS and MS-DOS running on top of it. When Compaq released the first 32-bit machine before IBM did, this set the stage for the first commercially successful version of Windows: 3.0.

The rest, as they say, is history. The power to design computer systems was wrested from computer manufacturers, instead becoming dictated by the ability of a computer to run Windows. Everything degraded to the lowest-common denominator (although there were in fact many computer manufacturers that dared stray from the beaten path, all of whom died lonely deaths, even including IBM’s personal computer division). And Microsoft really doesn’t care how particular computer manufacturers do. Even if Dell and HP went bankrupt, other companies would certainly take their place, and Microsoft still would continue to make tons of money.

This is, unfortunately, the landscape in which consumer-grade Linux finds itself: a market filled with indistinguishable x86 machines that are first and foremost Windows-capable.

In reality, Linux is generally successful in the realm of server-side machines, running on a wide variety of architectures. And the server market is very divergent from the desktop market. Heavily used servers often even eschew GUIs in order to maximize CPU cycles available to high-availability apps. UNIX and UNIX-like systems basically dominate this market. And ultimately, Linux is a UNIX-like system best suited to be a server, or to function in a cluster to run computation-intensive but often non-graphical applications. This is the market to which Linux-based companies like RedHat, Novell, and Oracle.

Too much imitation, not enough innovation

With regards to the desktop, however, Linux is essentially playing catch up. To the unfamiliar eye, both KDE and GNOME superficially look decidedly Windows-like. And most casual users tend to treat these environments as Windows knock-offs.

While features standard to X-based GUIs for decades which Windows never implemented are preserved, such as virtual desktops and multiple viewports, most users who are familiar with Windows don’t really know what to do with them and tend to ignore them.

In fact, hard core Windows users even tend to be baffled by Mac OS X’s intuitive interface.

True, Windows is not the only interface that X on Linux can emulate. But one of the other major desktop environments, GNUstep, is in fact a NeXTstep-clone (NeXTstep being the direct ancestor of Mac OS X)

Until Linux desktop environments develop truly useful features that Windows lacks, or alternative interfaces that transcend the dominant desktop metaphor, most people will continue to think of Linux as a Windows knock-off, and seen in this light, Linux will never look favorable in comparison.

The road ahead

While I’m not saying that Windows-like functionality should be ripped out of Linux distros, there needs to be a focus on alternative interfaces. Look at the direction that Apple is taking. The iPhone’s and iPod Touch’s interface is likely to be adapted to notebooks and desktop machines, bringing us closer to the fantasy interfaces found in “Minority Report” and “The Matrix”. And the Wii’s accelerometer is just waiting to be used in non-gaming applications.

Until Linux distinguishes itself as a unique desktop experience and not just a reimplementation of Windows functionality, it’s unlikely to make any in-roads on the desktop front.

Some people just don’t get it.

Unless Steve Jobs leaves Apple, and Apple ends up re-enacting the 1990’s, there’s no way Mac OS X is ever going to be available for random configurations of hardware built on top of the x86 platform. There’s absolutely no business or financial advantage to having to support such a mish-mash of devices, and it is completely inimical to Apple’s main source of revenue: hardware.

Apple is hardware company. Software is just a side effect.

The focus often then shifts to the idea that, since Apple makes hardware, and not software, why not just release Mac OS X under an open source license? The argument is that this would allow Apple to obtain that oft-misunderstood holy grail: market share.

The reasoning appears to be that Microsoft is such a successful company precisely because they’ve got a lock-down on market share when it comes to OSes.

But that’s all fine and dandy. After all, Microsoft is a software company, and not really a hardware company, despite the fact that they make mice and gaming consoles. But in their case, the business model is exactly the opposite of Apple’s. Why deal with actually making machines when you can sell the software and let the computer manufacturer worry about the hardware details?

You’ve got to remember that making hardware is expensive and if it doesn’t do well, you’re going to lose a ton of money. I think of Ford’s Edsel, or Sony’s Betamax (a situation eerily recapitulated almost exactly by the HD-DVD debacle.)

I don’t suppose anyone remembers the Commodore Plus/4, but here was successful computer company (the Commodore 64 is still the best selling computer of all time) who blew it.

Apple itself has had a lot of expensive failures on their hands as well: the Apple III, the Lisa, the Newton, and the Cube. (Interestingly, Apple seems to have salvaged the concepts of the last two products and reworked them into commercial successes. You can think of the iPhone as the successor to the Newton, and the Mac Mini seems to be another incarnation of the Cube.)

Why deal with the possibility of investing so much money in a hardware project only to have it fail?

So Microsoft’s profits soared, and they took over the world in terms of market share. Sure, Microsoft makes tons of money, despite being taken down by the U.S. Department of Justice and by the E.U., and despite its poor PR image. But it must be emphasized: market share does not equal profitability.

Case in point: both the Sony PS3 and the XBox 360 are loss leaders, marketed with the hope that the developer’s licenses and royalties from released games will eventually net their respective companies some profits. Despite the fact that each console commands a significant percentage of market share, they don’t make money directly. If Sony and Microsoft weren’t multinational juggernauts with vast resources rivalling some industrialized nations, there’s no way they could stay in business with this kind of strategy.

This is in stark contrast to Apple’s strategy of making a very nice margin for every piece of hardware they sell. Despite having only 8% of the market share for personal computers, they made a profit of $1.58 billion this last financial quarter (total revenue was $9.6 billion.) When you’re making that much dough, do you really care if you only have 8% of the market?

I mean, seriously, it’s not like BMW is hurting despite their single-digit percentage in market share. (As an exercise in basic arithmetic, let’s calculate what that market share is, based on the number of BMWs sold in the U.S. as a proportion of all the cars sold in the U.S. According to the U.S. Bureau of Transportation Statistics, the number of cars sold in a year has been between 7.5 to 8.1 million a year from 2002 to 2005. We’ll assume that this number hasn’t significantly changed for 2007. According to an article from the Associated Press, BMW sold 335,840 cars in 2007. This means that BMW’s market share in the U.S. is between 4.1-4.4%. Nevertheless, BMW’s total revenues added up to $82.7 billion dollars.)

Add this to the fact that Apple’s real competitors are Dell and HP (with 29.1% and 25.7% market share, respectively, according to an article about FY 2007 Q3 personal computer sales.) Dell had $15.7 billion in total revenue last quarter, but only made a profit of $766 million. HP’s 2007 Q4 performance was more respectable, with revenues of $28.3 billion and net income of $2.1 billion. Even still, you can see that market share is nowhere near proportional to profits made. Dell has more than 4 times the market share, but only had 1.5 times more revenue than Apple, and their profit was only half of Apple’s. HP has more than 3 times the market share and made more than 3 times the revenue that Apple did, but HP’s profit was only 1.3 times bigger than Apple’s. In other words, chasing market share does not translate into making more money. Last I heard, Apple was a company interested in making money, and not so much in winning popularity contests.

The counter-argument is that Sun Microsystems is also a hardware company, and they therefore should have similar interests with Apple, and yet they’ve released their OS as open source. Ignoring the fact that their target demographic has very little overlap (although I suppose Mac OS X Server running on an XServe does compete directly with Sun’s hardware), this argument doesn’t do a good job with separating the different connotations of free/open source software.

Sure, Solaris 10 is free, but you have to pay for support, and as far as I can tell, you don’t get the source code. The default GUI is not open source (CDE). The alternative GUI (JDS) is apparently built on top of GNOME but it doesn’t look like you’d be able to get the source code for the Sun-specific bits, and most of the useful applications that you need a Solaris system for probably aren’t open source either. (Not to mention the fact that it seems like most Solaris apps only run on Sparc and x86-64 CPUs, not so much on x86-32 CPUs.)

So what about OpenSolaris? Apparently OpenSolaris only refers to the kernel core and some basic utilities to go with it. Sure, the source code is open, but you’d still need to install a GUI and some apps. You need Solaris Express Community Edition or Developer Edition, and it doesn’t look like you get all the source code for those, either. Of course, there’s Nexenta, but then why not just install a Linux distro?

Or OpenDarwin + GNU-Darwin, for that matter? (OK, so OpenDarwin is dead, but you can still get the source code for Darwin from Apple, although I don’t think you can actually get it to compile.)

In other words, Sun’s model is almost exactly like Apple’s model: the base systems are open source (OpenSolaris vs. xnu + the BSD subsystem) but the GUI is proprietary (CDE vs. Aqua.) The only difference is that you can legally download the full version of Solaris for free.

Ultimately, what you’re paying for is support. If you need help with OpenSolaris, you can do what old-time Linux users had to do, and RTFM, scour the source, or desparately Google for answers. The alternative is to pay Sun to help you out (just like you would pay RedHat.)

If you look at it that way, the way Apple does it is not bad. You buy any one of their products, and you’ve basically got free face-to-face tech support (in the form of the Genius Bar) until your piece of Apple equipment erupts in a fiery ball of flame. Add to that that they’ll repair or replace your hardware free of charge for a year under warranty (which was responsible for my upgrade from an iBook G3 to an iBook G4) and the price you pay for an Apple system doesn’t seem like such a dear premium if you ask me.

But I predict that Apple will release the source code of Aqua the day that Microsoft releases the source code of Vista and the day that Sun releases the source code of the proprietary bits of Solaris. So don’t hold your breath.

The Orange Lights “Click Your Heels” (stolen from D’s myspace page)

fire and rain

posted on February 14th, 2008

should my soul catch fire again
(the embers smolder, glow bright in the darkest hour)
not a wildfire streaking through the fields
up the mountainsides
leaving smoking disaster and ruinous ash

firelight in the darkness
on some distant shore
I am alone
bright burning warmth, like life
like breaking dawn on a winter morning
burning past the gray mist and blanketing fog

I watched the day fade away
the sunset burst into glorious luminescence
sending shafts of golden, amber light
sailing through the deep bank of rain clouds
(and all my weariness burned away)

a rainbow shimmers into existence
arcing against the twilight sky
cascading down into the deep recesses of the canyon below
great flocks of puffy white clouds waft eastward
and ice cold rain fall upon my face
rivulets then gouts of flowing water streaming down the road

still the light glimmers bright
this vision sears itself into my eyes
the soft fall of light rain upon the windowpane
the sound embedded in my heart
and in all the riot of sunbeams refracting and radiating
of storm clouds racing and raindrops falling
there is a quiet stillness
in this gentle, constant motion
and my soul lies tranquil
if only for a moment

I don’t know. Sometimes I wonder if I’ve just lost my capacity for friendship. For love. For caring.

Not that I don’t care, but I find that these days, I care about people in the abstract, as some reified, hypothetical construct. To steal a turn of phrase from the misanthropic Moe Syzlak from “The Simpsons”, I care about people in the sense that I don’t wish anyone any specific harm, and generally would like people to reach their goals (providing that these said goals are generally benign, or more specifically, they aren’t harmful to *me*.)

But I don’t remember the last time that this sentiment was backed with passion. With fire. With feeling.

I don’t remember the last time I was in love.

That is, if I’ve ever truly been in love.

Wow, there’s something inside me that is truly broken, like it’s been kicked to shit, ravaged into pieces, trampled, and burned, with the ashes scattered to the wind. I know it’s truly all fucked up and probably irreversibly destroyed because it doesn’t even hurt a bit.

time marches on

posted on February 11th, 2008

This week’s I-5 playlist, featuring cheesy love songs and songs to commit suicide to:

  1. Lionel Richie “The Only One” (flashback to sometime in the 1980s, but also evocative of August 2001)

  2. Eden’s Crush “Love This Way” (flashback to August 2001. Yes! Nicole Scherzinger!)

  3. Elliott Smith “I Didn’t Understand” (flashback to September 2007)

  4. Ben Folds Five “Evaporated” (flashback to January 2001)

  5. Hiroko Kasahara “Moichido Love You” (flashback to January 1996)

  6. Peabo Bryson and Regina Belle “A Whole New World” (flashback to December 1992, this version has Lea Salonga singing the woman’s part instead)

  7. Toad the Wet Sprocket “Little Heaven” (flashback to August 1992)

  8. Tiffany “All This Time” (flashback to June 1988, but more illustrative of June 1998)

Elliott Smith’s “I Didn’t Understand”, Ben Folds Five’s “Evaporated”, and Tiffany’s “All This Time” are not exactly the most uplifting set of songs, but it didn’t spin me down the usual downward spiral of depression. What it did evoke was this sense of completely misunderstanding the past 15 years of my life. In some ways, I feel like destiny passed me by, and I’m living some sort of shadow life. In other ways, I feel like I completely misunderstood my destiny, and I’ve been wishing for things that were never meant to be, and now here I am, reluctantly awake from my fantastic dreams.

Dum spiro, spero, right?

nerdy tasks for today

posted on February 10th, 2008

For the last three years now, I’ve been trying to build GNOME on Mac OS X using Fink, but I always end up bailing out long before I get through the myriad of dependencies. Now I realize the easier thing would be just to bail on Fink and go with MacPorts, but I had already taken the trouble to learn Fink’s packaging format. Moreover, while MacPorts has a *BSD heritage (as does Mac OS X itself), Fink has more of a Linux heritage, which I’m far more familiar with.

Fink’s version of GNOME is now six stable releases behind. (Fink has GNOME 2.8. GNOME 2.10, 2.12, 2.14, 2.16, 2.18, and 2.20 have all been released.) So I started mucking around with the package files and got about a quarter of the way through the build order before I realized that the maintainers of the Fink Project had already gotten started with updating to GNOME 2.20.

So it’s back to the beginning, to make sure I’m using FreeType 2.3.5 in all the packages.

Mac OS X 10.5 “Leopard” has been out for several months now. I’ve upgraded my iBook G4 933 MHz and everything seems relatively solid, so now I’m ready to upgrade my Mac Mini G4 1.2 GHz. This, naturally, entails backing up my hard drive, which is taking forever.

40 year symmetry

posted on February 9th, 2008

My sister reminds me of the 1968 Democratic National Convention which ended up erupting into riots. 1968 was a crazy year. Both MLK and RFK had been assassinated. The Vietnam War was still raging and sending body bags back at an obscene rate, and the American public was in an uproar. LBJ had announced that he would not seek re-election. The front-runners of the Democratic presidential nomination were Hubert Humphrey (who would end up losing ignobly to Nixon), the more status quo candidate, and Eugene McCarthy, whose platform rested heavily on an anti-war stance, with the goal of rapid withdrawal from Vietnam. The undemocratic manner in which Humphrey won the nomination without having participated in a single primary ended up being a liability in the general election, and resulted in permanent changes in the nomination process.

The 2008 Democratic National Convention runs the risk of being a similar debacle. No one has been assassinated (yet), but the War in Iraq is still raging and continuing to send body bags back to the homeland. Al Gore is not going to run for re-election. While HRC (whom I see as the status quo candidate) and Obama (who has a legitimate anti-war stance) are a lot closer in delegate counts than Humphrey and McCarthy were, there looms the specter of superdelegates deciding the nomination. And if they stupidly swing the pendulum away from the popular vote, you can be certain that mayhem will ensue. The other albatrosses are Michigan and Florida, who, according to party rules, will have no delegates, as punishment for moving their respective primaries up. While it sucks that Michigan and Florida Democrats have been denied their ability to vote, it would also suck immensely to reverse the rules all-of-the-sudden now. HRC has already brought up the nasty notion of contesting this ruling, and if this becomes an issue, it will also likely cause chaos.

On the other hand, it looks more like it will be the Republican Party that is bound to fracture. The unholy alliance of corporate interests, libertarians, and fundamentalist Christians that got Reagan elected back in the day is unravelling with each primary and caucus. The party faithful have chosen McCain—the man renown for being a “maverick”—as their candidate. Romney—the status quo “mainstream” candidate backed by such hacks as Rush Limbaugh and Hugh Hewitt—has tanked badly and is out. Only the spoilers are left: Huckabee—the darling of unhinged fundamentalist Christians, and Ron Paul—the dark horse supported by delusional libertarians (although there are rumors that his run is coming to an end.) Even if McCain manages to fix the election in his favor, the Republican Party as we know it is finished.

The scary thing is the idea that the nomination of HRC will galvanize Republicans of all stripes to vote against her. Given what happened in 2004 (in which Democrats foolishly picked John Kerry as the “anybody-but-Bush” candidate), it remains to be seen whether antipathy against a particular candidate will be enough. I am somewhat skeptical that an “anybody-but-Hillary” sentiment will actually help McCain win the election. (I am less skeptical that Diebold may help McCain win the election.)

In any case, Barack has been my man from the start, ever since he was in the Illinois State Senate. I left Illinois in 2004, right when he was starting his bid for the U.S. Senate and I never dreamt that he would have made it this far.

In policy content, the differences between HRC and Obama are small details. (Not that the details don’t matter, but they have far more similarity with each other than they do with McCain.) But the one thing that swings my vote completely is the fact that HRC voted for the war, and Obama was against it from the start.

To paraphrase Obama himself, there’s something to be said about being ready on day one, but there’s something also to be said being right on day one.

It’s going to be an interesting year. Here’s to hoping that no blood will be spilt.

the failure of vista

posted on February 8th, 2008

It seems to be conventional wisdom that Microsoft Vista sucks, and most Windows users are not going to be comfortable with switching to a Linux distro. (Mostly because they can’t play their games, but my advice for them is to invest in a PS3 or a Wii, *or* buy a Mac and install Parallels so that you have Windows around whenever you need to get your game on.)

But what is most intriguing about this Slashdot article that trashes Vista is that the author goes into detail about the idea of downloading movies online and somehow zapping it to your TV. Which happens to be what the Apple iTV does. Maybe Steve J. was onto something after all.

You could also get by with buying the connection kit that hooks your video-capable iPod or your iPhone to a TV, but I suspect the resolution wouldn’t be as good.

violently ill

posted on February 7th, 2008

It’s been a while since I’ve been this sick. I’ve gotten sick over the past few years, usually with some kind of upper respiratory illness, and I even missed at least one day of work, but it was usually just something I could power through, and eventually shake off. A speed bump, if you will.

Probably the last time I was this badly ill was four years ago, with a non-resolving cough, night sweats, chills, weakness, and fatigue. It wasn’t really the severity that got me, but the prolonged nature of it. I kept going to work, too. It lasted for at least two weeks, culminating in an episode of orthostatic hypotension (exacerbated by vasovagal syncope) but eventually cured by a buttload of antibiotics.

Before that, I got sick with some respiratory virus during residency interview season, but that lasted for only one day, and the only reason why I felt like I was going to die was because it was December in Chicago and the cold just magnified my misery.

Before that, I think it was my senior year in college, when I couldn’t get out of bed to go to class because I felt so weak, and eventually I again needed antibiotics to get better.

As a kid, I remember being sick a few times, probably viral illnesses, but because my parents are both in health care, they would pump me full of antiobiotics anyway. I even remember when I got chickenpox as a little kid, and how I started feeling dizzy in the mall, and eventually had to be carried.

Before that, apparently I was a pretty sickly kid, eventually needing a tonsillectomy, but I don’t remember much of that.

But this bout of gastroenteritis beat me down pretty bad. It didn’t help that I had to drive down from the Bay to L.A., which meant that I had to concentrate for at least 5 hours. Except the Grapevine was closed. This lead to a misadventure in the Tehachapi Mountains. I had intended to go around the Grapevine by heading east from Bakersfield to Mojave, and from there, heading south to L.A. through the Antelope Valley, thinking that I could avoid the snow. Instead, I ended up driving through a snowstorm, which, when combined with wanting to hurl and trying to avoid pooping myself, was a little harrowing. I briefly entertained the idea of stopping and checking into a motel and just going to sleep. I then briefly entertained the idea of driving myself to the nearest emergency room just so I could get some IV fluids. By the time I emerged into the San Fernando Valley, I was pretty spent, and when I got to my parents’ house, I pretty much crashed out.

The worst part was that I was shivering uncontrollably, despite wearing a jacket, and burying myself under two blankets and a comforter, with the heater on full-blast. Eventually, I figured out that I was febrile, and after taking some Tylenol, I stopped shivering, and my fever finally broke, leaving me drenched in sweat.

I was finally able to hold something down the evening afterwards, and now I think I’m starting to feel better, but I still feel pretty damn weak.

Nothing makes you realize just how good life is like incapacitating illness (however temporary and brief.)

branches, lines

posted on February 6th, 2008

this trigger
sending millions of
particles of light
laser beams
gamma rays

burning trails of phosphorescence
each scintillation a separation
each glimmer sparking an entire universe

kindling her smile
igniting his laughter
her joy, his excitement
disconnecting past from present

dreams are like shadows evaporating with dawn
like cobwebs in the wind
like teardrops in sunlight

let there be
said the Lord

annihilating my soul
incinerating the rotting core of my being

let there be
shimmering waves
oscillating, quivering
spectra like harmony
tracing out movements, world lines
illuminating light cones
from stellar ash to nebular dust
blowing in the galactic wind

looking back
just a straight line
razor edged
streaking along
like an express train
a supersonic jet
you can’t stop what’s coming

acute gastroenteritis

posted on February 4th, 2008

Man, driving 6 hours while you have the runs sucks. And the Grapevine is closed.

the underdog has his day

posted on February 3rd, 2008

The narrative of the New York Giants appeals to me. They started the season 0-2. They were the wildcard team. They were expected to lose by 12 points to the seemingly unstoppable, undefeated Patriots. Talk about David slaying Goliath.

eastern sky before dawn in the desert

posted on February 1st, 2008

Venus and Jupiter shining in the dark Colombian sky Venus and Jupiter shining over trees in San Diego Venus and Jupiter above a rural road in Ohio Venus and Jupiter above an industrial complex in Texas Venus and Jupiter shining between the leaves of a tree in Lake Elsinore, California Venus and Jupiter above the Turkish Riviera

(Thumbnails derived from images on SpaceWeather as per fair use provisions of the U.S. copyright law.)

I woke up this morning just before dawn, and on the way to McCarran International, I was treated to the sight of Venus and Jupiter almost touching. Is an omen? A harbinger of luck?

In any case, it was pretty sight. Even in Vegas, the sky can surprise you.


posted on February 1st, 2008

The roar of traffic, the murmur and thrum of the crowd
and the mournful winter wind, scouring the desert sand
and the inside of my soul is silent and still
like a raging river flash-frozen in mid-torrent
and eons have passed, the axis of the earth precesses, and still there is no thaw.

This night, like all nights, the shadows fended off by the
actinic light arcing from the fluorescent bulbs
this pale, wan facsimile of sunlight
without warmth, quenched, smothered by the darkness outside

In the throng, amidst the hither and thither of the still-living
I am alone
as if I were a ghost
unseen, unfelt, unacknowledged
I’m cowering in the interstices
pretending I don’t exist
and wishing that I didn’t have to pretend
but this futile un-life
the hour turns
the daylight burns
flickers, fades
and the earth spins around again
the stars in their eternal dance

In the tumult of a million souls,
I am alone
unnamed, unclaimed
and damned
swimming through this murky sea
of brimming despair
even this cold numbness
cannot take the pain away
come to life and throbbing away
in the caverns of my rotten heart
desiccated and crumbling

The memories will forever haunt me
of what could never be,
of decisions and indecision
of words not spoken,
of chances not taken,
and what, what, what would it really matter,
knowing what I know
and why do I find myself taking this same downward spiral
never finding the bliss of oblivion
the silence of annihilation
tracing my futile footsteps
dancing a hopeless dance with my own shadow.

clawing to the surface

posted on February 1st, 2008

Wow. It’s been a while since I’ve felt this way. As I gazed mesmerized by the spinning barrels of the slot machine, I felt suddenly suffocated by an awful feeling of despair and loneliness. It was almost as bad as being short of breath. The feeling eventually passed, but I just feel spent, and my muscles are all tight with anxiousness.

Where did that come from?

Better not to think about it.