Archive for January, 2006

Watch your language

I understand that natural languages are evolving constantly and that new
words, new expressions and new spellings trickling into our everyday speak is
what makes the language alive.  It also explains why artificial languages
such as Esperanto have utterly failed to gain momentum.

I am ready to endure a certain amount of abuse when I listen to the TV or to
people.  Fair enough.

But something that really irritates is me when these blatant misuses of
English creep into written text.

Here is a short list of that
improprieties I have noticed these past years:

  • Don’t start an article or a blog post with "so".  Ever.
  • "Anyways" is barely a word and  it’s dubbed
    archaic. But can I
    interest you in "anyway", "either way" or "at any rate"?
  • As for "irregardless", even the
    dictionary says "don’t use
    .  So don’t.
  • "Criteria" is plural.  The word you are looking for is "criterion".
  • When something doesn’t matter to you, you "couldn’t care less". 
    Note the negation.  Just think about it for a minute and it will make
  • In "weird" and "receive", the e comes first.  I know, it’s weird. 
    Just learn it by heart and don’t think twice.
  • If you’re not sure if you should use "its" or "it’s", replace it with "it
    is".  If the sentence still makes
    sense, then "it’s" is correct.  Otherwise, use "its".
  • In "gauge", the ‘a’ comes first.
  • A pandemic is global, by definition.  If you really want, you can say
    "global epidemic" but "pandemic" by itself is enough.
  • "D

One thing you don’t know about Google Talk

There have been many comparisons of the various messaging clients available
today:  Google Talk, Yahoo Messenger, MSN Messenger, AOL Instant Messenger
and hybrid ones as well, such as Trillian.  These studies usually do a
good job at comparing the features offered by these clients, but for some
strange reason, I have yet to find one that really nails the one thing that
makes the difference for me with Google Talk:  presence ubiquity.

Throughout the day, I typically move between three or four different PC’s
(work, laptop, home, etc…) and this particular feature means that not only do
I not need to log on constantly, it also means that I can start a conversation
home and finish it at work without having to do anything.  The logic behind
the message broadcasts is sometimes a bit mysterious, but it works most of the
time and is a joy for computer vagabonds like myself.

Since no article ever mentioned this feature, I am guessing
my situation is pretty unusual, but since it’s the only one that only exists in
Google Talk, I am stunned that it hasn’t been emulated by any other client. 

Having said that, I do agree that Google Talk is very
austere and I have to confess an unending fondness for Yahoo’s cute smileys, but
presence ubiquity is the main reason why my Friends List on Google Talk
keeps getting bigger why my Yahoo’s Buddy List has been stagnant for over a year
now.  People who need to reach me now know that Google Talk is the first
client to try.  If they don’t know where I am, Google Talk certainly does.

Of course, the fact that Yahoo keeps
nagging me about
without giving me a chance to opt out and that it will change my
home page  if I’m not careful during the installation doesn’t help. 
It could be worse, though:  you could install AOL Instant Messenger and
spend the next ten minutes hunting down and killing all the "Free AOL" shortcuts
that suddenly pop up in unexpected places on your desktop.  I
hear their latest experimental version, codenamed Triton, goes even further and
changes your home page, installs Plaxo, a toolbar and a modified full-screen version of IE that
lands on their page.  Oh and you can’t refuse the installation of this
software:  you will need to uninstall it by hand later.

That’s clearly not "not evil".

I feel some comfort thinking that Google Talk uses open standards and doesn’t
come with any additional baggage, but I wouldn’t mind if it added some of the
bells and whistles found in its competitors.  How about them smileys for a start, 


Lost in “Lost”

I like Lost a lot.  Its innovative plot coupled with strong
acting and ingenious twists is like a breath of fresh air that is almost making
flashbacks look cool again.  I TiVo all the episodes religiously and I have
hardly  been disappointed by any so far, which is quite an achievement for
a show in the middle of its second season (a feat that has only been topped by
24, which just started its fifth season).

However, I’m beginning to notice a disturbing trend in the storyline: 
the Lost writers seem to be very good at coming up with startling scenes and
puzzling twists, but they do very little in terms of explaining them. 
Instead of tying up loose ends, the episodes keep adding new implausible twists
that make you wonder if the authors really know where they are going or if they
just shoot for cheap one-liners and hope that viewers will forget them later.

Well, I’m not forgetting.  Here is a quick list of the unresolved facts
that I can think of so far (spoiler warning:  these facts cover both
season one and two, you might want to stop reading now if you are not caught up
with the show as of January 15th):

  • Mutant animals.
  • The mysterious "others".
  • Winning lottery numbers that pop up in unexpected places.
  • A semi-crazed French woman.
  • A virus that makes people hear voices.
  • A quarantine shelter buried in the ground.
  • A man who’s been sequestered in said shelter for years and ran away (how
  • Jack happens to know that man.
  • A mechanical arm that pulls people into the ground.
  • An underground machinery that requires numbers to be punched in on a
    regular basis.
  • A heroine cargo that was flying out of Nigeria and ended up crashing on
    the island.
  • A swirling cloud of black smoke.

And I’m sure I am forgetting quite a few.

I hear that enough of the plot has already been written to occupy five
seasons, which is great news, but I’m really hoping that we’ll start seeing the
explanation of some of these mysteries soon, or my patience will start wearing
thin very fast.


TestNG at JavaOne

I am happy to report that my TestNG presentation has been accepted at JavaOne 2006:

Congratulations! Your submission entitled ” Beyond JUnit: Introducing
TestNG, the Next Generation in Testing " was so compelling that the JavaOne
Program Committee made an early decision to accept your proposal to speak at
the upcoming 2006 JavaOne(sm) Conference in San Francisco, California, May
16-19, 2006. As an accepted speaker, you will receive a full complimentary
pass to the Conference.

Sadly, the dates conflict with a previous engagement, so I will not be able
to make it to JavaOne this year for the first time in almost ten years. 
The good news is that I worked with the JavaOne committee and we have found a
substitute to make the presentation.  I will disclose his name once
everything has been finalized, but you won’t be disappointed.

On a similar note, here is the schedule of the TestNG Tour, "Changing the way
we write software, one test method at a time":

  • Oakland Java Users Group (February).
  • TheServerSide Symposium (March).
  • SD West (March).
  • TheServerSide Symposium Europe (June, Barcelona).

I will also make a second presentation in Barcelona that is not related to
TestNG.  I have a tentative title and some content, but I’ll hold on to the
details until everything’s been finalized.


Mass hypnosis

Ah… Mac World.  And I don’t mean
the conference
, I mean "Mac World".  You know, this weird
parallel universe
where all those who have fallen prey to the Steve Jobs Cult dwell and sneer at
the rest of the population with a look of utter pity in their eyes.  I’m sure you know a few of these people if you’re a developer
or you work at a software company.  Maybe you’re even one of them.

It never misses:  whenever an Apple conference approaches, the tech
world gets all abuzz and Macboys can barely avoid exploding on the spot in a
thousand tiny droplets in anticipation of whatever Steve will throw in their
direction as long as it’s introduced as "One More Thing".

Witness my good buddy Michael, someone who would certainly not be described
as irrational, and much less exuberant, by his peers.  Michael
decided to
sign a check for $3,200
payable to Steve just to pre-order a MacBook. 
I bet that thousands of people are doing the very same thing as I write these
words, so Michael is hardly alone.  But that doesn’t make this behavior any
more explainable.

What I find really ironic is that after Jobs spent so much time explaining
that the Power PC was a speed demon and hearing tireless Mac users claim that
their machine was "plenty fast enough", these new announcements allow the Mac
community to breathe a common sigh of relief and finally say "okay, okay, we
admit it.  Our PowerPC’s are impossibly slow, we’ve been lying all along,
but it’s all going to change now!  So please stop disrupting our fantasy
with your facts".

Don’t get me wrong:  I can understand being excited about a product that
will bring new features or radical innovations, but paying forward $3,200 just
to get a faster machine?  And of all companies, to *Apple*, which
has a long track record of:

  • Exaggerating and omitting some vital facts in their announcements (any
    word on the battery life of these new Core Duo-based beasts?).
  • Making a terrible job at provisioning their stock supply (any bet
    on the percentage of orders that will actually be delivered in February?).

Maybe it’s just me, but with the power of the Internet at my fingers almost
twenty-four hours a day, I find it absolutely impossible (and unforgivable) to
make the slightest purchase before doing a quick research and seeing what other
people think of the product I’m about to buy. 

What makes things worse with
the MacBook is that we’re talking about a completely new product that is
guaranteed to have flaws in its initial versions.  Yes, yes, I know, Steve
says it runs flawlessly, but just stop staring at the pendulum he is swinging in
front of your eyes for a second and trust me on this one (and stop drooling too): 
You’re better off waiting.  A couple of weeks will do.

You’ve been
stuck with a machine slow as molasses for almost two decades, surely you can
hold off for a bit, can’t you?

So, dear reader, what’s your excuse for preordering one of these

Update:  Erik pointed me to

this piece
which seems to indicate that indeed, the benchmarks used by Apple
to compare Duo Core processors to PowerPC ones are flawed.  A technique not
unlike the one they used a few years ago to prove the opposite, though, so you
can’t accuse them of being inconsistent in their misinformation.


Use Print to read annoying web pages

One thing that really irks me is articles that are split into multiple parts
and that force you to click Next every time you are ready to read more. 
Here is an

Whenever I encounter such an article, I immediately search for a Print
button, and most of the time, it takes you to a page that not only contains the
entirety of the article, but also removes all the ads and other clutter from it.

for yourself

The language of interviews

When I interview someone, I usually let them use the language of their choice
with between C, C++, C# and Java. There are several reasons for that:

  • I want them to be comfortable.  It’s already hard enough to be in
    an interview, much less being forced to use a syntax or an API they are not
    familiar with.  Of course, I don’t pay too much attention to syntactic
    details or making sure they use the right method name as long as the logic
    of what they write is sound.
  • It’s not that I have something against Ruby or other fourth generation
    languages, but I found that these languages work at a level of abstraction
    that is a little too high to give me a meaningful insight on the candidate
    for a forty-five minute interview.  There are plenty of very
    interesting problems that are hard to solve even in Ruby or Python (and
    actually, it’s quite likely this is the kind of problem that the
    candidate will be struggling with if she gets hired), but formulating the
    question and writing the solution would take several hours.

The real challenge is therefore to find a problem that is very easy to
express and which solution in one of the languages mentioned above will give me
enough information on this candidate to formulate a verdict.

Interestingly, the choice that the candidate makes already reveals a few
things on their abilities.  I found that typically, C/C++ people tend to be
very comfortable with low-level algorithmic questions ("pointers and recursion",

to quote Joel
) but fare very poorly as soon as we "move up the stack"
(object-oriented design, design patterns, enterprise frameworks, etc…). 
Conversely, Java/C# people are more comfortable with these concepts but get
easily stumped on "tight memory" types of questions.

Of course, great candidates excel at both, which brings me to my next point.

Good developers are born good.  Their brain is wired a certain way and
they can chew on any CS concept thrown in their direction and spit it out with a
bow tie.  Most of these developers then go to school and move from the "gem
in the rough" state to that of a "pure diamond".  School accelerates and
expands their knowledge.  Of course, there is hardly anything they learned
in school that they couldn’t have learned by themselves, but the formal process
of learning, reading book and listening to teachers saves them years and years
of work.  It also expands their minds to concepts they would probably never
have encountered in their professional career.

With that in mind, I find Joel’s obsession on pointers and recursion quite

There are two important facts to keep in mind about pointers and recursion:

  1. They are important concepts and any serious developer should probably be
    comfortable with them.
  2. You will hardly ever use any of these concepts for today’s typical
    programming jobs.

How’s that for a paradox?  How do you interview for this?

Well, it’s actually very easy to do a quick check on pointers and recursion,
even in Java, but it’s equally important to spend most of your interviewing time
on other areas that are more relevant to the job the person will be asked to do. 

One of my friends pointed out that what we are seeing today is a more
distinct separation between "system programmers" (kernel, device drivers, etc… 
which require C/C++ and pointer juggling) and "application programmers" (for
which pretty much any programming language will do, including Visual Basic). 
What’s really puzzling is that Joel’s company produces a bug-tracking software,
and it’s hard to imagine why you would need an army of superstar programmers .  A few selected senior tech leads and designers?  Sure.  But
an entire team of them…  doubtful.

As for Joel’s reference to Paul Graham’s vastly over-hyped essay
"Beating the Averages", I am
still trying to decide which of the following two quotes is the most ridiculous:

  • His start-up had an edge over its competitors because of the
    implementation language they chose.
  • Because of this choice, they were able to implement features that their
    competitors couldn’t.

Actually, I’ll call that a tie:  both claims are equally preposterous.

Paul Graham has been a dinosaur for a long time and his disturbing elitist
stance ("if you don’t know Lisp, you’re an idiot") oozes from every paragraph of
every single programming essay he has ever authored.  So far, Joel had
managed to remain reasonably objective and interesting in his posts, but his
extremely narrow background (Microsoft technologies in C/C++ and bug-tracking
software) is beginning to take a toll on his objectivity and I find that most of
his writings are more and more missing the big picture.  I hope he’ll turn
around soon and open up to modern programming topics, because frankly, I am
having as much fun using Ruby on Rails or Eclipse and EJB3 today as I did
Copper list based demos
on my Amiga fifteen years ago or coding floppy disk
drivers in 6502
on my beloved Apple ][ twenty years ago (gasp).