I recently attended a presentation of C++0x by Bjarne Stroustrup.  It
wasn’t pretty.

C++0x is the next version of the C++ specification and it’s scheduled to be
final in 2009.  Yes, you read that right:  2009. 

"Well, okay, I guess
that with such a distant deadline, the changes to C++ are going to be completely
revolutionary, right?"

Right?

Not really.

If you are a hardcore C++ developer, I’m sure you will welcome these
modifications, but anyone with a simple passing (or forced) interest in C++ will
be less than impressed and will probably shrug the whole effort away and go back
to using gcc while avoiding templates because they are still buggy in that
implementation.

So, what’s going on with C++?  Here is a quick rundown (obviously not
exhaustive).

  • Make the language easier for beginners, for example by providing easy
    ways to convert ints to strings. 
     
  • Get rid of the maddeningly mandatory space required between ">" signs for
    templates.  It’s about time users, and not compiler authors, got the
    last word on that one.
     
  • Scoped macros, one of my favorite features:  you
    can now clearly delineate where your macros apply and when they become
    undefined with…  well, macros (#scope / #endscope). The intent is
    noble, but rescoping existing macros is probably not going to happen, so
    this change will only benefit new macros.
     
  • "Admit the GC as a valid implementation technique".  I am quoting
    this directly from Stroustrup’s slides because the phrasing is too precious. 
    It looks like there is something to learn from "next generation languages"
    (Java, C#) after all…
     
  • Type inference, woohoo!  In C++0x, variables can declared with
    auto
    .  Now we’re talking 21st century. 
     
  • Better runtime type information, dubbed XTI ("Extended Type
    Information").  Reflection for C++…  Well, as much as possible
    anyway.
     
  • Template typedefs.  A long overdue good idea that will make
    templates (and related errors) a tad more readable.

As you can see, nothing that is really going to help C++ programmers raise
the level of abstraction where they are currently stuck:  no closures, no
LINQ, no new keywords, and still this unavoidable exposure to all these
low-level details that fewer and fewer people care for these days.  Why do I
even need to know about the existence of iterators when all I want is to work on
every element of a collection?

I wrote my last line of C++ about nine years ago, so my exposure to
Java, C# and Ruby these past years has undoubtedly made me very biased, but I have
to say I found all the snippets of code that Bjarne showed absolutely dreadful
to read.  Coding conventions have never been the forté of C++ programmers,
but with the advent of templates, partial specialization, traits and other
advanced programming techniques, C++ code has become a terrible alphabet soup
mixing lower case, upper case, all caps, camel case, *, & and [] and const
appearing sometimes left and sometimes right, underscore sometimes followed by a
capital, other times by a lowercase character or an all-caps word, etc…

Now, to be fair, you need to remember that from day one, C++ has been
submitted to extremely strict constraints that have had a very strong impact on
its design.  I certainly give credit to Bjarne and the C++ committee (which
I used to be a part of) for never forgetting their users and keeping in mind
that breaking the slightest feature in the language will result in millions of
lines of code breaking across the planet.

Having said that, it’s becoming increasingly clear to me that C++ is turning
into a niche language, and if you want to start a new project, you’d better have
some very good reasons to pick C++.  Chances are very good
that Java, C# or Visual Basic will fit the bill in 99% of the cases.

Good bye, C++.  I’m looking forward to reusing my C++ brain cells for
concepts that will help me tackle the software challenges of the next twenty
years as opposed to not forgetting to insert a space in my code or remembering
to free the memory that I’m not even sure I allocated in the first place.