I just came across a fascinating presentation called The Next Mainstream
Programming Language (see links at the bottom).
It was made by Tim Sweeney, the founder of Epic, a video game company that
created Unreal.
The title of the presentation is actually misleading, because the topics
addressed by Sweeney in his presentation are not exactly about the next
programming language but more a reflection on the programming requirements of
the video game industry with hard number and real code. What makes this
presentation much more special than other typical talks trying to explain that
“performance is really important and you need to squeeze as much power as
possible in your code” is that Sweeney has obviously thought a lot about the
bigger picture behind these problems, and as a consequence, he offers some
startlingly insightful suggestions to address these problems, mostly coming from…
Haskell.
That’s right, Haskell.
Now, Haskell is not exactly known for its speed, but the concepts that it
allows to express turn out to be a very fitting match for the challenges of
video game programming.
He breaks down game development in three distinct kinds of code, in
increasing order of FPU usage:
- Gameplay simulation: can run with lower animation speed and
typically involves C++ and a scripting language (0.5 GFLOPS). - Numeric computation: low-level, high performance code, almost
exclusively C++ with a strong functional slant (5 GFLOPS). - Shading: generates pixels and vertex attributes, runs on the GPU,
"embarrassingly parallel" and done in a homegrown scripting language called
HLSL (500 GFLOPS).
They use no assembly language. Ever.
Here are some of the other points found in this presentation:
- No languages today capture integers correctly. What he means is
that languages do have integers, of course, but most of the time, users
really want a very distinct subset of integers (iterating over a loop,
bounding an array, verifying certain conditions) and that if these
constraints were enforced by the language, a lot of bugs could be avoided.
- There is a wonderful correspondence between features that aid
reliability and features that enable concurrency.
- Lenient evaluation is the right default.
- Purely Functional is the right default.
- The Java/C#
#1 by Lachlan on March 25, 2006 - 6:01 am
Very interesting stuff, makes a lot of sense.
The idea that the compiler catches null pointers, array index out of bounds, etc is in start contrast to the rise in popularity of more dynamic languages such as Ruby and Python. In such languages the compiler does far less checking, and almost all statements are potentially unsafe at runtime.
Java-heads may be interested in “Nice”. It runs on the JVM and incorporates some of the ideas from this presentation. Indeed, my favourite feature of Nice is that it eliminates NullPointerException :-). Unfortunately, the lack of docs and tools made it pretty hard for me to use it in my day job. http://nice.sourceforge.net/.
#2 by Laurent Simon on March 25, 2006 - 9:07 am
I agree with Lachlan. Nice is a safe and powerful language. I always ask myself why Java did not include similar features ?
#3 by Taras Tielkes on March 25, 2006 - 10:52 am
There are a couple of IDE excensions for static nullability checking in the Java space:
http://www.jetbrains.com/idea/documentation/howto.html
There’s similar work being done for Eclipse.
#4 by Dalibor Topic on March 25, 2006 - 12:49 pm
Over here (since I am in Saarbr
#5 by Dalibor Topic on March 25, 2006 - 12:49 pm
Over here (since I am in Saarbr
#6 by nono on March 25, 2006 - 3:06 pm
Hi Cedric and all,
I’ve still got letters from Tim Sweeney dating back to the early days of Epic Megagames (when Jack Jazzrabbit was in development, way before the first line of code of the first Unreal had been written and even before everybody had Internet access).
🙂
One of my “all-time Tim favorites” is a paper on how the game state replication was done in Unreal, written in 1999 by Tim Sweeney. It is still good today and was just plain visionary back in the days.
You didn’t emphasize on something very important he’s saying now: we’re moving to more complex systems and… Type inference doesn’t scale.
Funny how everybody who responded so far talked about Nice, JML, Jetbrains’s annotations.jar (@NotNull/@Nullable), etc. There are obviously people who want “more control”. And I’m one of these people (I do use IDEA 5.x/JDK 1.5 and @NotNull is a real time-saver): the stricter the limit you put on your abstract data types, the easier you make it for the compiler to “proove” that the various components will act as their contract say they will, the sooner you catch bugs, etc.
You didn’t mention it directly but he also talks about the benefits of referential transparency … (one obvious advantage being to allow the use of memoization [yup, without an ‘r’], one other advantage being to help the compiler “proove” the correctness of some part of the program).
I side with everybody who recommends functional programming (and Joshua Bloch hints that it is possible to program in Java in a more functional way that what is usually seen, Effective Java item 13).
And I side with everybody who’s horrified by the horrible abuse of exceptions (the glorified goto’s) in Java and C#.
It is really good to see his hard data backing his statements on all the defects Java has and… terminating the presentation with a full page devoted to one of Haskell’s main flaw: type inference doesn’t scale.
I hope that the “next mainstream programming language” will ressemble the best of several worlds (I’ve got my idea as to how it should look like… And I’m more than convinced it is *very* similar to Tim Sweeney’s view).
Really a good read, thanks for that nice blog entry,
(I’ll go check those Spec# slides now…)
#7 by Jason Carreira on March 26, 2006 - 12:23 am
Check out Scala:
http://scala.epfl.ch
It’s got a lot of things inspired by Haskell, with a nicer looking syntax and stronger typing than Java, although it does provide for type inference… I’m not sure how type inference works in Haskell, but in Scala it’s done by the compiler, so I’m not sure how that is supposed to break at runtime…
It looks like an interesting combination of OO and functional programming so far… and it runs on the JVM or the .NET CLR.
#8 by Dalibor Topic on March 26, 2006 - 3:27 am
The most exciting thing for me about spec# was that the underlying theorem prover, the intermediate language and all the pretty nice machinery to prove the properties worked transparently, on the fly, without the IDE user noticing (or having to know) anything about the underlying ATP, it’s syntax, logic, proving programs correct, etc.
It was a bit like DHH’s ROR demo: call a method from the standard class libraries, finish typing, a red squigly line would appear almost insabtly and a pop up box would tell you that you may end up violating a specific contract (for example your array index going out of bounds for a substring method), type some more, another squigly line … until your specifications and your code matched. Proving program properties beats unit testing, in my opinion. In particular having the fun part (actual proofs) being done automatically and quickly in the background.
cheers,
dalibor topic
#9 by Stephan Schmidt on March 26, 2006 - 12:10 pm
Wanted to read the Spec# slides but I wasn’t able to. Some people still think PPT is a universal exchange format 🙁
#10 by Laurent on March 27, 2006 - 6:36 am
Definitely nice reading.
However the example on circularity (page 36) is really not convincing. He should have put some real world example.
An argument I don’t get is that he goes from execption everywhere to exception nowhere (except where explictly asked for). There are some cases where this can’t be avoided: what about data read from a file a user can change (.ini comes to mind)?
I also don’t agree with slide 61, The Coming Crisis in Computing:
– I don’t think we will get 20 cores each with 4 way hw threads by ’09; look at AMD and Intel roadmaps… Anyway multi threading will expand, though probably not as symetrically.
– hardware is already reaching 1 TFlops (see PS3 Cell)
– GPU’s already have general computing capabilities though it’s not that easy to get it (see http://www.gpgpu.org/ or recent announcements by NVIDIA and Havok).
A side note is that PS3 hardware is probably a hint at what the future looks like, much more than XBox 360. I really think the direction is heterogeneous computing.
BTW HLSL is not a scripting language, it is DX9 shading language (more or less the same as NVIDIA’s cg).
#11 by 除尘器 on March 31, 2006 - 12:30 am
除尘器
#12 by -L on April 7, 2006 - 1:06 am
I find it strange that much of the language features he calls have been available to those programming in Ada since the early 80’s yet there is no mention of Ada. Ada has high-level concurrency support (rendezvous, protected objects, suspension objects), strong typing, real arrays, support for object oriented programming etc.
For example the Transform method on page 31 of the PDF would be straightforward to express in Ada such that all the requirements were satisfied.
What Ada is missing though is lazy evaluation (although the compiler will warn about uses of uninitialized values) and C-like syntax (Ada is a far descendant of Algol family, thus to most people it looks like Pascal). The verbose syntax, however is justified since the user-defined types define contracts, and is therefore not really a burden.
#13 by lol on April 7, 2006 - 3:59 am
Isn’t this “homegrown” HLSL language made by a small garage company called Microsoft?
#14 by lol on April 7, 2006 - 4:00 am
Isn’t this “homegrown” HLSL language made by a small garage company called Microsoft?
#15 by Bobby Fellon on May 22, 2006 - 12:47 am
I have a question to the people posting on this thread, if someone can kindly answer, what does it take to have a job in the game programming industry or even programming, I would just like to understand what qualifications did you get, what did you have to learn to know the stuff you know?
#16 by Rene Dudfield on May 30, 2006 - 5:33 pm
They use C++ with SIMD intrinsics. This is basically inline assembly.
Cheers.
#17 by Jim McCoy on May 30, 2006 - 10:27 pm
If anyone is looking for a language that offers most of what is asked for here and doesn’t suffer from the original sin that will end up strangling some of the suggested JVM/CLR variants, take a look at Erlang.
_Strong_ concurrency support, almost purely functional (you can create side-effects by using what is known as the process dictionary, but this is discouraged), very nice fault-tolerance and reliability features, hot-swappable code, and lazy evaluation. It even offers a shell-like interface so you can play around with bits on the fly.
As far as which language will end up being “hot” in five years due to the requirements stated in the original article, I am betting on either Erlang, OCaml, or Haskell. While I personally favor Erlang I think that OCaml is probably the leading candidate in this group at the moment…
#18 by Wouter Lievens on June 7, 2006 - 7:13 am
To me this sounds pretty similar to what the guys are Intentional Software are saying.
http://www.intentsoft.com/
I’ve also read a few Martin Fowler articles that express similar ideas. And before that, it’s hit me as well independently: we don’t need more run-time dynamicity, we-need compile time dynamicity. The ability to extend your language without losing early checking. Design contracts, not just on method entry and exit points, but on any language element (from expressions to classes to packages) is what we need.
#19 by justasecond on June 21, 2006 - 11:54 am
“No languages today capture integers correctly”. Err…Delphi/pascal has integer subranges and can enforce range checking.
var i : 1..3 ;
is a valid declaration.
#20 by Stefan Scholl on June 22, 2006 - 11:15 am
Haskell is fast enough: http://shootout.alioth.debian.org/gp4/benchmark.php?test=all&lang=all
#21 by simon on June 23, 2006 - 1:35 am
Qi [http://www.lambdassociates.org/] would be another interesting alternative with it’s Turing-equivalent type notation.
For example, interval arithmetic in Qi:
http://www.lambdassociates.org/studies/study01.htm
#22 by simon on June 23, 2006 - 1:37 am
Qi (http://www.lambdassociates.org/) would be another interesting alternative with it’s Turing-equivalent type notation.
For example, interval arithmetic in Qi:
http://www.lambdassociates.org/studies/study01.htm
#23 by Frater Plotter on July 2, 2006 - 12:21 pm
I’m sure as hell glad Tim Sweeney’s ideas of language design have changed since ZZT-OOP. 🙂 🙂
#24 by Praca on October 8, 2006 - 12:27 am
yea! fascinating presentation thanks for link!
#25 by Sam Bronson on October 23, 2006 - 2:53 pm
Heh, yeah, ZZT-OOP sucked! But it was a lot better than hardcoding the behaviour of one-of-a-kind things in the engine.
#26 by BUses on December 4, 2006 - 6:16 pm
This presentation is professional and to the point.
#27 by Futureboy on December 15, 2006 - 2:16 am
“All dereference and array accesses must be statically verifiable, rather than causing sequenced exceptions”
Heh. Call me when you solve that and the Halting Problem, which are equivalent.
#28 by Former Game Programmer on January 21, 2007 - 11:13 pm
Sweeny doesn’t like the Agol/Pascal/Modula/Ada family which is why there was no mention of Ada, which has nearly all the features Sweeny was talking about.
#29 by Former Game Programmer on January 21, 2007 - 11:13 pm
Sweeny doesn’t like the Agol/Pascal/Modula/Ada family which is why there was no mention of Ada, which has nearly all the features Sweeny was talking about.
#30 by Former Game Programmer on January 21, 2007 - 11:15 pm
Sweeny doesn’t like the Agol/Pascal/Modula/Ada family which is why there was no mention of Ada, which has nearly all the features Sweeny was talking about.
#31 by Sridhar on March 26, 2007 - 2:47 am
Yeah, Futureboy, to take an arbitrary program in a non-safe language and analyze whether its dereferences/array accesses are safe is equivalent to the Halting Problem. But that’s not what Tim is proposing. He’s proposing the use of a safe language, in which, in order to make a dereference/array access, the programmer has to supply enough information to the compiler to prove the safety of that access (with an elegantly designed system, in many real-world cases, the compiler will have enough information to deduce the safety on its own, without extra information from the programmer beyond the same sort of code as he would typically write today. In such cases as that this will not be enough, the programmer will have to modify his code and/or provide some extra information for the safety proof, perhaps (in the easiest but least satisfying way) through wrapping the access in a run-time safety check).
So Tim’s goals aren’t incompatible with the undecidability of the Halting Problem; he doesn’t want to take existing, potentially unsafe code and algorithmically analyze it for safety. He wants to ask programmers (who presumably have an idea in their head of why their code is safe) to code in a manner which makes this safety clear.
Or, to just make it clear that Tim’s goals are entirely possible: algorithmically determining whether a C program has no null pointer dereferences is of course impossible, for Halting Problem reasons. But nonetheless, many serious general-purpose languages exist in which null pointer dereferences simply cannot happen in code which successfully compiles (e.g., in Standard ML).
#32 by Leba on May 29, 2007 - 11:45 am
“By 2009, game developers will face
#33 by sprezarki on October 15, 2007 - 1:43 am
20+ cores… damn! I`m a game addict and i can`t wait !
#34 by shawn on October 22, 2007 - 1:21 pm
The examples “type inference doesn’t scale” I feel are misleading. Actually the problem is Haskell’s implementation. He talks about wanting contract enforcement without exceptions, typing is one of the means to that end. Because Haskell doesn’t autoconvert a string to an int in a “duck” type way, doesn’t mean that it doesn’t scale. The type-classing in Haskell would be a better example. That stuff is really useful, but when one needs it downstream in the process, it was probably not done upstream. Then the ouch sets in.
#35 by Ryanair on October 30, 2007 - 6:36 am
20+ cores – great result
#36 by Zutestrane on November 11, 2007 - 3:28 pm
Ok we have almost 2008, the newest CPUs have 4 Cores, could be 8 by 2009. Dont think it will be 20+.. btw. the Unreal 3 engine is overhyped, it doesnt look sooo good everybody says. The New Unreal Tournament screens arent that impressive from my point of view, what u guys say about that?
#37 by Zutestrane on November 11, 2007 - 3:28 pm
Ok we have almost 2008, the newest CPUs have 4 Cores, could be 8 by 2009. Dont think it will be 20+.. btw. the Unreal 3 engine is overhyped, it doesnt look sooo good everybody says. The New Unreal Tournament screens arent that impressive from my point of view, what u guys say about that?
#38 by narus on December 19, 2007 - 10:04 am
How does Haskell aid in multiprocessing?
Any comparison with Erlang, which extends to multiprocessing systems with no additional programming required?
#39 by barbie oyunları on January 19, 2008 - 10:19 am
thnaks for the archives
#40 by Blog Directory on January 20, 2008 - 7:04 pm
I hope that in 2010, my computer can have amy core. Currently, i am still using old pentium 4 computer and it can’t multi task. Hopefully by then, we are able to do many mutli task!
#41 by James burt on February 27, 2008 - 11:21 pm
I have just bought a quad core 6600 2.4ghz, it really is fast compare to me old pc. I can’t imagine having 10 or 20 core!!
#42 by mike on March 17, 2008 - 8:26 am
you said:
By 2009, game developers will face CPU
#43 by rabbit vibrator on March 25, 2008 - 7:07 am
08, the newest CPUs have 4 Cores, could be 8 by 2009. Dont think it will be 20+.. btw. the Unreal 3 engine is overhyped, it doesnt look sooo good everybody says. The New Unreal Tournament screens arent that impressive from my point of view, what u guys say about that?
#44 by Webdesigner on May 8, 2008 - 7:43 am
I guess the next mainstream programming language is going to be Ruby! It definitely rules, check it out!
#45 by bilard on May 24, 2008 - 2:38 am
Great and excellent article it
#46 by ankh47 on September 4, 2008 - 12:13 pm
his example in the slide “type-inference doesnt scale” seems plain misguided to me.
of course string is not an integer, you have to “process” the string to make it an int, even in c …
and if he is talking about an scripting lang. built on top of the statically typed core lang (ie. haskell), it’s the scripting lang’s task to know the right types and automatically cast the values so the programmmer doesn’t have to.
would sb. enlighten me WTF was he trying to say/ how should i translate so it would make some sense ?
#47 by voyance on October 3, 2008 - 1:33 pm
exactly
#48 by Ali on October 20, 2008 - 12:44 am
Incredible efforts and work is put into creating a video game. Never acknowledged this until I saw the Powerpoint presentation
#49 by Voyance on October 31, 2008 - 4:41 am
i’am agree with you
#50 by alex on September 9, 2009 - 12:36 am
Lots students transpire the duty to qualified resume writers because they miss the ability to write a respectable resume so the cause why you
need to resume writer, but such customers like composer don’t do that. Thanks a lot for the topic. Very good topic about A programming language for 2010.