Got forwarded an interesting link from Dr. Gregory Wilson at the University of Toronto. He's an editor at Dr. Dobbs as well, and has an article up called Extensible Programming for the 21st Century that touches on many of the topics I've been concerned with lately.
He makes some interesting assertions, starting with:
This article argues that next-generation programming systems will accomplish this by combining three specific technologies:compilers, linkers, debuggers, and other tools will be plugin frameworks, rather than monolithic applications; programmers will be able to extend the syntax of programming languages; and programs will be stored as XML documents, so that programmers can represent and process data and meta-data uniformly.
This article argues that next-generation programming systems will accomplish this by combining three specific technologies:
Ok, #1 no problem, we arguable have this with the CLR's model, while #2 is a little different. I assume he means actual keyword syntax, as opposed to the way we "changed" programming languages in C with preprocessors and inline functions. Of course, LISP and Schema use macros to the point where one can't tell where the language starts or ends. Additionally, custom language constructs like the "using" statement in C# that "expands" into a try/finally use of the IDisposable pattern extend the language within a specific context - so I'll buy #2 also. I used to think that Number 3 is more of a stretch, but then you've got XAML sneaking up on us as well (not to mention our own foray into XML and CodeGen).
I supposed the question I am left with after reading his article is - hasn't all this already happened?
Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. I am a failed stand-up comic, a cornrower, and a book author.
Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.