Vala is an interesting language - it looks like C#, but is a compiled, reference counted, GObject based language (which is Gnome's/GTK's object model library), that simplifies/streamlines writing app code, compared to writing it in C.
Which brings on an interesting historical tangent - back in the day, Windows applications were written against the COM object model, either in pure C, or C++ with MFC. The parallels are obvious - Microsoft needed a language that was easier to use and better suited for the needs for applications programmers.
The obvious thing would've been to build something like Vala - a COM based reference counted compiled language - but they decided to build .NET/C# a garbage-collected JITted language with an entirely alien library and execution model. And while it became somewhat a success in the world of generic software dev, it never fulfilled this niche, with none of the core MS products ever integrating it, and most of the internal teams treating it with animosity.
I wonder why Microsoft decided to go down this route.
It was called Ext-VOS, and the main systems language was going to be J++.
The next COM Runtime.
See history of F# at HOPL, for some references on it.
However, the lawsuit happened, and the research language cool became C# and took J++ place, with J# being done for code migration from J++.
J++ already had P/Invoke (named J/Direct), Windows Forms (named Windows Foundation Classes), COM interop, events. The extensions that caused the lawsuit.
The background of Ext-VOS was visible in many environment configuration flags for .NET, that had COM_ prefix, nowadays replaced since .NET went open source.
Ironically after all this, Microsoft is back on the Java game, has its own OpenJDK distribution, and key contributions to the JIT.
Regarding the integration, you are missing the Microsoft politics, where WinDev is pretty much against anything that isn't C or C++. It has been a surprise that Rust has been accepted by them.
They are responsible for all safer attempts that could undermine Windows spotlight being killed (Singularity, Midori, Longhorn).
Thanks for the article, I'll be sure to read it when I have the time - unfortunately there seems to be nothing in there about J++ and Ext-VOS which would be the most interesting to this discussion.
From what I gather, J++ would've been still a bytecode based, and GCd language, which would've had the same shortcomings as .NET when interacting with native code.
I believe you and I admit Im quite ignorant about UWP, but from what I remember, Windows works fundamentally works with kernel objects and ref counted handles, where if you say, open a file, a socket, create a mutex, or a Win32 control, you create a kernel object, and you get a ref counted handle to it in userspace.
How is this process different in UWP, and why is it heavier?
I'd guess, that since Win32 apps allocate one handle/kernel object per ui control, an UWP/WPF does not, your typical UWP app would use less handles than your typical WinForms one.
COM requires that you call AddRef()/Release () methods at every single point the set of owners might change, including as function parameters and return values.
This is so error prone, that naturally there are smart pointers that keep track of this.
Like Apple did with Objective-C, and Cocoa's retain/release, VB 5 and 6, Visual C++, Delphi also have language extensions that keep track of this instead.
In Microsoft's C++ world, these language extensions are seldom used, because of the internal riot that killed C++/CX, replacing it with C++/WinRT.
So you still have the choices of using the smart pointers from MFC, ATL, WRL, C++/WinRT, WIL, or rolling up your own.
All of them with the caveat, that the Objective-C/Swift optimizations of removing needless pairs of calls does not take place.
So anyone that already knows the COM usage relatively well, tends to take shortcuts in the ways we are supposed to call AddRef()/Release (), as means to decrease the call count.
In .NET land, the runtime takes care of being more clever, via CCW/RCW infrastructure, caching instances and such.
Additionally there is the issue that a COM component can run in-proc, out-proc, or in-proc but hosted by specific Windows services, thus add OS IPC on top of each method call.
Better way is to get one of those COM programming bibles, "Inside COM", "COM+ Unleashed", " .NET and COM: The Complete Interoperability Guide", "Windows Runtime via C#"
Speaking of, today you can use https://gircore.github.io for rich GObject (and GTK4) interop. It is relatively new hence not widely known but is already used by e.g. https://www.pinta-project.com. It's a proper and actively maintained successor to GtkSharp.
Embedding Mono today might be a mistake. You really do want to embed CoreCLR instead if you can, even though it's a bit more complex.
The reason for this is the up-to-date Mono (that keeps up on runtime features and library support) lives here: https://github.com/dotnet/runtime/tree/main/src/mono After .NET became what it is today, many Mono components simply became part of the codebase there. Most notably Mono linker which became ILLink, a critical component of assembly trimming and AOT compilation to native binaries.
However, Mono is significantly slower than CoreCLR, frequently does not have the optimizations that performance-oriented code paths expect, only supports 128b SIMD and now serves the purpose of supporting exotic targets like WASM, monoaot for iOS (it will be eventually superseded), ARMv6 or just new platforms in the process of bring-up.
In any case, if you still plan to use Mono, it is best to use the one from dotnet/runtime.
Alternatively, you can build dynamically linked libraries with NativeAOT (it literally gives you plain .so files) and use that for extensibility instead. Note that they do not support unloading and live throughout the duration of the process.
Yep, there was also J# an intermediary language so you could run old J++ code on .NET iirc which obviously became defunct sooner rather than later since C# became defacto.
I have yet to meet a C# dev who actually touched J++ or J# to be honest.
My recollection of the reason back then was that running on a VM would make the transition to 64 bits smooth, which at the time was worrying everyone a lot.
I low-key wish they would open source the compiler so the open source community could make an attempt at rebuilding the IDE. I have heard they cannot open source all of it, wish they would open what they can.
It was an installer, installed all the DLLs, which if you bundled with your application, meant you were golden to run it as-is. Though you are technically correct that you need them, I think my statement still stands: You just need to bundle the required DLLs and OCX files and you're golden.
That's not very comforting for the dozens of times (that I personally experienced) that the developer didn't make sure it was "golden" and instead left a headache on the end user.
As long as you used the bundled installer, everything worked. If you didn’t, you’d need to bundle extra components but, IIRC, component selection was automatic - any DLLs, VBXs, and OCXs used would be detected and bundled.
I think the installer itself was licensed from a third party.
C# was a response to Java. Microsoft RARELY innovates. They look around and what is trendy and cool and copy it (usually in terrible fashion) and shove it down the throats of all the customers who can’t leave them.
Aarrggghhh. Why do we insist that projects/systems/languages must be continually changing and evolving?
Shouldn't there be room for a system that just does the thing(s) is does well? Why do we need to be continually tweaking? Adding increasingly obscure "features" and new bugs?
Unlike a human language, a computer language isn't "dead" when it stops changing. It is dead when nobody is using it. These are very different criteria.
> Unlike a human language, a computer language isn't "dead" when it stops changing. It is dead when nobody is using it. These are very different criteria.
A human language is considered dead if it no longer has any first-language speakers, but does have second-language speakers or is used fluently in written form, such as Latin. [0]
I broadly agree with your point though. Many languages would do well to slow their rate of change. There are very few slow-changing languages, like C, Forth, and Scheme. This 4 year old comment of mine on this topic is still applicable. [1]
Agreed - languages that keep on changing result in a lot of churn in the ecosystem, as older libraries quickly feel dated when they do things "the old way".
New features often interact poorly with some of the existing features, as those features weren't designed with the new feature in mind.
Be careful with what you wish for. One such niche is COBOL development. COBOL hasn’t changed much and it’s anything but dead in the sense of not being used, but there are not few developers who wish it were.
>> The existing ones are all deficient in rather serious ways
But most of the new ones look and behave just like the old ones - but with slightly more awkward syntax to allow for some special "feature" dear to the authors ;)
I routinely check out the various new languages mentioned on HN and Lobsters. While plenty of languages are new to me, I'm yet to see any feature that is new to me :(
I appreciate the work that goes into any language, but the stream of "new" languages feels like a stream of tweaks and rearrangement. It's disappointing. Most new languages come across as "like language X but with feature Y".
Reminds me of all those "Airbnb for X" proposals from a few years ago:)
Vala was super neat as it came out of the ElementaryOS line of work. Their holistic focus towards usability and approachability of a Linux distro was inspiring enough that I even supported them with my meager student developer income. Since then, I've moved to Mac and haven't dabbled much in desktop Linux machines. As it grew older and some things got broken up (like serenity and ladybird), the novelty-energy wore off and I looked away for years.
As a language, I hoped Vala would pave the way for a beautiful high performance application tool. Instead, we got Electron.
Vala is quite a bit older than ElementaryOS, though maybe that's the primary use for a long time now? I vaugely remember Vala as inspired by and reacting to an even earlier trend of GTK applicatons in C#/Mono such as F-Spot and Banshee.
That's the same place I ended. A desktop environment does not need its own language, and it was kinda hubris to think people would do that.
What's also interesting is that in Gnome 3, which came just a few years after, they chose JS because so many people already knew it and could easily develop for it.
I'd say the latter was the better idea, as there are far more shell extensions than Vala applications.
"A desktop environment does not need its own language ..." I'm unsure that I agree. Given that part of Vala's raison d'être was interoperability with C and that it integrated well with the parts of the desktop environment that made it a desktop environment (i.e., the gadget toolkit and the accompanying services), if I wrote GNOME (Gnome these days?) applications, I'd probably be fine writing the UI in Vala and the rest of my logic in something that, at a minimum, spoke the C ABI.
Likewise, I used to write Windows software in Pascal with parts in C and C++ because I oddly both found Win32 and COM and found interfacing with third-party libraries easier to deal with that way.
These are kinda ergonomic decisions, though, and a whole new language really isn't necessary for that. You could do that by making a fantastic library for another language/languages.
I'd point to PyQT as an example. It's function and documentation are flawless. I was able to build an entire web browser using it in a day(no, not a chrome competitor, just a simple one using QT and webkit components). The 'official' libraries are so clumsy in comparison.
My argument is exactly about the ergonomics. If the goal is to target Gnome in particular and take advantage of C interoperability (ideally, without the overhead of a foreign function interface), Vala seems like a solidly ergonomic solution to that problem.
To me, what was surprising about Vala is that it has automated memory management, C#-like syntax, but does not really offer memory safety, even if you don't interface with obviously unsafe C code.
Right, the latest version seems to be available through MSYS... I remember there was a huge version gap for *nix vs Windows earlier. I guess I remember the native Windows distribution, which is not available nowadays at all.
Which brings on an interesting historical tangent - back in the day, Windows applications were written against the COM object model, either in pure C, or C++ with MFC. The parallels are obvious - Microsoft needed a language that was easier to use and better suited for the needs for applications programmers.
The obvious thing would've been to build something like Vala - a COM based reference counted compiled language - but they decided to build .NET/C# a garbage-collected JITted language with an entirely alien library and execution model. And while it became somewhat a success in the world of generic software dev, it never fulfilled this niche, with none of the core MS products ever integrating it, and most of the internal teams treating it with animosity.
I wonder why Microsoft decided to go down this route.