Sad news – Niklaus Wirth passed away on January 1st . RIP.
https://twitter.com/Bertrand_Meyer/status/1742613897675178347
A lot of early software for #apple|s #Lisa and #Macintosh computers was written in Pascal, including most of the Lisa operating system and applications.
Niklaus Wirth not only invented the language, but he also contributed to Clascal, Apple's object oriented Pascal variant. His help was recruited by Larry Tesler, who not only worked on the Lisa OS, but also on Smalltalk at Xerox PARC and later on the Apple Newton.
The Lisa OS source code is available from the CHM:
https://computerhistory.org/blog/the-lisa-apples-most-influential-failure/
Wirth managed to do what few people in academic CS can/want to do today by implementing his own hardware, OS, compilers, GUI and applications and even designing his own hardware description language in several iterations (#Modula and the #Lilith, #Oberon and the #Ceres and later his RISC5 FPGA systems) and he used the systems in teaching and research at #ETHZ. Today, academic CS often prioritizes teaching of phenomena and coping with commercial HW/SW instead of building from first principles .
@me_ that's really sad to hear :/
@me_ Oh wow, didn't realize Martin Odersky was his student. May he rest in peace.
@me_ I have similar views but also things are very different these days, there are many good reasons why a single group can't produce credible versions of all of these things
@regehr Yeah, I'm wondering if I can (and am willing to) stay up to date with all current developments in computer architecture, operating systems, etc. until I retire. I'm moderately successful at ignoring Windows and x86 already . Currently it's all about AI here (surprise, surprise) and I have the impression that almost none of the students are still interested in system-level topics. Fourteen years to go until retirement, that's a long time...
@me_ @regehr I'm not sure I believe that, actually. I suspect there's quite a lot of interest, but little opportunity to pursue it academically (in large part because _so much_ emphasis is on AI, ML, etc, etc, etc...those of you inside of academia understand the ebbs and flows of popular topics and the influences they place on research far better than I do).
@peter_haas @me_ @regehr oh, well, if you want my opinion....
Plan 9 had some really cool ideas, but they were products of their time, the place they were developed, the group that developed them, and the issues that were important to that group in that time and place.
The idea of per-process, mutable name spaces and small file trees containing end points for IPC as a means for building a system was interesting and led to some really elegant demonstrations: I've always liked how one could import one's namespace from a terminal onto a CPU server, thus being able to (privately!) access all of one's local devices remotely. No need for X11 or a network sound protocol; just open the relevant device file and you're done. In many ways, Plan 9 was a better Unix than Unix, which is unsurprising given who developed it.
But the question must be asked, is building a better Unix all that relevant? Are highly synchronous system interfaces and files as _the_ metaphor for system interaction still relevant in 2023? More importantly, should they be? I would argue that they are less so than in the late 1980s (and I think some good judges would agree with me on that).
Then there's the implementation.... Plan 9 worked well enough on the sorts of machines available at Bell Labs in the 80s and 90s, but let's be honest: these are interesting ideas ensconced in research-quality code. In terms of reliability, it seemed to exist in this limbo somewhat midway between 7th Edition Unix and 4.3BSD for, well, ever. And then there's the C dialect: I think this was a clear win in 1989. In 2023? Not so much. It underspecifies on important things (atomicity of primitive operations, for instance) and assumes too much (sizes of primitive types). Some things were pure misfeatures (members of unnamed structs embedded in a struct become elements of the outer struct and we can pun on those has led to some weird contention issues with locks and reference counts). These days, aesthetics of some constructs or typedefs aside, I honestly think that C23 does everything that Plan 9 C did, but better.
So yeah. Plan 9: a highly influential research system, and I actually do run it in a few places these days, but it never evolved to be the system I hoped it would become. There was more hope for commercialization of Inferno (as a reimplementation of the plan9 ideas) but of course that never happened either. More's the pity.
Interestingly, there's a tie-in to Plan 9 (and Inferno!) from Wirth's work: the `acme` text editor was based on ideas from Oberon.
@cross @regehr This is probably a bit of a biased view, admitted. There was almost no teaching on computer architecture and OS topics here in Bamberg before I arrived in 2022 (only an introductory course which had to cram CA, OS and a bit of networking and security in a single semester), so it might take some time for (new) students to be motivated. On the other hand, we hired ten (!) new professors with AI-related research topics in the last two years here (stupid local politic(ian)s...)
@me_ @cross @regehr It was my experience at JHU that if you offer them solid systems courses, the students will come. Despite competition from other areas like ML. Not all of them, obviously, but plenty of them to keep faculty busy teaching architecture, systems programming, compilers, operating systems. The good stuff in other words.
@dotstdy Shortly after nuclear fusion works in practice...
@regehr @cross We received funding specifically for a larger number of (assistant/associate/full) professor in the context of the so-called "high-tech agenda AI" (sorry, in German only – https://www.stmwi.bayern.de/wirtschaft/forschung-technologie/hightech-agenda-bayern/). So it was sink or swim – either accept the funding specifically for a number of AI-related positions or don't get any funding at all...
It seems to me (an outsider) that "computer science" as a discipline has been diluted ― that CS, properly speaking, is about understanding, creating, implementing, and designing systems, and has little or nothing to do with "teaching programming".
@publius Some (quite a significant number of academics, in fact) think CS was never about programming. Usually this is combined with the infamous quote "Computer science is no more about computers than astronomy is about telescopes, biology is about microscopes or chemistry is about beakers" . However, I believe it's of utmost importance that CS students get their hands dirty and learn to build and analyze systems, even ones close to the hardware or directly in hardware.
Having done assembly-language programming, I definitely think it's something that anyone in CS should do, just for the insight it cultivates into what's actually happening under all those abstractions. Fraunhofer, the lens-grinder, made two of the most significant discoveries in the history of astronomy because he built a better telescope. But that doesn't mean that astronomy is about telescopes.
Universities are selling degrees in lens-grinding and calling it astronomy.
@me_ I read the article. I see the blurb:
"The Lisa’s sophisticated operating system, which allowed multiple programs to run at the same time (“multitasking”) was too powerful even for its 68000 processor, and thus ran sluggishly."
I am surprised...the Amiga 1000 has the same CPU yet it wasn't sluggish at all, and definitely was multitasking. I think something else was dragging Lisa down.
EDIT: typo fixed (thing -> think)
@thebluewizard I think there are several reasons. The Lisa runs at 5 instead of ~8 MHz due to constraints with the bandwidth of the DRAM shared between CPU and the (discrete, supported by a state machine PROM) video circuitry. Most of LisaOS is written in Pascal, but I think AmigaOS (at least the early TRIPOS-based version) was written in BCPL – please correct my if I'm wrong, I was on the Atari ST side in the '80s – so I would expect a similar overhead due to the use of an HLL compiler...
@thebluewizard OTOH, Apple wrote most of the initial MacOS in 68k asm and the early Mac didn't feel slow unless it was constrained by its little memory or constant floppy swapping. However, there was no multitasking in early MacOS and the Mac ran at almost 8 MHz. The Lisa's video (720*364*1, 60 Hz) required a memory bandwidth of ~2 MB/s, the Mac (512*342*1, 60 Hz) only ~1.3, this probably helped (though some Mac developers ascribe the upgrade to 8 MHz to some ingenious design by Burrell Smith).
@me_ I didn't know Lisa ran on 5 MHz. That can play some role in slowing down. Also, the compiler's code generation was not so good back then...again another factor. And yeah, the video rendering could be another factor.
AmigaOS's architecture is made up of libraries (not the kind one uses to link programs with a linker; it is completely different...more like a crude version of COM, if you will). In version 1.x, all except one were written in C, and the one exception is the DOS library, ported from TRIPOS, and was written in BCPL, so any DOS calls must pass the BCPL pointer, which is a pointer divided by 4, not a normal pointer. This makes for a rather unusual coding. This holds true for AmigaOS 2.x and 3.x, though I think the DOS library code were rewritten in C. I am not familiar with AmigaOS 4, as things got changed in it.
Incidentally, the AmigaOS developers were forced to port TRIPOS because they couldn't figured out how to write a multitasking disk I/O. Yep, TRIPOS saved the day for the developers.
I know all this because I used to write some AmigaOS apps, plus I read some history of how AmigaOS came to be! And I met a few Commodore engineers myself, and I even have my Amiga 3000 signed from them!!!
@me_@sueden.social I have no idea who he was, but R.I.P.
@graphite One of the pioneers of computer science and a teacher and researcher who influenced not only European, but worldwide computer science for decades. Check out http://www.projectoberon.net if you have time – there's so much to learn from his code and books.
@me_ RIP Legend
@me_ pascal was my first one
@me_
After using PL360, Algol-W, a dialect of Pascal (Protel), and assembler (S/360, PDP-11), I learned C; and C was a terrible step backwards from all of those. The software world would have become so much better if more people had paid attention to and had understood Wirth's work.
RIP
@me_ I didn't even realize that's what started the thread. Yeah, used to be a big Wirth-ian myself, did a lot of programming in AmigaOberon in the 1990s. Mostly to avoid C which of course I use every day now at work. It was super-cool that Wirth still hacked Oberon stuff, including hardware, after his retirement.
@me_ huh. Good thing I solved one #AdventOfCode 2023 day using Pascal, then.
(GNU Pascal, which is part of GCC on MirBSD and therefore preinstalled, and which supports the Turbo Pascal 6-style OOP syntax I’ve learnt. Never FreePascal.)
@me_ :
"Today, academic CS often prioritizes teaching of phenomena and coping with commercial HW/SW instead of building from first principles ."
"Currently it's all about AI here (surprise, surprise) and I have the impression that almost none of the students are still interested in system-level topics. "
treffende Beobachtungen, die ich auch teile
ich denke stärkeres Aufzeigen der Bedeutung von IT-Grundlagentechnologien würde allen helfen - in Kombination mit #hypebusting
@branils @me_ While I think that there is still _some_ substance in AI (as compared to blockchain, NFT and that other BS), students should realise that unless they land a job at Amazon, Google etc., the actual amount of software projects and business opportunities yielding actual money is pretty limited at this point. It is highly debatable if this will change in the future.
@cloud_manul @branils @me_ nah, someone wrote a clone of a (slightly older) ChatGPT in a 498-line SQL query.
No substance whatsoever.