Pascal was the first structured high-level language I learned when I was a kid.
After I learned C and started using it, I noted that I experience five times more memory-related issues in C programs than in an equivalent Pascal code I was writing before.
During that era, Pascal had a remarkable advantage few other languages could match: it used a single-pass compiler that generated machine code as it parsed the source code. No intermediate representations or syntax trees - just a direct translation from source to machine code, all thanks to the well thought-out language syntax invented by Nicolas Wirth. That feature made Pascal compilers incredibly fast.
In turn, it allowed to tighten up a typical development cycle of the day: (edit -> compile -> run) x N times. Given typical CPU speeds of the time, it made a night and day difference. For example, given the same piece of software under development with a comparable number of lines, Turbo Pascal development cycle was about 5 seconds, while Turbo C gave you 40 seconds of a round-trip time at best.
Pascal was the right tool at the right time. Both Apple and Microsoft initially used Pascal to develop their operating systems.
When available CPUs started to become faster and faster, that particular Pascal advantage began to fade out and other languages commenced eating away its market share. Somewhere between 1986 and 1992, software houses were switching to C in flocks.
Unfortunately it took until cyberattacks started to make an impact on companies budgets, or critical countries infrastructure, for governments and companies to finally realise there was something to programming with straightjacket (the usual criticism from C folks against Pascal back then).
> For example, given the same piece of software under development with a comparable number of lines, Turbo Pascal development cycle was about 5 seconds, while Turbo C gave you 40 seconds of a round-trip time at best.
I mentioned it here recently but, we use Delphi at work, the Turbo Pascal successor. A full release build of our main project is about 2 million lines of code, and compiles and links in about 40 seconds on my laptop which has an Intel i7-1260P. A mere compile is of course typically much faster.
I haven't benchmarked it recently myself, but back in the 2000s code generation was quite decent. It was good enough I decided to stop handwriting assembly, as writing compiler-friendly code was significantly faster and much more readable.
> After I learned C and started using it, I noted that I experience five times more memory-related issues in C programs than in an equivalent Pascal code I was writing before.
I wasted so much productive time learning/writing C for the sake of learning/writing C instead of just doing the work in Pascal back then.
Later on I had the same problem, I already knew Rails but wanted to do Python/Django just becuase.
I first learned programming in Wang BASIC on a Wang 2200 computer (8KB RAM!) in 1978. A year later a "Byte Shop" computer store opened up in town. I didn't have any money, but I would go by and look at the different computers on display and browse their books. While flipping throw a book on pascal, I remember being confused: is this the actual programming language, or is this pseudocode?
Another advantage of Pascal is that the programs written in it crashed much less, which also allowed for a much safe development on the machines of that time which didn't have any "memory write" protections.
And safety in development actually translated in less crashy product too.
Pascal is very much like a managed language but without GC or borrow checker. It's not formally memory-safe, but its syntax discourages a developer from playing with fire unless it's really needed.
Additionally all the flaws regarding it being designed for teaching and the raise of dialects, were already fixed by 1978 with Modula-2, which Niklaus Wirth than created, with the learnings from Mesa at Xerox PARC.
Later, we also got the managed language genealogy, via Modula-2+ branch, and Niklaus Wirth own Oberon variants, or inspired dialects from it.
Nowadays GCC has Ada, Modula-2 and Algol 68 as official frontend, we have Free Pascal and Delphi.
Then we also have all the other modern ones that somehow got some inspiration out this history.
Thus we as an industry aren't lacking alternatives.
Writing compilers for old CPUs has some real magic in it. It helps you see how processors really work and brings back the old days when hardware was simple and easy to understand. I miss that time. I once wrote a small C compiler in TypeScript for the Intel 8086 and 8087 ([1]), and I have huge respect for the people who coded for those chips. It’s super hard but also very rewarding.
8086 was a cakewalk compared to some of the weirder old chips. 6502 is a notoriously bad compiler target but things like the Signetics 2650 or RCA 1802 had a completely different set of challenges.
I think writing lexers and parsers is just fun, code generation I have not done; which is next level imo. I guess the next level after that is doing the lexing parsing and code generation on the chip. Then the need for multi pass compilation would become apparent quickly I presume!
> It helps you see how processors really work and brings back the old days when hardware was simple and easy to understand. I miss that time.
Just FWIW, you can still find Z80s listed for sale all over the usual e-merchants and people absolutely still design around them. It wasn't discontinued until last June, and there's an updated eZ80 design still made and sold by Zilog.
The Z-80 was one of the best compiler targets of that age, but the 8086 was even better. Everyone was amazed at the very fast Turbo Pascal compiler for the Z-80 that got ported to 8086. I had an 80286 computer and Turbo Pascal was my favorite programming language because the compiler was fast, execution was fast, and the language was extended enough that you could do most systems and applications programming in Pascal -- you could easily link assembly language procedures such as replacements for the stdlib zero and copy routines that took advantage of new instructions and wider paths to 2x those functions.
This is really cool, I only managed to get a Pascal interpreter for a subset of the language as a type-in book, when I was already into PC land and naturally only kept the book as collection item.
This made my heart melt:
https://github.com/pleumann/pasta80/issues/7#issuecomment-28...
For those that don't know it, Anders Hejlsberg is the guy behind Turbo Pascal, Delphi, J++, C# and TypeScript https://en.wikipedia.org/wiki/Anders_Hejlsberg
That's like knowing that the gods are smiling down on your project.
Good heavens. Comparable to getting one of those $0x1.00 cheques from Donald Knuth...
Pascal was the first structured high-level language I learned when I was a kid.
After I learned C and started using it, I noted that I experience five times more memory-related issues in C programs than in an equivalent Pascal code I was writing before.
During that era, Pascal had a remarkable advantage few other languages could match: it used a single-pass compiler that generated machine code as it parsed the source code. No intermediate representations or syntax trees - just a direct translation from source to machine code, all thanks to the well thought-out language syntax invented by Nicolas Wirth. That feature made Pascal compilers incredibly fast.
In turn, it allowed to tighten up a typical development cycle of the day: (edit -> compile -> run) x N times. Given typical CPU speeds of the time, it made a night and day difference. For example, given the same piece of software under development with a comparable number of lines, Turbo Pascal development cycle was about 5 seconds, while Turbo C gave you 40 seconds of a round-trip time at best.
Pascal was the right tool at the right time. Both Apple and Microsoft initially used Pascal to develop their operating systems.
When available CPUs started to become faster and faster, that particular Pascal advantage began to fade out and other languages commenced eating away its market share. Somewhere between 1986 and 1992, software houses were switching to C in flocks.
Unfortunately it took until cyberattacks started to make an impact on companies budgets, or critical countries infrastructure, for governments and companies to finally realise there was something to programming with straightjacket (the usual criticism from C folks against Pascal back then).
> For example, given the same piece of software under development with a comparable number of lines, Turbo Pascal development cycle was about 5 seconds, while Turbo C gave you 40 seconds of a round-trip time at best.
I mentioned it here recently but, we use Delphi at work, the Turbo Pascal successor. A full release build of our main project is about 2 million lines of code, and compiles and links in about 40 seconds on my laptop which has an Intel i7-1260P. A mere compile is of course typically much faster.
I haven't benchmarked it recently myself, but back in the 2000s code generation was quite decent. It was good enough I decided to stop handwriting assembly, as writing compiler-friendly code was significantly faster and much more readable.
> After I learned C and started using it, I noted that I experience five times more memory-related issues in C programs than in an equivalent Pascal code I was writing before.
I wasted so much productive time learning/writing C for the sake of learning/writing C instead of just doing the work in Pascal back then.
Later on I had the same problem, I already knew Rails but wanted to do Python/Django just becuase.
I first learned programming in Wang BASIC on a Wang 2200 computer (8KB RAM!) in 1978. A year later a "Byte Shop" computer store opened up in town. I didn't have any money, but I would go by and look at the different computers on display and browse their books. While flipping throw a book on pascal, I remember being confused: is this the actual programming language, or is this pseudocode?
This is the reason why there are still people doing new development in Delphi.
Another advantage of Pascal is that the programs written in it crashed much less, which also allowed for a much safe development on the machines of that time which didn't have any "memory write" protections. And safety in development actually translated in less crashy product too.
Pascal is very much like a managed language but without GC or borrow checker. It's not formally memory-safe, but its syntax discourages a developer from playing with fire unless it's really needed.
Additionally all the flaws regarding it being designed for teaching and the raise of dialects, were already fixed by 1978 with Modula-2, which Niklaus Wirth than created, with the learnings from Mesa at Xerox PARC.
Later, we also got the managed language genealogy, via Modula-2+ branch, and Niklaus Wirth own Oberon variants, or inspired dialects from it.
Nowadays GCC has Ada, Modula-2 and Algol 68 as official frontend, we have Free Pascal and Delphi.
Then we also have all the other modern ones that somehow got some inspiration out this history.
Thus we as an industry aren't lacking alternatives.
Modula-2 is the road not taken for me. It's such a pity that language didn't get the chance that it deserved.
Writing compilers for old CPUs has some real magic in it. It helps you see how processors really work and brings back the old days when hardware was simple and easy to understand. I miss that time. I once wrote a small C compiler in TypeScript for the Intel 8086 and 8087 ([1]), and I have huge respect for the people who coded for those chips. It’s super hard but also very rewarding.
[1] https://github.com/Mati365/ts-c-compiler
8086 was a cakewalk compared to some of the weirder old chips. 6502 is a notoriously bad compiler target but things like the Signetics 2650 or RCA 1802 had a completely different set of challenges.
I think writing lexers and parsers is just fun, code generation I have not done; which is next level imo. I guess the next level after that is doing the lexing parsing and code generation on the chip. Then the need for multi pass compilation would become apparent quickly I presume!
> It helps you see how processors really work and brings back the old days when hardware was simple and easy to understand. I miss that time.
Just FWIW, you can still find Z80s listed for sale all over the usual e-merchants and people absolutely still design around them. It wasn't discontinued until last June, and there's an updated eZ80 design still made and sold by Zilog.
The Z-80 was one of the best compiler targets of that age, but the 8086 was even better. Everyone was amazed at the very fast Turbo Pascal compiler for the Z-80 that got ported to 8086. I had an 80286 computer and Turbo Pascal was my favorite programming language because the compiler was fast, execution was fast, and the language was extended enough that you could do most systems and applications programming in Pascal -- you could easily link assembly language procedures such as replacements for the stdlib zero and copy routines that took advantage of new instructions and wider paths to 2x those functions.
This is really cool, I only managed to get a Pascal interpreter for a subset of the language as a type-in book, when I was already into PC land and naturally only kept the book as collection item.
Thanks for spoiling my upcoming weekend. :)
This is great to see. I have played around with Turbo Rascal Syntax Error and will give this a shot. SpeccyNext coming when round 3 ships.
Great name. Names can be so generic but this hits the nail on the head
[dead]
[dead]