The fundamentals of CS courses in my undergrad program were taught in Racket (with HTDP). At the time this seemed very impractical, because I already knew Java and just wanted to write useful apps.
But as I've progressed in my career, I've come to appreciate how it helped me develop a strong foundation for reasoning about programs and their underlying logic (in any language) that continues to serve me to this day.
If I was approaching this outside of a structured curriculum, it would be hard for me to justify this long-term intangible benefit relative to being able to move quickly to write something interesting and useful. And I think writing interesting and useful programs is the best way to motivate oneself to keep programming. So I can't strongly advocate for this approach. But I do think there are some worthwhile benefits.
nkassis 683 days ago [-]
I think it could be argued that the language used in universities is unlikely to be the one used professionally. I remember when the derogatory term "Java school" was commonly used for schools that shoehorned Java in all courses just to try to match the language of the day in enterprises. It really should be about what is the best pedagogical tool to teach the concepts the student is expected to understand irrespective of future professional tools used. If racket exposes those concepts better without making it onerous on students to learn that's great.
When MIT switched from scheme to python for their undergrad classes there was a big debate here aboutit. I think the argument was that python won't make learning the concepts any harder but is less of an issue for students to learn and has some long term value professionally so win-win. I don't fully agree but I can see the logic.
timidger 683 days ago [-]
It's not (just) that python is "easier" to learn than python (which I dispute - lisp is as easy to learn as a first language as any other. Depending on the language it may be a difficult second language though). The world had also changed radically since it was introduced into the curriculum:
"Costanza asked Sussman why MIT had switched away from Scheme for their introductory programming course, 6.001. This was a gem. He said that the reason that happened was because engineering in 1980 was not what it was in the mid-90s or in 2000. In 1980, good programmers spent a lot of time thinking, and then produced spare code that they thought should work. Code ran close to the metal, even Scheme — it was understandable all the way down. Like a resistor, where you could read the bands and know the power rating and the tolerance and the resistance and V=IR and that’s all there was to know. 6.001 had been conceived to teach engineers how to take small parts that they understood entirely and use simple techniques to compose them into larger things that do what you want.
But programming now isn’t so much like that, said Sussman. Nowadays you muck around with incomprehensible or nonexistent man pages for software you don’t know who wrote. You have to do basic science on your libraries to see how they work, trying out different inputs and seeing how the code reacts. This is a fundamentally different job, and it needed a different course.
So the good thing about the new 6.001 was that it was robot-centered — you had to program a little robot to move around. And robots are not like resistors, behaving according to ideal functions. Wheels slip, the environment changes, etc — you have to build in robustness to the system, in a different way than the one SICP discusses.
And why Python, then? Well, said Sussman, it probably just had a library already implemented for the robotics interface, that was all."
> It's not (just) that python is "easier" to learn than python (which I dispute - lisp is as easy to learn as a first language as any other. Depending on the language it may be a difficult second language though). The world had also changed radically since it was introduced into the curriculum:
A lot of texts will use pseudo-code and it is (from my experience) easier for beginner programmers to see the relation between pseudo-code and Python than for many other languages.
lproven 683 days ago [-]
> python is "easier" to learn than python
(?)
aidenn0 683 days ago [-]
That was quoted from GP; presumably they meant "easier to learn than scheme"
lproven 682 days ago [-]
Aha, ISWYM. Thanks.
melagonster 683 days ago [-]
yes, this actually is the most important advantage of python: python can equal to pseudo code. but new function of python reduced this advantage.
agumonkey 683 days ago [-]
I'm still surprised Sussman describes the 2000s as a "new" job whereas to me it's just mediocre soil from an unstable industry. Mucking around is not engineering.. it's alchemy.
dreamcompiler 683 days ago [-]
The logical culmination is modern ML, where no engineering is involved. "If the thing fails on some case and kills somebody, train it more." This is voodoo, not engineering.
tjr 682 days ago [-]
Given Sussman's aversion to mysterious black-box AI, that's an interesting observation. The curriculum change brings everyone one step closer to not really understanding how things work.
Of course, it's often pointed out, "Well, if you write in Scheme (or C or Java or whatever) then you're not writing in assembly language, much less machine language, so you already don't understand everything." There's certainly truth there, but, to me, going from, expertise writing code in a high-level programming language to, gluing together libraries that you kinda-sorta understand, feels like a bigger leap in what you do or do not understand than going from assembly language to Python.
tigen 683 days ago [-]
I suppose it depends on what issues we should consider to be introductory. Maybe the older theoretical approach was more fundamental then than it is now. Like, maybe that was effectively more practical given the old hardware performance constraints and likely having more control of the entire program stack.
It's interesting that Sussman kind of lumps together "uncertain software libraries" into the same category as machine control robustness (e.g. hysteresis). I never thought of it that way but I guess in practice it's all just "stuff", those libraries are just another piece of your program's environment like any other.
agumonkey 683 days ago [-]
Maybe MIT approaches post 2000 engineering with a solid foundation of analysis that creates both a beautiful creation process and reliable beautiful software artefacts. But what I observe is a never ending stream of partial doc reading, partially out of date, with random attempts until it looks it won't fail it left running for a few minutes.
Ability to deal / reflect with unknowns for engineering is of great value, but so far I've never seen that in office.
pklausler 683 days ago [-]
I see this as the distinction between computer programming and software engineering.
lproven 682 days ago [-]
> It's not (just) that python is "easier" to learn than python
(?)
douglaskayama 682 days ago [-]
He meant "python is easier to learn than scheme/racket".
andtheyrerobin 682 days ago [-]
> It really should be about what is the best pedagogical tool to teach the concepts the student is expected to understand irrespective of future professional tools used. If racket exposes those concepts better without making it onerous on students to learn that's great.
One of the challenges with an approach which doesn't concern itself with "industrial grade" or "production ready" languages is getting buy in from student. Even if there were a perfect language for teaching, if students don't see the applicability of that language they aren't going to learn enough of the concepts to move to such a language later.
I think it's very easy for us (and other technically competent folks) to see value in learning how computers work for the sake of that knowledge; however, students, as an over generalization, don't. The fact that relevance and motivation are some of the hardest hurdles to overcome in early computing classes is a perfect example of this. Using languages with a professional pedigree is important because it increases student buy in to what they're being taught.
An analogy I like is that you wouldn't give someone new to woodworking a toy hammer and hand saw because they need to learn fundamentals like striking and cutting before they can start using "real" tools, you would provide them with capable, but beginner friendly, tools that allow them to build those skills as they learn.
eyelidlessness 683 days ago [-]
I still haven’t tried Racket, but I did dive into Clojure. It was on the job, not a structured curriculum by any stretch. But I can say the same:
> …it helped me develop a strong foundation for reasoning about programs and their underlying logic (in any language) that continues to serve me to this day.
I went on to use Clojure and ClojureScript for several years. Now I primarily work in JavaScript and (preferably) TypeScript, but I can’t say a day has gone by where what I learned working in Clojure hasn’t been applicable and valuable.
Just one more anecdata point, but I would advocate learning a lisp—maybe even any lisp. In fact, it’s usually one of my first recommendations when juniors/mentees ask me for advice in any kind of broad strokes.
xp84 683 days ago [-]
When I read things like the second paragraph above, I start to wonder if maybe I am actually a terrible programmer because I only ever learned what you might generously call “applied computer science” aka only the languages commonly used in typical workplaces (Java, PHP, Ruby, JS) and none of the CS classics. Does anyone else without the so-called ‘academic’ languages foundation ever feel this way when you see people on HN always writing about Lisp and its peers?
ign0ramus 683 days ago [-]
You're probably not. Programming in lisp,back in my college days, felt enlightening. I remember flying through quiet complex projects with a clarity that I've rarely experienced in my professional career. I do yearn for those days when I'm deep within a nonsensical "enterprise" Java codebase or a messy frontend codebase but I don't think Lisp made me an objectively "better" programmer. There is enough imposter syndrome in this industry as it is. If you ever want to, pick up SICP or "How to design programs", you'll probably find them interesting but don't expect to find programming nirvana.
gglitch 683 days ago [-]
I think I had the inverse experience. I started with Lisp because I was customizing and extending Emacs. As I self-taught programming from that point forward, most other languages have sooner or later frustrated me with inexplicable inconsistencies with the syntax and data model. Lisp, Tcl, and Prolog all clicked into place quickly and easily, and for everything else, I'm constantly referring to a guide to the syntax. Not that big a deal, but feels internally like at best sort of a head-scratcher.
None of this has anything to do with academic computer science. I think I'm just unusually bad at memorizing syntax.
scj 683 days ago [-]
I had the same prof for several courses. His exams were pretty straight forward, but always contained a "tough question".
In order to get that tough question, you could carefully read the book, read his assignment solutions, and pay close attention to his caveat warnings in lectures... One of the three would typically contain the tough question.
Or, you could just read the book, do the assignments, and accept ~95% on the final exam as a maximum grade.
A ~95% cap, for a lot less work, is a good deal. Taking the deal is probably the right call for most people.
That being said, the thing I wish someone explained well is how amazing lambdas can be. I understood the benefits of list/array/stream processing on first glance, but it took a few years before I really understood the practical application of lambdas.
xedrac 683 days ago [-]
Well you had a nice professor. Mine would have a few "tough questions" that accounted for 50% of your grade. It forced you to know everything inside and out if you hoped to pass.
tigen 683 days ago [-]
Chapter 1 of the SICP book says that the main reason it uses Lisp is its convenience in manipulating procedures as data.
There is a "JavaScript Edition" of SICP. The preface says ECMAScript 2015 enabled a close match by adding lambda expressions, tail recursion, and block-scoped variables and constants.
It doesn't mention much downside except that JavaScript's "rigid syntactic structure complicated the implementation" of some programming language design book material.
zsh_89 683 days ago [-]
No, I say you won't get too much by learning LISP.
I was attracted to LISP-related topic by similar arguments;
looking back now, I found those argument at least not true for me.
You WILL get very interesting ideas and concepts, it's fun, basically, that's it.
+ You won't solve problem faster than your colleges with solid competitive programming background;
+ you won't be able to optimize the code and cut 30% of your company's server cost,
+ you won't feel comfortable to read real-world complex project code,
+ you won't get some domain-specific knowledge to solve problems you previously can't solve.
I also wish I would have studied classical CS in college. However, I was sick most of the time, and there were half a dozen weed-out classes, so the computer science courses I did take were non-major courses to supplement a GIS degree.
GIS is certainly interesting in its own ways, and it taught me some visualization and design skills, but I wish it would have been possible for me to learn more of what's going on under the hood of the massive libraries and high-level languages I use every day as a professional "software engineer"
2143 683 days ago [-]
Me too.
So I decided to learn SICP on my own. I already have the book, so I'll watch the MIT lectures of the original authors available online, and try to make progress.
I'm not in any hurry to complete it anytime soon.
tgbugs 683 days ago [-]
I've been using Racket for about 8 years now, and it is now my goto language for anything that needs a gui. It took writing 3 or 4 not-so-little guis to feel comfortable working with Racket's object system, but now it basically goes at the speed of thought.
Lately I've been using Racket to prototype the interaction design for a game, and I had written some placeholder text. I had put fake interpolation in for {player-name} and things like that, and then I realized that #lang at-exp would let me just ... do that, as described in the original article.
I had to laugh, because I maintain a bunch of Python code, and if Python has batteries included, Racket comes with a retrofittable fusion battery pack.
One other little anecdote, is that I've been porting an old plt-scheme codebase to Racket CS. In the second phase I have started replacing an ancient C++ rendering layer with the racket gui canvas functionality. During the process I moved hundreds of runtime type checks in the C/C++ layer to the Racket contract system. It was as if the Racket team had somehow secretly obtained access to the codebase I was working on and had designed the perfect abstractions to solve the problem.
jacknews 682 days ago [-]
Is it any good for actual games though?
I have one kid using scratch now, and one older graduated to godot, roblox/lua, some javascript/web.
I would love to get them into lisp (already done a bit of Janet along with me), but it really needs to be something they can write their game ideas in, and share/show-off to friends.
tgbugs 682 days ago [-]
Racket has had significant engineering and design time spent directly on the use case you describe. See this old thread from John Carmack about his son learning with Racket https://news.ycombinator.com/item?id=10111479. Things have continued to improve since then as well.
From my own experience, it is absolutely possible to get good interaction with low latency in Racket, but of course
as with any language you have to have enough experience to know how to avoid performance pitfalls.
For some of the gui work I've done I needed to mix threads and semaphores to get things to be really responsive when there could be a long running process triggered by a UI interaction, but in Racket it has been vastly easier and safer than in other languages.
bo-tato 675 days ago [-]
In the annual lisp game jam it seems a lot of the submissions and the most polished ones are using fennel, which is a lisp that compiles to lua from the same creator of Janet. Then for lua (and fennel) there is high quality game engines LÖVE (2d games), TIC-80 (retro 2d), LÖVR (3d).
With regard to Racket more generally, I'm probably not the best person to ask since I had a very high friction start where I just banged my head against the wall until things made sense.
maroonblazer 683 days ago [-]
I've started working through "How To Design Programs" and it calls for using DrRacket. It didn't occur to me to see if there was support for Racket in VSCode. Per this article, there is, via extensions.
For anyone who has worked through HTDP: is there any benefit to sticking with DrRacket vs using VSCode? The friction involved in moving around in DrRacket really dampens my enjoyment of the material in HTDP.
soegaard 683 days ago [-]
I recommend using DrRacket for the first two weeks.
And use it when you are working with picts.
Note that you can change the default settings to be more Emacs-like.
i-am-gizm0 683 days ago [-]
My intro CS class was in Racket. Probably 99% of people used DrRacket and were fine with it (to be fair that was the you haven't really programmed before class). I had to use it for some of the graphics assignments and it's... fine?
If you're going to keep using it, make your life easier and use a proper monospace font (like Consolas). The default (I think Cambria?) doesn't align parentheses and brackets with each other and it makes it hard to read (at least for me).
iNerdier 683 days ago [-]
I’m slightly confused by the aim of this article. It goes out if it’s way to state in the first few paragraphs how approachable it is to newcomers, then dives into talking about how good it is at making, other programming languages.
How big a desire of the average neophyte is to make themselves another language?
andsoitis 683 days ago [-]
There’s an elegance to a language that works great for beginners but also scales to very sophisticated use cases.
p4bl0 683 days ago [-]
Indeed, Racket is a platform for active academic research in both computer science education and programming language theory.
agumonkey 683 days ago [-]
I don't think both aspects are to be analyzed at once. Or unless maybe through the lens of minimization.
In most mainstream languages you have a large syntax, lots of idiosyncracies, and limited ability to hack with the innards of your tool. Scheme is small, regular and freeing.
Now personally, after my first year of college, I asked why on earth can't we access the methods in an object (java4 at the time) to generate a UI to dispatch / interact with it. Teacher rapidly walked me off the room while mumbling "but that would be metaprogramming!". I left confused about his annoyed tone.
Not until year 4 we had the chance to see reflection/intercession, lisps, macros.. I wish someone showed that to me when I was 12.
ps: it might not be obvious, and maybe I'm wrong but I see adhoc DSL's everywhere I work. ORM, build tools, they're all pseudolanguages.. and people keep reinventing them. Scheme/lisp offers it on a silver plater for you.
err4nt 683 days ago [-]
There are a lot of problems where people struggle to simply put the problem into meaningful terms, so they can think about it.
If you deal with the same problem space a lot, having a Domain-Specific Language (DSL) can let you execute that vocabulary and work at the level of the terms that fit the problem space best.
This is a lot smaller in scope than designing a whole programming language, think of it like a unique vocabulary for solving specific problems and automating specific kinds of work you do!
Solvency 683 days ago [-]
Seems pretty common to claim <technology X is both easy to learn yet extremely powerful>.
ssivark 683 days ago [-]
It’s taking about “little languages” (DSLs), somewhat in the spirit of what one thinks of as libraries/APIs (on a continuum, all the way to something with its own syntax… Like regular expressions).
Is this any different than a beginner musician who learns to play their favorite songs but also write their own?
683 days ago [-]
mst 683 days ago [-]
A neophyte wants to make a thing that -does- something, but part of the idea of Racket's Language Oriented Programming thing is to increase the subset of somethings for which throwing together a quick DSL makes for easier/nicer code.
I've not watched a relative neophyte try and actually -do- that so I can't comment on how far they've got with it, but making "creating DSLs" approachable seems like a really interesting thing to be aiming for.
SoftwareMaven 683 days ago [-]
The first series of courses in my CS program introduced us to building a pascal interpreter using scheme. It all depends on the pedagogical needs, I suppose.
psychphysic 683 days ago [-]
New comers 5 years experience systems engineering. Just like your typical entry level job listing.
markstos 683 days ago [-]
Looks great, but didn’t answer the question for me of why to learn this over TypeScript or Rust.
The answer seems to be in the title: this of particular interest for those already interested in Lisp or Scheme.
zelphirkalt 683 days ago [-]
Sibling comment already asked the important question to ask yourself, but my answer would be: Language design decisions and the typical benefits of using Scheme dialects or former Scheme dialects, as Racket does not consider itself a Scheme any longer, iirc.
For example being able to make good use of recursion, being able to write programs more elegantly.
Racket, in contrast to JS or TS, comes with lots of batteries included, therefore not requiring so many third party dependencies. Time and time again I see the Racket docs and think: "Ah Racket has got something for that/got you covered." Also it is rich in programming language concepts. You can write your programm in any of the languages that come with Racket, or you can write it with function contracts or you can use Typed Racket.
At its core, it is a much cleaner and better designed language than TS will ever be due to TS' JS legacy.
markstos 683 days ago [-]
I coded in Perl back when it dominated web development and came to value practical over beautiful in language choices.
CPAN made Perl the best choice for many tasks and the same can be said of NPM for choosing TypeScript now, regardless of the elegance of the language.
But I’m also a fan of “language tourism”. Even if one language is more practical, visiting other language cultures can make us better developers overall.
I studied Haskell for a bit which gave new ideas for how I use Perl. Culturally, Haskell was using a lot of functional programming concepts and recursion. Those were possible in Perl but not popular.
I’m sure there is value in studying Lisp languages for those who end working mostly in TypeScript or Rust
mcpackieh 683 days ago [-]
> Racket does not consider itself a Scheme any longer, iirc.
Racket (the language) itself is no longer a Scheme, strictly speaking, but still feels generally "schemey" in the broad scheme of lisps. However, Racket (the software package) does ship with a Scheme or two, namely R5RS and R6RS.
disconcision 683 days ago [-]
Even the racket teachpack libraries designed for education are very capable; I was able to make this structured editor with only using teachpack content without external deps: https://github.com/disconcision/fructure
aobdev 683 days ago [-]
I would argue that learning Racket as part of a foundation in CS would make it easier to learn TypeScript and Rust.
soulbadguy 683 days ago [-]
What are you looking for in learning a new language ?
markstos 683 days ago [-]
TypeScript as a job skill for web work.
Rust because it’s the other language I see most often see coders learn to build their own apps. It has a good reputation for developer experience, security and speed.
I’ve heard enthusiasm for lisp-based languages for a couple decades and don’t doubt the merits of Racket.
eyelidlessness 683 days ago [-]
Disclaimer: my lisp experience is largely with Clojure, and not at all with writing Racket, just earnestly reading about it.
If you’re picking just one language to learn, and limiting your learning to only that one language for a significant period of time, you’ll probably want to learn Rust. You won’t get all of the benefits of also learning a lisp, but you’ll get a lot of the ones which generalize well.
If you’re open to learning 1+N and otherwise lean towards TS, I would say you’re better off learning Racket (or any lisp really) in tandem than learning TS alone.
Familiarity with the core concepts and idioms of FP are broadly beneficial in any language, and lisps tend to be a good balance of those with low incidental mental overhead, and reasonable escape hatches to do imperative stuff where it makes sense. Racket is probably a particularly good candidate because its optional typing is another overlapping story between the two.
I haven’t written in any lisp for close to a decade now, but I still find that prior experience beneficial every day since. I sometimes miss the simplicity and flexibility of the parentheses. But I can take what I learned—how to reason about state, data flow, data-driven abstractions—anywhere. And those abstractions are particularly useful in a structural type system like TS.
soulbadguy 683 days ago [-]
-- You won’t get all of the benefits of also learning a lisp, but you’ll get a lot of the ones which generalize well.
Which one ? Rust and lisp in general are quite different beast...
eyelidlessness 683 days ago [-]
Rust also is functional by default with clearly distinguished imperative mechanisms. In my (admittedly limited) experience, Rust hews more towards the type theory of MLs and generally encourages programming with expressions rather than statements. The syntax is definitely less lispy but not so much that it’s hard to metaprogram.
soulbadguy 683 days ago [-]
Yeah in which case I think you are fine skipping racket or lisp for that matter
uneekname 683 days ago [-]
I love Racket, re-wrote my website in it and a few other projects. I'd call myself an advanced beginner, I guess.
> Racket excels as a programming language for young learners and beginners.
I honestly think Racket could do more to be beginner-friendly. The documentation is excellent, but difficult to understand as a newcomer. There are some great little tutorials that are easy to work through, but the ramp from there to writing your own Racket programs is steep imo.
I don't know of any other language with so many batteries included. Racket deserves to see community growth, and hopefully with that will come more resources for folks to get started
no_wizard 683 days ago [-]
My limited use of Racket is purely due to how it has its own IDE and nothing else.
Its fine, and its pretty good for Racket code (and its variants, like typed racket) but when I'm working on a real project I have other things I need to write too, like CSS, HTML, TypeScript, bash scripts etc.
Its more ecosystem to be more adoptable IMO.
I love it for learning though, but turning racket into a production level language w/ proper ecosystem has a way to go
jimhefferon 683 days ago [-]
Lots of folks use it outside of Dr. Racket.
dizhn 683 days ago [-]
I am getting a weird rectangular box on this website that is not allowing me to read the article in peace. My assumption was they did want me to read it, but it doesn't seem to be the case. It says "Discover more from Deus In Machina" on this box. I would maybe want to discover more except for 2 things, #1 it won't let me read the current thing, #2 i don't know that other things will let me read it either.
I think they should fix this quite obvious bug before writing more articles.
ArchieMaclean 683 days ago [-]
Beneath the large obvious green 'Subscribe' button there is some low-contrast grey text ('Continue reading') that, when clicked, hides the box. This is on Substack's side, not the author of the article.
dfan 683 days ago [-]
This is an annoying Substack thing. Click on "Continue reading" (if you wish to).
doodpants 683 days ago [-]
I use the "Disable JavaScript" extension in Firefox, and created a rule which disables JavaScript for the entire substack.com domain so as to avoid this nonsense. Unfortunately, some sites such as this one use their own domain name despite being based on Substack, so I still have to disable it manually in these cases.
683 days ago [-]
mrkeen 683 days ago [-]
I get this kind of thing on plenty of articles linked from HN. Sometimes it's a paywall. Othertimes it's just an inexplicable UX race to the bottom. The first stages of "enshittification" I guess?
efficax 683 days ago [-]
just click continue reading…
eyelidlessness 683 days ago [-]
Substack has a really annoying implementation of this that can be super non-obvious in some cases, because it’s based on a fixed(-ish?) scroll position but blocks before the modal is effectively visible. On my first visit to the article I was convinced that several links on the page were just underlined text because I happened to scroll to just the right place where I had a modal overlay I couldn’t see. I only discovered they were actually links after giving up and scrolling to read more, then second guessing my original conclusion.
theanonymousone 683 days ago [-]
Isn't Clojure the literal "Lisp for the modern day(a.k.a JVM)"?
I hope I don't get kicked out of HN altogether for this comment :)
mejutoco 683 days ago [-]
IMHO Common Lisp is the modern one. Using the JVM is nice for certain uses, and clojure is a fine language, but going bare metal from higher abstractions is some power of Lisp that we should not give up. Same with macros and code generation.
Luckily all of them can coexist, so we do not need to choose.
mark_l_watson 683 days ago [-]
I agree, Common Lisp is crufty but really stands the test of time. I have been using Common Lisp since about 1982 and old code still runs fine.
My relative use of Lisp languages: Common Lisp 60%, Racket 20%, Haskell 5%, Clojure 5%, and various other Schemes 10%. Unfortunately since most of my work in the last 8 years has been deep learning, LLMs, LLM chains, etc., I spend a little over half my time now using Python. So it goes...
i_am_a_peasant 683 days ago [-]
That Vonnegut reference at the end ;D
deckard1 683 days ago [-]
"old code runs fine" is a poor benchmark for how well a language holds up over time. All x86 code still runs fine. Perl 5 is from, I think, 1994. It's still maintained and in development. Most C code from the '80s would still compile and run just fine (at least POSIX-based, Windows however...). You can trivially find Pascal, Fortran, COBOL, and Forth code that all "still runs fine."
I don't think it's a particularly unique or interesting quality, that some old code still runs. After all, I can go to archive.org right now and run all of that ancient DOS, Amiga, whatever code in a 100% exact (or close to) emulator in my browser.
I am curious how performant compiled Common Lisp is compared to GraalVM compiled Clojure native images. You certainly don't give up macros and code generation when using Clojure, though you do give up a specific class of explicit reader macros. Some reader extensibility can be had (and is used in Clojure.core) via reader functions (data_readers.clj)
pfdietz 683 days ago [-]
You don't give macros and code generation when using compiled Common Lisp, either.
criddell 683 days ago [-]
How is code generation handled on modern operating systems and CPUs? Isn't there normally a strict separation between code and data to prevent exploits?
perihelions 683 days ago [-]
I just checked: SBCL on Linux/amd64 puts compiled functions in a r+w+x page. You could write Lisp that writes Lisp that writes self-modifying machine code, and it would (I think) run by default.
(compiled-function-p #'f) ; T
(disassemble 'f)
; disassembly for F
; Size: 58 bytes. Origin: #x5350CD14 ; F
; 14: 498B4510 MOV RAX, [R13+16] ; thread.binding-stack-pointer
; 18: 488945F8 MOV [RBP-8], RAX
; 1C: 4883EC10 SUB RSP, 16
; [...]
$ pmap $PID --range 5350CD14
442331: /usr/bin/sbcl
0000000053498000 122272K rwx-- [ anon ]
total 122272K
683 days ago [-]
dorfsmay 683 days ago [-]
Was lisp compiled code, not code.running in a repl, ever able to self modify?
Self modifying code in the sense of changing one instruction to another at runtime (as was commonly done in assembler in the 1960s) is not generally possible with modern Common Lisps mostly because modern operating systems don't allow it. And that's a good thing because such code would be hopelessly insecure and impossible to reason about.
But if you mean "compile a new function called X and replace the old X at runtime", that's easy in Common Lisp. It's not commonly done unless you're explicitly writing some kind of compiler.
What is commonly done is to create a lexical closure at compile time and change its bound values at runtime. IOW changing the private data of a function at runtime is more generally useful than changing its instructions.
What's most common is to write lisp programs that emit lisp source code and compile it at compile time (but usually not run time). Such programs are called macros.
pfdietz 683 days ago [-]
One can define and compile functions in a running Common Lisp. There's a standard function, compile, that does this.
In SBCL, any evaluation of an expression is done by first compiling it. Compiled functions that are no longer accessible (including not having any frames on the stack) are garbage collected.
The really interesting question is not whether users can mutate existing compiled code, but whether it's useful for the Common Lisp implementation to do so. This is because Common Lisp is a dynamic language, where generic functions and classes can be redefined on the fly. If you want to implement such things efficiently, it would be useful to be able to change existing compiled functions to reflect that (for example) this slot of this object is at this offset rather than that offset.
A scheme has been proposed to do that that puts such code off to the side in little chunks accessed with unconditional branches. When a redefinition occurs the branch is redirected to a newly created chunk; the old one is GCed when no longer referenced from the stack. You have to pay for the unconditional branches, but those are typically quite fast.
andsoitis 683 days ago [-]
> How is code generation handled on modern operating systems and CPUs? Isn't there normally a strict separation between code and data to prevent exploits?
You compile code, which is text (data), all the time, don’t you?
Zambyte 683 days ago [-]
The difference is that the output of the compiler is usually not loaded into the same process that did the compiling. That is not the case for applications that are developed On Lisp™.
andsoitis 683 days ago [-]
True. Well, since it works on the latest CPUs and OS's, presumably it isn't verboten.
p_l 683 days ago [-]
At worst you have to ensure that you separate what needs to be writeable from executable, and then flip mappings RW->RX when generating new code, just like other JIT compilers do.
criddell 683 days ago [-]
Oh okay. I was thinking of self-modifying, not just code generation. That was something that was done occasionally by DOS and Windows programmers before the introduction of DEP in Windows XP.
trws 683 days ago [-]
Yeah, self modifying has become harder. As far as I know the most robust solution to that now is like what Julia does, where it can compile new generations of any module to update call targets, for example. At worst, when seccomp disallows flipping a writable page to executable at runtime, you can always compile into a file then dlopen the file. In other words, it can take many more contortions now than it used to, but it’s still possible.
waffletower 683 days ago [-]
No, 'eval' is available in many dynamic languages and needs to be utilized with care
archgoon 683 days ago [-]
It is not that strict. Many years ago machine code was directly loaded into memory and wherever the cpus program counter was, that's what it'd execute.
These days, a page of memory can be set to
Read
Write
Execute
The exploit mitigation you refer to is having the program typically set pages of memory to never have both write and execute set at the same time.
However, this is ultimately controlled by the program. On Linux, the program can invoke the os call 'mprotect' to change the permissions on a page (though a program can also voluntarily use seccmp routines to forego ever invoking this ever again)
And this is basically what browsers do. They compile the code into memory that has been set to 'write' (but not execute) and proceed to then set it to execute (but not write).
The effectiveness of this mitigation is mitigated by the existence of ROP techniques, which is why Intel started introducing Control-flow Enhancemnent Technology (CET), which is intended to ensure you can only branch to certain locations in memory.
systems 683 days ago [-]
the problem with clojure is you have to learn java (or at least some java)
same issue i see with F#, you still need to have at least some basic knowledge of C#
unfortunately running on JVM actually really means running on Java
same for .net it mean running on C#
i dont mind learning two languages, its expected from most developers to know more than one, but context switching in a single function between two language is not fun
pjmlp 683 days ago [-]
The curse of guest languages.
However F# suffers from a bigger issue, that clojure doesn't, belonging to the platform owner, that behaves as if it was an error to make it part of VS 2010.
jeremyjh 683 days ago [-]
Isn't that because F# failed to win any significant market share? I think F#'s problem is that C# is too good and this was true even in 2010, its not like Scala vs. Java in 2010.
josephferano 683 days ago [-]
I worked with F# for about 3 years and the sentiment of most F# developers after using it in anger, including myself, is "I don't want to go back to C#". In my opinion, it didn't win significant market share because Microsoft didn't want it to. This goes even further back than 2010. I don't know if it's because F# was open source and C# was not, as Microsoft's FOSS stance hadn't really come around yet till what, 2016?
systems 683 days ago [-]
F# is way nicer than C# (and Clojure is still way nicer than Java)
And I dont think F# failed, I just think it needs to find a way to hide C# and OOP better
But the language is doing fairly well, tons of educational resources, tons of videos on youtube , several nice projects on github, solid vs code mode, it is part of .net , but still a lot can be improved, and microsoft is far from abandoning, C# is one of MS flagship products , F# is not a flagship product
And I also think that Don Syme is a lot more active working for F# and promoting it, than Rich Hickey is currently working for Clojure
soulbadguy 683 days ago [-]
Big F# fan here, but i don't really "doing fairly well" is a fair assessment of the F# ecosystem right now.
> microsoft is far from abandoning
Might has well. When you look at the resources actually being deployed for F#, it's clear that MSFT either don't care or don't really have plan for F#. Most of the work is done by the community. The number of actual paid/full time MSFT dev on F# is very limited.
The tooling is extremely limited, there is no official F# libraries for pretty much any MSFT and azure services (have to relie on C# libs.)
In leetcode where the have C# (so already have the infra to run .net stuff, and language like elixir and racket (so they do have more niche language), they still don't have F#.
The salvation for F# would come from finding a killer app, something akin to pytorch/numbpy or rails.
trenchgun 683 days ago [-]
Killer library. Not an app.
guhidalg 683 days ago [-]
Maybe, and it's a good case study to compare F# vs. C# with Java vs. Scala to understand why Scala seems to have higher penetration than F#.
In my opinion, and I mean this in the broadest way possible, the average C# developer is probably less motivated to learn a new language than the average Java developer. That's because C# developers are prescribed by Microsoft what to do, and Microsoft could deliver a productive-enough experience with C# + .NET Framework + Visual Studio for a long long time.
Java's fractured IDE story and runtime (OpenJDK vs. HotSpot) I think lead to a higher concentration of programming language people looking at the JVM as a viable target to build a better experience on top of, so we got Groovy, Scala, Kotlin, (anyone remember Xtend?), IDEA, Eclipse, etc...
The bazaar produced and embraced a functional programming language whereas the cathedral treats F# as thunderdome for new C# features.
Capricorn2481 683 days ago [-]
Scala pretty much hangs on a thread and is only breathing cause of Spark
bmitc 683 days ago [-]
Reading C# code makes me shudder. It is so hard to read with all the brackets, spacing, indentation levels, and overall verboseness. I honestly don't know how people do it.
F# can do OOP just as well if not better than C#, and F# is anywhere from 2-10 times more concise than C#.
That being said, the mistake of Microsoft was viewing F# as a competitor to and research lab for C# and the community for viewing and selling it as a replacement for C#.
What Microsoft and the community should have done is treated F# as a competitor and replacement for Python from the start.
pjmlp 683 days ago [-]
That is what F# folks tried to pivot the language as of latest, only to be sidestepped yet again by Microsoft, when they hired Guido and managed to be the company that changed his mind regarding CPython optimizations.
Also see the development effort Microsoft spends on Python across all their IDEs and what F# gets.
deckard1 683 days ago [-]
> The curse of guest languages.
Every language lives under the iron fist of libc and the C ABI.
pjmlp 683 days ago [-]
Not every OS is an UNIX clone.
bmitc 683 days ago [-]
You really don't need to know C# to use F#. You need to understand .NET, but that is not the same as needing to know C#. F# sits on top of .NET and the CLR in a much more clean way than Clojure sits on top of the JVM. F# also has excellent error messages and is also much easier to install.
I know F# quite well at this point but only barely know C#. I know C# maybe only mildly better than C++ or Java in terms of being able to read it and guess what it's doing. I certainly can't write in it without looking up a bunch of stuff.
Capricorn2481 683 days ago [-]
Aren't most libraries you want in C# though? Isn't there a bit of impedance mismatch between how functional F# is and how OOP C# is? I have heard from a few F# fans how frustrated they are that they have to keep interacting with C#
bmitc 683 days ago [-]
OOP is very natural in F#, so it's not that. Interacting with .NET is not an issue. Interacting with a third-party library can sometimes be an issue, if they're doing a bunch of funky C# stuff. For my own use cases, I haven't needed to pull in a bunch of C# developed, third-party libraries. When I did, for example Silk.NET's window bindings, it was easier to just write my own. I couldnimagine that's different for some more industrial use cases though.
nifty_beaks 683 days ago [-]
I recently did a small project in F# just to try it out. This was my exact experience. Syntacticly it seemed very nice and in some ways it was cool to have all the libraries I'm used to in C#, however due to the latter it ends up being possible to suddenly have `null` or other weird non-functional thingies suddenly pop up.
683 days ago [-]
c_crank 683 days ago [-]
Clojure is opinionated against using all the potential features of Lisp.
waffletower 683 days ago [-]
That is fair, though not a negative for many. Clojure prefers immutable data structures and functional constructs. Rich Hickey even wishes that he added fewer OOP constructs in the language (such as structs) and leaned more heavily on Clojure rooted functional concepts like 'transducers' which came later in the life of the language.
masijo 683 days ago [-]
What would be those potential features of Lisp that Clojure is missing?
c_crank 683 days ago [-]
Multi paradigm support, low level / C interop capacity, really any sort of native features
waffletower 683 days ago [-]
Clojure is a multi-hosted language. Clean native interop would be nice to have I agree though.
fulafel 683 days ago [-]
It steers you pretty heavily toward functional programming.
wildermuthn 683 days ago [-]
Unfortunately, IMHO Clojure’s maintainers hold an iron grip on the language and actively limit the growth of an ecosystem around it. See “Open Source is not about you”.
It’s too bad, because itself is really terrific.
fulafel 683 days ago [-]
I think being conservative about evolving the language, and ecosystem growth are mostly different issues, the open source post was about the core language.
Comparing to other languages I think the stability of the language has served the user community really well and been an enabler of the ecosystem. I guess the continuous language additions and resulting complexity & library/framework chuirn in some otherlanguages can also be seen as vitality and growth, but for many of us it's the wrong kind of growth.
(And yes there are also disadvantages to the centralised development model of the core language)
The iron grip is on a decidedly double-edged sword. I applaud the continuity and maintainability of Clojure and its ecosystem. The consistency of the language has come from this stern stewardship. Rich makes very clear that the source code for the language is available to be forked. Very interesting developments have come from this, most notably babashka.
pjmlp 683 days ago [-]
To their right, Rich Hickey almost got bankrupted while developing Clojure.
The usual "expect all for free, give nothing back" attitude.
threatofrain 683 days ago [-]
Rich Hickey basically wrote this in response to one of the notable and respected contributors of his ecosystem. If Rich wanted open source to be nothing more than a license and delivery mechanism then he shouldn’t have accepted volunteers. But a language without an ecosystem of volunteers is a dead language.
erichocean 683 days ago [-]
Clojure "the language" is certainly ossified, but Clojure "the ecosystem" is doing very, very well.
nocman 683 days ago [-]
I'm genuinely curious. What about the language do you feel is "ossified"? I am interpreting that as being a negative description.
Stability is extremely valuable. A lack of change to the core language over extended periods of time can be a very good thing, especially if certain changes would break existing code. Rich has made it clear that he is indeed targeting this kind of stability.
Again, I would be very interested to hear what specific changes you think need to be made to 'Clojure "the language"'.
erichocean 683 days ago [-]
No changes, I'm a happy Clojure user. Long-term stability is one of its selling points.
nocman 671 days ago [-]
Been busy and didn't see the response...
So... you meant 'ossified' in a good way?? If so, I guess I would have used a different term that didn't have such a negative connotation.
pjmlp 683 days ago [-]
Many languages do quite well as commercial products.
phtrivier 683 days ago [-]
"It can generate an executable" ; but how slow will this executable be ? Compared to something compiled with SBCL or a non-LISP ?
p4bl0 683 days ago [-]
Racket executables are not to be compared with those of a compiled language that aims for speed. The primary goal is not speed here and the documentation explicitly says so. You should see them as interpreted code that is bundled with its interpreter for easier distribution and execution, without anything else to install for end users.
You can write a Racket application even with a GUI and whatever lib you need and then cross build it as a native executable (that is, an executable that embeds the interpreter, the necessary libs, and your application's code) for yours and other platform for easy distributions.
But the authors states that Racket is "the finest example of a modern day lisp bar none". In the modern day energy efficiency and climate change are major issues, the German government, for example, has set up the Blue Angel Ecolabel program: <https://www.blauer-engel.de/en>. Using modern day to mean VSCode and Discord is a worthless use of the phrase.
Racket ranks far ahead of languages like TypeScript, Python, and Erlang in terms of energy usage, but trails languages like C, Rust, Ada, Java and Common Lisp.
I think, however, there are probably other factors in your computing solution that has a bigger energy impact, like what processors you run on, the architecture of your program, etc.
BackBlast 683 days ago [-]
How can you trust a list that places JavaScript an order of magnitude lower than TypeScript? Seems fundamentally broken.
badsectoracula 683 days ago [-]
I didn't read the original paper[0] but it is based on the Debian Benchmark Game[1] as it was in 2017 and the game relies on user submitted code for each language separately, so most likely when the researchers checked the results, the TypeScript tests didn't have as optimized tests as the JavaScript tests - which makes sense as the latter is way more popular than the former, so there are less people to bother with it (and in fact the benchmarks nowadays do not include TypeScript at all).
You can confirm it via archive.org too: the TypeScript[2] page shows both less implementations and the implementations that are shown are often slower than those in the JavaScript[3] page.
As for if that makes sense, well, IMO using the benchmark game for judging how good a language is at optimizations is flawed in the first place as not only there is a bias towards the more popular languages but also a lot of the top entries use approaches that in practice you wouldn't find in real projects.
And-yet your program was accepted and included, and then with a later TypeScript update it stopped working.
spectralnorm.typescript-6.ts(115,29): error TS2794: Expected 1 arguments, but got 0. Did you forget to include 'void' in your type argument to 'Promise'?
Something like that is probably what stopped the authors of "Energy Efficiency across Programming Languages".
Yeah Typescript doesn't guarantee perfect backwards compatibility. You normally have the compiler as part of your dependencies and pin it. I guess they just didn't know.
I don't know what you mean with your last sentence. That's the same paper...
BackBlast 683 days ago [-]
I appreciate the thoughtful reply. Every time I see this list trotted out I can't get past the obvious red flag. Nice to have some more background on it.
igouy 683 days ago [-]
--alwaysStrict
So when the JavaScript doesn't type check, the authors measured a different program that does type check.
Even so, that only messes up the results because the mean is used rather than the median, and the data tables published with that 2017 paper, show a 15x difference between the measured times for a single outlier the selected JS and TS fannkuch-redux programs.
That single outlier is enough to distort TS and JS "mean" Time difference.
Check the pages i linked at, or even better the one that compares the two[0] around 2017. While there is a single case with wildly different results, there is more than one with results that also have a large difference in JavaScript's favor.
Please read the original paper and base your comments on the repo that the authors provided.
Like you, most readers haven't seen the paper.
Like you, most readers have only seen "Table 4. Normalized global results for Energy, Time, and Memory" taken out-of-context (often without any way to find the original source).
Most readers notice the too-large differences C/C++ and JS/TS and start speculating about how those differences were caused (because that's fun).
I went back to the time measurements the authors provided and calculated the Mean, Geometric Mean and Median. Simply using a more appropriate summary statistic would have presented average values which readers would have found acceptable:
JS 7.25 times slower than C
TS 7.8 times slower than C
:even though they were based on different programs and included an outlier. (Similar story with C/C++.)
igouy 682 days ago [-]
> How can you trust a list that places JavaScript an order of magnitude lower than TypeScript?
Simple: start with the same data, make the same calculations, see the same results.
When we make assumptions about how measurements were made and analyzed, our assumptions may be wrong.
andsoitis 683 days ago [-]
> How can you trust a list that places JavaScript an order of magnitude lower than TypeScript? Seems fundamentally broken.
The list places JavaScript at a score of 4.45 and TypeScript worse at 21.5. For reference, C, the most energy efficient it 1.0
BackBlast 683 days ago [-]
> The list places JavaScript at a score of 4.45 and TypeScript worse at 21.5. For reference, C, the most energy efficient it 1.0
The point is that TypeScript is JavaScript with typing syntax added on top. There is a transpile step into JS. That's how TypeScript works. The runtimes are exactly the same. Unless we're also measuring the build step? Which seems silly.
The difference should be near zero. And it's not. They clearly do not understand exactly what they are measuring.
andsoitis 683 days ago [-]
One explanation is that the typical JS output by the TS compiler is more verbose (so more code to load, parse, and run). On the other hand, one can imagine that the TS compiler’s catching classes of errors at compile time means fewer runtime exceptions.
I think the takeaway, however is the theme of a spectrum of energy efficiency, with compiled languages being more efficient than those that are not.
But I still maintain that the overall efficiency of a system is more a function of other factors.
BackBlast 683 days ago [-]
My take away is that this list is very unreliable.
soegaard 683 days ago [-]
I don't know what the usual speed difference of TypeScript programs vs JavaScript
programs.
But the argument that TypeScript generates a JavaScript, so
it must have similar speed doesn't hold in general.
If the compiler in question is a whole-program compiler,
it can make optimizations that a normal person couldn't do.
As an anecdote in [1] a raytracing program were implemented
in both Scheme and C. The Stalin compiler (a whole-program Scheme compiler)
produced an executable 45% faster than the one produced by g++.
The Stalin compiler produced the excecuable by compiling Scheme to C,
and then used a C compiler to produce the final executable.
The price of a whole-program compiler? Well, the compile time are huge.
I don't know the numbers but I can probably agree that "modern day" can be replaced by "batteries included" and it would be a better description, from the (valid) point of view that you take.
However, Racket is also a modern language in that it has many new and fancy programming language research-grade features since it is also a programming language theory research platform. From that point of view, "modern day" is a valid description :).
mcpackieh 683 days ago [-]
Generally, don't take puffery literally. And whenever the author of this random blog seems to contradict the documentation, you can safely assume the Racket documentation is more authoritative.
683 days ago [-]
lispm 683 days ago [-]
> Racket executables are not to be compared with those of a compiled language that aims for speed.
"The CS implementation is the default implementation as of Racket version 8.0. This variant is called “CS” because it uses Chez Scheme as its core compiler and runtime system.
The CS implementation typically provides the best performance for Racket programs. Compiled Racket CS code in a ".zo" file normally contains machine code that is specific to an operating system and architecture."
ryanschaefer 683 days ago [-]
A while ago I wrote an AWS lambda runtime to use Racket. I saw terrible Startup performance at around 500ms for a simple echo program.
dchest 683 days ago [-]
Yeah, reading all those parentheses on startup is not fast.
As an anecdote, I used to host a computationally expensive web app written in Racket (even before Chez version) serving large amount of requests at peak times on a single $10 DigitalOcean droplet. It was pretty fast and didn't crash. No need for awkward modern web scale thingies.
velcrovan 683 days ago [-]
Is the parentheses comment a joke?
dchest 683 days ago [-]
Yes :) I don't think there are many of them in a simple echo program, and I guess the runtime is compiled when it's shipped.
Also how big will the exe be. Last time I looked the whole interpreter is packed together with the source.
Also I‘m not sure if you can obguscate the code, if you do not want to publish it.
macco 683 days ago [-]
Probably, it will be fast enough. The new compiler is based in mzscheme.
ObscureScience 683 days ago [-]
Isn't Chez Scheme the default runtime by now?
Jtsummers 683 days ago [-]
Yes. Since 8.0.
683 days ago [-]
bmitc 683 days ago [-]
I really wanted to dive into Racket a few years ago and still really like Racket. But this is when they switched focus to the Rhombus project, which killed a lot of interest and personal momentum I had going for Racket. It has also become clear that Racket's maintenance core is too academic. I don't think Racket is too academic, but by the very nature of Racket's primary community being professors and graduate students, it means there is a very small core team with a lot of the interest and use outside of that core being shortlived as students go through PhD programs. There are a few nonacademic core members, but there are few.
fn-mote 683 days ago [-]
IMO this part is FUD:
> when they switched focus to the Rhombus project, which killed a lot of interest and personal momentum I had going for Racket
The language has been "complete" for a long time. If you're actively trying to get something done, I think it's unlikely that some core language work is going to stop you.
It isn't clear what domain the parent wanted to work in; I won't say it's impossible they had problems but ... details are everything at this point.
The academic part of Racket gave rise to `syntax-parse`, a truely fabulous improvement over the standard Lisp/Scheme/whatever you used to work with way to write macros.
bmitc 683 days ago [-]
Why is it FUD? It's my own feeling. In this specific case I had just spent a week of my and my work's time and money learning Racket at an official Racket school only to find out at the end that they're creating a new "surface" syntax. The split of do I learn and participate with Rhombus or Racket is a concern. The idea that the core maintainers, or really the core maintainer (singular), is busy with Rhombus is a concern. The explanations given at the time for the Rhombus project were primarily dubious and unclear. The uncertainty and doubt were not created by me. It was enough to make me move on. Other languages like F# and Elixir suit my needs.
> The academic part of Racket gave rise to `syntax-parse`, a truly fabulous improvement over the standard Lisp/Scheme/whatever you used to work with way to write macros.
That's not the academic part I am referring to. My comment is about who the maintainers and users are and how their profession or studies affect how Racket is interacted with. (This is not a complaint or judgement. It's an observation.)
> The language has been "complete" for a long time.
There are a fair number of bugs in Racket's libraries, including core ones. They aren't discovered because academic users are not touching bits that more industrial-oriented users would find. In particular, I am thinking of the GUI toolkit, and in my experience, only one person is capable of fixing those.
patrec 683 days ago [-]
> Racket is probably the finest example of a modern day lisp bar none
It's not. There are several nice and unique things about Racket, but it's a pretty poor example of a lisp, to the extent that if you want to understand what's good about lisp you'd probably be better off learning emacs lisp than racket.
clircle 683 days ago [-]
go on...
patrec 682 days ago [-]
Lisps have traditionally been optimized for very smart people solving cutting-edge problems (both in academia and industry) that benefited PL innovation in an interactive, fully malleable and explorable environment. People like Chaitin, Baker, Gosper, LeCun, Minsky, Norvig, Sussman etc. This community has essentially vanished and a large part of the vacuum has been filled up by people who seem to derive their feelings of self-worth by association to past greatness rather than personal accomplishments. But some of the spirit lives on in Julia (which has a much better claim to being the foremost modern lisp), and some cool stuff and people have been "grandfathered in" and continue to do use lisp for fairly sophisticated stuff (a backend example would be google flights/ITA and a end-user app example would be opus modus).
Racket is, in my probably slightly biased opinion, a vehicle for doing firmly within-the-box CS research and teaching. It comes with a lot of batteries included and there are some cool an innovative things about it (it has sophisticated support for syntactic abstractions with acceptable error reporting, contracts, non-textual data etc).
It's run by people with very respectable academic output, but to the best of my knowledge none of them are particularly appreciative of interactive and malleable computing (even python is much better here), or ever wrote any code that's reallly pushing the state-of-the-art. I'd also say that the racket IDE (despite also having some nice features) is ugly, clunky, sluggish and not something that I can see appealing to very good programmers.
I'm also not aware of any impressive industry project done in Racket (although, in fairness I should add that Carmack has said nice things about racket).
kazinator 682 days ago [-]
Languages in the Lisp family are very nice and advantageous to work with if you're the type of programmer who types out hundreds of lines of code, saves them to a file, and they either work the first time, or do so with minimal changes.
The people you mention in your first paragraph likely all belong to this category, incidentally.
Iteractivity is not required for solving cutting-edge problems and certainly not by people of that caliber.
Lisp programming, as such, does not require very smart people; languages in the Lisp family provide a user-friendly experience suitable for the average person who has some aptitude and curiosity for programming.
You need very smart people to solve cutting-edge problems in fragile low-level languages, in which a mistake will destabilize the machine. Those people have to be experts in the problem domain, and in coding (or a team which includes such).
Interactivity actually makes things easier for the bumblers who massage code into working and believe that trying functions at the REPL is a viable test plan.
kazinator 682 days ago [-]
Erratum: ... advantageous to work with even if ...
singularity2001 683 days ago [-]
They call #hash[(a . b)] modern? Maps should be a first class citizen with first class syntax and not ... that.
zsh_89 683 days ago [-]
As a pretty experienced programmer,
who make a living by writing code for almost 10 year(with a CS master degree), I would like to say something to new learners of CS who need making money by CS skill in the future:
+ almost everything related with LISP is fun
+ but they have VERY limited use in industry, for very good reason;
+ you should try spend at least same amount time (as on LISP topic) to learn something like leetcode(or even higher level of competitive programming, which improve your skill of converting your idea to code fast and basic math thinking; useful in interview), job-oriented CS book. You'll be a better problem-solver and more ready in job market.
+ If you're very sure you will be financially OK, just learn anything you like and ignore this message.
I think there are many people reading this thread who develop using a variety of Lisp as their primary language and are more than adequately compensated financially, and yes, in 2023.
zsh_89 683 days ago [-]
Yes I completely agree.
Yet I insist that it is responsible to clearly deliver that message that the numbers of applications/jobs are NOT at the same magnitude.
Personally I've spent many hours on LISP related topic; I enjoyed the time; I recommend it as a low priority interesting topic.
Things like develop a working distributed KV store(MIT 6.824), solid 1000+hours on competitive programming, those skill&knowledge are not only beautiful&interesting, but also make a person a better practical problem solver.
dunefox 683 days ago [-]
> + but they have VERY limited use in industry, for very good reason;
And what would these very good reasons be? inb4 "only for single programmers", brackets are difficult for normal programmers, macros can't be understood by mortals, and other non-issues that have been discussed to death.
glonq 683 days ago [-]
I will argue against your first point ;)
...but otherwise pretty much agree.
I learned LISP in college, way back when expert systems were the future of AI.
The worst thing that I could probaby do professionally is to write something in LISP that needs to be adopted and maintained by somebody else in the future.
jes 683 days ago [-]
I'd like to play around a bit with Racket. I have been using Slime with Common Lisp for years. From what I'm reading, it sounds like Geiser[1] might be a Slime-alike package for Emacs.
Thanks for the heads up! Looks great, will take it for a spin.
nsm 683 days ago [-]
IMO a couple of cool aspects of Racket that aren't called out in beginner guides, but I think are really useful in real programs:
1. The use of custodians on a per-(green-)thread basis that brings "oh yes, if the thread crashes or is shutdown, racket guarantees that all open files/sockets/resources will be shut down". For example you can use this to ensure that a timed out client doesn't cause unnecessary resource usage. https://docs.racket-lang.org/more/index.html#%28part._.Termi...
2. Events and composable concurrency from Concurrent ML - Unlike Go's channels, concurrent ML events are composable up the stack, and have some nice things built in like "NACK events" (allowing cancellation down the stack). Unfortunately there aren't great primers about using this that I know of apart from the Kill Safe Synchronization Abstractions paper. https://users.cs.utah.edu/plt/kill-safe/
Some of these were added to make it easier for the IDE to manage errant programs that could be written by beginners, but it is also a very "applicable to production" set of tools.
Racket internalizes Extra-linguistic mechanisms https://felleisen.org/matthias/manifesto/sec_intern.html
I agree that startup time is not a winning aspect of Racket. My naive understanding is because Racket is not a direct bytecode interpreter like CPython, but actually has to run a compile step to native code, and doing that for programs + their required libraries necessarily takes at least a couple of hundred milliseconds even before anything can start running, while CPython can pretty much start executing from the get go.
manicennui 683 days ago [-]
The comments section of this submission are the perfect example of how far this site has fallen. Bunch of blub programmers worried about whether Racket is the most popular or can be used for their crappy day job.
But as I've progressed in my career, I've come to appreciate how it helped me develop a strong foundation for reasoning about programs and their underlying logic (in any language) that continues to serve me to this day.
If I was approaching this outside of a structured curriculum, it would be hard for me to justify this long-term intangible benefit relative to being able to move quickly to write something interesting and useful. And I think writing interesting and useful programs is the best way to motivate oneself to keep programming. So I can't strongly advocate for this approach. But I do think there are some worthwhile benefits.
When MIT switched from scheme to python for their undergrad classes there was a big debate here aboutit. I think the argument was that python won't make learning the concepts any harder but is less of an issue for students to learn and has some long term value professionally so win-win. I don't fully agree but I can see the logic.
"Costanza asked Sussman why MIT had switched away from Scheme for their introductory programming course, 6.001. This was a gem. He said that the reason that happened was because engineering in 1980 was not what it was in the mid-90s or in 2000. In 1980, good programmers spent a lot of time thinking, and then produced spare code that they thought should work. Code ran close to the metal, even Scheme — it was understandable all the way down. Like a resistor, where you could read the bands and know the power rating and the tolerance and the resistance and V=IR and that’s all there was to know. 6.001 had been conceived to teach engineers how to take small parts that they understood entirely and use simple techniques to compose them into larger things that do what you want. But programming now isn’t so much like that, said Sussman. Nowadays you muck around with incomprehensible or nonexistent man pages for software you don’t know who wrote. You have to do basic science on your libraries to see how they work, trying out different inputs and seeing how the code reacts. This is a fundamentally different job, and it needed a different course. So the good thing about the new 6.001 was that it was robot-centered — you had to program a little robot to move around. And robots are not like resistors, behaving according to ideal functions. Wheels slip, the environment changes, etc — you have to build in robustness to the system, in a different way than the one SICP discusses. And why Python, then? Well, said Sussman, it probably just had a library already implemented for the robotics interface, that was all."
https://www.wisdomandwonder.com/link/2110/why-mit-switched-f...
A lot of texts will use pseudo-code and it is (from my experience) easier for beginner programmers to see the relation between pseudo-code and Python than for many other languages.
(?)
Of course, it's often pointed out, "Well, if you write in Scheme (or C or Java or whatever) then you're not writing in assembly language, much less machine language, so you already don't understand everything." There's certainly truth there, but, to me, going from, expertise writing code in a high-level programming language to, gluing together libraries that you kinda-sorta understand, feels like a bigger leap in what you do or do not understand than going from assembly language to Python.
It's interesting that Sussman kind of lumps together "uncertain software libraries" into the same category as machine control robustness (e.g. hysteresis). I never thought of it that way but I guess in practice it's all just "stuff", those libraries are just another piece of your program's environment like any other.
Ability to deal / reflect with unknowns for engineering is of great value, but so far I've never seen that in office.
(?)
One of the challenges with an approach which doesn't concern itself with "industrial grade" or "production ready" languages is getting buy in from student. Even if there were a perfect language for teaching, if students don't see the applicability of that language they aren't going to learn enough of the concepts to move to such a language later.
I think it's very easy for us (and other technically competent folks) to see value in learning how computers work for the sake of that knowledge; however, students, as an over generalization, don't. The fact that relevance and motivation are some of the hardest hurdles to overcome in early computing classes is a perfect example of this. Using languages with a professional pedigree is important because it increases student buy in to what they're being taught.
An analogy I like is that you wouldn't give someone new to woodworking a toy hammer and hand saw because they need to learn fundamentals like striking and cutting before they can start using "real" tools, you would provide them with capable, but beginner friendly, tools that allow them to build those skills as they learn.
> …it helped me develop a strong foundation for reasoning about programs and their underlying logic (in any language) that continues to serve me to this day.
I went on to use Clojure and ClojureScript for several years. Now I primarily work in JavaScript and (preferably) TypeScript, but I can’t say a day has gone by where what I learned working in Clojure hasn’t been applicable and valuable.
Just one more anecdata point, but I would advocate learning a lisp—maybe even any lisp. In fact, it’s usually one of my first recommendations when juniors/mentees ask me for advice in any kind of broad strokes.
None of this has anything to do with academic computer science. I think I'm just unusually bad at memorizing syntax.
In order to get that tough question, you could carefully read the book, read his assignment solutions, and pay close attention to his caveat warnings in lectures... One of the three would typically contain the tough question.
Or, you could just read the book, do the assignments, and accept ~95% on the final exam as a maximum grade.
A ~95% cap, for a lot less work, is a good deal. Taking the deal is probably the right call for most people.
That being said, the thing I wish someone explained well is how amazing lambdas can be. I understood the benefits of list/array/stream processing on first glance, but it took a few years before I really understood the practical application of lambdas.
There is a "JavaScript Edition" of SICP. The preface says ECMAScript 2015 enabled a close match by adding lambda expressions, tail recursion, and block-scoped variables and constants.
It doesn't mention much downside except that JavaScript's "rigid syntactic structure complicated the implementation" of some programming language design book material.
You WILL get very interesting ideas and concepts, it's fun, basically, that's it.
+ You won't solve problem faster than your colleges with solid competitive programming background;
+ you won't be able to optimize the code and cut 30% of your company's server cost,
+ you won't feel comfortable to read real-world complex project code,
+ you won't get some domain-specific knowledge to solve problems you previously can't solve.
+ .....
It won't make you a better problem solver; at least its impact is way smaller than pick up go/rust/cpp/java and carefully try implement challenge like: https://github.com/codecrafters-io/build-your-own-x
GIS is certainly interesting in its own ways, and it taught me some visualization and design skills, but I wish it would have been possible for me to learn more of what's going on under the hood of the massive libraries and high-level languages I use every day as a professional "software engineer"
So I decided to learn SICP on my own. I already have the book, so I'll watch the MIT lectures of the original authors available online, and try to make progress.
I'm not in any hurry to complete it anytime soon.
Lately I've been using Racket to prototype the interaction design for a game, and I had written some placeholder text. I had put fake interpolation in for {player-name} and things like that, and then I realized that #lang at-exp would let me just ... do that, as described in the original article.
I had to laugh, because I maintain a bunch of Python code, and if Python has batteries included, Racket comes with a retrofittable fusion battery pack.
One other little anecdote, is that I've been porting an old plt-scheme codebase to Racket CS. In the second phase I have started replacing an ancient C++ rendering layer with the racket gui canvas functionality. During the process I moved hundreds of runtime type checks in the C/C++ layer to the Racket contract system. It was as if the Racket team had somehow secretly obtained access to the codebase I was working on and had designed the perfect abstractions to solve the problem.
I have one kid using scratch now, and one older graduated to godot, roblox/lua, some javascript/web.
I would love to get them into lisp (already done a bit of Janet along with me), but it really needs to be something they can write their game ideas in, and share/show-off to friends.
From my own experience, it is absolutely possible to get good interaction with low latency in Racket, but of course as with any language you have to have enough experience to know how to avoid performance pitfalls.
For some of the gui work I've done I needed to mix threads and semaphores to get things to be really responsive when there could be a long running process triggered by a UI interaction, but in Racket it has been vastly easier and safer than in other languages.
Here is a library for making retro games: https://r-cade.io/
Ask on Discourse or Discord, for more game options.
With regard to prototyping GUI's I'd suggest taking a look at https://github.com/mfelleisen/7GUI. https://github.com/Bogdanp/racket-gui-easy could also be a good place to start.
With regard to Racket more generally, I'm probably not the best person to ask since I had a very high friction start where I just banged my head against the wall until things made sense.
For anyone who has worked through HTDP: is there any benefit to sticking with DrRacket vs using VSCode? The friction involved in moving around in DrRacket really dampens my enjoyment of the material in HTDP.
Note that you can change the default settings to be more Emacs-like.
How big a desire of the average neophyte is to make themselves another language?
In most mainstream languages you have a large syntax, lots of idiosyncracies, and limited ability to hack with the innards of your tool. Scheme is small, regular and freeing.
Now personally, after my first year of college, I asked why on earth can't we access the methods in an object (java4 at the time) to generate a UI to dispatch / interact with it. Teacher rapidly walked me off the room while mumbling "but that would be metaprogramming!". I left confused about his annoyed tone.
Not until year 4 we had the chance to see reflection/intercession, lisps, macros.. I wish someone showed that to me when I was 12.
ps: it might not be obvious, and maybe I'm wrong but I see adhoc DSL's everywhere I work. ORM, build tools, they're all pseudolanguages.. and people keep reinventing them. Scheme/lisp offers it on a silver plater for you.
If you deal with the same problem space a lot, having a Domain-Specific Language (DSL) can let you execute that vocabulary and work at the level of the terms that fit the problem space best.
This is a lot smaller in scope than designing a whole programming language, think of it like a unique vocabulary for solving specific problems and automating specific kinds of work you do!
https://chreke.com/little-languages.html
I've not watched a relative neophyte try and actually -do- that so I can't comment on how far they've got with it, but making "creating DSLs" approachable seems like a really interesting thing to be aiming for.
The answer seems to be in the title: this of particular interest for those already interested in Lisp or Scheme.
For example being able to make good use of recursion, being able to write programs more elegantly.
Racket, in contrast to JS or TS, comes with lots of batteries included, therefore not requiring so many third party dependencies. Time and time again I see the Racket docs and think: "Ah Racket has got something for that/got you covered." Also it is rich in programming language concepts. You can write your programm in any of the languages that come with Racket, or you can write it with function contracts or you can use Typed Racket.
At its core, it is a much cleaner and better designed language than TS will ever be due to TS' JS legacy.
CPAN made Perl the best choice for many tasks and the same can be said of NPM for choosing TypeScript now, regardless of the elegance of the language.
But I’m also a fan of “language tourism”. Even if one language is more practical, visiting other language cultures can make us better developers overall.
I studied Haskell for a bit which gave new ideas for how I use Perl. Culturally, Haskell was using a lot of functional programming concepts and recursion. Those were possible in Perl but not popular.
I’m sure there is value in studying Lisp languages for those who end working mostly in TypeScript or Rust
Racket (the language) itself is no longer a Scheme, strictly speaking, but still feels generally "schemey" in the broad scheme of lisps. However, Racket (the software package) does ship with a Scheme or two, namely R5RS and R6RS.
Rust because it’s the other language I see most often see coders learn to build their own apps. It has a good reputation for developer experience, security and speed.
I’ve heard enthusiasm for lisp-based languages for a couple decades and don’t doubt the merits of Racket.
If you’re picking just one language to learn, and limiting your learning to only that one language for a significant period of time, you’ll probably want to learn Rust. You won’t get all of the benefits of also learning a lisp, but you’ll get a lot of the ones which generalize well.
If you’re open to learning 1+N and otherwise lean towards TS, I would say you’re better off learning Racket (or any lisp really) in tandem than learning TS alone.
Familiarity with the core concepts and idioms of FP are broadly beneficial in any language, and lisps tend to be a good balance of those with low incidental mental overhead, and reasonable escape hatches to do imperative stuff where it makes sense. Racket is probably a particularly good candidate because its optional typing is another overlapping story between the two.
I haven’t written in any lisp for close to a decade now, but I still find that prior experience beneficial every day since. I sometimes miss the simplicity and flexibility of the parentheses. But I can take what I learned—how to reason about state, data flow, data-driven abstractions—anywhere. And those abstractions are particularly useful in a structural type system like TS.
Which one ? Rust and lisp in general are quite different beast...
> Racket excels as a programming language for young learners and beginners.
I honestly think Racket could do more to be beginner-friendly. The documentation is excellent, but difficult to understand as a newcomer. There are some great little tutorials that are easy to work through, but the ramp from there to writing your own Racket programs is steep imo.
I don't know of any other language with so many batteries included. Racket deserves to see community growth, and hopefully with that will come more resources for folks to get started
Its fine, and its pretty good for Racket code (and its variants, like typed racket) but when I'm working on a real project I have other things I need to write too, like CSS, HTML, TypeScript, bash scripts etc.
Its more ecosystem to be more adoptable IMO.
I love it for learning though, but turning racket into a production level language w/ proper ecosystem has a way to go
I think they should fix this quite obvious bug before writing more articles.
I hope I don't get kicked out of HN altogether for this comment :)
Luckily all of them can coexist, so we do not need to choose.
My relative use of Lisp languages: Common Lisp 60%, Racket 20%, Haskell 5%, Clojure 5%, and various other Schemes 10%. Unfortunately since most of my work in the last 8 years has been deep learning, LLMs, LLM chains, etc., I spend a little over half my time now using Python. So it goes...
I don't think it's a particularly unique or interesting quality, that some old code still runs. After all, I can go to archive.org right now and run all of that ancient DOS, Amiga, whatever code in a 100% exact (or close to) emulator in my browser.
It looks like that wasn't the case even in 1996:
https://groups.google.com/g/comp.lang.lisp/c/O1dDXlQsVkw
But if you mean "compile a new function called X and replace the old X at runtime", that's easy in Common Lisp. It's not commonly done unless you're explicitly writing some kind of compiler.
What is commonly done is to create a lexical closure at compile time and change its bound values at runtime. IOW changing the private data of a function at runtime is more generally useful than changing its instructions.
What's most common is to write lisp programs that emit lisp source code and compile it at compile time (but usually not run time). Such programs are called macros.
In SBCL, any evaluation of an expression is done by first compiling it. Compiled functions that are no longer accessible (including not having any frames on the stack) are garbage collected.
The really interesting question is not whether users can mutate existing compiled code, but whether it's useful for the Common Lisp implementation to do so. This is because Common Lisp is a dynamic language, where generic functions and classes can be redefined on the fly. If you want to implement such things efficiently, it would be useful to be able to change existing compiled functions to reflect that (for example) this slot of this object is at this offset rather than that offset.
A scheme has been proposed to do that that puts such code off to the side in little chunks accessed with unconditional branches. When a redefinition occurs the branch is redirected to a newly created chunk; the old one is GCed when no longer referenced from the stack. You have to pay for the unconditional branches, but those are typically quite fast.
You compile code, which is text (data), all the time, don’t you?
These days, a page of memory can be set to
Read Write Execute
The exploit mitigation you refer to is having the program typically set pages of memory to never have both write and execute set at the same time.
However, this is ultimately controlled by the program. On Linux, the program can invoke the os call 'mprotect' to change the permissions on a page (though a program can also voluntarily use seccmp routines to forego ever invoking this ever again)
And this is basically what browsers do. They compile the code into memory that has been set to 'write' (but not execute) and proceed to then set it to execute (but not write).
The effectiveness of this mitigation is mitigated by the existence of ROP techniques, which is why Intel started introducing Control-flow Enhancemnent Technology (CET), which is intended to ensure you can only branch to certain locations in memory.
unfortunately running on JVM actually really means running on Java same for .net it mean running on C#
i dont mind learning two languages, its expected from most developers to know more than one, but context switching in a single function between two language is not fun
However F# suffers from a bigger issue, that clojure doesn't, belonging to the platform owner, that behaves as if it was an error to make it part of VS 2010.
And I dont think F# failed, I just think it needs to find a way to hide C# and OOP better
But the language is doing fairly well, tons of educational resources, tons of videos on youtube , several nice projects on github, solid vs code mode, it is part of .net , but still a lot can be improved, and microsoft is far from abandoning, C# is one of MS flagship products , F# is not a flagship product
And I also think that Don Syme is a lot more active working for F# and promoting it, than Rich Hickey is currently working for Clojure
> microsoft is far from abandoning
Might has well. When you look at the resources actually being deployed for F#, it's clear that MSFT either don't care or don't really have plan for F#. Most of the work is done by the community. The number of actual paid/full time MSFT dev on F# is very limited. The tooling is extremely limited, there is no official F# libraries for pretty much any MSFT and azure services (have to relie on C# libs.) In leetcode where the have C# (so already have the infra to run .net stuff, and language like elixir and racket (so they do have more niche language), they still don't have F#.
The salvation for F# would come from finding a killer app, something akin to pytorch/numbpy or rails.
In my opinion, and I mean this in the broadest way possible, the average C# developer is probably less motivated to learn a new language than the average Java developer. That's because C# developers are prescribed by Microsoft what to do, and Microsoft could deliver a productive-enough experience with C# + .NET Framework + Visual Studio for a long long time.
Java's fractured IDE story and runtime (OpenJDK vs. HotSpot) I think lead to a higher concentration of programming language people looking at the JVM as a viable target to build a better experience on top of, so we got Groovy, Scala, Kotlin, (anyone remember Xtend?), IDEA, Eclipse, etc...
The bazaar produced and embraced a functional programming language whereas the cathedral treats F# as thunderdome for new C# features.
F# can do OOP just as well if not better than C#, and F# is anywhere from 2-10 times more concise than C#.
That being said, the mistake of Microsoft was viewing F# as a competitor to and research lab for C# and the community for viewing and selling it as a replacement for C#.
What Microsoft and the community should have done is treated F# as a competitor and replacement for Python from the start.
Also see the development effort Microsoft spends on Python across all their IDEs and what F# gets.
Every language lives under the iron fist of libc and the C ABI.
I know F# quite well at this point but only barely know C#. I know C# maybe only mildly better than C++ or Java in terms of being able to read it and guess what it's doing. I certainly can't write in it without looking up a bunch of stuff.
It’s too bad, because itself is really terrific.
Comparing to other languages I think the stability of the language has served the user community really well and been an enabler of the ecosystem. I guess the continuous language additions and resulting complexity & library/framework chuirn in some otherlanguages can also be seen as vitality and growth, but for many of us it's the wrong kind of growth.
(And yes there are also disadvantages to the centralised development model of the core language)
edit: there's a good summary of the discourse surrounding this post in the reddit comments at the time: https://www.reddit.com/r/Clojure/comments/a0pjq9/rich_hickey...
The usual "expect all for free, give nothing back" attitude.
Stability is extremely valuable. A lack of change to the core language over extended periods of time can be a very good thing, especially if certain changes would break existing code. Rich has made it clear that he is indeed targeting this kind of stability.
Again, I would be very interested to hear what specific changes you think need to be made to 'Clojure "the language"'.
So... you meant 'ossified' in a good way?? If so, I guess I would have used a different term that didn't have such a negative connotation.
You can write a Racket application even with a GUI and whatever lib you need and then cross build it as a native executable (that is, an executable that embeds the interpreter, the necessary libs, and your application's code) for yours and other platform for easy distributions.
See https://docs.racket-lang.org/raco-cross/index.html and https://docs.racket-lang.org/raco/exe.html for more information.
But the authors states that Racket is "the finest example of a modern day lisp bar none". In the modern day energy efficiency and climate change are major issues, the German government, for example, has set up the Blue Angel Ecolabel program: <https://www.blauer-engel.de/en>. Using modern day to mean VSCode and Discord is a worthless use of the phrase.
Racket ranks far ahead of languages like TypeScript, Python, and Erlang in terms of energy usage, but trails languages like C, Rust, Ada, Java and Common Lisp.
I think, however, there are probably other factors in your computing solution that has a bigger energy impact, like what processors you run on, the architecture of your program, etc.
You can confirm it via archive.org too: the TypeScript[2] page shows both less implementations and the implementations that are shown are often slower than those in the JavaScript[3] page.
As for if that makes sense, well, IMO using the benchmark game for judging how good a language is at optimizations is flawed in the first place as not only there is a bias towards the more popular languages but also a lot of the top entries use approaches that in practice you wouldn't find in real projects.
[0] https://greenlab.di.uminho.pt/wp-content/uploads/2017/10/sle...
[1] https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
[2] http://web.archive.org/web/20170425064751/https://benchmarks...
[3] http://web.archive.org/web/20170425114504/https://benchmarks...
Didn't submit it though because they make it difficult.
https://news.ycombinator.com/item?id=24826453
(the whole thread is worth a read)
https://sites.google.com/view/energy-efficiency-languages
I don't know what you mean with your last sentence. That's the same paper...
Even so, that only messes up the results because the mean is used rather than the median, and the data tables published with that 2017 paper, show a 15x difference between the measured times for a single outlier the selected JS and TS fannkuch-redux programs.
That single outlier is enough to distort TS and JS "mean" Time difference.
https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
https://news.ycombinator.com/item?id=36524073
[0] http://web.archive.org/web/20170715120019/http://benchmarksg...
Like you, most readers haven't seen the paper.
Like you, most readers have only seen "Table 4. Normalized global results for Energy, Time, and Memory" taken out-of-context (often without any way to find the original source).
Most readers notice the too-large differences C/C++ and JS/TS and start speculating about how those differences were caused (because that's fun).
I went back to the time measurements the authors provided and calculated the Mean, Geometric Mean and Median. Simply using a more appropriate summary statistic would have presented average values which readers would have found acceptable:
:even though they were based on different programs and included an outlier. (Similar story with C/C++.)Simple: start with the same data, make the same calculations, see the same results.
When we make assumptions about how measurements were made and analyzed, our assumptions may be wrong.
The list places JavaScript at a score of 4.45 and TypeScript worse at 21.5. For reference, C, the most energy efficient it 1.0
The point is that TypeScript is JavaScript with typing syntax added on top. There is a transpile step into JS. That's how TypeScript works. The runtimes are exactly the same. Unless we're also measuring the build step? Which seems silly.
The difference should be near zero. And it's not. They clearly do not understand exactly what they are measuring.
I think the takeaway, however is the theme of a spectrum of energy efficiency, with compiled languages being more efficient than those that are not.
But I still maintain that the overall efficiency of a system is more a function of other factors.
But the argument that TypeScript generates a JavaScript, so it must have similar speed doesn't hold in general.
If the compiler in question is a whole-program compiler, it can make optimizations that a normal person couldn't do.
As an anecdote in [1] a raytracing program were implemented in both Scheme and C. The Stalin compiler (a whole-program Scheme compiler) produced an executable 45% faster than the one produced by g++. The Stalin compiler produced the excecuable by compiling Scheme to C, and then used a C compiler to produce the final executable.
The price of a whole-program compiler? Well, the compile time are huge.
[1] Scroll to Siskind's comment. https://groups.google.com/g/comp.lang.scheme/c/NJxRsdMNKz4
For those curious about the actual programs and results: https://web.archive.org/web/20071011073406/http://www.ffcons...
Programs that didn't type check were excluded.
https://github.com/greensoftwarelab/Energy-Languages/issues/...
Using the Geometric Mean or the Median with the time measurements from that table would have highlighted the middle value, like this:
https://benchmarksgame-team.pages.debian.net/benchmarksgame/...However, Racket is also a modern language in that it has many new and fancy programming language research-grade features since it is also a programming language theory research platform. From that point of view, "modern day" is a valid description :).
https://docs.racket-lang.org/reference/implementations.html
"The CS implementation is the default implementation as of Racket version 8.0. This variant is called “CS” because it uses Chez Scheme as its core compiler and runtime system.
The CS implementation typically provides the best performance for Racket programs. Compiled Racket CS code in a ".zo" file normally contains machine code that is specific to an operating system and architecture."
As an anecdote, I used to host a computationally expensive web app written in Racket (even before Chez version) serving large amount of requests at peak times on a single $10 DigitalOcean droplet. It was pretty fast and didn't crash. No need for awkward modern web scale thingies.
λ most of the way down.
> when they switched focus to the Rhombus project, which killed a lot of interest and personal momentum I had going for Racket
The language has been "complete" for a long time. If you're actively trying to get something done, I think it's unlikely that some core language work is going to stop you.
It isn't clear what domain the parent wanted to work in; I won't say it's impossible they had problems but ... details are everything at this point.
The academic part of Racket gave rise to `syntax-parse`, a truely fabulous improvement over the standard Lisp/Scheme/whatever you used to work with way to write macros.
> The academic part of Racket gave rise to `syntax-parse`, a truly fabulous improvement over the standard Lisp/Scheme/whatever you used to work with way to write macros.
That's not the academic part I am referring to. My comment is about who the maintainers and users are and how their profession or studies affect how Racket is interacted with. (This is not a complaint or judgement. It's an observation.)
> The language has been "complete" for a long time.
There are a fair number of bugs in Racket's libraries, including core ones. They aren't discovered because academic users are not touching bits that more industrial-oriented users would find. In particular, I am thinking of the GUI toolkit, and in my experience, only one person is capable of fixing those.
It's not. There are several nice and unique things about Racket, but it's a pretty poor example of a lisp, to the extent that if you want to understand what's good about lisp you'd probably be better off learning emacs lisp than racket.
Racket is, in my probably slightly biased opinion, a vehicle for doing firmly within-the-box CS research and teaching. It comes with a lot of batteries included and there are some cool an innovative things about it (it has sophisticated support for syntactic abstractions with acceptable error reporting, contracts, non-textual data etc).
It's run by people with very respectable academic output, but to the best of my knowledge none of them are particularly appreciative of interactive and malleable computing (even python is much better here), or ever wrote any code that's reallly pushing the state-of-the-art. I'd also say that the racket IDE (despite also having some nice features) is ugly, clunky, sluggish and not something that I can see appealing to very good programmers.
I'm also not aware of any impressive industry project done in Racket (although, in fairness I should add that Carmack has said nice things about racket).
The people you mention in your first paragraph likely all belong to this category, incidentally.
Iteractivity is not required for solving cutting-edge problems and certainly not by people of that caliber.
Lisp programming, as such, does not require very smart people; languages in the Lisp family provide a user-friendly experience suitable for the average person who has some aptitude and curiosity for programming.
You need very smart people to solve cutting-edge problems in fragile low-level languages, in which a mistake will destabilize the machine. Those people have to be experts in the problem domain, and in coding (or a team which includes such).
Interactivity actually makes things easier for the bumblers who massage code into working and believe that trying functions at the REPL is a viable test plan.
+ almost everything related with LISP is fun
+ but they have VERY limited use in industry, for very good reason;
+ you should try spend at least same amount time (as on LISP topic) to learn something like leetcode(or even higher level of competitive programming, which improve your skill of converting your idea to code fast and basic math thinking; useful in interview), job-oriented CS book. You'll be a better problem-solver and more ready in job market.
+ If you're very sure you will be financially OK, just learn anything you like and ignore this message.
(https://github.com/azzamsa/awesome-lisp-companies/
https://lisp-lang.org/success/
http://www.lispworks.com/success-stories/index.html
such as
https://www.cs.utexas.edu/users/moore/acl2/ (theorem prover used by big corp©)
https://allegrograph.com/press_room/barefoot-networks-uses-f... (Intel programmable chip)
quantum compilers https://news.ycombinator.com/item?id=32741928
etc, etc, etc)
Personally I've spent many hours on LISP related topic; I enjoyed the time; I recommend it as a low priority interesting topic.
Things like develop a working distributed KV store(MIT 6.824), solid 1000+hours on competitive programming, those skill&knowledge are not only beautiful&interesting, but also make a person a better practical problem solver.
And what would these very good reasons be? inb4 "only for single programmers", brackets are difficult for normal programmers, macros can't be understood by mortals, and other non-issues that have been discussed to death.
I learned LISP in college, way back when expert systems were the future of AI.
The worst thing that I could probaby do professionally is to write something in LISP that needs to be adopted and maintained by somebody else in the future.
Thoughts or advice?
[1] https://www.nongnu.org/geiser/
[1]: https://www.racket-mode.com/
1. The use of custodians on a per-(green-)thread basis that brings "oh yes, if the thread crashes or is shutdown, racket guarantees that all open files/sockets/resources will be shut down". For example you can use this to ensure that a timed out client doesn't cause unnecessary resource usage. https://docs.racket-lang.org/more/index.html#%28part._.Termi...
2. Events and composable concurrency from Concurrent ML - Unlike Go's channels, concurrent ML events are composable up the stack, and have some nice things built in like "NACK events" (allowing cancellation down the stack). Unfortunately there aren't great primers about using this that I know of apart from the Kill Safe Synchronization Abstractions paper. https://users.cs.utah.edu/plt/kill-safe/
3. Sandboxing - https://docs.racket-lang.org/reference/Sandboxed_Evaluation....
Some of these were added to make it easier for the IDE to manage errant programs that could be written by beginners, but it is also a very "applicable to production" set of tools. Racket internalizes Extra-linguistic mechanisms https://felleisen.org/matthias/manifesto/sec_intern.html
Also the fact that various DSLs can inter-op with each other directly, so that you can use something like a binary parser description, as if it was just another Racket library. For example this description of the format https://github.com/Bogdanp/racket-binfmt/blob/master/binfmt-..., is directly included in another file as a regular library https://github.com/Bogdanp/racket-binfmt/blob/master/binfmt-...
I agree that startup time is not a winning aspect of Racket. My naive understanding is because Racket is not a direct bytecode interpreter like CPython, but actually has to run a compile step to native code, and doing that for programs + their required libraries necessarily takes at least a couple of hundred milliseconds even before anything can start running, while CPython can pretty much start executing from the get go.