| _______ __ _______
| | |.---.-..----.| |--..-----..----. | | |.-----..--.--.--..-----.
| || _ || __|| < | -__|| _| | || -__|| | | ||__ --|
|___|___||___._||____||__|__||_____||__| |__|____||_____||________||_____|
on Gopher (inofficial) |
| Visit Hacker News on the Web |
|
COMMENT PAGE FOR: |
| SICP: The only computer science book worth reading twice? (2010) |
|
ingen0s wrote 2 hours 25 min ago:
Perfect book and if you donât have a version - get it!
jll29 wrote 3 hours 48 min ago:
SICP is the best book to read as one's first book when studying
computer science.
After many years of hobbyist programming (and consuming 'structured
programming' books as well as languages from Pascal to Common LISP) we
used Abelson & Sussmann at my undergraduate comp. sci. course, and it
was eye-opening.
It demonstrates the simplicity, beauty and interactivity of Scheme
while teaching you that computer science is the layering of different
kinds of abstractions (from procedural abstraction and data
abstraction, over defining your own (domain specific) language and
implementing a compiler for it to defining new hardware in software).
All of it seems so effortless, how only true masters can make things
look like.
Make sure you buy the second edition, not the first or more recent
ones, however (which use Python instead of Scheme - ugh).
anonzzzies wrote 5 hours 59 min ago:
Next to SICP, I like the entire "The Little *" series as reading twice
(or more) material. And Types and Programming languages. For applicable
(in what I do anyway) CS. But not only reading though; implementing as
well; I need to repeat these things otherwise I forget parts.
I myself, but probably because I knew and respect the guy, I reread the
works of Dijkstra ever so often; books + papers. Not really applicable
anymore, but good for the brain and he was a good writer (imho).
rurban wrote 6 hours 1 min ago:
Many classics are worth reading twice. Knuth, Tanenbaum, Stephens,
PAIP, ...
soup10 wrote 8 hours 55 min ago:
I picked up SICP expecting to read something really interesting or
profound with the way it's been hyped up over the years however it's
more of a how-to manual for working with Scheme/LISP and frankly that
didn't interest me. Unfortunately most people have come to accept that
LISP isn't a particularly effective way of programming even if some
people get really excited by the idea of mutable and interchangeable
data and code it's just not as powerful as they make it out to be and
the obfuscation of program flow and execution and the lack of
separation/delineation of data and code proves to be a hinderance more
often than it is helpful. This doesn't discount LISP's contribution to
computer science historically and how it's influenced modern day
language design over the years, just that in my opinion LISP/SCHEME is
more of a historical curiosity than a modern day guide to effective
programming. (And certainly one that has no place as the introductory
class at MIT). Anyway I've said something negative about SICP so
prepare for this to be downvoted to the bottom :)
nomilk wrote 9 hours 19 min ago:
I considered reading SICP recently but this changed my mind:
> It's old and feels old. originally in scheme, they recently re
released the book in JavaScript which is more approachable to today's
audiences and there are still good things in there about encapsulation
and building dsls. ymmv. Though the language and programming design
concepts hold up, we're playing at higher levels of abstraction on more
powerful machines and consequently the examples sometimes seem too tiny
and simple.
I had studied economics in a similar way, but learning slightly
old/outdated ideas demotivated me - I was much more interested in
learning what works and what's considered the best way to do things,
not what had been considered a good idea at some point in the past.
I don't want to be a downer on SICP (especially since I haven't even
read it), but I hope this info might help others (or elicit a strong
refutation).
cess11 wrote 7 hours 56 min ago:
Sure, SICP is not a good book for people wanting to do rote learning,
imitation, 'best practice' while ignoring the history.
It's for people that would like to learn rather advanced programming
techniques and foundational ideas in computer science.
crystal_revenge wrote 9 hours 6 min ago:
Scheme as basically an implementation of the untyped lambda calculus
will eternally be a good frame work to think about the problems of
computation in.
In the more practical area Racket (the most modern Scheme) has
basically any practical functionality you would want, while amazingly
remaining a platform for an incredible amount of experimentation in
computation and programming language theory.
But SICP is a book that is for people interested in the study of
computation what programming languages can be. If you're worried
about getting a job in software it won't be all that useful, but it
will remain a classic for anyone interested in engaging in creating
the future of software.
virtuallynathan wrote 9 hours 42 min ago:
Always fun to see one of my professors from the quite tiny, but awesome
computer science department at St Andrews on HN!
freethejazz wrote 11 hours 39 min ago:
I havenât seen it in the comments yet, but you can watch Abelson and
Sussman teaching the material from this book from recorded lectures in
1986.
I still find their description of how to create and group abstractions
in various layers to be useful personally and as a mentor. (In the
videos, lesson 3A, 1:07:55)
|
| [1]: https://m.youtube.com/playlist?list=PLE18841CABEA24090 |
|
selimthegrim wrote 11 hours 32 min ago:
The Kabbalah joke gets me every time.
Iwan-Zotow wrote 11 hours 47 min ago:
TAOCP
Jiahang wrote 12 hours 50 min ago:
and CSAPP i think
Avid_F wrote 12 hours 53 min ago:
That and the art of computer programming
upghost wrote 13 hours 24 min ago:
Hot take: SICP and SD4F "considered harmful (without counterpoint)"*.
Why? The modus operandi of problem solving in these books is object
oriented programming masquerading as functional programming, and it is
presented as a _neutral_ beginner book. It is _not neutral_. This is
a very opinionated approach to programming.
To be fair, I do not believe the authors intended for this style of
programming to be taken as gospel, but it is often presented _without
counterpoint_.
The most powerful technique introduced -- implementing complex behavior
via extensible polymorphic generics -- is virtually unmaintainable
without a compiler-supported static type checker. You would know that
if you ever tried to implement the code yourself in a dynamic language
of your choice.
The ramifications of these choices can be felt far and wide and are
largely unquestioned.
Ironically, they make code hard to understand, hard to extend, and hard
to maintain. I need to reiterate, I do not believe the intention of the
authors was to suggest these ideas should be used beyond a pedagogical
setting, but they often are.
As a specific critique to SD4F, which states as a goal making code more
resilient by emulating biology, I would point to Leslie Lamport's talk
on logic vs biology[1].
I would add that I think SICP would be fine if it were taught in tandem
with Paradigms of Artificial Intelligence Programming by Peter
Norvig[2]. PAIP offers a completely different approach to solving
problems, also using lisp. This approach is much closer to constructing
a language to model a problem and then solving the problem symbolically
using the language created. Areas that use OO techniques, such as the
chapter in CLOS, are clearly marked as such.
In other words, I say "SICP considered harmful" because thrusting it
upon an eager newcomer as a trusted neutral guide to beginner coding
(without offering any counterpoint) could set them back by a decade,
filling their head with "functional object oriented programming"
concepts that don't translate well to industry or CS.
[*]: I say this as someone who has thoroughly studied both books,
implemented the code, taken Dave Beazely courses to have the
information spoon fed to me (dabeaz is awesome btw, take all his stuff)
and used the techniques in production code bases.
[1]
|
| [1]: https://lamport.azurewebsites.net/pubs/future-of-computing.pdf |
| [2]: https://github.com/norvig/paip-lisp |
|
rednafi wrote 14 hours 9 min ago:
This is great, but itâs not what I get paid for. Iâve yet to work
at a place where I thought, âIf only I had read SICP, things would be
easier.â
I work with distributed systems, writing business logic and dealing
with infrastructure concerns. For me, learning about databases, quirks
of distributed systems, and patterns for building fault-tolerant
services is more important than reading the nth book on structuring
programs, deciding which algorithm to use, or figuring out whether my
algorithm has O(1) or O(n) complexity.
This doesnât mean CS fundamentals arenât importantâthey areâbut
I work in a different space. Iâd get more value out of reading
Designing Data-Intensive Applications than SICP. If I were in the
business of building frameworks or databases, Iâd probably be the
target audience.
globular-toast wrote 15 hours 7 min ago:
My favourite part of SICP and something that has stuck with me for
years is the idea of "wishful programming". That is where you build
something top-down by simply wishing you had the lower-level routines.
Then, of course, you actually go and build those lower-level routines
until you reach the bottom. I find this way of thinking works really
well with test-driven development. Write a test against functionality
you wish you had, then go and fulfill that wish. Most developers seem
to build stuff bottom-up and then end up with something that isn't
really what anyone wished for.
wruza wrote 6 hours 1 min ago:
They do that because their wish is performance and naturalness.
You may accidentally wish something you donât yet know the true
nature of, and this will create a fragile mess at the bottom. It
usually does, cause algorithmic nature of things is rarely intuitive.
Starting from the bottom is like starting from quarks that you have
rather than from âI want magic to existâ. Well it does not. You
reach the bottom and thereâs quarks instead of magicules and
youâve lost all context clues on the way which could help to
convert between two physics.
Both approaches have their use, because sometimes you have to be bold
with your wishes to solve a deep problem. But personally I prefer
magic to be packed into the before-topmost layer. I.e. build from the
bottom up, and then, just before the business logic, create a
convenience magic layer that translates to/from business speak. It
becomes adjustable and doesnât induce a tangled mess all the way
down.
globular-toast wrote 3 hours 9 min ago:
So I think one of the things best avoided in life in general is
extremity, in all its various guises. No single technique should be
followed like scripture, but rather incorporated into ones toolkit
and used where appropriate. Building top-down will get you where
you want, but risks fragile underpinnings due to a lack of
cross-cutting architectural guidance. But, on the other hand,
bottom-up might get you the best foundations at each layer but
ultimately deliver nothing of value to the users. In practice it's
necessary to take a balanced approach and, of course, mistakes will
be made and experience will become the guide. Like you I definitely
employ the "meet in the middle" approach in practice, but what SICP
taught me is how to think about starting at the top, that is, to
build upon wishes.
WillAdams wrote 8 hours 24 min ago:
Interestingly, Dr. Donald Knuth used pretty much that approach when
writing TeX --- he started by writing out the sort of
formatting/tagging which seemed appropriate, then theorizing about
the sort of programming which would be appropriate for markup (hence
macros), then worked on the implementation.
I've been trying a similar thing for my own effort to create a
library for modeling G-code in OpenSCAD --- hopefully with the recent
re-write in "pure" OpenPythonSCAD it will become something usable.
Qem wrote 13 hours 6 min ago:
The Smalltalk world has great support for this, through coding in the
debugger. You should try Pharo.
jnordwick wrote 15 hours 47 min ago:
I took cs61a at Berkeley as my very first computer science class I
couldn't program I never tried to so scheme was my first language.
My ta told me that everybody should take the class twice when you first
come in and when you're graduating.
When you first take it especially if you know other languages like C at
the time you don't get the full depth of the problems you're given a
great introduction and you think you understand everything but you
don't realize the depth of complexity. Message passing the metacircular
evaluator, continuations as the basis of all flow control, etc
You think they are neat tricks that you understand the curriculum
because you can do the homework you don't understand how those neat
tricks are really the basis of everything else you'll do.
When you're graduating you've had time to go through all your classes
you realize just how foundation was principles are and you get so much
more out of the book.
Well I didn't take the class a second time I need help grade and TA for
a couple semesters.
I work as a quant developer and in trading now and even though my field
has nothing to do with that I still think it's the basis of me as a
developer.
golly_ned wrote 15 hours 34 min ago:
My same experience. For much of the rest of the cs curriculum I felt
like we had already to some extent covered the main ideas in 61a with
sicp.
anon115 wrote 15 hours 49 min ago:
its watever
debo_ wrote 16 hours 35 min ago:
SICP helped me understand early on that there were many models of
programming, even though I'd learned a limited number in my
undergraduate. It was one of the books that helped me feel equipped to
read the docs of any language, library or framework and have some
notion of how to orient myself.
WillAdams wrote 8 hours 31 min ago:
One of the best programming classes I had in college was a
comparative languages course where multiple languages were covered,
each in two week or so blocks.
MikeTaylor wrote 16 hours 45 min ago:
Just dropping in to say that The Elements of Programming Style is worth
reading three times â and I have read it many more times than that,
and benefitted from it. Here's my review (from 2010) if you're
interested:
|
| [1]: https://reprog.wordpress.com/2010/03/06/programming-books-part... |
|
jacoblambda wrote 14 hours 22 min ago:
Oh and here I thought you were talking about Elements of Programming
by Stepanov and McJones which tbh I'd give the same
recommendation/review.
|
| [1]: https://elementsofprogramming.com/ |
|
taeric wrote 16 hours 53 min ago:
Curious to hear folks opinion on the newer Software Design for
Flexibility: How to Avoid Programming Yourself into a Corner ( [1] )?
|
| [1]: https://www.amazon.com/gp/aw/d/0262045494 |
|
golly_ned wrote 15 hours 36 min ago:
Itâs a much, much denser successor to sicp. I hadnât succeeded in
self-studying with it despite strong lisp/scheme chops and strong
affinity for sicp.
taeric wrote 5 hours 38 min ago:
I have a copy. Found it fun, but not quite as mind shifting. I
think I need to try it again, but I am curious how others feel.
__turbobrew__ wrote 17 hours 25 min ago:
Itâs interesting, SICP and other many other âclassicâ texts talk
about designing programs, but these days I think the much more
important skill is designing systems.
I donât know if distributed systems is consider part of âComputer
Scienceâ but it is a much more common problem that I see needs to be
solved.
I try to write systems in the simplest way possible and then use
observability tools to figure out where the design is deficient and
then maybe I will pull out a data structure or some other âcomputer
sciencyâ thing to solve that problem. It turns out that big O
notation and runtime complexity doesnât matter the majority of the
time and you can solve most problems with arrays and fast CPUs. And
even when you have runtime problems you should profile the program to
find the hot spots.
What computer science doesnât teach you is how memory caching works
in CPUs. Your fancy graph algorithm may have good runtime complexity
but it completely hoses the CPU cache and you may have been able to go
faster with an array with good cache usage.
The much more common problems I have is how to deal with fault
tolerance, correctness in distributed locks and queues, and system
scalability.
Maybe I am just biased because I have a computer/electrical engineering
background.
pjmlp wrote 40 min ago:
> I donât know if distributed systems is consider part of
âComputer Science"
It surely was part of my Informatics Engineering degree, with
Tanenbaum book being one of the required reads.
nh2 wrote 1 hour 21 min ago:
> big O notation and runtime complexity doesnât matter the majority
of the time and you can solve most problems with arrays
I have the exact opposite experience.
Software comes out best if you always ensure to use an approach with
sensible runtime complexity, and only make trade-offs towards
cache-friendly-worse-O implementations where you benchmarked
thoroughly.
Most cases where I encounter mega slow programs are because somebody
put in something quadratic instead of using a simple, standard O(n
logn) solution.
Check out [1] for many examples.
|
| [1]: https://www.tumblr.com/accidentallyquadratic |
|
olpquest22 wrote 3 hours 23 min ago:
Since you emphasize designing systems over just programs, do you have
any go-to resources or references ?
cowsandmilk wrote 3 hours 34 min ago:
> What computer science doesnât teach you is how memory caching
works in CPUs.
Yes it can, and there are tons of papers about data structures to use
in various scenarios to handle not just L1, L2, L3, but also NUMA.
Sure, this isnât in SICP, but claiming CS as a field completely
ignores how memory works is incorrect.
0xDEAFBEAD wrote 5 hours 40 min ago:
>What computer science doesnât teach you is how memory caching
works in CPUs. Your fancy graph algorithm may have good runtime
complexity but it completely hoses the CPU cache and you may have
been able to go faster with an array with good cache usage.
Traditionally, the field of databases is largely about solving
algorithm problems in the scenario where you have much more data that
can fit in memory. Data exists on disk as "pages", you have a fixed
number of "page slots" in RAM. Moving pages from disk to RAM or RAM
to disk is slow, so you want to do as little of that as you can.
This makes trivial problems interesting -- e.g. there's no notion of
a 'join' in classic computer science because it's too trivial to
bother naming.
We're used to thinking of the study of algorithms as a sort of pure
essence, but one could argue that algorithmic efficiency is only
meaningful in a particular data and hardware context. That's part of
what keeps our jobs interesting, I guess -- otherwise algorithm
expertise wouldn't be as useful, since you could just apply
libraries/cookbook solutions everywhere.
seanmcdirmid wrote 9 hours 32 min ago:
> but these days I think the much more important skill is designing
systems.
It is hard to design systems if you don't have the perspective of
implementing them. Yes, you move up the value chain to designing
things, no, but no, you don't get to skip gaining experience lower
down the value chain.
> What computer science doesnât teach you is how memory caching
works in CPUs.
That was literally my first quarter in my CS undergrad 30 years ago,
the old Hennessy and Patterson book, which I believe is still used
today. Are things so different now?
> The much more common problems I have is how to deal with fault
tolerance, correctness in distributed locks and queues, and system
scalability.
All of that was covered in my CS undergrad, I wasn't even in a fancy
computer engineering/EE background.
__turbobrew__ wrote 5 hours 35 min ago:
I think CS 30 years ago was closer to computer engineering today.
At my uni 10 years ago the CS program didnât touch anything
related to hardware, hell the CS program didnât even need to take
multivariable calculus. In my computer engineering program we
covered solid state physics, electromagnetism, digital electronics
design, digital signals processing, CPU architecture, compiler
design, OS design, algorithms, software engineering, distributed
systems design.
The computer engineering program took you from solid state physics
and transistor design to PAXOS.
The CS program was much more focused on logic proofs and more
formalism and they never touched anything hardware adjacent.
I realize this is different between programs, but from what I read
and hear many CS programs these days start at Java and never go
down abstraction levels.
I do agree with you that learning the fundamentals is important,
but I would argue that a SICP type course is not fundamental â
physics is fundamental. And once you learn how we use physics to
build CPUs you learn that fancy algorithms and complex solutions
are not necessary most of the time given how fast computers are
today. If you can get your CPU pipelined properly with high cache
hits, branch prediction hits, prefetch hits, and SIMD you can
easily brute force many problems.
And for those 10% of problems which cannot be brute forced, 90% of
those problems can be solved with profiling and memoization, and
for the 10% of those problems you cannot solve with memoization you
can solve 90% of them with b-trees.
richiebful1 wrote 51 min ago:
We (at a public research university in the US) designed a
rudimentary CPU, wrote mips assembly, and understood computer
architecture for our CS degree. I graduated 6 years ago
Edit: we also did formal methods and proofs as part of core
curriculum
jltsiren wrote 53 min ago:
Computers today are slower than they have ever been. And
tomorrow's computers are going to be even slower.
In many applications, the amount of data grows at least as
quickly as computer performance. If the time complexity of an
algorithm is superlinear, today's computer needs more time to run
it with today's data than yesterday's computer did with
yesterday's data. Algorithms that used to be practical get more
and more expensive to run, until they eventually become
impractical.
The more data you have, the more you have to think about
algorithms. Brute-forcing can be expensive in terms of compute
costs and wall-clock time, while low-level optimizations can take
a lot of developer time.
seanmcdirmid wrote 5 hours 20 min ago:
A top tier CS program is going to make you learn computer
architecture along side automata and proofs. MIT went the extra
mile with the SICP, it was honestly a hole I didnât have access
to in my top tier program, but I only realized this because I
studied PL in grad school. You should go through it if you
havenât, I think it would have made my ugrad experience better
and I definitely benefited from a great well rounded curriculum
already (UW CSE is still no slouch, but it isnât MIT!).
If you are into physics and mechanics, then you have to check the
SICM (SICPâs less famous cousin) out as well. Again, MIT went
the extra mile with that as well.
inopinatus wrote 12 hours 46 min ago:
Scouring SICP cannot imbue the student with mechanical sympathy any
more than poring over analysis of Coltrane makes me a saxophonist.
Nevertheless. It must be done. Theory and practice.
sameoldtune wrote 10 hours 23 min ago:
Nicely said. The way I think about it: if we canât write legible
and adaptable functions, then we have no chance at making viable
systems. All the same engineering skills are at play, just on a
different scale.
osigurdson wrote 14 hours 11 min ago:
I think the classic CLR text is great but, yeah, caches through quite
a monkey wrench into naive big O analysis.
Still, I think the time investment to learn algos and data structures
isn't too much of a burden.
swatcoder wrote 14 hours 23 min ago:
There are still innumerable people writing standalone programs,
single-purpose embedded systems, independent components and
libraries, etc
The industry has expanded to include a lot of large-scale distributed
cloud projects (often where we might have expected mainframes and
cobol before), with many of today's largest employers doing most of
their work there, but none of that other stuff really went away. It's
still being done every day.
You need a book for what you're doing, and not every book is going to
be that. Apparently, SICP is not it. I possess and have read many
books, and only some small number of them are applicable to the
projects I'm working on at any time.
They don't compete with each other, they complement each other.
esfandia wrote 14 hours 34 min ago:
The right book for the right problem. SICP isn't meant to teach you
how to tackle fault-tolerance in a complex distributed system. Here
is a textbook that talks about distributed systems (van Steen and
Tannenbaum):
|
| [1]: https://www.amazon.ca/Distributed-Systems-Maarten-van-Steen/... |
|
__turbobrew__ wrote 5 hours 8 min ago:
Yes, I have the distributed system book from van Steen :)
nioj wrote 10 hours 44 min ago:
You can also get a free PDF version of that textbook here [1] (you
only need to provide an email)
|
| [1]: https://www.distributed-systems.net/index.php/books/ds4/ |
|
lisper wrote 14 hours 50 min ago:
> these days I think the much more important skill is designing
systems
That's true, but that doesn't mean that there is no value in having
an understanding of how established technology works under the hood.
> What computer science doesnât teach you is how memory caching
works in CPUs.
That is also a very good point. There is a lot of daylight between
the lambda calculus and real systems.
ozim wrote 15 hours 27 min ago:
Well CS and software dev in trenches moved a bit.
There are still jobs where people write frameworks, database engines
or version control tools. Those jobs require heavy CS and algorithms,
data structures day to day. But there are less of those jobs nowadays
as no one is implementing db engine for their app they just use
Postgres.
Other jobs that is vast majority is dealing with implementing
business logic. Using database with understanding how it works in
details is of course going to produce better outcomes. Yet one still
can produce great amount of working software without knowing how
indexes are stored on disk.
Also a lot of CS graduates fell into a trap where they think their
job is to write a framework - where in reality they should just use
frameworks and implement business logic- while using CS background to
fully understand frameworks already existing.
paulddraper wrote 14 hours 10 min ago:
That's true, though new technologies -- web browsers, mobile
devices -- have necessitated some framework writing.
gwervc wrote 13 hours 52 min ago:
So yes, some people working at Microsoft, Apple and Google wrote
those frameworks. But that's like a drop in the bucket.
paulddraper wrote 13 hours 50 min ago:
+Meta ;)
Quekid5 wrote 14 hours 39 min ago:
> Yet one still can produce great amount of working software
without knowing how indexes are stored on disk.
I agree... up to a point. Most software will likely be
replaced/obsolete before it even reaches a scale where indexes even
matter (at all) given how fast the underlying hardware is at this
point.
... but I don't think this is particularly relevant wrt. the "to CS
or not CS" question. If a CS grad has been paying any attention
they usually have a decent idea of what kinds of problems are
intractable vs. problems that are tractable (but maybe expensive to
compute) vs. easy. Also just general exposure to different ways to
approach solving a problem (logic programming, pure functional,
etc.) can be very valuable. There's just much that one couldn't
come up with on their own if one weren't exposed to the ideas from
the vast expanse of ideas that are known in CS. (And even a
master's doesn't come close to scratching the surface of it all.)
aleph_minus_one wrote 15 hours 1 min ago:
> while using CS background to fully understand frameworks already
existing.
Most frameworks today are so complicated that you typically cannot
understand them fully, and even understanding them somewhat
partially is more than a full-time job.
ozim wrote 1 hour 7 min ago:
You don't have to understand them fully to make use of them. One
has to understand design patterns and underlying reasoning,
understand context of the framework like for example if it is JS
framework that it runs in browser to understand which parts are
"because we are running in the browser" vs "that is just why
framework implemented it" and if it is Typescript then how it
blends into that mix.
Then for any details or unexpected behavior knowing where to look
in documentation.
porknubbins wrote 10 hours 40 min ago:
I wish someone told me this back when I was trying to get a
programming job as a self taught programmer. I would do things
like try to build a simple React clone thinking it would help me
overcome imposter syndrome to fully understand things from the
base up, but it was pretty futile because no one really has time
to wrap their head around something that big unless they are paid
full time to do it.
apwell23 wrote 3 hours 35 min ago:
i did build a react clone in a long weekend. but I built the
first 90% that takes 10% of the time and not the last 10% that
takes 90% of time.
eastbound wrote 4 hours 24 min ago:
I have a saying that program's complexity is always exactly
equal to the human-intelligible complexity + 1.
If not, the developer would add one more feature. It is due to
the entirely human-made aspect of this discipline.
hbbio wrote 12 hours 50 min ago:
Last week, after reading Methodology is bullshit here, this was
my first thought!
|
| [1]: https://x.com/henri__OK/status/1854813243916882365 |
|
pipes wrote 16 hours 27 min ago:
True. However I find that most junior and even experienced
programmers struggle with tactical level coding. I'm really suffering
with this right now because the small component I'm tasked with
making a small change to is annoyingly stateful and deals with 2
abstractions at once. (It processes files and uses the file system
and database to store it's state). I'm shocked how badly it has been
thought out. I've spent days trying to avoid doing what has gone
before, bits bolted on that make it even more difficult to
understand. It really seems that pull request culture has just led to
any old crap being approved because no one has the band width to
think deeply about the actual code. Bring back in person code reviews
!
soegaard wrote 16 hours 43 min ago:
Have you seen
"Software Design for Flexibility: How to Avoid Programming Yourself
into a Corner"
by Chris Hanson and Gerald Jay Sussman
It's from 2021.
pipes wrote 16 hours 38 min ago:
I hadn't, that looks excellent.
crystal_revenge wrote 9 hours 11 min ago:
Unfortunately, while I really want to love Software Design for
Flexibility, it's clear that Hanson and Sussman haven't really
solved (or even come close to solving) the problem they have
identified in the book.
The introduction to that book is brilliant at identifying just
how much room software has to grow (you can find similar talks
from various Strange Loop sessions Sussman has done), and is
really quite inspirational for anyone seriously thinking about
the future of computing.
But the rest of the book fails to provide a coherent answer to
the questions that are brought up in the intro. It shows off some
neat functional programming tricks, but repeatedly fails to
deliver on solving the (admittedly ambitious) challenges it
provides for itself.
I'm still glad I have a copy, and have re-read the first half
multiple times now, but sadly it's not the book it wants to be.
To be fair though, that is because we haven't come close to
understanding computation enough to solve those problems.
It's a very ambitious book that falls short of it's own
ambitious.
ralphc wrote 13 hours 29 min ago:
IMO it's not excellent. It's not like SICP, it's obtuse for no
reason, I find it a hard slog. Flexibility is good but it seems
to try to make every bit of your program flexibile and pluggable
and you just need to do something eventually.
My opinion, I'd welcome others on the book; there was a small
splash when it came out but not much discussion since.
maroonblazer wrote 12 hours 55 min ago:
Is there another book you'd recommend - more recent than SICP -
for how to avoid programming yourself into a corner?
ralphc wrote 11 hours 38 min ago:
I don't know about books, but I think the best approach is
functional programming in a dynamic language. That could be
because I'm currently an Elixir fanboy, but I think Lisps,
especially Scheme or Clojure, or a functional-restricted
approach in JavaScript could do it as well. I agree with
parent comment that it's better to keep things as simple as
possible and make the changes when necessary vs. building in
all the flexibility in the beginning.
gregmac wrote 12 hours 56 min ago:
I haven't read the book, but my experience is that the way to
make things flexible is to make them simple as possible.
When I've used (or built) something that was built in the style
like you're talking about, it's almost always wrong, and the
extra complexity and stuff now makes it harder to do right.
It's not surprising: unknown future requirements are unknown.
Over building is trying to predict the future.
It's like someone building a shed and pouring a foundation that
can work for a skyscraper. Except it turns out what we needed
was a house that has a different footprint. Or maybe the
skyscraper is twice the height and has a stop for the
newly-built underneath. Now we have to break apart the
foundation before we can even begin work on new stuff; it would
have been less work if the original just used a foundation for
a shed.
tkiolp4 wrote 16 hours 46 min ago:
I find books like SICP interesting and not very useful. I love
reading them because I like this stuff, but I donât get to apply
their teachings in real world software. Itâs a problem because
naturally I want to spend my time reading these kind of books, but if
I do that I would be jobless. I need to divide my time between
reading pearls like SICP and boring Kafka/Postgres/Golang/K8s/AWS
documentation.
gonzobonzo wrote 8 hours 39 min ago:
One of the problems I've seen is that when new learners and
self-taught individuals ask for advice, a lot of software engineers
give recommendations based on what they wish their job was or how
they would like to imagine themselves.
sriram_malhar wrote 13 hours 17 min ago:
The first reason why I really loved SICP is that it is based on
Scheme, a language with powerful primitives. I came from a
self-taught world of PL/1, Algol, C, then later C++, Java etc. None
of them had closures, hygienic macros, anonymous functions,
functional programming, call/cc, and of course, "amb", the
non-deterministic choice operator. At an even more basic level,
SICP taught me that a lot of non-trivial code can be written with
just sequences and maps, with good enough efficiency!
Because SICP's starting point was so high, they could describe many
concepts easily from the ground up, from object oriented
programming, backtracking, constraint programming and
non-determinism.
This taught me a number of techniques to apply in real-life,
because I could readily identify the missing building blocks in the
language or system I was given to work with. For example, I was
able to build a lightweight threads system in Java quite readily
because I knew that the missing piece was a continuations feature
in Java.
See
|
| [1]: https://github.com/kilim/kilim |
|
lovecg wrote 15 hours 58 min ago:
I donât find them useful in the sense of directly applying
practical techniques in my day job, but I consider them somewhat
necessary background reading to get into the right state of mind.
You can very quickly tell when someone never acquired any academic
knowledge in this area (or never played with functional languages
or similar pastimes) - you canât explain to those people why
modifying global variables all over the place in a large program is
a bad idea and other things like that. They just nod along
skeptically and then somehow keep stumbling into the same kind of
mess over and over.
Buttons840 wrote 9 hours 14 min ago:
You kind of defeat your own argument. You say it's important to
learn "academic knowledge", but then acknowledge the organization
will not value your knowledge.
I do agree with you though.
lovecg wrote 7 hours 59 min ago:
Well in my experience good organizations do recognize that the
better design means lower costs in the long run, and people who
donât get that tend to not get promoted. Communicating this
effectively up and down the chain is a whole different art in
itself though.
KerrAvon wrote 17 hours 18 min ago:
If you knew how to design programs you could run it all on a single
box and wouldnât have to design âsystems.â
Iâm being slightly facetious, but only slightly. If you really
think everything is solvable with arrays, you are not going to scale
well and of course youâre going to need to throw a lot more
hardware at the problem.
__turbobrew__ wrote 5 hours 11 min ago:
My argument is that 90% of problems can be solved with arrays, 5%
of problems can be solved with memoization, 3% of problems can be
solved with b-trees, and 2% of problems with other data structures.
It is good to know that solutions to the 2% exists, but what we
should be focusing on is writing the simplest code possible which
solves the problem and then only optimize afterwards using a
profiler.
God forbid you have to work on some codebase written by someone who
believes they are the second coming of haskell with crazy recursion
and backtracing, monads, red black trees, and a DSL on top of the
whole thing.
You are right that many problems can be solved with a single box,
but my argument is that you do not need fancy algorithms to solve
problems on a single box. We should strive to use single boxes
whenever possible to reduce complexity.
Computation is designed by humans to serve humans, we should make
it as easy as possible for humans to understand. Iâm probably
going to start a flamewar here, but this is why simple solutions
like UNIX and golang have prevailed in the past. Simple code is
easy to understand and therefore it is easy to modify and reason
about. Some people think simple means that you decompose programs
into the smallest possible functional parts, but simple to me is a
500 line main function.
llm_trw wrote 17 hours 21 min ago:
You're in luck. Part 5 of the book is about building a virtual
machine to run lisp simalated at the register lelvel: [1] Writing a
network between n such machines is left as an exercise to the reader.
|
| [1]: https://mitp-content-server.mit.edu/books/content/sectbyfn/b... |
|
scop wrote 17 hours 32 min ago:
Iâm slowly making my way through it a second time and thoroughly
enjoying it. The first time through it seemed quite abstract, albeit
only because of my completely lack of real world programming. The
second time through it a revelation as I now have a strong base of
experience through which to understand it (experience which it also,
informs!).
I am using Elixirâs Livebook to take notes and complete the
exercises. It is very helpful to have a live notebook tool while
reading it!
ralphc wrote 13 hours 34 min ago:
You're doing the exercises in Elixir and not Scheme then?
spit2wind wrote 17 hours 46 min ago:
Programming Pearls is another book that rereads well. It's also short,
too, which makes rereading it possible.
docandrew wrote 17 hours 47 min ago:
Iâm working through it now, for someone with a computer engineering,
EE or math background I think this is a great resource to get started
with CS fundamentals.
myleshenderson wrote 17 hours 49 min ago:
I've been programming for 25 years and have owned the book for about 10
years. I just recently started to work through it and started with Dr.
Racket.
There are things to love about Dr. Racket: hovering over a variable and
visually seeing its connections to other places in the code is really
cool. But ultimately I was a bit frustrated that it wasn't vs code.
So I stood up an configuration that let me use vs code (cursor
actually) to work through the exercises. The LLM integration into
cursor is cool as you can give it your code and whatever narrative you
wrote and ask for feedback.
I am a tiny way through the exercises but having turned my code, the
responses that I write, and the feedback that I get from the LLM into a
static site.
It's been a fun way to spend a little time. For sure, I'm not getting
the full benefit of working through SICP just with my own thoughts
(without the aid of an LLM), but it's neat to see how you can integrate
an LLM into the exercise.
Upvoter33 wrote 17 hours 56 min ago:
There are some great books, and every book means something different to
each person who reads it.
K&R influenced a generation of programmers.
Hennessy and Patterson influence a generation of architects.
etc. etc.
It's not just SICP.
But the greater point: a book can be meaningful, and we can always use
more good ones.
whobre wrote 18 hours 3 min ago:
I donât quite get the cult status of SICP. I read it and itâs a
fine beginner programming book, but nothing more.
liontwist wrote 15 hours 54 min ago:
I donât understand this comment. If you master the material you
know more than 90% of engineers in the field.
aleph_minus_one wrote 15 hours 16 min ago:
> If you master the material you know more than 90% of engineers in
the field.
Telling someone that he/she is smarter than 90% of the people is
not a praise. :-)
bdangubic wrote 15 hours 11 min ago:
amen⦠just look at 90% of people at the DMV :-)
jgon wrote 17 hours 54 min ago:
Just so we're clear, this is a "beginner programming book" that has
you create a scheme interpreter, then a register machine simulator,
then a compiler out of your interpreter that will then have its
compiled code run on the register machine simulator, by the final
chapter.
This is probably the part where you'd step up and post a link to your
repo with solutions to the exercises to back up your talk, but
generally I only see this sort of casual dismissal from people who
haven't actually worked through the book.
veqq wrote 13 hours 8 min ago:
Concrete Abstractions, Schematics of Computation and others from
the era (also using Scheme) covered similar ground (and went far
further!) SICP is denser and sticks to theory forgoing databases,
operating systems and actually implementing scheme in assembly.
dbtc wrote 16 hours 41 min ago:
I commend your righteous indignation. Made me smile. Flame on!
becquerel wrote 14 hours 27 min ago:
One aspires to be a hater of such high caliber.
owl_vision wrote 18 hours 13 min ago:
i'd also recommend "Concrete Abstractions: An Introduction to Computer
Science using Scheme" by Max Hailperin, Barbara Keiser, Karl Knight.
|
| [1]: http://www.gustavus.edu/+max/concrete-abstractions.html |
|
maxhailperin wrote 14 hours 39 min ago:
Spelling correction on the second author's last name: Kaiser
owl_vision wrote 14 hours 20 min ago:
Thank you for the correction. pardon my typo.
shrubble wrote 17 hours 59 min ago:
I concur, I am learning from it nowâ¦
alabhyajindal wrote 18 hours 19 min ago:
I really wanted to like SICP but Lisp throws me off. I love Haskell and
Standard ML however! Did others have a similar experience? Might be
interesting to read a book similar in spirit to SICP but using a
different language as a vehicle (No, I don't want to do SICP in
JavaScript).
bez00m wrote 5 hours 17 min ago:
"Functional Programming in Scala" aka "Red Book of Scala" is a the
one that IMO teaches to think the same way as SICP, while using a
typed language. The books stand next to each other on my bookshelf,
definitely worth reading.
whimsicalism wrote 12 hours 39 min ago:
yes, i like ML (well.. ocaml) and bounced off SICP for the same
reason. It was actually SICM that made me come back and stick with
it, the ideas were just too interesting (whereas for SICP it was a
lot of ideas I was already familiar with)
kccqzy wrote 13 hours 43 min ago:
I don't understand why Lisp throws you off. I only read SICP after I
became proficient in Haskell and it is just fine.
dokyun wrote 14 hours 14 min ago:
SICP isn't a book about Lisp, however it uses some of Lisp's unique
properties to demonstrate important concepts that other languages
can't easily replicate. A book that's meant to be similar to SICP
that doesn't use Scheme or Lisp would not be anything like SICP, or
at least not teach the same things. Haskell and ML are in my
experience much harder to understand than Scheme, so I'm wondering
what your difficulty is?
ok123456 wrote 14 hours 9 min ago:
There's a SICP edition done in Javascript.
dokyun wrote 13 hours 39 min ago:
Have you looked at it? It's an abomination. The point of SICP
isn't Scheme or the syntax of Scheme, but what it represents.
Whoever made the Javascript rewrite didn't understand that. You
can't write a metacircular interpreter in Javascript, because
Javascript is not homoiconic.
wruza wrote 8 hours 59 min ago:
You can't write a metacircular interpreter in Javascript,
because Javascript is not homoiconic.
Is that a downside? I never wrote or used metacurcular
interpreter in my life and still donât know why I had to read
about it. Is it an interesting implementation technique of
lisp? Yes. Does anyone really need that?
You can rip off that part and everything that follows and that
will be enough for a regular programmer. No one itt needs to
know how to design metacircular interpreter on register
machines.
shawn_w wrote 11 hours 51 min ago:
I'm pretty sure someone wrote a very basic, very literal scheme
to JavaScript transpiler and just ran the book's code through
it. The results look nothing like what any normal person would
write.
2c2c2c wrote 14 hours 20 min ago:
i thought berkeley was using a modified version of the book using
python a few years back
linguae wrote 16 hours 55 min ago:
You might be interested in a 1987 article titled "A Critique of
Abelson and Sussman or Why Calculating is Better than Scheming" ( [1]
), where the author advocates the use of KRC or Miranda as
alternatives to Scheme. I don't know much about KRC, but Miranda is
a statically-typed functional programming language that influenced
Haskell.
|
| [1]: https://dl.acm.org/doi/10.1145/24697.24706 |
|
hluska wrote 17 hours 9 min ago:
I can identify with that - Lisp throws me off (because Iâm not
smart enough). But I ended up forcing myself to work through it and
learned a tremendous amount because Iâm not smart enough to work
with a lisp. It felt like I spent so much time just reading through
the code that I ended up learning more than I would in a language
Iâm comfortable with.
There is a Python version of SICP. I have never worked through it or
even given it more than a cursory scan so this is not an endorsement
more just a link to prove it exists:
|
| [1]: https://wizardforcel.gitbooks.io/sicp-in-python/content/0.ht... |
|
rustybolt wrote 18 hours 2 min ago:
I really wanted to like SICP and I probably would have if I read it
15 years ago. I started reading it last month and I found it to be
too broad. It covers too much interesting mathematical principles and
then jumps to the next one right when it starts to get interesting.
In other words, it's too shallow.
It probably doesn't help that I've seen many courses/documents that
are (in hindsight) derivatives from SICP, so I have the nagging
thought "not this again" when a topic is introduced in SICP.
cess11 wrote 16 hours 32 min ago:
It's written for engineers, they already know the math, but they
don't know how to design and implement virtual machines, objects,
compilers and whatnot that it shows how to do.
horeszko wrote 18 hours 13 min ago:
I think there is a Python version if that floats your boat
abeppu wrote 18 hours 30 min ago:
> In fact, Iâd go further and say that itâs the only computer
science book of that age that Iâd happily and usefully read again
without it being just for historical interest: the content has barely
aged at all. Thatâs not all that unusual for mathematics books, but
itâs almost unheard of in computer science, where the ideas move so
quickly and where much of whatâs written about is ephemeral rather
than foundational.
I recall that when MIT stopped teaching with SICP, one of the main
claims was that programming now is often not about thinking
abstractions through from first principles, and creating some isolated
gem of composing definitions. Instead, we interact with and rely on a
rich ecosystem of libraries and tools which often have individual
quirks and discordant assumptions, and engineering then takes on a
flavor of discovering and exploring the properties and limitations of
those technologies.
I think now, (some) people also are at the point of not even directly
learning about the limitations and capability of each tool in their
toolbox, but leaning heavily on generative tools to suggest low-level
tactics. I think this will lead to an even messier future, where
library code which works on (possibly generated) unit tests will bear
some fragile assumption which was never even realized in the head of
the engineer that prompted for it, and will not only fail but will be
incorporated in training data and generated in the future.
chambers wrote 17 hours 28 min ago:
I've witnessed how abandoning first principles undermines the
evolution of a system. If our mental model of a system is not
formalized into first principles (i.e. a high-level specification),
then successive generations of engineers will have to re-learn those
principles through trial-and-error. They'll introduce mutations and
dependencies between the mutations-- and when they leave, the next
generation of maintainers will repeat the process. Generations of
mutations eventually create a brittle, calcified creature of a system
which people fear to touch with a ten foot poll.
I imagine people who were taught SICP would be more respectful, if
not inclined, towards a formal articulation of a system's principles.
This philosophy is described in depth in the original 1985 article
[1] and in more accessible language in [2] . You can also observe
engineers opposing/misunderstanding the need for specification in
|
| [1]: https://gwern.net/doc/cs/algorithm/1985-naur.pdf |
| [2]: https://www.baldurbjarnason.com/2022/theory-building/ |
| [3]: https://news.ycombinator.com/item?id=42114874 |
|
chongli wrote 18 hours 18 min ago:
I recall that when MIT stopped teaching with SICP, one of the main
claims was that programming now is often not about thinking
abstractions through from first principles, and creating some
isolated gem of composing definitions.
Which is a category mistake that they actually address in the
lectures. SICP is not a programming course, itâs a computer science
course. Computer science is not about computers, let alone
programming, just as geometry is not about surveying instruments and
astronomy is not about telescopes.
When they stopped teaching SICP â in response to the pressure to
teach more modern tools â they abandoned their scientific
principles to satisfy commercial concerns. They stopped teaching
computer science and became a vocational school for the tech
industry.
abeppu wrote 4 hours 3 min ago:
I'm fine with the claim that CS is not about computers as astronomy
is not about telescopes, but there are pure CS courses that don't
cover programming and do cover automata, turing machines,
computability, complexity etc and don't cover programming. SICP is
about programs in a running language rather than abstract idealized
computations, and is centered around reading and writing programs
as examples. I think its success stems from its grounding in
exhibiting such examples that ordinarily would not become
accessible to students so quickly.
medo-bear wrote 16 hours 8 min ago:
> They stopped teaching computer science and became a vocational
school for the tech industry.
Sheldon always said that MIT is a trade school
imglorp wrote 16 hours 41 min ago:
The quote was about "programming by poking" which I take as highly
relevant to actual distributed software. It meant (1) systems are
more built by integrating many components, and (2) for many
reasons, the components are not understood by the integrator and
(3) they must resort to experimentation to validate how things
actually work.
Unless you have a TLA+ model of all your components and how they
interact, I would argue you don't understand your distributed
system either, for all inputs.
|
| [1]: https://web.archive.org/web/20160505011527/http://www.post... |
|
PittleyDunkin wrote 17 hours 4 min ago:
> SICP is not a programming course, itâs a computer science
course.
I don't see what you mean by this at all. Furthermore this doesn't
strike me as a useful distinction when a) it doesn't cover most
topics labeled by consensus as "computer science" and b) it very
clearly does teach a great deal about programming.
Why not say it teaches computer science and programming skills? Why
do these have to be exclusive? There's obviously a great deal of
overlap in general.
chongli wrote 9 hours 37 min ago:
I don't see what you mean by this at all
The goal of the course is not to teach programming skills, it's
to teach computer science. The difference is explained quite
thoroughly in the lectures. One might even say that answering the
question "what is computer science?" is one of the core goals of
the course and a major part of the philosophy of the professors
who created the course.
The argument being made by the comparisons to geometry and
astronomy is that in any discipline there is a difference between
means and ends: what you are attempting to achieve is distinct
from the tools you're using to achieve it. Furthermore, it's a
mistake to believe that the discipline is all about the tools.
No, the tools are the means, not the end.
PittleyDunkin wrote 9 hours 13 min ago:
> The goal of the course is not to teach programming skills,
it's to teach computer science.
Who cares what the goal is? It teaches programming skills too.
The intent is irrelevant and for the most part so too is the
distinction (outside the american education system, anyway).
> Furthermore, it's a mistake to believe that the discipline is
all about the tools.
Who outside the american education gives a damn about "the
discipline", if that refers to anything meaningful outside the
american education system in the first place? It's arbitrary
and has no purpose or benefit aside from organizing the
education system. This is a course that miraculously, against
all odds, manages to teach useful skills in addition to jargon
patterns of thought. Why not celebrate this?
Anyway, programming is a useful pedagogical tool for teaching
CS. CS is a useful pedagogical tool for teaching programming.
To brag about not teaching one is just hobbling your own
insight into the value you provide students.
I myself have a CS degree from a prestigious institution and
largely enjoyed my education. But this attitude you alude to is
just jerking off for the sake of jerking off. Particularly in
the case of SICP.
FredPret wrote 18 hours 22 min ago:
I think just like traditional engineers have to learn physics,
computer people should learn these fundamentals for exactly the
reason you outline.
Then, when you hit the job market, you learn the ecosystem of what
other engineers have built and you work in that context.
In this way, you can eventually reach extreme productivity. Just look
at humanity's GDP over the last 200 years.
munchler wrote 18 hours 34 min ago:
> The computer revolution is a revolution in the way we think and in
the way we express what we think. The essence of this change is the
emergence of what might best be called procedural epistemology â the
study of the structure of knowledge from an imperative point of view,
as opposed to the more declarative point of view taken by classical
mathematical subjects
Ironic, given the increasing use of functional programming in domains
where old-fashioned imperative/OO programming used to reign alone.
namaria wrote 17 hours 41 min ago:
I think in the context of the book 'procedural epistemology'
encompasses all programming, not just what you'd call procedural
programming.
munchler wrote 17 hours 25 min ago:
Hmm, I donât think so. Functional programming is definitely based
on the âdeclarative point of view taken by classical mathematical
subjectsâ.
magical_spell wrote 21 min ago:
My understanding agrees with namaria's. I'm inclined to think
that, in the passage you provide, `imperative' means `pertaining
to processes' (where processes are those things described by
procedures; or, perhaps better put, the meanings of procedures).
namaria wrote 15 hours 41 min ago:
I disagree since the book is using a functional programming
language to advance the idea that CS is about procedural
epistemology as opposed to the declarative stance of maths.
The idea that a 'procedural programming paradigm' exists in
contrast with a 'functional programming paradigm' is blogspeak
imho.
jasonpeacock wrote 18 hours 43 min ago:
Original version: [1] Javascript version:
|
| [1]: https://mitp-content-server.mit.edu/books/content/sectbyfn/boo... |
| [2]: https://sourceacademy.org/sicpjs/index |
|
ElD0C wrote 18 hours 38 min ago:
And the Python version:
|
| [1]: http://www.composingprograms.com/ |
|
xdavidliu wrote 18 hours 34 min ago:
this is not the Python version of SICP. It's a different book
inspired by SICP. There's no "picture language" in chapter 2, and
there's no "metacircular evaluator" and "register machine" in
chapter 5.
ted_dunning wrote 13 hours 19 min ago:
It's hard to understand the point without these.
agumonkey wrote 18 hours 47 min ago:
My second reading made me dig the footnotes and references, and there's
a big world of beauty out there too. IIRC there's a paper where Sussman
and some team made a custom design programmable processor to compute
celestial bodies properties (trajectories). Mind bending as usual.
neilv wrote 18 hours 47 min ago:
The article has a broken link for the free copy: [1] [2] I hadn't seen
a blessed PDF version until today. Circa 2001, only the HTML version
was freely available, and someone converted it to TeXinfo: [3] If
anyone wants to work through SICP today, you can run the code in MIT
Scheme, or in DrRacket:
|
| [1]: https://mitp-content-server.mit.edu/books/content/sectbyfn/boo... |
| [2]: https://web.mit.edu/6.001/6.037/sicp.pdf |
| [3]: https://www.neilvandyke.org/sicp-texi/ |
| [4]: https://www.neilvandyke.org/racket/sicp/ |
|
sakras wrote 11 hours 9 min ago:
Just as a data point, I'd recommend going through it in Racket, which
I believe has an explicit SICP mode. I went through it in GNU Guile
and it was a pain because there were some minor syntactic differences
between Guile and MIT Scheme.
kkylin wrote 17 hours 51 min ago:
For anyone wishing to try: the maintainers of MIT Scheme no longer
provide a .dmg but you can download and build the x86_64 version of
MIT Scheme. The current release (v12.1) works on a Mac running
Sequoia with Intel CPU or on Apple silicon via Rosetta. But the
native code compiler (not necessary for SICP AFAIK) is a little
broken. (Anecdotally it worked on macOS prior to Monterey, so maybe
an Apple-supplied dependency changed. Haven't tracked down the
issue.)
All of that is to say: if you do not need MIT Scheme and don't want
to fuss with compiling it, then Racket might be a better way to go.
xdavidliu wrote 14 hours 7 min ago:
most package managers have it, including apt and brew, so most of
the time no need to build your own
kkylin wrote 11 hours 12 min ago:
Good point! though my comment about the native code compiler
being broken still applies to the brew-installed version
jgon wrote 17 hours 56 min ago:
The texinfo version was I believe the source for the really nice
HTML5 version if you want to read it in a browser, but with nice
formatting that the MIT original version:
|
| [1]: https://sarabander.github.io/sicp/ |
|
owl_vision wrote 18 hours 11 min ago:
Dr Racket has SICP and HTDP as a teaching pack.
xdavidliu wrote 18 hours 35 min ago:
one thing to note is that the second chapter's "picture language" is
not supported in MIT Scheme in 2024. There used to be a package but
it's like 2 decades out of maintenance. In Dr. Racket however, there
is a package specifically for working through those problems.
jph wrote 18 hours 51 min ago:
SICP is available for free: [1] If you want to get it elsewhere, the
full info is:
Structure and interpretation of computer programs by Hal Abelson and
Jerry Sussman (MIT Press. 1984. ISBN 0-262-01077-1).
|
| [1]: https://web.mit.edu/6.001/6.037/sicp.pdf |
|
|
| <- back to front page |