Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Why Wolfram Mathematica did not use Lisp (2002) (ymeme.com)
166 points by gearhart on June 29, 2015 | hide | past | favorite | 140 comments


I think the author is misremembering Wolfram's claims. I recall attending the first open demo of SMP, where Wolfram (and I think co-author Chris Cole) were demoing it in a large Caltech lecture hall. Here's what I recall.

They had tried to use Macsyma for their physics work, and found experimentally that it was too slow on what they considered to be "medium" problems, and could not handle "large" problems at all. They were trying to do things that were pushing the limits, and pushing them hard, of what could be done on the hardware of the day [1].

They showed some benchmarks comparing SMP to Macsyma using the kind of problems that had arisen in their physics research. For "small" problems, SMP was a bit faster. Maybe 2 or 3 times or so faster.

For "medium" problems, it was 100 times faster.

And it could do "large" problems.

I don't think I ever heard Wolfram say anything about Lisp being 100 times slower than C in general. I only heard claims like that when he was talking specifically about building systems like SMP, and was comparing basing such systems on top of a general purpose high level language (Lisp) vs. basing them on something built and optimized specifically to support a system like SMP.

[1] The SMP work was done on CITHEP's (Caltech High Energy Physics) VAX 11/780. I was an undergrad at Caltech at the time, working part time at CITHEP as a system programmer and admin, so knew many people involved in SMP development and got to watch it as it progressed, and then was put into production by working physicists.


What exactly were these 'problems' ?


In high energy physics people compute scattering amplitudes, the goal is to estimate how many events of a particular kind a detector in a collider experiment will see at various energies. The predominant tool for doing that is to calculate contributions from so called Feynman diagrams, each diagram is a graphical representation of some fairly complicated integral, the integrant is some rational function of the particle momenta. The nominator has to be simplified by manipulating spinor identities and in the case of for example QCD one has to work out color factors. Just a few years before people were computing 1000s of those diagrams by hand for Quantum electro dynamics, often while cross checking each others results. For non-abelian Yang-Mills Theory (t'Hooft Veltman had published their landmark paper in 1972) this is more or less infeasible beyond first order.

The number of diagrams that have to be evaluated quickly explodes beyond the first order and in the case of QCD already at two loops would require herculean efforts to do by hand. In fact Schoonschip was one of the first computer algebra systems and developed for just this reason.

Beyond that areas like Supersymmetry and certain parts of General Relativity would be very painful to work in, if everything had to be done by hand.


My memory is pretty good, but not good enough to recall that level of detail from a demonstration I attended ~35 years ago!

Wolfram's doctorate was in high energy physics, so they were probably things in those areas.


Mathematica is an interesting case study for a general-purpose software package that happens to be more feature-full and functional -- not just "more convenient" or "better UX" -- than any open-source counterparts.

I tend to use some proprietary scientific software, but a lot of it is because academia already has already invested in codebases for, say, Stata, GAMS, Matlab and so on. But Matlab is two steps removed from raw Fortran; and what sets it apart from the many identical-syntax clones are a few narrowly-oriented toolsets aimed at some kinds of engineers.

Mathematica is the only one I buy versions for my home computer. It's very, very good.


> Mathematica is an interesting case study for a general-purpose software package that happens to be more feature-full and functional -- not just "more convenient" or "better UX" -- than any open-source counterparts.

Isn't the world littered with such examples? Office is more feature-full than OpenOffice, Photoshop is more feature-full than GIMP, Illustrator is more feature-full than Inkscape, InDesign is more feature-full than Scribus, Avid|Final Cut|Premiere is more feature-full than any open-source NLE.


Yeah I don't know why he things it is rare. It's rarer that open-source software is more featureful and functional than competing closed source software. In fact I'm struggling to think of any that are clear-cut.

The opposite is very common, as your list demonstrates. Also another big category is CAD, and CAM. There are no decent open source programs in that space at all!


> It's rarer that open-source software is more featureful and functional than competing closed source software. In fact I'm struggling to think of any that are clear-cut.

I think the canonical example has been web browsers, for instance.


Or Linux which as an OS far surpasses e.g. windows. It runs on any hardware you can imagine from satellites to hand-held devices and the majority of the super-computers in the world.


> Or Linux which as an OS far surpasses e.g. windows.

Linux is a lot cheaper than Windows. More importantly, it's a lot LOT cheaper than most proprietary Unices. And it's freely modifiable. If that makes it "better" for you, have at, I am not going to quibble.

If you are talking about Linux as a desktop, rather than server, operating system... there are reasons that Windows users outnumber Linux users on the desktop on the order of about 90 to 1. It's not all because Microsoft is evil. If you use Linux as your primary desktop operating system, I'm happy for you. If you think it's better for you, great. May you go forth and prosper. If you think Linux far surpasses Windows as a desktop operating system in the general case, then you are incredibly delusional and should seek some help.


Well, Linux is used every day as operating system by millions of end users – on mobile devices.

And Linux and the BSDs are getting a lot more love on consoles, too – look at the SteamBox.

With 16% of all current generation games being able to run on linux, often (like in the case of the new Batman game) even running better than on Windows (or expected to do so), it is getting more and more towards equality.


> With 16% of all current generation games being able to run on linux, often (like in the case of the new Batman game) even better than on Windows

You are literally making stuff up. The OSX/Linux port of Arkham Knight doesn't even have a release date yet.


And the studio porting it has NEVER made a single game that was bad – while the windows port, made by a different studio... well, you’ve seen it.

And about the 16%: Look at the Steam games catalogue.


>I am not going to quibble >I'm happy for you >You are incredibly delusional and should seek some help

Yeah you seem unbiased...


What is the example web browser?


Firefox and Chromium vs Internet Explorer.


Today, no open source browser is clearly more featureful and functional than the closed source alternatives, so seems like a transient aberration rather than a canonical example.


I mean, the people who maintain far and away the most used closed-source browser, IE[1], are deprecating it and have tried to distance themselves from IE as much as possible in Microsoft Edge. And this is entirely because IE lacks features that Chrome and Firefox have, largely but far from entirely related to standards support for HTML5. I have been using Microsoft Edge since it was available in the Technical Preview, and it is not on par with Chrome or Firefox yet as a daily driver. I hope it gets there someday, but it's not there now.

[1] Safari is kind of a border case -- all the chrome is closed source, but the rendering engine and I think the Javascript engine are open source. I am lumping it in as an open source project here, I can understand why someone might argue the point though.


Ok, I think it's reasonable to accept this. However one could equally well describe these open source browsers, as browsers funded by Google and Apple. Perhaps that's why they are outliers.


Doesn't BRL-CAD [1] count ?

[1] http://www.brlcad.org/


I tried BRL-CAD while trying to finish my transition away from Windows to Linux. Although BRL-CAD has features that arguably make it more 'powerful' than Solidworks (the package I had learned previously), it was incredibly less intuitive to use.

To me, it seemed like a software package. Of course, it /is/ a software package, but making models in CAD doesn't feel like writing software to me. Making models in CAD reminds me of playing with LEGOs, a very visual activity. Typing in coordinates just seems wrong.

Now, I failed to effectively learn the software. It is totally possible that if I were a faster learner, or had more perseverance, I would be born again as a BRL-whisperer. I was just hung up on how wrong and difficult it felt compared to SolidWorks.


OpenOffice is "good enough"; where Word isn't an option, it does the job. GIMP isn't even trying; it's not in the same market as Photoshop or casual image editing, which is well served by web apps these days. (One wonders what it is for; padding developer CVs, possibly) Finally, nonlinear video editing hardly qualifies for "general purpose" -- and is there anything worth mentioning as for open source offerings?

Mathematica, on the other hand, is squarely in the terrain usually covered by open source.


OpenOffice is "good enough"...

This is a matter of debate, depending on what circles you're traveling in. If you're working in a "mixed environment" where some people are using OpenOffice and some people are using Microsoft Word, you're likely to run into problems that range from minor annoyance to show-stoppers very quickly. Some of this can be blamed on Microsoft's weird file formats, but this is true not only for .doc files but for the better-documented (if arcane) .docx and, in my experience, even with RTF. I'm in contact with a fair number of small-to-medium press fiction editors, and while a fiction manuscript is one of the simplest use cases you can imagine in terms of formatting, almost every editor I've talked with has complained about OpenOffice (and to a lesser degree LibreOffice) screwing up comments and revision tracking. I don't really want to use Microsoft Word, but so far I haven't found an open source equivalent for my needs that is good enough. (The closed source Nisus Writer Pro and even Apple Pages seem to do better, ironically. But that's not to say they don't have their own problems.)

I tend to think of open source's most well-established terrain as languages and server-side components where UI is not really a major issue.


Microsoft doesn't have the greatest support for OpenOffice/LibreOffice odt either, although they try (https://support.office.com/en-us/article/Differences-between...).

The fact that 90% of the world is using Microsoft's format is certainly relevant for everyday use, but it's orthogonal to the question of whether an open source project can achieve feature parity with a closed source product.

File format compatibility is always hard and I'm not convinced MS does a better job reading odt than LibreOffice does with OOXML. But LibreOffice does have a reasonably comparable feature set, even if the UI is not as attractive.


> LibreOffice does have a reasonably comparable feature set

Pile up all the features Office has but LibreOffice doesn't on one side, pile up all the features that LibreOffice has but Office doesn't on the other. What pile's larger?

It's kind of a pointless argument over whether or not LibreOffice has ENOUGH features; enough features for who? It's a global question with local answers. It's pretty easy to count who has MORE features, though, and Microsoft Office wins there.


GP was suggesting Word's superiority comes from its ability to read its own file format, which certainly makes it superior in practice, but has little bearing on OSS vs. closed source.

But, FWIW, most of the new "features" Microsoft adds to Word seem to be UI improvements (e.g. https://support.office.com/en-us/article/What-s-new-in-Word-...). And if you want to claim that Office is more usable than LibreOffice, and that this is generally true of closed source vs. open source projects, you won't get any arguments from me.


Did I misunderstand or are you arguing GIMP is for "padding developer CVs"? That's a... fringe opinion, to say the least.


Well, GIMPs UI at least is totally unusable. I ended up using Krita instead, which actually has a UI that makes sense. Additionally, even Krita supports a lot more stuff than GIMP – from proper animations, keyframes to all kinds of color spaces. (Yes, 32-bit floats for CMYK color spaces are possible in Krita)


I wonder how so many people manage to use "totally unusable" program. Krita is nice and all, but it's quite different product than GIMP, designed specifically for drawing, when GIMP is sort of "general stuff", like Photoshop is. And Krita is free and open source as well, so it doesn't support the original claim.

And GIMP doesn't really lack many features Photoshop has, and even has some that Photoshop doesn't. UI could be much better (especially controls for transformations), but it's not that horrible either. The reason why GIMP sucks in comparison with Photoshop is that few features it lacks are absolutely game-changing. Like effect masks. If you use effect masks (and pretty much every proficient Photoshop user does), software without them is not an option for you. And I believe it isn't that easy to introduce them to GIMP either.


Yes. But that’s exactly the thing – Krita, while focusing on being a drawing application, has exactly those features that GIMP, a general purpose image editing tool, is missing.

Like colorspaces, effect masks, etc.

I found Krita to be a far better photoshop replacement than GIMP, actually. (With two differences: moving selections is a bit more complicated, the option for that is hidden by default, and it does not have a simple contrast-brightness slider filter)


Yes, GIMP's UI is pretty awkward. No disagreement here.

Still, for a lot of people, GIMP is a very valuable open source alternative to Photoshop; not as full-featured, of course, but then again, nothing is. There are lots of tutorials for achieving interesting effects using GIMP. I've never heard it described as something people learn to pad their resumes.

Krita is pretty cool, too. If I'm not mistaken, it's not a full-featured Photoshop replacement either.


Presumably they meant the resumes of those working on GIMP, not users of it.


Gimp's UI is almost as awkward as its name.


If nonlinear video editing isn't general purpose, then I don't see how Mathematica is.


I'd add Lightroom (± Silver Efex Pro) vs Darktable. It's generally hard for the (generally unpaid) open-source community to outperform something as big as Adobe.


How is Mathematica generally more "feature-full"? What are you comparing it to?



"I wonder which will become self-aware first -- Wolfram Alpha, or Stephen Wolfram."


Attributed to Casey Muratori by http://chrishecker.com/Kurt_G%C3%B6del_is_Laughing_His_Ass_O... -

“Does anyone want to bet as to which will gain self-awareness first: Wolfram Alpha or Wolfram, Stephen?”


The Donald Trump of the computer industry.


I once went to a talk by Wolfram to an audience of academics which he finished by saying that if he managed to find the theory of everything (which he seemed to think was a possibility), he probably wouldn't bother telling people, since theoretical physicists wouldn't understand it.


But isn't this the ultimate point of "science studies" (broadly put) since the days of Kuhn, Lakatos, Latour and so on?

I mean, Wolfram's bizarre confirmation bias about discrete automata is (apparently) apparent to the computer-y people because we've seen discrete automata and have goofed with all sorts of complexity-generating simple formalisms (L-systems, genetic algorithms, ...) that are impressive for giggles (and some industrial applications) but don't amount to a kind of Kurzweilian Transcendence.

... but we're not all mathy enough to pick apart what string/brane theorists or even orthodox quantum gravitationists are doing and say with (possibly misguided) confidence that it's no hope for a theory of everything. The people who do are continental philosophers who tend to get laughed out of the room, often because they have silly overarching theories (Zizek has a couple of points about quantum physics, but then, he thinks psychoanalysis explains human history) that must be relentlessly mocked.


> The people who do are continental philosophers who tend to get laughed out of the room

And for good reason. For example, philosophers (Zizek included) raved about Badiou's Number and Numbers. It is a nice history of the approaches to formalizing the concept of number, but there is no philosophy there. As soon as Badiou or Derrida or any other continental philosophers try to do anything else with mathematics it just becomes a grossly inappropriate and confusing analogy.


>he thinks psychoanalysis explains human history

The kind of theoritical psychoanalysis Zizek talks about is not really classic psychoanalysis at all. Even Marxism at the lowest level is a theory of psychology.


I think Stephen may be confusing communication with understanding. True genius inspiration can be hard or even impossible to communicate as languages are an imperfect representation of what happens in our minds, especially the "genius mind."

Not that it excuses his sentiment at all. Seemingly too arrogant to take the time to contemplate himself in an honest light.


I'm not sure I get it. Is that a dig at how computationalism is met with skepticism among theoretical physicists?


More likely a dig at Wolframs gargantuan ego.


I wonder if he would have the balls to say that if Feynman was still around?


For what it's worth, Feynman was on Wolfram's dissertation committee

http://thesis.library.caltech.edu/2597/


Wolfram has always seemed to me to have the biggest case of "Not Invented Here" syndrome I've ever heard of. I mean, not only did he have to make his new language, new search engine, etc. He even tried to make a new kind of science...

If that's not extreme NIH, I don't know what is.


Related to this: I was at UC Berkeley in the early 80's (and did the port of Vaxima (the VAX version of Macsyma, running on Franz Lisp) to the Motorola 68000). There was definitely a rivalry between UCB and Caltech in the symbolic algebra area, and I think the "100x slower" was just an excuse to do it differently. I don't think he would have wanted to appear to be following anyone at UCB.


Well, when "not invented here" is accompanied by "not invented anywhere else, either", we usually just call it "invented".


Agreed. I remember first reading an overview of Lisp [1], and my reaction was, "Ah, okay, just like they do in Mathematica ... ah, right, just like how Mathematica does that. Huh -- he must have done it in Lisp then."

After that, I didn't even suspect it didn't use Lisp until today.

[1] I think in Hofstadter's Metamagical Themas


>If that's not extreme NIH, I don't know what is.

Worked well for him though...


Absolutely; I'm not knocking his achievements. It's a rare intelligence who can work through NIH and succeed so well.


I wonder if Wolfram's 100-times-slower reference point was based on a very particular comparison. Suppose you benchmarked a numerical algorithm that dealt with a lot of matrix-vector operations. And suppose you compared, on the one hand, a C or Fortran implementation that used arrays and native machine finite-precision numbers, against a Lisp implementation that used lists and the sort of "exact" arithmetic described in the usenet post.

This would not, of course, be a good comparison between the languages, but it might well give you a 100x speed difference. Taking Pitman's story at face value, it sounds like Wolfram wasn't very sophisticated about these things, and wasn't open to listening to explications. So he might have extrapolated a single dumb benchmark into a universal truth.

At the same time, it sounds like Pitman tried to convince Wolfram to represent floats as the ratio of two bignums, "with lots of other hidden bits to assure that any decimalization had enough bits to be precise." I can understand Wolfram not particularly wanting to taste any of that pie. I have worked with a simulation platform that uses paired BigNums just for time values (other values in the simulation use regular double-precision variables), and it was a big drain on simulation speed.


I could also see a matrix representation as lists of lists, and when you do anything you're always consing, and replacing elements of lists, and walking lists to get to elements. Compare that to a C-style 2D array of float or double, and Lisp looks horrible - plausibly 100x as bad, and worse as the size of the matrix increases.

Of course, the problem with that is the Lisp data representation used, not Lisp itself.


Except Lisps that are intended for numerical work, like MACLISP on PDP-10s at the time, did have arrays. And as noted, MACLISP was around then faster than DEC's FORTRAN (an issue DEC fixed not too much later).

Lispers aren't stupid, which should be distinguished from how easy it is make a simple Lisp. Making a performant one takes effort on the scale of making any similar language implication good and fast.


I never said that Lisp didn't have arrays. I said that representing a matrix as a list of lists would not perform well.

What I meant to be saying is that Wolfram plausibly may have looked at a bad data representation for matrices on Lisp, and concluded that Lisp was inherently 100x as slow.


Either that, or a rough estimation of the problems he expected to face moving forward -- this is a decades-old company, an amazing outlier in this industry -- made him think this would always be a relevant bottleneck.

We're able to use higher-level languages because they've developed the trick of identifying inner loops and hard coding them in C (this is how we get Torch and train big-ass neural networks in Lua).


I think it's also important to note that Macsyma (or atleast Maxima) has a lot of cruft in the code. I actually like Weyl better, even if it hasn't seen active development in ~2 decades.

P.S: Maxima actually uses a list-of-lists to represent a matrix.


The author mentioned that the Lisp in question and Fortran could operate at the same speed. My impression is that the big use of Fortran has always been numerical algorithms with a bunch of matrix-vector operations — is that not the case? Because if it was like against like it doesn't make any sense, but if Lisp were as fast at matrix operations I figure we still wouldn't be using LAPACK for that stuff.


Matrix-vector operations are part of BLAS, and most performant BLAS packages (e.g. MKL and OpenBLAS) are written in assembly or machine-generated using a special-purpose codegen. You can't write the gemv/gemm kernel in any general-purpose language and expect to beat modern BLAS. But you can call BLAS/LAPACK from whatever language you want. This is how MATLAB succeeds even though it has a dog-slow interpreter.


Well, it's not entirely surprising to hear an anecdote about a young (19 year old), brilliant programmer arrogantly dismissing legitimate possibilities.


He's still arrogantly dismissing them.


Fair enough.


Random Wolfram story, was at lunch yesterday at my old college and respected computer scientist who originally interviewed me. He told me that he first showed Wolfram a computer.

This guy was a friend of Wolfram's mother who asked for help because 8 year old Stephen was "bored with dinosaurs" and wanted something else to think about.


How did the respected computer scientist feel about having unleashed Stephen Wolfram upon the computing world? :)


Well at least he diverted him away from dinosaurs, otherwise Jurassic Park might have been a documentary!


Wolfram himself has a far more detailed write-up on the development of SMP and Mathematica.

http://blog.stephenwolfram.com/2013/06/there-was-a-time-befo...

It might be biased or self-serving but it's more interesting than an offhand USENET recollection about a brilliant and arrogant young man.


People's experience with Stephen Wolfram is that he is very biased and self-serving, hence the popularity of an anecdote from an external source.


Seems like a good example of technical people placing too much importance on a technical decision.

It seems clear Lisp would have been a perfectly fine choice, as would a number of other languages. It also seems unlikely the world (or even Mathematica) would be dramatically different for it.

I wish more "X is better than Y because of Z" discussions would admit aesthetics were an important factor. Instead we end up with convoluted justifications that just annoy everyone (Lisp is slow, C is fast, C++ is complicated)


Wolfram's decision at worst did not impede the development of a tremendously successful software product. At best it enabled it. Mathematica is 27 years old. It predates the i486 which was the first mass market CPU with an FPU. In the early 1980's betting on Lisp was betting on Lisp machines and workstation class hardware. The first version of Mathematica ran on M68000 Macintosh machines. It's hard to imagine overcoming the pain that something slower than C would have inflicted on users.


> The first version of Mathematica ran on M68000 Macintosh machines. It's hard to imagine overcoming the pain that something slower than C would have inflicted on users.

According to this (https://en.wikipedia.org/wiki/Macsyma#Commercialization) Wikipedia article, Macsyma (a Lisp-based CAS that Mathematica was designed to compete with) was running on 68000 Sun-1s in the mid-80s and a Windows port came out about a year after Mathematica came out for the Macintosh (1989 and 1988, respectively). The pain must have been real because Mathematica ended up with all of Macsyma's market: "Macsyma's market share in symbolic math software had fallen from 70% in 1987 to 1% in 1992"


For this domain which is largely compute bound when it counts, for the PC systems it was targeting back in those days, C was almost certainly the right choice if you had enough resources for the extra programming work required.

Commercial Macsyma is ... a special, and rather sad case. As explained to me by Danny Hillis in 1982-3 when he wished my company, LMI, could provide Lisp Machines for Thinking Machines, Inc. to help develop the Connection Machine 1, one motivation was Symbolics' pathological business practices, and the nastiest example then was Macsyma. Back then the MIT Technology Licensing office was still horrible, and it was arranged that Arthur D. Little, I think, recommend how Macsyma be licensed, and it ended up being exclusively to Symbolics, which was not a common approach.

As far as we could tell, Symbolics bought it primarily to keep it out of the hands of LMI and anyone wanting to run it on conventional hardware, and went so far as to try to get people who have Vaxsyma copies to send them back and stop using it, which was not well received as you might imagine. That helped Fateman, using the DoE which had sponsored much of the work, to force MIT to release a snapshot as open source, which eventually became Maxmia.

In house, Symbolics treated it with benign neglect, and I don't think it was massively improved. This situation became ironic as their hardware business declined and the Macsyma unit became an important cash cow, but as these things go, for the usual internal political reasons, it never got development resources commensurate with its actual status and potential.

MIT also didn't reward the people who'd originally written it at MIT, something Joel Moses, who I happened to be directly reporting to in the 1987-8 time frame when he was the EECS department head, was obviously not happy with, along with I'm sure many others. So in short, Symbolics did nearly everything they could to mess up the Macsyma community and product, and the noted decline, once there were good alternatives, was inevitable.

RMS is not 100% wrong in his loathing of Symbolics....


> In house, Symbolics treated it with benign neglect, and I don't think it was massively improved.

There is a lot of butthurt from various people, who seem to know better how to run a company in hindsight. Symbolics sold Macsyma on various platforms: Lispm, Windows, DEV Vax, Sun Unix.

> it never got development resources commensurate with its actual status and potential.

Symbolics was still selling Macsyma at a time when competitors like LMI or TI were no longer in the Lisp business.


There is a lot of butthurt from various people, who seem to know better how to run a company in hindsight. Symbolics sold Macsyma on various platforms: Lispm, Windows, DEV Vax, Sun Unix.

The exclusive licencing to Symbolics resulted in a delay in porting it to non-Lispm platforms, due in part to the cited internal opposition. And not too much later Symbolics effectively exited the market; they too have a lot to say about how not to run a company.

Symbolics was still selling Macsyma at a time when competitors like LMI or TI were no longer in the Lisp business.

Yet for some explicable reason people stopped buying it and its market-share crashed to 70% to 1% in 5 years.


Between Macsyma, S-Graphics, and Statice Symbolics could have been a really great software business. S-Graphics was sold off to Nichimen and developed and marketed as Mirai until the early 2000s, and the Statice guys all left to make ObjectStore. I read on Usenet somewhere that there was an unsuccessful attempt to acquire Macsyma as a separate business.


> Between Macsyma, S-Graphics, and Statice Symbolics could have been a really great software business.

How so? Don't you think they tried and explored that? In reality in early 90s nobody was interested in Lisp anymore. Macsyma was still sold to the market, but didn't have much success. Nichimen's N-World had a small customer base, ran on SGIs (which were still expensive) and then under Windows NT. Statice on non-Lispms never left the beta status.

> I read on Usenet somewhere that there was an unsuccessful attempt to acquire Macsyma as a separate business.

That's false. See Macsyma, Inc. The company was founded in 1992 and in 1999 acquired by 'Symbolics Technology'.


Fun fact: Symbolics continues to sell 'Macsyma'.


To me it sounds more like NIH syndrome. Wolfram was young, smart, cocky, and didn't really know much about computation when he started - but was convinced he could do a better job if he started from scratch.


> Wolfram was young, smart, cocky

He remains two of those things to this day. I have no doubt Wolfram is legitimately a genius (despite his arrogance and tendency to take credit for other people's inventions) and I adore Mathematica as it is, but I can't help but wonder how much better it would be if Wolfram's personality was less... abrasive.


I've never met him but have read some of his stuff over the years and talked to a few people who have (met him).

He strikes me a clearly very smart, just not as smart as he thinks he is (not uncommon, but more rare at his level of talent I think). An aside: For what it's worth, I find the "genius" label problematic in general, not just for him. I think it's a concept that probably has useful application, but to < 1% of the people to whom it's applied.

His abilities as a technical writer are middling, unfortunately, and at some point being able to communicate your ideas is almost as important as the ideas.

I too wonder how much further Mathematica could have got if he hadn't pushed away a number of clearly talented people.


I agree genius is overused and try to avoid it. My point was I get the impression that he might really deserve it, but his personality leaves you with a distinctly bad aftertaste. I've only met maybe 2 or 3 people I really thought deserved the label, though I guess it's hard for me to judge given that I'm certainly not one. I do however mean a genius as measured purely by ability... in my mind there is a distinction between people who have the abilities of a "genius" versus someone with the achievements of a genius. Hard work by normal people most often often produces the latter, whereas the former case is much harder to identify.

And I agree wholeheartedly... I once had students ask me why I put so much emphasis on lab reports in an [upper level engineering course]. My sincere belief is that at least 50% of working in a technical field is your ability to communicate. The most genius answer has little value if you can't properly articulate it.

ANKOS is the exact example I have in mind when I think about this. Really I find the premise that cellular automata are somehow fundamental to computation and the universe interesting (though maybe I don't buy into it to the same degree as Wolfram), but his presentation of this thesis is so dreadfully tedious and conceited as to squash any desire I might have to investigate it.


I heard that when he was writing and editing ANKOS, he rejected a picture of a panther that was to be used to illustrate reaction-diffusion textures in nature, because he didn't like the expression on its face. ;)


Perhaps the best property of ANKOS is that it is a clear demonstration of what sort of trouble you can get into when a book doesn't have an (effective) editor.

I'm certain there is a small, interesting, well written book hiding in all that verbiage somewhere, but it's well hidden.


It appears that the discussion on arbitrary precision vs machine precision is obsolete as it does not seem to apply to Mathematica, maybe it used to be true for SMP, though I'm not sure.


I don't know why I've gotten downvoted for this, but if you reread the post you will see a bunch of references to things that just haven't been true of Mathematica since v1.0.


This might be obvious to many, but the language of Mathematica has better support for ("abstract") currying than Lisp, something like

A[B, C][X, Y][Z]

does not straightforwardly translate into an S-Expression but is potentially convenient for symbolic algebra. I believe that I read this emphasis on currying in one of Wolfram's own accounts on how Mathematica came to be. It is probably also important that the internal data structure used by Mathematica is not a list, I think the support for "level specs" points at how internally the data is most likely represented, although I'm not sure.


It translates very directly.

(((A B C) X Y) Z)

We could clean up this syntax and write a macro. Let's call that M. Then one could write

(M A (B C) (X Y) (Z))

The definition of M is quite simple.

Secondly, Lisp only uses lists for the representation of most of its source code. Actual computational data structures of course need not be lists. Mathematical expressions need not be represented as lists in Lisp.


I'm not sure they are quite equivalent, the semantics of pattern matching tend to be different: in lisps lists are always a kind of binary tree so (list 1 2 3) is really (1 . (2 . (3 . nil))), this has a consequence for unification in that (list 1 2 3) will match (x y) as 1 and (list 2 3). In Mathematica on the other hand lists tend to be flatter if you will, and there is this concept of Sequence which I sometimes find problematic.


I don't think that's true: (x y) is (x . (y . nil)) and shouldn't match (1 . (2 . (3 . nil))), because nil would have to match (3 . nil).


In Clojure,

  (-> A [B C] [X Y] [Z])


"During the dinner discussion leading up to this definition the foreword to one of the Mathematica books was mentioned, where Stephen Wolfram (in third person) wrote "Stephen Wolfram is the creator of Mathematica and is widely regarded as the most important innovator in scientific and technical computing today." In honour of this self-assessment I suggest we call the unit of ego the Wolfram." - "Monumental egos" blog post by Anders Sandberg

http://www.aleph.se/andart/archives/2009/04/monumental_egos....


What's the conversion factor between Wolfram's and nanodijkstras[1]?

This would be a great question to answer using Wolfram Alpha, since it does have nanodijkstras[2] and Wolfram[3]. Alas, the conversion is beyond it[4].

[1]: https://www.youtube.com/watch?v=Xoyw8LHGtzk

[2]: http://www.wolframalpha.com/input/?i=nanodijkstra

[3]: http://www.wolframalpha.com/input/?i=wolfram

[4]: http://www.wolframalpha.com/input/?i=wolfram+to+nanodijkstra


Another possible reason would be: Macsyma was already in LISP.


Market differentiation is a powerful reason.


Isn't Mathematica very Lispy? It looks like R-expressions.


The syntax looks a lot like M-expressions, but the evaluator is about as far from a normal Lisp as you can get.

Mathematica uses a fixed-point evaluator; that is, expressions continue evaluating until they reach some stable state. Every common Lisp I've seen will only evaluate once unless you explicitly code for it.

While this evaluation behavior is very useful for mathematics, I find that it makes Mathematica a poor general-purpose programming language. It's pretty easy to wind up with confusing evaluation behavior, and controlling evaluation in Mathematica is a very complicated topic. There are myriad language constructs used to do so: Unevaluated, Hold, HoldForm, HoldAllComplete, other Hold*; see http://stackoverflow.com/questions/1616592/mathematica-uneva... and http://library.wolfram.com/infocenter/Conferences/377/ for more information.


Well M-expressions to be more precise. Actually if you ask some they'll tell you that Mathematica is the only Lisp that implemented them.


And with good cause, I think. Mathematica syntax (both the verbose form and with the various syntactical sugar) is immensely more readable to the average person than Lisp. Of course, this could have been done on top of Lisp anyways (and wouldn't that be nice?)


Surprise, Macsyma, which is written in Lisp has that syntax. Macsyma is the earlier computer algebra system, which Kent Pitman mentioned in that post. He also mentions that Wolfram knew Macsyma.

Example:

    applysymmetry(exp,opdum,symtype):=block(
      [getdum:get(opdum,symtype),piece,inflag:true,partswitch:true],
      if getdum=false then return(exp),
      subst(lambda([[arglist]],
                   apply('aplsym1,append(getdum,[arglist,opdum]))),
            opdum,exp))$

If you know Lisp, you will recognize typical constructs like IF, RETURN, SUBST, LAMBDA, APPLY, APPEND, ...


And this is an entirely normal usage of lisp: define a domain specific language (DSL) on top of the lisp, to manipulate the natural objects of the domain in a natural way.

I've always thought Mathematica would have been a better system build on top of a "real" lisp, and would have got there faster. I've never heard any of Wolframs statements about the parallels that have made me question that, but I could be missing something.


This still seems more limited than the Mathematica syntax, which includes postfix operators, for example. I also appreciate that parentheses are used only for grouping, not for function calling and such. I'm not saying this couldn't have been done with Macsyma or on top of Lisp more generally (indeed, that would be nice), or that Mathematica wasn't influenced by Macsyma; I'm saying that purely for syntax, I think Mathematica ended up with a better, more extensive, and easier-to-use system than pretty much any other language.


Macsyma also has postfix operators.


> Well M-expressions to be more precise.

M-expressions is what I meant, I think I misremembered the name.


Or M-LISP. Or Logo for that matter (loosely).


Yeah, I don't really believe the claim that it's the only one, it's probably fair to say it's the most widely used though.


Dylan is a lisp with readable syntax.


I think is obvious that Wolfram won't use anything that he can't take full credit...


Maybe KMP should have recycled Jerry Pournelle's account for a certain bright 19-year-old, instead... ;)

http://www.stormtiger.org/bob/humor/pournell/story.html

"Personally, I'd just turn off his account. It's not like it's the first time, and he not only flaunts his use of our machines but stabs us in the back with grumblings about why he doesn't like this or that program of ours when he gets a chance. (Am thinking particularly of an article he wrote which condemned Lisp for reasons amounting to little more than his ignorance, but which cited Teach-Lisp in a not-friendly light... The man has learned nothing from his presence on MC and sets a bad example of what people might potentially accomplish there. I'd rather recycle his account for some bright 12-yr-old...)" -KMP


He'd already used Macsyma on MIT-MC, had written his own SMP, left Caltech due to IP issues WRT SMP, and was in 1985 working on cellular automata at the IAS while no doubt arranging for SMP Mark II, AKA Mathematica which he started the next year....

Interestingly, Pournelle was the kind of guy who could exercise Macsyma, or at least his first computer program solved a system of 60 or so linear equations on a IBM 650, an "affordable" machine back in the '50s that used a magnetic drum for its main memory.


Yeah, with turists like Jerry Pournelle, who needs Stephen Wolfram?

"Mr. Pournelle bids me tell you that if you intended to annoy him, you have succeeded, and that his next column in BYTE will have a lot to say about the ARPANET...."

"One thing that is known about ARPA: you can be heaved off it for supporting the policies of the Department of Defense. Of course that was intended to anger me. If you have an ARPA account, please tell CSTACY that he was successful; now let us see if my Pentagon friends can upset him. Or perhaps some reporter friends. Or both., Or even the House Armed Services Committee."


Chris was really, really stupid to purge JERRYP for badthink; if Jerry had been intent on revenge vs. reform, he could have done a lot of damage to certain parties, although not of course the Internet itself, which that year was starting to be substantially extended by the NSFNet.


Threatening to sick the House Armed Services Committee on the MIT AI Lab is "badact".

And by the way, it was POURNE, not JERRYP:

    MIT Maximum Confusion PDP-10

    MC ITS.1488. PWORD.2632.
    TTY 57
    16. Lusers, Fair Share = 86%

    *:login pourne
    That account has been temporarily turned off.
    Reason:
    Think of it as evolution in action.
    Any questions may be directed to USER-ACCOUNTS
    *
And you have to admit, he did asked to be dumped off the net, so he got what he asked for (although he failed to deliver what he promised in return: seppuku):

"thank you. if left to you I suppose I cewrtainly will find my accounts terminated. Your nice private message appreciated. seppuku follows.. maybe you ought to have me dumped off the net and be done with it? or must you work through someone else? J. E. Pournelle"


But I have to strongly disagree: POURNE was really, really stupid for thinking AND acting bad, and Chris/Gumby/Klotz made exactly the right call. The ARPANET was not Jerry Pournelle's government sponsored entitlement program.


If he had used Lisp how would he have ported it to so many machines? I programmed in Common Lisp quite a bit in grad school and I liked it, but it was always on a specific platform. I would use Lisp even now, but I'll often use C/C++ just because I know I can get it to work most everywhere.


SBCL runs pretty much everywhere: http://sbcl.org/platform-table.html


It didn't when Mathematica was being developed. CMUCL on i386 wasn't released until 1996.

Kyoto Common Lisp was available in the late 80s and was fairly portable since it compiles to C.


Wolfram has said fellow student Rob Pike (who went on to co create go etc) convinced him to use C though i've never seen the reasons enumerated. So while he certainly comes off as arrogant and certainly reinvented a lot of stuff it isn't like he didn't listen to anyone...


say what you will about Wolfram, but he took a risk, got things done, and shipped something people wanted


What language features distinguish Mathematica from lisp? Are we just talking about the m-expression syntax, or are there more fundamental differences?


Here's what Wolfram wrote about Mathematica vs lisp in the first Mathematica book: http://reference.wolfram.com/legacy/v1/contents/4.2.7.pdf

I don't know enough about lisp implementations to know if these differences still hold, but they still hold for Clojure and ClojureScript, for example. Though some people have done some experimentation like https://github.com/kovasb/combinator

In the email under discussion though Wolfram was beginning work on SMP, which preceded Mathematica by about 6 years and influenced the design of Mathematica both in what to do and what not to do: http://blog.stephenwolfram.com/2013/06/there-was-a-time-befo...


Thing is, you wouldn't develop a system like Mathematica without writing those sorts of things on top of your Lisp implementation. As MACSYMA, the first of these programs did, which he used prior to starting on SMP, even using a similar syntax per this comment https://news.ycombinator.com/item?id=9798643


Mathematica's evaluator is based much more on term rewriting and defaulting free variables as symbol objects. Lisp requires everything to be much more explicit (which has many pros and a few cons).


Unless you write a DSL that evaluates in the manner you want.


This quote struck me "I'm pretty sure his response was that what tools others chose to use or not use was not his concern"

On the one hand I feel a conflict of wanting to use every new, shiny tool. But a dose of this attitude would help me get a lot more work done.


The only reason I still subscribe to c.l.l is for awesome stories like this


Wolfram is extremely cocky for a bald man.


I'll bite.

What does hair (or lack thereof) have to do with anything?


For some reason, that's actually a pretty funny troll. He should've added on to it though. "If he's so smart, he would've found the cure for baldness already".


Grass doesn't grow on a busy street...


Parts of him are actually quite hairy. But fortunately, just like his theory of everything, he's doesn't bother telling people, since doctors wouldn't understand it. [1]

[1] https://news.ycombinator.com/item?id=9798776


TL; DR: Stephen Wolfram has high self-regard


I always just assumed it was mostly because of the desire to have a proprietary system he could generate revenue out of, instead of wading into GPL-esque territory that tends to go along with LISP.

It's also one of the main reasons I have been using Octave and Python/Anaconda.


It was first released a year before the GPL.

One thing I'm sure was very significant was an IP fight with his institution, Caltech, which prompted him resigning in 1983, the same year RMS gave up on Lisp Machines and started the GNU project.


Most proper Lisps are all proprietary, what GPL scheme?

None of the GPL Lisps compare to what is offered by the likes of Allegro Common Lisp and similar environments.


> None of the GPL Lisps compare to what is offered by the likes of Allegro Common Lisp and similar environments.

Care to elaborate?


Allegro Common Lisp, Franz Lisp provide IDE tooling reminiscent of the Lisp Machines. With visual debuggers, graphical REPL, hypertext documentation triggered from the REPL, libraries to integrate with the OS....

Additionally their optimising compilers are quite good.


Didn't realize, the only LISP I regularly interact with is emacs lisp...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: