What is the best programming language?

post by lsparrish · 2012-05-26T00:58:05.309Z · LW · GW · Legacy · 98 comments

Learning to program in a given language requires a non-trivial amount of time. This seems to be agreed upon as a good use of LessWrongers' time.

Each language may be more useful than others for particular purposes. However, like e.g. the choice of donation to a particular charity, we shouldn't expect the trade-offs of focusing on one versus another not to exist.

Suppose I know nothing about programming... And I want to make a choice about what language to pick up beyond merely what sounds cool at the time. In short I would want to spend my five minutes on the problem before jumping to a solution.

As an example of the dilemma, if I spend my time learning Scheme or Lisp, I will gain a particular kind of skill. It won't be a very directly marketable one, but it could (in theory) make me a better programmer. "Code as lists" is a powerful perspective -- and Eric S. Raymond recommends learning Lisp for this reason.

Forth (or any similar concatenative language) presents a different yet similarly powerful perspective, one which encourages extreme factorization and use of small well-considered definitions of words for frequently reused concepts.

Python encourages object oriented thinking and explicit declaration. Ruby is object oriented and complexity-hiding to the point of being almost magical.

C teaches functions and varying abstraction levels. Javascript is more about the high level abstractions.

If a newbie programmer focuses on any of these they will come out of it a different kind of programmer. If a competent programmer avoids one of these things they will avoid different kinds of costs as well as different kinds of benefits.

Is it better to focus on one path, avoiding contamination from others?

Is it better to explore several simultaneously, to make sure you don't miss the best parts?

Which one results in converting time to dollars the most quickly?

Which one most reliably converts you to a higher value programmer over a longer period of time?

What other caveats are there?

98 comments

Comments sorted by top scores.

comment by fubarobfusco · 2012-05-26T04:33:15.583Z · LW(p) · GW(p)

As far as I can tell, there is no such thing as a good programmer who knows only one programming language.

There just isn't. The nature of the field is that raw computation is quite alien to ordinary human thought; and that the best way to get anything even remotely resembling a reasonable grasp on it is to come at it from multiple angles.

You should know about functional abstraction, which you're not going to get very much of out of C or Perl, but you might get out of Lisp or Haskell.

You should know something about how the machine operates, which you're not going to get out of Java, Python, or Haskell, but you might get out of C, Forth, or assembly. And reading about the actual architecture you're working on. (It's easier to do this on a dinky architecture. I recommend 8-bit AVR microcontrollers.)

You should be eager to reuse other people's code to solve your problems, which you'll learn from an open-source, big-library language like Python, Perl, or Ruby, but probably not from a small, standardized language like C or Scheme.

You should learn about data structures, which you'll only get deeply if you have to implement them yourself, which is really only something you do in a CS class — but you should also learn about not reimplementing data structures that someone has already done for you better than you can do yourself. You can pick that up in any language with good data-structures libraries, like C++ with STL, Python, Haskell, etc.

You should learn about types as an expressive system and not merely a burden, which you'll get from a strongly-typed, type-inference language like Haskell, but not from a heavyweight manifest-typing language like Java or C++.

You should know how the OS works, which in most systems today you can really only get by learning C. You should know about system calls, the standard library, sockets, and so on. You should know why we don't use gets(3) any more.

You should know about Lisp macros because there's really nothing else comparable in any other language. Template Haskell doesn't really count, and C macros especially don't count.

You should learn about parsing the traditional way; and then learn about monadic parser combinators in Haskell, because they are fucking awesome.

You should know what a buffer overflow is, what a format-string exploit is, what an SQL injection is.

You should know about relational databases. I could go on about relational databases because they are deeply misunderstood by most programmers and actually contain a wealth of hidden awesomeness; but I won't, because it would go on forever, and much smarter people like Chris Date have written it already. Suffice it to say that a relational database is not just a place to stuff variables that you want to keep around for a while; it is a place to stuff facts that you want to draw inferences from. There — that makes it sound suitably AI an' shit.

You should learn how to work with a nontrivial runtime that does things like garbage-collection, which you won't get out of C or C++ but you might get out of Lisp, Java, Python, Ruby, or Go ... if you pay attention.

You should learn about concurrency, which you can maybe get by learning about threads and the like in a language like Python, C++, or Java, but you'll probably learn better in a language like Erlang, Go, or Clojure that treats concurrency as a first-class concern.

Anyway, as should be obvious, computation is big and there is a lot to learn. But you can pretty much jump in anywhere. There's no right or wrong starting point, so long as you keep in mind that it's a starting point. Don't let anyone convince you that their personal pet language or system is all you will ever need. They're just trying to get you to join their phyg, and phygf exist at your expense.

Computing is a field that you can keep doing for a long time and still have lots to learn. This is, in fact, part of what makes it awesome.

Replies from: MBlume
comment by MBlume · 2012-05-27T05:20:39.810Z · LW(p) · GW(p)

I keep meaning to try drinking some of the relational kool-aid -- any links you'd recommend?

Replies from: dbaupp
comment by dbaupp · 2012-05-27T07:35:18.493Z · LW(p) · GW(p)

If you are looking to learn the conventional mechanics of using relational DBs, The Learn X The Hard Way series is well-regarded, and there is an (in progress) SQL edition (I don't have specific experience with this book).

comment by [deleted] · 2012-05-26T09:21:07.265Z · LW(p) · GW(p)

I eagerly await the most rational toothpaste thread.

I think it has been mentioned before, but it bears repeating, please don't use "most rational" in titles. Just ask for the best programming language and describe your needs.

Replies from: Jayson_Virissimo, Vaniver, lsparrish, CuSithBell
comment by Jayson_Virissimo · 2012-05-26T11:14:05.569Z · LW(p) · GW(p)

Upvoted for being the most rational comment in this thread.

Replies from: lsparrish
comment by lsparrish · 2012-05-26T18:06:50.243Z · LW(p) · GW(p)

Snark isn't the same as rationality.

Let's break it down, shall we? The comment contains the following three things:

  1. A joke insinuating "rational programming" is the same as "rational toothpaste". No claim is made explicitly, so no rebuttal can be made. This is pure dark arts if you ask me.
  2. Negative instruction: Don't do this. No attempt to explain why.
  3. Positive instruction: Do that instead. Again, no attempt to explain why.

And you think this is more rational than the detailed, respectful, intelligent comments made by people who actually thought about the questions for five minutes and shared their expertise?

I'm appalled.

Maybe it's obvious to you that having "most rational X" in the title is stupid. And to be honest in hindsight it seems a bit silly to me as well, now that I have explicitly thought about the reasons for it. But it wasn't obvious when I wrote it, and it surely isn't self-evident to everybody.

I'm not against setting up norms and rules, and yes they are gonna change on people, and yes people need to be humiliated from time to time for breaking them flagrantly, but it's simply unfair to make humiliating jokes in retaliation for breaking them when there is no actual reason for the person to have known about the norm's existence.

Note that Konkvistador didn't even have a link to a top level post to substantiate the claim that this norm exists, just "I think this has been mentioned before" -- am I supposed to read every comment on LessWrong to see what norms I'm supposed to be following?

Give me a break!

Replies from: None, Emile, Jayson_Virissimo, wedrifid
comment by [deleted] · 2012-05-27T07:10:49.270Z · LW(p) · GW(p)

Negative instruction: Don't do this. No attempt to explain why.

Rational is a frequently used but unfortunately more and more meaningless applause light on LessWrong.

comment by Emile · 2012-05-26T19:54:07.682Z · LW(p) · GW(p)

Hmm, I guess people that follow most of the discussions here might be overestimating how much "community lore" others are aware of (or they may mistake "opinions shared by a few posters" for "community lore").

For prior discussions of this see here:

We are one step closer to the glorious time when a thread titled "Most rational choice of footwear" will appear.

or here:

This might be the most blatant misuse of "rational" in a post title I've ever seen.

Now that's a challenge ...

  • Rational snorkeling
  • Rational hash brownie preparation¹
  • Rational weasel appreciation
  • Rational captchaloguing
  • Rational cache flushing
  • Rational cash flashing
  • Rational rasher rationing

Most people seem to have liked the suggestion to use "optimal" instead.

comment by Jayson_Virissimo · 2012-05-26T20:22:22.504Z · LW(p) · GW(p)

Snark isn't the same as rationality.

Forgive my ignorance; so what is the most rational form of humor?

Note: Both of these comments are jokes and were intended to be funny, not snarky punishments for norm-breaking.

Replies from: TheOtherDave, lsparrish
comment by TheOtherDave · 2012-05-26T21:08:59.421Z · LW(p) · GW(p)

It turns out to be fart jokes. I have an elegant proof of this, but it is too long to fit in a comment.

Replies from: Curiouskid
comment by Curiouskid · 2012-05-30T00:34:11.527Z · LW(p) · GW(p)

Louis C.K. was deconstructing why farts are funny on the daily show the other day:

  1. They come out of your ass.
  2. They make a trumpet noise.
  3. etc
comment by lsparrish · 2012-05-27T16:59:02.324Z · LW(p) · GW(p)

Argh, I can't believe that went completely over my head in both cases. Now that you've added the italics around "most rational" I can see it.

comment by wedrifid · 2012-05-27T08:44:45.634Z · LW(p) · GW(p)

I share your surprise that the grandparent was so positively received. It was briefly at -1, since I was the first to encounter the comment and I thought it was inane, obnoxious and wrong, without being sufficiently wrong that it could get any points for clever irony or humor.

Mind you I upvoted Konk's actual comment.

comment by lsparrish · 2012-05-26T14:05:49.186Z · LW(p) · GW(p)

I think it has been mentioned before, but it bears repeating, please don't use "most rational" in titles.

I haven't seen this advice before. A link would be appreciated.

Edit: The post has been retitled to "what is the best programming language". My main reason for doing so is to avoid confusion as well as dilution of the meaning of the word "rational" -- which should probably be reserved for specific contexts (e.g. avoiding cognitive biases) rather than used as a catch-all for "most optimal" and so forth.

Just ask for the best programming language and describe your needs.

My needs? Well I am already moderately skilled at a dozen or so languages, including Python, SQL, and Forth. My first scripting language was Perl and my first GUI language was REALBasic, which was essentially Visual Basic for the Mac.

Why did I go into Forth? Well, I wanted some down and dirty understanding of what the heck is actually going on in a computer. And I couldn't stick with C long enough to get that for some reason. Now I've done things like creating my own string manipulation functions (by concatenating other functions and primitives). I'm not sure I could have got that from Python.

On the other hand, now when I look at C code slinging pointers and char arrays around it makes perfect sense, and I can also visualize linked lists and other data structures. As a newbie though I remember it was all extremely confusing.

Replies from: Kingoftheinternet
comment by Kingoftheinternet · 2012-05-26T15:46:03.161Z · LW(p) · GW(p)

Behold, rational wart removal

Replies from: lsparrish
comment by lsparrish · 2012-05-26T15:58:36.287Z · LW(p) · GW(p)

Um, that seems off topic. I do see some vaguely on topic comments in the replies... maybe you meant to link to one of them?

Replies from: Kingoftheinternet
comment by Kingoftheinternet · 2012-05-26T16:29:27.701Z · LW(p) · GW(p)

I linked to the post itself because more than one of the comments were about using "rational" in the titles of posts, and I also thought the content of the post was relevant to understanding that discussion.

Replies from: lsparrish
comment by lsparrish · 2012-05-26T17:42:46.610Z · LW(p) · GW(p)

Labeling it off topic was an overreaction on my part. It was clear to me that you were talking about the comments.

Nonetheless, it seems kind of silly (in an insulting and childish way) for someone to portray the topic "most rational programming language" as essentially equal with "rational wart removal", which is the most parsimonious interpretation of your comment, and which I must therefore rebut since you did not bother to clarify.

There are multiple levels on which programming languages can be rational -- they can teach rationality skills, they can help you make money for rational causes, and so forth. Wart removal is far more specific and a much more clear-cut case of dilution of the term.

I have substituted "best" in the title in the interests of preventing dilution, but this still seems to me to be a step above and beyond what I am required by linguistic politeness and the demands of clarity to take -- programming and rationality really are related in ways beyond the superficial "rational = best" kind of way.

Replies from: Kingoftheinternet
comment by Kingoftheinternet · 2012-05-26T19:45:55.903Z · LW(p) · GW(p)

That's the kindest interpretation you could think of? I'm a bit bothered that I have to specify I wasn't trying to be a dick in this specific situation. No, I wasn't trying to be mean to you. It looked like you wanted to see situations similar to yours, so I showed you the first one to came to mind (which of course was the most extreme one), and I assumed you wouldn't think I was implying they were equal.

comment by CuSithBell · 2012-05-27T01:46:35.940Z · LW(p) · GW(p)

"Rational" is so frequently used as a contentless word that, if I were to have a comment keyword blacklist, it'd be number two on there, right after "status", perhaps followed by "downvote me". Unless you're talking meta (as in the parent comment), I strongly recommend trying to figure out what you actually mean, and use that word. "Rationality" ain't the goal.

Replies from: wedrifid
comment by wedrifid · 2012-05-28T16:24:29.416Z · LW(p) · GW(p)

if I were to have a comment keyword blacklist, it'd be number two on there, right after "status"

I strongly recommend trying to figure out what you actually mean, and use that word.

Usually when I say status I actually mean status. It's a valid, coherent and useful abstraction. Reducing all uses of the term to component parts would require writing paragraphs or essays all over the place and in general be a stupid thing to do.

Replies from: CuSithBell
comment by CuSithBell · 2012-05-28T17:34:43.974Z · LW(p) · GW(p)

As evidenced by the presence of "downvote me" on the list, my problem with each term is not necessarily the same.

Briefly, I expect people talking about status to be worrying about trivialities, providing facile explanations, or stopping at an overabstraction while thinking they understand a complex issue.

I expect people requesting downvotes to be A) nutty, possibly conspiracy theorists, and B) over-concerned with the karma system.

Replies from: wedrifid
comment by wedrifid · 2012-05-28T18:35:01.728Z · LW(p) · GW(p)

Briefly, I expect people talking about status to be [...] or stopping at an overabstraction while thinking they understand a complex issue.

It is certainly a word that I have seen used as a curiosity stopper.

comment by John_Maxwell (John_Maxwell_IV) · 2012-05-26T02:34:43.081Z · LW(p) · GW(p)

I'm pretty sure the short answer is: Become really good at Python. Learn additional languages if you want to solve a problem Python isn't good for, you want to learn programming concepts Python doesn't facilitate, you want to work on a project that isn't in Python, etc.

Rationale:

  • I've seen lots of discussions online about what people think the best introductory programming language is. Python seems to be the clear favorite. (1, 2.)
  • UC Berkeley and MIT both use Python for their introductory CS classes. (Yes, both universities switched away from Scheme.) I don't know much about any other universities.
  • Recently on Hacker News there were polls on what programming languages people like and dislike. Hacker News is the unofficial homepage of programmers everywhere, and thousands participated in these polls. According to two different ways of analyzing the data (1, 2), Python was the most favored language. Note that the poll was for general purpose programming, not best introductory language.
  • In my experience, it's quite valuable to be highly fluent in your chosen language. For me, there seems to be a substantial productivity difference between a language I am fluent in and a language I am mildly familiar with.
  • You can hack on LW.

The answer might change if you're partway invested in learning some other programming language. I don't know if it's worthwhile to restart in midstream or not.

Replies from: dbaupp
comment by dbaupp · 2012-05-26T08:02:49.114Z · LW(p) · GW(p)

Also, one shouldn't ignore the practical advantages of a batteries-included language (like Python) for enabling you to do things.

For example, if one wants to do some transformation of a spreadsheet which might be tricky or annoying to do in Excel/LibreOffice, one can just import csv and you've got a complete CSV file reader. Similarly, it's an import and 2 function calls to get a file off the internet.

Replies from: sketerpot
comment by sketerpot · 2012-05-28T20:54:01.561Z · LW(p) · GW(p)

Similarly, it's an import and 2 function calls to get a file off the internet.

I only count one function call:

import requests
f = requests.get(my_url).text

The built-in urllib and urllib2 modules can do this too, but they're a disaster that should be buried in the ground. The general consensus is that requests is the Right Thing. You have to install it, but that should be easy with the pip package manager:

$ pip install requests

By the way, I agree with the recommendation of Python. It's a very easy language to get started with, and it's practical for all sorts of things. YouTube, for example, is mostly written in Python.

Replies from: John_Maxwell_IV
comment by John_Maxwell (John_Maxwell_IV) · 2012-05-29T00:04:24.875Z · LW(p) · GW(p)

Thanks for requests!

comment by MBlume · 2012-05-27T05:17:13.113Z · LW(p) · GW(p)

Everyone says you should start with Python. Everyone's right. It's a beautiful language in which you can write good, concise, elegant code. Lots and lots of shops use it.

If you want to learn a second language, to give yourself some sense of the diversity available, I'd recommend Haskell. I think Haskell conplements Python nicely because a) it's nicely designed and b) it's almost nothing at all like python. It's fantastically strict where Python is fantastically permissive. It's functional where Python's object-oriented, etc.

I honestly don't know what the best Python tutorial is -- I learned from a handful. The best Haskell tutorial in the world is Learn You a Haskell for Great Good

Replies from: dbaupp, lsparrish
comment by dbaupp · 2012-05-27T07:41:28.298Z · LW(p) · GW(p)

The "other" Haskell tutorial is also worth a mention: Real World Haskell. (That said, I prefer LYAH.)

comment by lsparrish · 2012-05-27T18:35:50.320Z · LW(p) · GW(p)

I've been reading this book and enjoying it. At first I couldn't get into the groove because I got bored/distracted while reading the intro, but I was able to get started right away with the slick interactive web interface at Try Haskall, after which coming back to LYAH had more appeal.

Some thoughts:

  • Haskell shares plenty of syntax with Python, Ruby, and Javascript that you don't see in Perl or C. For example the way lists and tuples are represented, and the filter and map functions.
  • Getting into something right away with instant feedback reduces Delay so the Procrastination Equation comes out more favorable. So to those struggling with akrasia, the web app is worth a try. Similar apps exist fot for Ruby and Python.
  • Intros in programming books are something you should probably skip (or skim) and get into the examples, which you should start trying out asap. It is important to get positive feedback if you want to generate sustainable interest.
  • Once you've achieved a certain amount of interaction with the language to the point that it is clicking with you, front matter stuff and technical explanations will become much more interesting. (You may wonder why you thought they were boring.)
  • In this particular case, I had a harder time starting LYAH during the evening than I did during the morning. There could be some possible state-of-mind considerations there -- e.g. less patience for the jokes and digressions due to end of day fatigue, less "room in my brain" for the new concepts. Tentative hypothesis: It's optimal to do something that is more a matter of rote typing and simple response (like Try Haskell) later in the day whereas something involving intellectual learning (like reading Learn You A Haskell) works better during the morning.
comment by David_Allen · 2012-05-26T04:22:07.450Z · LW(p) · GW(p)

Is it better to focus on one path, avoiding contamination from others?

Learning multiple programming languages will broaden your perspective and will make you a better and more flexible programmer over time.

Is it better to explore several simultaneously, to make sure you don't miss the best parts?

If you are new and learning on your own, you should focus on one language at a time. Pick a project to work on and then pick the language you are going to use. I like to code a Mandelbrot set image generator in each language I learn.

Which one results in converting time to dollars the most quickly?

If you make your dollars only from the finished product, then pick the language with the highest productivity for your target platform and problem domain. This will probably be a garbage collecting language with a clean syntax, with a good integrated development environment, and with a large available set of libraries.

Right now this will probably be Python, Java or C#.

If you make your dollars by producing lines of code for a company, then you will want to learn a language that is heavily used. There is generally a large demand for C++, C#, Java, Python, and PHP programmers. Companies in certain domains will focus on other languages like Lisp, Smalltalk and Ada.

Which one most reliably converts you to a higher value programmer over a longer period of time?

No single language will do this in the long run, but you might take temporary advantage of the current rise of Python, or the large install base of Java and C++.

For a broad basic education I suggest:

  • Learn a functional language. Haskell is my first choice; Lisp is my second choice.
  • Learn an object oriented language. Smalltalk has the best OO representation I have come across.
  • Learn a high level imperative language. Based on growth, Python appears to currently be the best choice; Java would be my second choice.
  • Learn an assembly language. Your platform of choice.

If you want to do web-related development:

  • HTML, CSS, Javascript.
  • SQL and relational DB.
  • XML, XSD, and XSLT.
  • C#.NET, Java, Python or PHP.

If you want to do engineering related development:

  • C and C++.
  • Perl
  • SQL
  • Mathematica or Matlab
  • for some domains, LabVIEW
Replies from: dbaupp
comment by dbaupp · 2012-05-26T08:12:15.718Z · LW(p) · GW(p)

Learn an assembly language. Your platform of choice.

One way to do this is by writing small C programs and looking at the assembler a compiler generates e.g. by calling gcc with -S. (You can also use this to get some understanding of the optimisations a compiler performs by comparing the difference between the assembler with optimisations and the assembler with full optimisations.)

As you do this, you should also start replacing bits of the C code with inline assembler that you have written yourself, since writing code is better than just reading code.

(Also, the DPCU16 from the yet-to-be-released game 0x10^c might be a reasonable way to learn the basics of assembly languages: there are even numerous online emulators, e.g. 0x10co.de)

comment by CasioTheSane · 2012-05-26T06:56:00.111Z · LW(p) · GW(p)

Languages are for completing tasks, and each has varying strengths and weaknesses for different tasks. What specifically do you want to be able to do?

If you are a scientist or engineer who needs to quickly and accurately answer questions from quantitative data or perform statistical inference, R is the way to go. It also has a great interactive command line with powerful data visualization tools and plotting functions. The experience of "playing with" and manipulating data to quickly ask questions, and consider the data in different ways directly from the R command line is amazing.

If you want to do web development, I would recommend Python.

If you want a general low level language to better understand how computers work, or to develop very high performance code I recommend C which is practically a portable assembly language.

Don't waste your time with proprietary languages where without purchasing expensive licenses your code and skill are useless: visual basic, MatLab, etc. unless you're employed by a company that requires it.

In general once you learn a few programming languages, learning new ones becomes easier and easier. For example, as a person proficient in half a dozen other languages I was able to quickly complete big projects in Python without taking any time to explicitly learn it- just by looking at example code, and referencing digital copies of the O'Reilly Python books whenever necessary.

That's a learning strategy I highly recommend: don't waste time just to learn a programming language with tedious examples, just choose a programming project and immediately start learning what you need to know to finish it once step at a time.

Replies from: None
comment by [deleted] · 2012-05-28T16:46:39.678Z · LW(p) · GW(p)

R

I've been wanting to learn R. Do you have any reccommendations for tutorials?

Replies from: CasioTheSane, shokwave, gwern
comment by CasioTheSane · 2012-06-02T06:26:34.149Z · LW(p) · GW(p)

I recommend these: Girke Lab R manuals

Full disclosure, I am biased because I co-authored several of those... but I really do think they're quite good. They're oriented primarily towards people that want to do biology/bioinformatics with R.

Bayesian content is in the works...

Replies from: None
comment by [deleted] · 2012-06-02T15:12:11.036Z · LW(p) · GW(p)

That's sweet, thanks!

R is severely lacking free tutorials. (As is bayesian stats)

comment by shokwave · 2012-05-29T05:36:44.324Z · LW(p) · GW(p)

This might be an approach.

comment by Kingoftheinternet · 2012-05-26T01:51:26.170Z · LW(p) · GW(p)

Just today, there was a post at Coding Horror, which was itself a follow up to another excellent post, about whether or not learning a programming language is a good use of your time. I think you should read those before you get too invested in the idea of teaching yourself how to program.

Replies from: dbaupp
comment by dbaupp · 2012-05-26T07:45:22.756Z · LW(p) · GW(p)

That second post caused a bit of a furor, which was mostly disagreement. E.g. the Hacker News thread, and several (hundred?) other blog posts.

And the first post is also causing disagreement (again, the relevant HN thread).

comment by Bugmaster · 2012-05-26T07:26:08.334Z · LW(p) · GW(p)

Learn assembly, C, and Scheme, in that order.

Start by learning assembly for some small, manageable microcontroller, like PIC16 (if they still sell those) or a low-end Atmel, preferably one that you could wire up yourself. Prove to yourself that you can make LEDs light up by writing bits to ports. This will give you a good understanding of what computers actually do behind the scenes.

Follow that up with C, for your desktop machine. Implement some common data structures like linked lists and binary trees (you could do it in assembly as well, but it'd be too annoying). Get a good feeling for pointers, memory allocation, and recursion. Learn what clean syntax feels like.

Finally, learn Scheme, which is a language of pure abstraction completely divorced from any hardware, operating system, or syntax (ok, this isn't 100% true, but it's close). Understand how you can manipulate very simple building looks to build powerful expressions, and how code can be used to generate other code and pass it around. Implement an object-oriented "meta-language" in Scheme.

At this point, you'll be well equipped to learn any other language you want in a manner of days. You will probably never use assembly, C, or Scheme ever again, but which would you rather have -- a single tool, or a factory inside your mind that lets you create tools at will ?

Oh, and stay as far away from C++ as you possibly can, it's pure poison.

Replies from: wedrifid, vi21maobk9vp
comment by wedrifid · 2012-05-26T07:32:31.352Z · LW(p) · GW(p)

I recommend outright reversal of the above process. If you absolutely must learn assembly language, do it once you can already program. The same applies to other excessively low level languages with a lot of emphasis on memory management rather than semantic expression.

Replies from: lsparrish
comment by lsparrish · 2012-05-26T14:46:25.097Z · LW(p) · GW(p)

I actually started with Basic, then went to Perl, then Python (which I didn't grok all that much at the time), and finally Forth, which is probably lower level than C in some respects but was somehow easier for me to stick with. I tried picking up C and couldn't get past Hello World. With Forth (specifically the RetroForth project -- which was a bit less smooth at the time) I built my own linked lists, dictionaries, string splitters, and stuff like that, using concatenation that maps more or less directly to machine code. Now when I look back at these other languages I see real stuff going on instead of magic. Maybe this is the equivalent of practicing fencing with a weighted sword.

comment by vi21maobk9vp · 2012-05-26T10:01:18.685Z · LW(p) · GW(p)

This still misses a few ares. You would not be completely ready to learn OCaml (or Haskell) with sophisticated type inference system. You would probably not be ready to learn Erlang (unless you used something like Termite if your Scheme turned out to be Gambit). If you picked an R5RS Scheme, you would probably miss some parts of the power of macros. You would miss APL/J array processing. You would probably (unless you pay attention to using some Scheme object system) miss SmallTalk object orientation and also multiple dispatch-capable object oriented programming (which is not SmallTalk, but, say Common Lisp Object System).

As for calling C syntax clean... Well...

Anyway, for long-term learning you need to ask what concepts to learn - you will most probably have a choice of languages (but no one specific languages will include all).

Replies from: Bugmaster
comment by Bugmaster · 2012-05-31T01:26:56.159Z · LW(p) · GW(p)

You would not be completely ready to learn OCaml (or Haskell) with sophisticated type inference system. You would probably not be ready to learn Erlang (unless you used something like Termite if your Scheme turned out to be Gambit).

All true, but you should be able to pick those up easily enough if you have internalized the other concepts. For example, type inference is much easier to understand when you realize that, underneath it all, there's no such thing as a "type" anyway, just pointers pointing to memory blocks of various sizes; and that, on the other hand, you could construct whatever notion of a "type" that you want, by using functional programming.

I have to admit, though, that I was never a big fan of macros, in any language.

As for calling C syntax clean... Well...

I meant, as compared to raw assembly.

Anyway, for long-term learning you need to ask what concepts to learn

That was kind of my point: instead of learning a specific set of concepts, learn just enough of the right ones, so that adding new concepts becomes easy.

Replies from: vi21maobk9vp
comment by vi21maobk9vp · 2012-05-31T04:54:56.450Z · LW(p) · GW(p)

type inference is much easier to understand when you realize that, underneath it all, there's no such thing as a "type" anyway, just pointers pointing to memory blocks of various sizes

Well, it looks that either you have some minimal experience with abstract algebra or you will need to learn some of it while working with complex type systems.

Learning new powerful abstraction to the level of being able to exploit it for complex tasks is a matter of a few days of full-time learning/thinking/tinkering per se, so learning new languages will still not be trivial. And you have to spend a few weeks collecting minimal best practices.

I was never a big fan of macros, in any language

Given that only macro-assembler and Lisp-like languages even had complex enough macros to matter until recently...

Well, from my experience I can say that implementing a Lisp-macros-like system for Pascal did help me simply to cope with some project. Some say that macros are just for fixing deficiencies of the language; while it is partially true, macros are useful because you will never have a language precisely fit for your task at hand in a large project.

As for calling C syntax clean... Well... I meant, as compared to raw assembly.

But on the same distance from hardware, there is also Pascal. I remember being able to read Pascal without knowing much of it, and it is still not verbose enough to be annoying to some people. While learning C, it is a nice idea to write down every way to write a hard-to-read code buried in the syntax that you come up with. After a while, it will make a useful checklist when cleaning up sloppy code.

That was kind of my point: instead of learning a specific set of concepts, learn just enough of the right ones, so that adding new concepts becomes easy.

The main problem is that unless you have learned some concept you don't really know whether you need to try to apply it. You give a nice set of starter concepts, of course, but I found it useful to show that there are very numerous concepts not mentioned - and it is a good idea to be aware of them.

A piece of advice not to take seriously is to look at http://www.falconpl.org/index.ftd?page_id=facts . Afterwards, one could find where all the mentioned concepts are implemented sanely and learn those languages...

Replies from: Bugmaster
comment by Bugmaster · 2012-06-04T22:59:34.636Z · LW(p) · GW(p)

Well, it looks that either you have some minimal experience with abstract algebra or you will need to learn some of it while working with complex type systems.

Or both !

so learning new languages will still not be trivial.

It depends on what you mean by "trivial". Learning a programming language to the point where you can effectively employ it to solve a complex real-world problem will never be easy, for the same reason that learning a human language to the point where you can converse in it will never be easy. There are always a bunch of library APIs (or dictionary words), edge cases, and best practices (or colloquialisms) to memorize. But understanding the basics of the language does not have to be hard.

macros are useful because you will never have a language precisely fit for your task at hand in a large project.

Technically, this is a language deficiency in and of itself. I rarely find myself wishing I had more macros in Python, or even C#. I do wish there were macros in Java, but that's because they still haven't implemented closures correctly.

But on the same distance from hardware, there is also Pascal.

I dislike Pascal because I find its pointer syntax to be needlessly obscure. That said, I haven't used Pascal for like 15 years, so my knowledge could be hopelessly outdated.

You give a nice set of starter concepts, of course, but I found it useful to show that there are very numerous concepts not mentioned - and it is a good idea to be aware of them.

Sure, but you've got to draw the line somewhere; after all, there are as many concepts as there are language designers ! Many of them can be rolled into more general concepts (f.ex., all kinds of different type systems can be rolled into the "type system" category). Others aren't even concepts at all, but simply useful tools, such as regular expressions or file I/O. You can't learn everything at once, so you might as well start with the big ideas, and go from there.

Replies from: vi21maobk9vp
comment by vi21maobk9vp · 2012-06-05T19:49:46.877Z · LW(p) · GW(p)

for the same reason that learning a human language to the point where you can converse in it will never be easy. There are always a bunch of library APIs (or dictionary words), edge cases, and best practices (or colloquialisms) to memorize. But understanding the basics of the language does not have to be hard

All this is simple to look up - programming is not fluent speech, it is writing. The problem is that similar words have radically different combinations of meanings. And also, sometimes there are totally new concepts in the language. You see it better after you try learning a language where concepts do match your expectations.

Technically, this is a language deficiency in and of itself. I rarely find myself wishing I had more macros in Python, or even C#.

Well, I have written significant amount of code in Python and I did have to use workarounds that would be cleaner as macros... If you consider your language a good fit to your task at any time, you are likely just not asking for the best. It can be mitigated if your requirements are understandable.

I dislike Pascal because I find its pointer syntax to be needlessly obscure

It is still the same. But C syntax is plainy malicious even in assignments, so why care about pointers. Somehow, Google managed to create a clean C-derived syntax in Go by streamlining a lot of rules.

You can't learn everything at once, so you might as well start with the big ideas, and go from there

But it is also clear that you should always know that you are not learning some magical set of all basic concepts, just the concepts that are simpliest to learn in the beginning.

Replies from: Bugmaster
comment by Bugmaster · 2012-06-05T20:56:44.318Z · LW(p) · GW(p)

All this is simple to look up - programming is not fluent speech, it is writing.

Have you ever tried learning a foreign language ? Maybe it was easy for you -- I know people who seem to have a natural aptitude for it -- but for me, it was basically a long painful slog through dictionary-land. Yes, from a strictly algorithmic standpoint, you could look up every word you intend to read or write; but this works very poorly for most humans.

If you consider your language a good fit to your task at any time, you are likely just not asking for the best.

I think your demands might be a bit too strict. I am perfectly ok with using a language that is a good, though not 100% perfect, fit for my task. Sometimes, I would even settle for an inferior language, if doing so grants me access to more powerful libraries that free me from extra work. Sure, I could "ask for the best", but I have other goals to accomplish.

But C syntax is plainy malicious even in assignments...

How so ? Perhaps you were thinking of C++, which is indeed malicious ?

But it is also clear that you should always know that you are not learning some magical set of all basic concepts, just the concepts that are simpliest to learn in the beginning.

I agree with you that there's no magical silver bullet set of concepts, but I also believe that some concepts are vastly more important than others, regardless of how easy they are to learn. For example, the basic concept you internalize when learning assembly is that (roughly speaking) the computer isn't a magical genie with arbitrary rules -- instead, it's a bag of circuits that moves electrons around. This idea seems trivial when written down, but internalizing it is key to becoming a successful programmer. It also leads naturally to understanding pointers, on which the vast majority of other languages -- yes, even Scheme -- are built. I doubt that you can properly understand things like type inference without first understanding bits and pointers.

Replies from: vi21maobk9vp
comment by vi21maobk9vp · 2012-06-06T06:07:37.455Z · LW(p) · GW(p)

Have you ever tried learning a foreign language

English, French (I usually forget the latter and recover it when I have any proximate use for it). My native language is Russian. It is a big relief when learning French that most words have the same translations in many contexts. This multi-translation problem is way more annoying than simply looking up words.

Sometimes, I would even settle for an inferior language, if doing so grants me access to more powerful libraries that free me from extra work

This actually confirms my point. You will have to choose inferior language from time to time, and its lack of tools of adapting language to your task is either local incompetence of language authors or lack of resources for development of language or lnaguage community arrogance.

But C syntax is plainy malicious even in assignments... How so ? Perhaps you were thinking of C++, which is indeed malicious ?

"i+= i++ + ++i;" can be reliably compiled but not predicted. There are many actual everyday examples like "if(a=b);".

Of course, it is not even close to C++, which takes malicious semantics a few levels up.

basic concept you internalize when learning assembly is that (roughly speaking) the computer isn't a magical genie with arbitrary rules

leads naturally to understanding pointers, on which the vast majority of other languages

Any command-line programming environment will make you internalize that computer has some rules and that it does what you order - literally.

x86 assembly is quite arbitrary anyway. Maybe LLVM assembly (which is closer to "pointer machine" than to "random access machine) would be nicer. After all, high-level languages use specially wrapped pointers even in implementation.

I doubt that you can properly understand things like type inference without first understanding bits and pointers.

You cannot properly understand some performance implications, maybe. But the actual input-output correspondence can be grokked anyway. Of course, only higher-order functions have a strict proof that they can be understood without proper understanding of imperative semantics.

Replies from: Bugmaster
comment by Bugmaster · 2012-06-07T02:36:17.471Z · LW(p) · GW(p)

This multi-translation problem is way more annoying than simply looking up words.

It's possible that you are much better at automatically memorizing words than I am.

You will have to choose inferior language from time to time, and its lack of tools of adapting language to your task is either local incompetence or lack of resources or arrogance.

Wait... what ? Are you saying that, when I have some practical task to finish, the best solution is to pick the most elegant language, disregarding all other options -- and that not doing so makes me arrogant ? I am pretty sure this isn't right. For example, my current project involves some Bluetooth communication and data visualization on Windows machines. There are libraries for Java and C# that fulfill all my Bluetooth and graphical needs; the Python library is close, but not as good. Are you saying that, instead of C#, I should just pick Scheme or Haskell or something, and implement my own Bluetooth stack and drawing APIs ? I am pretty sure that's not what you meant...

"i+= i++ + ++i;" can be reliably compiled but not predicted.

Ok that's a good point; I forgot about those pre-/post-increments, because I avoid them myself. They're pretty terrible.

On the other hand, the regular assignment operator does make sense; the rules that let you say "if(a=b)" also let you say "a=b=c". The result of an assignment operator is the RHS. I don't see this as a bad thing, though it might've been better to use "eq" or some other token instead of the comparison operator "==".

Any command-line programming environment will make you internalize that computer has some rules and that it does what you order - literally.

True, and that's a good lesson too, but programming in assembly lets you get close (though not too uncomfortably so) to the actual hardware. This allows you to internalize the idea that at least some of these rules are not arbitrary. Instead, they stem from the fact that, ultimately, your computer is an electron-pushing device which is operating under real-world constraints. This is important, because arbitrary rules are something you have to memorize, whereas physical constraints are something you can understand.

You are right about x86 assembly, though, which is why I mentioned "a small microcontroller" in my original post. Their assemblies tend to make more sense.

But the actual input-output correspondence can be grokked anyway.

You are right, though this depends on which problem you're solving. If you approach the programming language completely in abstract, then yes, you can understand things like input-output correspondence from the strictly algebraic point of view. What you won't understand, though (at least, not as readily), is why all these language features were created in the first place, and which problems they are designed to solve. But if you never intend to write practical programs that perform applied tasks, maybe that's ok.

Replies from: vi21maobk9vp
comment by vi21maobk9vp · 2012-06-07T11:29:56.619Z · LW(p) · GW(p)

It's possible that you are much better at automatically memorizing words than I am.

Or simply annoyed by different things.

Are you saying that, when I have some practical task to finish, the best solution is to pick the most elegant language, disregarding all other options -- and that not doing so makes me arrogant?

Sorry for unclear phrase. I mean that language's lack of tools is language's arrogance.

rules that let you say "if(a=b)" also let you say "a=b=c"

"a=b=c;" vs "a=c; b=c;" is not much; the former syntax simplifies injection of vulnerabilities (intentionally or incidentally).

Instead, they stem from the fact that, ultimately, your computer is an electron-pushing device which is operating under real-world constraints You are right about x86 assembly, though, which is why I mentioned "a small microcontroller" in my original post. Their assemblies tend to make more sense

I have written in C for these microcontrollers - physical constraints visibly leak into the language, so if you are learning C anyway, you could delay learning assembly.

why all these language features were created in the first place, and which problems they are designed to solve. But if you never intend to write practical programs that perform applied tasks, maybe that's ok.

If you learn just Scheme and OCaml you still can understand what type system and type inference gives you.

You can appreciate steam engine without knowing nuclear physics, after all.

Replies from: Bugmaster
comment by Bugmaster · 2012-06-18T21:39:52.232Z · LW(p) · GW(p)

I mean that language's lack of tools is language's arrogance.

I'm still not sure what you mean by that. Are you suggesting that all languages should make all possible tools available ? For example, should every language, including C, Javascript, Java, C#, Ruby, Python, Dart, etc., provide a full suite of Bluetooth communication libraries ? I agree that it would be really neat if this were the case, but IMO it's highly impractical. Languages are (so far) written by humans, and humans have a limited amount of time to spend on them.

"a=b=c;" vs "a=c; b=c;" is not much; the former syntax simplifies injection of vulnerabilities (intentionally or incidentally).

What do you mean by "injection of vulnerabilities" ? Also, "a=b=c;" should be more correctly rendered as "b=c; a = b;". This makes it possible to use shorthand such as "if ( (answer = confirmRequest()) == CANCEL) ... ".

so if you are learning C anyway, you could delay learning assembly.

Sure, you could delay it, but it's best to learn it properly the first time. There are certain essential things that are easy to do with assembly that are harder to do with C: for example, balancing your branches so that every iteration of the main loop takes the same number of cycles.

If you learn just Scheme and OCaml you still can understand what type system and type inference gives you.

If you were a person who only knew Scheme, how would you explain "what type inference gives you", and why it's useful ?

Replies from: vi21maobk9vp
comment by vi21maobk9vp · 2012-06-19T13:28:38.520Z · LW(p) · GW(p)

I mean that language's lack of tools is language's arrogance. I'm still not sure what you mean by that. Are you suggesting that all languages should make all possible tools available ?

It was a clarification to some specific phrase in my previous comment. The original phrase answers both your questions. I specifically said that it can be lack of resources or competence, not only arrogance. And this is specifically about tools that allow you to tailor the language to your specific task, so that there are no problems with language that you are prohibited from solving. Somebody can always write a bluetooth library.

certain essential things that are easy to do with assembly that are harder to do with C: for example, balancing your branches so that every iteration of the main loop takes the same number of cycles

This is not essential for many applications, even with what is now called microcontrollers. Learning optimization on that level is something you can do while having a good grasp of other concepts already.

If you were a person who only knew Scheme, how would you explain "what type inference gives you", and why it's useful ?

Type inference allows you to write with strict typechecks and catch some kinds of errors without cluttering the code with type specifications for every variable.

Replies from: Bugmaster
comment by Bugmaster · 2012-06-19T21:10:28.632Z · LW(p) · GW(p)

And this is specifically about tools that allow you to tailor the language to your specific task, so that there are no problems with language that you are prohibited from solving. Somebody can always write a bluetooth library.

That makes sense, and I do wish that more languages supported more capabilities, but I think it's unrealistic to expect all languages to support all, or even most, or even some large fraction of real-world tasks that are out there. There are vastly more tasks than there are languages: graphics (raster, vector, and 3d, on various systems), sound, desktop user interfaces, bluetooth, TCP/IP networking, bio-sequence alignment, finance, distributed computation, HTML parsing and rendering, SQL access... and that's just the stuff I'd had to handle this month !

Learning optimization on that level is something you can do while having a good grasp of other concepts already.

I think the opposite is true: performing this kind of optimization (even on a "toy" program) is exactly the kind of task that can help you internalize those concepts.

Type inference allows you to write with strict typechecks and catch some kinds of errors without cluttering the code with type specifications for every variable.

I agree with you there, but I'll play Devil's Advocate, in my attempt to adopt the perspective of someone who only knows Scheme. So, can you give me an example of some Scheme code where the strict typechecks you mentioned are truly helpful ? To me (or, rather, my Schemer's Advocate persona) this sounds inelegant. In Scheme, most entities are pairs, or data structures built of pairs, anyway. Sure, there are a few primitives, but why should I worry about 5 being different from 5.0 or "5" ? That sounds like a job for the interpreter.

Replies from: vi21maobk9vp
comment by vi21maobk9vp · 2012-06-20T06:56:10.986Z · LW(p) · GW(p)

That makes sense, and I do wish that more languages supported more capabilities, but I think it's unrealistic to expect all languages to support all, or even most, or even some large fraction of real-world tasks that are out there

You didn't understand my point correctly. Language per se should not support directly, say, bluetooth - because bluetooth will change in an incompatible way. Language could live without a bluetooth library - why not, there is always FFI for dire cases. But the question is about allowing to define a nice API if a need arises. More or less any metaprogramming tool that is not constrained in what it can create would do - those who want to use it, will wrap it in a layer that is nice to use, you can then just incorporate their work.

Common Lisp didn't have any object system in the first edition of the standard; CLOS was prototyped using macros, documented, and then this documentation was basically included in standard. Of couse, macro use could be somewhat more clumsy or more explicit for any reason (make it easier to control overuse, for example) - this is not a problem. The problem is there when you have zero ways to do something - for example, to define a non-trivial iteration pattern.

So, can you give me an example of some Scheme code where the strict typechecks you mentioned are truly helpful ? To me (or, rather, my Schemer's Advocate persona) this sounds inelegant. In Scheme, most entities are pairs, or data structures built of pairs, anyway. Sure, there are a few primitives, but why should I worry about 5 being different from 5.0 or "5" ? That sounds like a job for the interpreter.

Sorry? I was talking about things that help to catch errors. In any small snippet the errors are simple enough to find for this to be unillustrative. It only helps you when you have some kind of wrong assignment in 1K+LOC.

comment by cata · 2012-05-26T03:33:56.058Z · LW(p) · GW(p)

Time to dollars: Python. Ubiquitous, powerful enough, useful for everything, has a friendly learning curve but also a wide variety of concepts in the language.

Highest-value programmer: Probably C, but it's sort of a moot point because I don't think there's a way to become a programmer of above-average value without becoming pretty competent in two or three languages along the way.

Caveats: I'm pretty sure that it varies a great deal based on your inclination and logical ability going in.

comment by IlyaShpitser · 2012-05-27T00:22:07.749Z · LW(p) · GW(p)

If a programming language has nothing new to teach you, it is not worth learning. For this reason, it is probably a good idea to learn multiple ones that are "conceptually orthogonal" to each other. Examples:

lisp (code-as-data, syntax vs semantics, metaprogramming)

prolog (declarative programming)

a simple RISC assembly language (machine details, stack vs heap)

haskell (functional programming, type inference, lazy evaluation, monadic theory, type classes)

ruby (message passing, object oriented programming, regular expressions)

APL (concise syntax, vector programming, non-ascii programs)

Replies from: wedrifid
comment by wedrifid · 2012-05-27T01:01:08.130Z · LW(p) · GW(p)

If a programming language has nothing new to teach you, it is not worth learning.

Assuming you are a person not influenced by external incentives.

Replies from: IlyaShpitser
comment by IlyaShpitser · 2012-05-27T01:05:02.637Z · LW(p) · GW(p)

Learning ideas has better ROI than learning tools. It's easy to pick up tools as needed for work, but recognizing ideas/patterns is both a more transportable kind of knowledge and harder to acquire. Also key ideas behind computation do not have a "half-life," whereas tool/tradeschool type knowledge does.

Replies from: edgeArchitect
comment by edgeArchitect · 2012-06-03T00:24:27.376Z · LW(p) · GW(p)

Exactly, it's all about the concepts underlying the tool and recognizing situations when a certain tool has a better ROI than some other one at solving a problem at hand.

But, sometimes it can be hard to make a fair judgement on whether you really know something or just think that you know. So, it might definitely be useful to know a few other techniques/tools of doing the same thing in order to foolproof yourself.

comment by OrdinaryOwl · 2012-05-26T01:27:05.797Z · LW(p) · GW(p)

A question regarding your title: are you looking for the programming language that best teaches rationalist thinking (if there is one in particular)? Or are you asking for a more general analysis of what the various languages are best at?

Regardless, as a novice programmer (I'm taking my first Java class right now), I would be interested in hearing what LW's opinions are. I chose Java because I wanted to develop Android apps, and because of the large number of jobs calling for Java programmers in my area.

comment by listic · 2012-05-27T13:08:27.454Z · LW(p) · GW(p)

I would like to ask the commentators: what do you think about learning JavaScript as a "first" programming language? I would like to learn to use modern programming technologies and utilize best practices, but learn something quickly usable in the real world and applicable to web programming.

I was going to learn JavaScript for a while (but haven't got around to it) because:

  • I heard it's kinda Scheme on the inside, and generally has some really good parts
  • To do web programming, I need to learn JavaScript for client side anyway; with Node.JS I can utilize (and practice) the same language for server-side programming.
  • Node.JS seems to be a great framework for web programming, built with asynchronous/evented paradigm that should be good for doing... whatever stuff they are doing on the web?
  • Looks like Node.JS is slowly climbing to mainstream acceptance. I mean, I think I could really get a job with that outside of Silicon valley and Japan!

But I have heard so much advice to learn Python lately that I am thinking: am I missing something and being difficult?

It looks like lsparrish has been around and tried learning different languages before, so did I: I was paid to program in C and Forth. But I am a real beginner actually.

Replies from: David_Allen, lsparrish, DSimon, thomblake, John_Maxwell_IV
comment by David_Allen · 2012-05-27T17:20:16.482Z · LW(p) · GW(p)

JavaScript is fine as a first language. I consider it to be a better first language than the TRS-80 BASIC I started on.

comment by lsparrish · 2012-05-27T18:10:13.860Z · LW(p) · GW(p)

JS has a powerful advantage as far as usefulness in that it comes with all the browsers already, so you're going to have to learn it for client side if you are doing web apps. My suggestion to newbies trying to find a quick 10-minute intro is coffeescript.

I'm still leaning towards Python and Haskell as things I should be learning for various reasons. (Python seems useful and career-friendly, and I already know enough to be dangerous. Haskell seems to teach a different kind of math/thinking, which is attractive long term even if I never use the language.)

However Javascript is pretty friendly, especially with CoffeeScript and NodeJS. It might actually be a better language for the web-entrepreneur track since the hottest apps will be optimized for the client side.

One thing I've noticed about the NodeJS community is they seem really good about removing trivial inconveniences. For example with Meteor I was able to get an example set up in about 30 seconds.

comment by DSimon · 2012-05-31T19:52:32.150Z · LW(p) · GW(p)

Javascript shares a problem with C++: it is hard to find non-crap documentation and tutorials that won't lead new coders subtly (or not so subtly) into bad habits that are hard to break later. With C++ or Javascript, the first few google results for any newbie question are likely to be pretty bad.

If you have access to a really good Javascript programmer who uses modern techniques and libraries (use of jQuery, prototype, coffescript and/or node.js are all good signs), and can get them to supply you with help or at least review the help you're gtting from others, then that's cool. If not, then stay away from JS until you're a good programmer and you have a direct practical need for it.

comment by thomblake · 2012-05-29T17:40:49.681Z · LW(p) · GW(p)

Personally, I would recommend learning Python first and then learning JS. Udacity has great free courses in Python. Python has fewer caveats than JS. And there is very little in Python style that will steer you wrong when learning JS.

comment by John_Maxwell (John_Maxwell_IV) · 2012-05-29T00:08:14.519Z · LW(p) · GW(p)

Python has very nice tracebacks that help a ton with debugging. JavaScript doesn't come close. But yes, JavaScript is not a terrible choice for a first language.

comment by thelittledoctor · 2012-05-28T12:13:24.639Z · LW(p) · GW(p)

I'm fond of Perl as a first language, for a couple of reasons. Foremost among them is that Perl is fun and easy, so it serves as a gentle introduction to programming (and modules) that's easy to stick with long enough to catch the bug, and it's versatile in that it can be used for webapps or for automating system tasks or just for playing around. But I wouldn't recommend making it anybody's only language, because it IS a scripting language and consequently encourages a sort of sloppy wham-bam-thank-you-ma'am approach to coding. Start with it, learn the basics, then move on to Python, and after achieving competence there learning new languages pretty much just feels like fun and games. Perl remains my favorite language for anything to do with SQL, and also for hammering out quick scripts to automate boring tasks.

Lisp is probably not necessary, but IS fun to learn. I don't know whether if it makes you a better programmer. I'm definitely better now than I was before I learned it, but I don't know how to differentiate between "I gained experience" and "Lisp fixed my brain".

My first languages were C++ and Java, incidentally, and I would say that I became a decent programmer in spite of that rather than because of it. C++ was too much all at once, at least for twelve-year-old-me, and Java by contrast is so gentle and coddling that it became a kind of tarpit from which I almost did not escape.

I think more than anything what reliably converts you to a higher value programmer (provided you already have good math skills) is going through the larval stage as many times as possible.

Replies from: DSimon, vi21maobk9vp
comment by DSimon · 2012-05-31T19:48:12.716Z · LW(p) · GW(p)

I remember Perl with fondness, but unfortunately it seems to be a dying language. The foretold Perl 6 (literally foretold, there were "exegeneses" and "apocalypses" and everything) has been at a standstill for many years, and the once-amazing CPAN has now been utterly demolished by the likes of GitHub and RubyGems. There's a lot to be said for languages that have active communities regularly supplying new and updated useful libraries.

If you miss Perl, try Ruby; it actually was meant at the beginning to be a fairly Perl-like language, and it has many (IMO somewhat underused) features that assist with quick get-crap-done scripts, like the ARGF I/O handle that automatically and conveniently reads through STDIN and/or files specified on the command line.

comment by vi21maobk9vp · 2012-05-31T04:37:32.534Z · LW(p) · GW(p)

The problem with Perl as a first language (which maybe makes Python a better choice) is that it encourages sloppiness a bit too much. You can certainly resist; but in Python, Pascal, Scheme you can take arbitrary example program off the net and have a decent chance of reading and understanding it quickly. Reading code not written in your presence is an important skill, and developing it with Perl will take more time than with most other proposed first languages.

comment by deepthoughtlife · 2012-05-27T07:17:50.002Z · LW(p) · GW(p)

I have no interest in evaluating languages based on how quickly they lead to money; only how they affect your ability to program. Additionally, I am not particularly experienced. I've only been programming for three or four years. Take my words with a grain of salt.

I remember one semester in college where I was using three separate programming languages at the same time. VB, Java, and Assembly (16-bit x86). Sometimes it lead to a small amount of confusion, but it was still a good experience. I would suggest beginning a second language soon after achieving minimal competence with your first language, but continuing to learn the first as well. Sticking too long to a single language encourages you to stick in a rut, and think that is the only way to do things. Go down as many paths as you are able to. Getting in a language rut is not a good idea.

Java is not the most amazing language in the programming world, but it is an extremely useful one to learn. It contains a C style syntax without some of the more difficult aspects. In other words, you learn how to program in the traditional way, but more easily. It is relatively verbose, and strict on some matters, but that teaches a methodical way of programming that is necessary to making good programs, but less strict languages do not teach. You can do anything in Java, and most of it is of reasonable difficulty. While you will need C++ to round out your C style, knowing Java is a very good stepping stone for it. It is an advantage to learning how to program that it is not interpreted (ignore that it compiles to bytecode), since that means you must have a coherent thought before writing it. When just learning, winging it is not the best strategy. Complexity should be approached by the newbie programmer as quickly as they are able to handle it.

Prolog might be the most amazing programming language, but I wouldn't recommend it to someone just learning how to program, and it isn't particularly transferable. It does give a person a way to think about the problem directly in a logical manner. The algorithms can often be expressed in a more pure form than in other languages, since it utilizes a search algorithm to actually get the answer rather than a series of commands. I would have liked to learn some algorithms in Prolog. As an example: factorial(0,1). factorial(1,1). factorial(N):= N>1, N * factorial(N-1). This looks almost exactly like the algorithm as it would be described. In fact, this could be used to intuitively explain the algorithm perfectly to someone who didn't know what factorial was. Notes: Prolog operates on equivalence. The first line declares that factorial(0) is logically equivalent to the number 1. Likewise factorial(1) is logically equivalent to 1. Disclaimer: As a (hopefully) just graduated CS student, I have written two different term papers and given a 36 minute presentation related to Prolog, but was only exposed to it these last few months, and have not yet had the chance to do a significant amount of programming in it. (19 units ftw.) I wish I had taken the opportunity to use it when I first heard of it a couple years ago; it was that or Lisp and I did Lisp.

On that note, Lisp. Definitely a good language, it can be very difficult to learn, especially due to the overabundance of parenthesis obscuring the fundamentally different approach to problems. I freely admit that my first attempt to learn Lisp went down in flames. Make sure you understand Lisp's fundamentals, because they are not the same as a more normal programming language. It takes real dedication to properly learn Lisp. Still, Lisp is one of those languages I wish I had spent more time on. It is a different and excellent way to think about problems, and is a very good language to expand your horizons after you already know how to program. While it is possible to write in a C style in Lisp (Common Lisp at least), make sure to avoid that temptation if you really want to understand it. It is especially good for recursion, and you are right about the whole code as lists thing being an interesting perspective. I didn't really learn much about Lisp macros. Okay, fine. I wish I had done Prolog and Lisp.

If you want to be a good programmer, then you must learn assembly of some form. It would be utterly insane to try to learn programming with assembly, (of course, some did, and I wouldn't be here on my computer if they hadn't.). Understanding how the machine actually (logically) works is a huge step toward being a good programmer. A user can afford to only know what is done, a programmer needs to learn how it is done, and why. High level languages will not make you a good programmer, even if they are what you end up using for your entire career. The two semesters I spent learning how the computer operates on a logical level were of extreme importance to me as a programmer, and assembly is an important part of that.

comment by sixes_and_sevens · 2012-05-26T01:54:19.658Z · LW(p) · GW(p)

There's a dilemma here which is present in teaching a lot of skills: do you want your hypothetical students to be building useful things quickly, or do you want them to be internalising concepts that will last them a lifetime?

If it's the former, just give people a solvable problem and let them pick their own tools. If it's the latter, start them off with some verbose compiled unforgiving strongly-typed beast like C or Java. They're best learnt in a training environment rather than on the fly, so if you have a training environment, it makes sense to learn them there. It's easier to go from, say, Java to Python than it is to go in the other direction.

Replies from: John_Maxwell_IV
comment by John_Maxwell (John_Maxwell_IV) · 2012-05-26T02:57:19.246Z · LW(p) · GW(p)

At times I've thought the same myself, but I've read accounts from at least two different CS professors who said that student outcomes improved dramatically when switching from statically typed languages to Python as a first language. I suspect Python works well because it makes it easy to introduce just a few concepts at a time.

I started with Python and I didn't have any trouble learning statically typed languages.

Replies from: sixes_and_sevens
comment by sixes_and_sevens · 2012-05-26T15:28:49.690Z · LW(p) · GW(p)

Static typing isn't really the issue. C and Java are constrained and fussy and demand a level of forethought and attention to detail that is absent from something like Perl or Python. This is mostly because they're compiled, although static typing also plays a role. The point is they're unforgiving, and teach you a level of discipline and rigour that you otherwise may not need to learn.

Personally, I hate working in C or Java, and avoid them precisely because prototyping all my functions annoys me and I don't care whether my objects are public or private. I'm still glad I learned them when I did, in a training environment where I was obliged to do so. If I hadn't, and I'd started on Python, I would have had absolutely no motive to learn them unless someone made me.

Replies from: lsparrish, John_Maxwell_IV
comment by lsparrish · 2012-05-26T17:21:13.403Z · LW(p) · GW(p)

I see what you are saying. As a newbie I found it hard to stick to C or Java for long enough to get past Hello World. If you're relying on internal motivations, you aren't so likely to stick it out long enough to get the fundamentals from these. The problem is you need rewards within a certain time limit for the brain to become addicted. This does happen, but only after a certain amount of coding has been done.

On the other hand something like Python (or Basic for that matter) is easy, but your inner lazy person is going to keep on thinking certain things are magic because they are automated and work without you thinking about them.

With Forth I like to think there's a bit of the best of both worlds. IME, you can get addicted to Forth without too much effort, but it is very hard to get anything serious done in it until you've been doing it for several years. Essentially you end up building your own language from first principles.

For example pretty much every language has a stack for passing values, but most hide this from the user. Likewise every language represents memory addresses as numbers, but this also tends to be hidden from the user. In Forth if you want to hide complexity you pretty much have to do the hiding of information yourself -- concatenate functions and primitives to generate complexity, factor them into smaller functions to hide it.

Factoring is necessary for every language of course, but most of them don't punish you as hard for not factoring, and most ship with tons of complexity already pre-factored and ready for you to magically wave your hands at. I'm not saying that's bad, just that it is (or seems to me) a trade-off people may not be aware they are making.

comment by John_Maxwell (John_Maxwell_IV) · 2012-05-26T19:37:23.368Z · LW(p) · GW(p)

Fair enough. I guess I could say the same thing, I'm glad I was made to learn C for school. C is the ideal language for a data structures class in my opinion, since you can implement all the data structures yourself and understand what's really going on under the hood.

comment by RobertLumley · 2012-05-26T01:25:04.568Z · LW(p) · GW(p)

My personal recommendation is Visual Basic, assuming you use Excel for anything beyond recording what you ate for breakfast. VB extends the functionality of it ten fold, if you know a few basic things. It has the added bonus of being a very easy language to learn, the syntax is pretty much English. That being said, no company is ever going to use VB as a real programming language, but it sounds like employment is not your goal.

Edit: Also, it's important to note that (at least I don't think) any language is going to teach rationality any better than any other. It's not like programming changes very much, for most purposes, it's just different syntax.

Replies from: Dr_Manhattan, Andreas_Giger
comment by Dr_Manhattan · 2012-05-26T04:13:40.373Z · LW(p) · GW(p)

Surprisingly, a lot of Wall Street uses VB for automating models. It's a dirty little secret but I've known people highly paid to do this.

comment by Andreas_Giger · 2012-05-26T03:01:49.726Z · LW(p) · GW(p)

Just because you can code in C++ the way you coded in C doesn't mean it's a good idea to do so. Programming changes faster than most programmers do. The "just different syntax" statement is true for scripting languages, but not for programming languages.

You will see this practically the instant you start optimizing your code for performance scaling or modularity or whatever.

comment by JoshuaZ · 2012-05-26T02:04:16.400Z · LW(p) · GW(p)

Relevant xkcd. It is important to realize that a good programming language doesn't help if the thoughts are confused. This often matters much more than the language choice.

Replies from: Andreas_Giger
comment by Andreas_Giger · 2012-05-26T02:45:11.733Z · LW(p) · GW(p)

OOP in general can help clarifying thoughts, especially when dealing with complex systems.

Readability of code is another huge factor, which depends on your ability to write good code (which is easier in some languages than in others) as well as the language syntax itself.

A good language may not help if your thoughts are already confused, but it might help you to not confuse your thoughts.

comment by Fromage · 2012-05-27T08:38:33.898Z · LW(p) · GW(p)

What's the most rational way to remove back hair?

I have plenty of annoying back hair that I need to get rid of. Through monthly calculations, I've found that they grow at an average 1.23 cm/month, and have a tendency to curl into black wisps.

Further Bayesian calculations have shown that the empirical probability of getting noticeable hair in my lower and upper torso is 0.8 and 0.63 respectively during the summer season.

I would like to inquire about the most rational length to which I should trim my back hair (I have a Vernier Caliper and extremely sharp scissors), and the recommended duration between successive trimmings.

I'm sure that the rationalist community here can help me make the most rational choice in back hair removal. As a Bayesian myself, I greatly prefer the use of Occam's Razor, and would appreciate simpler answers.

comment by edgeArchitect · 2012-05-26T02:09:25.253Z · LW(p) · GW(p)

Guys, programming languages ARE exercise in logic and rationality, so actually if you really want to do "lesswrong" then you MUST be familiar with at least 3 and how each language manages logic.

Lisp, Prolog, Haskell any of these. Java is a good one too. Kinda Aristotelian in a way.

Replies from: Desrtopa, atorm
comment by Desrtopa · 2012-05-26T02:30:49.388Z · LW(p) · GW(p)

Similarly, Go, Magic: The Gathering, PredictionBook, and PUA are exercises in rationality, so naturally participating in all of those is also a necessity.

Replies from: Kingoftheinternet
comment by Kingoftheinternet · 2012-05-26T03:34:52.885Z · LW(p) · GW(p)

Competitive weightlifting! Finite element analysis! Arguing with creationists on Reddit for four hours a day!

Replies from: edgeArchitect
comment by edgeArchitect · 2012-05-27T01:09:12.755Z · LW(p) · GW(p)

Sure they are, but the amount of variables in those can be overwhelming for one and some of what you've listed may be based on flawed logic. So, I would not suggest starting out with those.

I think programming is great because it can potentially teach you how to break things down into different levels of abstraction and manipulate concepts by applying only a few basic rules.

I should've contributed a little bit more to the OP, but I was mostly replying to those who were saying that there is no difference between languages and one is just is good as the other. That I think is wrong because you can't compare a real functional language with an "object oriented" one.

comment by atorm · 2012-05-26T04:10:53.652Z · LW(p) · GW(p)

Does anyone else automatically mistrust sentences with single words in all caps?

Replies from: Jayson_Virissimo, edgeArchitect
comment by Jayson_Virissimo · 2012-05-26T10:06:28.146Z · LW(p) · GW(p)

Does anyone else automatically mistrust sentences with single words in all caps?

Personally, I've never had much faith in sentences of any type. Shady characters, they are.

comment by edgeArchitect · 2012-05-27T06:53:01.729Z · LW(p) · GW(p)

I don't mind scepticism, if you see anything wrong with the content other than letters do let me know please.

Replies from: atorm
comment by atorm · 2012-05-29T20:44:56.258Z · LW(p) · GW(p)

Welllllll, since you asked:

"do 'lesswrong'" "LessWrong" is not a verb. One could be less wrong, or one could exemplify the qualities encouraged by LessWrong, but one doesn't do lesswrong/LessWrong.

"Lisp, Prolog, Haskell any of these" is a sentence fragment. I'll accept a sentence fragment under poetic license, but add a comma after "Haskell" to match the spoken pause.

I think the community's objection to your post stems from the now frowned-upon sentiment that there are behaviors that are absolutely required to fit in in this community or, especially, to be a rationalist. I personally thought it was especially silly that you insist on "at least 3" programming languages. Also, programming languages aren't exercises in logic/rationality. Programming may be an exercise in logic and rationality, but programming languages are languages. They aren't exercises in anything.

Replies from: edgeArchitect
comment by edgeArchitect · 2012-06-01T09:41:00.475Z · LW(p) · GW(p)

I am not a native English speaker, but I think I'm versed well enough to know that you know what I meant to say. I also do think that it is a little bit unfair for you to mention my grammar and punctuation since you yourself use words like "caps" and "Wellllll". Not only it is hypocritical, but also comes across as a Stickman argument of a sort.

I'm sorry for pointing this out. But, I mean, you are partially right, I do think that sometimes, when working with complex concepts, the language needs to be pretty clear for a common understanding to develop between parties involved.

With that said, I do tend to play with my language to see if I can modify it's impact, while still keeping that which my words represent intact.

While my choices may not seem rational, they are not wrong. I may be trying to present a point that whether we are talking about programming or human languages all of them are exercises in logic and symbolic construction, all you have to do is learn how to make sense of it. Or ask for clarification.

So, I think you just haven't learned to appreciate any sorts of languages yet. But I do applaud your knowledge of grammar.

Now, onto this: "I think the community's objection to your post stems from the now frowned-upon sentiment that there are behaviors that are absolutely required to fit in in this community or, especially, to be a rationalist."

This is the most irrational thing I've heard in a while. The most rational behaviour for a rationalist to have is to be rational, no matter which community it is, or how many karma points someone has, and no matter how many grammatical errors you find.

And, please, never say languages are nothing. It is an insult to the achievement of our species.

"Lisp, Prolog, Haskell any of these". Thanks, I completely forgot about that rule.

Sorry if I wasn't the reply you were were hoping to see.

Replies from: atorm
comment by atorm · 2012-06-03T18:22:43.825Z · LW(p) · GW(p)

Knowing you are not a native English speaker makes me even more inclined to forgive your errors (notice that I did not criticize your grammar until you asked), since you are correct in observing that they did not seriously interfere with comprehension. But poorly put-together sentences do detract from a message, irrational as that may be.

"The most rational behaviour a rationalist can have is to be rational." What does this even mean? And how does it relate to my objection that knowing three programming languages is not a requirement of rationality, or of contributing valuably to this community? Given that human rationality is a subject still being explored, making a definitive claim that something is a key requirement of rationality is incredibly arrogant in the face of ignorance.

Finally, I didn't say that languages are nothing. I said they aren't exercises. And I'm pretty sure they aren't going to feel insulted.

Replies from: edgeArchitect
comment by edgeArchitect · 2012-06-04T00:23:10.097Z · LW(p) · GW(p)

"But poorly put-together sentences do detract from a message, irrational as that may be." Why is it irrational?

"The most rational behaviour a rationalist can have is to be rational." Meditate on it some more.

"...my objection that knowing three programming languages is not a requirement of rationality..." I said at least three, because there are at least three ways to do the same thing. Some are more efficient at one thing and others are more efficient at other things.

"Given that human rationality is a subject still being explored, making a definitive claim that something is a key requirement of rationality is incredibly arrogant in the face of ignorance." It is incredibly arrogant, but it is also a very useful tool in deductive reasoning.

I feel like I'm being trolled by a spell-checking bot. You haven't contributed anything to this topic other than your sceptism, which is the most basic tool anyone in their right might is able to use.

You said that "They [languages] aren't exercises in anything". Which I think is not true.

Thanks for the English lessons, bye.