What Programming Language For Linux Development? 997
k33l0r writes "Recently I've been thinking about developing (or learning to develop) for Linux. I'm an IT university student but my degree program focuses almost exclusively on Microsoft tools (Visual Studio, C#, ASP.NET, etc.) which is why I would like to expand my repertoire on my own. Personally I'm quite comfortable in a Linux environment, but have never programmed for it. Over the years I've developed a healthy fear of everything Java and I'm not too sure of what I think of Python's use of indentation to delimit blocks. The question that remains is: what language and tools should I be using?"
I like Python (Score:5, Interesting)
Why not stick with C#? (Score:5, Interesting)
http://www.mono-project.com/ [mono-project.com]
C/C++ (Score:5, Interesting)
Take a look at Qt and Gtk. They're the two big GUI toolkits. I personally like Qt more, it's better documented and much easier to get running in Windows (and macs). As for the python, there's nothing wrong with its indenting. The problems of the language are much deeper. No language is going to be perfect, it's a tool
As for IDE's, if you're coming from a MS background take a look at the latest netbeans. It's a little slow (fine on new hardware though) and a bit better than Eclipse for C/C++ support.
What do you want to program? (Score:4, Interesting)
That is the first question you should ask yourself, actually. ;)
One thing you might learn, from a tinkering-with-Linux point-of-view, is shell scripts. Surprised no one mentioned them yet. They aren't really "programming" in the sense of creating apps, but they are fun and a cool part of Linux.
Re:How much do you want to learn? (Score:3, Interesting)
Re:C/C++ (Score:5, Interesting)
C/C++ are the languages you'd want to go for. They can do *everything*, have great support, are fast etc.
Let's be honest here. C and C++ are very fast indeed if you use them well (very little can touch them; most other languages are actually implemented in terms of them) but they're also very easy to use really badly. They're genuine professional power tools: they'll do what you ask them to really quickly, even if that is just to spin on the spot chopping peoples' legs off. Care required!
If you use a higher-level language (I prefer Tcl, but you might prefer Python, Perl, Ruby, Lua, Rexx, awk, bash, etc. - the list is huge) then you probably won't go as fast. But unless you're very good at C/C++ you'll go acceptably fast at a much earlier calendar date. It's just easier for most people to be productive in higher-level languages. Well, unless you're doing something where you have to be incredibly close to the metal like a device driver, but even then it's best to keep the amount of low-level code small and to try to get to use high-level things as soon as you can.
One technique that is used quite a bit, especially by really experienced developers, is to split the program up into components that are then glued together. You can then write the components in a low-level language if necessary, but use the far superior gluing capabilities of a high-level language effectively. I know many people are very productive doing this.
How about (Score:4, Interesting)
You really should have a good grasp of the concepts of programming languages, so that you can work with the bulk of projects that come your way.
A little scripting, a little functional, a little procedural, a little OO, all combine to make Jack a versatile hacker.
Re:C/C++ (Score:1, Interesting)
Are C/C++ really fast? Have you got figures to back it up? I'm not just talking about using C directly, but using C with some heavyweight library like Gtk which does its own very inefficient implementation of objects (glib), uses reference counting, and adds tons of asserts (which in a true HLL could be eliminated by the compiler). As Wikipedia would say ... [citation needed].
Rich.
Re:C/C++ (Score:3, Interesting)
Re:What do you want to program? (Score:5, Interesting)
* for example using kdialog
Re:How much do you want to learn? (Score:3, Interesting)
Yeah! I mean, it's only used in the most widely used desktop environment. And it's only leaps and bounds better than the usual drek.
Tabs are EVIL (Score:2, Interesting)
Re:Java (Score:3, Interesting)
On top of that, it doesn't take much juggling to get your app to run on multiple platforms.
One would think, right? But it's actually NOT that much easier to get your app running on multiple platforms. I shall describe to you three projects that I wrote in C, Java, and ObjC respectively.
The first app was written in C, it was a webserver. It actually was really easy to port, because I used basic C libraries. About the only problem was that on Solaris, I had to include some libraries that I didn't on the other *nixes. The other *nixes? Linux, OpenBSD, and Mac OSX. It all worked like a charm, no cross platform issues. I have no doubt it would have run just as fine on SFU (services for Unix) on Windows, or if one had the right compiler, like djgpp, it would work just fine as well.
I wrote a raytracer in Java for a graphics course. I already had the vector class, which took care of most of the code, so most of was super easy to write. Just make a vector, cast the ray, just a bunch of math that I had already written into Vector and Matrix libraries. There was just one problem. How do you draw a pixel in Java? Oh yeah, that's right, there is no way to do it. Some implementations allow you to do a drawRect() of size 0,0, but others interpret that as a "draw nothing"... thus in order to actually draw my image correctly across Linux, OSX, Solaris, and OpenBSD, I had to draw 4 times as many pixels as I had to, by doing a drawRect large enough that it would actually draw something on each one of the OSes. And these are all POSIX systems? WTF?
Last, I got sick of the performance of my Java raytracer, as well as the inconsistent undefined behavior of drawRect() with a height/width of 0,0. So, I ported my raytracer to ObjC. I started with the vector library, things went fast, and easy. It also worked perfectly on OSX, Linux, OpenBSD and Solaris. It used SDL for the graphics (which actually has a defined behavior that allows one to draw one and only one pixel). And the standard POSIX thread library to make it multithreaded to draw even faster.
Point of all of this: The LIBRARY SPECIFICATIONS are the most important thing to be defined across the board. Any unspecified behavior in your library results in being able to write code that won't work on a different implementation of that same library. Also, without a Java VM, the java raytracer wouldn't run on that system, which means it is no better than requiring the ObjC library, and SDL library.
Re:This is all true however... (Score:3, Interesting)
WRT "man" - actual conversation from work last week with regard to "man 2 send" - the c function for sending via a socket:
Me: "Look, instead of asking me all the time, just look it up."
Co-worker: "What website?"
Me: Use "man". It's already in the man pages on your machine.
Co-worker: "How do I do that?"
Me: "Try man man. Read the result."
Two weeks ago, it was "ps" and "kill" (though not "kill -9" :-). Next week, I introduce "apropos", and grep to filter out yucky results. ... kids nowadays ... they think "mc" is a "low-level" tool. I shudder at the thought of teaching them how to make a commit to svn from a terminal .... it's like they can't read ... I have to "white-board" every explanation, or threaten to lart them. Or make them drink the coffee.
Like naming conventions. "I know what UPPERCASE and lowercase are, but what's TitleCase? What's camelCase?" At least they grokked type_name_t for types, UPPER_CASE_IS_FOR_MACROS_AND_CONSTANTS, lower_case_for_functions(), TitleCaseForClasses,camelCaseForMethods() - AFTER I drew a picture of a camel and pointed out the hump (my camel book stays at home). Then gave some more shit over misusing leading underscores (short version - don't).
Then we got into the whole history of "do you precede a class name with a "C" for "Class" or "T" for "Type", and how Microsoft used "CClass" to avoid naming conflicts with their competitor, who was using "TClass", and it doesn't really matter - use your initials for all I care - just be consistent (also lets us know who to blame, if they're too cowardly to include a \blame comment :-)
Re:I like Python (Score:1, Interesting)
No, it's not OCD. I'm picky about it because it has real (admittedly tiny) effects.
When inserting code after a blank line, I end up having to hit tab 2-4 times, sometimes more, before I insert that new line (or after, whatever). Some editors flat don't have an option to do it for you, and it's not always worth or possible it to fire up (or even install) your preferred one. It's just a pain in the ass. Extra keystrokes, they add up.
Plus, I dunno. Sometimes I scroll up and down a section and watch the cursor. It's kind of a low-level routine in my brain that I don't have to expend conscious though on, if that cursor stays in the same column, it's all one piece. Helps me visualize the code better, hold more information in my head at once. With those blank lines, it makes me think for a tenth of a second or two, and lose the "picture" in my head.
There's a ton of other reasons I like my blank lines to obey indenting that I can't think of right now, but it's basically a "beaten to death with feathers" situation where there's a million little nagging things about it that end up taking 0.2 seconds extra. And I work fine with others, if they're not right I fix 'em if I'm doing a bunch of work in one section and don't bother otherwise.
Besides, aren't you being a little OCD in getting antsy in the pantsy over my preferences?
Re:This is all true however... (Score:5, Interesting)
...C++...
The problem with C++ is that it's a crazy, crazy language. At first, it was just a superset of C, but now there's all kinds of stuff...it's mutated into a totally different kind of thing. C has this elegant simplicity going for it. There's nothing the matter with C++...except that C is (pretty much) perfect.
Also your feel dirty comment, is that because of the ease in which a poor programmer can create unstructured code? If so would it not be the fault of the programmer and not the language specifically? (i.e. Assembly for the 8088, ..286, ..386 and IBM Mainframe made me feel dirty sometimes with they way you were forced to branch, but it was fast...and no I am far from an expert Assembly programmer.
The problem with PHP (and I code mostly in it for a living) is that it wasn't 'designed' at all. Originally it was just a pre-processor, and it's grown into a full blown language from there. This is all well and good, except that there's no sort of continuity to it at all. Naming conventions? (isset vs every other 'is' function starting with 'is_', etc) Who needs them? OO? Sure...ish. PHP is great for getting things done, but I certainly feel dirty after coding in it.
FYI, personally I do not have a preference and simply choose what is convenient for me to use that will get the job done, period. I honestly do not know the nuances between them...and I am sure that there are some.
Always a good way to be.
Re:C or C++ (Score:5, Interesting)
That's only if you need features from the latest versions of MS.NET, mostly in the cases of porting existing applications. Mono is a strong platform in its own right and perfectly suitable for developing Linux applications.
And you do NOT need to use an old version of C#. The compiler is C# 3.0 compliant and they plan on adding C# 4.0 support shortly after it is released.
Thanks for playing.
Re:Java (Score:3, Interesting)
I don't define myself as "hesitant" to use manage code, I define myself as "where's my pointer?" You're right, I'm generally skeptical at best of new trends. My job isn't to be trendy or try new stuff - it's to solve problems in something standard, efficient, and sustainable. C has proven a language capable of nearly every practical task thrown at it for the last 30 years. It is easily the most portable ever created. Why would I want to switch?
Yes, there are those that say I shouldn't be tied to a programming language, because they come and go. That's absolutely true, but I see no reason to use new tools if they offer no substantial advantages to my problems over the old toolsets.
I'm sorry, I can manage memory and manage pointers. I do not need to incur the penalty and nuisance of using java. C very closely resembles how the underlying machine works (at least for Von Neumann architectures), and I think that's a very good thing for programmers.
Re:Java (Score:4, Interesting)
"Java is slow" is a stupid old myth. Does it not occur to you that JIT compilers compile to native code?
Ahah ? You should tell that to people who develop applications delivering only 1000 pages per second on a 8-core machine where equivalent plain-old C easily delivers more than 10000 pages per second on a single core of the same machine. Surely the GC is at fault, everything related to object management is at fault, the memory footprint voiding all cache efficiency is at fault, in summary, the language is at fault.
An yes, that's what I see in enterprises.
I think the real problem with Java developers is that they have been told that what they did was fast, and they believe it. But let's face it : when processing an HTTP request burns ONE JOULE there is definitely a problem. No wonder why datacenters are filling that fast...
Everytime a Java developer tried to prove me wrong, he showed me he was able to reach performance levels I was able to reach 10 years ago on an obsolete machine. "Look: 500 pages per second on this small 4-core xeon !". Well, I do 2000 on my 2.5W, battery-powered Geode computer, and that is small.
So please stop spreading bullshit about efficiency of such things, there are people who believe you and now we find their crap sucking all the power of datacenters.
Re:This is all true however... (Score:5, Interesting)
Nope!
http://www.lrde.epita.fr/~didier/research/verna.06.ecoop.pdf [epita.fr]
http://portal.acm.org/citation.cfm?doid=1143997.1144168 [acm.org]
http://www.eecs.berkeley.edu/~fateman/papers/lispfloat.ps [berkeley.edu]
Good idea. :) I know /you/ were only joking but Lisp has been held back by a ton of widely believed (and massively ironic) mythology and it is very sad. The only thing really wrong with it is the lack of stuff written in/for it because of its grossly undeserved reputation.
Re:C/C++ (Score:3, Interesting)
http://www.cryptonomicon.com/beginning.html [cryptonomicon.com]
Re:Yep, RAII is where it's at (Score:3, Interesting)
Re:Why not start with assembly language? (Score:3, Interesting)
assembly is for pussies.
real programmers fire up a hex editor and enter the hex codes into the file directly.
No, I'm not kidding. I know of two that work on enbedded hardware that would program a 6802 processor completely in hex on a keypad from memory. they did it so much they never wasted time testing things by writing it down then converting to hex. they just went... "let's try this!" and started typing away..... it was amazing.
Re:Children these days... (Score:3, Interesting)
A buddy of mine in university had to toggle the boot loader back into an old HP machine's core (yes, real core) after he blew it away through a programming error (no memory protection of course.)
It took him well over an hour. Everybody was much more careful about single stepping through potentially hazardous area of their code first after that. Good times... uncontrolled access to the lab building so we had pizza and beer in there.
Re:Children these days... (Score:3, Interesting)
I once worked on a machine with 40K of core - REAL magnetic cores! - where you had to toggle the machine code - which was in ASCII! - into the machine from a front panel. This was an RCA 301 from back in the '60's that an idiot friend of mine had installed in the late '70's to run his father's department store. The box was THIRTY THOUSAND POUNDS of hardware. The flooring had to be reinforced to hold it. The wiring was a rat's nest. It used nine tape drives and actually had a COBOL compiler that required hours of multipass compilation to compile a program with a couple thousand lines in it.
There was a hard drive that never worked - it held 10MB, the disk was about three feet wide, and it took a 30HP motor to spin it.
Data input was via punch cards, until we got a card-to-tape input device.
My advice: don't try any of this at home.
Re:This is all true however... (Score:3, Interesting)
Those papers do not show, and their authors do not state, that such a qualification is justified...
http://www.lispworks.com/success-stories/raytheon-siglab.html [lispworks.com]
http://www.franz.com/success/customer_apps/eda/amd.lhtml [franz.com]
http://www.franz.com/success/customer_apps/data_mining/itastory.php3 [franz.com] (also: http://www.itasoftware.com/careers/l_e_t_lisp.html?catid=8 [itasoftware.com] )
etc.
Re:This is all true however... (Score:3, Interesting)
Why would the Lisp compiler do well on 'small' examples, like the C compiler does, but underperform it on 'larger' examples? Perhaps there are reasons to expect such behaviour but rather than answer a comp. sci. illiterate like me, if you really believe the authors of those papers have been making such serious errors I think you should write your own paper. And if you haven't got the time or inclination to do that, perhaps you could at least explain your criticisms to these fellows: http://books.google.com/books?id=8Cf16JkKz30C&pg=PA21&lpg=PA21 [google.com] They appear to have used similar reasoning - toy examples and extrapolation - and it looks like they're embarking on a major project to make a better performing R-like system with Common Lisp. But if Lisp really doesn't scale up well, that could turn out to be a futile ambition and a horrible waste of time of course.
Our QPX search engine is engineered for speed, speeds that must not be lower than using C and where huge amounts of data must not be bigger than packing them in C structs. Still, QPX is very complicated, and driven by individuals who write large bodies of code. Lisp allows us to define a wide variety of abstractions to manage the complexity, and at the same time we get the speed we want. Once QPX is compiled, one cannot easily tell the machine code from the machine code compiled from C.
(from the other ITA link)
Of course you may have guessed right about ITA's case anyway (and undoubtedly there do exist situations in which Lisp is inadequate for performance reasons - even C is sometimes) but whatever the reasons ITA have for using C and Java as well, there are no similar excerpts in #1 or #2 (or in other examples) and it seems to me they are hardly 'toy' examples.
Sure - you're just knocking the papers I linked to - and it seems to me that is just as well considering there are at least some anecdotal 'real world' counter-examples to your Lisp performance worry stemming from your (possibly justified) criticism that extrapolation from those papers' findings is, at least logically, invalid. I have always found Lisp easily adequate performance-wise for my own use - but that's just anecdotal too of course.
Re:This is all true however... (Score:2, Interesting)
... your (possibly justified) criticism that extrapolation from those papers' findings is, at least logically, invalid.
If you have ever worked with static analysis, my criticism is vacuously true.