Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming Java Python Stats

C Passed Java to Take #1 Spot on TIOBE's Index (techrepublic.com) 102

In its ongoing attempt to gauge the popularity of programming languages, "C is at the top of the list of TIOBE'S Index for February 2021 with Java in second place," reports TechRepublic: Those two languages swapped positions on the list as compared to 2020, but the rest of the list is almost exactly the same as a year ago. Python is in the No. 3 spot followed by C++, C#, Visual Basic, JavaScript, PHP, and SQL.

Assembly Language rounds out the top 10 list, up from spot 12 in 2020. R moved up two spots over the last year from 13 to 11. Groovy jumped to the 12h spot, up from 26 a year ago. Classic Visual Basic is on the rise also moving up four spots to 18.

For what it's worth, in the last year Go has dropped to #13 on the list — overtaken by assembly language, R, and Groovy.

And Swift dropped from #10 to #15, also being overtaken in the last year by Ruby.
This discussion has been archived. No new comments can be posted.

C Passed Java to Take #1 Spot on TIOBE's Index

Comments Filter:
  • Happy (Score:4, Funny)

    by Joce640k ( 829181 ) on Sunday February 14, 2021 @07:37AM (#61062210) Homepage

    It's good to see that 1.65% of all code is still being written in Assembly Language. :-)

    • (and that Python is only 10x as popular as ASM)

    • Comment removed based on user account deletion
      • The entire microcontroller world is obviously preferably asm.
        And last time I checked, simple microcontrollers vastiy outnumber full CPUs. Hell, most computers got many microcontrollers inside of them too.
        And kernels and compilers obviously all contain asm.

        • Re: Happy (Score:5, Interesting)

          by cormandy ( 513901 ) on Sunday February 14, 2021 @09:06AM (#61062368)
          I program uCUs using C, a language I first learned over 30 years ago at the start of my career. (Note that this is hobby stuff for me now and I do it on my spare time, not professionally.). No doubt both C and ASM both feature prominently on this list due to their use in this space. Iâ(TM)ve been using the Microchip XC compiler for some years now, and prior to that Hi-Tech C compiler (before being acquired by Microchip). Although off topic Iâ(TM)d like to voice my fury at the licensing moving from perpetual to subscription based for this particular product (for the advanced compiler features). Same with my accounting software (imagine not being able to access your business accounts if unable to pay a subscription) and, while Iâ(TM)m at it, Autodesk software.... Anyone in agreement with me on this issue? How I bloody hate subscription-based software licensing and to see it in use for a compiler product leaves me furious... I believe in supporting commercial software but the move to subscription has become a plague...
        • Was last time you checked 30 years ago maybe?

        • Re: (Score:3, Informative)

          by blugalf ( 7063499 )

          "The entire microcontroller world is obviously preferably asm."

          No, it's not. The bulk of microcontroller code is written in higher level languages, mostly C, even for smaller devices.

          Obviously, and by necessity, there is more assembly than elsewhere. It's indispensable for timing critical code or interrupt handling, but this is usually only a tiny fraction of all code.

        • Actually most machines are, strangely enough, full of machine code.

      • by gweihir ( 88907 )

        Quite a bit of small IoT and sensor stuff has components in assembly or is fully assembly if really small. Remember that all the old MCUs like 8051 family, z80, etc. are all still in use.

      • by Pimpy ( 143938 )

        I'd believe the 1.65% figure, assembly is still the main way to go for OS bringup, mapping new instruction set extensions into the toolchain, etc. Yes, most people will ultimately use higher-level languages, but those higher-level languages still need to be mapped to the new CPU somehow in order to make effective use of any new features.

      • The list is generated by comparing the reported number of hits returned for search phrases like +"Python programming" vs +"C Programming" vs +"ASM Programming". This done using a variety of search engines, but is not limited to a date range. I would not give much credence to the results, especially as a guide to current "popularity", which is what it is supposed to be measuring..

    • I do, on ATMEGA, I wrote a 512 bytes bootloader to allow flashing of the device.

  • by chrism238 ( 657741 ) on Sunday February 14, 2021 @07:39AM (#61062216)
    A rather pointless statement, otherwise.
    • Comment removed based on user account deletion
      • I'd say mixrocontrollers are *usually* programmed in assembly.

        Also, everyone who studied this, usually had to code assembly for some microcontroller at least once.

        • I've never programmed an ARM (32bit) or an MSP430 (16bit) in assembly language. Always in C though.

          But AVR, Picmcro and 805x don't have what I would consider great C compilers (especially for complex applications) and I would think this is where all the assembly language programming is taking place.

      • by grub ( 11606 )
        r/asm has a good amount of posts from people working in classic 6502 assembly. Many of them are people who are building (or have built) Ben Eater's excellent 6502 Computer on a Breadboard [youtube.com]
        • by leptons ( 891340 )
          /r/asm is an echo chamber just like every other /r/ on that website. No, 6502 is not moving the needle on assembly language. The popularity is most certainly due to other modern microcontrollers. There were 25 billion microcontrollers shipped in 2019. The use case for cheap resource constrained microcontrollers is vast, and so is the need for assembly language for these devices.
          • by grub ( 11606 )
            Oh yeah, wasn't saying that a the 6502 or a subreddit was responsible for the rise in asm, just pointing out that it's still popular. The architecture is still used in embedded systems with hundreds of millions moved annually.
    • All of them. Obviously.

      I wonder how you even communicate with regular people, e.g. outside ...

  • I had Java for 1 year at Uni and have used it a grand total of 1 day in my professional career.

    • {shrug} I'm the opposite. Studied C in Univ and been using Java professionally for over 20 years. Right tool for right job. I suppose if we started our same application today it'd be written on .NET but it's been working out well for us. No issues migrating versions or platforms (from NT to Linux and back to Windows again, and a couple of database platform migrations to boot with fairly few issues).

      Obviously we had to move away from Oracle's implementation but OpenJDK proved to be a good drop-in replaceme
    • Yes, and I only coded in Pascal, PHP and then Haskell ever since in my professional career, so clearly PHP is one of the greatest and C is shite. ;)

      *shudders in PHP*

  • my favorite languages are c, c++, java, php, perl, js and sql however I have yet to need to learn python or c# from my experience here in the eu ...java has seen a surge in popularity but jobs that require c is not something I encounter as often
    • forgot to mention delphi a language that I still encounter surprisingly often
      • In Germany too.

        Seems like a lot of people got recommended Turbo Pascal in the early days, because let's face it, it was the hot shit for computer kids back in the days,
        and then also became fascinated with Delphi's RAD, which was amazing when it came out.

        It's just that some kept being stuck there.

        But hey, some people who do not code low-level stuff, also seem to be stuck on C anyway. And many people seem to be stuck with C++, despite the Frankensteinian abomination that it is. Al least Delphi was actually en

        • What makes you think Delphi is âoeprimitiveâ and âoelimitedâ? When was the last time you actually used it?

          Delphi now supports Windows, Mac, Linux, iOS, and Android out of the box. The language is very expressive.

          Itâ(TM)s limitations are due to the crowd of developers and 3rd party component vendors who abandoned it when Borland became a sinking ship. And, after it was sold to Embarcadero, many didnâ(TM)t come back.

          But, itâ(TM)s an amazing platform. So, yeah...Iâ

        • I like C++. I came to my current $DAYJOB from an embedded C++ place, and changed to Java in the hope to avoid discussions about language constructs as were dominating in the previous job. But after some time I had to deal with both performance and latency, so I had to use C++ again. I estimated it would cost like 3 times as much to code in than Java. But after we got started on a good framework and automatic testing, the parts written in C is actually much better off than the old Java code.

          That said: Ja

          • Java compiles faster in large part because it doesn't generate machine code. You pay for that many times over in runtime JIT latency. Not the right tradeoff IMHO.

  • by BAReFO0t ( 6240524 ) on Sunday February 14, 2021 @08:24AM (#61062284)

    I know they mean well,

    but unless they themselves take the #1 spot on the index of indexes with a verified reliability and usefulness, their judgement is almost, but not quite, entirely useless. :)

    In other words: Theor methods seem about as reliable as a Slashdot poll, so why are you telling us this, and why are you acting as if it was news?

    • by gweihir ( 88907 ) on Sunday February 14, 2021 @11:02AM (#61062620)

      Actually, Tiobe is not halfway bad. Sure, it partially reflects what people are in demand, not what people are working in, but as a relevancy index it serves well.

      The only problem with Tiobe is that some people refuse to accept what it clearly states because that does not fit into their mental model how things have to be.

      • Actually, Tiobe is not halfway bad. Sure, it partially reflects what people are in demand, not what people are working in, but as a relevancy index it serves well.
        The only problem with Tiobe is that some people refuse to accept what it clearly states because that does not fit into their mental model how things have to be.

        Here's a problem, it's based on search engine results, and there is nothing clear to accept about that other than search engines return a lot of results for "c programming", for reasons. Among search engines you'd probably guess are a few you wouldn't, eBay and Amazon. So the back catalog of 90's books on c/c++, php, Java, Perl etc all contribute.

        It's not based on search engine query statistics, or job postings (ok, indeed.com is searched), it doesn't speak to maintained code vs. written, or activity in t

        • fwiw, over time, mathmethods can be used to compare with other historical data to see if there is any correlation. I'm assuming no-one has done that with TIOBE?

  • that this really means "C programming" search queries passed "Java programming" search queries to take #1 spot in the index of "x programming" search query popularity.

    just how this superficial metric correlates to important ones, such as language popularity, job markets, potential value of studying a language, etc. is anyone's guess.

    • so probably somebody wrote a bot to search for "C programming", just to change TIOBE index...and of course the bot was written in java language!
      • I just have a hard time imagining a programmer that knows a programming language ever using this specific query. in fact I find it hard to believe many decent programmers or future programmers would even use this a query worded this way; they look like something an uninitiated or a student could use. so to me this index is much more indicative of ongoing interest in a programming language, that correlates mostly with how many junior programmers with this language we're going to have in a couple years, rathe

    • I don't wanna bash C programmer here, but I can't think they are #1 in 2021. Most company in C# have evolved to C++ and most new project started are not in C++. Yes if you build super "scalable" /high efficient code it's the best choice but look at job offers I think the "C programming language" might included other non "pure C programming language" such as C++, C# or objective C.
      • by gweihir ( 88907 )

        They may well be #1 in demand though, and that demand may be hard to fill.

        Incidentally, my boss recently told me that for retirement he plans to refresh his Fortran skills, because there are some exceptionally well-paying jobs in that area that are very hard to fill.

      • according to https://www.tiobe.com/tiobe-in... [tiobe.com] they are using hit counts for +"language programming" for their index, where language is the name of an actual programming language. so the question is, what kind of person would type "c programming" as their search query? certainly not those who are already programming in C. I can speculate that a significant part of these queries comes from CompSci freshfolks that almost universally study C as one of their first programming languages, and quite possibly THE f

    • by Rockoon ( 1252108 ) on Sunday February 14, 2021 @09:36AM (#61062444)
      Almost certainly a good chunk of the searches are for lookup up the edge cases and complicated bits of the standard C preprocessor, and not C itself.

      The reason for this is because dozens of other languages also use the C preprocessor, and in some specific areas the preprocessor dominates every language within the domain (cg, glsl, hlsl, fx, ... every one a shader language and ALL use the C preprocessor) and on some operating systems the exact same implementation of it (cpp on linux) is used everywhere.

      What gets me is this drive towards less featureful preprocessors in the last batch of general purpose languages, as if the programmer trusted to write the code cannot be trusted to write a macro that generates code. If anything the standard C preprocessor could use a few more features and helpers for clearly useful simplifications (all the common "clever" tricks like getting the parameter count of a macro invocation should be first class features requiring no tricks)
  • by grub ( 11606 ) <slashdot@grub.net> on Sunday February 14, 2021 @09:09AM (#61062376) Homepage Journal
    When I woke up this morning the universe felt somehow in order, all things where they should me.
    • When I woke up this morning the universe felt somehow in order, all things where they should me.

      You must be part of the "Me" generation! Right on!

  • It says something about humanity that our top programming language is a language that was written as a quick hack on top of PDP-11 assembly and does not have support for dynamic arrays (you have to fake it with malloc) or even proper support for static arrays. When you pass arrays into a function in C, they become a pointer to memory, just as if they were malloc'ed. Yes, that's right, C's fundamental design flaws are completely hidden by its superficial design flaws.

    But look how fast it is!
    • Re:Sigh (Score:4, Interesting)

      by The Evil Atheist ( 2484676 ) on Sunday February 14, 2021 @09:40AM (#61062448)
      Other indexes don't have C at the top.

      C is the "top" for where it's used - close to the machine. The machine has no concept of arrays or even stacks, so it's not even a question of speed, but about a portable mapping of code to hardware. Speed is a side-effect of modelling the hardware closely. You simply don't want, and can't have, dynamic stuff at the machine level unless you provide the abstraction yourself - from providing your own library, to implementing a language on top of it

      Having said that, C99 has VLAs so it does have dynamic arrays without malloc, but it introduces the problem of running out of stack unpredictably. It turns out languages aren't magic, and compilers and interpreters eventually have to get down to the machine level.
      • This doesn't explain why C has no problem properly supporting static arrays defined *inside* the function, other than the fact Ken and Dennis were super-lazy and they already had the code for passing pointers by value, so, why not degenerate the array into a pointer and be done? What could possibly go wrong?
        • The static arrays inside the function is an illusion maintained by the compiler. The size is known at compile time, so it can check it if the array is used at the same scope its defined in. If you pass it to a generic function, like memcpy, that has to use arrays of all sizes, then you can't declare memcpy with every conceivable array size. So it all boils down to a pointer and a length.

          So it does explain it. The size of a statically sized array inside a function is an illusion maintained by the compiler
        • This doesn't explain why C has no problem properly supporting static arrays defined *inside* the function

          What do you mean "properly" supporting? I know that sizeof only works in the scope that a static array is defined in, because the compiler has that information even though the array does not know its own size. When that same array is passed to a function, the compiler does not have that information because the array does not know its own size.

          When you pass arrays into a function in C, they become a pointer to memory, just as if they were malloc'ed

          I hate to be the one to break it to you, but arrays are a pointer to memory even before they are passed to a function. I don't know why you think that arrays magical

          • The array should know it's own size. Really, all it would have taken was an extra pointer pointing to the last element and some extra code in the compiler. But Ken and Dennis were lazy, even by "70s programmer hacking on a PDP-11" standards.
            • And no, I won't accept any "performance" bollocks. There is no real performance cost. It was all laziness, really. And yet everyone thinks the original C language was a masterpiece that shouldn't be improved upon, or should I say fixed.
            • by Uecker ( 1842596 )

              Arrays know their size in C! You can also get run-time bounds checking if your compiler supports it. (e.g. -fsanitize=undefined)

              Yes, an array passed to a function decays into a pointer to its first element and the size is lost but you can also pass a pointer to the array itself and then the size is preserved.

              void f(int n; double (*x)[n])
              {
                  (*x)[n] = 1; // you get a run-time error if your compiler supports it and it is activated.
              }

          • by Uecker ( 1842596 )

            If you have an array as function parameter than it is changed to a pointer. While this does not affect the array itself the type is changed and the length is lost.

            But you can prevent this by passing a pointer to the array:

            void f(int n, double (*x)[n])
            {
                (*x) - array with size information
            }

            And arrays with dynamic bound can be stored on the stack and also the heap. I am not sure why people people VLAs can only exist on the stack.

      • Wait, what? (Score:5, Informative)

        by fyngyrz ( 762201 ) on Sunday February 14, 2021 @11:27AM (#61062686) Homepage Journal

        The machine has no concept of arrays or even stacks

        Erm... microcontrollers have had stack pointers and register push/pull stack manipulation instructions since... well, just about the beginning. Index registers too. When the 6809 came out (1978), its stack pointers had all manner of cool, SP-specific instructions well beyond pushing and pulling registers. Not to mention tons of useful indexing modes that were specifically designed to, among other things, be quite array-riffic. Which work not only on the index registers, but on the stack pointers, too. So it is outright trivial to, for instance, implement an array or table or pointer or other datum on the stack, and pass it around likewise — the essence of procedure-local variables. And of course there are many indexing modes so you can do this in heaps, in memory, to memory-mapped I/O, etc.

        Bottom line, you don't need to go above ASM at all to work directly with arrays and stacks. Well, unless your uProcessor is less capable than a 1978-era uProcessor. And no one wants that. :)

        • That's still not a stack. Your compiler issues the instructions for stack management, or you have to manually write assembly to behave like a stack, but there's nothing stopping you from writing whatever the hell you like in assembly.

          Those registers and instructions are for convenience, but there is no hardware stack that actually keeps track of whether they're correct. That is all an abstraction/illusion maintained by the compiler, or by people opting into it when writing assembly. Therefore, the hardwa
          • Those registers and instructions are for convenience, but there is no hardware stack that actually keeps track of whether they're correct.

            Come on, that's absurd. Programmers have a job, you know. You can gloriously screw up a stack in c, c++, even Python. To say the language (ASM or whatever) has no concept of a stack is like saying it has no concept of anything, like program counters or branching. Compared to most things, uControllers are very well equipped with stack manipulation operations, designed to b

            • You can gloriously screw up a stack in c, c++, even Python.

              Yes, because those languages are defined with stack semantics in mind. You can screw up a stack in those languages because you break all the hidden assumptions that the compiler or interpreter makes.

              Assembly has no such limit. The instructions are for convenience, but they are not mandatory. The stack is mandatory given the way the higher level languages are designed. To be more precise, the abstract machine of C and C++ doesn't actually specify stacks exactly, which is why they've been able to add threa

    • It says something about humanity that our top programming language is ... C

      That we're awesome? Or maybe... that we think we're awesome, and it's the other idiots that are writing all those buffer overflow bugs?

      • C programmers are like drunk drivers (only for the case of C programmers, it's a permanent condition): Full of unwarranted confidence that their experience would never allow them to insert any buffer overflow or buffer over-reads bugs in their code, leading to the presumption that a brain-impaired language like C with no proper bounds checking is a non-issue. You see, it's always the other C programmers that cause carwrecks like Heartbleed.
    • On the other hand, the only language that I've been able to produce faster code in or better control memory usage with is assembly. When you're coding for devices, it matters.
    • Re:Sigh (Score:5, Informative)

      by monkeyxpress ( 4016725 ) on Sunday February 14, 2021 @02:52PM (#61063330)

      ...and does not have support for dynamic arrays (you have to fake it with malloc)...

      I'm sorry to have to break this to you, but your high level language is also faking dynamic arrays with malloc.

      Dynamic arrays do not exist as a hardware construct. The closest you can get is paging at the process level, but at the variable level it is not worth the complexity to attempt to support this in hardware. Hence all dynamic allocation is a tradeoff between performance and memory efficiency. There are endless ways to support this tradeoff, and whoever wrote the allocator for your high level language has had to decide what works best in general, which may be terrible for your particular application.

      If you want to just use a general purpose allocation system, then wrap that evil malloc in a library and away you go. It's really not very hard. But some of us need finer control of how things are working and that is available as well.

      Now don't get me wrong, high level languages are very good for many things, but an ignorance of what shiny features like automatic memory allocation are doing in the background is essentially the cause of problems like websites that chew 100's of MB to show a bit of text and images.

      • You're both wrong. C variable length arrays have automatic duration, that is they are allocated on the stack.

        • by Uecker ( 1842596 )

          C variable length arrays can be allocated on the stack or on the heap like all other data types.

  • I was wondering. For you device driver coders out there, what language are you using now?

    I was guessing maybe C or C++, but in reality what do you code in?
  • Is this in the same league as COBOL now?
    • Not really, the issue is support (and revenues).

      COBOL has great support via IBM.

      Classic VB has been sunsetted, not supported by MS, and server admins bring passion to their hatred of systems built on it.

      Anything you write today is technical debt tomorrow, all technologies are on a trend to the debt heap). But some can be big money makers from a support perspective (MS missed the bus on this with VB).

  • we moved to the cloud for automation efficiency and other business aspects like costs. we are moving back to the edge for distributed compute power. why should i send my stuff to a cloud when i have the super computing power in my pocket? why should i lump all my data together with the rest of my million devices when i already had a distributed database in the form of my million devices? industry 4.0 integration? do it on site. why should i jump out of my robotic arms native programming language to ena
  • Personally, I'd like to see this list bifurcated into languages that are compiled versus ones that are interpreted.

    • Very few languages are exclusively in either camp. I've even used interpreted C and C derivatives many times in my career.
  • ...not in any malevolent, Machiavellian sense, but because they appear to be focused on safety critical systems, which in turn are more likely to involve asics / control systems etc and be coded in lower level languages. So I don't think this survey really means anything to the common or garden dev.
  • Java is an abomination, it was bad back when Sun owned it for technical reasons and now with Oracle owning it and suing everyone it is toxic as fuck.

  • Good compilers (Score:5, Interesting)

    by mi ( 197448 ) <slashdot-2017q4@virtual-estates.net> on Sunday February 14, 2021 @01:45PM (#61063140) Homepage Journal

    C is at the top of the list

    I credit improvements in compilers and other tools for this, primarily. Clang is amazing at flagging little problems at compile time — and newer releases of GNU C/C++ aren't bad either. I, for one, get a mild endorphins rush from fixing C (and C++) code to compile with -Wall -W

    Whatever the compiler misses, valgrind will catch at runtime. Having spent such effort at development, you get a lean and fast program to release — whereas the languages relieving you from the grunt work produce programs, that remain both slow and resource-consuming for ever...

    • C is at the top of the list

      I credit improvements in compilers and other tools for this, primarily. Clang is amazing at flagging little problems at compile time — and newer releases of GNU C/C++ aren't bad either.

      Amazing, certainly, but also a little sad. Many of the optimisations that can be done (or warnings that can be generated) are not being done/generated because of the huge body of code out there that will get broken/flagged, respectively.

      Unfortunately, you cnanot blame the compiler writers nor can you blame the programmers themselves - the ANSI committee is filled with morons who have no intention of making the language spec less ambiguous. After their last standard they were dumbfounded to learn that the c

      • by Uecker ( 1842596 )

        C is at the top of the list

        I credit improvements in compilers and other tools for this, primarily. Clang is amazing at flagging little problems at compile time — and newer releases of GNU C/C++ aren't bad either.

        Amazing, certainly, but also a little sad. Many of the optimisations that can be done (or warnings that can be generated) are not being done/generated because of the huge body of code out there that will get broken/flagged, respectively.

        Maybe not by default, but there are pretty good tools you can use. The problem with C is that it does not hold your hand: The standard library is minimalistic and compilers let you shoot yourself in the foot. But once you know the tools and have a set of good data structures available with safe APIs, C can be used safely in my opinion.

        Unfortunately, you cnanot blame the compiler writers nor can you blame the programmers themselves - the ANSI committee is filled with morons who have no intention of making the language spec less ambiguous.

        Well, the C standard is maintained by a working group of a joint ISO / IEC committee and there are quite a lot compiler writers participating. Also there are quite a few peopl

        • After their last standard they were dumbfounded to learn that the changes they made to the C99 standard turned memcpy into undefined-behaviour.

          I am not sure what you are talking about. memcpy is defined as part of the standard library so certainly not causing undefined behaviour when used as intended.

          From this link over here [yodaiken.com]:

          That last sentence in 6.5 Para7 is a hack added when the Committee realized it had made it impossible to write memcpy in C

          Their solution was to add a footnote with an exception for memcpy, meaning that you can't implement memcpy() in C (your implementation has no exception).

          It gets worse though:

          Yes, according to the C Standard, malloc cannot be written in C. In fact, the example of malloc in K&R 2cd edition is not written in C according to the standard under current interpretations. Instead there is special treatment for malloc as a mystery library function so it can return pointers to data of “no declared type” even though malloc is defined to return pointers to void. What if you want to write an allocator, not called malloc/free? I don’t know, perhaps you are supposed to use Rust or Swift.

          So, once again, the standard crippled the implementors by making sure they can't use standards-compliant C to write the malloc/memcpy implementation, then turned tons of existing legal code into illegal code.

          It actually gets a lot worse when you start reading up on the current interpretations of the standard

          • by Uecker ( 1842596 )

            I think you are bit confused about the nature of undefined behavior (as are some of your sources - random blogs on the internet are sometimes not the ideal source). The C standard (and this was always the case) specifies certain things and does not specify certain other things (leaves it undefined). This is the reason why C is very portable while still being extremely light weight (does not need a costly abstraction layer). Some compiler writers promoted the idea that "undefined behavior" is illegal for a

            • What a wall of text. Use paragraphs dammit. If you can't take the time to write a message properly, why expect others to take the time to try to decipher it?

              I think you are bit confused about the nature of undefined behavior (as are some of your sources - random blogs on the internet

              I hardly think that linking to a blog by a committee member is some "random blog on the internet" :-/ You didn't even find a link that agrees with anything you say, be it random or not!

              are sometimes not the ideal source). The C standard (and this was always the case) specifies certain things and does not specify certain other things (leaves it undefined). This is the reason why C is very portable while still being extremely light weight (does not need a costly abstraction layer). Some compiler writers promoted the idea that "undefined behavior" is illegal for a programmer to use and can be exploited for compilers for optimization, e.g. even a little bit of UB behavior allows a compiler to completely break your program. Now, strictly speaking the later is true from the point of the standard (as it is undefined behavior compilers can do what the want) and also partially intended.

              No, it is not intended. This is why the standard has "implementation-defined" in addition to "undefined". What you are describing here is implementation-defined behaviou

              • by Uecker ( 1842596 )

                UB is not forbidden. It has a definition in 3.4.3 (C17). Implementation defined also has a definition in 3.4.1

                But you are excused: obviously thee standard text did not have enough paragraphs for you to be able to read it.

                And btw. I am committee member (and GCC contributor).

                • You know, I half want to copy this entire thread and post it somewhere just to display the problems with the committee that I (and others) have complained about. It's ironic that I complained that the committee members are morons because they aren't addressing the concerns with C and are instead making changes that make the language less safe and less defined than it is now, and you go ahead and demonstrate the problem I was complaining about!

                  UB is not forbidden. It has a definition in 3.4.3 (C17).

                  Not for correct programs you dimwit.

                  3.4.31undefined behavior

                  behavior, upon use of a nonportable or erroneous program construct or of erroneous data, for whichthis International Standard imposes no requirements

                  2Note 1 to entry:Possible undefined behavior ranges from ignoring the situation completely with unpredictable results,to behaving during translation or program execution in a documented manner characteristic of the environment (with orwithout the issuance of a diagnostic message), to terminating a translation or execution (with the issuance of a diagnosticmessage).

                  So, yeah, UB is not forbidde

  • ... and not yet moved to Rust whilst Web-Hipsters try every new PL that comes out every odd month. It's Svelte now, wasn't it? Or has that passed already? What's new this quarter? Anyone?

  • The words "most popular" could mean a variety of things, of course. The measure coming from various search engines could also means so many different things. Are they looking at "searches" with programming language names in them? I'd think that's more a measure of developers probably having difficulty in the languages.

    I really wish these kinds of indexes and polls would--first and foremost--explain what they are trying to measure and how they are measuring it.

    To me, the languages' value is gauged by

    (1) p

  • by FeelGood314 ( 2516288 ) on Sunday February 14, 2021 @02:37PM (#61063282)
    You can write unmaintainable code in any language but java, java script and now C++ programmers have all embraced write once read never. Scripting languages where never meant to be reread.
  • ...but aren't all the other languages pretty much written in C, or C++? I mean, sure, it's more complex than that. But I thought the Java, Javascript, Ruby, Lua, C#, Python etc etc runtimes are written in either C or C++.

    I know some others can compile themselves, maybe Go can, and D and stuff. Maybe Swift? But I think most are just written as a front end to clang, which is written in C++.

    C and C++ will outlive all of them.

  • Languages are popular for all sorts of reasons, including bad ones. Pay your bills, code the language you like or need and chill.
  • Why is SQL on that list even though its a domain specific language for database queries (it even says so in the name) and not a general purpose programming language.

A good supervisor can step on your toes without messing up your shine.

Working...