Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

[P]
Why C Is Not My Favourite Programming Language

By James A C Joyce in Technology
Mon Feb 09, 2004 at 03:24:52 AM EST
Tags: Software (all tags)
Software

Brian Kernighan, the documenter of the C programming language, wrote a rant entitled Why Pascal is Not My Favourite Programming Language. I can picture him thinking to himself smugly as he repeatedly strikes facetiously at Pascal by describing a few of its small flaws over and over again.

Unfortunately, time has not been kind to Kernighan's tract. Pascal has matured and grown in leaps and bounds, becoming a premier commercial language. Meanwhile, C has continued to stagnate over the last 35 years with few fundamental improvements made. It's time to redress the balance; here's why C is now owned by Pascal.


No string type

C has no string type. Huh? Most sane programming languages have a string type which allows one to just say "this is a string" and let the compiler take care of the rest. Not so with C. It's so stubborn and dumb that it only has three types of variable; everything is either a number, a bigger number, a pointer or a combination of those three. Thus, we don't have proper strings but "arrays of unsigned integers". "char" is basically only a really small number. And now we have to start using unsigned ints to represent multibyte characters.

What. A. Crock. An ugly hack.

Functions for insignificant operations

Copying one string from another requires including <string.h> in your source code, and there are two functions for copying a string. One could even conceivably copy strings using other functions (if one wanted to, though I can't imagine why). Why does any normal language need two functions just for copying a string? Why can't we just use the assignment operator ('=') like for the other types? Oh, I forgot. There's no such thing as strings in C; just a big continuous stick of memory. Great! Better still, there's no syntax for:

  • string concatenation
  • string comparison
  • substrings

Ditto for converting numbers to strings, or vice versa. You have to use something like atol(), or strtod(), or a variant on printf(). Three families of functions for variable type conversion. Hello? Flexible casting? Hello?

And don't even get me started on the lack of exponentiation operators.

No string type: the redux

Because there's no real string type, we have two options: arrays or pointers. Array sizes can only be constants. This means we run the risk of buffer overflow since we have to try (in vain) to guess in advance how many characters we need. Pathetic. The only alternative is to use malloc(), which is just filled with pitfalls. The whole concept of pointers is an accident waiting to happen. You can't free the same pointer twice. You have to always check the return value of malloc() and you mustn't cast it. There's no built-in way of telling if a spot of memory is in use, or if a pointer's been freed, and so on and so forth. Having to resort to low-level memory operations just to be able to store a line of text is asking for...

The encouragement of buffer overflows

Buffer overflows abound in virtually any substantial piece of C code. This is caused by programmers accidentally putting too much data in one space or leaving a pointer pointing somewhere because a returning function ballsed up somewhere along the line. C includes no way of telling when the end of an array or allocated block of memory is overrun. The only way of telling is to run, test, and wait for a segfault. Or a spectacular crash. Or a slow, steady leakage of memory from a program, agonisingly 'bleeding' it to death.

Functions which encourage buffer overflows

  • gets()
  • strcat()
  • strcpy()
  • sprintf()
  • vsprintf()
  • bcopy()
  • scanf()
  • fscanf()
  • sscanf()
  • getwd()
  • getopt()
  • realpath()
  • getpass()

The list goes on and on and on. Need I say more? Well, yes I do.

You see, even if you're not writing any memory you can still access memory you're not supposed to. C can't be bothered to keep track of the ends of strings; the end of a string is indicated by a null '\0' character. All fine, right? Well, some functions in your C library, such as strlen(), perhaps, will just run off the end of a 'string' if it doesn't have a null in it. What if you're using a binary string? Careless programming this may be, but we all make mistakes and so the language authors have to take some responsibility for being so intolerant.

No built-in Boolean type

If you don't believe me, just watch:

$ cat > test.c
int main(void)
{
bool b;
return 0;
}

$ gcc -ansi -pedantic -Wall -W test.c
test.c: In function 'main':
test.c:3: 'bool' undeclared (first use in this function)

Not until the 1999 ISO C standard were we finally able to use 'bool' as a data type. But guess what? It's implemented as a macro and one actually has to include a header file to be able to use it!

High-level or low-level?

On the one hand, we have the fact that there is no string type, and direct memory management, implying a low-level language. On the other hand, we have a mass of library functions, a preprocessor and a plethora of other things which imply a high-level language. C tries to be both, and as a result spreads itself too thinly.

The great thing about this is that when C is lacking a genuinely useful feature, such as reasonably strong data typing, the excuse "C's a low-level language" can always be used, functioning as a perfect 'reason' for C to remain unhelpfully and fatally sparse.

The original intention for C was for it to be a portable assembly language for writing UNIX. Unfortunately, from its very inception C has had extra things packed into it which make it fail as an assembly language. Its kludgy strings are a good example. If it were at least portable these failings might be forgivable, but C is not portable.

Integer overflow without warning

Self explanatory. One minute you have a fifteen digit number, then try to double or triple it and boom! its value is suddenly -234891234890892 or something similar. Stupid, stupid, stupid. How hard would it have been to give a warning or overflow error or even reset the variable to zero?

This is widely known as bad practice. Most competent developers acknowledge that silently ignoring an error is a bad attitude to have; this is especially true for such a commonly used language as C.

Portability?!

Please. There are at least four official specifications of C I could name from the top of my head and no compiler has properly implemented all of them. They conflict, and they grow and grow. The problem isn't subsiding; it's increasing each day. New compilers and libraries are developed and proprietary extensions are being developed. GNU C isn't the same as ANSI C isn't the same as K&R C isn't the same as Microsoft C isn't the same as POSIX C. C isn't portable; all kinds of machine architectures are totally different, and C can't properly adapt because it's so muttonheaded. It's trapped in The Unix Paradigm.

If it weren't for the C preprocessor, then it would be virtually impossible to get C to run on multiple families of processor hardware, or even just slightly differing operating systems. A programming language should not require a C preprocessor just so that it can run on both FreeBSD, Linux or Windows without failing to compile.

C is unable to adapt to new conditions for the sake of "backward compatibility", throwing away the opportunity to get rid of stupid, utterly useless and downright dangerous functions for a nonexistent goal. And yet C is growing new tentacles and unnecessary features because of idiots who think adding seven new functions to their C library will make life easier. It does not.

Even the C89 and C99 standards conflict with each other in ridiculous ways. Can you use the long long type or can't you? Is a certain constant defined by a preprocessor macro hidden deep, deep inside my C library? Is using a function in this particular way going to be undefined, or acceptable? What do you mean, getch() isn't a proper function but getchar() is?

The implications of this false 'portability'

Because C pretends to be portable, even professional C programmers can be caught out by hardware and an unforgiving programming language; almost anything like comparisons, character assignments, arithmetic, or string output can blow up spectacularly for no apparent reason because of endianness or because your particular processor treats all chars as unsigned or silly, subtle, deadly traps like that.

Archaic, unexplained conventions

In addition to the aforementioned problems, C also has various idiosyncrasies (invariably unreported) which not even some teachers of C are aware of: "Don't use fflush(stdin), gets() is evil, main() must return an integer, main() can only take one of three sets of arguments, you musn't cast the return value of malloc(), fileno() isn't an ANSI compliant function..." all these unnecessary and unmentioned quirks mean buggy code. Death by a thousand cuts. Ironic when you consider that Kernighan thinks of Pascal in the same way when C has just as many little gotchas that bleed you to death gradually and painfully.

Blaming The Progammer

Due to the fact that C is pretty difficult to learn and even harder to actually use without breaking something in a subtle yet horrific way it's assumed that anything which goes wrong is the programmer's fault. If your program segfaults, it's your fault. If it crashes, mysteriously returning 184 with no error message, it's your fault. When one single condition you'd just happened to have forgotten about whilst coding screws up, it's your fault.

Obviously the programmer has to shoulder most of the responsibility for a broken program. But as we've already seen, C positively tries to make the programmer fail. This increases the failure rate and yet for some reason we don't blame the language when yet another buffer overflow is discovered. C programmers try to cover up C's inconsistencies and inadequacies by creating a culture of 'tua culpa'; if something's wrong, it's your fault, not that of the compiler, linker, assembler, specification, documentation, or hardware.

Compilers have to take some of the blame. Two reasons. The first is that most compilers have proprietary extensions built into them. Let me remind you that half of the point of using C is that it should be portable and compile anywhere. Adding extensions violates the original spirit of C and removes one of its advantages (albeit an already diminished advantage).

The other (and perhaps more pressing) reason is the lack of anything beyond minimal error checking which C compilers do. For every ten types of errors your compiler catches, another fifty will slip through. Beyond variable type and syntax checking the compiler does not look for anything else. All it can do is give warnings on unusual behaviour, though these warnings are often spurious. On the other hand, a single error can cause a ridiculous cascade, or make the compiler fall over and die because of a misplaced semicolon, or, more accurately and incriminatingly, a badly constructed parser and grammar. And yet, despite this, it's your fault.

To quote The Unix Haters' Handbook:

"If you make even a small omission, like a single semicolon, a C compiler tends to get so confused and annoyed that it bursts into tears and complains that it just can't compile the rest of the file since one missing semicolon has thrown it off so much."

So C compilers may well give literally hundreds of errors stating that half of your code is wrong if you miss out a single semicolon. Can it get worse? Of course it can! This is C!

You see, a compiler will often not deluge you with error information when compiling. Sometimes it will give you no warning whatsoever even if you write totally foolish code like this:

#include <stdio.h>

int main()
{
char *p;
puts(p);
return 0;
}

When we compile this with our 'trusty' compiler gcc, we get no errors or warnings at all. Even when using the '-W' and '-Wall' flags to make it watch out for dangerous code it says nothing.

In fact, no warning is given ever unless you try to optimise the program with a '-O' flag. But what if you never optimise your program? Well, you now have a dangerous program. And unless you check the code again you may well never notice that error.

What this section (and entire document) is really about is the sheer unfriendliness of C and how it is as if it takes great pains to be as difficult to use as possible. It is flexible in the wrong way; it can do many, many different things, but this makes it impossible to do any single thing with it.

Trapped in the 1970s

C is over thirty years old, and it shows. It lacks features that modern languages have such as exception handling, many useful data types, function overloading, optional function arguments and garbage collection. This is hardly surprising considering that it was constructed from an assembler language with just one data type on a computer from 1970.

C was designed for the computer and programmer of the 1970s, sacrificing stability and programmer time for the sake of memory. Despite the fact that the most recent standard is just half a decade old, C has not been updated to take advantage of increased memory and processor power to implement such things as automatic memory management. What for? The illusion of backward compatibility and portability.

Yet more missing data types

Hash tables. Why was this so difficult to implement? C is intended for the programming of things like kernels and system utilities, which frequently use hash tables. And yet it didn't occur to C's creators that maybe including hash tables as a type of array might be a good idea when writing UNIX? Perl has them. PHP has them. With C you have to fake hash tables, and even then it doesn't really work at all.

Multidimensional arrays. Before you tell me that you can do stuff like int multiarray[50][50][50] I think that I should point out that that's an array of arrays of arrays. Different thing. Especially when you consider that you can also use it as a bunch of pointers. C programmers call this "flexibility". Others call it "redundancy", or, more accurately, "mess".

Complex numbers. They may be in C99, but how many compilers support that? It's not exactly difficult to get your head round the concept of complex numbers, so why weren't they included in the first place? Were complex numbers not discovered back in 1989?

Binary strings. It wouldn't have been that hard just to make a compulsory struct with a mere two members: a char * for the string of bytes and a size_t for the length of the string. Binary strings have always been around on Unix, so why wasn't C more accommodating?

Library size

The actual core of C is admirably small, even if some of the syntax isn't the most efficient or readable (case in point: the combined '? :' statement). One thing that is bloated is the C library. The number of functions in a full C library which complies with all significant standards runs into four digit figures. There's a great deal of redundancy, and code which really shouldn't be there.

This has knock-on effects, such as the large number of configuration constants which are defined by the preprocessor (which shouldn't be necessary), the size of libraries (the GNU C library almost fills a floppy disk and its documentation, three) and inconsistently named groups of functions in addition to duplication.

For example, a function for converting a string to a long integer is atol(). One can also use strtol() for exactly the same thing. Boom - instant redundancy. Worse still, both functions are included in the C99, POSIX and SUSv3 standards!

Can it get worse? Of course it can! This is C!

As a result it's only logical that there's an equivalent pair of atod() and strtod() functions for converting a string to a double. As you've probably guessed, this isn't true. They are called atof() and strtod(). This is very foolish. There are yet more examples scattered through the standard C library like a dog's smelly surprises in a park.

The Single Unix Specification version three specifies 1,123 functions which must be available to the C programmer of the compliant system. We already know about the redundancies and unnecessary functions, but across how many header files are these 1,123 functions spread out? 62. That's right, on average a C library header will define approximately eighteen functions. Even if you only need to use maybe one function from each of, say, five libraries (a common occurrence) you may well wind up including 90, 100 or even 150 function definitions you will never need. Bloat, bloat, bloat. Python has the right idea; its import statement allows you to define exactly the functions (and global variables!) you need from each library if you prefer. But C? Oh, no.

Specifying structure members

Why does this need two operators? Why do I have to pick between '.' and '->' for a ridiculous, arbitrary reason? Oh, I forgot; it's just yet another of C's gotchas.

Limited syntax

A couple of examples should illustrate what I mean quite nicely. If you've ever programmed in PHP for a substantial period of time, you're probably aware of the 'break' keyword. You can use it to break out from nested loops of arbitrary depth by using it with an integer, such as "break 3"; this would break out of three levels of loops.

There is no way of doing this in C. If you want to break out from a series of nested for or while loops then you have to use a goto. This is what is known as a crude hack.

In addition to this, there is no way to compare any non-numerical data type using a switch statement. C does not allow you to use switch and case statements for strings. One must use several variables to iterate through an array of case strings and compare them to the given string with strcmp(). This reduces performance and is just yet another hack.

In fact, this is an example of gratuitous library functions running wild once again. Even comparing one string to another requires use of the strcmp() function.

Flushing standard I/O

A simple microcosm of the "you can do this, but not that" philosophy of C; one has to do two different things to flush standard input and standard output.

To flush the standard output stream, one can use fflush() (defined by <stdio.h>). One doesn't usually need to do this after every bit of text is printed, but it's nice to know it's there, right?

Unfortunately, one cannot use fflush() to flush the contents of standard input. Some C standards explicitly define it as having undefined behaviour, but this is so illogical that even textbook authors sometimes mistakenly use fflush(stdin) in examples and some compilers won't bother to warn you about it. One shouldn't even have to flush standard input; you ask for a character with getchar(), and the program should just read in the first character given and disregard the rest. But I digress...

There is no 'real' way to flush standard input up to, say, the end of a line. Instead one has to use a kludge like so:

int c;
do {
errno = 0;
c = getchar();

if (errno) {
fprintf(stderr,
"Error flushing standard input buffer: %s\n",
strerror(errno));
}
} while ((c != '\n') && (!feof(stdin)));

That's right; you need to use a variable, a looping construct, two library functions and several lines of exception handling code to flush the standard input buffer.

Inconsistent error handling

A seasoned C programmer will be able to tell what I'm talking about just by reading the title of this section. There are many incompatible ways in which a C library function indicates that an error has occurred:

  • Returning zero.
  • Returning nonzero.
  • Returning a NULL pointer.
  • Setting errno.
  • Requiring a call to another function.
  • Outputting a diagnostic message to the user.

Some functions may actually use up to three of these methods. But the thing is that none of these are compatible with each other and error handling does not occur automatically; every time a C programmer uses a library function they must check manually for an error. This bloats code which would otherwise be perfectly readable without if-blocks for error handling and variables to keep track of errors. In a large software project one must write a section of code for error handling hundreds of times. If you forget, something can go horribly wrong. For example, if you don't check the return value of malloc() you may accidentally try to use a null pointer. Oops...

Commutative array subscripting

"Hey, Thompson, how can I make C's syntax even more obfuscated and difficult to understand?"

"How about you allow 5[var] to mean the same as var[5]?"

"Wow; unnecessary and confusing syntactic idiocy! Thanks!"

"You're welcome, Dennis."

Variadic anonymous macros

In case you don't understand what variadic anonymous macros are, they're macros (i.e. pseudofunctions defined by the preprocessor) which can take a variable number of arguments. Sounds like a simple thing to implement. I mean, it's all done by the preprocessor, right? And besides, you can define proper functions with variable numbers of arguments even in the original K&R C, right?

In that case, why can't I do:

#define error(...) fprintf(stderr, ...)

without getting a warning from GCC?

warning: anonymous variadic macros were introduced in C99

That's right, folks. Not until late 1999, 30 years after development on the C programming language began, have we been allowed to do such a simple task with the preprocessor.

The C standards don't make sense

Only one simple quote from the ANSI C standard - nay, a single footnote - is needed to demonstrate the immense idiocy of the whole thing. Ladies, gentlemen, and everyone else, I present to you...footnote 82:

All whitespace is equivalent except in certain situations.

I'd make a cutting remark about this, but it'd be too easy.

Too much preprocessor power

Rather foolishly, half of the actual C language is reimplemented in the preprocessor. (This should be a concern from the start; redundancy usually indicates an underlying problem.) We can #define fake variables, fake conditions with #ifdef and #ifndef, and look, there's even #if, #endif and the rest of the crew! How useful!

Erm, sorry, no.

Preprocessors are a good idea for a language like C. As has been iterated, C is not portable. Preprocessors are vital to bridging the gap between different computer architectures and libraries and allowing a program to compile on multiple machines without having to rely on external programs. The #define statement, in this case, can be used perfectly validly to set 'flags' that can be used by a program to determine all sorts of things: which C standard is being used, which library, who wrote it, and so on and so forth.

Now, the situation isn't as bad as for C++. In C++, the preprocessor is so packed with unnecessary rubbish that one can actually use it to calculate an arbitrary series of Fibonacci numbers at compile-time. However, C comes dangerously close; it allows the programmer to define fake global variables with wacky values which would not otherwise be proper code, and then compare values of these variables. Why? It's not needed; the C language of the Plan 9 operating system doesn't let you play around with preprocessor definitions like this. It's all just bloat.

"But what about when we want to use a constant throughout a program? We don't want to have to go through the program changing the value each time we want to change the constant!" some may complain. Well, there's these things called global variables. And there's this keyword, const. It makes a constant variable. Do you see where I'm going with this?

You can do search and replace without the preprocessor, too. In fact, they were able to do it back in the seventies on the very first versions of Unix. They called it sed. Need something more like cpp? Use m4 and stop complaining. It's the Unix way!

Sponsors

Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure

Login

Related Links
o Why Pascal is Not My Favourite Programming Language
o how many compilers support that
o atol()
o strtol()
o Also by James A C Joyce


Display: Sort:
Why C Is Not My Favourite Programming Language | 556 comments (448 topical, 108 editorial, 3 hidden)
+1 FP: addresses a real problem (2.00 / 10) (#6)
by DJ Google on Sat Feb 07, 2004 at 03:28:27 PM EST

Even Microsoft knows better than to use C. Shit C sucks so badly they had to make their own language (C#) which corrects most of the mistakes mentioned in this article.

--
Join me on irc.slashnet.org #Kuro5hin.org - the official Kuro5hin IRC channel.

+1FP, James A C Joyce (1.28 / 14) (#7)
by Michael Jackson on Sat Feb 07, 2004 at 03:35:21 PM EST

Child porn is much more enjoyable than C.

#kuro5hin.org -- irc.slashnet.org -- On the fucking spoke.
drdink -- gimpy pedo-fag felching drwiii off in the weeds

At least (2.69 / 13) (#8)
by flo on Sat Feb 07, 2004 at 03:40:05 PM EST

C lets you shoot yourself in the foot.
---------
"Look upon my works, ye mighty, and despair!"
-1, !clue (2.57 / 19) (#10)
by Bad Harmony on Sat Feb 07, 2004 at 03:56:12 PM EST

It might help if you understood what you are criticizing.

The signedness of chars is implementation dependent. On some systems they are signed.

C lets the programmer choose the string format. The language does not predefine it. Null terminated strings are a convention of libraries and system calls on certain operating systems.

Overflow detection is implementation dependent.

C was not designed to be a user-friendly language that holds your hand, wipes your bottom, and kisses your scrapes. It is a minimal set of abstractions over the hardware. It is not a general purpose high-level language for developing applications! It is a systems programming language, for use by experienced programmers. If you don't like that, go away and use some other language that is more appropriate for your task.

54º40' or Fight!

There're some things I love about C. (2.88 / 18) (#11)
by fae on Sat Feb 07, 2004 at 03:58:08 PM EST

It is very easy to learn the ins and outs of the core of C. It's all simple and consistent and elegant. I also love what you can do with C pointers. They are so powerful. Function pointers just make the deal so much sweeter.

I agree with much of your submission. Too difficult to do unicode, easy to accidentally overflow, rarely used functions (strncpy, anyone?), etc.

Yet, C is assembly language made (mostly) portable. Some of the difficulties of C are just a necessary outcome of its low-levelness. Other difficulties came when they decided to add bits and pieces of high-level. (C++ took this bloating even further and that is why I hate it.)

By the way, you should have a conclusion section.

-- fae: but an atom in the great mass of humanity

shamelessly stolen from /. (1.85 / 7) (#16)
by pertubation theory on Sat Feb 07, 2004 at 04:21:16 PM EST

RespeCt the C++k! And tame the C#nt!

----
Dice are small polka-dotted cubes of ivory constructed like a lawyer to lie upon any side, commonly the wrong one.
- Ambrose Bierce
Just for the hell of it, a rebuttal (2.83 / 24) (#18)
by curien on Sat Feb 07, 2004 at 04:28:37 PM EST

I'm not going to go point-by-point, but I'll refute what comes to mind easily (ie, things that don't take too much thought to refute). I'm also skipping the ones I feel are completely retarded.

No string type
In C, a string is not a type, it's a data format. I suppose you can see this as a weakness, but the large number of routines available for manipulation of this data format have put the C-style string into the realm of abstract data type, IMO. I'd like to hear any argument otherwise. You might as well rant about how C doesn't have a matrix type.

Functions for insignificant operations
You're kidding, right? What are you, an APL fanatic? Or maybe you liked having to use keywords for everything (as in Cobol and Pascal)?

Flexible casting
You are showing your ignorance. Casting is not conversion, and "conversion" is what you really meant.

Array sizes can only be constants
Bzzt... wrong. Next!

The encouragement of buffer overflows
This is indeed an issue one must face when using C. I'm mostly a C++ programmer, and I only use raw memory (pointers, etc) when writing interfaces to low-level functionality. Without manipulating raw memory, programming wouldn't be possible, but it must be localized and (heh) buffered from the client programmer.

Integer overflow without warning
It's because C's a low-level language. :-} Seriously, if you want a language that holds your hand, use Ada. If you want one that only does exactly what you tell it, use assembly. If you want a portable  assembly, use C.

There are at least four official specifications of C I could name from the top of my head
Nope, only three, and one of them is just a library extension. Two of those three have been implemented completely in all four major compilers that I've used over the years.

If it weren't for the C preprocessor, then it would be virtually impossible to get C to run on multiple families of processor hardware
Only if it wants to take advantage of OS-specific functionality. Many Unix command-line utilities, for example, can be implemented in macro-free, standard C.

Even the C89 and C99 standards conflict with each other in ridiculous ways. Can you use the long long type or can't you?
Huh? First, it's not a conflict. Second, are you saying that new versions shouldn't provide new features?

almost anything like comparisons, character assignments, arithmetic, or string output can blow up spectacularly for no apparent reason because of endianness or because your particular processor treats all chars as unsigned or silly, subtle, deadly traps like that
Present one real-world example where a character assignment fails for platform-specific reasons.

various idiosyncracies... all these unnecessary and unmentioned quirks
Umm... how can they be unmentioned if people mention them? While you're at it, let's wonder why Pascal requires a period at the end of the main block's "END".

When one single condition you'd just happened to have forgotten about whilst coding screws up, it's your fault.
Well, yeah. Obviously the programmer has to shoulder most of the responsibility for a broken program. See, you agree!

Unfortunately, one cannot use fflush() to flush the contents of standard input.
You can't use sync as a filename completion utility, either.

About your code sample: unnecessarily complex.

  int c;
  while((c = getchar()) != '\n' && c != EOF) ;

Besides, you bely a basic fallacy in your complaint. There is no 'real' way to flush standard input up to, say, the end of a line. That's because the "end of the line" part is awfully arbitrary. Why should the library decide for you how much of the buffer to flush? Flushing the entire thing (which is what fflush does with output streams) is simply infeasible with input streams, as it would leave the stream at EOF. Not very useful, is it?

Inconsistent error handling
Oh, the horrors of backwards compatibility. This is indeed a failing.

Variadic anonymous macros
You could do it before, just not as easily. Also, do I have this right... you're complaining that C does provide a feature you're asking for?

The C standards don't make sense
First, footnotes are non-normative. Second, any document can be snipped in such a way as to appear to not make sense. (The C standard has real defects, as any paper of that size will, but that footnote is not one of them.)

In C++, the preprocessor is so packed with unnecessary rubbish that one can actually use it to calculate an arbitrary series of Fibonacci numbers at compile-time.
This is unrelated to the real argument, but you're thinking of templates, not the preprocessor. The C++ preprocessor is the same as the C90 preprocessor. This comment is interesting to note, however, in that it is indicative of your general lack of subject knowledge.

--
All God's critters got a place in the choir
Some sing low, some sing higher

The real problem. (2.37 / 8) (#31)
by tkatchev on Sat Feb 07, 2004 at 06:12:30 PM EST

The real problem is the fact that C is an ideal abstract, portable assembler for the PDP.

Unfortunately, though, modern processors are quite a bit different from those in the era of PDP. We need something that understands and takes advantage of the vastly improved processor architectures of today.


   -- Signed, Lev Andropoff, cosmonaut.

Don't let the door hit you on the way out... (2.42 / 14) (#33)
by BenJackson on Sat Feb 07, 2004 at 06:13:51 PM EST

I for one applaud your dislike of C.  That leaves more jobs available for those of us who can be trusted to use sharp tools.

By the way, how come your rant didn't include the lack of a native complex number type, or the lack of proper tail recursion?  Not to mention no lazy evaluation, no garbage collection and it makes you use symbols for the syntax instead of plain English.  It can't even figure out your program structure from looking at the whitespace.  And it's taking up 1/26th of all of the namespace available for one-letter programming language names!

+1FP Original! (2.42 / 7) (#37)
by Run4YourLives on Sat Feb 07, 2004 at 06:34:44 PM EST

Finally, someone has found something wrong with this language and speaks up.

Please go away.

It's slightly Japanese, but without all of that fanatical devotion to the workplace. - CheeseburgerBrown

wtf is pascal? [nt] (2.57 / 7) (#39)
by quartz on Sat Feb 07, 2004 at 06:51:09 PM EST



--
Fuck 'em if they can't take a joke, and fuck 'em even if they can.
Another fine rant. (2.84 / 25) (#41)
by it certainly is on Sat Feb 07, 2004 at 07:13:44 PM EST

Very good, sir. Here are some of my disagreements, at random.
  • no string type: C is not a high level language. Repeat after me: C is not a high level language. No modern processor actually has "string" support (x86 has some useless pascal-style string instructions, limited to 255 byte strings), and C always works in terms of the processor. As strings these days need to include full unicode support, I think C made the right choice: use an external library to do strings. The standard library has better things to do than bloat. The string library as it stands is small and useful for the occasional use of strings. For heavy use of string manipulation, use Perl. C is a low level language. It is not a string manipulating language.
  • no bool type: what is it with you and strong typing?!?!?!?! Gah, I can't stand that. Processors have a "zero" flag. The zero flag gets set on comparison of equal integers. In C, something is false if it's 0 (Branch If Zero), or else it's true (Branch If Not Zero). Actually forcing people to change nonzeros to explicitly the "true" constant is the kind of paper-pushing, dotting-the-'i's-and-crossing-the-'t's shit that low level languages like C do not put up with.
  • anything else to do with explicit language support for datatypes and structures: fuck off. C is a low level language. Use something else for frilly bits.
  • any language features that are not basic features of the von Neumann architecture: fuck off. C is a low level language. Use something else for frilly bits.
  • library cruft: we should just re-write old programs, then? You'll note that atoi() and atol() are just simple macro front-ends to the generic strtol().
  • operators: "*", "[]" and "->" are dereferences. "." is a structure offset. They are two different things, and if you don't know the difference then you are probably writing inefficient code with far more dereferences than necessary.
  • string cases: are you mad? Using ifs and strcmp()s is identical to how a higher level language would implement switch() on actual strings. You'll probably complain that case labels can't be variables, too. That is deliberate, as the switch() construct compiles directly to a jump table, or a list of well-ordered "subtract #constant / branch if zero" commands. That's what it's for.
  • endianism and alignment: if you didn't know this and take it into consideration, it's your own damned fault. The problem is never the actual endianness or alignment rules, it's those idiots who read and write data to disk or network where it might be used by different computers or software. Never do this. EA wrote the Interchange File Format just for you. All data should be serialised as a well defined bytestream, not as a raw memory dump. That is what debuggers are for.


kur0shin.org -- it certainly is

Godwin's law [...] is impossible to violate except with an infinitely long thread that doesn't mention nazis.

C has a boolean type (1.57 / 7) (#42)
by Rupert Pupkin on Sat Feb 07, 2004 at 07:15:34 PM EST

ranting about things you don't know makes you look stupid, stupid.

Why a chainsaw is not my favortie tool (2.78 / 19) (#50)
by bugmaster on Sat Feb 07, 2004 at 08:06:12 PM EST

It is nearly impossible to screw in bolts with a chainsaw: you just end up with broken bolts and/or chainsaw. It doesn't even have a phillips head ! How stupid is that ?

That's because the chainsaw is a specific tool made to do a specific job. So is a screwdriver. So are C, Java, PHP, and whatnot. You wouldn't use a microscope to hammer nails (ok, maybe you would, I don't know), and you wouldn't use Java for embedded or real-time programming, and you wouldn't use C to create GUIs. What's the problem ?
>|<*:=

C is not Portable? (2.66 / 9) (#58)
by lukme on Sat Feb 07, 2004 at 08:44:18 PM EST

ANSI C is the standard.

K&R C is the older standard.

POSIX C defines some thing in addition to ANSI C

GNU C, MicroSoft C, Think C, Power C, Borland C, ... are all implementations of ANSI C.

I bet the following:

1) you have never read the ANSI C standard.
2) you don't realize that ANSI C compliers can compile about 95% K&R C (there are a few minor differences you need to be aware of).
3) You have never worked on a portable C project.
4) You have never needed to have your code run fast.


-----------------------------------
It's awfully hard to fly with eagles when you're a turkey.
C Is Not Your Favourite Prog Language because (2.69 / 13) (#61)
by Idioteque on Sat Feb 07, 2004 at 09:40:13 PM EST

you're using it for the wrong applications and you don't understand how to use it correctly. C is very powerful and much closer to the hardware than you seem to want it to be, obviously you need another language. Please don't knock this great language just because you don't understand it's uses.


I have seen too much; I haven't seen enough - Radiohead
+1FP, Finally someone brave enough to mention it! (1.66 / 6) (#66)
by Azmeen on Sat Feb 07, 2004 at 09:59:13 PM EST




HTNet | Blings.info
once again... (2.33 / 6) (#71)
by teichos on Sat Feb 07, 2004 at 11:12:07 PM EST

You belittle the technology and its standards when it doesn't save you from your own stupidity. Perhaps, you're just not doing it right?

flames and modbombs are the most pathetic forms of flattery
C is stupid. (1.80 / 5) (#72)
by SIGNOR SPAGHETTI on Sun Feb 08, 2004 at 12:13:33 AM EST

You can't write a useful program in C without first implementing a LISP interpreter.

--
Stop dreaming and finish your spaghetti.

C programming is for artists (2.83 / 12) (#75)
by mstefan on Sun Feb 08, 2004 at 01:31:15 AM EST

You can make good art, or bad art. But at least you have complete control over the canvas, the paints and the brushes. And if you want to paint outside the lines, you can do that too. Whatever the machine can do, C (and a little inline assembly where required) will let you do.

The difference between Pascal and C is the difference between using a paint-by-number kit with crayons and a canvas with oil paints. You end up with a picture in both cases, and its certainly easier to make a mess with oils. But in the hands of master, there's no doubt which is the superior medium.



Yeah ... um ... you don't get it. (2.90 / 11) (#77)
by Mr.Surly on Sun Feb 08, 2004 at 01:43:53 AM EST

C really is just one step up from machine-language.  Go do some machine-language by writing it on paper, then inputting it using a hex keypad, then maybe you'll have a little perspective of where C came from.  C is old, and it is weird.  Deal with it, or don't use it.

This article really smacks of "Oh, poop! C is really hard, so I'll write an article complaining about it instead."  As such, it's probably a long, subtle troll.

I'm glad you don't like C (2.00 / 7) (#79)
by Sapien on Sun Feb 08, 2004 at 02:23:19 AM EST

You really suck at it.

buffer overflows is the biggie here... (none / 3) (#94)
by reklaw on Sun Feb 08, 2004 at 05:09:38 AM EST

... and it's the reason why using C for anything (especially anything large) is a very bad idea. Think of how much time and effort could have been saved over the years if C handled buffer overflows gracefully instead of crashing hard and/or letting people throw code into memory when they happen...
-
C wastes my time (2.60 / 5) (#130)
by meaningless pseudonym on Sun Feb 08, 2004 at 09:34:51 AM EST

Bravo, sir!

If I'm writing in C, there's all manner of things I have to remember but which are effectively useless in day-to-day coding. The practical outcome is to make it easier for me to write a hard-to-find bug.

We have various posters here screaming 'But it's a low-level language!'. Well, that's not a virtue of itself and in any case, is it appropriate to use a low-level language for so many of the things we do with it? It may well be a fantastic low-level language but if I'm writing a Minesweeper clone then that's completely irrelevant so I'm left with a language that's merely easier to bug and harder to debug. I'd wager that the average coder spends their time closer to writing Minesweeper than the Linux kernel.

Don't get me started on that syntax. Huge numbers of little symbols are _not_ easier to read than keywords. There's a reason we don't use APL any more.

Find me the times when I really need that memory level control or speed and I'll gladly find a C-loving masochist and get them to write the code. Well clear 99% of the time, that's not relevant and it's nothing more than vanity to suggest that it is. Most of what we write is emphatically not process bound, performance critical code. The rest of the time, give me any one of a number of better designed, more programmer-friendly languages that have all the control I need with none of the gotchas and watch me turn out higher quality code in less time.

C sucks. And blows. At the same time. (2.22 / 9) (#132)
by localroger on Sun Feb 08, 2004 at 09:50:38 AM EST

I learned to program by reverse engineering a BASIC interpreter with a tool similar to DEBUG and writing a new one for myself, so I'm not afraid of low-level programming. You have labels and variable names? What luxury! To this day I remember quite a few 8080 opcodes.

When I first encountered C around 1985 I was stunned at how ugly it was. Trying to be low-level and high-level at the same time, it manages to be neither. Every "enhancement" makes it more bloated and more complicated without making your life easier. Every once in awhile I've decided to bite the bullet and learn this piece of crap language just so I'll be current, and after a few chapters I rinse my eyes out with lye to make sure I never repeat the error.

I've had a pretty long career and I've done quite a few interesting things with truly low level and truly high level languages -- low level in assembly, and high level in various BASIC dialects or proprietary control languages. Nothing in between.

When you need performance, there is no substitute for assembly. C isn't portable assembly. If you have been taught that, you were lied to. It takes about five minutes examining the object code shat out by a C compiler to understand that, if you actually know how to program in assembly yourself. And with Intel and AMD going on six generations of object-code compatible CPU's the lure of "portable assembly" is dimmer than ever.

I've managed to go almost 20 years as a professional programmer without ever writing a line of C code, and if I can go another 20 it will represent one of the great successes of my career. With any luck I'll even live to see this horrible language die the death it so richly deserves.

What will people of the future think of us? Will they say, as Roger Williams said of some of the Massachusetts Indians, that we were wolves with the min

tradition (2.50 / 8) (#148)
by svampa on Sun Feb 08, 2004 at 12:05:31 PM EST

C is a very old language, and the reason why it's everywhere is similar to the reason why COBOL is still in lot of bank software: A legacy of old times.

C is a middle-level language, and it should be used for what it was made for: System and drivers programming. And even for this cases, people should think about using another kind of language, it has too many problems.

  • Pointers Aritmetic is a suicide, and should be used in a few special cases.
  • The need of "break" in "Switch" sentence (problably inherited from assembler) i dangerous. No compiler dares not to warn the lack of a "break". That shows how absurd is this "feature" of language.
  • If you write a single "=" instead of "==", you are in problems. it's a dangerous syntax
  • No boolean type
  • ...
  • Languages like java try to solve a lot of the known problems of C. But what I miss in new languages is that in order to ease learning to C programers they try to imitate C syntax. "}", uppercase/lowercase etc. What a pity. And C++ has extended a problemantic lenguage inheriting all the problems that C had.

    High level languages, and hard typed languages are not academic games, they are the result of investigation. They make compiler be aware of your logic, a lot of bugs are catched in compiler time, not in runtime.

    I'm sure that part of the problem of nowadays unstable software is C. I think that the day software developers (so companies do) dump C, will be a bright day for software.



    Once again, wrong applications (2.88 / 9) (#167)
    by Idioteque on Sun Feb 08, 2004 at 02:08:19 PM EST

    Throughout my programming career I've worked in bascially two areas of programming: writing device drivers and writing video and audio CODECs that run on both PCs and embedded systems. I can't begin to think of another language more suited for these two areas.

    Do I use C for my web photo album or my CD ripping and encoding scripts? No. Would I want to write a graphical email client in C? No. Would I want to write a video decoder in any language besides C? No. When you're trying to squeeze out as much processing as possible, the easy integration of assembly language comes in quite handy too. Don't give me this nonsense about compilers being so good these days you don't need to hand code any assembly. Yes, compilers are very good, but this is to a point. When you're trying to implement SIMD instructions for further optimizations, hand coding in assembly is sometimes the only option.

    Furthermore, a good C programmer is one who understands how C translates into assembly, which tells you how close to assembly C really is.


    I have seen too much; I haven't seen enough - Radiohead
    C obviously sucks (1.40 / 5) (#177)
    by psychologist on Sun Feb 08, 2004 at 04:08:07 PM EST

    After reading this article, I have come to the conclusion that  when I learn to program, I shall not use C.

    What is your opinion on C+ or D?


    Why This Article Is Not My Favorite Article (2.46 / 15) (#179)
    by kitten on Sun Feb 08, 2004 at 04:20:28 PM EST

    It is tedious, long-winded crap, largely incorrect owing to gross generalizations and misapplications, desperately trying to be funny and failing, and is written by a crapflooding nitwit.
    mirrorshades radio - darkwave, synthpop, industrial, futurepop.
    Why can't we program in English? (1.80 / 5) (#181)
    by United Fools on Sun Feb 08, 2004 at 04:52:38 PM EST

    Would that solve all the problems?
    We are united, we are fools, and we are America!
    "here's why C is now owned by Pascal. " (2.25 / 4) (#182)
    by horny smurf on Sun Feb 08, 2004 at 04:55:36 PM EST

    You somehow forgot to mention why Pascal is superior. Perhaps because almost all of the "problems" (with the exception of the preprocessor) you listed also apply (in spades) to Pascal?



    Why crack is your favorite drug (2.80 / 5) (#183)
    by strlen on Sun Feb 08, 2004 at 05:01:12 PM EST

    First, let's start with the first one, and one closest (heh) to my heart. As others have stated, C is not a high-level language. Why shouldn't C have a built in string class? Ironically, you provided a reason yourself, with the idea of multi-byte strings: duplicating functionality of strcpy() etc.. on 16-bit strings would be rather trivial.  In addition, C's handling of strings as NULL-bounded arrays of characters allows use to use efficient CPU -level manipulation techniques on some platforms, or high degrees of optimization. So, if you want a string handling language, I high suggest trying Perl, or sed/awk/grep.

    As for lack as what's implemented as functions as parts of the language.. again: C is not a high level. C is translated to assembly. It is entirely possible to write a boot-loader, or an OS kernel in C, at a level where you have no libc.

    As for buffer overflows, the issue is a) the fact that this isn't an issue with a decent OS and a decent architecture. Typing "set noexec_user_stack = 1" >> /etc/system should be sufficient. Secondly, all sorts of patches exist to remedy the situation all ready: propolice, safe strlcpy() functions, and fucking strncpy functions already.

    As for pre-processor power, I suggest looking into plan9's C-compiler suite, which takes significant steps to curb that.

    And let me restate again: C is not a high level language. What you're looking for is called "C++". Have you ever look at assembly code generated from C++ and assembly code generated from C? That's where the difference will lie. You don't write CGI applications in C, you don't write an OS in Perl. C++ provides greater flexibility, but those who code an object oriented language without any knowledge of C, are generally the ones getting paid $15 an hour to program horrible Java code which will crash under most any conditions (see friendster's JSP backend for prime illustration of that.)

    --
    [T]he strongest man in the world is he who stands most alone. - Henrik Ibsen.

    Why loose words on the evident? (2.00 / 4) (#185)
    by jope on Sun Feb 08, 2004 at 05:29:50 PM EST

    C is an anachronism, a better macro assembler. It is a terrible, ugly language that includes none of the developments of computer language design that occurred in the last 30 years. C++ and C# are nearly as bad. The only reason why people are using these languages is because they have to, or they do not know decent modern languages, or both.

    Heres a thought ... (none / 2) (#198)
    by blackpaw on Sun Feb 08, 2004 at 07:06:50 PM EST

    If you don't like it, *DON'T USE IT !*

    Sheeze

    Why C is not my favorite language (1.25 / 4) (#208)
    by Big Sexxy Joe on Sun Feb 08, 2004 at 11:34:46 PM EST

    Because Java is.  Not that C suffers any terrible faults in general, but Java is the most fantasically designed language I have learned.  Anything you can do in C you can do better in Java.

    I'm like Jesus, only better.
    Democracy Now! - your daily, uncensored, corporate-free grassroots news hour
    Don't mess (2.83 / 6) (#226)
    by bloat on Mon Feb 09, 2004 at 05:10:09 AM EST

    If you find yourself writing a comment that says:

    /* Don't touch this code unless you know what you're doing */

    Then do everyone on your team a favour - delete the comment and rewrite the code.

    CheersAndrewC.
    --
    There are no PanAsian supermarkets down in Hell, so you can't buy Golden Boy peanuts there.
    Use Perl; (none / 2) (#228)
    by zentara on Mon Feb 09, 2004 at 05:34:53 AM EST

    Many,many ex-C programmers have found a "happy comprimise" with Perl. It takes care of most of the problems you discuss. Perl is like a "user-friendly" front-end to C. It makes it so easy to whip out a program, and if you feel the "need for speed", you can include Inline C or even Inline Assembly.

    Before I posted this, I wanted to see if anyone else mentioned Perl, so I searched the page for it. The matches I got were "pro perl y". Sort of a "zen moment". :-)

    Unix Hater's handbook reference (none / 2) (#231)
    by Will Sargent on Mon Feb 09, 2004 at 05:43:21 AM EST

    http://librenix.com/?inode=3046
    ----
    I'm pickle. I'm stealing your pregnant.
    Efficiency is (NOT) king (2.83 / 6) (#234)
    by gidds on Mon Feb 09, 2004 at 06:09:16 AM EST

    Gosh! I'm surprised this story generated so much interest - and such polarised responses...

    As a professional software developer who's used C for yonks, I agree with most of what you say. C is a pain to use in many ways; structuring large systems is a pain, making them robust is a pain, memory management is a pain, the standard library is a pain... Yes, you can learn to work around most of these, but such a low-level approach is appropriate far less often than people seem to think.

    Where I disagree is in the reasons you attribute these problems to. It's not just ancient fashion, ignorance, or perverse pleasure. C is as it is because of a philosophy that there should be no hidden code: all code the compiler emits should map directly to a statement in your program. Most of the extra features you'd like to see - memory management, exceptions, more types, string handling - would involve extra work 'behind the scenes' to manage. C is designed for 'no overheads'.

    Why? Efficiency. Efficiency is the driving force behind C. (Runtime efficiency, that is. Programmer efficiency doesn't come into it...) And this is both its greatest blessing and its greatest handicap. It means that compilers are relatively easy to write, that the language is relatively small and concise, and C code is small and fast. But it also means that programmers have to spend time and mental effort working around the lack of higher-level facilities, or doing without and risking ugliness and flakiness.

    Worse than this, though: it encourages bad habits. Error handling is difficult and ugly, so people don't bother. Bounds checking is hard work, so people just assume that there's enough memory. C teaches that efficiency is the most important thing, so people use all sorts of ugly hacks and fragile constructs in pursuit of ultimate efficiency, when something slightly slower but perfectly usable would allow something far more robust, extensible, and reusable.

    Andy/

    my pet peeve with C (2.25 / 4) (#236)
    by the sixth replicant on Mon Feb 09, 2004 at 06:51:54 AM EST

    is that it forced a lot of languages to look like it so they could be taken "seriously". hence {}, if () etc etc. nothing too bad here but the biggest fuckup is that for some godforsaken reason a dumb arse computer programmer thought he's smarter than 3000 years of mathematical history and decides to start counting things from 0. yes, zero. was it the first language to do this. i don't think so (i guess there are more up their arse programmers out there that just love showing how "original" they are), but it was one of the most popular that started this trend. in fact, you can say this about computer languages :
    if you need to start counting from 0 - then it's a real computer language; if not - it's mickey mouse.

    i used C to do most of my combinatorial computing since it was the fastest language and i needed very large arrays to handled the finite field arithmetic (which by the way, having arrays start from 0 was very convenient!!!) - making me wait a few days that would have otherwise taken a few weeks of computation time. but my programs were "simple" recursion but no maths library stuff). if we now had to use C in our work environment (web company) i think most people would have gone crazy just reading each others code let along trying to debug it.

    so i've showed that C has it's place - speed issues, but can we shoot they guy (and it has to be a guy) who thought counting things starting from zero somehow made sense (yeah i'm sure it's a compiler thing etc and maybe assembler related - but jesus - 21st century people)

    ciao

    Just a couple of points on this crackrock story (2.75 / 4) (#237)
    by axel on Mon Feb 09, 2004 at 07:07:54 AM EST

    -The browser most people are using to read K5 is written in C. -The desktops on which most people work are written in C (gnome, any gtk-based stuff) or C++ (kde, ms-windows, etc). -All your OSes are belong to us too. -If you cant live without string type, there are _lots_ of libraries that implement wrapper string types that you can use (see for example glib). But anyway, live with it: or you can always go back to BASIC. -If can't use pointers properly, then it's not the language's fault: rule #1 for c compilers is 'the programmer knows what he's doing'. No point in having a smartass compiler that tries to guess stuff or fix people's code. First thing you have to learn to code C is pointer discipline. -Of course C is nasty and not suited for coding text filtering tools and scripts: that's what perl, awk, sed and etc are for. oh wait, they're written in C as well. we should rewrite them in java! that way they'll be more efficient!!!

    Casting return value of malloc() (1.20 / 5) (#243)
    by fishpi on Mon Feb 09, 2004 at 08:50:02 AM EST

    You claim that you "mustn't" cast the return value of malloc(), which is totally false. Casting the return value of malloc() is totally acceptable within the standard, and some people (notably P. J. Plauger) advocate this.

    What is true is that you don't have to cast the return value of malloc. Casting it has no effect other than to hide possibly useful warnings. In the vast majority of cases the right thing to do is not to cast it.

    A language is a tool (none / 3) (#244)
    by gbd on Mon Feb 09, 2004 at 08:59:26 AM EST

    And like all tools, the more powerful it is, the more dangerous it can be if not used properly. C is sort of like an industrial strength nail gun; if wielded improperly, it can cause untold carnage. Used correctly, however, it can accomplish virtually anything with unparalleled speed and efficiency. Languages like Java, on the other hand, are more like the padded "Whack-A-Mole" mallets you might find at Chuck E. Cheese. If you screw up, the worst thing that will happen is that you'll give some kid a bloody nose and get a lecture from a man in a big mouse suit. But nobody (seriously) suggests writing an operating system or a CPU-intensive image processing algorithm in Java, because reasonable people realize that it's not the right tool for the job.

    --
    Gunter glieben glauchen globen.
    Quote in context (none / 0) (#248)
    by Protagonist on Mon Feb 09, 2004 at 09:58:30 AM EST

    Just for the sake of accuracy, here's the full text of the footnote on whitespace:

    Thus, preprocessing directives are commonly called "lines". These "lines" have no other syntactic significance, as all white space is equivalent except in certain situations during preprocessing (see the # character string literal creation operator in cpp.stringize, for example).

    ----
    Hahah! Your ferris-wheel attack is as pathetic and ineffective as your system of government!

    C might be ok (none / 3) (#250)
    by Cro Magnon on Mon Feb 09, 2004 at 10:24:11 AM EST

    if you're writing an OS kernel or device driver. But why the fcsk would anyone write a high-level program in it? Pascal, Ada, Python, and Perl are better suited for most programming.
    Information wants to be beer.
    The author doesn't understand his own argument (2.92 / 14) (#263)
    by irwoodhouse on Mon Feb 09, 2004 at 11:36:56 AM EST

    I learned to program in 6502 assembly and BBC BASIC.

    Then I was taught to program in standard (similar to ISO-) pascal.

    Then I taught myself C in order to maintain a fairly arcane piece of software. I've about 10 years experience with C.

    BWK's article (which I've actually read - I wonder how many other other posters have?) is a report following his experience of trying to rewrite a suite of Software Tools from C into Pascal. It details the problems, and makes observations about the constraints (contraint!=limitation; constraint=disallowed, limitation=inability) Pascal imposes in order to enforce Good Programming Style (in the opinion of Wirth).

    Joyce's article on K5 appears by comparison to be just a plain rant.

    As many posters have noted, Pascal and C are entirely different tools. Most importantly, C was written to enable KT to get Unix working (practical problem, real-world solution) whereas Pascal was written to teach Structured Programming (at the time thought to be The Solution to shaky software, since superceded by Object Orientation which itself is superceded by Formal Methods).

    As people wanted to use the language for things it wasn't designed for, features were added.

    To pick an example, when I learned Pascal, it didn't have strings (they were introduced by Turbo Pascal). We have packed arrays of char, which were very inflexible.

    It occurs to me that Joyce doesn't really grasp the reasons behind the differences behind programming languages, particular where C is involved because of its (probably) unique position.

    To analyse every one of his comments requires an article in itself (I'm tempted), so I'll select a couple:

    "char" is basically only a really small number

    Not until the 1999 ISO C standard were we finally able to use 'bool' as a data type

    CPUs generally don't understand the concept of characters, booleans, strings, structs. They understand bytes and words, and must be taught everything else. Even Pascal has the "Type" keyword for custom data types.

    Inconsistent error handling

    A follow-on from the above (constraints of machine types). If an in-band error value is to be returned, it must be outside the valid range. For file descriptor functions, the error value is therefore negative. For functions return pointers, -1 might well be interpreted as an unsigned value and be a valid machine address. NULL is typically mapped to zero, and zero is typically guarenteed to be outside the valid address space of a process.

    errno is used where there are multiple possible causes of errors to avoid further polluting the return space with in-band error values.

    Joyce also lays some criticism without exploring the implications. He cites exception handling. C as a language has practically no run-time environment outside that provided by the operating system. Pascal has a fairly sophisticated one (amongst other things, it checks bounds on array even where the index is a variable). This run-time is implemented by the compiler, which provides the exception handling.

    The implications are two-fold. Firstly, said exception handling is an overhead not visible to the programmer and not under his control. Whilst useful for novices, it can be infuriating when it interferes with a programmer who knows exactly what he is doing. Compilers are undoubtedly clever, but (in the case of C) they should not attempt to second-guess the programmer.

    Secondly, the entire run-time environment including error handling has to be written somewhere, so at some level of programming it doesn't exist.

    C's design is for systems tasks (and Joyce seems to have forgotten this includes compilers) where there are no cosy environments to assist you. Writing systems-level tasks in assembly is not easy and is definately NOT portable, which is why only the minimum is done in assembly.

    To my knowledge, other than assembly and C, only one other language has been used to implement an operating system and compiler, and that is Oberon (also by Wirth, and complete disaster: read the book).

    The above points should indicate exactly why.

    Header files are evil (none / 2) (#272)
    by X-Nc on Mon Feb 09, 2004 at 12:13:23 PM EST

    It's been a while since I was a professional programmer but I still remember coding in COBOL, C/C++, perl, php, shell/sed/awk/etc. and some CICS. At this point in time I have decided that languages which require header files are evil. If you have to include something just to do fundamental or basic tasks then you're screwed. Why do I feel this way? Personal annoyance, I guess. You have to have every bit of every header file memorized in order to really do anything. This is the trouble with C/C++ and, to a lesser extent, perl. php has "include" in it but most of the things you need to do are all base parts of the language. php includes are more like subroutines.

    If I were still a "real" programmer I think I'd break things out this way...

    1. For web apps: php
    2. For system/admin/one-off's: shell & co.
    3. For back-end monster sized apps: COBOL
    4. For large apps: ruby
    That's not to say I would be opposed to using other langauages as needed. They are, after all, tools and each one has a place in the grand scheme of coding.

    The only thing that really bothers me is when someone says that such-and-such language is the "silver bullet" suitable for every possible coding situation. Knew a guy once who claimed that BASIC was the only language anyone ever needed to learn 'cause it could do everything anyone could ever want. I asked him about building an OS with it and he said that was a task that was not needed. "No one would ever want to build an OS." I guess that Linux, the BSD's, MenuetOS and the like are just figments of our collective imagination.

    --
    Aaahhhh!!!! My K5 subscription expired. Now I can't spell anymore.

    Two Factual Errors (none / 3) (#278)
    by gte910h on Mon Feb 09, 2004 at 12:32:10 PM EST

    On Library Size:

    You don't have to include a whole header if you want to only include a couple functions. Just prototype them. This is a common practice when working with multiple architectures, some of which may not implement some functions.

    And the entire library is only build into the executable if you use your linker improperly. Almost all modern linkers can do function based linking, only linking in the functions that may be called in the execution of the application. (Yup, you have to statically link to do this, but if you were looking to decrease execution footprint, you're probably already doing that).

    On Multiple Functions that do the same thing:

    This is called "history" and it had to do with the multiple places C was used, and now we'd like code from all three development families to use the same compiler. If you look at most late 80's-early 90's systems, you will only find one of those two or three functions you describe.

             --Michael

     

    C is good for certain tasks. (none / 3) (#280)
    by gte910h on Mon Feb 09, 2004 at 12:51:01 PM EST

    C is good for performance based tasks. I have written things in lisp or java, just to have to redo chunks them in C when there was an feature that was too slow on some platforms. C+Python or C+Lisp are GREAT programming environments these days (although you have to have a lot of confidence from your boss to be allowed the second).

    I agree that exceptions are useful when writing libraries and the like, but unfortunately, no maintainable* language comes close to C/C++ in speed, and C++ libraries can't even be linked into other compilers C++ applications.

    *as in the "not perl" family of languages.

    On Types and Breaks (2.75 / 4) (#286)
    by hardburn on Mon Feb 09, 2004 at 01:27:51 PM EST

    Hello? Flexible casting? Hello?

    With a good type system, you don't need casting. Needless to say, C's type system sucks. Take a look at Strong Typing and Perl, where Dominus gives an overview of why C's type system sucks, and why ML languages have a really great type system and thus need no casts at all. This presentation convinced me that I really need to learn OCaml.

    You can use it to break out from nested loops of arbitrary depth by using it with an integer, such as "break 3"; this would break out of three levels of loops.

    I wouldn't hold up PHP on a pedestal here. Doing break by the depth of the loop structure makes the code much more fragile in the event that you need to restructure your loop. A much better way is with the label syntax allowed in Java or Perl, so you can break by name:

    OUTER: foreach my $i (0 .. 3) {
        INNER: foreach my $j (0 .. 3) {
            break OUTER if $j = $i;
        }
    }

    IMHO, you should only code C because your only other alternative is assembly. Thus, C should be considered purely a low-level language and should ditch any attempt to make it otherwise. If your problem can be solved at a higher level, don't waste your time with C.


    ----
    while($story = K5::Story->new()) { $story->vote(-1) if($story->section() == $POLITICS); }


    Just a couple points. (none / 3) (#287)
    by jmv on Mon Feb 09, 2004 at 01:30:55 PM EST

    You seem to bash C in favor of Pascal in a couple places where Pascal is no better. Note that my Pascal experience dates back to Turbo Pascal, but while there are probably many more expensions now, I doubt the "standardized" language has changed much.

    Strings: What does it change you have two functions or just one (nobody forces you to use both). Also, Pascal strings also have a fixed length, which I why I don't like them (the C++ string class is much better)

    Buffer overflows: The fixed-length Pascal string is about as dangerous wrt buffer overflows.

    Integer overflow without warning: If Pascal does it, it's probably a compiler option (that could be done in C) and it would be *really* slow.

    Portability: Come one, you can do non-portable code with about any language. With C, if you use only C89, your chances are quite good. With Pascal, you depend on the extensions supported by your compiler. Last time I used Pascal, casting pointers was an extension, which means you couldn't even do dynamic array allocation without using extensions (i.e. non-standard code). Last thing, +- all platforms have a C compiler, that's not quite true for Pascal. On many platforms you can only compile Pascal by using p2c :)

    Trapped in the 1970s: I thought the design of Pascal dated just as much...

    Library size: Ever tried diet libc? You can use it to create static executables that are less than 1k in size. The size of libc is not caused by the language, but the choices (speed vs. size) that are normally made. There's no reason for the Pascal lib to be smaller or larger than the C lib because of the language.

    OK, I could go on forever, but I better get some work done today.

    Some good points-- but, (none / 3) (#291)
    by valar on Mon Feb 09, 2004 at 01:41:41 PM EST

    The reason for no string class-- a) your processor doesn't natively support string type (ok, so maybe pseudostrings of up to 255 characters, but that is the best I've heard of) b) no classes in C. In C, aggregate types are only loosely coupled with methods that act on them. In my opinion this is one weakness of C that can no longer be justified as a performance issue.

    As far as them being stored as arrays of characters, that is how every programming language does it-- though some hide it better than others.

    Buffer overflows? Yes. This falls under the "shit happens" school of programming. Buffer overflows are 'easy' in C and hard in other languages. But still very possible (I once demonstrated a buffer overflow in ADA to prove to an ADA developer that it was possible).

    Low level or high level? As a computer engineer, I'm inclined to say everything other than machine code or assembly are high level. That said, C is obviously lower level than C# or Java (my point is helped by the fact that several implementations of the .NET runtime and java virtual machines are written in C). As far as the library being huge-- well, it isn't. It is significantly smaller than the standard library of most modern languages (compare with .net or java, or even perl, python, or ruby).

    I have never seen array subscripting associated. In fact, I had to compile a test program before I even believed you. You taught me a new trick to confuse and astound at parties. :)

    Integer overflows-- a) There are several good algorithms that depend on a numerical overflow to know when to stop b) The reason that you get weird numbers and not zero or some kind of error condition is that signed numbers in C (and in most computers) are stored as 2s complement numbers. c) In order to create a runtime overflow handling system one would have to add 3 or four instructions for every arithmatic operation performed. That is a tremendous overhead. d) Because integers can overflow in C, you are can an array of integers as a larger integer type, if you define your addition operation correctly (unfortunatly, this requires an add() function, because you can't overload operators in 'pure C').

    You can and in some schools of programming are highly encouraged to cast the return value of malloc. You can't fflush(stdin) because flushing is for output. I've never heard gets() is evil.

    Most compilers get confused if you use bad syntax. Some are better than others but this is a compilier design issue, not a language one.

    If you pass a bad address to puts(), what the hell is it supposed to do? Guess what string you want to output? In more modern languages, the only difference is that the error happens at a different time.

    Hash tables: what hash function should they use to handle all cases well?

    I'll note that in C# (and IIRC, java) the multidimensional arrays work the same way. If you need a jagged array, you compose it uses pointers (or references, whatever). If not, you just declare a straightforward multidimensional array.

    Float is not the same as double. I repeat, FLOAT IS NOT THE SAME AS DOUBLE. The reason atof() is called atof is that it returns a float.

    Break X: Yeah, that would be nice, wouldn't it. In C, you are presented with two options, use GOTO and a label (yes, that's right: goto) or set a variable and check it in each of the outside loops. The goto way is probably more readible than either break x or the other C solution, for most situations.

    Error handling: diagnostic messages are intended to catch programming errors. Numerical return errors are intended for errors that 'aren't your fault.'

    ..And pascal is standard and portable..? (none / 3) (#293)
    by beavan on Mon Feb 09, 2004 at 01:59:42 PM EST

    Not as such. C is probably the most common, portable and standard language out there. Can anyone name a single platform that has more than 2 users that DOES NOT have a decent C compiler? C is just what I need when I'm developing for a low memory next to zero CPU cellular phone, even the objects are small and lean. Try to compare the size of a pascal or C++ generated object file and see what I'm talking about... Why? Because C doesn't make the compiler do anything you didn't INTEND it to do. Can't cope? maybe it's time for a career switch. Having said that, on my big fat Sparc, I use C++ (except for important low level stuff), a very well structured and lovely language. C++ still requires a brain. In fact, in my opinion it's much harder and complex to use, even if it supports strings (STL is actually a part of C++) in the way you think strings should be supported... BTW, if one of my programmers would use strcat, and not strncat (for instance) I'd have him write pascal for a whole week!

    I love burekas in the morning
    A more thorough article by the same title... (2.50 / 4) (#303)
    by jason on Mon Feb 09, 2004 at 03:41:52 PM EST

    Prof. Fateman of Berkeley has an article with more research behind it, but with more-or-less the same title:

    Software Fault Prevention by Language Choice: Why C is Not my Favorite Language

    Fateman's article is worth some thought, especially if you're designing a C library interface.



    Yay!!! (none / 3) (#331)
    by Deus Horribilus on Mon Feb 09, 2004 at 05:24:35 PM EST

    This is what I have been trying to explain to my workmates for the past month. I am currently programming scientific software in C++ (okay, it is a little better than C), but before this the standard language was FORTRAN.

    Why the change, you ask? To quote a fellow researcher:

    "Everybody else is doing it, so we have to."

    It's the whole jumping off a cliff argument all over again. If your current solution works, why bother with the trouble of changing your entire programming language?

    I have to add the point to this article that C and all its offshoots are NOT scientific programming languages. Surely the lack of an exponential operator and complex numbers would have demonstrated this. It's another argument against its use. Moreover, the propensity of C for errors (both seen and unseen) makes it unsuitable for programs that depend on accurate algorithms. And I haven't even mentioned its unreadability in code form (oh, wait, I just did).

    Great article, now if only my colleagues would read it...

    _________________________________________
    "Beliefs are never concrete, they change direction like autumn leaves in a windstorm..."

    YHBT (2.25 / 4) (#336)
    by ph317 on Mon Feb 09, 2004 at 06:13:55 PM EST

    Unfortunately, time has not been kind to Kernighan's tract. Pascal has matured and grown in leaps and bounds, becoming a premier commercial language. Meanwhile, C has continued to stagnate over the last 35 years with few fundamental improvements made. It's time to redress the balance; here's why C is now owned by Pascal.

    It's a brilliant peice of writing, I support it's section vote.  But cmon guys, YHBT, so stop arguing the finer points of it.

    What's your porblem? (none / 3) (#354)
    by the on Mon Feb 09, 2004 at 07:55:16 PM EST

    "How about you allow 5[var] to mean the same as var[5]?"

    And what exactly is the cause of your difficulty with the commutativity of .[.]? Too confusing for you? Goes against your religious beliefs?

    --
    The Definite Article

    C is owned by Pascal? (1.75 / 4) (#364)
    by JayGarner on Mon Feb 09, 2004 at 10:07:03 PM EST

    WTF is this doing here? I feel like I went to the Journey fan board by mistake, and there's an article on how much Foreigner sucks.

    WTF?!

    C Isn't Supposed to Be High Level (2.75 / 4) (#369)
    by NeantHumain on Mon Feb 09, 2004 at 10:40:03 PM EST

    If you want a string type, operator overloading, exception handling, etc., use C++, Java, C#, or any of the other new-fangled programming languages out there. C was never meant to do these high-level things because it was originally meant to write an operating system in.

    C's simple treatment of all things as either variables or pointers is its beauty. Its beauty is also its ugliness. It's in the eye of the beholder.

    I use C++ or Java most of the time because I could use the abstraction that C doesn't provide. If you want to deal with things at the low level, use C or maybe assembly.


    I hate my sig.


    Great Story (1.60 / 5) (#380)
    by Gysh on Tue Feb 10, 2004 at 03:09:36 AM EST

    Wow.

    I'd like to apologize for all the in your other article, because while I still don't agree with it, this one rocks. I don't mind using C or C++, and there were a few areas where I disagreed with you, but most of your points were (in my opinion) more than valid, and I really enjoyed reading the article.

    C(++) definitely isn't a fun language to learn compared to others, but I tend to use whatever works. It bugs me, however, when people take up the attitude that you're not a skilled programmer if you don't do everything in C(++) regardless of whether or not it's a good idea. "l337 skillz forevaR!" and such.

    Of course, now I feel like an groveling idiot... "Forgive me, oh wise one!"... but that's what I get for going off on a rant despite my better judgment. Heh.

    Egg Troll does technology better (1.75 / 4) (#388)
    by bigchris on Tue Feb 10, 2004 at 04:40:06 AM EST

    Linky

    ---
    I Hate Jesus: -1: Bible thumper
    kpaul: YAAT. YHL. HAND. btw, YAHWEH wins ;) [mt]
    Bogus claims (2.77 / 9) (#393)
    by ttsalo on Tue Feb 10, 2004 at 06:27:08 AM EST

    The article is a troll, but anyway...

    Buffer overflows abound in virtually any substantial piece of C code. This is caused by programmers accidentally putting too much data in one space or leaving a pointer pointing somewhere because a returning function ballsed up somewhere along the line. C includes no way of telling when the end of an array or allocated block of memory is overrun. The only way of telling is to run, test, and wait for a segfault.

    Bullshit. I write security-critical software for living with C and I don't think we've ever had a buffer overflow vulnerability in our code. Why? We don't, ever, store anything in an array without checking that it fits. sprintf and it's ilk are absolutely banned everywhere in our code. (snprintf is good.) You really think that the only solution is running the code and seeing whether it segfaults? Are you a retard?

    If you want to break out from a series of nested for or while loops then you have to use a goto. This is what is known as a crude hack.

    Nonsense. Why the hell would a goto fail; be more of a crude hack than break 3;? Breaking from nested control structures with a goto to a clearly specified location is much, much cleaner than some break n;

    Multidimensional arrays. Before you tell me that you can do stuff like int multiarray[50][50][50] I think that I should point out that that's an array of arrays of arrays. Different thing.

    Same thing. Tell me, what does your multidimensional array do that you can't do with multiarray[50][50][50]?



    Offtopic, but... (2.75 / 4) (#408)
    by bugmaster on Tue Feb 10, 2004 at 08:28:50 AM EST

    How come no modern compiled language (that I know of) supports binary constants ? You would think that C would support them, being a low-level language and all (and hence, incidentally, out of scope of the author's article), but it doesn't. 0xA5 ? Sure. 165 ? Sure. 0245 ? Sure. 0b10100101 (or something similar) ? No.

    Why not ? People have to use binary constants all the time, because that's how you make flags and masks. In fact, I can only think of a single case I've encountered where a hex constant would be better than a binary one.
    >|<*:=

    This is like complaining that... (2.77 / 9) (#418)
    by skyknight on Tue Feb 10, 2004 at 09:06:07 AM EST

    a soldering iron is a lousy tool for creating web applications. You personally, at your dotcom company, aren't going to sit down with such a tool, but somewhere along the line a soldering iron was in fact involved with putting the pieces in place to render the webapp on someone's monitor. It certainly isn't the right tool for building your piece of the pipeline, but it was the right tool for building some of the hardware components of the systems that people are employing to both develop and use the webapps that you write.

    Quite simply, high level languages and tools do not materialize out of thin air. Perl, a beautiful high level language that supports all of the wizardry that you want, is written in C. It's not crafted from hand written assembly. All of mankind's technology is hierarchical, with more specialized and powerful technology being crafted from simpler, more "stupid" tools. Do you think that industrial circuit board etching tools with precisions smaller than that of the resolution of the human eye are built by a guy with a hammer and chisel?

    You lament that C doesn't hold your hand all along the way, but it's a trade off. When choosing a language, you have to ask yourself many questions. What is the user base? How long will this software be around? Is man time or machine time more important? If you're writing a little bit of glue code, you're going to be the only one using it, and it's going to be thrown away at the end of the day, you'd be insane to write it in C instead of Perl. If you're writing a device driver, it's going to be distributed to millions of people and hang around forever, and squeezing every bit of performance out of it is the ultimate imperative, you'd be a fool not to use C.

    There Ain't No Such Thing As A Free Lunch. In a language that checks for out of bounds array accesses, you'll certainly stamp out bugs faster, but it means that every time you access an array there is the overhead of the check. Shifting the burden to the machine makes perfect sense in a lot of situations, but not all of them.

    Real software engineers are not language bigots. Real software engineers carry a diverse collection of tools on their virtual belts, knowing both how and, more importantly, when to use each one of them.



    It's not much fun at the top. I envy the common people, their hearty meals and Bruce Springsteen and voting. --SIGNOR SPAGHETTI
    A choice comment..... (2.00 / 4) (#424)
    by wabbitwatcher on Tue Feb 10, 2004 at 10:35:40 AM EST

    When one single condition you'd just happened to have forgotten about whilst coding screws up, it's your fault.

    Who else is there to blame biy the programmer?

    Such languages do exist. (none / 3) (#435)
    by wrg on Tue Feb 10, 2004 at 12:35:20 PM EST

    You can define binary constants in Ada, and you can even embed underscores to aid in visually grouping them.  So, for instance, you could write the same number in hexadecimal, octal, decimal, and binary thus:

    16#AF#
    8#257#
    175  -- or 10#175#
    2#1010_1111#  -- or 2#10_101_111#

    You can also define binary constants ("bit-strings") in PL/I, for example:

    '10101111'B

    C has sprouted C++, Objective-C, Java and C# (2.75 / 4) (#451)
    by akuzi on Tue Feb 10, 2004 at 02:59:49 PM EST

    > Pascal has matured and grown in leaps and bounds,
    > becoming a premier commercial language. Meanwhile,
    > C has continued to stagnate over the last 35 years
    > with few fundamental improvements made

    This post strikes me as a troll, but i'll give the author the benefit of the doubt.

    C itself has remained constant, but it has sprouted a whole tree of derived languages that completely dominate modern applications programming.

    Whether you consider these premier commercial versions of Pascal you are talking about to be the same language as the original 'Pascal' or not is really a matter of nomenclature - since they are atleast as different from the original Pascal as C is to C++. Borland Delphi is really an Obj oriented language, whereas the original Pascal had no OO features whatsoever.

    Article is Misleading but Partly True (2.66 / 6) (#474)
    by OldCoder on Tue Feb 10, 2004 at 08:04:02 PM EST

    C lacks a formal string type and hash tables because when the language was designed it wasn't clear how to create a string or hash that was good enough for all applications, so they created a language in which one could write whatever kind of string or hash table you wanted.

    This was a true breakthrough, compared to languages that came before. Languages that came before overspecified, so that you could be stuck with a string or hash that was not appropriate.

    The source for the bad string functions you hate was available, and the idea was that people would write their own string types as needed. That's why it was part of the standard library and not in the language, where you could not get away from it.

    The reason that so many projects used the standard string type rather than implementing or buying their own, is basically the cowardice and stupidity of management. Most or many of the software managers in the first several decades of C development were not great programmers themselves, or programmers at all, and were very frightened by the idea of deviating from the standard. I know, I fought that battle.

    The reasons for avoiding buffer overflow, and the techniques to use for avoiding buffer overflow, were well known and widely publicized by the early 1980's. I remember learning them then. But the only way to enforce them is by the formal code review, which was too expensive for most budgets. Another management failure.

    To emphasize this, consider the widespread problem of buffer overflow in Microsoft products written in C. Many or most of the people building the software knew about the potential for buffer overflow, but management could not get organized enough to create and enforce coding standards.

    Of course, the fact that manageing programmers was similar to the job of herding cats only made management more difficult. The current commoditization of programming jobs to the level of low level clerk-like status, and the regimentation that is the norm in Indian workplaces, might help here...

    Deeper Reasons
    Building a safe string type requires building a memory allocation system underneath, such as garbage collection or following the malloc/free discipline. Once one of those is in your program, you cannot get rid of it. But some programs are simply not compatible with a one-size-fits-all memory allocation scheme, and so the C language needed to provide programmers with the flexibility that comes with not bolting in a memory allocation scheme.

    C was built for the programming problems of the 1970's and 80's, when machines were made of wood and men were made of iron, as they say. The luxury of a multi-gigahertz processor was just a pipe dream...

    --
    By reading this signature, you have agreed.
    Copyright © 2003 OldCoder

    C is old. But its offspring aren't. (2.62 / 8) (#489)
    by Xtapolapocetl on Wed Feb 11, 2004 at 04:14:55 AM EST

    Firstoff, a little background. I'm a longtime C guy (23 years old, been programming in C on a regular basis since I was 9, and damn near daily since I was 15). I love C. But I've gotten to the point where its limitations just get in my way.

    You're right about C being an old language, and it shows. But you're comparing a new version of Pascal (Delphi, which is a far cry from old-school Pascal, being that it's an OO language while Pascal originally was nothing of the sort) to an old version of C - try comparing it to C++ or Objective C, which are basically modern versions of C. I don't consider C99 to be a new version of C, either - I consider it to be an update to an old version of C. While it fixes a lot of annoyances with C and is in my opinion a good thing, it still doesn't add the types of things that make software development less of a battle.

    First I'll talk about Objective C, because it's become my favorite language over the past year. Combined with the NextStep API (I use its best and most current implementation, Cocoa, under MacOS X), it's amazing how rapidly you can develop a GUI application. The standard library is unbelievable, including a through-and-through standardized memory management scheme (NSObject's retain and release reference-counting mechanism). If you follow the rules, you will never have a memory leak, because the rules of who allocates and who releases objects are very well-defined and sane. If I'm writing a utility or something for my own use, I'll do it in Objective C (unless it's the type of thing for which Perl makes more sense, like a quick script). On its own, Objective C is nothing terribly special (although it does let you do some neat things that are difficult or impossible in other languages), but when combined with the NextStep API (which is, for all intents and purposes, Objective C's standard library), it's truly a beautiful thing, and if it makes sense to use it for something I'm working on, I absolutely would not consider using anything else.

    But that only really does you good if you're developing under and for MacOS X (it's good to be a Mac user these days! Come join us!). My day job, though, is that of game developer, and let's face it - the Mac market is not a big enough target. So we use C++.

    C++ is a funny beast. Taken on its surface, it looks to be little more than C with some object-oriented extensions. And while that's what it is, the extensions in question make it a very different programming environment to work with, and a much more enjoyable one. A well-designed class hierarchy does wonders for code correctness and maintainability and allows interesting design patterns that would be somewhat of a hassle to do in C. Templates are another story altogether - template metaprogramming (especially in a field such as mine where performance is king) is incredibly useful - it allows truly generic code with zero runtime overhead. You do have to be careful of code bloat, though, but again a good design will help avoid that sort of problem.

    When it comes to memory management, C++ is still basically where C was - the programmer is responsible for allocating and freeing memory from the heap. But I wouldn't have it any other way! Again, things are different in less performance-sensitive situations, but I need to have full control over everything the program is doing - the last thing I want my code to do is dump into a garbage collection routine that'll waste 200ms for no good reason. 200 milliseconds? Shit - the performance goal for my engine is 60 frames per second, which leaves a scant 16.666 milliseconds to do everything required to render a frame. A 200ms trip down Garbage Collection Lane would waste enough time to render an entire 12 frames. It, and any other language or library feature with similarly unpredictable performance characteristics, are entirely unwelcome in my world. Thank God C(++) gives me control over my own program.

    But shit, man - you're a smart guy. You can come up with a workable solution for organizing resource management - I did it for this engine early in development, and 2 years later, resource leaks are very rare and always caught and fixed almost immediately. Nobody said programming would be brainless.

    Now about your complaints with the standard library - again, it was written for a different computing world. Of COURSE it feels dated and doesn't support some things it should. But once again, look at the current incarnations of the C family - I've ranted enough about Cocoa enough, so suffice it to say your complaints don't apply to it. C++ too, though - the STL is a wonderful library, and a huge timesaver. If you know how to use it properly - not just the basics of its containers, but also its algorithms - you can accomplish a ton of work in a very short time by allowing your code to be "written" by the template processor instead of doing it by hand. And as an added bonus, you're building on code you can rely on - it's been debugged by someone else already! I've never been affected by a bug in STLPort, which is the STL implementation we use for our engine (due to both its very good performance and its portability - nothing is more of a pain in the ass than writing code using different implementations of a core library on different platforms. Too much hassle. So we use STLPort on Win32/Linux/OS X).

    As far as C goes, you're right - it's old, and it shows. But even though C was my first and longest love in the programming world (not the first language I learned - that would be Pascal. But I fell in love with C), I wouldn't consider starting a new project in it, no matter what the project (with the possible exception of embedded programming for a platform where a decent C++ compiler doesn't exist). C++ just offers the programmer too many advantages over plain C, and as I said, if you know what the hell you're doing, you can get equivalent performance out of C and C++. And while you can't really write code in C++ that's faster than anything you could write in C, templates make it easier to write efficient code in certain situations that would require much more (mostly repeated) code in C.

    Although I truly love C++, it is certainly not appropriate in all situations either. If performance isn't an issue (or security is) I would generally use a safer, interpreted language (possibly Perl, but lately I've been digging Python. Those aren't the only options, though - many other languages would work fine too). Especially for any code which interfaces with a network (or arbitrary file data, or another other case where a malicious user can feed your program garbage, especially when the program is running with elevated permissions), using a language like C which has no security features is really asking for trouble. Despite my talk about memory management above, I'm not perfect when it comes to details, and neither is any other programmer, and those details will be the death of you in a security-conscious environment. Considering that networking programs are generally I/O bound as opposed to CPU bound, a slower but safer language makes sense.

    And while I'm on the subject, the same applies to command-line utilities, system daemons, and things of that nature. Frankly I'm shocked that the OpenBSD project hasn't started an initiative to rewrite good chunks of their userland in a non-C language. I'm sure they are well aware that it would result in a safer, more secure system than what they're doing now (combing through their code with a fine-toothed comb looking for security problems). Obviously buffer overflows aren't the only security problem there is, but Perl (and I'd imagine Python and other as well, although I'm less familiar with those) has a ton of libraries, both in its standard library and in public code archives like CPAN, which are designed to help avoid other security problems.

    Ok, I've rambled enough (it's 4 in the morning, and I gotta get back to work - this physics engine isn't gonna debug itself!). I guess my point is that yes, C is old. But many of your issues are addressed quite well by C's modern derivatives (although not all - that wouldn't be possible without sacrificing control and efficiency, which are exactly the things that necessitate using a C-based language in the first place). The problem is that they are used when they shouldn't be. My job couldn't be done without a C-based language, though, and neither could many others. When raw performance counts, there's no reasonable alternative. Nothing else gets out of your way and lets you at the raw horsepower of your machine quite the same way (well, except assembly, but aside from being non-portable, it's generally unnecessary these days unless you're doing very low-level code interfacing directly with the hardware or you're trying to do optimizations which are beyond the scope of the compiler, such as using SIMD instructions). With power always comes responsibility, and C and friends are no exception. If you know how to use them, and you have a reason to do so, they can be a lifesaver. But if you fuck up, you have no one to blame but yourself. The computer just did what you told it to do, nothing more, nothing less.

    ~Xtapolapocetl

    --
    zen and the art of procrastination

    Smalltalk-80 (none / 3) (#516)
    by Phillip Asheo on Wed Feb 11, 2004 at 04:27:12 PM EST

    Kicks every other language's ass including Java and C++.

    If only Steve Jobs had stolen that from Xerox too...

    --
    "Never say what you can grunt. Never grunt what you can wink. Never wink what you can nod, never nod what you can shrug, and don't shrug when it ain't necessary"
    -Earl Long

    Java descending from C? (none / 2) (#529)
    by marcovje on Thu Feb 12, 2004 at 08:15:30 AM EST

    Java only borrows some basic C syntax (like {} and the post/pre increment operators) Actually Delphi is much closer to Java than e.g. C++. Also check the credits for the Java VM. One contributor is a ... Niklaus Wirth :)

    Huh? (none / 3) (#530)
    by Arevos on Thu Feb 12, 2004 at 09:38:37 AM EST

    Good programming is all about choosing your tools. You're complaining that your screwdriver is a really bad tool at hammering in nails. Your whole rant is, well, utterly pointless.

    C isn't high level. It's a step above assembly. If you want to manipulate strings and hash tables or whatever, then either get a good library or, don't use C!

    It really is that simple.

    Why are you ranting about this? (none / 1) (#544)
    by Verdeboy on Fri Feb 13, 2004 at 01:10:35 AM EST

    This rant is pointless, C++ is much better than C.

    About your rants about the preprocessor: a programmer worth his salt DOESN'T USE THAT STUFF, or at least I don't.

    About your string rants there is a C++ header called string.h which defines an object-oriented string type, and in C++ all you have to do is overload operator=() to call the appropriate functions to concatenate strings (not hard to do, and i think in the C++ header it already is, but I'm not sure). If I want do complicated string operations I use Perl scripts embedded in my C++ code--which is very easy to do since that excellent language was written in C.

    The reason there is not exponent operator is that they ran out of operators--operator^() is used for bitwise OR.

    Finally check your UNIX makefiles and note most of them call a C compiler--why on earth they don't use C++ I will NEVER know.
    But all in all, this kinda thing is a personal preference issue, you should have figured out mine by now.
    --Verde

    DETECTING WINDOWS USE DOWNlOAD SLACKWARE LINUX
    Hilarious (none / 1) (#550)
    by loqi on Fri Feb 13, 2004 at 06:48:58 PM EST

    This is even better than that "Why C++ is the coolest language" article. Keep taking him seriously, folks, it's got me in stitches.

    that's strange... (none / 0) (#563)
    by busfahrer on Mon Jul 12, 2004 at 02:58:17 PM EST

    ...he listed all the reasons for which I like C.
    --
    GCS d s:+ a19 C++ UL P+>P++ L+>L++ E- W++ N+ o? K? w+>w++ O! M- V? PS+ PE-- Y+ PGP t 5? X+ R(R+) tv b- DI D++ G e h! y
    THE BEST PROGRAMMING LANGUAGE (none / 0) (#564)
    by THE TRUTH on Sun Jun 12, 2005 at 06:18:40 PM EST

    EVERYTHING YOU CAN DO IN C (C++,C#,OBJECT C ...) YOU CAN DO IN PASCAL(FREE PASCAL, OBJECT PASCAL AND DELPHI) EXCEPT THE ERRORS THAT THE C COMPILER ALLOWS. :))

    THE SIZE, THE PORTABILITY AND THE SPEED OF THE CODE IS SIMILAR IN ALL COMPARABLE LANGUAGES.

    I HAVE WORKED IN ALL OF THE PROGRAMING LANGUAGES ABOVE BUT NONE COMPARES TO DELPHI.

    DELPHI RULZ!!!!!!!!!!

    IT IS THE FUTURE. TRY IT! IT'S THE BEST!:-)

    C IS OUTDATED, PASCAL HAS EVOLVED.

    Whoa, what the shit? (none / 0) (#565)
    by Patrick Chalmers on Sat Oct 15, 2005 at 12:41:27 PM EST

    This story still hasn't been archived? Laughing online!

    BTW, hi bleep.
    Holy crap, working comment search!

    Why C Is Not My Favourite Programming Language | 556 comments (448 topical, 108 editorial, 3 hidden)
    Display: Sort:

    kuro5hin.org

    [XML]
    All trademarks and copyrights on this page are owned by their respective companies. The Rest © 2000 - Present Kuro5hin.org Inc.
    See our legalese page for copyright policies. Please also read our Privacy Policy.
    Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
    Need some help? Email help@kuro5hin.org.
    My heart's the long stairs.

    Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!