Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming Communications Education Network Networking Software The Internet Technology

Is IoT a Reason To Learn C? (cio.com) 374

itwbennett writes: Whether or not beginning programmers should learn C is a question that has been roundly debated on Slashdot and elsewhere. The general consensus seems to be that learning it will make you a better programmer -- and it looks good on your resume. But now there might be another reason to learn C: the rapid growth of the internet of things (IoT) could cause a spike in demand for C skills, according to Gartner analyst Mark Driver. "For traditional workloads there is no need to be counting the bytes like there used to be. But when it comes to IoT applications there is that need once again..."
This discussion has been archived. No new comments can be posted.

Is IoT a Reason To Learn C?

Comments Filter:
  • Until (Score:2, Insightful)

    Someone create a IOT language!

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Ugh... Please don't give them any ideas. C is plenty fine for the task.

    • There's MicroPython and you can also build stuff with Simulink.

      • Re:Until (Score:4, Insightful)

        by Z00L00K ( 682162 ) on Wednesday February 15, 2017 @02:13AM (#53871473) Homepage Journal

        If you really want something unmaintainable you should go for Simulink.

        On the other hand - I have never seen a good reason NOT to learn C. It's one of the basic building block languages that's widely used on almost every platform, so you won't waste your time if you learn C.

  • by haruchai ( 17472 ) on Tuesday February 14, 2017 @09:05PM (#53870001)

    and I not sure learning C will help much with that.

    • The language that pretty much invented to buffer overflow? Yeeeah, I think you might be right.

      • by bug1 ( 96678 ) on Tuesday February 14, 2017 @09:31PM (#53870149)

        C was invented as portable assembly IIRC. If you cant sort out a buffer overflow then dont call yourself a programmer.

        Real programmers see their job as making the computers job easy, not the other way around.

        • Computers (and machines in general) were created to make the life of humans easier. Imho, a real programmer also remembers that fact.

          • by bug1 ( 96678 )

            Computers (and machines in general) were created to make the life of humans easier. Imho, a real programmer also remembers that fact.

            There are no technical reason a programmer cant create code that is good for humans (usable), and for machines (efficient).

            The problem is when a programmer decides, or is forced to compromise on quality. That is the lesser programmer.

        • by Opportunist ( 166417 ) on Tuesday February 14, 2017 @09:56PM (#53870309)

          Real programmers? As compared to the untrue Scotsman?

          Look around you and see what doubles as "programmer" today. Ask them how a stack overflow happens and why that is a problem. If I get a buck from you every time you get a blank stare and you get ten from me every time you get a sensible answer, your house belongs to me before the week is out.

        • by haruchai ( 17472 )

          "If you cant sort out a buffer overflow then dont call yourself a programmer"
          Several decades of severe network exploits show that a huge amount of software, commercial, free, open source, obscure, commonplace, whatever was written by people who can't call themselves programmers and used by those who don't have the 1st clue how to use it securely.

          When I first got serious about computing, 2 decades gone, the common wisdom was that we had not choice but to keep on using the old (mostly very insecure) software

          • by Darinbob ( 1142669 ) on Tuesday February 14, 2017 @11:51PM (#53870973)

            Most buffer overflows weren't necessarily because of being sloppy in the original code, but because the code was copied so readily. Someone has a simple routine without all the necessary checks because it's not being used for anything very important, software that doesn't need to be secure, it's a one-off utility (maybe it converts postscript to PCL). Then someone copies that routine into another program, makes that a set-uid program, and poof you've got a security hole. First programmer says "it was not intended to be re-used", second programmer says "re-inventing the wheel is foolish!", and they blame each other.

        • by ShooterNeo ( 555040 ) on Tuesday February 14, 2017 @11:09PM (#53870737)

          I do embedded C programming. With this said, I don't think that improvements to the tools are impossible - sure, I have to prevent buffer overflows myself at the present time - but it doesn't have to be this way. The key thing about embedded programming is that hardware designers are lazy. They want to do the least amount of work possible. So instead of making their hardware easy to program, they like to make it in a way that is easiest to them. So every data sheet contains all kinds of special exceptions to the rules that you the programmer have to take into account. And instead of supporting some fancy, easy to program in language, they do the minimum amount of work to make a C compiler work. (it's really minimal - you only need to map a few base instructions to opcodes on the hardware and you can bootstrap the C compiler).

          One major issue is while every microcontroller or DSP generally has roughly the same stuff - various ports that do the same thing, the MAC instruction, usually a von Neuman architecture, usually interrupts and DMA - you basically have to scrape the datasheet for weeks to do something you've done before on a different microcontroller.

          • Well, most systems use third party C compilers, gcc or keil or whatever. C is low level enough that the CPU doesn't have to know it's C anyway, it could be Fortran, Pascal, Ada, or whatever as long as it has a basic stack model. They're not supporting any language. Von Neuman isn't even required, there are many embedded systems that are Harvard architecture (separate regions for code and data) which C supports just fine.

            • Sure. And if somebody has bothered to do the porting work and write a third party tool that uses a different compiler - and your project has the budget for such a tool - it's better. But it still isn't easy. You still end up reinventing many things.

        • Worth mentioning that a half-way decent string and buffer library will solve basically all buffer overflow issues. If you're using strcpy(), gets(), or even strncpy() (without knowing the pitfalls), then you're in trouble.
      • by guruevi ( 827432 )

        Buffer overflows can be very useful constructs when you handle them very carefully and properly. They can be used as the GOTO for very limited microprocessor's code.

        • by vux984 ( 928602 )

          If you need a stronger microprocessor for a task use one. The notion that using buffer overflow or stack smashing or jumping into data segments with 'dynamic' code is something you should ever do is simply ridiculous.

          Yeah... to get a 60Hz 1970s microprocessor to do something useful in 512 bytes of RAM you had to be creative... and it was expensive to go upmarket for a better CPU. But in the 21st century if you are using a buffer overflow to write code on purpose, the time you spend building and documenting

        • No. If that's your concern, simply PUSH the target address and RETurn to it. At least if your processor doesn't simply let you MOVe raw data into the IP.

          There is exactly no reason to ever smash the stack to redirect the program flow UNLESS of course you do not have control over the program code, i.e. when you have to do pretty much what those who use that kind of exploit do. But then we're leaving the area of development and enter the realm of ... let's say repurposing.

    • by raymorris ( 2726007 ) on Tuesday February 14, 2017 @09:35PM (#53870171) Journal

      Career network security programmer here.

      Absolutely of you're programming a device that connects to the internet, you should understand a bit about security and have a security mindset. If your device won't get regular updates, this is even more true.

      Where does C fit in? It's unnecessary, if you just want to learn basic security best practices.

      If you want to really understand how exploits work, and some advanced protections, you need to understand how your program and your data are arranged in memory, what actually happens in the hardware to when your asynchronous timer goes off, etc. For that, C is the language to learn. Java programmers know how to use Java. C programmers know how Java works, internally. The bad guys writing exploits are (typically) C programmers, they can defeat your PHP or Python program because they know how PHP and Python work internally.

      You've always used languages with automatic garbage collection, so you mostly don't have to worry about freeing memory after you use it? Great. You don't know how and when memory is freed, and what happens when a hacker exploits a "use after free" to execute code that he's put into the variable you think no longer exists.

      To be clear, I'm not saying that people need to *use* C to write secure software. I'm saying that if you *learn* C, you'll learn a lot that applies to advanced security knowledge in any language. Higher level languages are most commonly written in C; if you know how things are done in C you'll understand what your high-level language is doing behind the scenes. You'll understand your Ruby software much better if you understand how the same program works in C.

  • by lucasnate1 ( 4682951 ) on Tuesday February 14, 2017 @09:11PM (#53870025) Homepage

    I heard this said before about phones, but eventually technology developed enough to allow mobile devices to have a strong enough processor. People are already too used to program higher-level and I see no reason why the same environments we have in our phones can't run on our fridges or boilers or ovens, therefore I do not think that people will use C.

    • Someone at some level is going to be using C, even on smartphones and IoT devices. Yes, writing apps doubtless will be done in higher-level languages, but there's going to be someone working in C, at the very least writing drivers for a minimalistic Linux environment.

    • by JanneM ( 7445 ) on Tuesday February 14, 2017 @10:43PM (#53870575) Homepage

      The high-level VMs and the drivers to drive the specific hardware isn't developed by magical Low-Level Elves in Happy Rainbow Fairly Land. Every IoT device is going to have their oen special hardware stuff, and somebody needs to write the low-level code to interface with it. That is done in a combination of C and assembler.

      Also, at volume the price difference between a MCU that can run a VM and one that can not will be on the order of tens of cents (either currency). If you plan to make on the order of a million devices, then 20 cents per unit will more than pay for a programmer that knows to use the small MCU over a Java hack that does not.

    • What do you think the higher level languages are written in, or the operating system on those phones? The high level stuff is for the frigging apps writers, not for the engineers building the device. The apps aren't written in C and the radio layer isn't written in Java.

    • I see no reason why the same environments we have in our phones can't run on our fridges or boilers or ovens, therefore I do not think that people will use C.

      Because CPUs aren't really getting faster anymore, and no one wants to pay an extra $60 for a microwave just so the company could get away with hiring a cheaper programmer.
      I hope I'm wrong and you're right, I would love a ~20GHZ processor in my microwave because it was cheap enough.

  • by Opportunist ( 166417 ) on Tuesday February 14, 2017 @09:16PM (#53870047)

    IoT is a reason to learn a few things about IT security. Whether you plan to develop in the field or go into consulting, IoT means total job security in the IT-Security field for the foreseeable future.

    Quite frankly, if you thought Microsoft is keeping security busy, just wait 'til IoT makes it big into the office space. You're looking at security holes and blunders you can't even imagine today! And every single of them are a sweet, sweet, 4 to 5 digit consulting gig!

    • I think market pressures will fix the security problems, pretty quickly as IoT becomes ubiquitous.

      Recall Apple's and Microsoft's ongoing fight for security.

      • Look at the mess internet capable TVs are. Then ponder your statement again.

        Most "smart" TVs are horribly insecure. Did that cause an outcry? Nope. And as long as those TVs will just participate in a DDoS while still allowing Netflix to be shown, nobody will give a shit.

        Security is not a marketable feature. Nobody gives a shit about security unless it affects him directly. And since IoT devices are mostly used for DDoS blackmail right now, this doesn't even register with the average user.

        • DDoS blackmail isn't a thing [cloudflare.com], though some do fall for the scam:

          Given that the attackers can't tell who has paid the extortion fee and who has not, it is perhaps not surprising to learn that they appear to treat all victims the same: attacking none of them. To date, we've not seen a single attack launched against a threatened organization. This is in spite of nearly all of the threatened organizations we're aware of not paying the extortion fee. We've compared notes with fellow DDoS mitigation vendors and none of them have seen any attacks launched since March against organizations that have received Armada Collective threats.

    • Good one. IT security already seems to be a fairly in-demand skill, combined with the worries over IoT you'll be set for the foreseeable future.

      But what is IoT? Home automation? Smart appliances? Industrial sensor networks? Inercommunicating cars? What? For now I'll think of IoT as "Any networked stuff, other than servers, workstations or network equipment"
      • All that and more. In the consumer environment you're looking at home automation and various gadgets, while companies will certainly want to automatize facility management, physical security and power saving features like lamps that turn on and off depending on where people are, far more accurately than this happens today with simple motion detectors.

        There's plenty of room for IoT devices, and all of them are horribly insecure. I'm looking at a very bright future.

    • And every single of them are a sweet, sweet, 4 to 5 digit consulting gig!

      That sounds low......I assume that's not per year. How long would each one of those gigs last?

  • Android Things uses Java, and I'm sure other devices will use different languages (python, or something else that comes along). C is a nice to understand kind of language if you want to move libraries for Arduino to other platforms, but understanding any similar language will make that trivial.
    • by Z80a ( 971949 )

      Android you can and generally do use the NDK thing, that allows you to create native C compiled libraries to get a really, really nice boost of speed.

  • If its imported as a product what code is going to be done? Branding over some device ready GUI?
    Thats more an art skill for a web app?
    Making your own IoT device in the USA? Whats trendy in 2017 for low level design work?
    A US design team with the needed skills will work on that.
    If the kit is sold, most of the work has been done.
    If your making your own kit, you have the skills or paying a very smart person with the skills.
  • ... that is the question.

    • To C or not to C ... that is the question.

      And the answer would be true, regardless of the value of C. ;-P

  • ... use BASIC:

    10 get foxnews.com
    20 refresh
    30 goto 10

  • Learn Swift (Score:4, Interesting)

    by SuperKendall ( 25149 ) on Tuesday February 14, 2017 @09:25PM (#53870097)

    Swift is a language well-suited to byte counting, if that is a need - at this point because of the tremendous pressure to increase security on iOT devices I really think Swift could have massive uptake.

  • In an ideal world, developers of this newly emerging industry would try to avoid the mistakes of the past. They would gravitate towards one of the "safer" low-level languages such as Rust or Ada instead of C.

    Of course, from the news headlines it seems that IoT developers are already intent on recreating every bad security practice that's been described since the Morris Worm. So I'm not holding my breath.

  • Rust does anything in any space C can, but more safely.
    Light IoT will always need languages that are very power cautious, but I see that bringing a rise of CUDA IoT, or even FPGA skills.

    Heavy IoT will always be flavor-of-the-month.

  • Just C on your resume without C++ will pigeonhole you as a code monkey. Learn C++ and you will also know C, which is essentially just C++ with subtractions. These days, it would be rare to encounter an embedded tool chain that does not support both C and C++, usually with the same compiler.

    • by ebonum ( 830686 )

      I'm a COO, and I have C and C++ on my resume.

      It is not C on your resume that limits you, the limit is only in your mind.

    • C++ is not C and C is not C++.

    • After working with C++11, and seeing where C++17 is going, I decided I don't want to do C++ anymore. It's still on my resume (just like C#), but I'm not taking any jobs in it. The language is usable, but C++ codebases tend to suck.
      • Sounds like you found it too confusing. But really, C++ is much better with lamdas, you can get rid of functors for one thing, type deduction is working really well these days, generalized const exprs are really handy, etc etc. I'm not sure you understood what you were reading, you certainly did not provide any specifics. Better think about taking C++ off your resume, because if you run into a competent practitioner you risk being unmasked as someone who knows about the language, does not know the language.

        • But really, C++ is much better with lamdas, you can get rid of functors for one thing, type deduction is working really well these days, generalized const exprs are really handy, etc etc.

          I can link to half a dozen nice looking codebases in C. Can you link to even one in C++? The only ones I can think of that are any good limit themselves to a small subset of C++.

          • You've pretty much outied yourself as someone who has C++ on their resume in spite of having only a limited reading knowledge at best. Why don't you start by reading some code for the Kate editor? [kde.org] Seems pretty approachable to me. QT centric, but that's not a bad thing.

  • by ebonum ( 830686 )

    Have we fallen so far that we need a reason such as "IoT" to learn C?

    Sad day.

  • Yeah, I don't think any modern IoT device has any programmer "counting the bytes". I used to count bytes, back when I had 4KB of memory, or 8MB of memory, or 20MB of disk space. I think you'll be hard-pressed to find any IoT device with less than a gig of virtual memory. Considering zero or near-zero graphics output, I think you'll be just fine with any language ever inventing.

    My vote goes to turing, which I haven't seen in twe decades, but for which I have a school-age nostalgia -- I made a street-fight

  • Most mods will probably flag this as trolling. But I believe JavaScript is a great language for IoT. There are a few advantages of using JavaScript, it's actually very easy to get networking to work well and reliably. A programmer will be able to write front-end, server-side/less back-end and IoT back-end all in one language. The code will be portable across all these bases (not always needed, but some functions will be universal). There are now a proliferation of embedded devices that support JavaScri
  • It seems likely that most IoT devices will rely on a central hub which leaves the devices as relatively dumb. The hub (or the cloud) side will likely be a less constrained environment so will use a higher level language.

    The other factor is that the thing side has a manufacturing component so will probably be commoditized by Chinese manufacturers and relatively few jobs will exist outside.

  • Simply because... (Score:4, Insightful)

    by God of Lemmings ( 455435 ) on Tuesday February 14, 2017 @11:16PM (#53870771)
    knowing how to program in C and how C works under the hood makes you a better programmer. Even if you don't program in C. That is reason enough.
  • Like it or not, C is still the defacto standard for embedded programming. If you're working in any where near bare metal your going to be using it. That may change but for now, if you dev for Internet of things and you don't know it, you're basically a script kiddie. Source: I'm an electrical engineer.

  • Reducing compute times/memory usage by 0.1% in big data centers reduces the cost of electricity and cooling by millions.

    Reducing compute times/memory usage in battery powered devices increases battery lifetime.

    I view C++ as a better choice then C though, who can code a linked list from scratch more efficient then what they do? You get more abstraction power.

  • Because the current crop of IoT programmers clearly don't know a thing about it, or just don't care, which is why IoTs are the largest DDoS attack army in history.
  • C is programming lingua franca. Anybody calling himself a programmer and not knowing C is not a programmer.

    Before the mainstream software programming turned to be web programming, C was taught in every computer science university programme.

    Not knowing C limits your competence to a small pond called web programming.

  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Wednesday February 15, 2017 @03:48AM (#53871725)

    ... Embedded is a reason to learn C though. And embedded and IoT do have some intersection/overlap. But I IoT itself is mostly a fad involving the slapping together of unsafe preconfection microlinuxes with unsafe overkill websevers/port 80 stuff and adding that to toasters and stuff that really don't need it and won't be used more than ~3 times unless by some bored teenagers wgo wants to screw up your homes heating or AC by surfing on shodan for some lonb forgotten default access to said IoT trinkets.

    Bottom line:
    You shouldn't do anything because of IoT unless it,s avoiding it like the plague (unless you're a hacker that is). OTOH If you want to learn embedded, C with assembler for the basics is the way to go.

    Good luck.

  • by AaronW ( 33736 ) on Wednesday February 15, 2017 @05:59AM (#53872037) Homepage

    While most of my work is for chips that are vastly more powerful than what is found in IoT devices I work on bootloaders and bare-metal programming. In some cases memory is a premium. With only a couple of exceptions, all of the work I have done has always been with C. Most small micros are programmed in C almost exclusively with a sprinkling of assembly.

    C is very good for working closely with the hardware and in memory constrained environments. C code does exactly what it says. There is no hidden overhead. The runtime needed to run C code is pretty minimal. All it really needs to get going is often a few registers initialized and a stack and it's ready to go.

    It works beautifully with memory mapped registers and data structures which are extremely common in this environment. There's even a keyword designed for this, volatile, that is not present in most other languages (or it does not do the same thing).

    I can use a bitfield to define hardware registers and just map a pointer to the address of that register and use it. Mixing in assembly code is easy if it's needed, though generally I find that assembly code isn't needed very often.

    It's also easy to generate a binary blob using C, a linker script and a tool like objcopy.

    My experience has mostly been with 64-bit MIPS and some ARMV8 but it applies to most embedded processors. The output of the C compiler when optimizing for size (with the right options set) is pretty close to hand optimized assembly and often even better because the compiler does things that would make the code otherwise hard to read or maintain. The runtime overhead of C is minimal.

    C's flexibility with pointers is also a huge plus. I can easily convert between a pointer and an unsigned long (or unsigned long long) and back as needed, or typecast to a different data type and pointer arithmetic is trivial. There is no hidden bounds checking or pointer checking to worry about. Many people say that's a bad thing, but when you're working close to the metal it can really turn into a major pain in the you know what. Master pointers and know how and when to typecast them. I've seen too many times where people screw up pointer arithmetic. I once spent several weeks tracking down a bug that showed up in one of my large data structures where I saw corruption. It turned out that some totally unrelated code written by somebody else in a totally different module was written by someone who didn't understand pointer arithmetic and was spewing data all over the place other than where it should be. He also didn't realize that when you get data from a TCP stream you don't always get the amount of data you ask for, it could be less.

    I have been writing low-level device drivers and bootloaders for over 20 years and while programming has changed significantly for more powerful systems in userspace, for low level programming it has changed very little. My advice is to learn C and learn it well. Know what some of those keywords mean like static and volatile. Anyone who interviews in my group damned well better know what volatile means and how to use it.

    C isn't a very complicated language but it takes time to properly master. It also doesn't hold your hand like many modern languages, which is why you often hear it is easy to have things like buffer overflows or stack overflows and it's also easy to shoot yourself in the foot. It doesn't have many of the modern conveniences, but those conveniences often come at a cost in terms of memory and performance.

    The best book I've seen on the language was written by the authors of the language. It's not very long but it is concise and well written.

    The C Programming Language [amazon.com] by Brian W. Kernighan and Dennis M. Richie.

    I have also worked on C++ device drivers. While the overhead of C++ itself is generally minimal, it depends that you use only a subset of C++ and you have to know what C++ is doing behind the sce

  • by Lisandro ( 799651 ) on Wednesday February 15, 2017 @06:23AM (#53872099)

    The nice thing about C is that is as close to portable assembler as a language can get. It forces you to understand how computers and OSs work in order to become proficient, and in time your code will become better because of it - even when using other languages.

    Of course, the ugly thing about C is that is as close to portable assembler as a language can get.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...