Is IoT a Reason To Learn C? (cio.com) 374
itwbennett writes: Whether or not beginning programmers should learn C is a question that has been roundly debated on Slashdot and elsewhere. The general consensus seems to be that learning it will make you a better programmer -- and it looks good on your resume. But now there might be another reason to learn C: the rapid growth of the internet of things (IoT) could cause a spike in demand for C skills, according to Gartner analyst Mark Driver. "For traditional workloads there is no need to be counting the bytes like there used to be. But when it comes to IoT applications there is that need once again..."
Until (Score:2, Insightful)
Someone create a IOT language!
Re: (Score:2, Insightful)
Ugh... Please don't give them any ideas. C is plenty fine for the task.
Re: (Score:3)
Re: (Score:2)
If I can't use Rust or C# for low-level coding, then I demand my money back!!!!!!!!! For fuck's sake, I want to compile Ruby to run executables on my fridge!
Re: (Score:3)
You canuse rust, that's kind of what it's designed for. It's got more or less the same view of the machine as C, but with a much more sophisticated compiler front end.
Re: (Score:2)
There's MicroPython and you can also build stuff with Simulink.
Re:Until (Score:4, Insightful)
If you really want something unmaintainable you should go for Simulink.
On the other hand - I have never seen a good reason NOT to learn C. It's one of the basic building block languages that's widely used on almost every platform, so you won't waste your time if you learn C.
Re:Until (Score:5, Insightful)
Right, but it's shitty for the programmer to manage it manually, isn't it?
Not as shitty as having to guess when Java will close the file you just wrote (eg. so you can copy it to a USB stick).
Resource management is much more than RAM, it's files, network connections, etc. Garbage collectors handle RAM OK but they're really really crap for everything else. In reality languages like Java need just as much manual resource management as C.
The only language which really doesn't need manual work is C++. C++ has stack unwinding. C++ frees resources immediately when objects go out of scope, not when some garbage collector decides to wake up (which might be "never" - your file might _never_ close unless you quit the program).
Re: (Score:2)
A C64 game written in C++17. [youtube.com]
Re: (Score:3)
A C64 game written in C++17. [youtube.com]
This kinda displays why C++ is unsuitable for small memory stuff. The C64 had remarkably large games in its time, and yet by using C++ all that is possible is a simple pong clone. I wrote for the C64. This "demo" is very unimpressive.
Re:Until (Score:4, Insightful)
Complete bollocks.
I use C++ on a day to day basis on Atmel chips. I've written ray tracers in C++ on chips with 4K Flash and 128 bytes of RAM using ordinary "float" data types.
The entire "Arduino" ecosystem runs on C++, driving 3D printers, robots, etc. on tiny microcontrollers. How do you explain that Pesky Fact?
Java, Python and VB.net don't have a snowball's chance in hell of running on one of those chips.
(Most likely you've never even used C++, you're just repeating crap you once read on the Internet)
Re:Until (Score:5, Interesting)
The trick to using C++ in embedded is to throw away stuff like the STL that depends on memory allocation, and the ivory-tower stunt crap like iostreams. Even templates are generally bad because they cause code size to explode by compiling a new version of each method for every class that uses them.
Basically, do what mbed does and use it as "C with classes". If you declare global objects for each I/O device, the class declaration then becomes an API for the device, and the object hides all the internal state. The operational details of it can be rewritten for a similar kind of device, without changing the code that uses it. If you do things right, you might even save some bytes. C++'s inlining of method calls when there is no ambiguity also contributes to the efficiency. It is also much easier to use virtual methods than to deal with the twisty function pointer syntax in C, plus you only get one set of virtual function pointers generated per class in the vtable, and can't use the wrong function pointers by accident.
A damn good reason to learn security best practice (Score:5, Insightful)
and I not sure learning C will help much with that.
Re: (Score:3)
The language that pretty much invented to buffer overflow? Yeeeah, I think you might be right.
Re:A damn good reason to learn security best pract (Score:5, Interesting)
C was invented as portable assembly IIRC. If you cant sort out a buffer overflow then dont call yourself a programmer.
Real programmers see their job as making the computers job easy, not the other way around.
Re: (Score:2)
Computers (and machines in general) were created to make the life of humans easier. Imho, a real programmer also remembers that fact.
Re: (Score:2)
Computers (and machines in general) were created to make the life of humans easier. Imho, a real programmer also remembers that fact.
There are no technical reason a programmer cant create code that is good for humans (usable), and for machines (efficient).
The problem is when a programmer decides, or is forced to compromise on quality. That is the lesser programmer.
Re:A damn good reason to learn security best pract (Score:5, Insightful)
Real programmers? As compared to the untrue Scotsman?
Look around you and see what doubles as "programmer" today. Ask them how a stack overflow happens and why that is a problem. If I get a buck from you every time you get a blank stare and you get ten from me every time you get a sensible answer, your house belongs to me before the week is out.
Re: (Score:3)
You've got it backwards. Yes, I do know x86 and the pertinent C implementations rather well. But no, since C abstracts those concepts away, there is a) no need to and b) it actually makes things less portable if you have implicit assumptions about what the machine has and does WHEN C ALLOWS YOU TO NOT CARE ABOUT IT.
So please tell me, given the code
int a, *b = malloc(sizeof *b);
What exactly is the advantage of thinking about `a` and `b` as "on the stack" and `*b` "on the heap"? I truly don't get it.
What is
Re: (Score:2)
"If you cant sort out a buffer overflow then dont call yourself a programmer"
Several decades of severe network exploits show that a huge amount of software, commercial, free, open source, obscure, commonplace, whatever was written by people who can't call themselves programmers and used by those who don't have the 1st clue how to use it securely.
When I first got serious about computing, 2 decades gone, the common wisdom was that we had not choice but to keep on using the old (mostly very insecure) software
Re:A damn good reason to learn security best pract (Score:5, Insightful)
Most buffer overflows weren't necessarily because of being sloppy in the original code, but because the code was copied so readily. Someone has a simple routine without all the necessary checks because it's not being used for anything very important, software that doesn't need to be secure, it's a one-off utility (maybe it converts postscript to PCL). Then someone copies that routine into another program, makes that a set-uid program, and poof you've got a security hole. First programmer says "it was not intended to be re-used", second programmer says "re-inventing the wheel is foolish!", and they blame each other.
Re:A damn good reason to learn security best pract (Score:5, Interesting)
I do embedded C programming. With this said, I don't think that improvements to the tools are impossible - sure, I have to prevent buffer overflows myself at the present time - but it doesn't have to be this way. The key thing about embedded programming is that hardware designers are lazy. They want to do the least amount of work possible. So instead of making their hardware easy to program, they like to make it in a way that is easiest to them. So every data sheet contains all kinds of special exceptions to the rules that you the programmer have to take into account. And instead of supporting some fancy, easy to program in language, they do the minimum amount of work to make a C compiler work. (it's really minimal - you only need to map a few base instructions to opcodes on the hardware and you can bootstrap the C compiler).
One major issue is while every microcontroller or DSP generally has roughly the same stuff - various ports that do the same thing, the MAC instruction, usually a von Neuman architecture, usually interrupts and DMA - you basically have to scrape the datasheet for weeks to do something you've done before on a different microcontroller.
Re: (Score:3)
Well, most systems use third party C compilers, gcc or keil or whatever. C is low level enough that the CPU doesn't have to know it's C anyway, it could be Fortran, Pascal, Ada, or whatever as long as it has a basic stack model. They're not supporting any language. Von Neuman isn't even required, there are many embedded systems that are Harvard architecture (separate regions for code and data) which C supports just fine.
Re: (Score:2)
Sure. And if somebody has bothered to do the porting work and write a third party tool that uses a different compiler - and your project has the budget for such a tool - it's better. But it still isn't easy. You still end up reinventing many things.
Re: (Score:2)
Re: (Score:2)
Buffer overflows can be very useful constructs when you handle them very carefully and properly. They can be used as the GOTO for very limited microprocessor's code.
Re: (Score:2)
If you need a stronger microprocessor for a task use one. The notion that using buffer overflow or stack smashing or jumping into data segments with 'dynamic' code is something you should ever do is simply ridiculous.
Yeah... to get a 60Hz 1970s microprocessor to do something useful in 512 bytes of RAM you had to be creative... and it was expensive to go upmarket for a better CPU. But in the 21st century if you are using a buffer overflow to write code on purpose, the time you spend building and documenting
Re: (Score:2)
No. If that's your concern, simply PUSH the target address and RETurn to it. At least if your processor doesn't simply let you MOVe raw data into the IP.
There is exactly no reason to ever smash the stack to redirect the program flow UNLESS of course you do not have control over the program code, i.e. when you have to do pretty much what those who use that kind of exploit do. But then we're leaving the area of development and enter the realm of ... let's say repurposing.
Learn C for advanced security, not for basics (Score:5, Interesting)
Career network security programmer here.
Absolutely of you're programming a device that connects to the internet, you should understand a bit about security and have a security mindset. If your device won't get regular updates, this is even more true.
Where does C fit in? It's unnecessary, if you just want to learn basic security best practices.
If you want to really understand how exploits work, and some advanced protections, you need to understand how your program and your data are arranged in memory, what actually happens in the hardware to when your asynchronous timer goes off, etc. For that, C is the language to learn. Java programmers know how to use Java. C programmers know how Java works, internally. The bad guys writing exploits are (typically) C programmers, they can defeat your PHP or Python program because they know how PHP and Python work internally.
You've always used languages with automatic garbage collection, so you mostly don't have to worry about freeing memory after you use it? Great. You don't know how and when memory is freed, and what happens when a hacker exploits a "use after free" to execute code that he's put into the variable you think no longer exists.
To be clear, I'm not saying that people need to *use* C to write secure software. I'm saying that if you *learn* C, you'll learn a lot that applies to advanced security knowledge in any language. Higher level languages are most commonly written in C; if you know how things are done in C you'll understand what your high-level language is doing behind the scenes. You'll understand your Ruby software much better if you understand how the same program works in C.
Re: Learn C for advanced security, not for basics (Score:5, Interesting)
I agree. I write bootloaders for multi-core 64-bit processors. I've mostly been working on MIPS. C is very easy to get running. In one of my bootloaders which boots off of a SD card or eMMC chip all of the assembly code fits inside the 512-byte boot sector, and that includes the partition table and some UART functions because I had extra space. Most code is written in C. The C compiler does close to hand-tuned assembly code in terms of code density and the compiler often makes decisions that only a well seasoned assembly programmer would think of. C code provides a lot of flexibility but it also keeps the overhead to a minimum. You know exactly what the code will do. There is no hidden garbage collection or anything else. A pointer is just that, an actual address. You can easily convert them to an unsigned long or other numerical representation. You can also easily do things like have bitfields that map directly to hardware registers.
On the processors I work with we have scripts that parse the hardware design and generate bitfields and addresses for all of the hardware registers (and there are many thousands of them on the chips I work with). C is the language to use when you want to get down and dirty with the hardware. On top of that, with gcc and many other compilers it is easy to incorporate inline assembly where needed since there are instructions and things that the compiler will not know how to properly deal with. It's also easy to call C code from assembly or assembly code from C.
Working down at the hardware level you have to know what the language is doing and C does this very well and has all the constructs to do this. For example, the volatile keyword is perfect for mapping a pointer to a memory-mapped hardware register since that tells the compiler that the value can change at any time and not to cache it.
That's not to say that other languages can't also be used. At one point I worked on a large (100K lines of code) kernel-level device driver that was written almost entirely in C++. There was a huge amount of work that was needed to make it work. The drawback was that only one particular version of the Watcom compiler could be used. It had to be 10.1b (I don't remember the exact version but it was 10 something b). Revision C could not be used. There was all sorts of magic that had to be implemented in order to support some of the memory magic that went on due to C++. This was also before exceptions. While the driver worked great and was easy to maintain once all the magic had been implemented, it also required a lot more skill to work with compared to C since one had to know all the caveats involved.
From my initial reading of Rust, it would have major problems due to the way it deals with pointers. If the language is holding your hand to be "memory safe" then there's a good chance it won't work well when dealing with low-level stuff.
Another advantage of C is that there is minimal overhead. C isn't going to do crap behind the scenes. If you assign an array it's just a block of indexed memory with minimal overhead.
Things like pointer arithmetic and typecasting are extremely useful. I can't count the number of times I need to map between a pointer and a unsigned long (or unsigned long long) and it's easy to switch between them.
Features like the volatile keyword in C are extremely useful, and most higher-level languages don't support that (C++ does). Java and C# have the keyword but it doesn't mean the same thing. In C one can also implement things like read and write barriers which are needed for drivers that talk to the hardware.
They said the same about mobile (Score:3, Interesting)
I heard this said before about phones, but eventually technology developed enough to allow mobile devices to have a strong enough processor. People are already too used to program higher-level and I see no reason why the same environments we have in our phones can't run on our fridges or boilers or ovens, therefore I do not think that people will use C.
Re: (Score:2)
Someone at some level is going to be using C, even on smartphones and IoT devices. Yes, writing apps doubtless will be done in higher-level languages, but there's going to be someone working in C, at the very least writing drivers for a minimalistic Linux environment.
Re: They said the same about mobile (Score:3, Interesting)
With low cost IoT designs, memory is sometimes measured in bytes. I have coded processors with only 16 bytes of data memory.
Re:They said the same about mobile (Score:4, Informative)
The high-level VMs and the drivers to drive the specific hardware isn't developed by magical Low-Level Elves in Happy Rainbow Fairly Land. Every IoT device is going to have their oen special hardware stuff, and somebody needs to write the low-level code to interface with it. That is done in a combination of C and assembler.
Also, at volume the price difference between a MCU that can run a VM and one that can not will be on the order of tens of cents (either currency). If you plan to make on the order of a million devices, then 20 cents per unit will more than pay for a programmer that knows to use the small MCU over a Java hack that does not.
Re: (Score:3)
Actually most modern languages fall apart since you need a certain feature set at the low level that high-level languages try and protect you from. Many languages have significant overhead required just to use the language and thus don't work well in low memory situations.
I have written a number of bootloaders that have to fit in 8K of RAM. There is absolutely no way I could write them in anything other than C and a minimal amount of assembly. C code can be very efficient with modern compilers optimized for
Re: (Score:2)
What do you think the higher level languages are written in, or the operating system on those phones? The high level stuff is for the frigging apps writers, not for the engineers building the device. The apps aren't written in C and the radio layer isn't written in Java.
Re: (Score:2)
I see no reason why the same environments we have in our phones can't run on our fridges or boilers or ovens, therefore I do not think that people will use C.
Because CPUs aren't really getting faster anymore, and no one wants to pay an extra $60 for a microwave just so the company could get away with hiring a cheaper programmer.
I hope I'm wrong and you're right, I would love a ~20GHZ processor in my microwave because it was cheap enough.
Nope (Score:3)
IoT is a reason to learn a few things about IT security. Whether you plan to develop in the field or go into consulting, IoT means total job security in the IT-Security field for the foreseeable future.
Quite frankly, if you thought Microsoft is keeping security busy, just wait 'til IoT makes it big into the office space. You're looking at security holes and blunders you can't even imagine today! And every single of them are a sweet, sweet, 4 to 5 digit consulting gig!
Re: (Score:2)
I think market pressures will fix the security problems, pretty quickly as IoT becomes ubiquitous.
Recall Apple's and Microsoft's ongoing fight for security.
Re: (Score:3)
Look at the mess internet capable TVs are. Then ponder your statement again.
Most "smart" TVs are horribly insecure. Did that cause an outcry? Nope. And as long as those TVs will just participate in a DDoS while still allowing Netflix to be shown, nobody will give a shit.
Security is not a marketable feature. Nobody gives a shit about security unless it affects him directly. And since IoT devices are mostly used for DDoS blackmail right now, this doesn't even register with the average user.
Re: (Score:2)
DDoS blackmail isn't a thing [cloudflare.com], though some do fall for the scam:
Given that the attackers can't tell who has paid the extortion fee and who has not, it is perhaps not surprising to learn that they appear to treat all victims the same: attacking none of them. To date, we've not seen a single attack launched against a threatened organization. This is in spite of nearly all of the threatened organizations we're aware of not paying the extortion fee. We've compared notes with fellow DDoS mitigation vendors and none of them have seen any attacks launched since March against organizations that have received Armada Collective threats.
Re: (Score:2)
But what is IoT? Home automation? Smart appliances? Industrial sensor networks? Inercommunicating cars? What? For now I'll think of IoT as "Any networked stuff, other than servers, workstations or network equipment"
Re: (Score:2)
All that and more. In the consumer environment you're looking at home automation and various gadgets, while companies will certainly want to automatize facility management, physical security and power saving features like lamps that turn on and off depending on where people are, far more accurately than this happens today with simple motion detectors.
There's plenty of room for IoT devices, and all of them are horribly insecure. I'm looking at a very bright future.
Re: (Score:2)
And every single of them are a sweet, sweet, 4 to 5 digit consulting gig!
That sounds low......I assume that's not per year. How long would each one of those gigs last?
Android Things (Score:2)
Re: (Score:2)
Android you can and generally do use the NDK thing, that allows you to create native C compiled libraries to get a really, really nice boost of speed.
Who is making your IoT device? (Score:2)
Thats more an art skill for a web app?
Making your own IoT device in the USA? Whats trendy in 2017 for low level design work?
A US design team with the needed skills will work on that.
If the kit is sold, most of the work has been done.
If your making your own kit, you have the skills or paying a very smart person with the skills.
To C or not to C ... (Score:2)
... that is the question.
Re: (Score:2)
To C or not to C ... that is the question.
And the answer would be true, regardless of the value of C. ;-P
No ... (Score:2)
... use BASIC:
10 get foxnews.com
20 refresh
30 goto 10
Re: (Score:2)
Basic can't do that by its own, But i bet if you add some magical pokes it do.
Re: (Score:2)
We're old and stuff.
Learn Swift (Score:4, Interesting)
Swift is a language well-suited to byte counting, if that is a need - at this point because of the tremendous pressure to increase security on iOT devices I really think Swift could have massive uptake.
That too exists, LLVM barcode (Score:2)
What's being discussed here are platforms that need features SWIFT simply doesn't have, like inline assembler to manipulate hardware specific features
That's not so; the assembly language of Swift [apple.com] is bitcode [infoq.com] - which you can embed in Swift code for customized performance.
manual allocation schemes for shared memory (EG reserved blocks that are processed by hardware interrupts)
Although the link is not exactly that case you can use UnsafeMutablePointers [memkite.com] for that purpose. Swift is not a garbage collected langua
Swift is open source too you know (Score:2)
Swift itself is also open source [github.com].
It also can be used as a full replacement for C in a way Rust cannot; see my response to someone saying Swift can't be used for device programming for examples. Rust is a nice language but it's just not as architected as Swift is for use across the space of devices like C is already.
Sigh (Score:2)
In an ideal world, developers of this newly emerging industry would try to avoid the mistakes of the past. They would gravitate towards one of the "safer" low-level languages such as Rust or Ada instead of C.
Of course, from the news headlines it seems that IoT developers are already intent on recreating every bad security practice that's been described since the Morris Worm. So I'm not holding my breath.
Better options (Score:2)
Rust does anything in any space C can, but more safely.
Light IoT will always need languages that are very power cautious, but I see that bringing a rise of CUDA IoT, or even FPGA skills.
Heavy IoT will always be flavor-of-the-month.
Reason to learn C++ (Score:2)
Just C on your resume without C++ will pigeonhole you as a code monkey. Learn C++ and you will also know C, which is essentially just C++ with subtractions. These days, it would be rare to encounter an embedded tool chain that does not support both C and C++, usually with the same compiler.
Re: (Score:2)
I'm a COO, and I have C and C++ on my resume.
It is not C on your resume that limits you, the limit is only in your mind.
Re: (Score:2)
C++ is not C and C is not C++.
Re: (Score:2)
Spoken as someone who has mastered neither.
Re: (Score:2)
Re: (Score:2)
Sounds like you found it too confusing. But really, C++ is much better with lamdas, you can get rid of functors for one thing, type deduction is working really well these days, generalized const exprs are really handy, etc etc. I'm not sure you understood what you were reading, you certainly did not provide any specifics. Better think about taking C++ off your resume, because if you run into a competent practitioner you risk being unmasked as someone who knows about the language, does not know the language.
Re: (Score:2)
But really, C++ is much better with lamdas, you can get rid of functors for one thing, type deduction is working really well these days, generalized const exprs are really handy, etc etc.
I can link to half a dozen nice looking codebases in C. Can you link to even one in C++? The only ones I can think of that are any good limit themselves to a small subset of C++.
Re: (Score:2)
You've pretty much outied yourself as someone who has C++ on their resume in spite of having only a limited reading knowledge at best. Why don't you start by reading some code for the Kate editor? [kde.org] Seems pretty approachable to me. QT centric, but that's not a bad thing.
Re: (Score:2)
C++ style development is obviously different, but If you are incapable of translating any high level thing you do in C++ back into plain C then your weak fundamentals make you a second string developer. Most embedded development is not kernel programming and the shift away from C to C++ in that space is real. You can choose to be ahead of the curve or behind it.
Re: (Score:2)
You C++ people should realize it's your days that are counted blah blah blah...
Haha, you are very funny. I'm a C person.
Re: (Score:2)
No, I can absolutely guarantee that if you learn C++ without specifically learning C, then you do NOT know C.
Wrong, it only shows that your knowledge of C++ is incomplete, and that you will be a clear and present danger to any serious programming project. If your understanding of C++ is so limited you would be well advised to stick to the ilk of Java or html.
Re: (Score:3)
He is absolutely correct. There are some aspects where the two languages diverge. The way you program in C tends to be very different compared to the way C++ code is written. I have worked with both extensively for low-level projects (i.e. bare metal device drivers).
Wow (Score:2)
Have we fallen so far that we need a reason such as "IoT" to learn C?
Sad day.
Counting the bytes? (Score:2)
Yeah, I don't think any modern IoT device has any programmer "counting the bytes". I used to count bytes, back when I had 4KB of memory, or 8MB of memory, or 20MB of disk space. I think you'll be hard-pressed to find any IoT device with less than a gig of virtual memory. Considering zero or near-zero graphics output, I think you'll be just fine with any language ever inventing.
My vote goes to turing, which I haven't seen in twe decades, but for which I have a school-age nostalgia -- I made a street-fight
JavaScript ... and maybe Python (Score:2, Insightful)
Re:JavaScript ... and maybe Python (Score:4, Informative)
This is a discussion about platforms that would buckle under the bulk of a micro-OS and a JS interpreter/VM stack. And that's not even handling the issue that most of these devices use embedded hardware platforms that you need to access with specific assembler calls - how would you do that in JS or Python!?
There are a few JavaScript interpreters that use very minimal resources and have access to all the necessary hardware (wifi, BLE, SPI, UART, i2c, etc), these are Duktape http://duktape.org/ [duktape.org], Espruino https://github.com/espruino/Es... [github.com], JerryScript https://github.com/jerryscript... [github.com], and more. These are all designed for IoT devices. For performance this is an interesting read: https://www.espruino.com/Perfo... [espruino.com]
Re:JavaScript ... and maybe Python (Score:5, Insightful)
Re: (Score:2)
Or even javascript and C: http://jsish.sourceforge.net/ [sourceforge.net]
Duktape is along a similar line: http://duktape.org/ [duktape.org]
Everything is a reason to learn C (and C++) (Score:2)
Because https://www.youtube.com/watch?... [youtube.com]
No.... (Score:2)
It seems likely that most IoT devices will rely on a central hub which leaves the devices as relatively dumb. The hub (or the cloud) side will likely be a less constrained environment so will use a higher level language.
The other factor is that the thing side has a manufacturing component so will probably be commoditized by Chinese manufacturers and relatively few jobs will exist outside.
Simply because... (Score:4, Insightful)
easy choice (Score:2)
Like it or not, C is still the defacto standard for embedded programming. If you're working in any where near bare metal your going to be using it. That may change but for now, if you dev for Internet of things and you don't know it, you're basically a script kiddie. Source: I'm an electrical engineer.
Not just small devices. (Score:2)
Reducing compute times/memory usage by 0.1% in big data centers reduces the cost of electricity and cooling by millions.
Reducing compute times/memory usage in battery powered devices increases battery lifetime.
I view C++ as a better choice then C though, who can code a linked list from scratch more efficient then what they do? You get more abstraction power.
No, but IoT is a good reason to learn network secu (Score:2)
You have to learn C invariably of what you do (Score:2)
C is programming lingua franca. Anybody calling himself a programmer and not knowing C is not a programmer.
Before the mainstream software programming turned to be web programming, C was taught in every computer science university programme.
Not knowing C limits your competence to a small pond called web programming.
No. IoT is a fad. ... (Score:5, Insightful)
... Embedded is a reason to learn C though. And embedded and IoT do have some intersection/overlap. But I IoT itself is mostly a fad involving the slapping together of unsafe preconfection microlinuxes with unsafe overkill websevers/port 80 stuff and adding that to toasters and stuff that really don't need it and won't be used more than ~3 times unless by some bored teenagers wgo wants to screw up your homes heating or AC by surfing on shodan for some lonb forgotten default access to said IoT trinkets.
Bottom line:
You shouldn't do anything because of IoT unless it,s avoiding it like the plague (unless you're a hacker that is). OTOH If you want to learn embedded, C with assembler for the basics is the way to go.
Good luck.
Definitely yes (Score:3)
While most of my work is for chips that are vastly more powerful than what is found in IoT devices I work on bootloaders and bare-metal programming. In some cases memory is a premium. With only a couple of exceptions, all of the work I have done has always been with C. Most small micros are programmed in C almost exclusively with a sprinkling of assembly.
C is very good for working closely with the hardware and in memory constrained environments. C code does exactly what it says. There is no hidden overhead. The runtime needed to run C code is pretty minimal. All it really needs to get going is often a few registers initialized and a stack and it's ready to go.
It works beautifully with memory mapped registers and data structures which are extremely common in this environment. There's even a keyword designed for this, volatile, that is not present in most other languages (or it does not do the same thing).
I can use a bitfield to define hardware registers and just map a pointer to the address of that register and use it. Mixing in assembly code is easy if it's needed, though generally I find that assembly code isn't needed very often.
It's also easy to generate a binary blob using C, a linker script and a tool like objcopy.
My experience has mostly been with 64-bit MIPS and some ARMV8 but it applies to most embedded processors. The output of the C compiler when optimizing for size (with the right options set) is pretty close to hand optimized assembly and often even better because the compiler does things that would make the code otherwise hard to read or maintain. The runtime overhead of C is minimal.
C's flexibility with pointers is also a huge plus. I can easily convert between a pointer and an unsigned long (or unsigned long long) and back as needed, or typecast to a different data type and pointer arithmetic is trivial. There is no hidden bounds checking or pointer checking to worry about. Many people say that's a bad thing, but when you're working close to the metal it can really turn into a major pain in the you know what. Master pointers and know how and when to typecast them. I've seen too many times where people screw up pointer arithmetic. I once spent several weeks tracking down a bug that showed up in one of my large data structures where I saw corruption. It turned out that some totally unrelated code written by somebody else in a totally different module was written by someone who didn't understand pointer arithmetic and was spewing data all over the place other than where it should be. He also didn't realize that when you get data from a TCP stream you don't always get the amount of data you ask for, it could be less.
I have been writing low-level device drivers and bootloaders for over 20 years and while programming has changed significantly for more powerful systems in userspace, for low level programming it has changed very little. My advice is to learn C and learn it well. Know what some of those keywords mean like static and volatile. Anyone who interviews in my group damned well better know what volatile means and how to use it.
C isn't a very complicated language but it takes time to properly master. It also doesn't hold your hand like many modern languages, which is why you often hear it is easy to have things like buffer overflows or stack overflows and it's also easy to shoot yourself in the foot. It doesn't have many of the modern conveniences, but those conveniences often come at a cost in terms of memory and performance.
The best book I've seen on the language was written by the authors of the language. It's not very long but it is concise and well written.
The C Programming Language [amazon.com] by Brian W. Kernighan and Dennis M. Richie.
I have also worked on C++ device drivers. While the overhead of C++ itself is generally minimal, it depends that you use only a subset of C++ and you have to know what C++ is doing behind the sce
You should learn C to become a better developer (Score:3)
The nice thing about C is that is as close to portable assembler as a language can get. It forces you to understand how computers and OSs work in order to become proficient, and in time your code will become better because of it - even when using other languages.
Of course, the ugly thing about C is that is as close to portable assembler as a language can get.
Re: (Score:2)
Fair enough. You can always write C code in C++. But at the risk of being flamed, IMHO both are high-level languages. C++ is just higher level.
I recall reading (maybe in Stanley Lippman's primer book) an exercise question that asked why the language is called C++ and not ++C. Worth pondering.
Re: (Score:2)
Most of the arguments that embedded C developers had against C++ are obsolete. The language has been updated over the decades, and the compilers are much better. The C++ runtime libraries used to be a lot of black magic that was hard to get working in kernel space, and many kernels had to limit themselves to a subset of C++ in order to function (L4Ka, Symbian, and others). There are now lots of compilers that work reliably and lots of papers on how to get as many of the C++ features working on supervisor mo
Re: (Score:3)
I dunno. There is still significant overhead with C++, which is a big problem with small devices (128KB of code for example). There's the issue of the linker not really figuring out how to share object code for templates to avoid typical bloat. There's the hidden operations behind the scenes that cause problems for novice programmers running out of space or wandering why their code is so slow. There really is no good way to make exceptions both fast and use little code space (a shame since I like them i
Re: (Score:3)
I disagree, I think objects, templates and the static type system are core parts of C++. Exceptions and RTTI are nice features but not necessarily helpful to have in every C++ program.
Re:Arduino uses C++, Pi uses Linux (Score:5, Interesting)
I work at a company that makes chips for IoT, though in this case the chips are targeted at things like routers, switches, network security appliances and highly intelligent network cards. While C++ is supported, all of the stuff I work on is C. Our SDK is C. All of the vendor SDKs I've come across for dealing with different devices are written in C. Interfacing to C is fairly simple and well understood compared to introducing C++ in an embedded environment. Now the environment I deal with is either the Linux kernel or lower, usually lower since I work with bare metal most of the time.
C is used for a number of reasons.
1. The generated code is quite fast and fairly compact. With my experience with MIPS the output of the compiler is pretty close to hand-tuned assembly in most cases.
2. It's easy to deal with hardware registers in C. Hardware registers can be defined by volatile bitfields so a simple pointer can be used to access them.
3. There is no unintended overhead or hidden behavior. With C it is very much what you see is what you get. It doesn't do stuff under the covers.
4. Memory mapping data structures and things like that are very easy in C.
5. One is not dependent on things like a certain standard library. The amount of code needed for basic C support is fairly minimal. All you really need is a stack. I have plenty of code that does not have a heap.
6. Things like interrupt handlers are fairly trivial to code in C.
7. One can do interesting things using the linker with C code that are not really possible with most other languages. For example, I can easily link my code to execute at a particular address and generate a binary without any elf headers or any other cruft and there are interesting things that can be done with linker scripts.
8. There is no unexpected overhead due to the language. There is no background garbage collection that can run at some inopportune time. There's no extra code to do bounds or pointer checking to slow down the code or even get in the way.
9. Generally it is pretty easy to move between different versions of the toolchain. C generally doesn't change much.
10. C seems to resist bloat better than other languages, in part because it does exactly what you tell it to and nothing more.
Much of this can apply to C++ as well, though C++ requires a lot more overhead in order to properly support it due to some of the language features and C++ can hide certain things if you aren't careful.
That's not to say that things can't be written in high-level languages. There is plenty of flexibility once you get to something like a Raspberry Pi user-space program.
Arduino uses a subset of C++ but it's such a small subset that it might as well be C.
I write this as someone who has been writing embedded C code and assembly (98% C) for the last 20 years, though I have also worked on a few C++ projects as well. Most of this was device drivers, Vx Works, bootloaders (U-Boot and custom), bare metal applications and SDKs (dealing with high speed networking, 1, 2.5, 5, 10, 25 and 40Gbps) and some Linux kernel work and Arduino. I've worked with a variety of different CPU architectures (Intel, MIPS, ARM, PowerPC and more) including one that ran a functional programming language natively in hardware (the processor was physically incapable of running C code).
Re: (Score:3)
Hardware registers can be defined by volatile bitfields so a simple pointer can be used to access them.
Unless, of course, you have multiple registers with ordering constraints between them (e.g. write some data into one register, toggle a flag in another), because the volatile keyword in C does not give any guarantees of ordering between accesses to different volatile objects and the compiler is completely free to reorder the write to the flag before the write to the data.
Things like interrupt handlers are fairly trivial to code in C.
As long as someone else is writing the assembly code that preserves state for the interrupted context, prevents the interrupt handler from
Re:snarky: managed languages RulZ! (Score:4, Interesting)
If you're applying at a shop that does a lot of low level coding or coding on processor, memory and/or storage restricted platforms, if you're only experience is in Java or C#, I'd say your chances are pretty low. Walk in with a good practical grounding in C coding, I would imagine your chances go up. Not every shop is occupied by hipsters looking for keywords like "Python".
Re: (Score:2)
Then again if you claim to be a programmer who can't pick up python in a few sittings, you don't deserve the title.
Re: (Score:2)
Yes, but shops want "senior" Python programmers. You can't claim to be that after a few sittings.
Re: (Score:2)
Where do you find these shops? I look on job postings and everyone is asking for the latest buzzwords it seems.
Re: (Score:2)
Does "Python, PHP, SQL, C, and MOS 6502 assembly language" look good on a generalist resume? For example, I've used Python to make data conversion tools for games on 6502-based retro platforms.
Re: (Score:3)
There are a lot of people who say they know C who can't handle some of the simple stuff in C, as I have seen in many interviews.
I started saying "I'm sorry that I will ask some very simple questions..." because I felt embarrassed asking someone with so much on-paper experience who is applying for a hands on development job to do such simple things. Then so many of them flub it. Now I'm wondering if my apologies that the question is "simple" actually ends up as an insult for someone who can't figure them o
Re: (Score:3)
Sadly I run into that a lot, and I've interviewed a lot of people for low-level (i.e. bare metal) programming. Generally speaking, C is a fairly simple language, but in my line of work you better know damned well what various keywords mean and how they behave (like volatile) and know pointers backwards and forwards.
Re: (Score:2)
MIPS ASM or GTFO
Grandpa, get off Slashdot. It's time for your Metamucil.
Re: (Score:2)
Grandpa, get off Slashdot. It's time for your Metamucil.
Spoken by someone who probably thinks javascript is an awesome language.
Actually, no I don't. But it does have some awesome libraries, like d3. [d3js.org]