Thanks to AI, the Hottest New Programming Language is... English (analyticsindiamag.com) 114
"Generative AI is transforming software development by enabling natural language prompts to generate code, reducing the need for traditional programming skills," argues Analytics India magazine.
Traditionally, coding was the bastion of the select few who had mastered mighty languages like C++, Python, or Java. The idea of programming seemed exclusively reserved for those fluent in syntax and logic. However, the narrative is now being challenged by natural language coding being implemented in AI tools like GitHub Copilot. Andrej Karpathy, senior director of AI at Tesla predicted this trend last year.... English is emerging as the universal coding language.
NVIDIA CEO Jensen Huang believes that English is becoming a new programming language thanks to AI advancements. Speaking at the World Government Summit, Huang explained, "It is our job to create computing technology such that nobody has to program and that the programming language is human"... He calls this a "miracle of AI," emphasising how it closes the technology divide and empowers people from all fields to become effective technologists without traditional coding skills... "In the future, you will tell the computer what you want, and it will do it,"â Huang commented. Large language models (LLMs) like OpenAI's GPT-4 and its successors have made this possible...
Microsoft CEO Satya Nadella has been equally vocal about the potential of English for coding. Microsoft's GitHub Copilot, an AI code assistant, enables developers to describe their needs in natural language and receive functional code in response. Nadella describes this as part of a broader mission to "empower every person and every organisation on the planet to achieve more".... In a discussion earlier last year, Stability AI CEO Emad Mostaque claimed, "41% of codes on GitHub are AI-generated"...
In 2024, the ability to program is no longer reserved for a few. It's a skill anyone can wield, thanks to the power of natural language processing and AI
"No longer is the power to create software restricted to those who can decipher programming languages," the article concludes. "Anyone with a problem to solve and a clear enough articulation of that problem can now write software."
Although the article also includes this consoling quote from Nvidia's Huang in March. "There is an artistry to prompt engineering. It's how you fine-tune the instructions to get exactly what you want"
NVIDIA CEO Jensen Huang believes that English is becoming a new programming language thanks to AI advancements. Speaking at the World Government Summit, Huang explained, "It is our job to create computing technology such that nobody has to program and that the programming language is human"... He calls this a "miracle of AI," emphasising how it closes the technology divide and empowers people from all fields to become effective technologists without traditional coding skills... "In the future, you will tell the computer what you want, and it will do it,"â Huang commented. Large language models (LLMs) like OpenAI's GPT-4 and its successors have made this possible...
Microsoft CEO Satya Nadella has been equally vocal about the potential of English for coding. Microsoft's GitHub Copilot, an AI code assistant, enables developers to describe their needs in natural language and receive functional code in response. Nadella describes this as part of a broader mission to "empower every person and every organisation on the planet to achieve more".... In a discussion earlier last year, Stability AI CEO Emad Mostaque claimed, "41% of codes on GitHub are AI-generated"...
In 2024, the ability to program is no longer reserved for a few. It's a skill anyone can wield, thanks to the power of natural language processing and AI
"No longer is the power to create software restricted to those who can decipher programming languages," the article concludes. "Anyone with a problem to solve and a clear enough articulation of that problem can now write software."
Although the article also includes this consoling quote from Nvidia's Huang in March. "There is an artistry to prompt engineering. It's how you fine-tune the instructions to get exactly what you want"
Soo, fuzzy meaning is now enough? (Score:5, Interesting)
Does nobody write real code that does exactly what you want it to do anymore?
Ah, I see they interviewed some of the usual asshole CEOs. So this "story" is actually without meaning.
Re: (Score:2)
I take it that you haven't yet written up a few prompts to tell an AI what code you want it to write. Which would take it maybe 15 seconds. You might be surprised.
Re: (Score:2)
Before I am finished to tell an AI what code I want to write: I am finished with the code.
Unless I write a bash script, then I use vim: I use an IDE. Autocompletion in an IDE writes the code I want. And does not give me odd variable names, that I have to refactor to get readable code.
But if you have a good hint for an "coding AI" I might try it out :P
Re: (Score:2)
I don't know what kind of work you do, a coding assistant may not be helpful in all situations, but have a look at Windsurf.
https://codeium.com/windsurf/ [codeium.com]
Re: Soo, fuzzy meaning is now enough? (Score:2)
Well yeah, I never made any jokes about eve online. And why would I bother with a live subscription? I don't even have an Xbox.
Also didn't you mean to say "Zats not funny!"?
Re: Soo, fuzzy meaning is now enough? (Score:2)
Oh, also I've never once made any effort to learn that filthy pig latin, so I didn't particularly care how good or bad you think it is.
Re: Soo, fuzzy meaning is now enough? (Score:2)
Germans don't even know what jokes are.
https://youtu.be/DX8K8P_iUiQ [youtu.be]
Re:Soo, fuzzy meaning is now enough? (Score:5, Interesting)
Unfortunately, natural languages had to be ambiguous or the squeamish would be offended by the tiniest mistake, so ambiguousness was a required feature of natural languages as it had to be possible to wriggle out of your blunders.
"Talking" to computers was different. Computers must not be allowed to wriggle out of their blunders, as no one would buy them if they could. Preferably there should be no wriggle room for blunders in the first place.
It took relatively few monster screw-ups for key people to realise that, if the data was to have an impact on real life, something must be done - That something was SQL!
Re: Soo, fuzzy meaning is now enough? (Score:2)
Re:Soo, fuzzy meaning is now enough? (Score:4)
Ah, yes. That mess caused by absolutely frigging idiots. The same morons believed that you do not need to know how to code to solve complex problems in COBOL. This stupidity is incidentally still around, in the form of "no code" "coding" environments. Does not work, did never work, cannot work. Insight into the computing mechanism and the algorithms used is not optional.
Re:Soo, fuzzy meaning is now enough? (Score:5, Insightful)
I feel like there's a bunch of people gaslighting me.
After many iterations of the post GPT 3.5 world of folks saying "it can code what I want" trying it, it failing to impress, and then people telling me "well of *course* that old model didn't work well, but this *new* model works well" and getting the same result.
If it is a prompt that would pop up a ready to go stackoverflow answer in an internet search, or a library that exactly does what I was looking for, then the LLM code generation can do a passable job... sometimes. However in those cases, I could have just done an internet search and been in better shape anyway.
Problems beyond the obvious failing to work or not being able to do much:
It loves to invent convenient sounding attributes/functions in existing libraries that would be awfully convenient, except they don't actually exist. I've started fielding quite a few questions from people asking why their attempt to call a non-existant function fails, but their LLM said that was how to do it.
There were security implications in one attempt, and if the code it suggested had worked as described (it had invented a non-existant attribute, as above), then it would have had a glaring security vulnerability.
When it would suggest use of a library, it had to be double checked. It has the "stack overflow" problem of tending to suggest "the hot way to do it in 2013", without realizing that by 2020 that library was defunct and no longer is compatible with the use case in 2024.
Related to that, when it bakes in logic that you could have gotten from a library, *particularly* when dealing with third party internet served APIs, then you have a problem where no one is maintaining your client "library", and changes by that third party will catch you by surprise at the worst time.
It's like dealing with an offshore junior developer that scammed their way in without any actual training but worse, because you can't really teach it to do better.
Re:Soo, fuzzy meaning is now enough? (Score:5, Insightful)
Yep, that is a nice selection of some of the problems with letting AI "write" "code".
But these people are not gaslighting you. They are really too stupid to see how bad the quality of AI "generated" code a actually is and they deeply, deeply want to believe it can code well. Why? Clearly they are too incapable to code well by themselves, so they now want this one magic tool that will make then and their code not suck. Of course that is stupid in the second degree, because if AI could actually code well, these people would simply lose their jobs.
Re: (Score:2)
Maybe you just weren't articulating what you want properly.
Re:Soo, fuzzy meaning is now enough? (Score:4)
"And don't make up function calls that don't exist"
"Don't create glaring security holes"
"Double check that the cited libraries are actually still usable"
"Please commit your suggestions to a library on github and maintain it properly so I don't have to rewrite my code"
I don't think LLMs can follow those instructions, and to the extent they could, why would those not be the default behaviors?
Also, at some point, trying to get an LLM codegen to do what I want is more trouble than just doing it.
Now I could see potential for LLM handily augmenting code completion, as it becomes more seamlessly integrated. Current code completion is pretty good, with the LSP implementations out there to help, but there are occasions where even a braindead person would know where a segment of typing is going beyond what current code completion can really help with, and that's an area I could believe in LLM providing value.
Re: (Score:3)
Also, at some point, trying to get an LLM codegen to do what I want is more trouble than just doing it.
And you know what? Doing so makes you better at it and you may even learn some things on the side! One of the severe problems I see with AI code generators is that they make it far harder for people to learn coding. At the same time, they cannot actually replace that skill.
Re: (Score:2)
You can easily tell the coding assistant to generate a test case that exercises the code. I find it is particularly useful when working on code related to data science.
"Write a class that takes a data frame as input, swizzles it in the following way, provides methods that expose these aspects of the data, and be able to generate these kinds of plots that show the various distributions of the data."
Obviously it isn't going to check your code into github and respond to pull requests, that's your job as the su
Re: Soo, fuzzy meaning is now enough? (Score:2)
Write a class that takes a data frame as input, swizzles it in the following way, provides methods that expose these aspects of the data, and be able to generate these kinds of plots that show the various distributions of the data.
Any AI assistant is going to fail spectacularly with a prompt like this. And worse, if you rely on it, there's fuck all you'll be able to do when it inevitably doesn't work right.
Let's start at the beginning: A class takes a dataframe as an input. An input to what? One of its fields? One of its setters? One of its constructors?
Oh wait, you probably mean a Python style "class" instantiation, which only has one constructor. Which is fine, I guess, until you run into run into one of the many libraries (e.g. sq
Re: (Score:2)
>> A class takes a dataframe as an input. An input to what? One of its fields?
I didn't include the actual prompt here, but pandas dataframes have a standard format and can be accepted as-is by many utilities. You can easily dump a database into a dataframe with a few lines of code for example, and pass that right along to various analysis engines. I've used coding assistants to generate complete classes that do this kind of thing successfully on several occasions and they are in my toolbox.
>> Le
Re: Soo, fuzzy meaning is now enough? (Score:2)
I didn't include the actual prompt here, but pandas dataframes have a standard format and can be accepted as-is by many utilities. You can easily dump a database into a dataframe with a few lines of code for example, and pass that right along to various analysis engines. I've used coding assistants to generate complete classes that do this kind of thing successfully on several occasions and they are in my toolbox.
You're not doing software development in this case, you're scripting. In other words, you're stitching together existing code somebody else wrote without having any understanding of how it works, and then you're relying on the AI to do very basic flow control. Pandas in particular is pretty inefficient, by the way. At its core it's not even Python, it's C. But I'm guessing efficiency isn't a goal, otherwise you'd opt for something like Polars. Which means you're also unlikely to be doing anything particular
Re: (Score:2)
>> you're stitching together existing code somebody else wrote without having any understanding of how it works
You are blabbering nonsense. The generated code is in plain sight and I fully understand it. Any developer is going to be looking at code other people wrote on a regular basis and debugging or enhancing it. But as I said, don't use this tech. Who cares?
Meanwhile "92% of U.S.-based developers are already using AI coding tools both in and outside of work."
https://github.blog/news-insig... [github.blog]
Re: (Score:2)
I have had a whole class of students with relevant experience do it, repeatedly.
Your "take" is bereft of insight and so are you. Pathetic as well. Also, you might want to look up what a "fallacy" is. Here is a hint: It just makes you sound dumb.
Re: (Score:2)
So you haven't done it, and all you've got is lame insults.
Re: (Score:2)
If that is your take here then you are _really_ dumb. I will ignore you now, you have no insight, no understanding and a big ego on top.
Re: (Score:2)
You continue to demonstrate that all you've got is a nasty disposition.
Maybe a well documented algorithm (Score:2)
I take it that you haven't yet written up a few prompts to tell an AI what code you want it to write. Which would take it maybe 15 seconds. You might be surprised.
I would not be surprised if it could handle some data structures class quiz questions. Write a few lines of code based on some well documented algorithms, Given an array of foo objects, Perform an insertion sort using the member foo.z.
Then again, did it just copy a solution from Stack Overflow or did it correctly understood the instructions and "reason" out a solution itself.
Re: (Score:2)
>> Given an array of foo objects, Perform an insertion sort using the member foo.z.
Nothing wrong with that, you saved yourself some time. In my experience the existing code assistants can write a very useful class for you if you can accurately describe what you want. It will be well documented and there will be a test case.
Re: (Score:2)
>> Given an array of foo objects, Perform an insertion sort using the member foo.z.
Nothing wrong with that, you saved yourself some time. In my experience the existing code assistants can write a very useful class for you if you can accurately describe what you want. It will be well documented and there will be a test case.
Many of us have certainly been familiar with code assistants with respect to GUI editors for decades. Design your views, place buttons, checkboxes, etc ... the assistant generates all the code to implement the GUI. Compile, run, you can navigate around. It does no work, but there any empty functions for all those buttons, checkboxes, etc waiting for you to add functionality.
Now we are getting tools to help with the simpler parts of that functionality.
Re: (Score:2)
Indeed. And then you get badly broken security caused by an absolutely stupid beginner's mistake.
Problem domain (Score:4, Insightful)
Generative AI is about 30 years behind the curve. We were writing automated functional tests 30 years ago .... in English. And we didn't need a stinkin' warehouse full of NVIDIA chips to do it. A '386 or 68000 series processor was good enough.
One sentence debunks the article (Score:5, Insightful)
Most programming languages are now at a high enough level that programming with them is pretty much this same exercise. The key to software engineering is figuring out what the code ought to do in excruciating detail.
So the fact that there is some sort of prompt language that emulates a programming language is like saying there's a new high-level language/low-code approach around... not really earth-shattering. There have been many low-code platforms around for decades. They mostly work fine for small-scale solutions that don't really require reorganization how people interact with one another, and without complicated questions about how to represent what people are doing in code and data.
Not only is the code generated by AI of unclear reliability and maintainability (a typical problem with automatically generated code), there is essentially no concept of problem definition or design in it whatsoever.
This is typical later-stage-tech-hype-cycle kind of stuff. Color me unimpressed.
Re: (Score:1)
Re:One sentence debunks the article (Score:4)
Back in the 6 character identifier, pre-ASCII assembly language era, we joked about a miraculous DWIM macro: Do What I Mean. I am afraid that occurrence frequency language "understanding" advances us to the next iteration in Tree Swing Diagram [businessballs.com] process. It's not worthless, but it deserves skepticism.
Will this work? (Score:3)
Ok Chat-GPT, code me a relational database management system with full transaction support, indexing, constraint management, stored procedures, triggers, user security management compatible with windows active directory, and full support for ANSI SQL.
Re: (Score:3)
Wouldn't that be something.
My fantasy use is telling the AI to write a nwn:ee module with my described story. Ideally, it would ignore copyright as well and I could tell it to take the Wheel of Time or LoTRs and create a module out of it from the perspective of each character. Now that would be incredible. Or do that but make it a game that I could use one of those 3d headsets for. Really feel like you were in that world described on the pages.
I want to be able to tell AI to take my favorite book and make i
Re:Will this work? (Score:4, Insightful)
"AI, create me a Halodeck and make it cheap to operate. Solve world hunger and global warming while you are at it. Without killing all the humans."
Kill all but one human (Score:4, Funny)
Kill all but one human, as you asked.
Re: (Score:2)
That would be something. But I think long before we develop that there will be an "AI" system that can interpret "Give me an animated movie of at least 15 minutes featuring Queen Amidala naked and covered in hot grits, being molested by Jar Jar Binks and filmed by CowboyNeal." We're getting disturbingly close to it.
Re: (Score:2)
Ok Chat-GPT, code me a relational database management system with full transaction support, indexing, constraint management, stored procedures, triggers, user security management compatible with windows active directory, and full support for ANSI SQL.
Currently, with a more modest goal, the following iterative approach works:
Iteration 1: "Write a python program that can play wordle - i.e. play the guessing player".
Iteration 2: "Now modify the code to allow me to pass on a guess".
etc.
This is similar to the Agile mode of development where the programmer plays a BA role, shows the product owner working code, and then receives feedback for more modification, iterating towards something which satisfies the owner.
Re: (Score:2)
That is not how agile programming works.
Facepalm.
Ok Chat-GPT, code me a relational database management system with full transaction support, indexing, constraint management, stored procedures, triggers, user security management compatible with windows active directory, and full support for ANSI SQL.
No one would start coding on something like this.
Iteration 1: "Write a python program that can play wordle - i.e. play the guessing player".
Iteration 2: "Now modify the code to allow me to pass on a guess".
No ne
Re: (Score:2)
That is a fine example of two things:
a) Something that I fully believe is within the reach of LLM codegen solutions.
b) Something that has a ton of ready-to-go projects providing that already on github, and you don't need an LLM to generate it.
That has been my experience, that in experiments it *can* handle little demonstrations like you describe, but anything it can wrangle is readily offered in the very first link of an internet search.
Re: (Score:2)
Logic and reason is hard (Score:3)
Formalization is tough in any language, especialy English. There are functional languages that offer some concise syntax for doing very common types of definitions that can be a real pain in the ass to correctly define in a natural language. In a way that AI lets us use the wrong tool for a job is similar to the revelation that a cresent wrench is also a hammer.
Re: (Score:1)
But it is a wrench, bug killer, 'whack-a-mole' device, 'just give it a tap' device, and a hand-held or thrown object for self defense.
The latter usually against spiders.
Re: (Score:2)
... 'just give it a tap' device...
So, a hammer....
Cool, but then what? (Score:5, Interesting)
Assume that I have successfully managed to get WhateverGPT to make me a nice program and shipped it to customers.
Now someone files a bug report or a feature request... how do I go about debugging and/or extending this program?
Hopefully the answer is not just "tweak your prompts and regenerate from scratch" (which seems just as likely to generate new bugs as to fix existing ones) or "read and understand the generated codebase so that you can manually fix/update the code yourself" (which would largely defeat the purpose of using AI-generated code, since then you're back to needing to be a Real Programmer)
If I can assign the new Jira cases to the AI and have it read the reports and modify its existing codebase appropriately to address them.... that would be genuinely useful.
Re:Cool, but then what? (Score:5, Insightful)
If I can assign the new Jira cases to the AI and have it read the reports and modify its existing codebase appropriately to address them.... that would be genuinely useful.
It would be, but the generative error generators can't do that (and never will) because they can't extract meaning. As a software developer, writing syntactically correct code is the least important part of my job. The most important parts are understanding requirements and the logic of why and wherefore.
Writing the code is trivial. Know why the code needs to written is far more important. And despite NVidia's sleezy profit motives*, no computer generative error generators can reason.
* Being motivated by profits is not sleazy. Lying and deceiving to any degree necessary to make such profits, like NVidia's CEO does, is sleazy.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Those people are trained differently. They are instructed to ask, "do you want fries with that?" They are not tasked with writing software. And if they are, they quickly earn the title, "Unemployed."
Re: (Score:2)
You don't necessarily have to ship it to customers. Much programming happens internally. Like many businesses rely on kludges written in Excel. I know people without programming knowledge who managed to have ChatGPT make a GUI in python/tk, for some internal workflow. Something that previously they would hire a few hours of someone to do. The "someone" was making easy money writing simple python for noobs. Now, a ChatGPT subscription does this for a fraction of the cost.
Re: (Score:2)
I know people without programming knowledge who managed to have ChatGPT make a GUI in python/tk, for some internal workflow.
In that case, their "customers" are their co-workers, or even just themselves, but the problem otherwise remains the same. In particular, if their "customers" find the program useful, it's very likely they will want to make it better, and will ask them to make that happen. What will they do then?
Better work on your grammar, kids. (Score:2)
That's it, really.
The real motivation (Score:5, Insightful)
It's as if creating more buzz around AI somehow benefits nVIDIA's share price. Who knew?
Mandarin (Score:2)
Re: (Score:2)
Re: (Score:2)
And most of the Chinese translations of English I see are full of errors.
Such as this one [imgur.com]? (it's supposed to be jerked chicken)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Upper half of the Hanzi is missing, so no point in guessing.
What the OP was referring to, just translates to Sezuan Spicy Chicken.
Re: (Score:1)
Congratulations, you made me laugh :P
Re: (Score:2)
Re:Mandarin (Score:4)
To be fair English is a bastard language that must cause a lot of learners to think WTF where they thinking at some it.
Re: Mandarin (Score:2)
Re: (Score:1)
That is not a problem. That is a benefit. ... you do not even need to parse anything to know what it is.
Every pictogram has its own unicode
If you are scared about Hanzi/Kanji watch the videos linked there: https://www.chineasy.com/ [chineasy.com]
Re: (Score:2)
There are over 100,000 different Chinese character
In terms of clear communication with AI I would think that could be an advantage. But I suspect most people are going to use their native language to communicate with AI if that is an option. Its not clear to me whether that will be an option with AI trained to specific languages.
Re: Mandarin (Score:2)
Re: (Score:2)
Only 4000 Hanzi/Kanji (actually less) are in use.
There are three types of Hanzi:
Pictogram/Pictograph: even if it is "far sketched" it is a picture of what it means, example: one, two, three, mountain, river, sun, moon, human, mouth
Ideogram: often two pictures combined, giving an idea of something, that is not necessarily obvious. example a human in a mouth, well mouth in this case is just a fence: a prisoner. Or a king inside of a much bigger mouth/fence: kingdome/empire.
logographs: "Logos" - greek for word
Re: (Score:3)
Well my reply did get distracted from the original issue of using human language for programming and if different human languages would factor into that idea. I have spent several years trying to learn Russian, Mandarin and no
Re: (Score:2)
German language has genders for non-living objects. Yet, for a native English speaker, learning German is orders of magnitude easier than either Mandarin or Japanese.
Hell, as someone whose native language is not English, I've noticed than even English has genders for non-living objects... for example, ships.
Re: (Score:1)
Esperanto or Japanese would be more easy.
I am not super sure, but I thin same for Korean/Hangul.
I would think it is likely that Mandarin will be the new programming language given that is the most common language in the world
For an AI, perhaps. As most programming languages itself are English based, I do not think to much about that.
Keep in mind: Mandarin is only the "government language", in daily life, Chinese speak and use their local language. For random "Chinese" from all over China, using Mandarin as
Words matter (Score:5, Insightful)
How can they be so clueless? (Score:2)
Re: (Score:3, Insightful)
"A Clear Enough Articulation of That Problem" (Score:5, Insightful)
Anyone with any experience in programming will tell you that's where most of the effort lies -- articulating the problem. Even with constructed languages -- COBOL, FORTRAN, LISP, Smalltalk, C, Pascal, Modula-2, Java, C#, OCaml, Haskell, Rust -- all of which were designed to eliminate ambiguity and force the programmer to articulate operations very precisely, we still spend most of our time discovering where the holes in our mental models are. Those are the holes where bugs creep in, or where the user is unpleasantly surprised by unexpected behavior, which may be logical from the program's perspective, but isn't what the user expected/wanted -- because the problem wasn't articulated precisely enough.
"Programming" in English will not help. It might get you to a crude tech demo, but that's about it. It's not a tool. It's a parlor trick.
Re: (Score:2)
LAWYERS. maybe they can do it. Just think of all the issues we have with laws, regulations, and policies all written and resolved with English...
"Youâ(TM)ll never find a programming language that frees you from the burden of clarifying your ideas."
-Randall Munroe
https://xkcd.com/568 [xkcd.com]
Re: (Score:2)
(*derisive snort*)
I thought of putting an aside in my original comment touching on that very idea -- that, even with centuries of practice, laws written in English still have uncovered edge cases and exploitable loopholes. Indeed, if all laws were articulated perfectly clearly, unambiguously, and covered all edge cases, we'd need far fewer lawyers.
But I chose not to put that in, because I thought it was already blazingly obvious. (Indeed, one could argue that the current
Re: (Score:2)
oh no, even among college students the obvious frequently gets missed by a few. What passes for "Reporters" today are far worse and their readers who barely skim an old tweet worth of depth while distracted miss so much they'd elect a con-man rapist failure, twice.
Re: (Score:1)
we still spend most of our time discovering where the holes in our mental models are.
Exactly.
Umm no (Score:2)
Prompting is a horrible way to make anything .. IF they could make it good at generating TEMPLATES that would be good. For example, instead of generating an image, generate some photoshop layers. Or, a 3D file containing the objects and animation paths that can be modified. The biggest flaw with having AI do things other than generating programming code is the lack or difficulty of post-generation modifiability.
thats you lot (US) fooked then (Score:2)
Re: (Score:2)
It's the Holodeck (Score:2)
Old promises (Score:2)
"... nobody has to program ...programming language is human ..."
I've found the real product: Any high-school drop-out can give a computer orders and the computer "will do it". Businessmen have been promising exactly that for 40 years, and 'easy' answers, since computers were invented.
Inventing and using; block-structured, object-orientated, functional languages was wrong and a waste of man-power: We just need a computer that can take orders.
Those languages exist; because giving a computer orders, is difficult, because treating everything as a number is not easy t
Future or now (Score:5, Insightful)
NVIDIA CEO Jensen Huang believes that English is becoming a new programming language
Microsoft CEO Satya Nadella has been equally vocal about the potential of English for coding
Andrej Karpathy, senior director of AI at Tesla predicted this trend last year.... English is emerging as the universal coding language.
These are all quotes about what AI could do in the future. They are not quotes about what AI can do now. The difference is important, because when AI marketers talk in BS mode, they frame it in the future (because then they are technically not wrong, because they don't have to talk about current capability).
Meanwhile, a quote from Alan Perlis [yale.edu]:
93. When someone says "I want a programming language in which I need only say what I wish done," give him a lollipop. COBOL did the same thing.
Re: (Score:2)
Why English, guys? Why not Mandarin, Mr. Jensen Huang? Why not Telugu, Mr. Satya Nadella? Why not Slovak, Mr. Andrej Karpathy?
Sit back and watch the fires (Score:3)
Oh those precious non-programmers and their thoughts.
must be some interesting dialect (Score:4, Interesting)
Spent nearly a half hour last night trying to get ChatGPT to stop referencing Java classes in Javascript code it generated. Completely futile attempt . It just doesn't understand what's being asked of it, period. Even for small stuff for which it usually does OK like she'll scripts and Python, you better use source control and check every single diff. It often brings back old problems or creates new ones, along with sometimes fixing the one you asked it to.
Re: (Score:1)
Well,
perhaps the AI was WaySmarterThanYou?
Java and JavaScript are interwoven. It is completely legal to refer to Java from JavaScript, and if you know how to do: vise versa.
You probably missed a simple check box: exclude Java. Or something similar.
ENGLISH was the name of a real computer language (Score:2)
Dick Picks âoeRealityâ OS and ENGLISH programming language. Surely some slashdot readers know a bit about computer historyâ¦
https://en.m.wikipedia.org/wik... [wikipedia.org]
Its very simple (Score:4, Insightful)
If you can't articulate in English what you want to do then you probably won't be able to write the code properly yourself.
There are a number of AI code generators available these days that can be embedded in your everyday IDE. You can pull up a panel and tell it what code you are planning to write and it will generate a reasonable facsimile in seconds. You can even just write a block of comments that explains your intentions and it will do it all as a code completion. Will it be perfect? Probably not but you can then tweak it however you want, which is an awful lot faster that writing the whole thing yourself.
reminiscent of Hypercard (Score:1)
The article reminds of when Hypercard was released in 1987
HyperTalk? AppleScript? (Score:1)
Strictly speaking they just could use AppleScript or a similar X-Talk language.
However it proofed that "expressing something" in "plain English" still requires you to know odd quirks of the interpreter. And those quirks are often difficult to memorize. So you spent more time to google the "proper English" for AppleScript than you would need to write it in JavaScript (or Swift, or whartever), right away.
I guess this "new AI" stuff, are just clever search engines. As probably 90% of any code anyone ever will
Refinement (Score:2)
We need to draw line at the term 'coding'. (Score:2)
- Stop lying.
AI generated code is a cargo cult (Score:2)
I support an SDK... The other day I had a customer make a case where they said they tried having AI generate code to accomplish a task but this particular class was "missing" a particular method.
Dude our class isn't missing the method, your AI code generator it the digital equivalent of a cargo cult..
Of course I phrased it more .. tactfully - but I may have in fact quoted his "I tried using AI to generate the needed implementation but it didn't work" and responded "I am not at all surprised" and then explai
English is a terrible programming language (Score:3)
Asking someone to do something with precision using plain English is hard. It takes skill on both sides, and it usually involves followup questions and supervision. English is not very precise without being verbose and full of technical terms, which require a lot of expertise, more than mastering a programming language.
What many people don't realize is that programming languages are simple orders of magnitude simpler than any natural language. A beginner can learn the basics of a programming language in a few days, for an experienced programmer, it can be less than an hour. It is one of the easiest part of programming.
Programming is hard, and programming languages make the task easier, not harder. This is because what programming really is is to make a machine do exactly what we intend it to do, and machines are dumb, so programmers have to be extremely precise. With current-day AI, machines are still dumb, just less obviously so. They can do some simple tasks right without precise instructions, but they will screw up at the first slightly unusual thing, and do so without telling you. So if you want the machine to do a good job, you still need precision, and this is best achieved by someone skilled in that task (a programmer), using the right tools (including a programming language).
Software Engineers jobs (Score:2)
It irks me that people who do not understand what a job is think AI can do it. A developer probably spends about 10% of their time actually writing code. Rest of the time is spent figuring what the users need/want, or modifying/debugging existing code.
The joke is "AI will replace developers, as soon as users are able to tell it exactly what they want."
Also, wasn't COBOL, the "english-like" language, supposed to make programmers obsolete?
IMHO we should create an AI to replace VCs and CEOs - should be
yes but no (Score:2)
Ok, I don't code as part of my living any more. But when I do need some small amount of code, and need help, an AI has been very helpful. HOWEVER, the provided code never works. I'm the thing that figures out WHY it doesn't work, although once I do, I have been able to get the AI to fix their code.
This is not going to work for anything complicated, though. If you have to consider how you are going to trap 50 possible errors that might be thrown, I don't think an AI is going to save you. Nor if you have
That's what the COBOL designers said too (Score:2)
They wanted to create a programming language that was very English-like. They succeeded neither in building a good programming language, nor making it English-like. Though AI might be better at interpreting English, the issues remain, such as the necessity to be explicit and build a lexicon of terms that have specific meanings.
"Make me a program that calculates debt-to-income ratio." Buried in that sentence is a ton of understanding of legal regulations, economics, and use case specifics. It's still going t
not about coding (Score:2)