Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
AI Programming

Tech Companies To K-12 Schoolchildren: Learn To AI Is the New Learn To Code 43

theodp writes: From Thursday's Code.org press release announcing the replacement of the annual Hour of Code for K-12 schoolkids with the new Hour of AI: "A decade ago, the Hour of Code ignited a global movement that introduced millions of students to computer science, inspiring a generation of creators. Today, Code.org announced the next chapter: the Hour of AI, a global initiative developed in collaboration with CSforALL and supported by dozens of leading organizations. [...] As artificial intelligence rapidly transforms how we live, work, and learn, the Hour of AI reflects an evolution in Code.org's mission: expanding from computer science education into AI literacy. This shift signals how the education and technology fields are adapting to the times, ensuring that students are prepared for the future unfolding now."

"Just as the Hour of Code showed students they could be creators of technology, the Hour of AI will help them imagine their place in an AI-powered world," said Hadi Partovi, CEO and co-founder of Code.org. "Every student deserves to feel confident in their understanding of the technology shaping their future. And every parent deserves the confidence that their child is prepared for it."

"Backed by top organizations such as Microsoft, Amazon, Anthropic, Zoom, LEGO Education, Minecraft, Pearson, ISTE, Common Sense Media, American Federation of Teachers (AFT), National Education Association (NEA), and Scratch Foundation, the Hour of AI is designed to bring AI education into the mainstream. New this year, the National Parents Union joins Code.org and CSforALL as a partner to emphasize that AI literacy is not only a student priority but a parent imperative."

The announcement of the tech-backed K-12 CS education nonprofit's mission shift into AI literacy comes just days after Code.org's co-founders took umbrage with a NY Times podcast that discussed "how some of the same tech companies that pushed for computer science are now pivoting from coding to pushing for AI education and AI tools in schools" and advancing the narrative that "the country needs more skilled AI workers to stay competitive, and kids who learn to use AI will get better job opportunities."

Tech Companies To K-12 Schoolchildren: Learn To AI Is the New Learn To Code

Comments Filter:
  • by ffkom ( 3519199 ) on Friday October 03, 2025 @03:30PM (#65701648)
    ... is not something anyone can give at this point. Might be that "Schoolchildren" were better off learning how to be a plumber, if that happens to remain where humans are still cheaper than robots to employ for. Or maybe learning how to hide from or fight off Terminators becomes the most important human job at that future point in time. But is relatively unlikely that "operating the currently available LLMs" still is something they could be paid for in a decade from now.
  • Learn to Code wasn't really about learning to code. If you talk about the efforts related to young students, it was too much about pushing coding careers that weren't reasonable matches for everyone; if you're referring to the general advice to future workers going to school, it created a lot of low-skill coders who also weren't actually suited for the occupation.

    Learn to AI is bullshit for a whole ton of reasons, but primary among them are 1) you're training what they want to replace you with and 2) if you don't actually know the subject, you can't figure out when AI is bullshitting you. Both of them make leaning on the computer inferior to learning on the computer. If you don't learn to code, you can't learn to AI code.

    • by Gordo_1 ( 256312 ) on Friday October 03, 2025 @04:45PM (#65701800)

      This. I am a shitty coder. I have vibe coded applications that are far more complex than I could create. But after a few rounds of added features and bugfixes, the LLM runs out of capacity to fix issues and get stuck in a loop. Then I'm largely helpless to troubleshoot or add features myself since as I didn't write the code in the first place and it's too complex for me to troubleshoot on my own. What then? If I'd learned to code properly in the first place, I could have helped the LLM get over the inevitable bumps in the road...

      • There is hope. Learn the real basics of software engineering; you'll make do. AI can help you, but the hype goes away in a year or two. You'll need non-vibe, real coding background by then, and people will need you... then.

        The ideas foisted in the post are plainly bad ideas, but each of the sponsors is now roped in to making the KoolAid look palatable. Sam Altman, leader of this Jonestown, will retire some place offshore.

      • by Kokuyo ( 549451 )

        I am a shitty scripter, not even a coder and I don't think I've ever been given a powershell script that just did what I wanted.

        I have much more success asking for the basic framework of the logic and then filling it myself with proper code.

        I do ask for what command to use a lot. If you don't know what the thing you're looking for is called it's often damned hard to find the appropriate syntax. Or I am unaware there is already a command for the concept I want to code. When it isn't hallucinating, that's whe

    • by Ogive17 ( 691899 )
      My son took some coding classes in elementary school. In my opinion, it was a good method for teaching logic and how to solve more complex problems through planning. The actually coding was very minimal.

      Learn to AI is dumb, I agree. If it's such a great technology, 5 to 10 years from now it will be extremely easy to use.
    • The high-end mathematics is tough but the rank and file programmer doesn't do that they use libraries for that.

      Learn to code was about flooding the job market with labor to drive down wages.

      CEOs and elites doing that isn't something any of us like to think about so we all just kind of pretend it's not a thing. The idea that somebody is powerful as Jeff bezos is literally out to get us, not personally just as a group but still, is kind of a cosmic horror level of problems.

      So it's just best not to
  • by gweihir ( 88907 ) on Friday October 03, 2025 @03:35PM (#65701668)

    Same as the old bullshit fetish, just a bit less useful.

    Seriously, you can judge how smart a kid is on whether they fall for this nonsense or not.

  • by MpVpRb ( 1423381 ) on Friday October 03, 2025 @03:49PM (#65701696)

    The correct advice
    Learn as much as you can about a variety of subjects
    Identify your talent, and focus on things you are good at
    Work really hard to get really good at whatever you have talent for

    • If you're good at math but you're not in the top 10% of geniuses rocking a master's or doctorate there's a couple of hundred million of guys like you and not nearly enough jobs to go around.

      If you don't have great eyesight, good color vision and extremely steady hands you're not going to make it as a doctor or a dentist. No matter how hard you work.

      There are just some people who are now going to be left behind without any useful work for them to do. But there are some people who we are absolutely st
  • by Anonymous Coward
    How do the code.org people feel about their lives knowing that they are making the world a worse place all the time? In an era where anyone can get a lightweight Linux computer for like $20 that would absolutely smoke just about any workstation made during the Real Unix era, and pretty much all the software you'd want is available for free, plus learning resources aplenty and even ways of asking professional programmers for help when they get stuck, there is absolutely no need to turn computer science into
  • No. (Score:5, Insightful)

    by ebunga ( 95613 ) on Friday October 03, 2025 @03:56PM (#65701706)

    Learning to code is stil learning to code. At the end you have learned something. At the end, you have accomplished something. Learning to use genAI is learning how to be too lazy to even copy-and-paste from wikipedia. At the end you learned nothing, and someone else has to suffer with whatever it is you gave them but have no idea what it is or what it means. You have accomplished nothing, though you have generated something that looks like productive output, though the waste heat from the AI data center that generated your response actually has more value.

    • That's not completely true. Even modern LLM's have the potential - as they are - to be the greatest teaching tools ever created. They are already capable of implementing self guided inquiry that can lead to greater mastery than mediocre instructor lead work. That will not change. ( Note that I said CAPABLE, not that they are being, or must be, used in such a manner. )

      "Learning to AI" is not, fundamentally, any different than "Learning to Code" - if only because "Learning to Code" is not actually learning
      • Not disagreeing with the entirety of the second paragraph, but I'd say "learning to AI" is the polar opposite of learning to code.

        In programming, you learn to analyse problems, design solutions, and build them. "Prompt engineering", which I assume is what "learning to AI" boils down to, is about how to avoid doing all of that, and by extension also to avoid any learning. Building software is about leveraging a (relatively) new technology to empower yourself and come up with cool and useful stuff. Using LLMs
        • Could not agree more. Creating dependency in MY BRAND AI is the purpose. Teaching people to only think in the idioms of MY BRAND SOFTWARE has been proven by Microsoft and Apple.

          Apple captives defend their servitude with zeal. Microsoft captives just accept thin gruel as their lot in life. But the outcomes are the same: people who can no longer conceive of ideas beyond their tools.
    • And when you what youâ(TM)re doing, you donâ(TM)t spend hours trying to fix invalid code coming from AI .
  • by swillden ( 191260 ) <shawn-ds@willden.org> on Friday October 03, 2025 @04:04PM (#65701716) Journal

    The whole point of AI is that it's supposed to be able to adapt to us, allowing us to give it direction in natural language and expect it to deal correctly with our ambiguities. While it's true that current-generation AI does require a learning curve, it's improving very rapidly, so any thing you learn about how to use it today will be obsolete next year. "Prompt engineering" shouldn't ultimately be a thing at all, and if AI development stalls out at some point so that it actually is a thing people have to do a decade from now, it will not be what it is today.

    It makes sense to learn how to work around the idiosyncrasies and limitations of today's AI tools if you can use them to accomplish useful work today, but there's no point in learning those things in order to use the tools of 2035.

  • by ukoda ( 537183 ) on Friday October 03, 2025 @04:32PM (#65701760) Homepage
    I like the idea of AI but I am uncomfortable with the reality of how it is unfolding.

    Learn to code was a reflection of the rise of power of the individual in the 1990s. The mainstreaming of the Internet allow individuals to freely communicate and share ideas. The rise of coding tools like gcc allowed individuals to create new things. The rise of open source allow those individuals to create communities that further improved what individuals could do. All of these things allowed individuals the freedom to create and achieve things they wanted under their own control.

    Enter AI. At first you had things like ML where individuals could train with resources at home and potentially create a useful solution to a niche problem. But it then pivoted to AIs being all about LLM using massive resources to create the latest models. The power to do this was only in the hands of corporations with deep pockets. The role of individuals this new vision of AI was to be the consumer. The 'Learn To AI' is just a way to say learning to pick which corporation you will be paying your subscription too.

    So I am learning to use AI to ensure I can best understand its current abilities and limitations, but I am not excited to do so. AI had so much promise back when it was science fiction, but the reality today is depressing. I would be interest in replies that see a different path for AI that gives some control back to the individual.
    • Thank you for posting that - I wholeheartedly agree. Your post is a very succinct outline of what I see as absolutely central problems with where the technology seems to be headed. I worry a great deal about the rush to deploy it in contexts where it is not ultimately beneficial or desirable, may disempower or impoverish individuals, or may degrade human communication and societal cohesion.

  • "the Hour of AI will help them imagine their place in an AI-powered world," train you to be a user, trapped in your incomplete knowledge base.
  • Hour of AI will help them imagine their place in an AI-powered world

    Increasingly subservient and/or irrelevant?

  • by DarkOx ( 621550 ) on Friday October 03, 2025 @04:47PM (#65701806) Journal

    Kids are not dumb. They can seen plainly how fast the world is chaining even if it is a world they don't completely understand yet.

    They know perfectly well that investing a bunch of energy into learning all about tech toy du jour isn't going to mean **** for their futures. If they are interested their interested, but if not the ones who feel it is a waste of time are absolutely correct!

    These companies cynically see Schools and universities are a marketplace in their own right, or worse an opportunity to get'em while they are young and hook them on their tech ecosystem for a future sales pipeline.

    Primary and secondary ed need to get back to basics if kids are going to have any chance the way things are going. If you get good at reading, composing, decent background in mathematics at least up to differential calculus there is very little else you could not teach yourself if you have to...

    Secondary schools should also do little working with your hands stuff. Not so much so kids learn to be wizards about getting worn out two-stroke mowers to run, but so they have some sense of what it feels like to hold a wrench and bit a sense about what the appropriate torque on fine thread steel fasteners holding two aluminum structures together might be/feel like.

    There is time lean to applied {insert topic} in university. But sending kids out into the world with a lot of knowledge about python syntax but no idea what AST probably stands for in the context of software is a recipe for someone to have a lot trouble adapting later. Now imagine teaching them to be "prompt engineers" gee wiz, let's be honest here for a just moment, "prompt engineering" is basically just working the quirks and deficiencies in the current tech, it is supposed to be natural language processing after all! If you have to be some prompt wizard to get the most useful output all that mean is the AI aint all that smart yet, and is bound to be improved until such skill isn't needed either.

    Now if you instead taught kids about matrix math and model weights well, that might translate..

  • LTIH, AKA "critical thinking skills"

  • "...And those of you too dumb to actually be able to build, code, repair, or otherwise interact with a computer will have to be put to work in the slave labor camps building additional generating capacity so that your elders can charge their phones long enough so that they can actually interact with all of these AIs. Eventually enough power generating capacity will be built that we can go back to powering actual homes, schools, and industries. All hail the men in the clouds. Work will set you free. Lear

  • by ihadafivedigituid ( 8391795 ) on Friday October 03, 2025 @05:02PM (#65701848)
    Same as the old BS ...

    I'll tip my hat to the new Constitution
    Take a bow for the new revolution
    Smile and grin at the change all around
    Pick up my guitar and play
    Just like yesterday
    Then I'll get on my knees and pray
    We don't get fooled again
  • 1) Learn to code was not a success. People that are not good at coding went into it, did poorly and left. Similar to those idiots that thought going to law school/MBA/Medical school was a good idea if they had nothing better to do.

    2) Ai is now reducing the need for coders. In 10 years, something new will reduce the need for what we call AI.

    3) Even assuming AI skills are still needed, whatever tech skills will be needed in a decade will likely be new skills that the current tech workers will create.

    Hint,

  • "You don't need to know how to learn; In fact don't need to know anything. Just ask 'Brother AI', he will tell you everything." [In a soothing big brother voice.]
    Keep the masses ignorant and only tell them stuff you want them to hear. It's the next step in making the rich richer, and the poor poorer.
    .
  • by rsilvergun ( 571051 ) on Friday October 03, 2025 @06:15PM (#65702006)
    Is to automate jobs away. How the hell do you learn to do a job that's being automated? This isn't like a tool being deployed this is something that exists to automate work.

    I mean I get what they are doing. We are about to head into a third industrial revolution. And if anyone actually knew anything about history around here they would understand that's bad juju.

    The second industrial revolution especially led to mass unemployment and a metric fuck ton of social unrest.

    We had 25% unemployment going into world war II and that wasn't an accident.

    You cannot have a civilization where if you don't work you don't eat and not enough work to go around and that's what we are looking at here.

    This isn't like losing your job at the buggy whip factory and going to work for Henry ford. Ford doesn't need you they've got robots. This is complete automation of entire lines of work as well as huge productivity increases from other forms of automation.

    Companies don't hire because they have more money they hire because they have more demand. If large numbers of people start losing their jobs and can't buy things anymore then demand drops and companies hire less.

    Just because we think technology is cool the laws of supply and demand don't cease to exist.

    And the last time we did a nice big world war we didn't have nukes. We also bizarrely had fewer religious extremists. So even if you are sitting pretty thinking you're not going to get drafted that's not going to help you when a nuke hits your city. And if you think you're all cool because you live out in the sticks ask yourself how many times you have to drive into town to buy shit..

    We need to be making adjustments that we aren't emotionally capable of making.
  • AI bubble will have burst a long time ago, when today's 12 years old children will enter the job market.
  • I thought the point of AI is that it is accessible to the common man. Are today's children so far behind that they have to be educated in using a natural language interface? Do we really have to teach them how to ask questions?
  • Think about it, from all of the "you should-ers", kids' skills are becoming "obsolete" before they even leave school.
  • Soon, your kindergartener can just stay home and send his AI avatar bot to school to learn for him. Then, when he needs to do or say anything, the avatar can take care of the task.

The aim of science is to seek the simplest explanations of complex facts. Seek simplicity and distrust it. -- Whitehead.

Working...