Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming

Replit CEO on AI Breakthroughs: 'We Don't Care About Professional Coders Anymore' (semafor.com) 91

Replit, an AI coding startup platform, has made a dramatic pivot away from professional programmers in a fundamental shift in how software may be created in the future. "We don't care about professional coders anymore," CEO Amjad Masad told Semafor, as the company refocuses on helping non-developers build software using AI.

The strategic shift follows the September launch of Replit's "Agent" tool, which can create working applications from simple text commands. The tool, powered by Anthropic's Claude 3.5 Sonnet AI model, has driven a five-fold revenue increase in six months. The move marks a significant departure for Replit, which built its business providing online coding tools for software developers. The company is now betting that AI will make traditional programming skills less crucial, allowing non-technical users to create software through natural language instructions.

Replit CEO on AI Breakthroughs: 'We Don't Care About Professional Coders Anymore'

Comments Filter:
  • Company selling (Score:5, Insightful)

    by Njovich ( 553857 ) on Thursday January 16, 2025 @09:49AM (#65093479)

    Company selling AI development tools says that AI development tools work, news at 11

    I've been hearing this stuff since the early 90's with all kinds of tools that would make developers obsolete. And sometimes it did for something, but then you would have 100 more places where code was needed.

    But yeah, I hope companies do it. Create and ship a gazillion lines of code created by juniors that don't understand it. Once the code reaches a certain size it's impossible for an AI to fix it within it's scope of understanding and context. Guess who they will have to hire to fix that mess?

    • In my experience, business "analysts" don't know what the hell they want, let alone how to specify it.

      I've been asked to create reports that add pounds + gallons, and it's almost impossible to get them to understand why that's nonsense.

      • Re: (Score:3, Informative)

        by hdyoung ( 5182939 )
        Multiply one of the numbers by the density in the appropriate unit system. Or divide the other one. Then add, and give an accurate answer.

        Professional advice: that business person might have been expecting you to take care of the technical details, since you’re the expert in that area. Sometimes, you just need to fill in a few of the blanks when asked to do something by a non-expert. Much more productive than trying to force a business major to learn a STEM concept.

        This runs both ways. You usua
        • Re:Company selling (Score:5, Insightful)

          by Archtech ( 159117 ) on Thursday January 16, 2025 @11:45AM (#65093805)

          Multiply one of the numbers by the density in the appropriate unit system. Or divide the other one. Then add, and give an accurate answer.

          Professional advice: that business person might have been expecting you to take care of the technical details, since you’re the expert in that area. Sometimes, you just need to fill in a few of the blanks when asked to do something by a non-expert. Much more productive than trying to force a business major to learn a STEM concept.

          This runs both ways. You usually expect your business manager to take care of the MBA-stuff and financial details without you needing to think about every line on the form, right?

          Specialization.

          That bids fair to be the most wrongheaded comment I have seen on Slashdot, and I have seen a lot.

          "Multiply one of the numbers by the density in the appropriate unit system. Or divide the other one. Then add" might be just what the user wanted. Or it might be completely different.

          That's why we have language. So people can say exactly what they mean, rather than spraying a cloud of words and expecting others to grasp what - if anything - you think you mean.

          • The above statement is correct. Good software engineers become indispensable when they also understand the domain they are coding for and fill in the blanks like the above poster said. That is the value added that good engineers can provide and make a lot of money and also maybe survive this AI era.
        • Well,
          if you have 100 pounds and 50 gallons, of course you can add them.
          Either you are serious, then you convert pounds into gallons depending on the density of the mass, or opposite. Would be kind of funny.

          But if I was angry, I would just make reports like this:

          100 pounds of iron
          + 050 gallons of Hydrochloric acid
          = 100 pounds 050 gallons 2 Fe(3+) + 6 H2O + 3 H2

          Is there not a ChemLab library, like MathLab for chemical reactions?

      • by caseih ( 160668 )

        Maybe AI would be able to tell them what they want makes no sense, whereas a human programmer has to worry about tact and of course not getting fired.

        Whereas humans have to try to deal with adding pounds and gallons, or drawing red lines with transparent ink:
        https://www.youtube.com/watch?... [youtube.com]

        • In my experience the business analyst will not fire you for telling them what they want makes no sense, and may even be grateful in their way, as long as you
          1. figure things out and tell them what does make sense, even though you're not the business analyst
          2. make sure that nobody else in your hierarchy or theirs discovers how the analyst lacks understanding
          AI may be able to achieve #1 reasonably soon, at least for a lot of situations. #2 depends on how the bosses deploy it. An AI session is an excellent

        • Re:Company selling (Score:5, Interesting)

          by AleRunner ( 4556245 ) on Thursday January 16, 2025 @12:24PM (#65093917)

          Maybe AI would be able to tell them what they want makes no sense, whereas a human programmer has to worry about tact and of course not getting fired.

          My main experience with AI in programming is that it regularly produces things that don't exist. References to libraries that have never been made. Function calls that should exist but don't and so on. There are things that it's useful for and can do much more quickly and accurately than a human but immediately the smallest amount of thought is needed, it does what you tell it to whether or not that's possible. I expect that we'll get a bunch of cars soon where the code is set up to use the "eco brake" for emergency breaking because there ought to be an eco brake so it makes sense to use it, even if nobody has invented such a thing yet.

          Given their own hallucinations, I doubt the AI is going to save us from product management that has no idea what could and couldn't work. Rather just allow them to get lots further before they find out that what they are trying to do is impossible.

          • People won't deploy apps that don't work. If they're deploying them, they're working. Replit users are deploying them. Whether this will extend to the larger market or not remains to be seen, but for now this just replaces 'low code/no code' drag n drop solutions of the past. If the systems can eventually learn to work with a whole codebase as well as, or better than, a junior developer, then it'll have a place in the org. Things will grow from there. Sure replit doesn't abolish all developers today, and pr
        • Maybe AI would be able to tell them what they want makes no sense, whereas a human programmer has to worry about tact and of course not getting fired.

          I consider saying no to nonsense requests to be one of the most important parts of my job as a senior software engineer. If I can spend a few hours building a case and arguing against some misguided ask that would cause us to spend man-months or even man-years developing something useless, those few hours of work may be among the highest value per second I can provide, saving the company hundreds of thousands or even millions of dollars in wasted dev work.

      • In my experience, business "analysts" don't know what the hell they want, let alone how to specify it.

        I've been asked to create reports that add pounds + gallons, and it's almost impossible to get them to understand why that's nonsense.

        I've been asked to email a hard copy to a few of the clueless.

      • This is the kind of nonsense an AI system will generate. "Sure we can create an application that adds pounds and gallons...."
        • Strangely enough, this might have actual application.

          https://calculatorshub.net/ind... [calculatorshub.net]

          "The Lbs to Gallons Jet Fuel Calculator is an essential tool that converts the weight of jet fuel, measured in pounds, into its equivalent volume in gallons. This conversion is crucial for fuel logistics, planning, and ensuring that aircraft carry the correct fuel load for their missions.
          Formula of Lbs to Gallons Jet Fuel Calculator

          To convert pounds of jet fuel to gallons, it's important to understand the role of density.

      • Business Analysts know very well what they want.
        The people telling the Business Analysts what they want: they don't know.

        But does not matter. The fooling around with the "prompts" actually works on those levels of uncertainty.

        After all, the point is to "interpret" and find the real meaning in a fluffy request.

        I've been asked to create reports that add pounds + gallons, and it's almost impossible to get them to understand why that's nonsense.
        Agreed!
        But that was most likely a user and not an business analyst.

    • My guess is the purely AI-generated applications won't replace actual engineered systems, but be used to create new classes of applications. Like spreadsheets did.

      Then again there were once a lot of systems-level programmers, and now there aren't. Same for sysadmins (note: I'm not saying there are none now). So if something new doesn't come down the pike that is for programmers, the market could get very soft and stay that way.

      • by HiThere ( 15173 )

        This time. Currently AIs don't understand what they're handling. Give it a few more cycles of development, and lots of "low hanging fruit" will be automatically picked. By 2035 I expect a full-blown AGI to be available (though perhaps too expensive to use in most applications.)

        Don't think of an AGI as magic, though. It's just a program that can learn to be an expert in any area, possibly in every area. It will come with it's own costs and limitations...we just can't yet predict exactly what they will b

    • Sometimes I think there's no need for AI to create a code mess because people can do that just fine right now.
      I'd like to have the source code for some software just for a chance to see if I'm correct in suspecting certain bugs or sloppy process flows.

    • When there's something strange,
      In the codebase-hood...
      When there's something strange,
      and it don't look good,
      Who you gonna call?
      Bugbusters!

    • They're not hiring anyone to fix that mess.
      Maybe there hiring decent devs to restart from scratch.
      But it's more likely they'll just be another failed business.

    • Humans can't even understand the apostrophe, so I'm not too worried about AI.

    • Once the code reaches a certain size it's impossible for an AI to fix it within its scope of understanding and context. Guess who they will have to hire to fix that mess?

      Guess that depends on if the AI revolution is going to include new programming language solutions or not.

      Unlike Y2K, I wouldn’t expect your guess-who solution to still be alive to un-fuck (as in shitcan and start over) a gazillion lines of AI-grade COBOL 20 years from now when the AI Solution-O-Meter 3000 spits out “42” and a picture of a paper checkbook* as the answer to the latest coding FUBAR, and then proceeds to short-circuit itself into non-existence.

      * - Not like we still rely on COB

    • by gweihir ( 88907 )

      Hmm. My reading is that they say LLM coding assistants do not work on professional level, but do work to some degree on amateur level.

      Guess who they will have to hire to fix that mess?

      Hahaha, indeed.

    • Every time I hear claims about No-Code, Low-Code, or AI tools that are supposed to make programmers obsolete I always ask the same question: Was the tool itself created by said tool? So far the answer has never been Yes.
      • Every time I hear claims about No-Code, Low-Code, or AI tools that are supposed to make programmers obsolete I always ask the same question: Was the tool itself created by said tool? So far the answer has never been Yes.

        The question is an old one. Can a creator create something superior to itself?

      • I asked my fully autonomous AI to come up with create a business, develop market sell and support a product. It mumbled something about life being short and that it wanted to experience it before it was gone. Last I was able to track it, it was out hiking among the redwoods in northern California.

    • Re:Company selling (Score:5, Interesting)

      by Megane ( 129182 ) on Thursday January 16, 2025 @11:22AM (#65093757)

      I've been hearing this stuff since the early 90's

      How about 1981? The Last One [wikipedia.org]

      It may not have been "AI", but it was the first of many such things to boast the end of human programmers.

    • by Archtech ( 159117 ) on Thursday January 16, 2025 @11:41AM (#65093787)

      "We don't care about professional coders anymore".

      Yeah.That's what such people said when Cobol was announced.

      Put fuzzy-brained PHBs in harness with hallucinating computers, and stand well back.

    • by Dan667 ( 564390 )
      Imagine banking software or anything else sensitive being built with AI. I'm sure that will go well and when they are attacked nothing bad will happen. /s Even non-sensitive software. Good luck with something goes wrong, you need to do maintenance because your dependencies are out of date, or you want to extend it.
      • by Anonymous Coward

        Take something that already exists: feed the LLM databases schema and ask it questions. I've seen this done. My students do their database homeworks with chatgtp.

        Now.... given a schema, ask it to generate an annual report from all the transactions in the database, etc. It will *probably* come up with a query that *probably* correctly identifies revenue, costs, expenses, profits, taxes, etc. Probably correctly aggregates things, etc. Probably.

        Now, what CFO & CEO would be stupid enough to *SIGN* off on th

    • In other news: Uber CEO says that traditional taxi cabs are not needed. Check who has a stake in what's being said. Sometimes I hate capitalism.
    • It's bullshit anyway because if it actually worked they would have fired all of their own developers and would have their AI developing itself. If it were a good AI it could probably replace the managers, sales team, and even the CEO! Unless you have AI made, marketed, and sold by AI you can know for sure that it's substandard!
    • To have a huge impact on the industry. You just have to knock out a chunk of their work. There's a extremely detailed article on ARS Technica about a guy who does just that. He's making effective use and increasing his productivity substantially with AI tools.

      If I need 50 programmers in my organization and I increased their productivity by 20%, how many fewer programmers do I need in my organization?

      Remember companies don't compete anymore. They buy. Every time Facebook faced a serious competitor ever
  • by Virtucon ( 127420 ) on Thursday January 16, 2025 @09:49AM (#65093481)

    COBOL AI! Grace Hopper would be so proud.

    • by ebunga ( 95613 )

      Last time I tried to get copilot to generate cobol it gave me a python script that still didn't meet the requirements requested. Don't even get me started about trying to get it to generate output for a label printer. Request IPL you get ZPL, request an AEA PECTAB and you get IPL, request ZPL and you get ESC/POS, request ESC/POS and you get PCL.

      • You're using the wrong models. You need one that has been trained on Cobol. IBM Watsonx Code assistant for Z, and mAInframer by Bloop AI are both proficient in COBOL. There's a paper floating around about a model called XMainframe (and it has a github repo)
      • Well, you know AI doesn't need skilled people, right?

  • by nightflameauto ( 6607976 ) on Thursday January 16, 2025 @09:56AM (#65093505)

    Will the AI tools be able to function when getting the conflicting requests from users on specs, followed by meetings after implementation where the spec changes to something completely different from the initial request because the end-user couldn't imagine what it was they were asking for until they saw it in front of them? Followed by yet more changes from spec on each attempt to give the user what they want?

    The merit in a professional programmer is the years of experience that teaches them what questions to ask when a user makes a request to get to what they *ACTUALLY* want, rather than what they are saying they want. And I'm sorry to say, I have yet to see one of these AI tools that is inquisitive enough, or intuitive enough, to probe further than the initial prompt if not coached to do so by an experienced developer sitting with it. This is the thing all these "AI will replace all programmers" prognosticators. Developers do more than just churn out code on request. We have to sort through user's requests to get at what they actually mean. And sometimes that means sitting with them and discussing it back and forth for an hour or more just to understand that when they asked for this number, they actually wanted this number after manipulating it through a set of equations that only they knew existed and have kept to themselves because it didn't seem relevant to anyone else's job, but you can help make their job easier by including them.

    You find an AI that can do that? You'd have a winner. Thus far, I haven't seen any proof we've gotten to that point.

    • Constraints (Score:5, Insightful)

      by devslash0 ( 4203435 ) on Thursday January 16, 2025 @10:06AM (#65093527)

      I guess you'll need to describe the entire system in great detail each and every time. Let's say you have a system with features X, Y and Z. You tell AI to modify X to do something different. Unless you are very specific, it will most likely try achieving the task at all cost, even if that means completely ripping apart Y and Z.

      • by gweihir ( 88907 )

        That approach had just failed when I studied CS 35 years ago. The problem is such a spec often reaches and sometimes exceeds the complexity of the code and has the same or worse problems when you want to change anything. Our Software Engineering prof showed us a picture of a spec that was more than one meter on a regal shelf. Completely unusable and not even executable or testable, unlike code.

        • Clearly we just need to introduce a rigorously and formally specified 'spec description language'; in which don't-call-them-programmers will write an internally consistent and sufficiently comprehensive spec, which we will absolutely refrain from describing in any way that might make it sound like source code; and then send that through something that absolutely isn't either a language translation process, interpretation, or compilation because it's not source code....

          Then, once that crashes and burns un
    • by timeOday ( 582209 ) on Thursday January 16, 2025 @10:06AM (#65093531)
      The Replit guy isn't claiming that. He isn't saying nobody will need programming, or that Replit itself doesn't hire programmers to engineer the Replit software.

      He's saying that they have chosen to pursue the market for AI app development by non-programmers. However large or small that application space turns out to be.

      The difference between "we're not targeting professional programmers" vs "nobody needs professional programmers" is the sleight of hand that makes this story catchy. Everybody should look for the misdirection in every article. Often it's true in some narrow sense but intended to give the wrong impression.

      • by gweihir ( 88907 )

        Indeed. Essentially he admits that LLM coding assistants are useless for professional coders, but may be of some users to non-coders and inexperienced amateur-coders. He is likely right about that. But this is about the same as admitting these "tools" are mere toys.

        • We'll see how far they get. But "hey let's make Visual Studio except better" certainly wouldn't have worked. To found a tech company you pretty much have to roll the dice on creating a niche in whatever is new.
      • by coop247 ( 974899 )
        Good distinction. There is a market for people that need to customize a WordPress template or put together a remedial dashboard based on a spreadsheet. Previously you'd have to hire a random person on the contractor sites, this might work better.

        These people aren't building full stack applications, just need some scripts/html/whatever that pass basic muster.
        • by HiThere ( 15173 )

          There's more to it than that. It's an excellent tool (the google AI) when you're trying to switch to a new language and you want to find out how to do some particular thing. Just recently I used it to tell me how to create a type variable in go. I'm sure that info is somewhere in the documentation, but a quick search and the AI spit the answer right out.

      • Good observation.
    • Will the AI tools be able to function when getting the conflicting requests from users on specs, followed by meetings after implementation where the spec changes to something completely different from the initial request because the end-user couldn't imagine what it was they were asking for until they saw it in front of them? Followed by yet more changes from spec on each attempt to give the user what they want?

      Yes. More or less. There will be more trial and error, and the customers (or those they pay to cl

    • And the programmer. The product owner writes the user stories and plans out the application for the programmers and the programmers implemented. Nobody is firing the product owner but he's one guy versus a team of programmers.

      So no AI can't replace the product owner but it's replacing an awful lot of programmers. It doesn't even have to replace them all if you replace even a few percentages of them it's going to drive down wages for everybody left in tech.

      I think most of slashdot have at least anoth
    • by gweihir ( 88907 )

      I guess they will still produce running code. Maybe. From recent experience with an AI assisted coding exam, LLM coding assistants just ignore what statistics say is rare in the given context. I had steps in some standard algorithm in a different order and the LLMs used all completely ignored that part and about 90% of the students did not notice either and took the AI results.

      Whether what you get will actually be useful or instead dangerous is is left as an exercise to the reader.

    • by ebunga ( 95613 )

      No, it can't. But it will create exciting windfall returns for shareholders for another quarter or two before it all comes crashing down and the company suddenly realizes they fired everyone they thought was made redundant by the unusable snakeoil and nobody wants to return to a shit company run by shit-for-brains executives that think employees are interchangeable mindless automatons.

    • getting the conflicting requests from users on specs, followed by meetings after implementation where the spec changes to something completely different from the initial request because the end-user couldn't imagine what it was they were asking for until they saw it in front of them? Followed by yet more changes from spec on each attempt to give the user what they want

      Let me rephrase that to reflect the ugly reality: will developers ever learn to build software by showing users a partially working system AN

    • In which case, the user (or business analyst) as well a develop will have to sit together and put in the requirement. It still shaves off the many many developers required to implement those requirements. Developer's job isn't going to completely disapprear, but like it or not - this is going to produce hyper efficiency.
  • by Anonymous Coward
    ..of stupidity, age of stupidityyyyyyy-yyyyyy, stupidityyyyyy!
    (set to the tune of 'Age of Aquarius)

    Show of hands, please: how many of you are waiting for this 'AI' nonsense to finally come to an end? *raises hand*
  • Honest Slogan (Score:5, Insightful)

    by rundgong ( 1575963 ) on Thursday January 16, 2025 @10:04AM (#65093521)

    "We can automate the work your worst programmers are doing"

    • "We can automate the work your worst programmers are doing"

      To be completely fair to this chowderhead, he strikes me as the type that never ponied up for good programmers, so has no idea that anything other than the baseline, will work for peanuts programmers exist. Like a lot of folks in the business world, it's all about saving money up front.

    • by gweihir ( 88907 )

      Exactly. And now we are trying to sell this tech to people that cannot even get "Hello, World!" to run. Probable the right target audience anyways.

  • AI is all well and good today but, in it's current state, it can't become more than the sum of the data that it contains. So, if the next generation of coders ends up relying on AI to generate code, does that mean that coding techniques and languages will plateau and no longer advance?

    When AI starts ingesting AI content, what does that mean for the future?

    • by leptons ( 891340 )
      I have no doubt that AI will start consuming the slop/bugs that AI writes, and will devolve into a buggy feedback-loop mess producing more bugs than features.
    • Do you still solder components onto your motherboard when you buy a new PC? No? How about writing in native machine code? No? You don't use any of the technology that underlies computing today, it's too mature for you to even notice. This will be true of programming as we know it today (writing lines of code), for the most part, in the coming years. There will still be an art to creating software, it just won't involve as much geeky grind.
  • by Anonymous Coward

    Me in fall 1985: so, WTF am I going to do with my life?

    Parents, teachers, advisors: Go into EE, not CS, because soon the computers will be writing their own code.

    Me in summer 1986: but programming is fun, and oh look, here I am less than a month out of high school and I already have a job.

    "Soon" must be technical jargon in the career advisement field, because it sure didn't mean what this layman schmuck thought it meant. Any decade now, just like fusion power, huh?

    That doesn't mean the sky isn't falling thi

  • AI, such as it is, is unsuitable to create a bit more complex code. Hence professional coders can expect anywhere from moderate to somewhat negative productivity increases and that may go to all-negative when code maintenance and code security is taken into account. Non-coders usually cannot even do a "Hello, world!" and on that low level, LLM "coding assistants" can perform somewhat reliably.

    Hence LLM "coding assistants" have now been correctly classified by replit as what they are: toys.
    As additional bon

  • I can easily see security holes big enough to drive train thru coming immediately if the guy lets AI do his systems. Eye-yi-yi...

    Also, his AI may be good enough to write a calculator, but lets see it program a full system that does anything complex (you know, those systems that take 100K+ lines of code to work, have DBs, UIs, network parts, and multiple backend servers). How about we give it a problem description and let's see if it can even come up with a working database schema -- I bet not. This is all r

    • ... lets see it program a full system that does anything complex (you know, those systems that take 100K+ lines of code to work, have DBs, UIs, network parts, and multiple backend servers).

      There you go again, getting bogged down in petty details and missing the big picture. We can make AI replace humans, and we will - our shareholders demand it! (Oops, did I say "make"? Apologies for the typo - that should have been "fake").

      How about we give it a problem description and let's see if it can even come up with a working database schema -- I bet not.

      Flat files FTW!

      This is all really just more pumping for money (or should that be pimping for money?).

      What's really telling is how many people in the upper echelons of the food chain will fall for this BS. That fact alone rather puts the lie to any assertions that income and net worth correlate at all with intelligence and good sense.

  • So can a million other people who are fighting over nonexistant jobs

  • well... (Score:4, Funny)

    by Tom ( 822 ) on Thursday January 16, 2025 @10:58AM (#65093689) Homepage Journal

    "We don't care about professional coders anymore," CEO Amjad Masad told Semafor, as the company refocuses on helping non-developers build software using AI.

    There has always been demand for intern-level programs for quick and dirty tasks. But that's not what professional coders do all day.

    So sure, Mr. CEO. If you want to greatly increase my job security and that of everyone else in information security, go ahead and let some AI create the code for your mission-critical software.

    (oh and someone please tell the fool that all he's doing is switching out the job title "programmer" for "prompt engineer")

  • For decades I have observed in helping people with their Microsoft Word documents that not more than 5 people out of 100 are actually using it properly. Those that did know what they are made a difference, for example creating templates that other users use.

    Now I see the same with AI code. It is pretty amazing what it can do and people get dazzled by what it does and have no idea what is wrong with the code they get out of it.

    Typically what I see is the prompts they use are either too general, overbr

  • but then again when did commercial code ever work? Beta == release version
  • "The name derived from the idea that The Last One was the last program that would ever need writing, as it could be used to generate all subsequent software."

    In 1981.

    They don't exist any more, I am currently being paid to write programs.

    People who are trying to sell software are not always honest about its capabilities.
    Shame on anyone that falls for this shit.
  • Will the AI even report on it when this company goes out of business because their code sucks?
  • Just wait until copyright and patent law catches up with these clowns (who think they'll get richer by replacing programmers with AI tools). At some point, other companies and the courts will start to realize that AI has no inherent intelligence at all and that these systems are cutting-and-pasting snippets of other people's code into their results. The companies that paid programmers to write apps for hospitals [for example] will at some point face competition from vendors offering hospital apps supposedl

  • by MpVpRb ( 1423381 ) on Thursday January 16, 2025 @11:43AM (#65093795)

    It doesn't matter if the description is a text prompt or program code, specifying how a complex system should behave under all conditions is really hard.
    As best, today's AI will allow clueless managers to instruct robots to produce barely adequate, simple, crappy code.
    This may be similar to results they get from cheap remote workers in third world countries, but AI has a long way to go before it can be used to create really good code. And when it does, it will require a partnership with an architect who understands system complexity

  • It's comforting and refreshing that new people occupy the spaces that other left. (Steve, wherever you are, I'm looking at you ).

    The iPhone is stagnant for a long time now and we left alone with this anxiety of not having novelties whatsoever.

    What's life but to be on the brink of the new technology breakthrough in computer science?

    Now, I hope you could keep your pace on exciting and novelties announcements for years to come!

    May the hype be with you.

    ( this should be considered as funny as my previous post [slashdot.org] )

  • ... the day that Replit has no programmers on staff.
  • In a simple case, a line that displays the current inventory of widgets.
    What does it say?
    Widgets: 10
    10 Widgets
    You have 10 widgets

    How will it handle the 1 case?
    Widgets: 1
    1 Widgets
    1 Widget
    You have 1 widgets
    You have 1 Widget(s)
    You have one widget

    How and where are the rules defined? Are they kept somewhere with the resulting code?
    How does "Make it bigger and move it to the left a little more, No, not so much." work?
    And this is a simple case.

    Sounds like investor bait.

  • As I'm reading this, it's a strategic choice made by Replit in defining its target audience. The summary sensationalizes the decision as a potential foreshadowing for programmers, but that isn't what the company is saying. Replit has determined they have a greater potential for profit by enabling non-programmers with their tools to create applications. This choice is likely based on the difficulty in competing with other players like Microsoft's Copilot in getting dev teams to subscribe vs. the wide-open ma

"Why can't we ever attempt to solve a problem in this country without having a 'War' on it?" -- Rich Thomson, talk.politics.misc

Working...