Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Compare cell phone plans using Wirefly's innovative plan comparison tool ×
Open Source Programming Software

Help ESR Stamp Out CVS and SVN In Our Lifetime 245

mtaht writes ESR is collecting specifications and donations towards getting a new high end machine to be used for massive CVS and SVN repository conversions, after encountering problems with converting the whole of netbsd over to git. What he's doing now sort of reminds me of holding a bake sale to build a bomber, but he's well on his way towards Xeon class or higher for the work. What else can be done to speed up adoption of git and preserve all the computer history kept in source code repositories? ESR says he'll match funds toward the purchase of the needed hardware, so if you want to help drive him into bankruptcy, now's your chance.
This discussion has been archived. No new comments can be posted.

Help ESR Stamp Out CVS and SVN In Our Lifetime

Comments Filter:
  • by gmhowell ( 26755 ) <gmhowell@gmail.com> on Monday October 20, 2014 @05:15PM (#48189805) Homepage Journal

    You had my curiosity, then I read this:

    ESR says he'll match funds toward the purchase of the needed hardware, so if you want to help drive him into bankruptcy, now's your chance.

    Now you have my attention.

    (Not enough to RTFS however. This is Slashdot...)

    • by tqk ( 413719 ) <s.keeling@mail.com> on Monday October 20, 2014 @08:35PM (#48191493)

      ESR says he'll match funds toward the purchase of the needed hardware, so if you want to help drive him into bankruptcy, now's your chance.

      Now you have my attention.

      Not a bad way to go out if you think about it. Assume ESR in his doctor's office: "You have three months to live."

      "Cool! I'm going to instigate a full on crapfest buying spree fueled by my "GeekNation." We'll pre-order *everything* Intel, Foxconn, and Cisco are going to produce in the next two decades, I'll be long dead once the bills start to roll in, and Western Civilization will collapse into a black hole of insolvency before anyone realizes what's going on. Suck it PRC bastards! I'll be bigger than Hitler!!! Woohoo!"

      <Dilbert>Was that better or worse? I don't know how to tell.</Dilbert> # PHB: "You don't show enough passion for your job."

      [A bit over the top, I admit, but we are discussing ESR.]

  • Save-On -> Longs Drugs -> CVS -> ESR
  • by istartedi ( 132515 ) on Monday October 20, 2014 @05:17PM (#48189825) Journal

    Git set and aid esr for cvs and svn fix. HTH.

  • by Anonymous Coward on Monday October 20, 2014 @05:19PM (#48189845)

    Nice trolling, dice.

  • by ModernGeek ( 601932 ) on Monday October 20, 2014 @05:21PM (#48189865)

    I might do it for some things, but right now the ability to only checkout a subdirectory[source [stackoverflow.com]] is paramount in the way we use svn around here. Nestled with the fact that there are so many git solutions that are third-party hosted only, and so many hostable open source subversion options available, I'll stick with svn.

    Moving everything to the cloud, which is marketing speak for someone else's servers, for increased functionality is not an acceptable solution. Sure, you can host your own git repository, but the functionality in the available F/OSS solutions blows.

    • by gbjbaanb ( 229885 ) on Monday October 20, 2014 @05:47PM (#48190103)

      Exactly. I don't know why there's such nerdage against SVN except that git is hard, so therefore its better somehow. Despite the fact you can lose your history (irrevocably if you try) and screw things up even if you don't.

      If something is working, there's no point in trying to break it. And if you were to go break it, you'd go with Fossil anyway, git done right.

      • Despite the fact you can lose your history (irrevocably if you try) and screw things up even if you don't.

        Probably not, everything in git can be undone (except something like rm -rf * .* but your repository can be restored even then if you have a backup).

        • the trouble with DVCSs is there is no repository to backup. Everyone has their copies and a vape in one can (and will) be propagated to the others. Restore your repo from backups and watch as someone then commits the vape when they push their changes to you - the system doesn't know that it shouldn't take that commit.

          Its not like a centralized system where you can have proper backups.

      • Despite the fact you can lose your history (irrevocably if you try) and screw things up even if you don't.

        The only way to lose history irrevocably in git is if both of the following two conditions are met:

        1. 1. You have no backups ("backups" includes pushing your commits to a remote server, and
        2. 2. You go mucking about in your .git directory and start randomly changing files in there or deleting them without knowing what you're doing

        Most people do push to a central repository, so condition #1 is usually not met. And mucking about in your .git directory is not a normal use case, either. You should use the git comma

    • We just converted from CVS to SVN about three years ago, so it's too soon to move on :-)

      Git may be ok, but I hear lots of horror stories from groups trying to get it to work well, which then invokes the fans to respond that they're doing it wrong, then it all goes down hill with twenty year olds scuffling with sixty year olds until the forty year olds break it up.

    • by Alef ( 605149 )

      I don't think git and svn are even really comparable in that way. I used to be a proponent of svn, but since I learnt git properly, there is no going back to svn ever again. The entire philosophy is different at a fundamental level, completely changing the way at least I work with version control. Git is more like a flexible framework where I can juggle different versions and multiple development threads, reordering things, rebasing onto different branches, or even create completely new workflows (such as a

    • I've used sccs, rcs, cvs, svn, and git [all in production environments], spanning a 30 year period. git is easier to use than svn, and more powerful [features that git pioneered have been backported into svn, so you're getting the benefits, even if you don't realize it].

      Ultimately, it's all about the merging, which is the premise that git is based on. See:
      http://www.youtube.com/watch?v... [youtube.com]
      or
      http://www.youtube.com/watch?v... [youtube.com]

    • I might do it for some things, but right now the ability to only checkout a subdirectory[source [stackoverflow.com]] is paramount in the way we use svn around here.

      You should go with what works for you, but just so you know, git has supported sparse checkouts for a few years now. Here is a blog post from 2011 [jasonkarns.com] that shows you how to do it, if you're interested. Admittedly, it has a not-obvious name, and is accomplished via not-obvious methods, but c'est la vie, sometimes.

      Nestled with the fact that there are so many git solutions that are third-party hosted only, and so many hostable open source subversion options available, I'll stick with svn.

      Again, you should use what works for you, but FYI, you're looking for gitlab [gitlab.com], or one of the other options out there (but I like gitlab, so that's what I'm going to link to).

      All that being said, I don't

    • You can checkout a subdirectory if (and that's a big proviso) you structure your code in such a way that each directory is a separate git repository, referenced as a submodule. The submodule points to a specific version of the other repository. Unfortunately, there are still a lot of issues with this approach:

      The biggest is that you have to think about what parts of the project you might want to check out individually before you start. For new (small) projects, it's sometimes easy, but typically project

  • by Anonymous Coward on Monday October 20, 2014 @05:23PM (#48189897)

    TFA is about killing CVS for Git. It says NOTHING of SVN. This is probably because SVN is still a decent system and Git is no replacement (and the reverse is also true).

    CVS should die though, yes. Move to SVN or Git depending on your particular needs.

    • by TapeCutter ( 624760 ) on Monday October 20, 2014 @06:19PM (#48190441) Journal
      Like the so called "death of the mainframe", the death of CVS is still a long way off. From a business POV moving a large well managed CVS repository to something else is simply not worth the effort in most cases. I look after CVS repository for ~25 devs, some of the (active) code has been there for well over a decade. We looked long and hard at git, the benefits are not enough to justify turning the whole shop upside down for a few months. Physically converting the repository is just part of problem, there's also the automated build and tracking scripts that depend on CVS. You can also add to that the down time for at least half the devs to learn the new system - it's quite disturbing how many experienced devs only have a marginal understanding of source control in the first place.

      Of course if you're starting a new repository then use the shinny new hammer with the rubber grip.
      • The problem is that you're building more and more tooling on top of a painfully decrepit system. Every time you spend more than zero seconds dealing with renaming a file, you've lost money on the deal. Every time you work off HEAD because it's too painful to branch, you're spending developer salaries. I get that "if it ain't broke, don't fix it", but CVS it utterly and fundamentally broke. You're throwing good money after bad trying to keep it alive.
      • If you like the paradigm of the centralized repository which sure has it's uses then consider moving to SVN instead. It will make things easier and nicer, it is CVS done right.

        The git/mercurial crowd just cannot stand the CVS/SVN workflow and wants a lot more flexibility which makes perfect sense for a global open source project. If you don't need all that, move to SVN and be happy.

    • by cdecoro ( 882384 ) on Monday October 20, 2014 @07:54PM (#48191235)

      CVS should die though, yes. Move to SVN or Git depending on your particular needs.

      My particular needs are to (1) check out only a subset of files, because those files are binary and very large, and (2) permanently delete those files that I know I will no longer need. Unfortunately, neither SVN nor Git meets those needs, but CVS does. (And as much as I like SVN, rebuilding the entire repository doesn't count for (2)).

      • (1) check out only a subset of files, because those files are binary and very large, and

        Git can do this with sparse checkouts, but as you probably know, neither git nor svn are really meant to manage large binary files. You'd do best to use a repository that was designed to fit your needs so you're not constantly butting heads with your version control system.

        (2) permanently delete those files that I know I will no longer need.

        Most source code version control systems are specifically designed to avoid permanent data loss, so again, you're going to fighting your VCS if you use one that was meant to handle source code. That being said, BFG [github.io] will permanently delete

        • by cdecoro ( 882384 )

          You'd do best to use a repository that was designed to fit your needs so you're not constantly butting heads with your version control system.

          I'd love to know if such thing exists (in an open source form, ideally). As far as I'm aware, no such thing exists. (Boar [google.com] is getting close, however, so I'm hopeful).

      • (2) permanently delete those files that I know I will no longer need.

        I'm confused. The entire purpose of an archive database is to KEEP things, forever, so you can go back to them when you need to. If you have files that you expect to delete, maybe they shouldn't be going into the database.

        • by cdecoro ( 882384 )

          (2) permanently delete those files that I know I will no longer need.

          I'm confused. The entire purpose of an archive database is to KEEP things, forever, so you can go back to them when you need to. If you have files that you expect to delete, maybe they shouldn't be going into the database.

          No you're not; you're being patronizing. I am a photographer. I take lots of photos. The negatives are over 30 MB each. A single day's shooting can be dozens of GB. And while I expect to delete many of them, I don't know which of those will be deleted until I've gone through and worked with them. But I want them in an archive immediately, so I can have them tracked, versioned (the XMP sidecars/DNG headers are plaintext), backed-up, remotely accessible, etc.

          The funny thing is, despite pretentious boosters of

          • If you're a photographer dealing with 30MB images, then you're almost certainly using non-destructive editing on them. It sounds like the tool that you want is not a version control system at all, it's a filesystem.
      • If by large you mean tens of megabytes. Downhill, with a stiff breeze at its back.

      • by orzetto ( 545509 )

        [...] because those files are binary and very large

        No VCS is meant to do this, neither Git, SVN and certainly not CVS. Those files don't belong in a VCS because you cannot make a diff between them. Small binary files (e.g. icons in a website) are a small nuisance, but there is no point in storing large binary blobs in a VCS regularly. What you need is a backup system, not a version-control system.

        (2) permanently delete those files that I know I will no longer need

        SVN allows to do this with svndumpfilter (an

  • Horrific Summary (Score:5, Informative)

    by Anonymous Coward on Monday October 20, 2014 @05:34PM (#48189993)

    He isn't raising new money, he's opening up a discussion on what to do with the remaining money in some fund he started, and he said he'll match what's currently in the fund.

    Wish we could talk the editors into doing basic fact checking on the article submissions they allow through.

  • Counterpoint (Score:5, Insightful)

    by Tenebrousedge ( 1226584 ) <tenebrousedge@gmai[ ]om ['l.c' in gap]> on Monday October 20, 2014 @05:36PM (#48190013)

    Git's subtree / subproject management is extremely painful. The information manager from hell, indeed. I dislike SVN/CVS extremely, but they make much easier to do sub-repositories. For example, Arch's ABS is entirely under SVN, which works well enough for them, but using git the same way sounds like torture.

    • by Alef ( 605149 )
      I agree there are some things that could be improved with Git's submodule feature (I assume you are aware of it, though you called it something else), but is SVN's externals really that much better? In my experience, both works, each with it's own pros and cons, though neither of them is "extremely painful".
      • There's submodules and subtrees, neither works all that well. SVN can pull down part of a repo, which is (IMO) slightly more sane behavior.

        • I eventually got submodules to work properly for me, and have been using them effectively (I think) for a few years now. But it's not easy teaching other devs. Which is why I need to spend some time investigating hg properly. Although you can do sparse checkouts with git, apparently hg has some plugins which allow you to partially clone a repo without necessarily cloning all of the objects in its history (supposedly plugins can fetch that on demand, rather than in the initial clone). It seems this is possib
  • by jdege ( 88942 ) on Monday October 20, 2014 @06:34PM (#48190591)

    If Git did all that SVN does, I'd be glad to switch.

    But there are capabilities in SVN that Git not only doesn't have, it has decided it will never have. And that's a problem.

    Biggest issue for me? In SVN, I can create an extern to a subdirectory of a project. Git's subprojects always point to the root of a project. And for us, that's a big deal.

  • Hosting Git is dirt cheap. Converting from ${old_terrible_system} to Git is the painful one-time expense. Here's how you do it:

    1. Fire up a suitably bit AWS cloud server.
    2. Copy your repo to it.
    3. Run the command to convert your old repo to Git.
    4. Download the new Git repo.
    5. Shut down the instance.

    You don't buy expensive, power-hungry software that's going to cost an arm and a leg to store, power, and cool for the next year when you only need its brute force for a few hours. The Cloud isn't a magical

    • Re: (Score:3, Interesting)

      by Anonymous Coward

      You don't buy expensive, power-hungry [hard]ware that's going to cost an arm and a leg to store, power, and cool for the next year when you only need its brute force for a few hours.

      But he is planning to do conversions over and over, one after another, handling problems as they occur. As such, one of his goals is that the conversion be as speedy as possible, and he specifically said that he doesn't want to share a CPU with other cloud users. He wants one fast CPU devoted 100% to his project.

      Makes sense to

  • by steveha ( 103154 ) on Monday October 20, 2014 @06:39PM (#48190645) Homepage

    ESR has already helped several free software projects convert from CVS to Git using his existing computer. The bigger the project, the longer it takes. (Each attempt to convert the Emacs repos takes 8 hours with his current computer.) He has studied the C code for doing the conversion, and determined that the best sort of computer for doing these conversions would be as fast as possible (doesn't matter how many cores; this is a single-thread process) and would have as much RAM as possible. Graphics card? Whatever, who cares. Keyboard, mouse? Not going to buy those, he already has those. Oh, and he would prefer it not sound like a leaf blower so he is looking for quiet power supply and a case with large quiet fans.

    He says that several people spontaneously donated money to help him buy a better computer. So he opened up a discussion for how to best spend the money.

    Several people urged him to only use ECC RAM, which means either an AMD chip or a Xeon. Someone just donated $1000 (!!!) so he has pretty much settled on the Xeon.

    Once he has this, he will go around to free software projects and offer to do the conversion for them. His plan is to grab a copy of the CVS repo, run the conversion to make sure there are no surprises, then ask the project maintainers to stop modifying the CVS repo while he runs the final conversion.

    This seems like a reasonable service for him to be offering. Instead of each project figuring out the conversion process, he will become an expert on CVS to Git conversions (with more experience than anyone else) and he will have the purpose-built computer to do the conversions as quickly as possible. So he really will be saving time and hassle for the various projects.

    P.S. He converted the NetHack repos, and stirred up a hornets' nest. Read about it here: http://esr.ibiblio.org/?p=6389&cpage=1#comment-1207141 [ibiblio.org]

    • by Nimey ( 114278 )

      Why do this instead of spinning up an Amazon AWS instance on demand? It shouldn't cost much to convert a source repository with that method, especially compared to buying a whole computer.

    • by leiz ( 35205 )

      For ECC, many modern Pentium and i3 CPUs also support ECC. See http://ark.intel.com/products/... [intel.com] for example - 2 core @ 3.8 GHz for $150, perfect for a single process task. Most i5 and i7s have ECC disabled. At that point, just pay the relatively small premium for the Xeon versions.

    • Fire up an amazon cloud instance configured for number crunching, do conversion, shut it down again. This shouldn't cost more than 50 bucks...

  • by JPyObjC Dude ( 772176 ) on Monday October 20, 2014 @06:47PM (#48190693)

    There are many usecases where people will need a centralized version control system (VCS). SVN was written ground up to be best in class centralized VCS and they accomplished this goal while building a very elegant and efficient client server framework to boot.

    However IMO, SVN is still missing one very important feature: obliterate. This is a huge weakness with large repositories.

    I also missed branch aware meta data in the early versions (I've been using since 0.8x) and not sure if this has changed. Git has done a very good job there.

    • by Dredd13 ( 14750 )

      There are many usecases where people will need a centralized version control system (VCS). SVN was written ground up to be best in class centralized VCS and they accomplished this goal while building a very elegant and efficient client server framework to boot.

      this, this, a thousand times this.

      I wish I'd seen your comment before I posted my own similar comment.

  • After going through the process of moving from VSS to Subversion here some 10 years ago, I am in no hurry to change source control again.

    We had to educate a whole bunch of people, many of whom had never used anything more sophisticated than a tarball for version control, on how to use subversion.

    Much as I like DVCS, and in hindsight it would be useful to have some form of it here, it would have to:
    1) Be able to import our large and complex codebase,
    2) Cope with the fact that we have one repos with multiple

  • > What he's doing now sort of reminds me of holding a bake sale to build a bomber

    OP speaks as if he witnessed this before

  • So is ESR trying to convert the NetBSD CVS repo in some weird and special way or something, and that's why it failed? Because it has already been converted and is on Github [github.com]; if he needs info on how it was done, there's probably someone on the tech-repository [netbsd.org] mailing list that can help. It's been converted to Fossil [sonnenberger.org] too.
  • by Dredd13 ( 14750 ) <dredd@megacity.org> on Monday October 20, 2014 @10:21PM (#48192051) Homepage

    I wish people would stop pretending that the DSCM model is the "only way of the future". There are plenty of completely valid use-cases for monolithic source control models. For instance, I am a firm believer that configuration management repos belong in a strictly monolithic architecture, with a single source of truth, deterministic version numbering, etc., etc....

    Certainly I could see a case for moving people from CVS to something more modern (but in the same basic vein) like SVN, but here's the thing:

    If their existing SCM application is working for them, and they're happy with it, then it's perfectly fine.

    • A single point of failure is a big problem. The biggest advantage of a distributed system is that the main repo doesn't have to take a variable client load that might interfere with developer pushes. You can distribute the main repo to secondary servers and have the developers commit/push to the main repo, but all readers (including web services) can simply access the secondary servers. This works spectacularly well for us.

      The second biggest advantage is that backups are completely free. If something br

      • by dwpro ( 520418 )

        A single point of failure is a big problem.

        Obviously, that's why you back it up and have fail-over if that's necessary. A single source of truth is a big plus, as is being able to use that single source of truth for code migrations to environments, history for audits, etc.

        The second biggest advantage is that backups are completely free.

        Nothing in this world is free. Using developer machines for backup isn't an optimal (or, IMO tenable) solution if you're serious about business continuity.

  • It's just that ESR has an old decrepit machine to do it on. A low-end Xeon w/16-32G of ECC ram and, most importantly, a nice SSD for the input data set, and a large HDD for the output (so as not to wear out the SSD), would do the job easily on repos far larger than 16GB. The IPS of those cpus is insane. Just one of our E3-1240v3 (haswell) blades can compile the entire FreeBSD ports repo from scratch in less than 24 hours.

    For quiet, nothing fancy is really needed. These cpus run very cool, so you just ge

  • I read through half the comments here before twigging to what's really going on. He's got all of $700 for this project. And we're paying attention to this because???

    • Over $900, and he will match the donations with his own funds so... that's definitely enough for a pretty nice machine. And with the slashdotting, probably a lot more now.

      The bigger problem is likely network bandwidth to his home if he's actually trying to run the server at home. He'd need uplink and downlink bandwidth so if he doesn't have FIOS or Google Fiber, that will be a bottleneck.

      -Matt

  • Compared to the code you'll find in a CVS or SVN these days, the VCS itself is the leadt of your problems.

  • Go back to the 90s (Score:2, Interesting)

    by HnT ( 306652 )

    It is high time eric goes back to the 90s and leaves real programmers alone. Doesn't anyone else feel at least a little ashamed of how this guy who gave us the fetchmail plague always grabs the limelight ? The same guy who is notorious for harassing women at conferences and for generally being a nutbag.
    I for one am going to create a whole bunch of SVN repositories now, just because.

There is never time to do it right, but always time to do it over.

Working...