Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Software Programming Technology

On the Benefits of Speedy Software, and How It Affects User Perception of Engineering Quality and Overall Usability (craigmod.com) 140

Craig Mod: I love fast software. That is, software speedy both in function and interface. Software with minimal to no lag between wanting to activate or manipulate something and the thing happening. Lightness. Software that's speedy usually means it's focused. Like a good tool, it often means that it's simple, but that's not necessarily true. Speed in software is probably the most valuable, least valued asset. To me, speedy software is the difference between an application smoothly integrating into your life, and one called upon with great reluctance. Fastness in software is like great margins in a book -- makes you smile without necessarily knowing why. [...]

Speed and reliability are often intuited hand-in-hand. Speed can be a good proxy for general engineering quality. If an application slows down on simple tasks, then it can mean the engineers aren't obsessive detail sticklers. Not always, but it can mean disastrous other issues lurk. I want all my craftspeople to stickle. I don't think Ulysses (a popular text editing application) is badly made, but I am less confident in it than if it handled input and interface speed with more grace. Speed would make me trust it more.

This discussion has been archived. No new comments can be posted.

On the Benefits of Speedy Software, and How It Affects User Perception of Engineering Quality and Overall Usability

Comments Filter:
  • by QuietLagoon ( 813062 ) on Monday July 29, 2019 @09:58AM (#59005528)
    ... and nothing useful to write about.
    • Actually (Score:5, Insightful)

      by Anonymous Coward on Monday July 29, 2019 @10:06AM (#59005554)

      This is one aspect of why the "best current practice" in "software engineering" is mistaken. There are other reasons, but this is one of them: "Modern" software is slow, big, bloaty, and guaranteed to get worse over time.

      That you don't get it, doesn't change the point its validity.

      • and the mantra for the last few years has been "develop productivity" as if the only thing that matters are developers and users are an unfortunate irrelevance.

        Hence all the frameworks and tools designed to make a developer's life easier, even though so many are bloated beyond sanity in order to make things just that bit easier to code. (which IMHO always turns out to be a false promise anyway)

      • One of the biggest problems towards speed, is the human equation.

        When writing a program with a Human interaction, there is a lot of code to make sure the person doesn't do something stupid or unexpected. Data validation (on interface and client side), security checks, isolation servers so they talk over a rather slow bus to help reduce the risk if any one system got compromised.
        Those quick little programs we did for our college Computer Science Assignments wouldn't work in real life, because people would d

        • I just don't know where to start. How about best practices? I've seen posts on best practices that in general are not even good practices and this is where the new generation of coders are getting their information. Poor use of threading and general misunderstanding of overall design. I'm not going to make a super long list but it goes on and on.

          If you have ever worked with someone that has graduated with in the last 10 yrs you know what I'm talking about.

          • I have worked with people who had recently graduated. And have been doing this for a while now. One of the job's when we get a Jr develop is to show them best practices, and actually be a bit hard on them to get the job done right.
            Oh look you have been ignoring source control, now you lost your document. It looks like you are going to have to code it again.

            It is a fine line to let them make mistakes that they can learn from, and being a dictator telling them how to do their job in detail, and no allowing

        • by Bengie ( 1121981 )
          There is a fine line between "scope creep" and design gap(defect). Where I work, the design is always at least "what a reasonable person would expect", even if not explicitly required. There is also the issue that "design" can be made extendable. Design is the art of writing code that can have new features added with a minimal amount of code changes. Well factored code is highly correlated with extendable code.
          • Yes, I agree, however whenever you add hooks in your code to allow expansion things tend to get slower. "Best Practices" and a lot of the Microsoft Tool sets, are set on the ideology that you know from the final release to years out in deployment, and the idea of a recompile and redeploy is just a button click away.

            • by Bengie ( 1121981 )
              I should have added that there are always exceptions to the rules. A lot of people use the idiom "A good craftsman never blames his tools", but they forget to include "A good craftsman always picks the best tool for the job". If someone forces me to use an inferior tool, I will blame the tool. My department actually makes nearly all of its own tools. We dog food everything until it no longer bothers us before handing it off. We generally make tools for other departments, but until we feel the tool is perfec
      • Yep, I hate with all my strength software built on "Electron" (essentially a browser). The binaries are huge and performance is nothing to write home about. I understand the appeal (using web technologies which are second nature to many devs nowadays and also having code that runs everywhere) but I'd rather avoid them as long as I can
      • Ugh, faster small computer I've ever used, and yet when I use Word I have to type slowly because it can't keep up with me. You know things are weird when I point to Emacs as being a quick and fast application for editing.

    • by PPH ( 736903 ) on Monday July 29, 2019 @10:21AM (#59005650)

      Thanks to speedy software, he got it over with quickly.

      • Thanks to what must be slow software, I don't have any mod points to mod you up.

      • by epine ( 68316 )

        Thanks to speedy software, he got it over with quickly.

        So many young men these days are addicted to extreme porn that a growing category of E.D. treats young stallions for whom pussy IRL is no longer sufficiently stimulating to trigger any response at all.

        Like much of everything else, good ideas tend to arrive in a Pareto distribution. Your note-taking software had better not hitch in the peak of the flood. But this kind of true urgency is rare on the ground.

        On the other side of the coin, there's always bee

    • He wrote it using the AI code autocomplete software posted about yesterday.

    • by ArchieBunker ( 132337 ) on Monday July 29, 2019 @11:15AM (#59005970)

      How much faster is your computer today than it was in 1999? Orders of magnitude. Gigabytes of ram and solid stat drives with tons of bandwidth. Browsing the internet or loading AutoCAD takes just as long today as it did two decades ago. Why? It's not the hardware.

      • }}} How much faster is your computer today than it was in 1999? Orders of magnitude. {{{ --- Correct. So why does it take me the same amount of time to write a letter on a computer nowadays as it did 30 years ago? The majority of the slowness we run into is not the computer being slow. And to conflate speed with reliability is just ridiculous, in my experience, it is just the opposite. Programming shortcuts give you speed, and a lack of reliability.
    • by Anonymous Coward

      Software responsiveness is very useful to write about.
      USER TIME IS MORE VALUABLE THAN DEVELOPER TIME. Blaming your "improved productivity" or even your laziness is not a good reason to let software performance slide.
      Repeat that until you and every other developer get it through your thick fucking skulls.

  • by AlanObject ( 3603453 ) on Monday July 29, 2019 @10:06AM (#59005552)

    It is called the command line interface. Or the shell if you prefer.

    • Which works very well for a limited range of work. UIs appeared as a direct result of command line being too slow and cumbersome for certain activities. The downside was that UI (which was supposed to make a lot of work faster) gradually became slower and slower, to the point where it is now faster to use command line tools in many cases.

      UI slowness today is ubiquitous and has many reasons, most of them wrong. For example, a news website I access multiple times a day from my mobile device started misbehavin

    • Not so much command line interface, but the design approach it was was meant to do.

      "Do one thing, and do it well"

      Bosses and Execs don't want to deal with a lot of tiny programs, but one big monolith Enterprise Solution, which is easier for Them to manage, as they just need to work with one vendor.

      Most places with legacy programs, may have those VB6 programs with forms, that basically do one function rather well, and are hard to replace because they do what needs to be done so well and fast.

    • So you're happy to use an unoptimized CLI executable if it takes 10 minutes to run, instead of an optomized one that did the same job using a more efficient algorithm in 10 seconds?

  • by omfglearntoplay ( 1163771 ) on Monday July 29, 2019 @10:08AM (#59005562)

    Consistently fast programs absolutely affect people's moods. Where I work, an upgrade slowed down a piece of software enough that we basically had a revolt on our hands. The devs have never been known to be super competent, more focused on new features and marketing than doing things "right". Meanwhile, they are losing old customers to gain new ones. I think they are scared.

    Anyway, humans are animals. When we press a button, something is supposed to happen quick. If you need to wait 3 seconds, it feels wrong. It is annoying. Even worse if it takes 2 seconds sometimes and 5 seconds another time. When you repeat that process all day long, your brain and body are trying to find ways to do it better and faster, but it's just "randomly" slow for most users. It's like driving in traffic that has sudden stops. It is maddening because you can't get your flow.

    So for the love of humanity and sanity, please make your programs fast.

    • Comment removed based on user account deletion
    • The devs have never been known to be super competent, more focused on new features and marketing than doing things "right".

      This describes probably 80% of extant software.

    • What I find amusing is that you can give a developer a supercomputer with unlimited storage, RAM, and CPU... and they will still make a slow GUI, focusing more on how a box animates when you mouse over it, as opposed to a responsive interface.

      One of the best UIs I've seen was Hypercard on the Mac //x. Yes, this machine was rare, but HyperCard was originally written to run from an 800 kilobyte floppy on an 68000 machine, so it worked extremely well on a 68030. 30 years later, why can't we duplicate that, o

      • One of the best UIs I've seen was Hypercard on the Mac //x.

        Another application that was super useful and fast along those lines was Zoomracks [wikipedia.org]

        A Hypercard clone called Vipercard [theregister.co.uk] was launched last year, but it's written in Javascript and feels a bit slow to me...

    • Comment removed based on user account deletion
    • When we press a button, something is supposed to happen quick. If you need to wait 3 seconds, it feels wrong.

      Yeah, this is why I have a tabulator that runs in rounds and gives feedback; but that's still not quite good enough. When you want to tabulate millions of ballots through a method relying on both Tarjan's Algorithm and Floyd-Walsh, it takes time. Lots of time.

      ...does it?

      I rewrote my graph generation function to use a straight pairwise hash set instead of physical graph nodes, and that made it much faster and simpler. It also became easy to split and merge: if I generate two graphs on subsets of ballo

      • Interesting... Is your tabulator FOSS? If so, a link the code please!

        • Yes and the code is currently uglified [github.com] (it goes back and forth) and needs a rewrite. I'm going to rework how ballots work soon--I tend to move through projects in blocks, and wrote the tabulator in a short time due to experiencing (for the first time in my life) a surprisingly-useful (severe) addiction response to the task itself that taught me quite a lot about addiction that I thought I already knew, but didn't understand in nearly that clarity. Right now I have a few political projects, some papers an

      • by kiatoa ( 66945 )

        The problem with your code is that you've chosen the wrong solution. Ranked voting will forever go through the cycle "plurality is broke" ==> "wow, IRV is cool, let's do that" ==> "crap, that outcome was fucked, this ranked voting stuff reeks, back to plurality." ==> "plurality is broke" ==> ad infinitum.

        The pragmatic solution is approval voting. Simple, combinable (combining results from districts is trivial summation), explainable and adequate. Yep, adequate. Not perfect, not fantastic but ade

        • Ranked voting will forever go through the cycle "plurality is broke" ==> "wow, IRV is cool, let's do that" ==> "crap, that outcome was fucked, this ranked voting stuff reeks, back to plurality."

          Actually, I'm writing a paper about how IRV is broken. There's a minority lock-out problem wherein if every voter in mutual majority M all prefer each in a set C[M] of candidates to all candidates in set C[m], where a mutual minority (m) all prefer each candidate in C[m] to those in C[M], then if the number of voters |M| is less than twice the number of voters |m|, the voters in (m) have zero impact on the election. This failure mode occurs in many other mathematical situations not described by this axio

          • by kiatoa ( 66945 )

            Whew! Over 1200 words. Thanks for the detailed response.

            With 100% sincerity I wish you good luck with your campaign to improve or fix the system.

            To quote Richard Branson: Complexity is your enemy. Any fool can make something complicated. It is hard to keep things simple.

            • Complexity is sometimes necessary, and the organization of information often must come in layers to provide the high-level explanation and then the deep details.

              Think about the basic propositions. We have an electoral system which disenfranchises voters by its nature, which is dominated by oligarchy, and which can be manipulated readily simply by adding candidates. I've invented a system which removes all these flaws: we vote in a primary to get candidates representing the views of the span of the popu

    • 'more focused on new features and marketing than doing things "right"'

      That's not your devs, bro - that's management.

  • kids today (Score:5, Informative)

    by bugs2squash ( 1132591 ) on Monday July 29, 2019 @10:10AM (#59005578)

    I think people forget that many things in everyday use of technology used to be near instantaneous. You could answer your rotary dial phone the instant it rang and latency was undetectable for local calls. Now my mobile phone stats ringing and there are two or three rings before it is able to even display the call answer button, and once in a call the lag is noticable.

    similarly starting and stopping playback was instant, changing tv channels was instant, all these things take noticably much longer now

    • I think people forget that many things in everyday use of technology used to be near instantaneous.

      No bloody way. The constantly increasing latency on all kinds of digital interfaces pisses me off every damn day. Like when I touch an icon on the McDonald's kiosk, it should wait 1s to respond... that's BS.

      • I was actually going to comment on the McDonald's kiosks because I think they're a good example of how things end up slow.

        We were part of the "beta" program in my city, so we saw the initial version - and it was pretty good. It was super fast UI, didn't ask any superfluous questions, and the "menu" was laid out in a super simple/consistent way. On first use, I could order for me and all my kids as fast as I could think.

        Now the kiosks are terrible. Partly this is a problem with the touchscreens degrading

        • I was actually going to comment on the McDonald's kiosks because I think they're a good example of how things end up slow.

          We were part of the "beta" program in my city, so we saw the initial version - and it was pretty good. It was super fast UI, didn't ask any superfluous questions, and the "menu" was laid out in a super simple/consistent way. On first use, I could order for me and all my kids as fast as I could think.

          Now the kiosks are terrible. Partly this is a problem with the touchscreens degrading - but mostly it's software changes. Over time, they've added fancy animations/scroll effects. They've changed the menu organization to focus on marketing instead of letting people select the food they want. Every transition is animated, and they've added a bunch of unnecessary questions.

          I imagine the first design was done by UX engineers watching how people interacted with the things. Later versions were influenced by branding managers who will never use the kiosks more than once, and by people trying to steer around complaints from 1-in-a-thousand-level morons who really just need to talk to a person.

          I'm guessing the beta kiosks may have also had more robust hardware. It's great to get the demo running , get the bugs sorted out, get some consumer feedback. But then when it comes to wide scale deployment people start thinking real hard about how to decrease the cost per unit. So it runs a little slower they think, it still works doesn't it?

        • The problem with most fancy animations is that they look great to the developer/UI designer, but no one really wants them. One web sites they are really easy to add and they look cool, but the just slow down interactions.

          If the fade/ease/transition effects are long enough to be notices (at least 1 second), they just become annoying. If they are short enough to not get in the way, why bother?

          • If they are short enough to not get in the way, why bother?

            Well, this is the real where Apple is King.

            Seriously, they have mastered this tradeoff. The reason a lot of Apple's UIs feel and look so smooth and slick is because they've put in tiny little touches like short fades or animations etc... which are virtually unnoticeable consciously, but you can see them if you focus.

            Because they are almost subliminal, they don't intrude consciously, but contribute enough to the experience for it to feel "cooler".

            • The point is not that it feels cooler, is that when used properly, it feels faster.

              When you use lets set 150 ms to make a transition effect, the user percives it as instantaneous, but the machine had a reasonable amount of time to do operations.

              Usually this is the work of animators and/or front end developers, who wants that his works shines, not that it becomes invisible. So properly implementation requires a lot of case study, fine tuning, user feedback loop and effort.

              At the same time, "perceive
    • I miss the instant response of the old fashioned potentiometer volume control on my car radio. Now many of them seem to purposely fade in and out slowly as you turn the knob one way or the other. It seems I always overshoot the volume and have to back it down a little. Does anyone find that a useful feature?

      Sort of off topic: anyone remember the silly kids' game of turning the volume all the way down during a newscast or talk show, then flicking it up and down quickly to produce a funny word fragment?

      • by Falos ( 2905315 )

        I don't think I've encountered fade on physical dials.

        I'll be sure to make a point of being turned on/off (lul) next I'm buying a new car.

        Imagine if the physical feedback in your steering wheel was laggy.

  • by hey! ( 33014 ) on Monday July 29, 2019 @10:14AM (#59005606) Homepage Journal

    A lot of Windows Vista's problems came from being overly aggressively optimized for speed. One of the things that I hated about it was that operations were *inconsistent*; sometimes something that was perceptually instantaneous most of the time took just a second or so. If it was a common operation like opening a folder in the file manager, an "occasional" performance hiccup would trip you up all day long.

    It reminds me of something I once heard a sewage treatment plant operator say. Question: What do you get if you mix one gallon of sewage with a 99 gallons of drinking water? Answer: 100 gallons of sewage. If you mix 99 instances of quick response with 1 instance of slow response, you get the perception of a slow system.

    So *average* speed doesn't really tell you much about the quality of a system. I think if something is fast enough *consistently* that's a better proxy for the underlying quality.

    • Sewage water is actually a bad metaphor, because actually...

      Question: What do you get if you mix one gallon of sewage with a 99 gallons of drinking water? Answer:

      Actual answer: you got 100 gallons of water that is still considered drinkable because its polluants' concentrations are all under the threshold values.

      (And the less scrupulous actors in the food industry actually bet on this).

  • Comment removed based on user account deletion
  • by Anonymous Coward

    he lost all credibility with me when he implied a valid position for $1000 gardening shears vs $150 ones. Even $150 gardening shears is an insane premium.

  • by JoeDuncan ( 874519 ) on Monday July 29, 2019 @10:26AM (#59005672)

    Response latency is the *one* metric where computers have been getting consistently SLOWER over the years:

    https://danluu.com/input-lag/

  • Is there absolutely nothing happening in the tech world today? Why can't some people be more succinct?

    • It's almost like domination of the software industry by a finance cartel and a handful of monopolists has already stifled tech innovation. Good thing that's UNPOSSIBLE...

  • ...at least in software. They will happily spend an extra $200 for a CPU, graphics card, or memory if it promises a 10-20% speed improvement. But they won't pay $10 for a software product that is twice as fast (or faster) as the one they got for free. I developed a new kind of database that is twice as fast as Postgres, SQL Server, or MySQL at query speeds (without needing separate indexes). You would think that people would be jumping all over that kind of thing, but it is like pulling teeth to get people to even try it.
    • Some of us are looking for your Ph.D. thesis that explains how it's faster.

      • Write it in C and ASM instead of a virtual python instance inside a docker container.

      • by Pascoea ( 968200 )

        Some of us are looking for your Ph.D. thesis that explains how it's faster.

        We're also looking for a link to the project. You would think that if a person were making bold statements like "it's twice as fast as SQL Server", and bitching about how nobody will use it, they'd at least post a link where we could go take a look at it.

        • My homepage has links to videos where I demonstrate it and to blog postings about it.
          • Well,

            Just tried to check your DDBB, but Didgets.io just returns a nice 404 (slashdot effect?) Will try again in a couple days.

            Said that, IF your DDBB is mySQL compatible, so I don't have to port all the queries and works on low end machines (raspberry, or even containers inside rasp) I will become an evangelist

            Anyway remember, that a good product is just like a good idea, without sales leads to nothing.
      • I am not an academic who writes papers all day. I am an old-time programmer who grew up with slow CPUs, limited memory, and high latency for storage devices. I love algorithms so I rework the code until it is fast as I can make it. I program in C++ and don't use bloated libraries just because they speed up development.
        • don't use bloated libraries just because they speed up development.

          That's the key right there.

          Client's pay for time, quality is secondary.

          If I quote 20 hrs with a popular library as a linchpin but has a couple quirks or 80 hrs with an in-house written library, 9/10 times the client will go for the cheaper option, performance be damned.

          • I also take great pleasure in tackling something that takes 10 or 20 seconds (or longer) to run against a large data set and finding a way in software to do it much faster without simply throwing another dozen cloud servers at the problem (and the costs associated with them). I am afraid that this is a dying art.
        • I program in C# because it's faster and more-efficient in many cases than C++ or C, as it can self-optimize from runtime statistics. Still, the speed of the code path is less a problem than the algorithm: if an algorithm requires many, many operations per unit data, expanding exponentially for linear data growth, then that algorithm will take a long time versus one which grows linearly per unit data.

          I've found, oddly, that adding more encapsulation and obvious overhead caused reductions of up to 70% of

          • This is one of the secrets to my system. It is a columnar store that de-duplicates data. If I have a billion row table and column 4 only has 10,000 distinct values, then I only have 10,000 objects to look at (not a billion) with respect to that column. You still have to keep track of all the keys each value is mapped to, but it can be much faster than conventional systems. I also don't need a separate index for each column to speed things up (or technically you could say that each column is nothing but an i
            • 10,000 distinct values, then I only have 10,000 objects to look at (not a billion) with respect to that column

              Yeah, typical index stuff has a hash table that uses a hash to index into a bucket, which may contain one or many values, and those values get iterated. Each leaf in this short tree lists all the rows with that value. Iteration of a huge table of duplicate values is fast, and a pointer one of 4 billion unique values must be 32 bits. Looking up a row with its indexes and then seeking to all the deduplicate values requires extra IO operations. Modern machines can handle more IOPS, so that can be done kin

              • We are a little startup that did not get a few million dollars in VC money or find a dozen engineers willing to work for free for several years, so of course we are missing some important features. If no one will use our software until every checkbox you can think of has been implemented and tested fully, then we will never get off the ground. If instead, companies are willing to support our efforts as we can add value in some critical areas, then we will be able to fill in the missing pieces over time.
                • And that's your problem: businesses have to take risks with your product being supported, and have to integrate with your product instead of some other product.

                  PostgreSQL's license (being so permissive as to allow you to sell a modified version and to not release your source code, if that's your business strategy) allows you to circumvent that essentially by using an existing RDBMS to produce the service product while applying your new invention to the method in which the service product stores data.

                  Th

  • The more abstractions you throw on top of something, the more time it takes to do anything. Single page web applications are a really good example of this...you hit a UI element and it will take a perceptible amount for the page to call the function that calls the framework that calls 800 other elements to produce the message telling the user something is happening. No end user is going to realize that their one action triggered that crazy amount of interaction on the back-end...they just see that they push

  • Alas I've seen a trend where product designers and front-end engineers like to add animated transitions to UI elements. I find it just slows me down, waiting for the software to be done being clever. I do think animations have a place, when the computer wants to draw my attention to something. However if I initiated the action (clicked on something, etc) I don't really need to wait for that to "take effect" unless there's some slow action behind the scenes which is being reflected by the animation.

    Please: b

  • Than to make a fast program work.

    (from Karl Nyberg)

  • I often work with ardour, an open source daw. Internally it has hard R/T deadlines of 2.7ms or less. It's a pleasure to use... on OSX. On linux, even with a rt kernel, it's no longer as fun - linux has just made too many compromises for the datacenter to be anywhere near as snappy as the OSX version. Ardour USED to be pretty good on linux, but no longer. I was really sad to finally give up on it and move to OSX for my daw needs, but...
  • ... the software automatically gets better. To my ear, there's something suspect in this simplistic model of "quality".

    Does the user's perception of quality increase when the software is faster? Yes. A graph of the perception of quality vs. speed will slope upward rapidly at first. But it eventually hits a saturation point where increases in speed are no longer perceived as an increase in quality. The problem is that this perception curve is easily saturated using the capabilities of modern hardware with o

    • by djinn6 ( 1868030 )

      There's also features competing for developer time. When the user needs to do X, and it's not possible in the software or is currently done manually, which could take a few minutes to a few hours. Would the user consider the software higher quality if it did that work for them, even if it took 10 seconds?

  • My bank recently changed the software pushing its ATM GUI. It now has modern look "flat" boxes that float or glide in from the sides like a PowerPoint transition effect or something. I can tell they meant well. But its REALLY FUCKING SLOW. I need a pile of cash. It doesn't have to be pretty. If its between a monotone no frills GUI on a tube display that runs fast, and a fancy flat screen with pretty graphics that runs SLOW AS MOLASSES IN JANUARY, I'll take the former.

    Thing is, its probably only becaus

    • by Dunbal ( 464142 ) *
      Hate to break it to you but the bank really doesn't give a damn what YOU want. They already have your money. This was about someone impressing someone else at a meeting.
      • Hate to break it to you but the bank really doesn't give a damn what YOU want. They already have your money. This was about someone impressing someone else at a meeting.

        I'm sure that might be partially true. I can always take my pile of cash to another bank. Am I prepared to do that because of the GUI on the ATM? No, not at this time. But it is slow software, and I do think it sucks, so it kind of fits the theme of this discussion.

  • Someone to speak for the trees.

    Unfortunately, it is a doomed project. The single biggest determinant of speed in software is deciding what NOT to do. If you're going to throw all the features in, you are going to be slow, partly because all those features require flexible and general architecture, rather than focussed architecture, and partly because nobody has time to make things fast.

    As a secondary issue, far too many software engineers equate speed with raw horsepower, when actually it's about latency.

  • "Even now, I’m writing this in Ulysses. Ulysses works well for organizing large-ish bodies of writing. The organization is mostly of why I use it."

    Well, we know you don't use it for the grammar checker.

  • I like speedy software because I can (more easily) trust that it's doing what it's supposed to do, and not enabling features that I not only don't need but which get in the way of getting productive work done. And that's not even counting software actively spying on me.

    The latest word processor runs slower on my current hardware than Word for Windows 95 did on a machine less than a tenth as powerful, yet offers me nothing in the way of additional features that myself or 99% of users would actually want.

    I u

  • Software doesn't get fast by accident. Whenever it operates smoothly, it means that the creator took the time and effort to make it work well. If they took the time to do that, it's highly likely that they also took the time to work the bugs out.

  • by Tony Isaac ( 1301187 ) on Monday July 29, 2019 @11:14PM (#59010182) Homepage

    I hate using Skype for Android. It is DOG slow, whatever that means. Why does it take 10-15 seconds to show me the message it just notified me about? WhatsApp and Google Messages are far faster. Unfortunately, my company uses Skype, so I'm stuck. It's no wonder everybody is migrating to Slack!

  • Slow software is probably slow because it's doing too much. It's a resource hog. It's like having a car that supports only two passengers but takes up two parking spaces due to ill design where as another would take up one space for example removing the completely useless wings. On the other hand, software that goes too fast might be not doing something it's meant to be. Anyone who has optimised software will attest to this. Accidental noops can be as much as a problem as accidental sleeps.
  • ... article that circulated as a memo in the company becomes relevant: http://seriss.com/people/erco/... [seriss.com]

    Every single time my Mac, Excel or Jira are doing something simple and it takes them ages, I recall that one article.

    I remember OS X 10.4 used to be blazing fast on my white macbook, core 2 duo and 1 gb ram. I used office, photoshop, indesign and final cut to edit videos smoothly. Nowadays OS X won't even boot with 1 gb ram. we are doing something wrong.

If you have a procedure with 10 parameters, you probably missed some.

Working...