Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Software The Internet Technology

Developer's View: Real Life Inspirations Or Abstract Ideas? 144

StormDriver writes "According to writer Marc Prensky, most of us come from a generation of digital immigrants. It basically means the modern web developed during our lifetime, it is a place we migrated to, discovering its potential. But people aged 20 and younger are not like that at all. They are digital natives, they've spent their whole lives here. 'Hey, let's do a digital version of our college facebook' is a digital immigrant's idea, just like 'Hey, let's make something like a classifieds section of a newspaper, only this one will be online.' Or 'Hey, let's make an online auction housel.' 'Hey, let's make a place for online video rentals.' The thing is, recreating items, ideas and interactions from the physical realm on the Web already ran its course." To me, this sounds like the gripe that "Everything that can be invented, has been invented." There are a lot of real-life services and experiences that have yet to be replicated, matched, or improved upon in the online realm; I wouldn't want people to stop taking inspiration from "old fashioned" goods as starting points for digital products.
This discussion has been archived. No new comments can be posted.

Developer's View: Real Life Inspirations Or Abstract Ideas?

Comments Filter:
  • by PeanutButterBreath ( 1224570 ) on Thursday February 23, 2012 @01:40PM (#39139083)

    "Hey, lets make another Facebook, only more betterer!"

    • by PatPending ( 953482 ) on Thursday February 23, 2012 @01:47PM (#39139191)
      Fortunately there's no need to improve upon Slashdot.
      • Re: (Score:2, Funny)

        by Anonymous Coward

        Famous last words.

        Before modding me down into oblivion, note that I am making this statement in a general sense. Although I currently have no suggestions for improvement, eventually there will be some change that will make it better.

        • by Ihmhi ( 1206036 )

          Famous last words.

          Before modding me down into oblivion, note that I am making this statement in a general sense. Although I currently have no suggestions for improvement, eventually there will be some change that will make it better.

          Oh, I do, although it's a generic one - how about making Slashcode less of a buggy, laggy mess? Let's go one better: trash Slashcode completely and rewrite a new system from scratch that actually works efficiently.

          It's not as if we'd lose much - you already can't post in an article after a set period of time. Everything would be automatically archived, migrate the user accounts, bam done.

          • I'm curious if the threaded discussion for a slashdot article with a lot of comments would fit in the 16mb record contstraint in mongodb. That, and a porting project to nodejs. Does slashdot offer a data sample for testing slashcode against?
            • Why would you store all the comments for an article in the same record? Makes for update/clashing hell. Just store the article id and reply-to comment-id along with each comment, and let the db/caching take care of the heavy lifting.
      • by fyngyrz ( 762201 ) on Thursday February 23, 2012 @01:58PM (#39139311) Homepage Journal

        ...that the low-hanging fruit has been picked.

        Personally, I'm looking forward to the next real game changers. Some that might qualify would be real AI and robots, ultracaps capable of replacing batteries, political landscape shifts such as the adoption of the idea that the communications infrastructure is as important as, and for the same reasons, as the transport infrastructure with associated rights of passage and removal from commercial interests, just as private toll roads are almost unknown today, a space elevator or other means of inexpensive space travel, a confluence of insulation, local power generation, and storage to free the "average" home and vehicle from the power grid and oil interests, real 3d display technology... web innovations are rarely, at least recently, of a great deal of interest to me. Maybe it's just me, though.

        • lol, I've made ultracaps capable of replacing batteries - currently setting up a production line - thanks for the encouraging words, dealing with China's Rare Earth policies is daunting :)
        • by bzipitidoo ( 647217 ) <bzipitidoo@yahoo.com> on Thursday February 23, 2012 @02:36PM (#39139711) Journal

          Low hanging fruit has been picked? Not hardly!

          XML sucks, HTTPS was never as good as SHTTP, we don't have much experience with HTML5 yet so we don't have a good feel for its limitations. IPv6 certainly won't be the final word in network protocols. Plenty of room for more new programming languages, since coding is still awkward and painful. I learned of an interesting one called Racket just last month. Perl 6 is still being developed. The latest revision to C/C++ just came out last year. Code reuse is still incredibly messy, and languages still mostly reinvent the wheel, recreating libraries natively because it's still too hard to call library functions written in a different language. Java is especially bad there. The whole OOP thing with CORBA, and marshaling or serializing, never really caught on, and we still use an awful lot of plain old C library code with wrappers. We could really, really use a good standard for creating and interfacing with library code. And SWIG? Bleh, major code bloat. Imagine what it would take to persuade OS developers to migrate away from C to a better language. C is old, but some still use Fortran and even Cobol. On the hardware side, we're still stuck with all kinds of legacy ugliness going all the way back to the original PC design. Be glad they at least anticipated there could be more than one operating system, and designed the hard drive I/O routines to work with partitions.

          "Real" AI and space travel is sexy. But there's a great deal of more mundane work to do. It won't be glamorous but it will be a big help.

          • by CastrTroy ( 595695 ) on Thursday February 23, 2012 @02:47PM (#39139859)

            HTTPS was never as good as SHTTP

            A real problem is that I have to send my credit card credentials to a website in order to buy something. The real fix would be for me to be able to buy something off a website without sending them my credentials. Instead, I'd only have to communicate to my credit card provider that I authorized the site to charge a certain amount to my credit card, and have the money transferred, without the site knowing the information on my card. The same could be done for recurring payments. Authorize the merchant to charge a certain amount against my card every month, without them actually having to know my credit card information. This is kind of like paypal, but without the middle man.

            • by zidium ( 2550286 )

              Paypal!!

            • Sounds more like Facebook Connect or Twitter's oAuth. A great idea - already taken by Paypal, technically - but I would love to see it adopted more universally.

            • Indeed, giving someone your credit card details so that they can then make a withdrawal is a broken model from the start. The paypal solution where you push money makes far more sense. Given the amount of online fraud, I have always been surprised that Banks haven't pushed their own Paypal-style solution.
              • by acooks ( 663747 )

                The credit card companies have designed the system in such a way that they never carry the risk for fraud. It's brilliant - they're basically printing money.

                The consumer was "safe" on the old card-not-present system in the sense that the merchant has to refund the payment when the consumer cries fraud to the card company. If enough consumers cry fraud, the payment processing gateway (another middle man) may decide to stop the merchant's transaction processing.

                The new MasterCard SecureCode and Visa 3-D Secur

            • by smelch ( 1988698 )
              How would they know how to identify the account to charge? Some kind of identifier, like a 16 digit number?
          • by Anonymous Coward

            Code reuse is still incredibly messy, and languages still mostly reinvent the wheel, recreating libraries natively because it's still too hard to call library functions written in a different language. Java is especially bad there. The whole OOP thing with CORBA, and marshaling or serializing, never really caught on, and we still use an awful lot of plain old C library code with wrappers. We could really, really use a good standard for creating and interfacing with library code.

            It exists. It's twenty years

            • by dkf ( 304284 )

              It exists. It's twenty years old. It doesn't require marshalling. It allows object-oriented components to be reused in almost any language. It's in-process COM.

              It has a butt-load of problems in that it requires particular models of type systems and memory management. It's also a problem with respect to security boundaries. Managed code systems such as Java or C#/.Net do better (within their restrictions) and you actually do see a lot of code reuse in the wild now.

              The deep problem doesn't go away though: a good reusable component needs a properly considered API. You can't count on just use a chainsaw on some poor program and get a nice set of organs, ready to trans

            • COM is a stupid idea, this is why no one outside Windows does anything like that. The only reason why it exists on Windows is because Microsoft developers believed, it opens some kind of bright new object-handling future that they will be lords and masters of. It doesn't, the whole approach is stupid and dangerous, and it must be abandoned along with everything else Microsoft originated.

          • by smelch ( 1988698 )
            "XML sucks"

            Really? You think so? You have any more complaints from 5 years ago? Use JSON or binary serialization, then stop complaining.
            • JSON sucks too. Binary JSON doesn't address the problem that JSON is fundamentally hierarchical and small time, it just packs the data in better. BSON, as used in MongoDB? Better than JSON, but still bad. Try to format several terabytes of data for JSON and use it, and you'll see how bad even binary JSON can be. HDF5 was made for huge amounts of data, but has its limitations too.

              Maybe you think YAML is the answer? YAML is a superset of JSON, in case you didn't know. And no, sorry, still very wastef

      • Fortunately there's no need to improve upon Slashdot.

        Seriously, the more time I spend on the rest of the web, the more convinced I am that this statement is absolutely true.

    • Re: (Score:2, Offtopic)

      Comment removed based on user account deletion
    • With blackjack and hookers?
    • by RKBA ( 622932 )
      "Hey, lets make another Myspace, only more betterer! We will call it Facebook."
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Thursday February 23, 2012 @01:43PM (#39139117)
    Comment removed based on user account deletion
    • Hate to point this out to you, but the article isn't talking about "you and your fellow millennials" you're about 10 years too old.

      • by DrgnDancer ( 137700 ) on Thursday February 23, 2012 @02:18PM (#39139509) Homepage

        I also hate to point out that these articles are constantly pushing back the age at which people become digital natives. People the OPs age (also my brother's age) were "digital natives" when these articles were first being written 5-10 years ago. Now they're too old to be real digital natives. What constitutes this mysterious age group? My brother has had a computer in the house literally as long as he can remember. My parents got the first computer in our house in 1986 when I was 12 and he was 2. Unless his lack of opportunity to Google "learn how to walk and talk" disqualifies him, he's a digital native.

        Basically they moved the bar from "had a computer since infancy" to "had access to the Internet since infancy" after they realized that the first "digital natives" were no better with computers than anyone else. At some point they'll realize that having had access to the Internet since infancy isn't enough to magically impart computer skills either and they'll move the bar again. Ten years from now they'll be talking about the kids in their early twenties who have had access to mobile computing devices since infancy and are the real "digital natives". Meanwhile some tech will still be cursing while he cleans up the viruses on some "digital native's" computer because the kid is no smarter than any other kid and got his box owned looking for porn.

        There's not magic in being younger. To do anything more substantial that basic word processing and web browsing you still need a combination of mindset and training whether you're 15, 25, or 65. The basics are somewhat easier if you've been around computers all your life, but beyond that I don't think it matters much.

        • by nine-times ( 778537 ) <nine.times@gmail.com> on Thursday February 23, 2012 @03:18PM (#39140151) Homepage

          You're right, they keep pushing it back, and I'm guessing it's for 2 reasons:

          1) The concept of "digital" keeps changing. People who are 35 now may have grown up with computers already, but they didn't grow up with smartphones and facebook.

          2) Journalists keep predicting that the world will change once the "digital natives" start taking over, and they're not willing to give up the idea. As the "digital natives" enter the workforce and no one is observing the magical effects everyone predicted, they say, "Well these aren't true digital natives! Wait until the real digital natives arrive!"

          The fact is, most people under 20 don't understand computers very well. They've comfortable using them, but they have even less understanding of what's going on than earlier generations who had to do a lot more manual configuration.

          • People who are 35 grew up with computers which would be completely unusable by anyone who grew up with modern computers. :) I tried giving an 800MHz computer with 512MB RAM to an 18 year old relative who was expressing an interest in computer science, as it runs Linux very well. He essentially said that he'd never seen a computer with RAM measured in less than Gigabytes, and with a processor whose speed was not measured in Gigahertz. And no, "0.5" or "0.8" don't count. My first computer was a 12MHz 286

            • My first computer was a 12MHz 286 with a 125MB hard drive and like 2 MB of ram - and it was super badass at the time, with DOS 4.

              I'm not sure we want to start this game on Slashdot, but I got you beat. My first (IIRC) was a 4Mhz 8088 with 512k of RAM.

        • by clodney ( 778910 )

          There's not magic in being younger. To do anything more substantial that basic word processing and web browsing you still need a combination of mindset and training whether you're 15, 25, or 65. The basics are somewhat easier if you've been around computers all your life, but beyond that I don't think it matters much.

          I think there is more to it than that. It is true that a child's brain is wired differently than an adults. Very young children can learn multiple languages simultaneously and speak them all like a native, something that is very difficult for adults. Not sure how far into adolescence/adulthood those physiological changes persist, but it is at least a factor in early childhood development. And from a psychological perspective, growing up with something means you take it for granted in a way that is diff

      • by wanzeo ( 1800058 ) on Thursday February 23, 2012 @03:06PM (#39140035)

        Ok, try on this car analogy. Cars have been around for 100 years. They have been a defining aspect of our culture for 60 years, and nearly 100% of people know how to drive. And yet, only a very small subset of the population have any idea of how a car works.

        I don't see how any other technology should be expected to be different. People who grow up on computers and the internet will either take an interest in them because they are curious, or they will treat them like an appliance and have occasional problems, just like people have problems with their cars.

        I would use caution when attributing characteristics to "generations", the effects of individual personalities seem much stronger.

      • Give the crap that I hear kids spewing on the train or bus his point still applies to teenagers. Some are technically inclined and some are as dim, when it comes to computers, as my parents.
    • Your 28? Well, you're 8 years past the point the article is talking about.

    • I very much agree with this. While there is a certain percentage of the population that does understand computers, there's a surprising number of people who just don't get it. I know 20 year olds who still get viruses all the time, even though they've been using computers their entire life. Not only that, but basic operating skills aren't even there. Switch up the UI a little bit, and they are completely lost. They know very little of how to actually operate computers.
    • I agree that the idea of a "digital native" is a bit silly, but for slightly different reasons.

      The article suggests that most computer services are inappropriate metaphors of real-world services. There's a hulking great elephant in the room that's being ignored: almost all computer interfaces are inappropriate metaphors of real-world services. How can these so-called "digital natives" be native if the computers all "speak human" rather than having the users "speak computer"?

      I don't consider myself a "digi

  • by SmallFurryCreature ( 593017 ) on Thursday February 23, 2012 @01:43PM (#39139123) Journal

    The inventors are a handful of people at any time. The rest are consumers.

    Once, if you had a computer you had to know its ins and outs. Same as people who once drove cars or flew planes. Nowadays cars just work and people barely know where to put the fuel in, cue people putting in the wrong fuel. No owner of a Spyker would ever have done that, they KNEW their car and its needs.

    There will be new inventions made by old and young people but what they all have in common is that they don't just consume whatever tech is available in their time but think about and think about what is lacking or missing. The man who made lighthouses saw how his wife was cooking and made a better stove. Simple as that. Could easily have been her son as well. Or a grandpa watching his granddaughter. Inspiration comes from looking at the world and not just assume but to question. And no, kids are NOT better at it. If they were, they would be far harder to teach.

    • by Fned ( 43219 )

      It's the consumers that decide which inventions matter... and an invention which is "like X, but online" might not resonate as well with someone who's never seen X not online.

  • by Kenja ( 541830 ) on Thursday February 23, 2012 @01:48PM (#39139203)
    Al these metaphors and hubris over the years "who will build the off ramps from the information super-highway into the digital ghetto" etc.
  • I'm working on creating a digital mouse trap.
  • by Anonymous Coward

    The divide between those who grew up with the internet and those who did not is huge . You can see it in policy debates and philosophical problems: the previous generation thinks the internet is a neat toy, while the native generation thinks it is as essential as the air we breathe. The previous generation thinks intellectual property is a big deal, where people can and should control/profit from their ideas, while the native generation could not give the slightest care to these antiquated notions of copyr

    • Re: (Score:3, Insightful)

      by Pope ( 17780 )

      Sounds like these "digital natives" as you describe them have a monstrous sense of entitlement. "I want everything and I want it MY way and I want it NOW."

      • by gutnor ( 872759 )

        They are also too young to need money. When they start looking for work and feeding a family, their enlightment will dim pretty fast.

        We have had that before, remember the hippies, flower power and everything (or May 1968 generation in France) ? All this 1% business, financial crisis, threat against the middle class, ... has been happening under their watch. The current society they control has almost diametrically opposed values than the one they promoted in their youth.

    • The divide between those who grew up with the internet and those who did not is huge . You can see it in policy debates and philosophical problems: the previous generation thinks the internet is a neat toy, while the native generation thinks it is as essential as the air we breathe.

      Those who "grew up with" (e.g., had at a very young age) access to the internet barely show up in policy debates, as most of them aren't even old enough to vote.

      The big thing is that we still haven't collectively decided what th

  • by Sloppy ( 14984 ) on Thursday February 23, 2012 @01:51PM (#39139241) Homepage Journal

    Anyone else notice how the summary excerpt started talking about the web's recent appearance, and then segued into talking about "digital" as though it were a synonym, thereby implying "digital" tech appeared fairly recently?

    TFA is just one example of "digital"'s abuse, but it's ubiquitous. That word is now so rarely used in any connection with its meaning, that I think hackers have bricked the word from our language. That is so gay!

    • I see what you did there. Don't think I'm not on to you. ... because I am.

      But in wholehearted sincerity I cannot help but agree with you. Much like the chemist who decided to put an end to 'comparing apples and oranges' by carrying around a comparison of the IR spectra for both [improbable.com] (and showing not only that they were directly comparable, but also that they are obviously quite similar chemically), I propose each software engineer carry around an abacus, just in case they day comes when 'digital' is used as a sy

    • TFA is just one example of "digital"'s abuse, but it's ubiquitous. That word is now so rarely used in any connection with its meaning,

      Yes, it seems to just mean "on the internet" or "on a computer" now. I've seen DVDs for sale with the claim "includes digital copy". Silly me with my old analogue DVDs...

      • Yes, it includes a secondary digital copy for playback on other devices than a PC or your DVD player. What's so hard to understand about it?

        • Digital is not the defining characteristic of that secondary copy, as the main DVD content is also digital. So the adjective is useless.

  • by ScottyBlues ( 310677 ) on Thursday February 23, 2012 @01:56PM (#39139295)

    When the electric guitar was first invented, it was played just like an accoustic guitar but with amplification. Later, artists like Jimmy Hendrix came along and played it like it was a fundamentally different instrument. I think that a similar cycle is likely going on with the web, as the original article says. Like the electric guitar, the web has ways of "playing it" that are fundamentally different from the non-web counterparts. The best innovations have come and will come not from porting non-web faculties, but inventing new ones that could not exist without this medium.

  • HIstory repeating (Score:4, Insightful)

    by Ukab the Great ( 87152 ) on Thursday February 23, 2012 @01:56PM (#39139299)

    Since the 1970's every generation has independently invented disco and think they have something new. Donna Summer, Techno, Lady Gaga, etc. But eventually they get over it.

    • public dance halls for dancing to pop music are WAY older than the 1970s, young whipper-snapper. my grandma went to the discotheque of her day in the 1920s
  • The latest trend seems to be to abandon the desktop metaphor and make everything look like an iPhone. Big graphical buttons arranged in a grid, easy scrolling, no overlapping windows.

    • by Sique ( 173459 )

      This sounds awfully like Microsoft Windows 1.02. You know, the Comdex 1985 version.

  • Eh. (Score:5, Insightful)

    by fuzzyfuzzyfungus ( 1223518 ) on Thursday February 23, 2012 @02:02PM (#39139351) Journal
    Anybody who uses the phrase "digital natives" without a heavy dose of irony can usually be safely ignored.

    Are there cases where dragging physical metaphors into computing is brutally-old-and-busted? Sure; but making MP3 players with UIs consisting of elaborate(but non-resizeable) bitmaps renditions of 1970s stereo gear was a moronic idea back in the 90s, just as it is now. Outside of agonizing over-literal nonsense like that, 'real life inspirations' seem to take two forms, neither obviously outmoded:

    1. Remnants in name only: Your email client likely still has an 'inbox' and an 'outbox' because, at some point, somebody actually had two boxes on their desk. Guess what, it doesn't matter. The computerized abstractions have gained so many features(instant search, threading, sort-by-whatever-you-want, etc, etc.) that they bear almost no relation to their physical counterpart. They have to be called something, so the legacy name is harmless enough.

    2. Borrowings that make sense because people want them: Y'know why stuff exists in 'real life'? Because people wanted them it. If they wanted the dead-tree version, they will probably want an electronic one, as well. Once that gets built, it will eventually be polished(having features added and archaisms removed) until it moves into category #1.)

    This argument also seems to implicitly overstate the number of things that are somehow fundamentally digital. There are a lot of (mostly failed) ideas involving the dissemination of information in surprisingly modern ways within the constraints of antique media. Making variants of these ideas actually not fail this time will be a change; but it won't be one fundamentally tied to the internet(in anything other than an economic sense).
    • Are there cases where dragging physical metaphors into computing is brutally-old-and-busted? Sure; but making MP3 players with UIs consisting of elaborate(but non-resizeable) bitmaps renditions of 1970s stereo gear was a moronic idea back in the 90s, just as it is now. Outside of agonizing over-literal nonsense like that, 'real life inspirations' seem to take two forms, neither obviously outmoded:

      There is one important change that happened with media players the interface kept its "1970s stereo gear" look but was evolving in various ways -- "Stop" button disappeared, and "Play" button now shares space with "Pause". This reflects improved understanding of playing media files -- there is no point specifically telling a player that you are not going to resume playing something, and there is no additional effort necessary to combine two mutually-exclusive buttons. The idea of playlist (and "Previous"/"N

  • by __aamdvq1432 ( 2566673 ) on Thursday February 23, 2012 @02:08PM (#39139421) Journal
    From the article: "It’s time to embrace digital natives and give them something cool, that doesn’t try to imitate existing concepts." Maybe. There's still a huge, wealthy immigrant population that has lots more dough than the natives. Before I set about catering to either group, I need a business model. "Something cool" may be part of it - I won't ignore native sensibilities about "coolness." Something saleable will be a larger part, whether conceptually imitative or not.
    • As a digital immigrant, I request you create me a digital lawn which I can instruct the natives to stay the hell off of.
  • 1976 (Score:4, Insightful)

    by poena.dare ( 306891 ) on Thursday February 23, 2012 @02:14PM (#39139473)

    "Dammit, all these charts a tables need to be on a computer!" -- me in 1976 playing D&D.

    I would argue that the "immigrants" have a more pressing desire to innovate because they felt the crushing limitations of the non-virtual world first hand.

  • by na1led ( 1030470 ) on Thursday February 23, 2012 @02:17PM (#39139499)
    I remember in the early 80's, computers were considered a novel. No one needed one to run their business, and no one cared about knowing the world's problems every hour of the day. Today, people can't live without email or a cell phone. At my work, employees go nuts if their calendars and emails aren't synching with their smart phones.
  • by gurps_npc ( 621217 ) on Thursday February 23, 2012 @02:19PM (#39139517) Homepage
    All dating websites SUCK.

    Not just for geeks, but for everyone.

    They suffer from non-participating members, and spam.

    There is definitely room to create something new and better

  • by elrous0 ( 869638 ) * on Thursday February 23, 2012 @02:29PM (#39139631)

    I'm sorry, but the college kids that I've been around don't strike me as being particularly internet savvy at all. For all this hype about "They were born on the internet, they were raised at its tit, etc." they actually strike me as being no more tech savvy than any other generation. Sure, they all have Facebook profiles and play a lot of those Farmville-type games, but they still have to call someone to set up a router. They still have to ask me how to do a complex google search. They still seem to know fuck-all about internet security. My brother-in-law had to call me in to fix his laptop after my Generation-Y super-internet-savvy niece infected it with about every phishing virus known to man. The young programming students I've dealt with seem no more or less comfortable with programming than any other young programmers from other generations (and I go back a while).

    So where exactly are all these Generation Y ubermensches I keep hearing about? Because I sure haven't met many of them. There are geeks in that generation like any other, but, as with all generations before them, most of them seem pretty clueless about tech.

    • by na1led ( 1030470 )
      Generation Y are experts with Facebook and Twitter, and really good at playing video games, Beyond that, they know absolutely nothing!
    • The young programming students I've dealt with seem no more or less comfortable with programming than any other young programmers from other generations (and I go back a while).

      Well there you have it -- you are obviously too old to truly understand their specialness!

      I know the feeling.

      • by elrous0 ( 869638 ) *

        Obviously, they're being incredibly savvy behind our backs somewhere--probably in some secret dance club.

  • by joh ( 27088 ) on Thursday February 23, 2012 @02:40PM (#39139765)

    One important thing to note is that the laws of the physical world are pretty much ingrained in us. Not only in us, even in animals and their reactions to things. Things from the physical realm *have* to obey these laws (or they wouldn't work) and just imitating them can help here. *Understanding* why they work is better, though.

    One reason the iPhone took off as it did despite its touchscreen was the fact that the scrolling was modelled closely on the behaviour of "real" things: There is friction and inertia, you can "throw" a page, everything works in a reliable, predictable way because it's the same way every physical thing behaves. There is no abstraction here at all, it even painfully emulates things that have no real meaning in the digital world. They have meaning for us and our animal minds and bodies, though. We are a product of millions of years of evolution in the physical world and while there is freedom in breaking out of this there's also much to work with in this.

  • There are a lot of real-life services and experiences that have yet to be replicated, matched, or improved upon in the online realm

    Really? Name them.

    And no, the fact that some things are stupid when done online rather than in real life does mean they can necessarily be improved upon.

  • This is a blog post on a website for sharing bookmarks, and the author complains of lack of originality! Stumbleupon started in 2001 and even it had predecessors like linkbook.com.
  • No such thing (Score:4, Insightful)

    by starfishsystems ( 834319 ) on Thursday February 23, 2012 @02:57PM (#39139977) Homepage
    There's no such thing as one optimal point of view when it comes to understanding the universe or creating artifacts or inquiring into the human condition, just to cite a few examples.

    Each generation has its peculiar fashions and prejudices. Each generation is imaginative. New generations tend to bring a refreshing skepticism of preexisting paradigms - and this is good, or anyway better than complacency - but there's no guarantee that what they come up with will be any better than what came before. Less experience is not intrinsically an advantage over more experience; it stands to reason that more often the converse is true.

    One certainty is that, as the volume of human knowledge grows, its surface area increases also. It's at this surface that genuinely new discoveries and new ideas can take place. Unless it turns out that we're living in a bounded space, it's not the case that we're in any danger of running out of new material. And new generations do tend to be especially comfortable at this surface, because their life experience is all about new discoveries and new ideas - at least, discoveries and ideas which are new to them.

    It doesn't follow that all new discoveries and new ideas are revolutionary, or even necessarily very interesting. Most aren't, in my experience. Most are either prosaically obvious or shallowly misguided. I'm old enough to have seen a dozen generations of computer hardware come and go. Certainly there's been much incremental evolution along the way, but of all the hundreds of shiny new technologies that were supposed to be revolutionary, only a handful have actually stood the test of time. I'm happy to see anyone, young or old, propose a new one. But please, let's dispense with the hubris.
    • by dkf ( 304284 )

      Each generation has its peculiar fashions and prejudices.

      Even the word "generation" is prejudicial. It assumes there's some massive sea change, a moment when everyone born after thinks and acts in a different way. That's just totally BS. Reality's one big mess of stuff, with its early adopters, bellwethers and luddites. The same person may even be in all of those categories simultaneously (though for different things).

      • Well said.

        I agree that popular sentiment tends to reach a bit hastily for the generational label. The justification - such as it may be - comes from observing the effect of some characteristically disruptive event on a particular generation. The postwar baby boom is an obvious, and fairly credible, example.

        Now along comes the Internet or the Web (in popular culture these are nearly synonymous) whose advent, someone is bound to suggest, might have produced a generational paradigm shift. Well, before
  • by SoftwareArtist ( 1472499 ) on Thursday February 23, 2012 @02:59PM (#39139995)

    So "digital natives" have no experience with physical objects, and can never draw inspiration from things they encounter in the non-digital world?

    Um, yeah.

  • Probably not what you expect: When I started teaching about 10 years ago, I was given a class on the Software Development Life Cycle. I dug into the textbook and had this realization: All of the examples and case-studies in the book start from the position of some paper-based business wanting to computerize their processes -- but that's crazy, right!? Here we are post-2000 (by a year or two at the time), so obviously any business at this point is already computerized.

    But here I am 10 years after that, and e

    • Manila files don't crash. Doctors (and many other professions) don't like to lose time/money sitting around waiting for their filing system to be repaired.
  • by Phoenix666 ( 184391 ) on Thursday February 23, 2012 @03:21PM (#39140197)

    I agree that "digital native" and other terms are contrived and fluid. Rather than argue the definition of terms invented by marketing droids, let's ask the better question of what the next step from the Information Age is.

    Ideas precede action, so in that sense there is no limit to the evolution of how we organize and present information. But no matter how ornate our ideas, the physical world is. Ideas influence the material world, to be sure, but put a bullet in your head and no idea in the world will save you.

    So it's worth asking if the skills we have gained organizing and processing information on the level of ideas will help us master the physical world better. Can we make our homes, goods, and surroundings reflect the order we have imposed on abstractions housed within 1's and 0's?

    I believe they can, and the blood/brain barrier, as it were, is being breached on at least two fronts: 3D printing/additive manufacturing, and bioengineering. If we can materialize CAD drawings and DNA sequences directly, our physical world may come to echo virtual reality more quickly than any of us can now possibly imagine.

    The digital natives [sic] will likely look at the physical world and wonder why it does not reflect the virtual one, rather than the digital immigrants [sic] who look at the virtual world and wonder why it does not reflect the physical one. They will probably expand upon the Internet of Things, 3D printing, bioengineering, and do it at the pace they've become used to on the Internet rather than in the pre-Internet material world. Their frustrations, and therefore their actions, will be driven by the physical world's inability to live up to the expectations acquired in their virtual worlds.

    For better or worse, I expect that we are sliding down the event horizon of permanent dis-equilibrium until we reach the singularity.

  • by scorp1us ( 235526 ) on Thursday February 23, 2012 @03:28PM (#39140249) Journal

    Whatever you do, do not believe these two reasons. Never use then as a justification to not do something.

    1. If it was worth doing, someone else would have already done it. (No market)
    2. Someone else is already doing it, so there is no point in you doing it too. (No profit, too much competition)

    Commerce happens because of value and value alone. No one has done it just like you, or will do it just like you. Facebook wasn't first but their way won. Apple didn't invent computers or phones but they went on to make the best, and incredible profits even while charging a premium.

  • More like a generation of kids getting really good at doing mundane, dumbed down tasks in a virtual space. The reason why things--shopping, dating, taxes, whatever--were ported from real world to virtual was because it made real life easier. If using computers to make real life easier has run it's course, then why do I care about a new generation innovating a bunch of useless crap on the internet that doesn't better my real life.
  • ... and emphasizes why most of the software or process patents make no sense. When people came out with plastic sleeve pages for photos instead of little glued paper corners on blank paper, they didn't need to explain it as a new way of making a photo album; but digital manipulation of digital photos somehow became a whole new concept. Email is not paper mail, but it's like voicemail or fax (themselves already legally equated to older technology), and the *concept* of mail isn't new.

    I remember when it
  • by TemperedAlchemist ( 2045966 ) on Thursday February 23, 2012 @04:07PM (#39140583)

    I am the "digital native" people are looking for. At 20, I did practically grow up with the internet. It was there, always, ever since I was born. Of course it never really hit mainstream until the turn of the century, at least for me. Dial-up just never could get me plugged in (can't tie up the phone line too long). I remember only getting online when my parents would leave and work on stuff offline that I would submit to online communities. But that's not really being immersed.

    But now, I am plugged-in. I spend an outrageous amount of time on the internet, and fully admit to an internet addiction. My first instinct when I don't know something is to use Google. I have a wealth of resources at my fingertips almost all of the time and there is a feeling of detachment from reality I get when I get on. I lose a feeling of embodiment and feel more like an entity, free to roam wherever he chooses.

    But, whether or not this creates competence about computers is another thing entirely. Other people my age know where the power button is, they probably know what a graphics card is, and probably a few internet memes. That's the extent of it in my experience. I still get young people on internet forums who also can't seem to latch onto the idea they can use Google to answer their questions.

    • by Leolo ( 568145 )

      The sense of disembodiment you talk about is nothing unique to Internet natives. I'm over twice your age and I can remember experiencing the same thing back in the early 90s. William Gibson noticed the same thing happening to people playing Space Invaders even earlier. Which lead him to invent the term "Cyberspace."

  • Abstract Ideas.
  • by WOOFYGOOFY ( 1334993 ) on Thursday February 23, 2012 @06:21PM (#39141911)

    All of us inherit a world which is in a state of continually becoming . What that means is even people who are born today exist in a world of off line auction houses just as people who were born 30 years ago were. Offline auction houses are no more a property of the people who were born thirty years ago than they are of people born today. Neither generation invented them.

    Old people don't get the internet but that's because they're not interested. They don't get a lot of other non-internet things also. Don't forget, a lot of human activity and interest has as its unspoken ulterior motive getting laid....

    This is the same argument used by people who said that the internet was going to change the form of the literary novel entirely. Maybe something new and impossible pre-internet will happen (has it not already ? ) in the literary world but mere words being read in a linear fashion are still the fastest and richest way to mainline an arbitrary story into someone's head.

    So some people are now younger than the internet. Meh. From this, nothing follows. TVs been arond since the 50s but it still took until 2004-5 for it to consistently have anything good on it....

  • "It's like FaceBook, but it's entirely for gamers with registered stats, competition organization, walkthrough wikis, easter eggs, cheat codes and skins!" I've heard this before, it's called, "High Concept." It is the reason why 90% of US movies are easy to forget within a few years. "It's Jaws...in Space!" The description of 'Alien' which lead to a franchise that has lasting impact, but the vast majority of high concept ideas end up as forgettable works. There are lots of things in the digital world t

  • We of all generations are experiencing new technology together. Its kind of silly to say one generation is superior because it is fresh or another because it is experienced. We see this in the job marketplace where there is salary-experience compression: the capable new employee is paid about the same as the older guy.

    On the other hand, outside of technology there may value to other aspects of business such ans management and social skills. Age-stratified companies like all boomers or all college kids

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...