Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
The Internet

Science Project Quadruples Surfing Speed - Reportedly 646

johnp. writes "A computer browser that is said to least quadruple surfing speeds on the Internet has won the top prize at an Irish exhibition for young scientists, it was announced on Saturday. Adnan Osmani, 16, a student at Saint Finian's College in Mullingar, central Ireland spent 18 months writing 780,000 lines of computer code to develop the browser. Known as "XWEBS", the system works with an ordinary Internet connection using a 56K modem on a normal telephone line. " A number of people had submitted this over the weekend - there's absolutely no hard data that I can find to go along with this, so if you find anything more on it, plz. post below - somehow 1500 lines of code per day, "every media player" built in doesn't ring true for me.
This discussion has been archived. No new comments can be posted.

Science Project Quadruples Surfing Speed - Reportedly

Comments Filter:
  • 10megabits (OOL) is just too slow, i need that 4x increase. Now, a 4x increase in uploads, across a cable modem, would be a different story. though i don't really need more than a megabit, sometimes an increase there would be nice.
  • Basic maths. (Score:4, Interesting)

    by psychofox ( 92356 ) on Monday January 13, 2003 @07:18AM (#5071395)
    780,000 lines of code in 18 months is approximately 1500 lines per day every single day. I'm skeptical.
    • by SHiVa0 ( 608471 ) on Monday January 13, 2003 @07:19AM (#5071401)
      CTRL-C then CTRL-V...

      you see, it's not that hard to make 1500 lines of code per day!
    • Re:Basic maths. (Score:2, Interesting)

      by byolinux ( 535260 )
      We'll probably find he's just compiled Phoenix and put his own name in the title bar...
      • Re:Basic maths. (Score:4, Interesting)

        by Zeinfeld ( 263942 ) on Monday January 13, 2003 @10:00AM (#5072400) Homepage
        We'll probably find he's just compiled Phoenix and put his own name in the title bar...

        Most likely he has taken an open source browser and added in his own extensions. This is the type of innovation that making the browser open source is meant to support.

        As for speeding up browsing by a factor 100% that is pretty easy. We did a lot of work on HTTP-NG and you can speed up downloads a lot just by compressing the headers so that they fit into a single packet. HTML is also very compressible. The biggest mistake we made in the original Web code was not putting a lightweight compression scheme into the code, although it did make it into the specs.

        Of course the reason this did not happen was the LZ patent GIF fiasco and the then state of the GNU compression libraries. Even so Microsoft has supported compression in IE for some time.

        I am rather more skeptical about the 500% claim. I don't think that there is that much redundancy unless you have completely artificial examples.

    • by D4MO ( 78537 )
      {
      Do curly brackets on a single line count?
      }
    • by Pentagram ( 40862 ) on Monday January 13, 2003 @08:08AM (#5071603) Homepage
      Maybe he's a big fan of whitespace?
    • by (bore_us) ( 640738 ) <b,dewilde&student,utwente,nl> on Monday January 13, 2003 @08:47AM (#5071828) Homepage
      Sure 780,000 lines is a lot, but of course he used his own editor which quadruples his programming speed.
      I wonder what ever happened to reading webpages while surfing... Ooohh, right, so that's what you call surfing :-))
    • Re:Basic maths. (Score:5, Interesting)

      by sql*kitten ( 1359 ) on Monday January 13, 2003 @08:56AM (#5071893)
      780,000 lines of code in 18 months is approximately 1500 lines per day every single day. I'm skeptical.

      Indeed. I remember reading that IBM reckon that, including design, coding, testing, debugging and documentation, a programmer's doing well to get 10 lines of code per day, averaged over the life of the project.

      Also depends how he's counting lines. In C, because that can vary so much depending on individual formatting style, a good rule of thumb is to count semicolons. And even then it won't tell you if programmer A is writing fast but hard to read code and programmer B is checking the return value of every system call (as you're supposed to but few ever do), adding lines and robustness with no extra actual functionality.
      • Re:Basic maths. (Score:3, Interesting)

        by jimfrost ( 58153 )
        I remember reading that IBM reckon that, including design, coding, testing, debugging and documentation, a programmer's doing well to get 10 lines of code per day, averaged over the life of the project.

        From my software engineering course way back in college I think I remember the number being 4 or 5. But that is more like an industry average. One thing about software is that the best programmers are something like two to three orders of magnitude more productive than the average. Between that and the communication costs growing exponentially in a group you find that a few very talented programmers are vastly more productive than a mass of average programmers.

        Still, sustaining 1,500 LOC per day for a year and a half ... that's beyond the productivity level of anyone I've ever seen. I personally have managed 4,500 per day for a period of about a week on occasion ... but I wasn't sleeping much during that period.

        I am not sure I'd take that number at face value though. If this were real he would almost certainly be using a lot of prewritten code for codecs and the like and that would balloon the LOC for little effort on his part. It's more than a little unlikely that he'd be able to write all his own codecs in the first place.

        So, while the LOC sounds specious, it's potentially believable given the probability of code reuse.

        The thing that makes this entirely unbelievable is the performance claim. 4x performance of existing browsers over a 56k line? That's simply not possible since the bulk of time spent waiting is data transmission time. That could be improved but only with a server side component and it's doubtful it could be improved substantially without a large loss in quality.

        I'm not going to dismiss the claim of a new web browser, but I'd be surprised if any of the size and performance claims hold water.

        • Re:Basic maths. (Score:5, Insightful)

          by Angst Badger ( 8636 ) on Monday January 13, 2003 @12:28PM (#5073593)
          Still, sustaining 1,500 LOC per day for a year and a half ... that's beyond the productivity level of anyone I've ever seen. I personally have managed 4,500 per day for a period of about a week on occasion ... but I wasn't sleeping much during that period.

          I broke 1,000 LOC per day for about a week while working for an unnamed gigantic CPU monopolist. I was behind schedule, over budget, and had a hard deadline, and the code itself was fairly repetitive and not terribly efficient. Ordinarily, I'd figure I produce closer to 250 LOC per day during a normal coding period.

          Provided this story isn't complete hogwash, my guess is that the reporter asked the boy about the writing the program and he answered that it consisted of 780,000 LOC and took him a year and a half to build. He probably neglected to mention that 90% of those lines were in libraries written by other people. He may not have even intended to be deceptive in any way, figuring that any fool would know that was the case, but not realizing that the reporter was a fool.
        • Re:Basic maths. (Score:5, Informative)

          by angel'o'sphere ( 80593 ) <angelo.schneider ... e ['oom' in gap]> on Monday January 13, 2003 @02:00PM (#5074306) Journal
          well,

          when I had "software engineering" in my computer science courses, we got this figures for LOC per say:

          Application programs: 25 - 100
          Service programs: 5 - 25
          System programs: 1

          Application programs are things like an editor (albeit some editors are rather complex), service programs are things like cc and ld or asm (albeit some of them are not "that" complex) system programs are stuff like the kernal itself or, dynamic link loaders, device drivers etc.

          Well,
          we all know that LOC is not a defined "value" but people working a lot with that "measure" just define it :-)

          E.g. if you work with COCOMO or with PSP(personal software process) the typical LOC is defined as a single definition, a single expression(some even say every part of an expression), an argument to a function call, every include, every define and so on:

          fprintf(stderr, "this is an error number: %ld", errnum);

          That would be 4 LOC, one LOC for the "statement" and 3 for the 3 arguments. Consider you can make an error/bug in every argument or 'misstype' fprintf for fscanf ....

          LOCs do not realy get interesting in comparing hero programmers (10 to 20 times more effective) with standard programmers, but by comparing programming languages!!!

          The VERY INTERESTING point about LOCs is that the noted rules of thumb above are independend from programming languages!!!

          A programmer writing lets say 12 LOC per day C also writes ~ 12 LOC per day in assembler, in LISP in PERL or what ever language is appointed for the project.

          So: the more expressive and the more abstract a language is the more "algorithm" or "computation" is defined in the lines of code.

          In other words: 10 lines of C are far more calculation than 10 lines of assembler, while 10 lines of LISP, SQL or Prolog are even more than C.

          Bottom line: the number of statements the average programmer can write depends far more on the problem domain than on the language choosen!

          Well, the productivity of the so called hero programmer is in general not in lines of code, but in "abstractions" he implemetns. Or in number of features he implements. And that is often acomplished by choosing the right language constructs(not by writing more lines) ... e.g. using auto_ptr templates in C++ instead of manual exception management and manual allocation and deallocation inside of a function lets you "work" much faster and yields more maintaneable code. More readable, less to think about and faster ongoing to the next feature.

          angel'o'sphere
  • by grahamsz ( 150076 ) on Monday January 13, 2003 @07:19AM (#5071402) Homepage Journal
    "At seven times it actually crashes so I have limited it to six."

    This isn't a microprocessor - the speed it runs should be completely unrelated to it's caching.

    I'm very very skeptical that this is anything more than a browser with precaching.

    It also makes other ludicrous statements about how blind people can now access the web. I'm not sure how they do it presently, but i know that they do.
  • suspicious (Score:4, Informative)

    by g4dget ( 579145 ) on Monday January 13, 2003 @07:20AM (#5071404)
    If nothing else makes you suspicious about that story, this should:

    He wants to study computer engineering in Harvard University and eventually set up his own Internet or computer company.

    (For people who don't get it, Harvard's CS department, while reasonably good, is not exactly the obvious top pick among CS hotshots.)

    • by peterpi ( 585134 )
      An "internet company" heh?

      Good God, he could make a fortune if he's on the internet! Genius.

    • Re:suspicious (Score:4, Informative)

      by rcs1000 ( 462363 ) <rcs1000.gmail@com> on Monday January 13, 2003 @07:29AM (#5071448)
      Yes, but he is in Ireland. I'm not entirely sure how aware the average Dublin 17 year-old is of the relative rankings of Ivy League US universities.

      I'd be suspicious about the alleged speed of writing code. (That's thousands of lines a day!) It seems to be like this is just a browser which loads up links ahead of displaying them. Which, amazingly enough, is what all those "Your Internet Connection Is Not Optimized!!!" programs do.

      How doing this faster can make the computer crash is a bit of a mystery to me. (I can't think of a single program with a speed dial, and above a certain speed, the computer crashes... ;-))
      • Re:suspicious (Score:4, Insightful)

        by Sycraft-fu ( 314770 ) on Monday January 13, 2003 @07:44AM (#5071497)
        Onw would think that even someone not from the US would have hear of Caltech and MIT if they were in the computer field. They are, quite literally, world famous.

        No this is either total bullshit, or a huge exearation. Remember, with real science, computer or otherwise, the MOST important part is subjecting work to peer review. Anyhting which can only be demonstrated in one lab in a hands-off, no-details demonstration isn't science and the person is hiding something.
      • Re:suspicious (Score:4, Informative)

        by sql*kitten ( 1359 ) on Monday January 13, 2003 @09:04AM (#5071949)
        Yes, but he is in Ireland. I'm not entirely sure how aware the average Dublin 17 year-old is of the relative rankings of Ivy League US universities.

        He is in Ireland, but Dublin's no tech backwater. Trinity College Dublin is world-renowned for science and maths, and a short flight away are Imperial College and UCL in London, not to mention Oxford and Cambridge. A little further than that is the Sorbonne. There's no reason he shouldn't be as familiar with the rankings as anyone else.

        And thanks to the Irish government's very sensible tax policy (i.e. less is better), the country has a sizeable presence of US high-tech firms, like Oracle and Sun.

        As others have said, tho', anyone who claims to be able to sustain 1500 LOC/day for 18 months, is probably not to be taken seriously.
  • I've heard of tools in the past that claim to speed up browsing by cacheing ahead. These tools follow links on a page before you request them so that they are already in the browser's cache when you come to click on a link.

    The other possibility is some heavy compression server side, but this would require a server module (e.g. mod_gzip) and this rules out any kind of built in compression in ppp, so the sppeedup would, I guess, not be as noticable as 5x.

    Needless to say, I'm fairly sceptical that this is an actual speedup of browsing. If you can only fit 56Kbps down a line then you can only fit 56Kbps down a line...

  • I get it... (Score:4, Funny)

    by cca93014 ( 466820 ) on Monday January 13, 2003 @07:22AM (#5071412) Homepage
    This uses the same technology that manages to compress the entire British Museum, a DVD of "The Matrix" and the goatse weblogs into 42 bits of data and a packet of peanuts.

    It then makes use of network magic. You mean no-one ever told you about the magic ?

  • Hmm. (Score:5, Interesting)

    by boris_the_hacker ( 125310 ) on Monday January 13, 2003 @07:26AM (#5071433) Homepage
    Well I have to confess to being mildly curious. I mean, a 16 year old school boy writing 780,000 lines of code in 18 months ? Well I am impressed, by my meagre calculations that equates to _roughly_ 1,400 lines of code a _day_ every day for 18 months. And this application makes the internet go upto 6 times faster [apparently 7 times make it crash]. Not only that, it has been a secret project for the entire time. I smell a rat, either that or a complete genius code writer.

    But what really got me where the two most important features someone could ever want in a Web Browser - it can play dvd's [it incorporates ever media player!], and also has a handy animated assisant called Pheobe.

    Now, I am most probably wrong, and will happily eat my hat, but I cant help but feel that this isn't an entirely accurate article.

    ps. Does anyone know if it is standard compliant ?
    • Re:Hmm. (Score:5, Interesting)

      by Sycraft-fu ( 314770 ) on Monday January 13, 2003 @07:50AM (#5071524)
      There is simply no way a browser can incorperate "every media player" because they operte on different standards. Windows Media Player operates using DirectShow to play it's files. It's nothing more than a control program for DirectShow. Any DirectShow filter loaded onto the system with decode capabilities can be decoded. Any other program can use the same interface, and play all those file types. Fine, however this is Window ONLY, the code is proprietary to MS and not for otehr platforms. And then on Windows you have other things like QuickTime. QT does NOT function as a DS filter, it's a whole seperate way of doing things and again we have proprietary code. This continues for any other mdiea standard (Real for exmaple) that has it's own system.

      I have a feeling this project is nothing but hot air.
      • Re:Hmm. (Score:4, Interesting)

        by WEFUNK ( 471506 ) on Monday January 13, 2003 @11:38AM (#5073199) Homepage
        Based on the level of tech knowledge exhibited by the average tech reporter (pretty low) I'd guess that "built-in" probably just means that it comes pre-loaded with standard plug-ins. Especially when it's cited in an article that seems so impressed that "Other special aspects of his browser are the fact that access to 120 Internet search engines..." - a rather useless/annoying feature that's standard in any run-of-the-mill adware/spyware package or by visiting one of those squatter's website's with all the pop-up ads (searching dmoz 115 times isn't going to help anyone...).

        The claim that it's 100 to 500% faster is probably accurate in some sense, but compared to what? An old version of Netscape or Explorer? And on what kind of set-up? You can probably see that kind of variation in a single browser installation just by changing to different options and settings or by closing other windows or background applications. Personally, I often find myself switching between browsers depending on what seems to be working better on a particular day or on a particular network or machine.

        On the other hand, he does sound like he's a bright kid with a good future, but probably one that just took Mozilla and turned it into snazzy looking bloatware with a bunch of extra features. Or, perhaps an even brighter kid who did the same thing from "scratch" with a lot of cutting and pasting (of his own work and from existing programs and libraries) to end up "writing" so many lines of code.
    • I think what he did was as follows:

      1) Use COM to incorporate every Active Document Web Browser there is
      2) Use IE as a basis for the rendering
      3) Use those annoying little characters that MS calls agents
      4) Develop a compression utility that works on the server as a proxy.

      My guess is that his compression is partially lossless, meaning some data gets lost. I am guessing that is why when he has 7x compression the system crashes. Below that the system "ignores" the lost data.

      So what I think is unique with this browser is that it is an all in one solution that probably is pretty user friendly. And remember what amazes people is not the tech, but the presentation of the tech....
      • There is no such thing as partially lossless compression. You either loss data or you don't. The meaning of lossless is NO loss.

        You are right on the presentation bit ... people like to point and stare.
        • What I am thinking is the following....

          Lets say that you want to increase compression of some data. EG HTML. Could there not be a technique to speed things up? Sure there is, get rid of the spaces, remove some tags, etc.

          Well lets say that with each compression technique there are levels of what can be thrown away. And maybe when he tweaks to level 7 he throws away too much. At that point the app does crash since he may be throwing away something interesting.

          That was my point of partially lossless....
  • by GregWebb ( 26123 ) on Monday January 13, 2003 @07:27AM (#5071438)
    They've claimed that a 16 year old student has written 780,000 lines of code. That it combines a browser accelerated way beyond what anyone else has ever claimed (and that could potentially run faster, just doesn't yet), multi-format media player (actually, I don't want to watch DVDs in a little side window while browsing the web, thanks...) a meta search engine and an avatar-based help system?

    That's massive work _and_ a revolutionary breakthrough. If he's that good - and in a way that others hadn't thought of despite the efforts of several of the world's largest companies going into browser and network research - then this is remarkable. But without hard evidence (or even a mention on the competition's admittedly poor website) this just sounds way too much like a scam.
    • by Dr. Evil ( 3501 ) on Monday January 13, 2003 @10:10AM (#5072460)

      Prediciton: It turns out to be some Visual Basic application which uses built-in windows components such as media player... thus allowing "All media formats, and DVD playing capabilities"

      Quadrupling "Surfing Speed" is so bizzare a claim that I have no idea what it could mean. Maybe he's blocking banner ads... at 56k it could make a difference.

      As for the "lines of code" I strongly doubt that a kid is using the same criteria for lines of code that everyone else is using... it probably includes his html test suite, and all his test code, abandoned code and documentation added together. Or maybe he didn't know how to write a function, so it is a big cut-and-paste one-function VB program with Goto's.

      It's not that I doubt that a kid can pull this sort of thing off, it is that I doubt the school teachers nor the media have enough knowledge to judge it or report it accurately.

  • by Zocalo ( 252965 ) on Monday January 13, 2003 @07:27AM (#5071441) Homepage
    No, not another duplicate Slashdot story, but I seem to recall a story about another young Irish student who had developed a "revolutionary" encryption engine a while back. That was largely all claim and no solid documentation as well, and what has become of her efforts since then? Not much, not even a single update.

    Why am I thinking this is just another one of those snake-oil web speedups that does lots of caching and pre-emptive downloading of pages on the off chance you are going to view it? I'll be taking this story with a large pinch of salt for now I think.

    • by headbonz ( 156721 ) on Monday January 13, 2003 @08:17AM (#5071658)
      Its probably not fair to characterize Sarah Flannery's work as having had, "no solid documentation." As this page at Cryptome [cryptome.org] points out, Sarah's work did not "revolutionize cryptography" because several mathematicians -- including Sarah herself -- identified a "definitive attack" on the technique described in her winning paper (which was an application of the Cayley-Purser algorithm). Her book [amazon.com] remains a good read, especially for young women, and I don't think anyone believes that the math in her original paper is anything less than exceptional for a 15-year-old.
    • The encryption story wasn't snake oil, and had very solid documentation. Sarah Flannery won Irish young scientist of the year, and subsequently the EU-wide prize, for her work. Her paper is here [cryptome.org].

      The Cayley-Purser algorithm [wolfram.com] she developed was subsequently shown to have security flaws; I don't recall if this was before or after the EU prize, but thats immaterial, the work was original and interesting, and worth a prize for a 16 year old!

      She has subsequently written a book [amazon.com] , which is a pop science introduction to crypto, and I understand from the blurb she's now studying maths at Cambridge.

      -Baz

    • by Ivan the Terrible ( 115742 ) <vladimir@@@acm...org> on Monday January 13, 2003 @08:43AM (#5071813) Homepage
      I seem to recall a story about another young Irish student who had developed a "revolutionary" encryption engine a while back. That was largely all claim and no solid documentation as well, and what has become of her efforts since then? Not much, not even a single update.

      Bullshit. Get your facts straight before you malign someone. Sarah Flannery

      • won the Ireland's Young Scientist of the Year, and
      • the European Young Scientist of the Year awards,
      • was awarded a third-place Karl Menger Memorial Award from the American Mathematical Society and a fourth-place Grand Award in Mathematics,
      • won Intel Fellows Achievement Award,
      • wrote a paper [cryptome.org] on her algorithm, with a postscript exposing a successful attack,
      • wrote a book, In Code: A Mathematical Journey, [amazon.com] on her experiences (5 stars, 13 reviews, sales rank=35K).

      She used Mathematica, so the Wolfram website has review [wolfram.com] of the book.

      Here's a quote from Bruce Schneier in his 15 Dec 99 newsletter [counterpane.com].

      To me, this makes Flannery even more impressive as a young cryptographer. As I have said many times before, anyone can invent a new cryptosystem. Very few people are smart enough to be able to break them. By breaking her own system, Flannery has shown even more promise as a cryptographer. I look forward to more work from her.

      All of this was easily found with a Google search [google.com] that garned 24,000 hits.

  • Is that so? (Score:4, Interesting)

    by Pike65 ( 454932 ) on Monday January 13, 2003 @07:28AM (#5071443) Homepage
    Do we have any reason to believe that this has a lower bullshit quotient than that daft '100x compression of random data' story doing the rounds last year (can't find the /. link, here The Register's one [theregister.co.uk])?

    Sure, you can leave stuff out (images, JavaScript, Flash), but "at least quadruple"? If the page is simple enough then you can't just ditch a chunk of it.

    Ooh, AND "[at] least quadruple surfing speeds" and "they found it boosted surfing speeds by between 100 and 500". Even the article isn't making any sense . . .

    Of course, if this turns out to be true than I will be the first to eat my cat (and the first to download it), but I'm sure this isn't even possible, right?

    Just my 2 cents (actually, that was more like 5) . . .
  • We all remember the Flannery [wolfram.com] episiode, right. She was awarded the first prize at the Irish Young Scientist compition in 2000 for work on speeding up the processing time of the RSA algorithm. I remember slashdot covering this (although I can't find the story) but I also remember reading that it made breaking the encryption almost trival. Still the IYS award is a compition thats been running for 30-40 years now and is a credit to our small corner of the world.
  • What a load of crap (Score:5, Interesting)

    by Masa ( 74401 ) on Monday January 13, 2003 @07:30AM (#5071451) Journal
    This has to be a hoax. And not even a good one.

    A kid coding 780'000 lines of code in 18 months. All alone. In that time he have had to design and implement the whole shit including "every single media player built in".

    It would require some sort of dial-up-server side module to compress and modify the contents of the data and this kind of system would most certainly be a lossy method for transferring data. It won't be possible to transfer binary data with this thing without corrupting the result completely.

    And what kind of a piece of software would choke under the load of 7x56k modem ("At seven times it actually crashes so I have limited it to six.")?

    This is just a cheap attempt to gather some attention.

    • by Yurian ( 164643 )
      Whatever else this may be, it's definately not a hoax.This guy did indeed win Ireland's Young Scientist competition. I know because it takes place 5 minutes walk away from my house. He also made the front page of the Irish Times [ireland.com], a major national newspaper.

      As for his claims, well, I wasn't at the show this year, so I haven't seen his entry, unfortunately. They do sound fairly unbelievable, but you have to remember that they're being filtered through journalists, most of whom are really fairly tech-ignorant.

      I can say though that the Young Scientist is a major and well respected competition. The quality of the winners varies a lot from year to year, as you'd expect, but it's not run by idiots likely to be taken in by a hoax. Two yeras ago they flew in a Maths professor from MIT to verify some claim, so don't just accept things blindly.

      Of course, none of this prevents this guy from having stolen chunks of Mozilla or something, and then bolting some bits on.

  • by orthogonal ( 588627 ) on Monday January 13, 2003 @07:31AM (#5071453) Journal
    If this thing's really a web browser, and it runs completely on the client computer, any web pages it's requesting are coming down the line as HTML, uncompressed (except insofar as the modem's protocol might compress). Without a compresser on the other end, the speed's not coming from compression.

    If it does require a server side piece, it's not a web browser, per se; but as a general question, is it worthwhile to look into "compressed" web pages, e.g., foo.html.zlib? (I tend to doubt the savings are that much for the "average" page, but shoving graphics into an archive might keep down the number of requests needed to fetch a whole page and its graphics.)

    If it's not server side compression, the only thing I can think of (and fortunately smarter people than me will think of other things I'm sure) is that he's pre-fetching and caching pages to make the apparent speed faster.

    So is the "secret" that he has some hueristic that sensibly guesses what links you'll click next, combined with regularly fetching, oh say, your most requested bookmarks? (In my case it might look like: slashdot -- New York Times -- slashdot -- sourceforge -- slashdot -- freshmeat -- eurekareport -- slashdot.)

    In other words, is he mirroring sites locally in the background? And if so, how must bandwidth is wasted just sitting in the cache until it's stale?

    (On the other hand, could I point his browser at /., refreshing every five seconds to make sure I got a local copy of pages about to be slashdotted?)

    • by reynaert ( 264437 ) on Monday January 13, 2003 @07:52AM (#5071530)

      If it does require a server side piece, it's not a web browser, per se; but as a general question, is it worthwhile to look into "compressed" web pages, e.g., foo.html.zlib?

      This already exists, look for example at mod_gzip for Apache. This will compress pages before transmitting if the browser claims to support it. Mozilla does, I believe IE does too.

      • Yes, all of the big 3 support it. IIS supports it too. I've tired it before on a site of mine and yes, it did cut download size but it wasn't worth it since it caused problems due to the dynamic nature of my site.

        At any rate it's a technology that basically anyone can use since all teh big browsers and servers support it.
      • There is also CWNet's DSLBuster. That does the same thing by having a proxy sit upstream on a fat pipe and compress a whole webpage (images, etc) and sending it down to a IE plugin. It works fairly well, and some people swear by it, while others don't really notice a difference. My theory is that it does speed up browsing but the first part of the page appearing takes longer to appear so the percieved time is slower, even though it does finish faster. On the upside, even DSL users say it works for them.

        I could disclaim that I work for the same company, but I have squat to do with dial up services, so it's kinda pointless. I just know about it because I walk through tech support every day to my office.

        --
        Evan

    • If it does require a server side piece, it's not a web browser, per se; but as a general question, is it worthwhile to look into "compressed" web pages, e.g., foo.html.zlib?

      Sure is. So much so, that its already been done. Mozilla, for example sends a HTTP header Accept-encoding: gzip, deflate, compress;q=0.9. If the server understands that (e.g. Apache with mod_gzip), it's free to compress the data on the wire. IE (as of 5.5 anyway, don't know about 6.0) doesn't appear to send any "Accept-encoding" headers. I'd very surprised though, if this led to anything like a 400% speedup in anything but highly controlled test conditions.

      I'd hazard a guess that this new browser is quietly doing some background-caching. What articles I could find about this, however, are short on detail and kinda long on BS (web browsing and watching DVD's at the same time is a revolutionary feature? riiight), so it's really difficult to tell what substance there is behind all this. Time will tell, though...

      • IE6, Opera 5+, Netscape 4, lynx and w3m also support gzip and compress encoding. w3m even claims to support bzip2 encoding, although that's probably a bit heavyweight for this sort of thing :)

        Anyway, gzipping content can easily make for an 8x size reduction on a large page, especially if there's a lot of repetition in there, e.g. lots of tables. Whether this translates to a significant speedup in browsing speed depends heavily on the size of the page and the speed of the connection; certainly on a modem, going from 80k to 10k is very noticable :)
  • by ReD-MaN ( 27321 ) <psmithNO@SPAMmetafore.ca> on Monday January 13, 2003 @07:34AM (#5071466) Homepage
    According to Here [online.ie] :"It took him nearly two years and 1.5m lines of code to write it."

    There is no way.
  • And I bet he has an Uncle Rael who just helped create the first human clone, too, right?

    This is definitely one of those "I'll believe it when I see it" articles.

  • no footprint ? (Score:4, Informative)

    by mirko ( 198274 ) on Monday January 13, 2003 @07:38AM (#5071476) Journal
    It's curious that there is so few info about Adnan Osmani.
    I however found out this thread in the news [google.com] but, mind you, it's based on the same story...
    They bet that if it's possible, he may have either implemented some quick prefetch and/or pre-formatting subroutine...
  • Pity this guy never heard of open source. He could have taken <somebrowser/> and plugged in his mysterious bright idea.

    Maybe he found some compiler options that quadrupled the rendering speed of <somebrowser/>.

    Maybe he is just a fraud, and could sneak into the competition after creating a nice looking theme for <somebrowser/>.

    Maybe I'm just guessing and typing whatever comes to mind in <somebrowser/>.
  • ...6 56K modems, with 6 active phone lines required. Your ISP must also support multilink PPP.

    I'm sure that that's in there somewhere, oh, yeah, look...there it is commented out above line 53,425 in the code. Yep.
  • Pattern matching? (Score:5, Insightful)

    by horza ( 87255 ) on Monday January 13, 2003 @07:51AM (#5071529) Homepage
    I'm surprised that the majority of posters are resorting to unimaginative "what BS" posts instead of thinking up innovative ideas. Ok, here is my idea:

    Most web pages have a lot of static content in, especially menus etc. You could start rendering the page immediately from the cache from the last page and rerender afterwards as the new page starts to differ from the cached version.

    As the page comes in, keep switching to the page that is closest to same structure in cache (ie predominantly on the HTML tags). Don't render the text until the initial few chars are confirmed by the version downloading, then progressively render that (ie show old version then modify words where they differ).

    This would have the effect of progressively rendering the page as a whole much like those progressive GIFs. It would show a large speedup on pages that contain tables, as most browsers these days won't render a table until it has recieved the /table.

    This would be a 'faster' browser with no compression or pre-caching.

    Phillip.
    • by jonathanclark ( 29656 ) on Monday January 13, 2003 @08:17AM (#5071656) Homepage
      Another possible way to speed up transfer is by using upstream traffic as well as downstream traffic. Normally when you download a web page, the server assumes the client knows nothing about the content, but as other post mention the difference between two pages or updates of the same site will likely be much smaller than a complete resend. So the client can use it's upstream bandwidth to start transmitting data it already has for that site (or partial data hashes), while the server transmits new data. This would require a change to the web servers or use of a proxy server, but in general I could see this dramatically improving download speeds for sites that have a lot of common XML/CSS/menus etc.

      I think 90% of page traffic occurs on the top few websites through regular visitors, so in most cases the client will already have some data available.

  • by Spoing ( 152917 ) on Monday January 13, 2003 @07:54AM (#5071538) Homepage
    PHB: Jim! Did you fill out your TPS report?

    Jim: Yes, I --

    PHB: Jim...I'm concerned about your performance.

    Jim: Er, wha--

    PHB: You write, what, 30 maybe 80 "eL Oh Cee" a day? Right?

    Jim: Well, the TPS and project plans take --

    PHB: Says here, that this 16 year old kid can write 1500 "eL Oh Cee" a day. What do you think about that?

    PHB: Don't laugh...this is serious.

    Jim: Sorry. I ment --

    PHB: Jim, maybe you need to put in more hours. Reconsider your work habbits.

    Jim: I work till 10 most nights...

    PHB: Jim, it's not the hours it's how efficiently you handle them. I expect todays TPS on my desk by noon, along with a status report on each programming task you've done today.

    Jim: It's 11 --

    PHB: That's it Jim! Keep up the good work. In the meantime, see if you can increase that "eL Oh Cee" to, say, about a hundread. It's good to make a good impression. Fine. Excellent. I knew I could count on you. I'll see you then! ... Brian...did you fill out that TPS report...

  • Science project? (Score:2, Insightful)

    by MondoMor ( 262881 )
    I don't know about Ireland, but whenever I needed to do a science project, I had to supply shitloads of information, especially when making bold claims. Isn't that how science works?

    Hell, even reading the hypothesis of his project would be an improvement over what we have -- nothing.

    What shitty news coverage. The media isn't skeptical enough when it comes to science. If this was some miracle dreamed up by a politician, the media would have torn him to shreds by now, digging up dirt on him, his family, his marital history... everything.

    But when a miracle science story comes around, the media swallows it hook, line, and sinker. Unacceptable for this day and age.
  • 2003's Vaporware of the year!

  • Adnan says a six-fold increase is about the maximum practical boost.
    "At seven times it actually crashes so I have limited it to six."


    Now that's a good debugging technique, no wonder the code has 780,000 lines!

    To make the software more user friendly, it features a talking animated figure called Phoebe.

    With these skills, I guess he'll be working in Redmond soon...
  • I don't think its possible that he is rendering HTML 4 times faster and if he's using standard protocols (TCP), then he can't be getting the data any faster. If this story has any truth to it at all, I'd imagine this kid wrote a very memory intensive browser that kept open most media players. If you are browsing various media types (PDF, MP3, DOC, AVI, etc.) then keeping viewers/players for each of these types in memory would make browsing faster. Most people don't leave this stuff open, because it degrades overall performance when you aren't perusing multimedia.

  • C'mon Wil [wilwheaton.net] !

    Stop playing with time-shifting, and go out and play with the kids...

  • by Doomrat ( 615771 ) on Monday January 13, 2003 @08:27AM (#5071722) Homepage
    It doesn't say that it increases bandwidth, it says that it increases surfing speeds. It smells like precaching/'intelligent browsing' to me.
  • by klaasvakie ( 608359 ) on Monday January 13, 2003 @08:36AM (#5071771)
    Searching Irish Patent Office [espacenet.com]:

    Query :
    Application Date: 08/01/2003 -> 10/01/2003
    Abstract: *internet*
    Results: 0

    Query :
    Date Of Grant: 08/01/2003 -> 10/01/2003
    Abstract: *internet*
    Results: One Result: 2000/0717 82661 Server-based electronic wallet system

    Thats it, so it doesn't seem he applied for the patent in Ireland then...

    P.S. The stars around "internet" are mine, I used them to indicate that I searched all abstracts that contained the word "internet"
  • Duh (Score:2, Funny)

    by brx.o ( 640734 )
    This is really simple how it works. He simply increases the local gravational field while approaching the natural log raised to the 27 power of the speed of light. This causes the future to get entangled with the present and Osama takes advantage of this.
  • by kramer ( 19951 ) on Monday January 13, 2003 @08:40AM (#5071792) Homepage
    there's absolutely no hard data that I can find to go along with this, so if you find anything more on it, plz. post below - somehow 1500 lines of code per day, "every media player" built in doesn't ring true for me.

    Twits who make up bullshit stories like this thrive on attention. By posting it on a major site like slashdot, you give him exactly what he wants. Just use a little restraint, and try not to post the stories that are obvioulsy fake -- like this one, and the one about Masters of Orion 3 beign out soon (grin).
  • Clientside caching surely is most of the speed.
    Serverside caching [propel.com] could be used.
    TCP/IP non-comformaty [slashdot.org] is the third option.

    Assuming this is true, (ignoring the 1500 lines a day), what else could he be doing?

    Judging by harddisk prices, client side cacheing algorythms would make sense. Cacheing many portal and search engine homepages is a powerful start. Combined with a central server that then reviews these popular pages for changes, and publishes a simple summary for the browser client to collect and compare with older summaries, then a browser can collect only updated portal pages for the cache, all optimizes portal renders.

    Then less common homepages, such as the high school I attended, can be gleened from users typed-in webaddress history, and automatically cached as chron-job.

    Creating cached copies of commonly used graphics on portal website can save a ton of bandwidth. Again a server based bot could rate the linkcount of graphics on portal sites, and if the graphic has changed, and then post this list for browsers to collect for caching. Searching HTML for imagefiles, that are already stored in the cache, and modify the page on the fly to call only the cached image would save bandwidth. e.g. caching all of slashdot's article catagory icons.

    Then the tricky part, "which linked pages to cache while the user reads a page?", so that when a link is clicked, the pages renders fast. I would download the html from all of them, and while the reader reads, check for already cached images, and then start downloading image files.

    -Mac Refugee, Paper MCSE, Linux Wanna be!
    and first poster of the word "knoppix"
  • by jamie ( 78724 ) <jamie@slashdot.org> on Monday January 13, 2003 @08:59AM (#5071912) Journal
    I knew a programmer, a real hotshot, who really could write 1,500 lines of code a day.

    Then he discovered loops.

  • by Fjord ( 99230 ) on Monday January 13, 2003 @09:52AM (#5072331) Homepage Journal
    One possible explaination for the LoC count may be that he's using Borland and trusting it's "count". At my first real job, we used Borland and I made a realtively complex program over the course of 18 months (coincidentally enough). The line count was over 1.5 million, but the reality was that it wasn't that long, Borland was counting lines processed, which included the header files, and the OWL and windows headers could add a lot to each module (of which there were over 100, since I was big on modularization).

    I never really knew the true line count. I just remember the Borland one because I used to often do a global compile any time I wanted a half hour break ("Oh, the systems acting funny. Better do a global compile to make sure it's not a dependancy problem." If my boss came by and I wasn't there, he'd see the compile running on the screen).
  • by batgimp ( 323956 ) on Monday January 13, 2003 @10:27AM (#5072585)
    The competition he won is the same one Sarah Flannery won, the ESAT young scientist competition. See:
    http://www.esatys.com/ [esatys.com]
    Is it possible he counted 780,000 loc because he was including libraries and component code etc. etc. The article is badly written and doesn't give a true representation of his work. He claimed on Irish TV that he had written a client-server pair. I'm still fairly suspicious myself, but it *is* possible.
  • by dieScheisse ( 554308 ) on Monday January 13, 2003 @10:28AM (#5072601)
    this is all i could find: google groups [google.com]
  • No data for patent (Score:3, Informative)

    by Pranjal ( 624521 ) on Monday January 13, 2003 @10:32AM (#5072634)
    The article mentions that this fellow has applied for a patent last thursday. Guess what? There's no mention of the patent on the Irish Patent office website [patentsoffice.ie] or the European patent office website [espacenet.com] or WIPO website [wipo.org].
  • Yeah, right (Score:3, Interesting)

    by autopr0n ( 534291 ) on Monday January 13, 2003 @10:33AM (#5072642) Homepage Journal
    "At seven times it actually crashes so I have limited it to six."

    I call bullshit. That claim dosn't make any sense whatsoever, especialy if it's just software.

    It seems (to me) Like he just threw together a bunch of MS APIs (such as the microsoft speach API for 'Phoebe', the windows media API for the DVD player and video players, probably even used IE to display pages).

    At most he threw in an intelegent caching routine, such as pre-downloading linked pages or something. I also don't think he wrote 780kloc
  • by MichaelPenne ( 605299 ) on Monday January 13, 2003 @01:26PM (#5073999) Homepage
    Seems simply enough, this kid has obviously developed an FTL browser.

    Explains why it crashes at Warp 7 too, the dilithium code just can't take, keptin!
  • by bfree ( 113420 ) on Monday January 13, 2003 @02:05PM (#5074363)
    The Irish Times had an article on this on Saturday. The basic outline of the (admitedly brief article) was that this guy had won the Young Scientist awards (a big annual competition for all Irish school children), he had written a web browser that increased browsing speed somewhere near 5x which even included a DVD player. It said that he had written 1,500,000 lines of code and that he had done it in 18 months! The main thing that they mentioned but I haven't seen on this story is that the judges were sceptical and took his software down to the Computer Labs in UCD (a Dublin University) and they verified the performance there! I still didn't believe the article, and suspect the judges have given inappropriate praise to someone, but perhaps there is something at the bottom of all of this that actually is worthwhile (but I suspect that the speed up is the only worthwhile thing he has done and that it is little if anything more than existing techniques). The one thing I am curious about is can this guy actually travel to the US safely or did he really write a DVD player and break the DMCA (there's no way he was liscensed to do it!)? The other thing they mentioned was that he had not patented anything but was going to! I wonder if he will be able to and I wonder how many other patents he violated to create the project.
  • by daveirl ( 177821 ) <slashdot+regs@@@davidoneill...net> on Monday January 13, 2003 @04:55PM (#5075802) Homepage
    Karlin Lillington [weblogs.com] has more on the browser today and this seems informed!!

    The Irish browser story: Ok folks, here's the scoop. I am just back from talking to one of MIT Media Lab Europe's researchers, who both checked out the browser and talked to Adnan. He says the browser is 'absolutely extraordinary'. He says that what Adnan has done is re-engineer the efficiency of how a browser operates, which allows it to run up to six times faster (but usually not that much faster -- two to four times faster is more common). So it's not managing bandwidth but managing the way the browser itself handles and presents information. The researcher (whom I know and will vouch for) says that instead of simply tinkering with existing code he went down to the socket layer and reworked it at the protocol level (now, many of you guys will know the significance of this better than me, I'm just reporting the conversation). He added that it is incredibly clever work and stunning that a 16 year old has done this (I am not scrimping on the superlatives because that is what was said). (NB: A conversation in a group ensued that this work perhaps suggests that because the browser market is a virtual monopoly, there's been little incentive to improve efficiency in this way -- indeed, it might be beneficial to product development to just eke out a leeeetle more efficiency now and then and advertise it as continuing innovation... but I leave that to further discussion among the well-informed).

    And Adnan has indeed worked in all the existing media players AND a DVD player so you can watch a DVD while surfing. And incorporated in a voice agent that will speak web pages, for young children or for the sight-impaired. The improved efficiency angle got the notice of the few media reports done on this so far, but it's really not what Adnan himself was emphasising -- it's the whole package, said the MIT guy.

    Not surprisingly Adnan now has more than one university interested in him. And he has apparently told the numerous companies who saw the browser in action and who wanted to commercialise it that, at least for now, he has no interest in commercialising it.

    I will note that the MIT researcher had a big grin on his face and it was clear he found the whole project a pleasure to talk about. He also said he'd heard about the browser before he arrived at the Young Scientist exhibition and made a beeline to see it. Adnan apparently didn't really think it would necessarily win an award --the researcher told me it was clear that it HAD to win. So there you go. I'm sure we'll hear a lot more about all this soon.

    And yes, he has copyrighted it.

    Read More... [weblogs.com]

  • by 3trunk ( 18817 ) on Monday January 13, 2003 @06:50PM (#5076702) Homepage
    Karlin Lillington, a respected journalist for the Irish Times newspaper, maintains a weblog and has posted a more technical analysis here [weblogs.com] after talking to some people from MIT's media lab in Dublin, Ireland.
    Some snippets:
    "He says that what Adnan has done is re-engineer the efficiency of how a browser operates, which allows it to run up to six times faster (but usually not that much faster -- two to four times faster is more common). So it's not managing bandwidth but managing the way the browser itself handles and presents information. The researcher (whom I know and will vouch for) says that instead of simply tinkering with existing code he went down to the socket layer and reworked it at the protocol level (now, many of you guys will know the significance of this better than me, I'm just reporting the conversation). He added that it is incredibly clever work and stunning that a 16 year old has done this (I am not scrimping on the superlatives because that is what was said)."
    So perhaps there is some truth in this after all.
    newsQuakes [www.skep.tk]

"Imitation is the sincerest form of television." -- The New Mighty Mouse

Working...