Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
The Internet

Science Project Quadruples Surfing Speed - Reportedly 646

johnp. writes "A computer browser that is said to least quadruple surfing speeds on the Internet has won the top prize at an Irish exhibition for young scientists, it was announced on Saturday. Adnan Osmani, 16, a student at Saint Finian's College in Mullingar, central Ireland spent 18 months writing 780,000 lines of computer code to develop the browser. Known as "XWEBS", the system works with an ordinary Internet connection using a 56K modem on a normal telephone line. " A number of people had submitted this over the weekend - there's absolutely no hard data that I can find to go along with this, so if you find anything more on it, plz. post below - somehow 1500 lines of code per day, "every media player" built in doesn't ring true for me.
This discussion has been archived. No new comments can be posted.

Science Project Quadruples Surfing Speed - Reportedly

Comments Filter:
  • Basic maths. (Score:4, Interesting)

    by psychofox ( 92356 ) on Monday January 13, 2003 @08:18AM (#5071395)
    780,000 lines of code in 18 months is approximately 1500 lines per day every single day. I'm skeptical.
  • by grahamsz ( 150076 ) on Monday January 13, 2003 @08:19AM (#5071402) Homepage Journal
    "At seven times it actually crashes so I have limited it to six."

    This isn't a microprocessor - the speed it runs should be completely unrelated to it's caching.

    I'm very very skeptical that this is anything more than a browser with precaching.

    It also makes other ludicrous statements about how blind people can now access the web. I'm not sure how they do it presently, but i know that they do.
  • Re:Basic maths. (Score:2, Interesting)

    by byolinux ( 535260 ) on Monday January 13, 2003 @08:20AM (#5071408) Journal
    We'll probably find he's just compiled Phoenix and put his own name in the title bar...
  • by byolinux ( 535260 ) on Monday January 13, 2003 @08:25AM (#5071429) Journal
    Many Windows users use Jaws [freedomscientific.com] a popular screenreader.

    There is also BLinux [leb.net], a project to make Linux accessible for blind users.

    Jaws is pretty cool, I use it at work sometimes to test sites.
  • Hmm. (Score:5, Interesting)

    by boris_the_hacker ( 125310 ) on Monday January 13, 2003 @08:26AM (#5071433) Homepage
    Well I have to confess to being mildly curious. I mean, a 16 year old school boy writing 780,000 lines of code in 18 months ? Well I am impressed, by my meagre calculations that equates to _roughly_ 1,400 lines of code a _day_ every day for 18 months. And this application makes the internet go upto 6 times faster [apparently 7 times make it crash]. Not only that, it has been a secret project for the entire time. I smell a rat, either that or a complete genius code writer.

    But what really got me where the two most important features someone could ever want in a Web Browser - it can play dvd's [it incorporates ever media player!], and also has a handy animated assisant called Pheobe.

    Now, I am most probably wrong, and will happily eat my hat, but I cant help but feel that this isn't an entirely accurate article.

    ps. Does anyone know if it is standard compliant ?
  • by Zocalo ( 252965 ) on Monday January 13, 2003 @08:27AM (#5071441) Homepage
    No, not another duplicate Slashdot story, but I seem to recall a story about another young Irish student who had developed a "revolutionary" encryption engine a while back. That was largely all claim and no solid documentation as well, and what has become of her efforts since then? Not much, not even a single update.

    Why am I thinking this is just another one of those snake-oil web speedups that does lots of caching and pre-emptive downloading of pages on the off chance you are going to view it? I'll be taking this story with a large pinch of salt for now I think.

  • Is that so? (Score:4, Interesting)

    by Pike65 ( 454932 ) on Monday January 13, 2003 @08:28AM (#5071443) Homepage
    Do we have any reason to believe that this has a lower bullshit quotient than that daft '100x compression of random data' story doing the rounds last year (can't find the /. link, here The Register's one [theregister.co.uk])?

    Sure, you can leave stuff out (images, JavaScript, Flash), but "at least quadruple"? If the page is simple enough then you can't just ditch a chunk of it.

    Ooh, AND "[at] least quadruple surfing speeds" and "they found it boosted surfing speeds by between 100 and 500". Even the article isn't making any sense . . .

    Of course, if this turns out to be true than I will be the first to eat my cat (and the first to download it), but I'm sure this isn't even possible, right?

    Just my 2 cents (actually, that was more like 5) . . .
  • by jibster ( 223164 ) on Monday January 13, 2003 @08:29AM (#5071447)
    We all remember the Flannery [wolfram.com] episiode, right. She was awarded the first prize at the Irish Young Scientist compition in 2000 for work on speeding up the processing time of the RSA algorithm. I remember slashdot covering this (although I can't find the story) but I also remember reading that it made breaking the encryption almost trival. Still the IYS award is a compition thats been running for 30-40 years now and is a credit to our small corner of the world.
  • What a load of crap (Score:5, Interesting)

    by Masa ( 74401 ) on Monday January 13, 2003 @08:30AM (#5071451) Journal
    This has to be a hoax. And not even a good one.

    A kid coding 780'000 lines of code in 18 months. All alone. In that time he have had to design and implement the whole shit including "every single media player built in".

    It would require some sort of dial-up-server side module to compress and modify the contents of the data and this kind of system would most certainly be a lossy method for transferring data. It won't be possible to transfer binary data with this thing without corrupting the result completely.

    And what kind of a piece of software would choke under the load of 7x56k modem ("At seven times it actually crashes so I have limited it to six.")?

    This is just a cheap attempt to gather some attention.

  • by orthogonal ( 588627 ) on Monday January 13, 2003 @08:31AM (#5071453) Journal
    If this thing's really a web browser, and it runs completely on the client computer, any web pages it's requesting are coming down the line as HTML, uncompressed (except insofar as the modem's protocol might compress). Without a compresser on the other end, the speed's not coming from compression.

    If it does require a server side piece, it's not a web browser, per se; but as a general question, is it worthwhile to look into "compressed" web pages, e.g., foo.html.zlib? (I tend to doubt the savings are that much for the "average" page, but shoving graphics into an archive might keep down the number of requests needed to fetch a whole page and its graphics.)

    If it's not server side compression, the only thing I can think of (and fortunately smarter people than me will think of other things I'm sure) is that he's pre-fetching and caching pages to make the apparent speed faster.

    So is the "secret" that he has some hueristic that sensibly guesses what links you'll click next, combined with regularly fetching, oh say, your most requested bookmarks? (In my case it might look like: slashdot -- New York Times -- slashdot -- sourceforge -- slashdot -- freshmeat -- eurekareport -- slashdot.)

    In other words, is he mirroring sites locally in the background? And if so, how must bandwidth is wasted just sitting in the cache until it's stale?

    (On the other hand, could I point his browser at /., refreshing every five seconds to make sure I got a local copy of pages about to be slashdotted?)

  • Re:Hmm. (Score:5, Interesting)

    by Sycraft-fu ( 314770 ) on Monday January 13, 2003 @08:50AM (#5071524)
    There is simply no way a browser can incorperate "every media player" because they operte on different standards. Windows Media Player operates using DirectShow to play it's files. It's nothing more than a control program for DirectShow. Any DirectShow filter loaded onto the system with decode capabilities can be decoded. Any other program can use the same interface, and play all those file types. Fine, however this is Window ONLY, the code is proprietary to MS and not for otehr platforms. And then on Windows you have other things like QuickTime. QT does NOT function as a DS filter, it's a whole seperate way of doing things and again we have proprietary code. This continues for any other mdiea standard (Real for exmaple) that has it's own system.

    I have a feeling this project is nothing but hot air.
  • Re:Basic maths. (Score:3, Interesting)

    by mangu ( 126918 ) on Monday January 13, 2003 @09:01AM (#5071567)
    I've heard about this thing called code reuse


    I must be getting old. In my younger days, that was called "libraries", and you only counted each line once, no matter how many times they were reused.

  • by boaworm ( 180781 ) <boaworm@gmail.com> on Monday January 13, 2003 @09:12AM (#5071631) Homepage Journal
    Most likely you are correct, but there could be something to it. When I took classes in data communications we learned about how good ol' modems work. Increasing speed on an old modem is basically just a matter of the number of signals/second and how many different singlals you can send with a single "signal". By increasing the precision of signal detection (differentiating different signals, thus allowing more than one "bit" being sent through a single period of time you can increase the total amount of data).

    The article discusses that he uses a simple modem, so that could perhaps mean that he just wrote some new method for transmitting more bits/second by a more accurate signal detection method.


    Although, this has basically nothing to do with the browser at all, so it does not make any sense to me. Sounds like the article mixes apples and oranges, or perhaps the "student" is just laying out a smoke barrier so that noone will steal his ideas before he get the patent.

    my 5 cents...

  • by Doomrat ( 615771 ) on Monday January 13, 2003 @09:27AM (#5071722) Homepage
    It doesn't say that it increases bandwidth, it says that it increases surfing speeds. It smells like precaching/'intelligent browsing' to me.
  • by Charlotte ( 16886 ) on Monday January 13, 2003 @09:29AM (#5071734)
    Including testing, debugging and jerking off (hey, he's 16 right?), a typical software engineer will write 10-20 lines of actual code a day. Mind you that's including the analysis phase and excluding empty lines which are there for readability.

    I've always been proud that when looking back on my own projects I had something like 20-30 lines a day. On really good days you can write hundreds of lines but sometimes you have to throw everything out again because it's crap.

    I hope this guy isn't for real, he'll be burnt out by the time he's 30.

  • by klaasvakie ( 608359 ) on Monday January 13, 2003 @09:36AM (#5071771)
    Searching Irish Patent Office [espacenet.com]:

    Query :
    Application Date: 08/01/2003 -> 10/01/2003
    Abstract: *internet*
    Results: 0

    Query :
    Date Of Grant: 08/01/2003 -> 10/01/2003
    Abstract: *internet*
    Results: One Result: 2000/0717 82661 Server-based electronic wallet system

    Thats it, so it doesn't seem he applied for the patent in Ireland then...

    P.S. The stars around "internet" are mine, I used them to indicate that I searched all abstracts that contained the word "internet"
  • by npendleton ( 255215 ) on Monday January 13, 2003 @09:41AM (#5071802)
    Clientside caching surely is most of the speed.
    Serverside caching [propel.com] could be used.
    TCP/IP non-comformaty [slashdot.org] is the third option.

    Assuming this is true, (ignoring the 1500 lines a day), what else could he be doing?

    Judging by harddisk prices, client side cacheing algorythms would make sense. Cacheing many portal and search engine homepages is a powerful start. Combined with a central server that then reviews these popular pages for changes, and publishes a simple summary for the browser client to collect and compare with older summaries, then a browser can collect only updated portal pages for the cache, all optimizes portal renders.

    Then less common homepages, such as the high school I attended, can be gleened from users typed-in webaddress history, and automatically cached as chron-job.

    Creating cached copies of commonly used graphics on portal website can save a ton of bandwidth. Again a server based bot could rate the linkcount of graphics on portal sites, and if the graphic has changed, and then post this list for browsers to collect for caching. Searching HTML for imagefiles, that are already stored in the cache, and modify the page on the fly to call only the cached image would save bandwidth. e.g. caching all of slashdot's article catagory icons.

    Then the tricky part, "which linked pages to cache while the user reads a page?", so that when a link is clicked, the pages renders fast. I would download the html from all of them, and while the reader reads, check for already cached images, and then start downloading image files.

    -Mac Refugee, Paper MCSE, Linux Wanna be!
    and first poster of the word "knoppix"
  • by ipjohnson ( 580042 ) on Monday January 13, 2003 @09:48AM (#5071837)
    There is no such thing as partially lossless compression. You either loss data or you don't. The meaning of lossless is NO loss.

    You are right on the presentation bit ... people like to point and stare.
  • Re:Basic maths. (Score:5, Interesting)

    by sql*kitten ( 1359 ) on Monday January 13, 2003 @09:56AM (#5071893)
    780,000 lines of code in 18 months is approximately 1500 lines per day every single day. I'm skeptical.

    Indeed. I remember reading that IBM reckon that, including design, coding, testing, debugging and documentation, a programmer's doing well to get 10 lines of code per day, averaged over the life of the project.

    Also depends how he's counting lines. In C, because that can vary so much depending on individual formatting style, a good rule of thumb is to count semicolons. And even then it won't tell you if programmer A is writing fast but hard to read code and programmer B is checking the return value of every system call (as you're supposed to but few ever do), adding lines and robustness with no extra actual functionality.
  • by SerpentMage ( 13390 ) on Monday January 13, 2003 @10:02AM (#5071925)
    What I am thinking is the following....

    Lets say that you want to increase compression of some data. EG HTML. Could there not be a technique to speed things up? Sure there is, get rid of the spaces, remove some tags, etc.

    Well lets say that with each compression technique there are levels of what can be thrown away. And maybe when he tweaks to level 7 he throws away too much. At that point the app does crash since he may be throwing away something interesting.

    That was my point of partially lossless....
  • [Hard/Soft]ware... (Score:2, Interesting)

    by tommck ( 69750 ) on Monday January 13, 2003 @10:25AM (#5072100) Homepage
    I'll start off by saying that I didn't read the article...


    But, you're talking about things that occur in two different places... The modem is hardware, the browser is software. A new browser could not increase the _actual_ speed of the modem. Obviously, he's talking about some algorithm that makes better use of the bandwidth to show a _perceived_ increase in speed, because I seriously doubt that he's come up with a new compression algorithm that compressed 4 times better than existing stuff, but doesn't require the server to know about it.

    T

  • Re:Basic maths. (Score:3, Interesting)

    by jimfrost ( 58153 ) <jimf@frostbytes.com> on Monday January 13, 2003 @10:47AM (#5072288) Homepage
    I remember reading that IBM reckon that, including design, coding, testing, debugging and documentation, a programmer's doing well to get 10 lines of code per day, averaged over the life of the project.

    From my software engineering course way back in college I think I remember the number being 4 or 5. But that is more like an industry average. One thing about software is that the best programmers are something like two to three orders of magnitude more productive than the average. Between that and the communication costs growing exponentially in a group you find that a few very talented programmers are vastly more productive than a mass of average programmers.

    Still, sustaining 1,500 LOC per day for a year and a half ... that's beyond the productivity level of anyone I've ever seen. I personally have managed 4,500 per day for a period of about a week on occasion ... but I wasn't sleeping much during that period.

    I am not sure I'd take that number at face value though. If this were real he would almost certainly be using a lot of prewritten code for codecs and the like and that would balloon the LOC for little effort on his part. It's more than a little unlikely that he'd be able to write all his own codecs in the first place.

    So, while the LOC sounds specious, it's potentially believable given the probability of code reuse.

    The thing that makes this entirely unbelievable is the performance claim. 4x performance of existing browsers over a 56k line? That's simply not possible since the bulk of time spent waiting is data transmission time. That could be improved but only with a server side component and it's doubtful it could be improved substantially without a large loss in quality.

    I'm not going to dismiss the claim of a new web browser, but I'd be surprised if any of the size and performance claims hold water.

  • Re:Basic maths. (Score:4, Interesting)

    by Zeinfeld ( 263942 ) on Monday January 13, 2003 @11:00AM (#5072400) Homepage
    We'll probably find he's just compiled Phoenix and put his own name in the title bar...

    Most likely he has taken an open source browser and added in his own extensions. This is the type of innovation that making the browser open source is meant to support.

    As for speeding up browsing by a factor 100% that is pretty easy. We did a lot of work on HTTP-NG and you can speed up downloads a lot just by compressing the headers so that they fit into a single packet. HTML is also very compressible. The biggest mistake we made in the original Web code was not putting a lightweight compression scheme into the code, although it did make it into the specs.

    Of course the reason this did not happen was the LZ patent GIF fiasco and the then state of the GNU compression libraries. Even so Microsoft has supported compression in IE for some time.

    I am rather more skeptical about the 500% claim. I don't think that there is that much redundancy unless you have completely artificial examples.

  • Yeah, right (Score:3, Interesting)

    by autopr0n ( 534291 ) on Monday January 13, 2003 @11:33AM (#5072642) Homepage Journal
    "At seven times it actually crashes so I have limited it to six."

    I call bullshit. That claim dosn't make any sense whatsoever, especialy if it's just software.

    It seems (to me) Like he just threw together a bunch of MS APIs (such as the microsoft speach API for 'Phoebe', the windows media API for the DVD player and video players, probably even used IE to display pages).

    At most he threw in an intelegent caching routine, such as pre-downloading linked pages or something. I also don't think he wrote 780kloc
  • Re:Basic maths. (Score:5, Interesting)

    by ergo98 ( 9391 ) on Monday January 13, 2003 @11:53AM (#5072838) Homepage Journal
    I dono, maybe me and my co-workers are some kind of gods, but I don't see these "one week" numbers as outrageous. We're all gods when we don't need proof though, right? I code 100,000 lines per day and sleep 15 minutes on my commute to work (it's a straight section of the highway). I am a GOD! Of course then there's that silly old thing called reality. Here are some simple facts. Feel free to disagree.
    • Heroic coding is almost always destructive. Read The Mythical Man Month for a little background on this: Basically when people start putting in those 20 hour days then it's either the beginning of the end (which in some cases as the beginning of the beginning as well. See many well know .com cases). People, even gods like yourself, have a finite amount of problem solving cerebral ability per day, and extending that is generally counter productive.
    • The human body can go a couple of days with minimal sleep, but it is absolute folly to extrapolate that and presume that it'll keep going for even a week: Instead you'll either require a massive sleep "make-up", or you'll become mentally dull while your immune system collapses (this is presuming you don't have a medical condition).
    • A line of code per 16 seconds again sounds good and we can all easily do it by reimplementing something that we've already done (ooh look at my reversing a string function!), but it is astoundingly unlikely that someone could continue such a rate beyond even an hour. If coding were so trivial we would have tools to generate the code.
    • Ah the number of projects I've worked on where someone has given optimistic numbers, presuming that they'll magically create line after line after line...and then a subtle bug hits. Days later their half a day of coding is eclipsed by days of problem solving. I'm sure this doesn't affect Gods, though.
  • by podperson ( 592944 ) on Monday January 13, 2003 @12:30PM (#5073132) Homepage
    ...for faking all this well enough to fool a bunch of idiots in the press / online / and judges.
  • Re:Hmm. (Score:4, Interesting)

    by WEFUNK ( 471506 ) on Monday January 13, 2003 @12:38PM (#5073199) Homepage
    Based on the level of tech knowledge exhibited by the average tech reporter (pretty low) I'd guess that "built-in" probably just means that it comes pre-loaded with standard plug-ins. Especially when it's cited in an article that seems so impressed that "Other special aspects of his browser are the fact that access to 120 Internet search engines..." - a rather useless/annoying feature that's standard in any run-of-the-mill adware/spyware package or by visiting one of those squatter's website's with all the pop-up ads (searching dmoz 115 times isn't going to help anyone...).

    The claim that it's 100 to 500% faster is probably accurate in some sense, but compared to what? An old version of Netscape or Explorer? And on what kind of set-up? You can probably see that kind of variation in a single browser installation just by changing to different options and settings or by closing other windows or background applications. Personally, I often find myself switching between browsers depending on what seems to be working better on a particular day or on a particular network or machine.

    On the other hand, he does sound like he's a bright kid with a good future, but probably one that just took Mozilla and turned it into snazzy looking bloatware with a bunch of extra features. Or, perhaps an even brighter kid who did the same thing from "scratch" with a lot of cutting and pasting (of his own work and from existing programs and libraries) to end up "writing" so many lines of code.
  • Re:MIT is better (Score:3, Interesting)

    by the gnat ( 153162 ) on Monday January 13, 2003 @12:51PM (#5073296)
    Harvard's science departments are some of the best in the world (I'm a Yale alumn, so it hurts to admit this). Their medical school is among the very best in the country, and this means that the biomedical sciences there are almost unparalleled. It is not, however, an engineering school. There's a world of difference.

The moon is made of green cheese. -- John Heywood

Working...