Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming Graphics Software IT Technology

Photon Soup Update 116

rkeene517 writes "Two and a half months ago I posted an article asking for spare computer cycles. I was swamped by emails and volunteers. After the first weeks most dropped out. The die-hards kept running the program and we simulated 45.3 billion photons. The pictures are here. Thanks to all that helped out. I will be submitting the images to SIGGRAPH 2005 and a paper. (P.S. Never post your email address on slashdot. I got 900 emails! ouch.)"
This discussion has been archived. No new comments can be posted.

Photon Soup Update

Comments Filter:
  • Never post (Score:5, Funny)

    by bert.cl ( 787057 ) on Sunday July 04, 2004 @08:39AM (#9605220)
    The pictures are here. Thanks to all that helped out. I will be submitting the images to SIGGRAPH 2005 and a paper. (P.S. Never post your email address on slashdot. I got 900 emails! ouch.)" Within 2 months: "The paper got a prize and I would like to thank everyone who participated PS: Never post pictures of photo's on slashdot, my webserver is nothing but photons now"
  • Crap server (Score:1, Insightful)

    by Anonymous Coward
    You'd think that having a previous story on slashdot would kind of suggest getting a server that doesn't fall over because of the /. effect...

    Anybody got mirrors of the pics?
    • Auto-Mirror (Score:2, Interesting)

      by andr0meda ( 167375 )

      How difficult could it be to auto-mirror front page stories on /. itself?

      I mean, data-wise, local websites probably take up anything under a 100 Meg, and only go a few pages deep. The rest of it can still link to the outside world, since the probability of people following over 2 pages deep links away from the actual report is small. So the outside server could easily survive, and is not forced to switch servers just because there is ONE spike. /. itself Already takes the hit anyway, so it could easily s
      • RTFFAQ. [slashdot.org]
        • by afabbro ( 33948 )
          That's not a real answer. It's laziness on the part of the Slashdot "editors".

          could try asking permission, but do you want to wait 6 hours for a cool breaking story while we wait for permission to link someone?

          Personally, yes. But why not put it to a poll? In fact, I submitted it as a poll question and...it was rejected.

          So the quick answer is: "Sure, caching would be neat." It would make things a lot easier when servers go down, but it's a complicated issue that would need to be thought through

          • Re:Auto-Mirror (Score:3, Interesting)

            by Ieshan ( 409693 )
            I'm feeding a troll, so I'm posting without bonus to lessen the ugly karma hit.

            Don't be ridiculous. Caching DOES have very tricky issues dealing with copyright infringement.

            My suggestion for Caching, though:

            Enable submitter-optional caching, don't cache sites with any ad banners, only cache a site AFTER a cache.txt file has been placed in the home directory of the site with a listing of the files allowed to be cached (check it once every 5 minutes or so).
            • Enable submitter-optional caching, don't cache sites with any ad banners, only cache a site AFTER a cache.txt file has been placed in the home directory of the site with a listing of the files allowed to be cached


              the cache.txt idea sounds great, but the submitter is not always the owner of the site. It sounds like you could just cache based on the presense of the cache.txt file.
            • A Website can be cached and forwarded without copyright issues - much of the web is locally cached at some intersection in any event - and all of it is buffered and forwarded.

              however - to properly service the advertisers - a cahing service would serve a frequently refreshed copy - count the copies, and then over time - request page views to correspond to the copies served.

              AIK

            • Why not use a pre-existing standard robots.txt. If google can cache a site that does not have a robots.txt then /. should be able to. As long as they put up some sort of disclaimer like google does it should be fine. But they should cache it, keep it on the server, and check if the server the page is on is down every so often. Once it goes down, automatically use their cache to display the page.
          • Why is this modded as a Troll? Where are we supposed to discuss Slashdot on Slashdot?
        • "caching would be neat but it's a complicated issue that needs to be thought through before being implimented"

          So think it through already. It's been how many years? When that was written google didn't even exist. Since then google has implimented a wonderful caching system that seems to work. If the geniuses at google can figure out how to cache the entire freaking intarweb, surely people smart enough to bring us slashdot can figure out a way to cache 7 or 8 sites.
  • by marsu_k ( 701360 ) on Sunday July 04, 2004 @08:40AM (#9605227)
    (P.S. Never post your email address on slashdot. I got 900 emails! ouch.)
    One comment and cpjava.net is already inaccessible... guess you shouldn't post links either :-)
  • I see 45.3 billion photons... but man are they slow! This page is taking forever to load..

    • 10 fempto seconds (Score:3, Informative)

      by goombah99 ( 560566 )
      IIRC, 1 watt-second of light contains about 4xE+19 visible wavelength photons. So if the scene is illumnated with a 100 watt bulb then 4 Billion photons is equivalent to a 10 fempto second shutter speed.

      Of course if he was infact only modeling the photons that made it to the lens then the number might be a few hundred times larger.

      Thus I dont understand why this page is taking so long to load. If he had just put those photons into the optical fiber carrying my web connection I would have gotten them

  • Never post your email address on slashdot. I got 900 emails!

    I remember sending him an email. I also remember mentioning that the methodology (him mailing you the file) as opposed to downloading it or using Java Webstart [sun.com] was not the smarted way to go about doing this.

    Finally, I also remember he never sent me an invite even though I asked. Oh well, glad it worked out for him in the long run.

  • by vadim_t ( 324782 ) on Sunday July 04, 2004 @10:11AM (#9605543) Homepage
    Computers got 3000 times faster, but Java managed to compensate for 11 years of evolution.

    The previous article says:
    Year: 1994
    Computers: 100 SparcStation 1
    Time: 1 month
    Photons: 29 billion, 29 billion/month

    Now we have:
    Year: 2004
    Computers: Unknown, supposedly 3000 times faster
    Time: 2.5 months
    Photons: 45.3 billions, 18 billion/month

    If computers are indeed 3000 times faster, or heck, even 100, you should have got 72 billion just out of one of those computers running for the 2.5 months.
    • Gee where to start...

      Scaling from 1 month to 2.5 doesn't mean 2.5 times the simulated photons, it could be that he didn't even have a fraction of the users he had back in '94. Also scaling raw Hz clock cycles which is where the "3000 times faster" remark expertly refers to is a terrible measure of extrapolating waht the performance should be. It must suck being so stupid.
      • Well, let me explain in more detail.

        In 1993, the computer I had was a 386 DX 40 with 4 MB RAM and 170MB hard disk. 486 were recent and still very expensive.

        Today, I have a dual Athlon MP 2000+ with 1 GB ECC DDR RAM and 200GB disk which when I bought it, cost me about the price of a high-end single CPU computer and definitely performed better.

        After googling a bit, I found a Sparc Station 1 had a 25 MHz CPU, 64MB RAM, and a 25 MHz bus. While I know perfectly that MHz is not a good measure of performance, j
        • by Anonymous Coward
          I don't think he's computing the same scene as in 1993. It's likely much more complex. Also, more complex algorithms could be used. In other words, you can't really compare (with the data you have).
    • Computers got 3000 times faster, but Java managed to compensate for 11 years of evolution.

      If that's not flamebait, I'm not sure what is... Geez, how can you even say that?

      The previous article says: Year: 1994 Computers: 100 SparcStation 1 Time: 1 month Photons: 29 billion, 29 billion/month

      Now we have: Year: 2004 Computers: Unknown, supposedly 3000 times faster Time: 2.5 months Photons: 45.3 billions, 18 billion/month

      If computers are indeed 3000 times faster, or heck, even 100, you should have
    • Actualy the network computing started with many people rendering the image and after a month only about 5 or 6 people were following through and still rendering. Unfortunately I didn't put in a counter for cumulative CPU hours.
    • who says the problem scales linear? maybe it's not O(n) but O(n*logn) or something. I don't know enough about the subject anyway, but still...
      Just think of Fibonacci numbers, maybe to calculate one more photon you need a factor 10 speed up or something.
      • who says the problem scales linear? maybe it's not O(n) but O(n*logn) or something.
        It pretty much has to scale linearly to be able to distribute it the way he's doing. Also, when you think about it, he's tracing the paths of photons and photons don't really interact with each other so therefore the complexity of the problem scales linearly with the number of photons.
  • Mirror (Score:5, Informative)

    by uss_valiant ( 760602 ) on Sunday July 04, 2004 @10:15AM (#9605553) Homepage
    Such a story is useless without the images. So here's a temporary mirror for the resulting images of the project:

    Photon Rendering Project: image mirror [ee.ethz.ch]

    The mirror won't be up forever.
  • Another copy (Score:2, Informative)

    by andy314 ( 775757 )
    All 6 images: http://ca.geocities.com/andy314_1/photons.tar.gz
  • It would be useful if the IRS were to offer some kind of tax write-off for use of one's spare computing cycles. I formerly worked in an office where all the desks had networked computers that did...nothing... (except suck down sleep-cycle juice) for twelve to eighteen hours a day, if not 24 hours a day on weekends. Presuming a workaholic who leaves the keyboard alone for eight hours every night (to run to the pharmacy to refill that Provigil prescription?) and crashes on weekends, that's, uh, urm, 88 hour
  • 650k PNG files? (Score:3, Informative)

    by fontkick ( 788075 ) on Sunday July 04, 2004 @11:21AM (#9605928)
    I would recommend that the submitter take down the zips and images and reoptimize them as smaller size JPGs. A 650k file is just crazy for the actual image - which is only 512 pixels wide and blurry (due to depth of field effects). Just go into Photoshop, hit "Save for web", and you can resize and change the JPG settings to your hearts content. I got one of the files down to 12k and it looks fine. These are not highly detailed images to start with. .PNG may be the format of choice for geeks worldwide, but I've always thought it was worthless.
    • Re:650k PNG files? (Score:2, Informative)

      by azuretongue ( 180140 )
      Heck use pngcrush
      Best pngcrush method = 124 for soup_one.png (32.88% reduction)

      Best pngcrush method = 124 for soup_one_2.png (33.17% reduction)

      Best pngcrush method = 16 for soup_two.png (36.67% reduction)

      Best pngcrush method = 16 for soup_two_2.png (36.85% reduction)

      Best pngcrush method = 16 for soup_three.png (28.52% reduction)

      Best pngcrush method = 16 for soup_three_2.png (28.57% reduction)

      Pngcrush is free, open, and cross platform enough to run on those Sun SparcSataion1s he
    • I'm happy to see the artifact-free result of so much computation. Sure it's blurry but not due to noise. The blur is part of the data in this case.

      I agree that the author could post JPG preview images, but getting the current pictures is useful as well.
    • ".PNG may be the format of choice for geeks worldwide, but I've always thought it was worthless."

      Try taking screenshots of open applications, or a shell.

      PNG will beat out GIF every freaking time, with no loss in image quality (no artifacts, no reduction in color depth) and still stomp any JPEG of comparable quality.

      It is, perhaps, the best image format for very accurate images. It has a size that competes with JPG and GIF, with the added benefit of an alpha channel. This alpha channel feature is greatly

  • by theolein ( 316044 ) on Sunday July 04, 2004 @11:44AM (#9606114) Journal
    Firstly, I'm kind of irritated that the usual slashdot troll crowd expends so much hatred and ignorance on a truly creative project. The technique might not be using OpenGL, DirectX or ATi or NVidia's newest cards, but that is no reason to trash talk a technique that, in a few years time, might revolutionise CGI work in movies.

    And in movie production is where this technique will most probably eventually find use. Movie studios have the budget and the server farm equipment to make good use of a time and resource expensive technique such as this.

    And they certainly would want to. The images have almost exactly the same quality as grainy 1950's kodacolor or poor images from my 1970's vintage Kodak instamatic. While adding grain to a movie is no problem, most rendering techniques used today produce surfaces that are simply too clean and glass effects that are too clear, and this immediately gets picked up by the human eye, which is very good at subliminally noticing differences in image quality. Tracing the paths of photons and their interaction through and with materials produces images that mimic reality in an excellent way, IMO.

    I'm pretty sure that a large cluster, such as the one using Apple's G5s at Virginia tech, running optimised C or C++ code would be able to produce usable footage for movies. And what's more, I'm pretty sure that sooner or later, there will be tools to make this technique more accessable.
    • No, this is not a very useable way of rendering images. And he says so in the readme.

      Tracing photons is already used with the technique of photon mapping which you can look up on the net, there is a load of information available on the topic, as well as example numerous images.

      It is a pity that he did not use one of the standard scenes available to test renderers because then it would have been easier to compare the results with already existing renderers.

      The thing is, while this method has a very accura
  • by fok ( 449027 ) on Sunday July 04, 2004 @12:14PM (#9606335) Homepage
    It takes more time to download the images from a slashdotted site then actually render them!
  • by XO ( 250276 )
    Damn, 900 emails? I get that in spam, in just two days.

    *poor timothy* *pity party* awwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwww

  • BitTorrent download (Score:4, Informative)

    by JackZ ( 89183 ) on Sunday July 04, 2004 @02:17PM (#9607072) Homepage
    Have a link to a BitTorrent tracker with the images [chattanoogastate.edu].
    You will want 'photon_soup'

    Jack
    • The link above [chattanoogastate.edu] also includes a download called 'photon_soup all images and software' which includes the 3 revisions of his Java code as well as some older images that are no longer available.
      I do not know how many photons are in the first results. I believe the second results have 30 billion. The 3rd set of results are the current images.

      Jack
  • Impressive results (Score:2, Interesting)

    by phamNewan ( 689644 )
    After finally finding the pictures I was really impressed. Someone noted that rendered images are easily detected by the human eye, but these look like pictures. Granted parts of it are fuzzy, but that is part of what makes it look so real. The actual glass images look very real.

    Great job.
  • hope u used one, if not bad luck.
  • I missed this before, but the way this algorithm is described, it's indeed old soup.

    The way photons are traced, and waiting till they hit the camera aperture, is a special case of bidirectional ray tracing, in which the paths starting from the eye have length 0.

    This has been described in various papers since 1993, and has nicely been summarized in some recent books on the topic (e.g. http://www.advancedglobalillumination.com/).

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...