Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming Graphics Software IT Technology

Photon Soup Update 116

rkeene517 writes "Two and a half months ago I posted an article asking for spare computer cycles. I was swamped by emails and volunteers. After the first weeks most dropped out. The die-hards kept running the program and we simulated 45.3 billion photons. The pictures are here. Thanks to all that helped out. I will be submitting the images to SIGGRAPH 2005 and a paper. (P.S. Never post your email address on slashdot. I got 900 emails! ouch.)"
This discussion has been archived. No new comments can be posted.

Photon Soup Update

Comments Filter:
  • Re:Freecache (Score:4, Informative)

    by Halo1 ( 136547 ) on Sunday July 04, 2004 @08:59AM (#9605308)
    Since freecache only caches files >5MB, this isn't going to help anything (freecache is just going to pass those requests through to the original server)...
  • by gl4ss ( 559668 ) on Sunday July 04, 2004 @09:00AM (#9605311) Homepage Journal
    read the fucking blurb?

    ** I will be submitting the images to SIGGRAPH 2005 and a paper.**

    the images make a nice addition to the paper.. to show that the technique actually works.
  • Re:Auto-Mirror (Score:2, Informative)

    by Colonel Cholling ( 715787 ) on Sunday July 04, 2004 @10:00AM (#9605498)
    RTFFAQ. [slashdot.org]
  • 10 fempto seconds (Score:3, Informative)

    by goombah99 ( 560566 ) on Sunday July 04, 2004 @10:05AM (#9605516)
    IIRC, 1 watt-second of light contains about 4xE+19 visible wavelength photons. So if the scene is illumnated with a 100 watt bulb then 4 Billion photons is equivalent to a 10 fempto second shutter speed.

    Of course if he was infact only modeling the photons that made it to the lens then the number might be a few hundred times larger.

    Thus I dont understand why this page is taking so long to load. If he had just put those photons into the optical fiber carrying my web connection I would have gotten them sooner.

  • Mirror (Score:5, Informative)

    by uss_valiant ( 760602 ) on Sunday July 04, 2004 @10:15AM (#9605553) Homepage
    Such a story is useless without the images. So here's a temporary mirror for the resulting images of the project:

    Photon Rendering Project: image mirror [ee.ethz.ch]

    The mirror won't be up forever.
  • by TheGavster ( 774657 ) on Sunday July 04, 2004 @10:28AM (#9605595) Homepage
    Photons mapping is currently used on a small scale in some rendering engines to more accurately simulate light bounces. Its particularly useful at calculating caustics (light getting focused through a transparent medium) which can't be done by the less intensive radiosity systems. This experiment, however, seems to try to render using photon mapping exclusively. Nice idea, though not really practical at the present state of computing, given the graininess of the images and the amount of processing time. The Brazil rendering system (http://www.splutterfish.com/sf/sf_gen_page.php3?p rinter=1&page=brazil) for example, uses photon mapping on a much small scale (usually between 1-10M photons) in combination with raytracing to provide clear, realistic imagery (though not as technically perfect as this example)
  • Another copy (Score:2, Informative)

    by andy314 ( 775757 ) on Sunday July 04, 2004 @10:43AM (#9605671)
    All 6 images: http://ca.geocities.com/andy314_1/photons.tar.gz
  • 650k PNG files? (Score:3, Informative)

    by fontkick ( 788075 ) on Sunday July 04, 2004 @11:21AM (#9605928)
    I would recommend that the submitter take down the zips and images and reoptimize them as smaller size JPGs. A 650k file is just crazy for the actual image - which is only 512 pixels wide and blurry (due to depth of field effects). Just go into Photoshop, hit "Save for web", and you can resize and change the JPG settings to your hearts content. I got one of the files down to 12k and it looks fine. These are not highly detailed images to start with. .PNG may be the format of choice for geeks worldwide, but I've always thought it was worthless.
  • by vadim_t ( 324782 ) on Sunday July 04, 2004 @12:00PM (#9606234) Homepage
    Well, let me explain in more detail.

    In 1993, the computer I had was a 386 DX 40 with 4 MB RAM and 170MB hard disk. 486 were recent and still very expensive.

    Today, I have a dual Athlon MP 2000+ with 1 GB ECC DDR RAM and 200GB disk which when I bought it, cost me about the price of a high-end single CPU computer and definitely performed better.

    After googling a bit, I found a Sparc Station 1 had a 25 MHz CPU, 64MB RAM, and a 25 MHz bus. While I know perfectly that MHz is not a good measure of performance, just the 25 MHz bus would ensure that machine would have a MUCH lower performance than anything modern.

    Now, while Sun hardware at that point was probably way beefier than consumer stuff, these days it doesn't seem to be true for Sun hardware that has an affordable price.

    So, let's try a little estimation. Assuming current hardware has a performance of only 10X of the Sparc Station 1, he should have got the same result with just 10 volunteers running 24/7. From the 900 emails figure it sounds like he got quite a few more than that.

    And despite this increase in computing power, somehow his per-month performance was lower than 11 years ago.

  • BitTorrent download (Score:4, Informative)

    by JackZ ( 89183 ) on Sunday July 04, 2004 @02:17PM (#9607072) Homepage
    Have a link to a BitTorrent tracker with the images [chattanoogastate.edu].
    You will want 'photon_soup'

    Jack
  • by QuantumFTL ( 197300 ) on Sunday July 04, 2004 @03:09PM (#9607457)
    Computers got 3000 times faster, but Java managed to compensate for 11 years of evolution.

    If that's not flamebait, I'm not sure what is... Geez, how can you even say that?

    The previous article says: Year: 1994 Computers: 100 SparcStation 1 Time: 1 month Photons: 29 billion, 29 billion/month

    Now we have: Year: 2004 Computers: Unknown, supposedly 3000 times faster Time: 2.5 months Photons: 45.3 billions, 18 billion/month

    If computers are indeed 3000 times faster, or heck, even 100, you should have got 72 billion just out of one of those computers running for the 2.5 months.


    Blockquoth the old article:
    Now computers are 3000 times faster and I am doing it again only much better, with a smaller aperature, in stereo, with 3 cameras, and with some errors fixed, and in Java.


    I don't know about you, but that sounds like to me he is complicating things quite a bit. It's hard to blame his choice of Java language on the slow execution when he's making the problem considerably harder!

    Not to mention the fact that his code may not be as efficient... I can write a slow C program too if I use the wrong data structures or a bad algorithm.

    You're comparing apples to oranges here. Photon mapping is insanely complex in terms of the computational requirements, and doing 3 cameras with stereo, etc... Yeah.
  • by rkeene517 ( 637993 ) on Sunday July 04, 2004 @03:35PM (#9607676) Homepage
    My most sincere appologies. I had so many emails that I couldn't reply to them all. I ended up putting the code on my web site, which is what I should have done in the first place. This is the first time I ever posted to slashdot and it was a learning experience.
  • Re:650k PNG files? (Score:2, Informative)

    by azuretongue ( 180140 ) on Sunday July 04, 2004 @04:38PM (#9608132)
    Heck use pngcrush
    Best pngcrush method = 124 for soup_one.png (32.88% reduction)

    Best pngcrush method = 124 for soup_one_2.png (33.17% reduction)

    Best pngcrush method = 16 for soup_two.png (36.67% reduction)

    Best pngcrush method = 16 for soup_two_2.png (36.85% reduction)

    Best pngcrush method = 16 for soup_three.png (28.52% reduction)

    Best pngcrush method = 16 for soup_three_2.png (28.57% reduction)

    Pngcrush is free, open, and cross platform enough to run on those Sun SparcSataion1s he seems to be using.
  • by Hast ( 24833 ) on Monday July 05, 2004 @01:36AM (#9610967)
    No, this is not a very useable way of rendering images. And he says so in the readme.

    Tracing photons is already used with the technique of photon mapping which you can look up on the net, there is a load of information available on the topic, as well as example numerous images.

    It is a pity that he did not use one of the standard scenes available to test renderers because then it would have been easier to compare the results with already existing renderers.

    The thing is, while this method has a very accurate way of simulating a scene it has a very simple scene. If you add effects used by normal photon mapping today such as sub-surface scattering (used to create realistic skin on eg Gollum) or hair/fur renderers the computational time increases to something rediculous. Since modern rendering farms struggling with less accurate simulations today it is doubtful that this particular technique is useful.

    If fact I doubt that this technique produces a result which is significantly more accurate than eg photon mapping. (Which is, again, why the lack of a standard scene is regrettable.)

    But I do agree with you that there are a lot of less than clued in people here at Slashdot who ridicule ideas they have no grasp on how they work. (And as such make fools out of themselves.) And while that may sound as a thinly veiled attack on you it isn't ment as such. If you have some spare time I recommend that you play around with coding rendering software, it's quite easy to produce a working system. And in my experience it is very rewarding.

"Can you program?" "Well, I'm literate, if that's what you mean!"

Working...