Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Google Programming Contest Winner 229

asqui writes "The First Annual Google Programming Contest, announced about 4 months ago has ended. The winner is Daniel Egnor, a former Microsoft employee. His project converted addresses found in documents to latitude-longitude coordinates and built a two-dimensional index of these coordinates, thereby allowing you to limit your query to a certain radius from a geographical location. Good for difficult questions like "Where is the nearest all-night pizza place that will deliver at this hour?". Unfortunately there is no mention whether this technology is on its way to the google labs yet. There are also details of 5 other excellent project submissions that didn't quite make it."
This discussion has been archived. No new comments can be posted.

Google Programming Contest Winner

Comments Filter:
  • if i'd only known (Score:2, Interesting)

    by oogoody ( 302342 ) on Friday May 31, 2002 @09:33AM (#3616620)
    they wanted such boring submissions.
    The winning idea was cool, but the rest looks
    like free development for google rather
    than something novel.
  • by Masem ( 1171 ) on Friday May 31, 2002 @09:34AM (#3616628)
    From the hon. mentions:
    Laird Breyer, for his project, Markovian Page Ranking Distributions: Some Theory and Simulations. This project examined various properties of the Markovian process behind Google's PageRank algorithm, and suggested some modifications to take into account the "age" of each link to reduce Pagerank's tendency to bias against newly-created pages.

    This may help to defeat the current practice of overloading the PageRank results of a given key word as to point to a given page by having people link to that page with a link containing that keyword, aka "Googlebombing". I do think that the winner is a very interesting and useful project, this latter one will probably be implemented ASAP.

  • 404 Page Not Found ? (Score:5, Interesting)

    by bigmouth_strikes ( 224629 ) on Friday May 31, 2002 @09:40AM (#3616667) Journal
    I'm surprised that there are so many 404 Page Not Found errors in Google's search results, even on the top hits.

    Shouldn't Google automatically check results that a user follows and flag those that cannot be displayed ?
  • by Peyna ( 14792 ) on Friday May 31, 2002 @09:51AM (#3616735) Homepage
    Sounds like it wasn't doing IP Addresses or hostnames , but addresses found in text on pages. Using enough rules, and a funky algorithm, you could probably get pretty accurate for a number of pages, enough to produce good results on searches at least.
  • Nice (Score:3, Interesting)

    by Mr_Silver ( 213637 ) on Friday May 31, 2002 @09:52AM (#3616746)
    Whilst I'm very impressed with the winner the entry "Robust Hyperlinks" is something that do like a lot.

    What would be cool, would be the option to right click on the hyperlink and have the option "Find alternative location".

    Or even cooler, have IE (or your favourite browser) on putting up the 404 message have a hyperlink which does the same. Hell, easy enough to do with apache.

  • by PhilHibbs ( 4537 ) <snarks@gmail.com> on Friday May 31, 2002 @10:03AM (#3616819) Journal
    Google's links are not redirected via their server, and a lot of people would object to them "gathering data on their users' browsing activities". However, automatically checking the top link after each search (or scheduling it for checking) should be possible.

    What should they do if a page is unavailable, though? What if it's only down for a few seconds?
  • Wow (Score:1, Interesting)

    by Anonymous Coward on Friday May 31, 2002 @11:34AM (#3617479)
    What was failed to mention from his resume [ofb.net]:
    Miscellaneous Projects

    1995 - ongoing: Free Software

    I wrote and maintain Gale30, an open source instant messaging system. Other free software projects of mine include Airhook, Liboop, and some XML processing tools.

    2001 - ongoing: Sweetcode

    I am the sole proprietor of Sweetcode, a web site that reports interesting free software. Sweetcode receives thousands of visitors daily; media reports include NTK, memepool, the Linux Weekly News, and others.

    2000 - ongoing: SeattleWireless

    I maintain the Node Map, a simple XML-based GIS which uses public mapping engines to display the location of community 802.11b wireless nodes in Seattle.
  • by po8 ( 187055 ) on Friday May 31, 2002 @12:13PM (#3617760)

    In a weird coincidence, I just spent a half-hour last night lecturing about Daniel Egnor's Iocaine Powder [ofb.net], winner of the First International RoShamBo Programming Competition [ualberta.ca]. Credit this guy with two award-winning pieces of extreme programming cleverness!

  • by JoeBuck ( 7947 ) on Friday May 31, 2002 @02:17PM (#3618577) Homepage
    chrysalis writes: Yet this is not the case. Trust me, all well ranked web sites for common keywords belong to a few companies that are actively cheating.

    This statement is easily refuted. Type "linux". The 10 sites you see all belong there, and I can guarantee you that most of them are not engaging in cheating.

    But since you admit that you work for a company that engages in this practice, perhaps it helps you sleep at night to believe that "everybody does it".

I find you lack of faith in the forth dithturbing. - Darse ("Darth") Vader

Working...