And the Pulitzer Prize For SQL Reporting Goes To... (padjo.org) 27
theodp writes: Over at the Stanford Computational Journalism Lab, Dan Nguyen's Exploring the Wall Street Journal's Pulitzer-Winning Medicare Investigation with SQL is a pretty epic post on how one can use SQL to learn about Medicare data and controversial practices in Medicare billing, giving the reader a better appreciation for what was involved in the WSJ's Medicare Unmasked data investigation. So, how long until a journalist wins a Pulitzer for SQL reporting? And for all you amateur and professional Data Scientists, what data would you want to SELECT if you were a Pulitzer-seeking reporter?
And the award goes to... (Score:5, Funny)
Little Bobby Tables!
Wait, where did the award go...?
Re: (Score:2)
Re: (Score:2)
The duck goes quack. The pig goes oink. The sheep goes baaaa.
Yeah, we've all seen a See 'N Say, but few of us are still playing with one.
Anything but lecture (Score:2)
Yes. When I went to college, "lab" was the name for any section that students in a particular class were required to take in addition to a lecture.
If we can choose.... (Score:3)
And for all you amateur and professional Data Scientists, what data would you want to SELECT if you were a Pulitzer-seeking reporter?
SELECT convert_style(story, MY_WRITING_STYLE) FROM all_the_stories WHERE interest_score >= PULITZER_LEVEL;
Though I'd probably put a LIMIT on there so I don't publish too many Pulitzer winning stories at once.
Trend Analysis (Score:5, Insightful)
An award winning SELECT statement, in my opinion, would simply be one that asks an insightful question.
That's not what a Pulitzer Prize is for (Score:1)
I think people don't understand what the Pulitzer Prize is for these days. Once upon a time, it was given to inspire journalist excellence, but these days it is just a back-patting that says, "we endorse and agree with your leftist views." Just look at the prizes for Investigative Reporting - seven years into Obama's term and not a single Pulitzer has been awarded for investigating corruption and criminal behavior in his administration. [pulitzer.org] Not one. And this from a scandal-ridden Presidency that is ripe for
Re: (Score:2)
Just look at the prizes for Investigative Reporting - seven years into Obama's term and not a single Pulitzer has been awarded for investigating corruption and criminal behavior in his administration. [pulitzer.org] Not one.
How do you define "in his (Obama's) administration?" And versus how many during the Bush administration? I'm not suggesting their weren't any; I just don't know the numbers. I assume you do, since your statement implies, at the very least, that the answer is "more than one," and presumably also that the number is high enough to be statistically significant when it comes to exposing the bias you propose exists (and which, let me clear, I have no reason to actively doubt).
Here ya go: (Score:1)
SELECT a.Headline, AVG(r.Rating) as 'AvgRating', COUNT(DISTINCT v.IPAddress) as 'Views'
FROM `slashdot`.`articles` a
JOIN `slashdot`.`sources` s ON (s.ID = a.SourceID)
LEFT JOIN `slashdot`.`ratings` r ON(r.ArticleID = a.ID)
LEFT JOIN `slashdot`.`articleviews` v ON(v.ArticleID = a.ID)
WHERE s.Name LIKE '%dice%' OR (s.Name LIKE '%cowboy%' AND s.Name LIKE '%neal%')
GROUP BY a.ID
ORDER BY AVG(r.Rating) DESC, COUNT(DISTINCT v.IPAddress) DESC;
Reminds me of a prior job (Score:2)
I used to do similar stuff for marketing research: How many customers fitting a certain profile purchased product X and also product Y, with and without promotion Z.
The hard part was that the tables and product codes were messy. Historical baggage plagued the design. It would have been a relatively simple job with "clean" databases. (A lot of orgs have messy databases, by the way.)
I pushed the idea of views or re-constituted table copies for marketing queries, but the DBA was too booked on other projects an
Sadly promotes the Flaws in Today's Reporting (Score:3)
The SQL tutorial looks at the numbers but doesn't emphasize two kind of glaring omissions in the WSJ article:
a) Dr Weaver is charging for a procedure _labeled_ 'cardiac', but there is no mention of what the procedure is, it's relevance to cardiology (if the label is accurate), or it's relevance to internal medicine (Dr Weaver's _labeled_ current specialty). For all we know, Dr Weaver is an ex-cardiologist, now practicing internal medicine for which he has found this procedure to be extremely useful in the patients he treats. For all we know, the procedure was mislabeled (esp. since it is pointed out that the data is noisy incl. spelling errors, multiple labels for same thing, etc.)
b) At one point, Dr. Weaver's _statistical_ use of the procedure (99.5%) is compared to a raw numerical value (6) by Cleveland Clinic cardiologists. For all we know, the clinic cardiologists only saw 6 patients for whom the procedure was relevant, or they never use the procedure because they have other more relevant/current techniques, or patients who are seen by the clinic are at a point where the procedure isn't required.
While the SQL tutorial is an interesting look at how to verify the accuracy of the statistics in an article, it tacitly provided validation for what is still poor reporting ie. the statistics need explanation and validation beyond simple numbers.
If you assume that most people are pretty honest (statistically they are), then the SQL queries are a neat way to highlight that the billing system (not the practioners) is in need of a second or third look.
Hmm, any select? (Score:2)
Select * from NSA.listeningDB union select * from GCHQ.listeningDB;
select * from GOV.lobbyists where GOV.lobbyist.funded > 0 AND GOV.lobbyist.friend in (select GOV.inpower.name);