Stupid pet tricks, aka Google diagnosing

Stupid pet tricks, aka Google diagnosing

From the BBC

A team of Australian doctors googled the symptoms of 26 cases for a study in the New England Journal of Medicine.

In 15 cases, the web search came up with the right diagnosis, the paper published on the British Medical Journal website reports.

In each of the 26 cases studied, researchers based at the Princess Alexandra Hospital in Brisbane selected three to five search terms from each case and did a Google search without knowing the correct diagnoses.

They then recorded the three diagnoses that were ranked most prominently and selected the one which seemed most relevant to the signs.

The doctors then compared the results with the correct diagnoses as published in the journal.

Google searches found the correct diagnosis in just over half of the cases.

The crucial paragraph from the BMJ paper seems to be:

We then did a Google search for each case while blind to the correct diagnosis (that is, before reading the differential diagnosis and conclusion of each case record). We selected and recorded the three most prominent diagnoses that seemed to fit the symptoms and signs.
They looked through 30-50 results to find three that seemed reasonable. If a real decision support tool like SimulConsult doesn’t rank the correct diagnosis within the top 10 it’s considered as evidence of failure, not success. When there is such a failure typically there is some important fact missing from the database, a situation that is then corrected. It is not clear if Google has a similar ability to learn from the wisdom of the community of users.
Searches are less likely to be successful in complex diseases with non-specific symptoms … or common diseases with rare presentations …
These are the situations where decision support software is most useful.

Google is incredibly powerful. If you are too lazy to think you can type any thoughtless thing in and get something at least somewhat relevant out. That doesn’t mean Google should be used for clinical decision support.

November 10, 2006

5 thoughts on “Stupid pet tricks, aka Google diagnosing”

  1. I haven’t seen the paper, but I thought the background of the study was that the folks, not docs, were using google to make diagnoses.

    I honestly hadn’t thought of using it for my diagnostic dilemmas. I tend to stick to the old standbys: experience and training, and judicious use of consultative opinion.



  2. One of the comments, “I hope my doctor has better tools than Google.”

    He doesn’t. The medical profession is hampered by the fact that there are no computer diagnostic tools available to doctors. None at all.

    This is a result of the “drilling down”, the specialization that occurs, and the desire of Big Pharma to get their product prescribed, even if it is a misdiagnosis. The advantage of a misdiagnosis is that large quantities of expensive drugs get used experimentally, and without any diagnostic software, the ability of the doctor to converge on the truth is non-existent. The doctor is dependent on his own memory, or the expertise of colleagues.

    The Google effort of 50% correct is substantially ahead of the rate of a medical practitioner. For stupid pet tricks, rely on your doctor. For first diagnosis, he is more likely to be wrong. However, once the diagnosis is correct, you are more likely to get prescribed the drugs that are acceptable for current medical therapy. However, we are already aware that the medical profession eschews environmental, nutritional, and lifestyle approaches, so once more, you are on your own in a long-term therapeutic model where your doctor will recommend you take drugs for the rest of your life. Hmmmm …. I think I will give Google one more try ……. 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *