When we have concerns about our health, I would have thought that AI was the last resource that we should turn to for advice. Yet it seems that increasing numbers of people are using ChatGPT to help them ‘self-diagnose’.
I know I usually trawl the BBC website for stories to comment on to expose the lies within their reports and thereby help to dispel fear, but I recently came across an article on the CNET website that caught my eye.
This article, posted on 1st January 2024, is entitled The AI Doctor Is In. Here's How ChatGPT May Pave a New Era of Self-Diagnosis. The subtitle claims, The chatbot is more than fun to use: It may be the new health assistant for those who need it most in 2024 and beyond.
What is surprising to me is that, according to the article, some doctors are not totally against the idea. Although the subtitle refers to AI as a ‘health assistant’, have doctors not considered the likelihood that AI is being ‘trained’ to replace them? Do they not realise they are no longer regarded as being indispensable, especially if patients are able to ‘self-diagnose’?
Although later in the article it’s acknowledged that Chatbots should not be used as a total replacement for doctors, the headline, subtitle, and the beginning of the article, which are probably all that most people will read, suggest that using AI to ‘self-diagnose’ is useful for people to be able to ‘know’ what is wrong with them.
The article adds that,
“According to one estimate, 30 million people in the US are living with an undiagnosed disease. People who've lived for years with a health problem and no real answers may benefit most from new tools that allow doctors more access to information on complicated patient cases.”
This suggests that AI has access to more information than doctors, but why is this the case? Surely doctors have access to the same database of information and can therefore perform the same searches?
I’m certainly not denying that not knowing ‘what is wrong with you’ when you experience a variety of symptoms is a miserable situation to be in.
The point I want to make here is that the idea that there is ‘something wrong with you’, because you are experiencing symptoms and therefore ‘have’ a disease, is grossly misleading; or, to be more precise, it is completely false, because that is not how the body works.
The human body is perfectly capable of looking after itself through its innate self-healing processes. This of course is not good news for the medical establishment system that views these processes - which manifest as ‘symptoms’ - as a problem for which only they can provide the solutions, and so we therefore need to remain victims of our ‘diseases’.
It is therefore of paramount importance to them that everyone is made to believe that using AI to ‘self-diagnose’ is a good idea.
After all, who would need to visit a human doctor if people can use AI from the comfort of their home?
Another question to ponder is whether AI will be trained to provide prescriptions? I would suggest that, if it isn’t already happening, it will happen very soon!
This situation is seriously challenging to the future career of every single medical practitioner, all of whom spent a large amount of money and many years studying to become a qualified doctor.
Their ability to diagnose correctly is however, under question. Referring to comments by Sheila Wall, who is cited as the admin for an online group called Years of Misdiagnosed or Undiagnosed Medical Conditions, the CNET article states,
“Being undiagnosed is a miserable situation, and people need somewhere to talk about it and get information," she explained. Living with a health condition that hasn't been properly treated or diagnosed forces people to be more "medically savvy," Wall added.”
Although the purpose of this statement would seem to be to raise the question of why people’s health issues are being misdiagnosed, incorrectly diagnosed, or improperly treated, there is an underlying intimation that AI could do a better job.
On this issue, I would like to make the point that technology and the software it utilises are only as good as the men and women who create them.
I would also like to mention the well-known computer phrase, GIGO - garbage in, garbage out.
So, if AI is drawing data from the same information set that is based on the assumption that ‘symptoms = disease’, it too will misdiagnose people and recommend incorrect treatments because this is a flawed assumption. The only difference is that AI can perform the search far more quickly than a human.
But speed is not a guarantee of greater accuracy. Far from it!
In fact, it is admitted in the CNET article that AI does not have the ability to always correctly identify health problems. The article discusses how AI was shown in a study published in the Journal of Medical Internet Research to have only been effective in correctly identifying the ‘disease’ in certain cases, and that it was almost completely ineffective at identifying more rare conditions.
Interestingly, one doctor is cited in the article as stating that ‘making the diagnosis is usually the easy part’, although it would seem that, according to Sheila Wall’s group, they do not always succeed in doing so. Nor are they able to provide people with genuine, long-term solutions. As the result of their training, they usually offer prescriptions for treatments that merely manage people’s symptoms, often for life.
The reason for these poor results is because the medical system, and consequently AI which works from the same basic assumptions, possesses a flawed understanding of the human body and how it functions.
It’s important to remember that medical research studies only ever investigate ‘disease’ and these investigations occur mainly within a laboratory setting. For the most part, and again because of their training, medical researchers have no understanding of what disease is and why symptoms actually occur within the living human body.
The allopathic model of human health problems is based on the notion that diseases are distinct and identifiable conditions that just ‘happen’ to people and can be treated if correctly diagnosed.
This has never been proven to be the case.
Although taking a slightly different approach, many, but not all, ‘alternative’ health systems are also based on the same concept of ‘disease’; the main difference occurs with respect to the treatments they offer.
Within the mainstream system, diseases are classified as either ‘infectious’ or non-communicable (NCDs). I have discussed at length in previous articles here and elsewhere that there is no evidence that any disease is infectious.
So this leaves NCDs.
According to the September 2023 WHO fact sheet entitled Noncommunicable diseases,
“Noncommunicable diseases (NCDs), also known as chronic diseases, tend to be of long duration and are the result of a combination of genetic, physiological, environmental and behavioural factors.”
The fact sheet also claims that,
“Tobacco use, physical inactivity, the harmful use of alcohol, unhealthy diets and air pollution all increase the risk of dying from an NCD.”
Despite this claim, the medical establishment frequently admits in their documents and fact sheets to not knowing the causes of NCDs, including the main ‘killer’ diseases, such as heart disease and cancer. This also applies to conditions referred to as autoimmune diseases, which is another misapplied label because the body does not make mistakes and attack itself.
As I mentioned above, the symptoms that are claimed to demonstrate the presence within the body of ‘a disease’ are more correctly understood to be indicators of the body’s natural processes of self-healing and restoring homeostasis. Symptoms such as fevers, coughs, sore throats, fatigue, and rashes that are commonly associated with ‘infectious diseases’, are more easily recognised as efforts of the body to self-cleanse and self-heal.
The symptoms associated with chronic conditions that are usually diagnosed as an NCD, on the other hand, are less easy to recognise as being the body’s own self-healing processes. But that does not mean this explanation is incorrect.
The different sets of symptoms associated with chronic conditions are indicative of problems at a deeper level within the body; a situation that is likely to be due to an accumulation of toxins over a prolonged period of time. Although toxins are known to often play a significant role in causing damage within the body, they are not solely chemical and electromagnetic in nature; the label ‘toxin’ can also be applied to emotional stressors.
It is clear that the past 4 years have been extremely stressful for everyone, so it is hardly surprising that there has been, and will continue to be an upsurge in the number of people who experience symptoms of a more chronic nature.
If these people use chatGPT, or any other AI system, to investigate their health problems they are likely to believe they actually do ‘have’ the condition that AI suggests, and this will set up a whole cascade of beliefs and ideas, none of which are conducive to the restoration of health, but are far more likely to lead to the perpetuation and even worsening of their health problems.
I have much more to say about the role of our thoughts, beliefs, emotions, fears and ideas, and will do so in future writings.
For now, I would just like to emphasise the importance of understanding that our health is largely in our own hands. And even if asking AI for a diagnosis of our aches and pains seems like ‘fun’, as CNET suggests, this will never help us restore ourselves to the state of health.
It is only by understanding how to support our body in its self-healing processes that we are able to experience the state of health, which includes our emotions.
Although it should be self-evident, AI, by contrast, has absolutely no understanding of human emotions - nor can it ever gain this understanding.
Even if, as I have recently discovered, AI can produce beautiful poems, and even amazing lyrics and music, I would emphasise that no machine can actually experience the emotions that the poem or the song convey to us as men and women.
I would add that no machine can ever experience and appreciate the beauty of nature or the feeling of love.
❤️❤️❤️
You can help support my work in a number of ways.
Thank you. 🙏
This, along with other technologies this system is coming up with to, "Fight illness" are making the hair on the back of my neck stand up.
Sounds like another scheme to increase pharma profits. 'AI' can make things up to satisfy its programmers' prejudices.