Drug War Facts: Addictive Qualities of Popular Drugs

Withdrawal: Presence and severity of characteristic withdrawal symptoms.
Reinforcement: A measure of the substance's ability, in human and animal tests, to get users to take it again and again, and in preference to other substances.
Tolerance: How much of the substance is needed to satisfy increasing cravings for it, and the level of stable need that is eventually reached.
Dependence: How difficult it is for the user to quit, the relapse rate, the percentage of people who eventually become dependent, the rating users give their own need for the substance and the degree to which the substance will be used in the face of evidence that it causes harm.
Intoxication: Though not usually counted as a measure of addiction in itself, the level of intoxication is associated with addiction and increases the personal and social damage a substance may do.

Source: Jack E. Henningfield, PhD for NIDA, Reported by Philip J. Hilts, New York Times, Aug. 2, 1994 "Is Nicotine Addictive? It Depends on Whose Criteria You Use."

Drug War Facts: Addictive Qualities of Popular Drugs

Barry Bonds #756

Career homer #756 for Barry Bonds to put him 1st on the all time list and celebration
YouTube - Barry Bonds #756: ""

Facts About Money & People

- More of our fantasies are about money... than sex.
- Money is the leading cause of disagreements in marriages.
- 65% of Americans would live on a deserted island all by themselves for an entire year for $1,000,000.
- For $10,000,000 most of us would do almost ANYTHING! Including abandoning our family and friends and our church. A very high percentage of us would, for that same amount of money, change our race or sex.
What's really strange about this is, the statistics remain the same whether it's ten million dollars all the way down to three million. For three million bucks, most of us would do the same horrible things we would do for ten million. But, guess what? Few of us would do these things for a "measly" two million. And, 1 in every 14, would even murder someone for ten million bucks.
- If we could have any luxury in the world (and money didn't matter) more of us would choose to spend money on a butler and a maid than anything else.
- 90% of Americans who own pets buy them Christmas gifts.
- 92% of us would rather be rich than find the love of our lives.
- Money (or the lack thereof) is the biggest stress inducer in the lives of Americans. We worry more about money than our marriages, our health.
- Women have very fixed ideas on how much they are willing to spend on a bra. 38.3% of women won't spend $30 for a bra. 28.4% won't spend $50. 10% would pay as much as $75. And, only 3.5% would shell out $100. But, you know what? Almost 20% of women say they would pay almost anything for a bra. This is because they consider that the contents of what those bras are encasing is of extremely high-value.
- Nearly half of the people who sell their houses with furniture included will take all the light bulbs out of all the lamps when they vacate the premises.
- Most people won't bend over to pick up money lying on the sidewalk unless it's at least a dollar.
- There is about 405 billion dollars in circulation. Only 32 million of that amount is counterfeit. That means, the percentage of counterfeit money in America is .0079%. And, $20 bills are more often counterfeited than $100 bills.
- Fresh, crisp, clean bills are considered much more valuable than those which are old, wrinkled and dirty.
Let's flip a coin and try to guess whether it will come up heads or tails. Three times as many people guess 'heads' than 'tails'.
- One out of every four Americans believe their best chance of getting rich is by playing the lottery.
- 5% of lottery ticket buyers buy 51% of all tickets sold.

- A staggering 74% of us are influenced by how much we can win in a lottery as opposed to the odds of us winning.
- The odds of winning a lottery jackpot are about 10 million to 1.
- A person who drives 10 miles to buy a lottery ticket is 3 times more likely to be killed in a car accident while driving to buy the ticket... than... he is to win the jackpot.
- Only 6% of people in America regularly buy clothes tailor made just for them.
- 63% of us decide NOT to buy a product advertised on the Internet... because... we think the shipping and handling charges add too much to the order.
- Eight times as many Americans would rather use an ATM than deal with a real live teller.
- 83% of Americans still pay with checks instead of credit cards!
- Almost 30% of us say we would need 3 million to feel rich. This ties in with the fact most of us would do anything for as little as $3 million... but... not nearly as many of us would do those identical things for a measly $2 million.
- 80% of Americans say giving personal information (especially their credit card information) over the Internet scares the living shit out of them.
- Two-thirds of Americans say they wouldn't let their spouse spend the night and have sex with another person for a million dollars. The average wedding in America costs a staggering $20,000.00.
More than one-third of American women consider money more important than good sex to the success of a marriage.
- When it comes to houses, more than anything else, people want a state-of-the-art kitchen.
-When people shop for a car, what they want more than anything else is reliability for the best possible price.
- People tip more on sunny days than they do on dreary days.
- Almost two out of three people have modified their financial behavior because of their fears.
- More women would actually rather have an unlimited shopping spree than a fabulous lover.

Money Facts NicheGeek.com

One on One - The Dalai Lama

: ""

He calls himself a simple Tibetan monk, but he is far more than that to his people and most of the Buddhist world.
He is a Nobel Peace laureate who has had to live in exile in northern India since he fled his homeland as a teenager.
Tenzin Gyatso, known around the world simply as The Dalai Lama was the first Tibetan spiritual leader to travel to the West.
It is a trend he has continued as he regularly meets with world leaders to publicise the cause for a free Tibet.
Tibet is governed by China and the Dalai Lama has led a government in exile in Dharamsala, in northern India since fleeing his homeland in 1959.
Tenzin Gyatso has had a colourful life from an early age. He was taken from his family around the age of two, for intensive training, after being identified as the reincarnation of the 13th Dalai Lama. He was enthroned as Tibet's head of state at the age of 15.
In 1989, this simple monk – to use his words – was awarded the Nobel Peace Prize for his efforts for a peaceful resolution in the struggle for a free Tibet.

The Placebo Effect

A growing body of research reveals not just psychological and perceptual components to the placebo effect but also a biochemical substrate to the mechanism.

Extracted from Nexus Magazine, Volume 14, Number 4 (June - July 2007)
PO Box 30, Mapleton Qld 4560 Australia. editor@nexusmagazine.com
Telephone: +61 (0)7 5442 9280; Fax: +61 (0)7 5442 9381
From our web page at: www.nexusmagazine.com
by Peter Arguriou © 2007
A neglected phenomenon
One of the most commonly used terms in medical language is the word placebo. The placebo effect is used as a scale for evaluating the effectiveness of new drugs. But what exactly is the placebo effect and what are its consequences in the deterministic structure of Western medicine?
The placebo effect has been frequently abused by health professionals to denote and stigmatise a fraud or fallacy. Alternative therapies have often been characterised as merely placebos. But the placebo effect is not a fraudulent, useless or malevolent phenomenon. It occurs independently of the intentions of charlatans or health professionals. It is a spontaneous, authentic and very factual phenomenon that refers to well-observed but uninterpreted and contingent therapies or health improvements that occur in the absence of an active chemical/pharmacological substance. Make-believe drugs-drugs that carry no active chemical substances-often act as the real drugs and provoke therapeutic effects when administered to patients.
In many drug trials, the manufacturers of the drug sadly discover that their product is in no way superior to the effect of a placebo. But that does not mean that a placebo equates to a null response of the human organism. On the contrary, a placebo denotes non-chemical stimuli that strongly motivate the organism towards a therapeutic course. That is, the placebo effect is dependent not on the drug's effectiveness but solely on therapeutic intention and expectation.

Effects of positive and negative thinking
The placebo effect has been often misunderstood as a solely psychological and highly subjective phenomenon. The patient, convinced of the therapy's effectiveness, ignores his symptoms or perceives them faintly without any substantial improvement of his health; that is, the patient feels better but is not healthier. But can the subjective psychological aspect of the placebo effect account for all of its therapeutic properties? The answer is definite: the placebo effect refers to an alternative curative mechanism that is inherent in the human entity, is motivated by therapeutic intention or belief in the therapeutic potential of a treatment, and implies biochemical responses and reactions to the stimulus of therapeutic intention or belief.
But placebos are not always beneficial: they can also have adverse effects. For example, administering a pharmacologically inactive substance to some patients can sometimes bring about unexpected health deteriorations. A review of 109 double-blind studies estimated that 19 per cent of placebo recipients manifested the nocebo effect: unexpected deteriorations of health.1 In a related experiment, researchers falsely declared to the volunteers that a weak electrical current would pass through their head; although there was no electrical current, 70 per cent of the volunteers (who were medical students) complained of a headache after the experiment.2
In a group of patients suffering from carotid atherosclerosis, prognosis and progression of the disease were burdened when their psychological health was bad (i.e., they were affected by hopelessness or depression). In another group of carotid atherosclerosis patients, prognosis and progression were burdened not only by hopelessness but also by hostility. In patients with coronary heart disease, hopelessness was a determinative risk factor. Social isolation, work stress and hostility comprised additional risk factors.
Positive or negative thinking seems to be a decisive risk factor for every treatment, perhaps even more important than medical intervention.
The nocebo effect appears to have a specific biological substrate. A group of 15 men whose wives suffered from terminal cancer participated in a small perspective study. After their wives' deaths, the men experienced severe grief that caused immunodepression. The spouses' lymphocytes for a period of time after their wives' deaths responded poorly to mitogenes. Grief had assaulted their immune system. The study proposed that grief and grief-induced immunodepression resulted in high- level mortality of the specific group.

A short history of a small miracle
The term placebo (meaning "I shall please") was used in mediaeval prayer in the context of the phrase Placebo Domino ("I shall please the Lord") and originated from a biblical translation of the fifth century AD.7 During the 18th century, the term was adopted by medicine and was used to imply preparations of no therapeutic value that were administered to patients as "decoy drugs". The term began to transform in 1920 (Graves8), and through various intermediate stages (Evans and Hoyle, 19339; Gold, Kwit and Otto, 193710; Jellinek, 194611) was fully transformed in 1955 when it finally claimed an important portion of the therapeutic effect in general. Henry K. Beecher, in his 1955 paper "The Powerful Placebo", attributed a rough percentage of 30 per cent of the overall therapeutic benefit to the placebo effect.
In certain later studies, the placebo effect was estimated at even higher, at 60 per cent of the overall therapeutic outcome. In a recent review of 39 studies regarding the effectiveness of antidepressant drugs, psychologist Guy Sapirstein concluded that 50 per cent of the therapeutic benefits came from the placebo effect, with a poor percentage of 27 per cent attributed to drug intervention (fluoxetine, sertaline and paroxetine). Three years later Sapirstein, along with a fellow psychologist Irving Kirsch, processed the data from 19 double-blind studies regarding depression and reached an even higher percentage of therapeutic results attributed to the placebo effect: 75 per cent of depression therapies or ameliorations were placebo induced!
Hróbjartsson and Gotzsche (200114, 200415) doubted the effectiveness of the placebo phenomenon, attributing it solely to the subjective factors of human psychology. And indeed, there is a major aspect of the placebo effect related to psychology. In two studies where placebos were exclusively administered, the placebo effect seemed to be effected from the subject's perception of the applied therapy, i.e., two placebo pills were better than one, bigger pills were better than smaller, and injections were even better.16
The placebo induced a reaction not only to the therapy but also to its form, suggesting that the placebo phenomenon is shaped according to the personal symbolic universe of the patient. Before the placebo response occurs, human perception has already interpreted the applied therapy and has prepared a certain response to it. It would appear that not only chemical but also non-chemical stimuli participate in the motivation of the human organism towards therapy.
But is the placebo reaction solely a psychological phenomenon or does it have additional tangible somatic effects?
One of the more dramatic events regarding placebo therapy was reported in 1957 when a new wonder drug, Krebiozen, held promise as the final solution to the cancer problem. A patient with metastatic tumours and with fluid collection in his lungs, who demanded the daily intake of oxygen and the use of an oxygen mask, heard of Krebiozen. His doctor was participating in Krebiozen research and the patient begged him to be given the revolutionary drug. Bent by the patient's hopelessness, the doctor did so and witnessed a miraculous recovery of the patient. His tumours melted and he returned to an almost normal lifestyle. The recovery didn't last long. The patient read articles about Krebiozen's not delivering what it promised in cancer therapy. The patient then had a relapse; his tumours were back. His doctor, deeply affected by the aggravation, resorted to a desperate trick. He told his patient that he had in his possession a new, improved version of Krebiozen. It was simply distilled water. The patient fully recovered after the placebo treatment and remained functional for two months. The final verdict on Krebiozen, published in the press, proved the drug to be totally ineffective. That was the coup de grace for the patient, who died a few days later.
In spite of the melodrama of the Krebiozen case, there is no single case or personal testimony that can denote or prove a therapy to be effective. Statistical studies, not personal testimonies, can verify a proposed therapy's effectiveness, and well-planned studies are able to concur that the placebo phenomenon has somatic properties.
One such study was implemented in 1997. The two study groups consisted of patients with benign prostatic hypertrophy. One group took actual medication while the control group received placebo treatment. The placebo recipients reported relief from their symptoms and even amelioration of their urinary function. A placebo has also been reported to act as a bronchodilator in asthmatic patients, or to have the exact opposite action-respiratory depression-depending on the description of the pharmacological effect the researchers gave to the patients and therefore the effect the patients anticipated.
A placebo proved highly efficient against food allergies and, subsequently, impressively effective in the sinking of biotechnologies on the stockmarket. How could that be? Peptide Therapeutics Group, a biotech company, was preparing to launch on the market a novel vaccine for food allergies. The first reports were encouraging. When the experimental vaccine reached the clinical trials stage, the company's spokesperson boasted that the vaccine proved effective in 75 per cent of the cases-a percentage that usually suffices to prove a drug's effectiveness. But the good news didn't last long. The control group, given a placebo, did almost as well: seven out of 10 patients reported getting rid of their food allergies. The stock value of the company plunged by 33 per cent. The placebo effect on food allergies created a nocebo effect on the stockmarket! In another case, a genetically designed heart drug that raised high hopes for Genentech was clobbered by a placebo.
As aptly put by science historian Anne Harrington, placebos are "ghosts that haunt our house of biomedical objectivity and expose the paradoxes and fissures in our own self-created definitions of the real and active factors in treatment".
The placebo's pharmacomimetic behaviour can even imitate a drug's side effects. In a 1997 study of patients with benign prostate hypertrophy, some patients on a placebo complained of various side effects ranging from impotence and reduced sexual activity to nausea, diarrhoea and constipation. Another study reported placebo side effects as including headaches, vomiting, nausea and a variety of other symptoms.

The placebo effect in surgery
But how deep can the placebo effect trespass into the well-defined area of medicine? Surely it can't joust with medicine's strike force; it cannot challenge surgery. Or can it?
In 1939, an Italian surgeon named Davide Fieschi invented a new technique for treating angina pectoris (chest pain due to ischaemia or lack of blood/oxygen getting to the heart muscle, usually due to obstruction of the coronary arteries).24 Reasoning that increased blood flow to the heart would reduce his patients' pain, he performed tiny incisions in their chests and tied knots on the two internal mammary arteries. Three quarters of the patients showed improvement; one quarter of them was cured. The surgical intervention became standard procedure for the treatment of angina for the next 20 years. But in 1959, a young cardiologist, Leonard Cobb, put the Fieschi procedure to the test. He operated on 17 patients: on eight of them he followed the standard procedure; on the other nine he performed only the tiny incisions, letting the patients believe that they'd had the real thing. The result was a real upset: those who'd had the sham surgery did as well as those who'd had the Fieschi technique.25 That was the end of the Fieschi technique and the beginning of the documented surgical placebo effect.
In 1994, surgeon J. Bruce Moseley experimented with the surgical placebo. He split a small group of patients suffering from osteoarthritis of the knee into two equal groups. Both groups were told that they would undergo arthroscopic surgery, but only the first group got the real thing. The other group was left virtually untreated, with the doctor performing only tiny incisions to make the arthroscopic scenario credible. Similar results were reported in both groups.26
Moseley, stunned by the outcome, decided to perform the trial with a larger statistical sample in order to reach safer conclusions. The results were replicated: arthroscopic surgery was equal therapeutically to the placebo effect.27 The placebo had found its way into surgical rooms.
Perhaps the most impressive aspect of surgical placebo arose in a groundbreaking 2004 study. In the innovative field of stem-cell research, a new approach was taken with Parkinson's disease. Human embryonic dopamine neurons were implanted through tiny holes in the patients' brains. Once again, the results were encouraging. And once again, the procedure failed to do better than a placebo. In this case, the placebo involved tiny holes incised in the skull without implantation of stem cells. As the researchers confessed, "The placebo effect was very strong in this study".28
But how can it be that the therapeutic expectancy alone often produces results equal to those from actual surgery? It appears that the mind is exerting control over somatic processes, including diseases. The biochemical traces of this influence are only beginning to be outlined. Modern research indicates a biological, tangible substrate to the placebo effect.

Somatic pathways
In the mid-1990s, researcher Fabrizio Benedetti conducted a novel experiment whereby he induced ischaemic pain and soothed it by administering morphine. When morphine was replaced by a saline solution, the placebo presented analgesic properties. However, when naloxone (an opiate antagonist) was added to the saline solution, the analgesic properties of the water were blocked. Benedetti reached the conclusion that the placebo's analgesic properties were a result of specific biochemical paths. Naloxone blocked not only morphine but also endogenous opioids-the physical pain-relievers.29
The endogenous opioids, endorphins, were discovered in 1974 and act as pain antagonists. Benedetti's suggestion of a placebo-induced release of endorphins was supported by findings produced with MRI and PET scans.30 Placebo-induced endorphin release also affects heart rate and respiratory activity.31 As researcher Jon-Kar Zubieta described, "...this [finding] deals another serious blow to the idea that the placebo effect is a purely psychological, not physical, phenomenon".32
Further findings support the notion that the placebo effect presents a biochemical substrate in both depression and Parkinson's disease. Analysing the results of PET scans, researchers estimated the glucose metabolism in the brains of patients with depression. Glucose metabolism under placebo presented differentiations that were similar to those caused by antidepressants such as fluoxetine.33 In patients suffering from Parkinson's disease, a placebo injection promoted dopamine secretion in a similar way to that caused by amphetamine administration.34 Benedetti demonstrated that the placebo effect provoked decreased activity in single neurons of the subthalamic nucleus in patients with Parkinson's disease.35
From numerous research findings, it is logical and rather safe to conclude that there is a biochemical substrate to the placebo effect. But what is more intriguing to it is its relation to perception. It would appear that perception and the codes and symbols that the animate computer, the brain, utilises in order to process internal and external information strongly determine the potency and form of placebo response.
In a recent study, patients were purposely misinformed that they had been infected by hazardous bacilli and they subsequently underwent treatment. However, there were no bacilli and the treatment administered was a placebo. Guess what? Some of the study subjects developed infection-like conditions that were not treatable by the placebo medication.36 The mind interpreted the fictional bacilli as hazardous and instructed the body to respond to them as if they were real.
Despite the placebo's potency and its importance for a new perception of health where body and mind heavily interact, large numbers of scientists continue to regard the placebo as an insignificant systematic error, a troublesome nought. According to cancer researcher Gershom Zajicek: "There is nothing in the pharmacokinetic theory which accounts for the placebo effect. In order to keep the theory consistent, the placebo effect is regarded as random error or noise which can be ignored."37
One of the most perceptive placebo researchers was Stewart Wolf, "the father of psychosomatic medicine", who as early as 1949 had given it a thorough description. Wolf not only defended the placebo as a non-fictional and very "real" phenomenon but also described the placebo's pharmacomimetic behaviour. He was perhaps the first researcher to correlate the placebo effect not only with psychology and predisposition but also with perception. More than half a century ago, he stated that "the mechanisms of the body are capable of reacting not only to direct physical and chemical stimulation but also to symbolic stimuli, words and events which have somehow acquired special meaning for the individual".38
In this context, a pill is not merely an active substance but also a therapeutic symbol and thus the organism is able to respond not only to its chemical content but also to its symbolic content. Likewise a bacillus, beyond its physical properties, acquires symbolic properties that can cause an organism's reaction even in the absence of the bacillus.
The presence and extent of the nocebo effect should also be studied in regard to drug resistance. Perhaps drug resistance is a multifactorial phenomenon involving not only microbial evolutionary aptness but also human psyche mechanics. Placebo and nocebo phenomena might prove fundamental not only on the personal level but also in the public health arena.
They might even provide the foundation stone for a new model of health, a new medicine that was envisioned by Wolf in the 1950s: "...in the future, drugs will be assessed not only with reference to their pharmacologic action but also to the other forces at play and to the circumstances surrounding their administration".39
Five centuries ago, Swiss alchemist and physician Paracelsus (1493-1541) wrote: "You must know that the will is a powerful adjuvant of medicine." It seems that our scientific arrogance has blinded us to the teachings of the past.

NEXUS: The Placebo Effect

Origins of the New Testament

(Warning, religious content; if you have strong religious beleifs, this video might be offensive)

In the fourth century, the Roman Emperor Constantine united all religious factions under one composite deity, and ordered the compilation of new and old writings into a uniform collection that became the New Testament.

It has often been emphasised that Christianity is unlike any other religion, for it stands or falls by certain events which are alleged to have occurred during a short period of time some 20 centuries ago. Those stories are presented in the New Testament, and as new evidence is revealed it will become clear that they do not represent historical realities. The Church agrees, saying:
"Our documentary sources of knowledge about the origins of Christianity and its earliest development are chiefly the New Testament Scriptures, the authenticity of which we must, to a great extent, take for granted."
(Catholic Encyclopedia, Farley ed., vol. iii, p. 712)

The Church makes extraordinary admissions about its New Testament. For example, when discussing the origin of those writings, "the most distinguished body of academic opinion ever assembled" (Catholic Encyclopedias, Preface) admits that the Gospels "do not go back to the first century of the Christian era" (Catholic Encyclopedia, Farley ed., vol. vi, p. 137, pp. 655-6). This statement conflicts with priesthood assertions that the earliest Gospels were progressively written during the decades following the death of the Gospel Jesus Christ. In a remarkable aside, the Church further admits that "the earliest of the extant manuscripts [of the New Testament], it is true, do not date back beyond the middle of the fourth century AD" (Catholic Encyclopedia, op. cit., pp. 656-7). That is some 350 years after the time the Church claims that a Jesus Christ walked the sands of Palestine, and here the true story of Christian origins slips into one of the biggest black holes in history. There is, however, a reason why there were no New Testaments until the fourth century: they were not written until then, and here we find evidence of the greatest misrepresentation of all time.
It was British-born Flavius Constantinus (Constantine, originally Custennyn or Custennin) (272-337) who authorised the compilation of the writings now called the New Testament. After the death of his father in 306, Constantine became King of Britain, Gaul and Spain, and then, after a series of victorious battles, Emperor of the Roman Empire. Christian historians give little or no hint of the turmoil of the times and suspend Constantine in the air, free of all human events happening around him. In truth, one of Constantine's main problems was the uncontrollable disorder amongst presbyters and their belief in numerous gods.
The majority of modern-day Christian writers suppress the truth about the development of their religion and conceal Constantine's efforts to curb the disreputable character of the presbyters who are now called "Church Fathers" (Catholic Encyclopedia, Farley ed., vol. xiv, pp. 370-1). They were "maddened", he said (Life of Constantine, attributed to Eusebius Pamphilius of Caesarea, c. 335, vol. iii, p. 171; The Nicene and Post-Nicene Fathers, cited as N&PNF, attributed to St Ambrose, Rev. Prof. Roberts, DD, and Principal James Donaldson, LLD, editors, 1891, vol. iv, p. 467). The "peculiar type of oratory" expounded by them was a challenge to a settled religious order (The Dictionary of Classical Mythology, Religion, Literature and Art, Oskar Seyffert, Gramercy, New York, 1995, pp. 544-5). Ancient records reveal the true nature of the presbyters, and the low regard in which they were held has been subtly suppressed by modern Church historians. In reality, they were:
"...the most rustic fellows, teaching strange paradoxes. They openly declared that none but the ignorant was fit to hear their discourses ... they never appeared in the circles of the wiser and better sort, but always took care to intrude themselves among the ignorant and uncultured, rambling around to play tricks at fairs and markets ... they lard their lean books with the fat of old fables ... and still the less do they understand ... and they write nonsense on vellum ... and still be doing, never done."
(Contra Celsum ["Against Celsus"], Origen of Alexandria, c. 251, Bk I, p. lxvii, Bk III, p. xliv, passim)
Clusters of presbyters had developed "many gods and many lords" (1 Cor. 8:5) and numerous religious sects existed, each with differing doctrines (Gal. 1:6). Presbyterial groups clashed over attributes of their various gods and "altar was set against altar" in competing for an audience (Optatus of Milevis, 1:15, 19, early fourth century). From Constantine's point of view, there were several factions that needed satisfying, and he set out to develop an all-embracing religion during a period of irreverent confusion. In an age of crass ignorance, with nine-tenths of the peoples of Europe illiterate, stabilising religious splinter groups was only one of Constantine's problems. The smooth generalisation, which so many historians are content to repeat, that Constantine "embraced the Christian religion" and subsequently granted "official toleration", is "contrary to historical fact" and should be erased from our literature forever (Catholic Encyclopedia, Pecci ed., vol. iii, p. 299, passim). Simply put, there was no Christian religion at Constantine's time, and the Church acknowledges that the tale of his "conversion" and "baptism" are "entirely legendary" (Catholic Encyclopedia, Farley ed., vol. xiv, pp. 370-1).
Constantine "never acquired a solid theological knowledge" and "depended heavily on his advisers in religious questions" (Catholic Encyclopedia, New Edition, vol. xii, p. 576, passim). According to Eusebeius (260-339), Constantine noted that among the presbyterian factions "strife had grown so serious, vigorous action was necessary to establish a more religious state", but he could not bring about a settlement between rival god factions (Life of Constantine, op. cit., pp. 26-8). His advisers warned him that the presbyters' religions were "destitute of foundation" and needed official stabilisation (ibid.).
Constantine saw in this confused system of fragmented dogmas the opportunity to create a new and combined State religion, neutral in concept, and to protect it by law. When he conquered the East in 324 he sent his Spanish religious adviser, Osius of Córdoba, to Alexandria with letters to several bishops exhorting them to make peace among themselves. The mission failed and Constantine, probably at the suggestion of Osius, then issued a decree commanding all presbyters and their subordinates "be mounted on asses, mules and horses belonging to the public, and travel to the city of Nicaea" in the Roman province of Bithynia in Asia Minor. They were instructed to bring with them the testimonies they orated to the rabble, "bound in leather" for protection during the long journey, and surrender them to Constantine upon arrival in Nicaea (The Catholic Dictionary, Addis and Arnold, 1917, "Council of Nicaea" entry). Their writings totalled "in all, two thousand two hundred and thirty-one scrolls and legendary tales of gods and saviours, together with a record of the doctrines orated by them" (Life of Constantine, op. cit., vol. ii, p. 73; N&PNF, op. cit., vol. i, p. 518).

The First Council of Nicaea and the "missing records"
Thus, the first ecclesiastical gathering in history was summoned and is today known as the Council of Nicaea. It was a bizarre event that provided many details of early clerical thinking and presents a clear picture of the intellectual climate prevailing at the time. It was at this gathering that Christianity was born, and the ramifications of decisions made at the time are difficult to calculate. About four years prior to chairing the Council, Constantine had been initiated into the religious order of Sol Invictus, one of the two thriving cults that regarded the Sun as the one and only Supreme God (the other was Mithraism). Because of his Sun worship, he instructed Eusebius to convene the first of three sittings on the summer solstice, 21 June 325 (Catholic Encyclopedia, New Edition, vol. i, p. 792), and it was "held in a hall in Osius's palace" (Ecclesiastical History, Bishop Louis Dupin, Paris, 1686, vol. i, p. 598). In an account of the proceedings of the conclave of presbyters gathered at Nicaea, Sabinius, Bishop of Hereclea, who was in attendance, said, "Excepting Constantine himself and Eusebius Pamphilius, they were a set of illiterate, simple creatures who understood nothing" (Secrets of the Christian Fathers, Bishop J. W. Sergerus, 1685, 1897 reprint).
This is another luminous confession of the ignorance and uncritical credulity of early churchmen. Dr Richard Watson (1737-1816), a disillusioned Christian historian and one-time Bishop of Llandaff in Wales (1782), referred to them as "a set of gibbering idiots" (An Apology for Christianity, 1776, 1796 reprint; also, Theological Tracts, Dr Richard Watson, "On Councils" entry, vol. 2, London, 1786, revised reprint 1791). From his extensive research into Church councils, Dr Watson concluded that "the clergy at the Council of Nicaea were all under the power of the devil, and the convention was composed of the lowest rabble and patronised the vilest abominations" (An Apology for Christianity, op. cit.). It was that infantile body of men who were responsible for the commencement of a new religion and the theological creation of Jesus Christ.
The Church admits that vital elements of the proceedings at Nicaea are "strangely absent from the canons" (Catholic Encyclopedia, Farley ed., vol. iii, p. 160). We shall see shortly what happened to them. However, according to records that endured, Eusebius "occupied the first seat on the right of the emperor and delivered the inaugural address on the emperor's behalf" (Catholic Encyclopedia, Farley ed., vol. v, pp. 619-620). There were no British presbyters at the council but many Greek delegates. "Seventy Eastern bishops" represented Asiatic factions, and small numbers came from other areas (Ecclesiastical History, ibid.). Caecilian of Carthage travelled from Africa, Paphnutius of Thebes from Egypt, Nicasius of Die (Dijon) from Gaul, and Donnus of Stridon made the journey from Pannonia.
It was at that puerile assembly, and with so many cults represented, that a total of 318 "bishops, priests, deacons, subdeacons, acolytes and exorcists" gathered to debate and decide upon a unified belief system that encompassed only one god (An Apology for Christianity, op. cit.). By this time, a huge assortment of "wild texts" (Catholic Encyclopedia, New Edition, "Gospel and Gospels") circulated amongst presbyters and they supported a great variety of Eastern and Western gods and goddesses: Jove, Jupiter, Salenus, Baal, Thor, Gade, Apollo, Juno, Aries, Taurus, Minerva, Rhets, Mithra, Theo, Fragapatti, Atys, Durga, Indra, Neptune, Vulcan, Kriste, Agni, Croesus, Pelides, Huit, Hermes, Thulis, Thammus, Eguptus, Iao, Aph, Saturn, Gitchens, Minos, Maximo, Hecla and Phernes (God's Book of Eskra, anon., ch. xlviii, paragraph 36).
Up until the First Council of Nicaea, the Roman aristocracy primarily worshipped two Greek gods-Apollo and Zeus-but the great bulk of common people idolised either Julius Caesar or Mithras (the Romanised version of the Persian deity Mithra). Caesar was deified by the Roman Senate after his death (15 March 44 BC) and subsequently venerated as "the Divine Julius". The word "Saviour" was affixed to his name, its literal meaning being "one who sows the seed", i.e., he was a phallic god. Julius Caesar was hailed as "God made manifest and universal Saviour of human life", and his successor Augustus was called the "ancestral God and Saviour of the whole human race" (Man and his Gods, Homer Smith, Little, Brown & Co., Boston, 1952). Emperor Nero (54-68), whose original name was Lucius Domitius Ahenobarbus (37-68), was immortalised on his coins as the "Saviour of mankind" (ibid.). The Divine Julius as Roman Saviour and "Father of the Empire" was considered "God" among the Roman rabble for more than 300 years. He was the deity in some Western presbyters' texts, but was not recognised in Eastern or Oriental writings.
Constantine's intention at Nicaea was to create an entirely new god for his empire who would unite all religious factions under one deity. Presbyters were asked to debate and decide who their new god would be. Delegates argued among themselves, expressing personal motives for inclusion of particular writings that promoted the finer traits of their own special deity. Throughout the meeting, howling factions were immersed in heated debates, and the names of 53 gods were tabled for discussion. "As yet, no God had been selected by the council, and so they balloted in order to determine that matter... For one year and five months the balloting lasted..." (God's Book of Eskra, Prof. S. L. MacGuire's translation, Salisbury, 1922, chapter xlviii, paragraphs 36, 41).
At the end of that time, Constantine returned to the gathering to discover that the presbyters had not agreed on a new deity but had balloted down to a shortlist of five prospects: Caesar, Krishna, Mithra, Horus and Zeus (Historia Ecclesiastica, Eusebius, c. 325). Constantine was the ruling spirit at Nicaea and he ultimately decided upon a new god for them. To involve British factions, he ruled that the name of the great Druid god, Hesus, be joined with the Eastern Saviour-god, Krishna (Krishna is Sanskrit for Christ), and thus Hesus Krishna would be the official name of the new Roman god. A vote was taken and it was with a majority show of hands (161 votes to 157) that both divinities became one God. Following longstanding heathen custom, Constantine used the official gathering and the Roman apotheosis decree to legally deify two deities as one, and did so by democratic consent. A new god was proclaimed and "officially" ratified by Constantine (Acta Concilii Nicaeni, 1618). That purely political act of deification effectively and legally placed Hesus and Krishna among the Roman gods as one individual composite. That abstraction lent Earthly existence to amalgamated doctrines for the Empire's new religion; and because there was no letter "J" in alphabets until around the ninth century, the name subsequently evolved into "Jesus Christ".

How the Gospels were created
Constantine then instructed Eusebius to organise the compilation of a uniform collection of new writings developed from primary aspects of the religious texts submitted at the council. His instructions were:
"Search ye these books, and whatever is good in them, that retain; but whatsoever is evil, that cast away. What is good in one book, unite ye with that which is good in another book. And whatsoever is thus brought together shall be called The Book of Books. And it shall be the doctrine of my people, which I will recommend unto all nations, that there shall be no more war for religions' sake."
(God's Book of Eskra, op. cit., chapter xlviii, paragraph 31)

"Make them to astonish" said Constantine, and "the books were written accordingly" (Life of Constantine, vol. iv, pp. 36-39). Eusebius amalgamated the "legendary tales of all the religious doctrines of the world together as one", using the standard god-myths from the presbyters' manuscripts as his exemplars. Merging the supernatural "god" stories of Mithra and Krishna with British Culdean beliefs effectively joined the orations of Eastern and Western presbyters together "to form a new universal belief" (ibid.). Constantine believed that the amalgamated collection of myths would unite variant and opposing religious factions under one representative story. Eusebius then arranged for scribes to produce "fifty sumptuous copies ... to be written on parchment in a legible manner, and in a convenient portable form, by professional scribes thoroughly accomplished in their art" (ibid.). "These orders," said Eusebius, "were followed by the immediate execution of the work itself ... we sent him [Constantine] magnificently and elaborately bound volumes of three-fold and four-fold forms" (Life of Constantine, vol. iv, p. 36). They were the "New Testimonies", and this is the first mention (c. 331) of the New Testament in the historical record.
With his instructions fulfilled, Constantine then decreed that the New Testimonies would thereafter be called the "word of the Roman Saviour God" (Life of Constantine, vol. iii, p. 29) and official to all presbyters sermonising in the Roman Empire. He then ordered earlier presbyterial manuscripts and the records of the council "burnt" and declared that "any man found concealing writings should be stricken off from his shoulders" (beheaded) (ibid.). As the record shows, presbyterial writings previous to the Council of Nicaea no longer exist, except for some fragments that have survived.
Some council records also survived, and they provide alarming ramifications for the Church.Some old documents say that the First Council of Nicaea ended in mid-November 326, while others say the struggle to establish a god was so fierce that it extended "for four years and seven months" from its beginning in June 325 (Secrets of the Christian Fathers, op. cit.). Regardless of when it ended, the savagery and violence it encompassed were concealed under the glossy title "Great and Holy Synod", assigned to the assembly by the Church in the 18th century. Earlier Churchmen, however, expressed a different opinion.

The Second Council of Nicaea in 786-87 denounced the First Council of Nicaea as "a synod of fools and madmen" and sought to annul "decisions passed by men with troubled brains" (History of the Christian Church, H. H. Milman, DD, 1871). If one chooses to read the records of the Second Nicaean Council and notes references to "affrighted bishops" and the "soldiery" needed to "quell proceedings", the "fools and madmen" declaration is surely an example of the pot calling the kettle black.
Constantine died in 337 and his outgrowth of many now-called pagan beliefs into a new religious system brought many converts. Later Church writers made him "the great champion of Christianity" which he gave "legal status as the religion of the Roman Empire" (Encyclopedia of the Roman Empire, Matthew Bunson, Facts on File, New York, 1994, p. 86). Historical records reveal this to be incorrect, for it was "self-interest" that led him to create Christianity (A Smaller Classical Dictionary, J. M. Dent, London, 1910, p. 161). Yet it wasn't called "Christianity" until the 15th century (How The Great Pan Died, Professor Edmond S. Bordeaux [Vatican archivist], Mille Meditations, USA, MCMLXVIII, pp. 45-7).
Over the ensuing centuries, Constantine's New Testimonies were expanded upon, "interpolations" were added and other writings included (Catholic Encyclopedia, Farley ed., vol. vi, pp. 135-137; also, Pecci ed., vol. ii, pp. 121-122). For example, in 397 John "golden-mouthed" Chrysostom restructured the writings of Apollonius of Tyana, a first-century wandering sage, and made them part of the New Testimonies (Secrets of the Christian Fathers, op. cit.). The Latinised name for Apollonius is Paulus (A Latin-English Dictionary, J. T. White and J. E. Riddle, Ginn & Heath, Boston, 1880), and the Church today calls those writings the Epistles of Paul. Apollonius's personal attendant, Damis, an Assyrian scribe, is Demis in the New Testament (2 Tim. 4:10).
The Church hierarchy knows the truth about the origin of its Epistles, for Cardinal Bembo (d. 1547), secretary to Pope Leo X (d. 1521), advised his associate, Cardinal Sadoleto, to disregard them, saying "put away these trifles, for such absurdities do not become a man of dignity; they were introduced on the scene later by a sly voice from heaven" (Cardinal Bembo: His Letters and Comments on Pope Leo X, A. L. Collins, London, 1842 reprint).
The Church admits that the Epistles of Paul are forgeries, saying, "Even the genuine Epistles were greatly interpolated to lend weight to the personal views of their authors" (Catholic Encyclopedia, Farley ed., vol. vii, p. 645). Likewise, St Jerome (d. 420) declared that the Acts of the Apostles, the fifth book of the New Testament, was also "falsely written" ("The Letters of Jerome", Library of the Fathers, Oxford Movement, 1833-45, vol. v, p. 445).

The shock discovery of an ancient Bible
The New Testament subsequently evolved into a fulsome piece of priesthood propaganda, and the Church claimed it recorded the intervention of a divine Jesus Christ into Earthly affairs. However, a spectacular discovery in a remote Egyptian monastery revealed to the world the extent of later falsifications of the Christian texts, themselves only an "assemblage of legendary tales" (Encyclopédie, Diderot, 1759). On 4 February 1859, 346 leaves of an ancient codex were discovered in the furnace room at St Catherine's monastery at Mt Sinai, and its contents sent shockwaves through the Christian world. Along with other old codices, it was scheduled to be burned in the kilns to provide winter warmth for the inhabitants of the monastery. Written in Greek on donkey skins, it carried both the Old and New Testaments, and later in time archaeologists dated its composition to around the year 380. It was discovered by Dr Constantin von Tischendorf (1815-1874), a brilliant and pious German biblical scholar, and he called it the Sinaiticus, the Sinai Bible. Tischendorf was a professor of theology who devoted his entire life to the study of New Testament origins, and his desire to read all the ancient Christian texts led him on the long, camel-mounted journey to St Catherine's Monastery.
During his lifetime, Tischendorf had access to other ancient Bibles unavailable to the public, such as the Alexandrian (or Alexandrinus) Bible, believed to be the second oldest Bible in the world. It was so named because in 1627 it was taken from Alexandria to Britain and gifted to King Charles I (1600-49). Today it is displayed alongside the world's oldest known Bible, the Sinaiticus, in the British Library in London. During his research, Tischendorf had access to the Vaticanus, the Vatican Bible, believed to be the third oldest in the world and dated to the mid-sixth century (The Various Versions of the Bible, Dr Constantin von Tischendorf, 1874, available in the British Library). It was locked away in the Vatican's inner library. Tischendorf asked if he could extract handwritten notes, but his request was declined. However, when his guard took refreshment breaks, Tischendorf wrote comparative narratives on the palm of his hand and sometimes on his fingernails ("Are Our Gospels Genuine or Not?", Dr Constantin von Tischendorf, lecture, 1869, available in the British Library).
Today, there are several other Bibles written in various languages during the fifth and sixth centuries, examples being the Syriacus, the Cantabrigiensis (Bezae), the Sarravianus and the Marchalianus.
A shudder of apprehension echoed through Christendom in the last quarter of the 19th century when English-language versions of the Sinai Bible were published. Recorded within these pages is information that disputes Christianity's claim of historicity. Christians were provided with irrefutable evidence of wilful falsifications in all modern New Testaments. So different was the Sinai Bible's New Testament from versions then being published that the Church angrily tried to annul the dramatic new evidence that challenged its very existence. In a series of articles published in the London Quarterly Review in 1883, John W. Burgon, Dean of Chichester, used every rhetorical device at his disposal to attack the Sinaiticus' earlier and opposing story of Jesus Christ, saying that "...without a particle of hesitation, the Sinaiticus is scandalously corrupt ... exhibiting the most shamefully mutilated texts which are anywhere to be met with; they have become, by whatever process, the depositories of the largest amount of fabricated readings, ancient blunders and intentional perversions of the truth which are discoverable in any known copies of the word of God". Dean Burgon's concerns mirror opposing aspects of Gospel stories then current, having by now evolved to a new stage through centuries of tampering with the fabric of an already unhistorical document.

The revelations of ultraviolet light testing
In 1933, the British Museum in London purchased the Sinai Bible from the Soviet government for £100,000, of which £65,000 was gifted by public subscription. Prior to the acquisition, this Bible was displayed in the Imperial Library in St Petersburg, Russia, and "few scholars had set eyes on it" (The Daily Telegraph and Morning Post, 11 January 1938, p. 3). When it went on display in 1933 as "the oldest Bible in the world" (ibid.), it became the centre of a pilgrimage unequalled in the history of the British Museum.
Before I summarise its conflictions, it should be noted that this old codex is by no means a reliable guide to New Testament study as it contains superabundant errors and serious re-editing. These anomalies were exposed as a result of the months of ultraviolet-light tests carried out at the British Museum in the mid-1930s. The findings revealed replacements of numerous passages by at least nine different editors. Photographs taken during testing revealed that ink pigments had been retained deep in the pores of the skin. The original words were readable under ultraviolet light. Anybody wishing to read the results of the tests should refer to the book written by the researchers who did the analysis: the Keepers of the Department of Manuscripts at the British Museum (Scribes and Correctors of the Codex Sinaiticus, H. J. M. Milne and T. C. Skeat, British Museum, London, 1938).

Forgery in the Gospels
When the New Testament in the Sinai Bible is compared with a modern-day New Testament, a staggering 14,800 editorial alterations can be identified. These amendments can be recognised by a simple comparative exercise that anybody can and should do. Serious study of Christian origins must emanate from the Sinai Bible's version of the New Testament, not modern editions.
Of importance is the fact that the Sinaiticus carries three Gospels since rejected: the Shepherd of Hermas (written by two resurrected ghosts, Charinus and Lenthius), the Missive of Barnabas and the Odes of Solomon. Space excludes elaboration on these bizarre writings and also discussion on dilemmas associated with translation variations.
Modern Bibles are five removes in translation from early editions, and disputes rage between translators over variant interpretations of more than 5,000 ancient words. However, it is what is not written in that old Bible that embarrasses the Church, and this article discusses only a few of those omissions. One glaring example is subtly revealed in the Encyclopaedia Biblica (Adam & Charles Black, London, 1899, vol. iii, p. 3344), where the Church divulges its knowledge about exclusions in old Bibles, saying: "The remark has long ago and often been made that, like Paul, even the earliest Gospels knew nothing of the miraculous birth of our Saviour". That is because there never was a virgin birth.
It is apparent that when Eusebius assembled scribes to write the New Testimonies, he first produced a single document that provided an exemplar or master version. Today it is called the Gospel of Mark, and the Church admits that it was "the first Gospel written" (Catholic Encyclopedia, Farley ed., vol. vi, p. 657), even though it appears second in the New Testament today. The scribes of the Gospels of Matthew and Luke were dependent upon the Mark writing as the source and framework for the compilation of their works. The Gospel of John is independent of those writings, and the late-15th-century theory that it was written later to support the earlier writings is the truth (The Crucifixion of Truth, Tony Bushby, Joshua Books, 2004, pp. 33-40).
Thus, the Gospel of Mark in the Sinai Bible carries the "first" story of Jesus Christ in history, one completely different to what is in modern Bibles. It starts with Jesus "at about the age of thirty" (Mark 1:9), and doesn't know of Mary, a virgin birth or mass murders of baby boys by Herod. Words describing Jesus Christ as "the son of God" do not appear in the opening narrative as they do in today's editions (Mark 1:1), and the modern-day family tree tracing a "messianic bloodline" back to King David is non-existent in all ancient Bibles, as are the now-called "messianic prophecies" (51 in total). The Sinai Bible carries a conflicting version of events surrounding the "raising of Lazarus", and reveals an extraordinary omission that later became the central doctrine of the Christian faith: the resurrection appearances of Jesus Christ and his ascension into Heaven. No supernatural appearance of a resurrected Jesus Christ is recorded in any ancient Gospels of Mark, but a description of over 500 words now appears in modern Bibles (Mark 16:9-20).
Despite a multitude of long-drawn-out self-justifications by Church apologists, there is no unanimity of Christian opinion regarding the non-existence of "resurrection" appearances in ancient Gospel accounts of the story. Not only are those narratives missing in the Sinai Bible, but they are absent in the Alexandrian Bible, the Vatican Bible, the Bezae Bible and an ancient Latin manuscript of Mark, code-named "K" by analysts. They are also lacking in the oldest Armenian version of the New Testament, in sixth-century manuscripts of the Ethiopic version and ninth-century Anglo-Saxon Bibles. However, some 12th-century Gospels have the now-known resurrection verses written within asterisksÑmarks used by scribes to indicate spurious passages in a literary document.
The Church claims that "the resurrection is the fundamental argument for our Christian belief" (Catholic Encyclopedia, Farley ed., vol. xii, p. 792), yet no supernatural appearance of a resurrected Jesus Christ is recorded in any of the earliest Gospels of Mark available. A resurrection and ascension of Jesus Christ is the sine qua non ("without which, nothing") of Christianity (Catholic Encyclopedia, Farley ed., vol. xii, p. 792), confirmed by words attributed to Paul: "If Christ has not been raised, your faith is in vain" (1 Cor. 5:17). The resurrection verses in today's Gospels of Mark are universally acknowledged as forgeries and the Church agrees, saying "the conclusion of Mark is admittedly not genuine ... almost the entire section is a later compilation" (Encyclopaedia Biblica, vol. ii, p. 1880, vol. iii, pp. 1767, 1781; also, Catholic Encyclopedia, vol. iii, under the heading "The Evidence of its Spuriousness"; Catholic Encyclopedia, Farley ed., vol. iii, pp. 274-9 under heading "Canons"). Undaunted, however, the Church accepted the forgery into its dogma and made it the basis of Christianity.
The trend of fictitious resurrection narratives continues. The final chapter of the Gospel of John (21) is a sixth-century forgery, one entirely devoted to describing Jesus' resurrection to his disciples. The Church admits: "The sole conclusion that can be deduced from this is that the 21st chapter was afterwards added and is therefore to be regarded as an appendix to the Gospel" (Catholic Encyclopedia, Farley ed., vol. viii, pp. 441-442; New Catholic Encyclopedia (NCE), "Gospel of John", p. 1080; also NCE, vol. xii, p. 407).

"The Great Insertion" and "The Great Omission"
Modern-day versions of the Gospel of Luke have a staggering 10,000 more words than the same Gospel in the Sinai Bible. Six of those words say of Jesus "and was carried up into heaven", but this narrative does not appear in any of the oldest Gospels of Luke available today ("Three Early Doctrinal Modifications of the Text of the Gospels", F. C. Conybeare, The Hibbert Journal, London, vol. 1, no. 1, Oct 1902, pp. 96-113). Ancient versions do not verify modern-day accounts of an ascension of Jesus Christ, and this falsification clearly indicates an intention to deceive.
Today, the Gospel of Luke is the longest of the canonical Gospels because it now includes "The Great Insertion", an extraordinary 15th-century addition totalling around 8,500 words (Luke 9:51-18:14). The insertion of these forgeries into that Gospel bewilders modern Christian analysts, and of them the Church said: "The character of these passages makes it dangerous to draw inferences" (Catholic Encyclopedia, Pecci ed., vol. ii, p. 407).
Just as remarkable, the oldest Gospels of Luke omit all verses from 6:45 to 8:26, known in priesthood circles as "The Great Omission", a total of 1,547 words. In today's versions, that hole has been "plugged up" with passages plagiarised from other Gospels. Dr Tischendorf found that three paragraphs in newer versions of the Gospel of Luke's version of the Last Supper appeared in the 15th century, but the Church still passes its Gospels off as the unadulterated "word of God" ("Are Our Gospels Genuine or Not?", op. cit.)

The "Expurgatory Index"
As was the case with the New Testament, so also were damaging writings of early "Church Fathers" modified in centuries of copying, and many of their records were intentionally rewritten or suppressed.
Adopting the decrees of the Council of Trent (1545-63), the Church subsequently extended the process of erasure and ordered the preparation of a special list of specific information to be expunged from early Christian writings (Delineation of Roman Catholicism, Rev. Charles Elliott, DD, G. Lane & P. P. Sandford, New York, 1842, p. 89; also, The Vatican Censors, Professor Peter Elmsley, Oxford, p. 327, pub. date n/a).
In 1562, the Vatican established a special censoring office called Index Expurgatorius. Its purpose was to prohibit publication of "erroneous passages of the early Church Fathers" that carried statements opposing modern-day doctrine.
When Vatican archivists came across "genuine copies of the Fathers, they corrected them according to the Expurgatory Index" (Index Expurgatorius Vaticanus, R. Gibbings, ed., Dublin, 1837; The Literary Policy of the Church of Rome, Joseph Mendham, J. Duncan, London, 1830, 2nd ed., 1840; The Vatican Censors, op. cit., p. 328). This Church record provides researchers with "grave doubts about the value of all patristic writings released to the public" (The Propaganda Press of Rome, Sir James W. L. Claxton, Whitehaven Books, London, 1942, p. 182).
Important for our story is the fact that the Encyclopaedia Biblica reveals that around 1,200 years of Christian history are unknown: "Unfortunately, only few of the records [of the Church] prior to the year 1198 have been released". It was not by chance that, in that same year (1198), Pope Innocent III (1198-1216) suppressed all records of earlier Church history by establishing the Secret Archives (Catholic Encyclopedia, Farley ed., vol. xv, p. 287). Some seven-and-a-half centuries later, and after spending some years in those Archives, Professor Edmond S. Bordeaux wrote How The Great Pan Died. In a chapter titled "The Whole of Church History is Nothing but a Retroactive Fabrication", he said this (in part):
"The Church ante-dated all her late works, some newly made, some revised and some counterfeited, which contained the final expression of her history ... her technique was to make it appear that much later works written by Church writers were composed a long time earlier, so that they might become evidence of the first, second or third centuries."
(How The Great Pan Died, op. cit., p. 46)
Supporting Professor Bordeaux's findings is the fact that, in 1587, Pope Sixtus V (1585-90) established an official Vatican publishing division and said in his own words, "Church history will be now be established ... we shall seek to print our own account"Encyclopédie, Diderot, 1759). Vatican records also reveal that Sixtus V spent 18 months of his life as pope personally writing a new Bible and then introduced into Catholicism a "New Learning" (Catholic Encyclopedia, Farley ed., vol. v, p. 442, vol. xv, p. 376). The evidence that the Church wrote its own history is found in Diderot's Encyclopédie, and it reveals the reason why Pope Clement XIII (1758-69) ordered all volumes to be destroyed immediately after publication in 1759.

Gospel authors exposed as imposters
There is something else involved in this scenario and it is recorded in the Catholic Encyclopedia. An appreciation of the clerical mindset arises when the Church itself admits that it does not know who wrote its Gospels and Epistles, confessing that all 27 New Testament writings began life anonymously:
"It thus appears that the present titles of the Gospels are not traceable to the evangelists themselves ... they [the New Testament collection] are supplied with titles which, however ancient, do not go back to the respective authors of those writings." (Catholic Encyclopedia, Farley ed., vol. vi, pp. 655-6)
The Church maintains that "the titles of our Gospels were not intended to indicate authorship", adding that "the headings ... were affixed to them" (Catholic Encyclopedia, Farley ed., vol. i, p. 117, vol. vi, pp. 655, 656). Therefore they are not Gospels written "according to Matthew, Mark, Luke or John", as publicly stated. The full force of this confession reveals that there are no genuine apostolic Gospels, and that the Church's shadowy writings today embody the very ground and pillar of Christian foundations and faith. The consequences are fatal to the pretence of Divine origin of the entire New Testament and expose Christian texts as having no special authority. For centuries, fabricated Gospels bore Church certification of authenticity now confessed to be false, and this provides evidence that Christian writings are wholly fallacious.
After years of dedicated New Testament research, Dr Tischendorf expressed dismay at the differences between the oldest and newest Gospels, and had trouble understanding...
"...how scribes could allow themselves to bring in here and there changes which were not simply verbal ones, but such as materially affected the very meaning and, what is worse still, did not shrink from cutting out a passage or inserting one."
(Alterations to the Sinai Bible, Dr Constantin von Tischendorf, 1863, available in the British Library, London)
After years of validating the fabricated nature of the New Testament, a disillusioned Dr Tischendorf confessed that modern-day editions have "been altered in many places" and are "not to be accepted as true" (When Were Our Gospels Written?, Dr Constantin von Tischendorf, 1865, British Library, London).

Just what is Christianity?
The important question then to ask is this: if the New Testament is not historical, what is it?
Dr Tischendorf provided part of the answer when he said in his 15,000 pages of critical notes on the Sinai Bible that "it seems that the personage of Jesus Christ was made narrator for many religions". This explains how narratives from the ancient Indian epic, the Mahabharata, appear verbatim in the Gospels today (e.g., Matt. 1:25, 2:11, 8:1-4, 9:1-8, 9:18-26), and why passages from the Phenomena of the Greek statesman Aratus of Sicyon (271-213 BC) are in the New Testament.
Extracts from the Hymn to Zeus, written by Greek philosopher Cleanthes (c. 331-232 BC), are also found in the Gospels, as are 207 words from the Thais of Menander (c. 343-291), one of the "seven wise men" of Greece. Quotes from the semi-legendary Greek poet Epimenides (7th or 6th century BC) are applied to the lips of Jesus Christ, and seven passages from the curious Ode of Jupiter (c. 150 BC; author unknown) are reprinted in the New Testament.
Tischendorf's conclusion also supports Professor Bordeaux's Vatican findings that reveal the allegory of Jesus Christ derived from the fable of Mithra, the divine son of God (Ahura Mazda) and messiah of the first kings of the Persian Empire around 400 BC. His birth in a grotto was attended by magi who followed a star from the East. They brought "gifts of gold, frankincense and myrrh" (as in Matt. 2:11) and the newborn baby was adored by shepherds. He came into the world wearing the Mithraic cap, which popes imitated in various designs until well into the 15th century.
Mithra, one of a trinity, stood on a rock, the emblem of the foundation of his religion, and was anointed with honey. After a last supper with Helios and 11 other companions, Mithra was crucified on a cross, bound in linen, placed in a rock tomb and rose on the third day or around 25 March (the full moon at the spring equinox, a time now called Easter after the Babylonian goddess Ishtar). The fiery destruction of the universe was a major doctrine of Mithraism-a time in which Mithra promised to return in person to Earth and save deserving souls. Devotees of Mithra partook in a sacred communion banquet of bread and wine, a ceremony that paralleled the Christian Eucharist and preceded it by more than four centuries.
Christianity is an adaptation of Mithraism welded with the Druidic principles of the Culdees, some Egyptian elements (the pre-Christian Book of Revelation was originally called The Mysteries of Osiris and Isis), Greek philosophy and various aspects of Hinduism.

Why there are no records of Jesus Christ
It is not possible to find in any legitimate religious or historical writings compiled between the beginning of the first century and well into the fourth century any reference to Jesus Christ and the spectacular events that the Church says accompanied his life. This confirmation comes from Frederic Farrar (1831-1903) of Trinity College, Cambridge:
"It is amazing that history has not embalmed for us even one certain or definite saying or circumstance in the life of the Saviour of mankind ... there is no statement in all history that says anyone saw Jesus or talked with him. Nothing in history is more astonishing than the silence of contemporary writers about events relayed in the four Gospels."
(The Life of Christ, Frederic W. Farrar, Cassell, London, 1874)
This situation arises from a conflict between history and New Testament narratives. Dr Tischendorf made this comment:
"We must frankly admit that we have no source of information with respect to the life of Jesus Christ other than ecclesiastic writings assembled during the fourth century."
(Codex Sinaiticus, Dr Constantin von Tischendorf, British Library, London)

NEXUS: Forged Origins of the New Testament

World fresh water usage and supply graph

Evolutionary Psychology -Steven Pinker -

59 min 2 sec - Jul 1, 2002
Steven Pinker is Johnstone Family Professor of Psychology in the Department of Psychology at Harvard University and MIT.

Simplified schematic of a multistage thermonuclear weapon

Sequence of events in explosion:

STAGE 1: fission explosion
- Multiple detonators (3) simultaneously initiate detonation of high explosives (4).
- As detonation progresses through high explosives (4), shaping of these charges transforms the explosive shock front to one that is spherically symmetric, travelling inward.
- Explosive shock front compresses and transits the pusher (5) which facilitates transition of the shock wave from low-density high explosive to high-density core material.
- Shock front in turn compresses the reflector (5), tamper (6), and fissile core (7) inward.
- When compression of the fissile core (7) reaches optimum density, a neutron initiator (either in the center of the fissile core or outside the high explosive assembly) releases a burst of neutrons into the core.
- The neutron burst initiates a fission chain reaction in the fissile core (7): a neutron splits a plutonium/uranium-235 atom, releasing perhaps two or three neutrons to do the same to other atoms, and so on; energy release increases geometrically.
- Many neutrons escaping from the fissile core (7) are reflected back to it by the tamper (6) and reflector (5), improving the chain reaction.
- The mass of the tamper (6) delays the fissile core (7) from expanding under the heat of the building energy release.
- Neutrons from the chain reaction in the fissile core (7) cause transmutation of atoms in the uranium-235 tamper (6).
- As the superheated core expands under the energy release, the chain reaction ends.

STAGE 2: fusion explosion
- Gamma radiation from the fission explosion superheats the filler material (2), turning it into a plasma.
- The vaporized filler material (2) is delayed from expanding outward by the bomb casing (1), increasing its tendency to compress the fusion pusher/tamper (9).
- Compression reaches the fusion fuel (10), which has been partially protected from gamma radiation by the radiation shield (8).
- Compression reaches the fissile sparkplug (11), compressing it to a super-critical mass.
- Neutrons from the explosion of stage 1 reach the fissile sparkplug (11) through the channel in the radiation shield (8), initiating a fission chain reaction.
- The sparkplug (11) explodes outward.
- The fusion fuel (10) is now supercompressed between the fusion pusher/tamper (9) from without and the sparkplug (11) from within, turning it into a superheated plasma.
- Lithium and deuterium nuclei collide in the fusion fuel (10) to produce tritium, and tritium and deuterium nuclei engage in fusion reactions: nuclei fuse by pairs into helium nuclei,
producing a large energy release of gamma rays, neutrons, and heat.
- The large release of neutrons from fusion in the fusion fuel (10) causes transmutation of uranium-235 atoms in the fusion pusher/tamper (9), releasing additional energy.
- All reactions end as the superheated remnants expand under the energy release; the entire weapon is vaporized.
- Total elapsed time: about 0.00002 seconds.

Greek Gods Family Tree

Good explanation.
Click to enlarge...

Diagram of Programming Languages

When Worry Hijacks The Brain - TIME

Thursday, Aug. 02, 2007
When Worry Hijacks The Brain
By Jeffrey Kluger

Even the most stable brain operates just a millimeter from madness. In such a finely tuned cognitive engine, only a small part must start to sputter before the whole machine comes crashing down. When that happens, reason and function come undone, rarely as dramatically as in the neurochemical storm that is obsessive-compulsive disorder.
Say you leave work at 6 p.m. for what should be a 12-minute drive home. Say just as you're pulling onto the street, a child on a bicycle crosses in front of you. A few feet later, you feel the thump of a pothole. But what if it wasn't a pothole? Suppose you hit the child. You look in your rearview mirror, and all is clear, but can you be sure? So you circle back around the block. Still clear--except for a lumpy bag of leaves on the curb. But is it a bag or a child? So you circle once more. Four hours later, you finally arrive home, mutter something to your spouse about a late meeting and go to bed spent and ashamed. Tomorrow you'll do it all over again.
Devoting an entire evening to a 12-minute drive is not the only way to know you've got obsessive-compulsive disorder (OCD). You know it when you shrink from the sight of a kitchen knife, worried that you'll inexplicably snatch it up and hurt yourself or a family member. You know it when leaving the house consumes hours of your day because the pillows on your bed must be placed just right. You know it when you can't leave the house at all for fear of a vast and vague contamination that you can't even name.
We all think we know what OCD is, and most of the time we're all wrong. It's the nervous guy from Monk; it's cranky Jack Nicholson in As Good As It Gets. In the end, though, things usually work out for them. They even get the girl, who sees them as a kind of adorable emotional fixer-upper.
But OCD isn't adorable. About 7 million adults, teens and children in the U.S. are now thought to have it in one form or another, and their pain is far worse than you probably know. What's more, since one family member disabled by the disorder can destabilize an entire household, a single diagnosed case can mean several collateral victims. Worse, OCD is a condition that often masquerades as other things. It is routinely labeled depression, bipolar disorder, attention-deficit/hyperactivity disorder (ADHD), autism, even schizophrenia. Victims often conceal their problem for years, ensuring that no diagnosis--right or wrong--can begin to be made.
With the twin obstacles of secrecy and mislabeling, the average lag time between the onset of the disorder and a proper diagnosis is now a shocking nine years, according to surveys of doctors conducted by the Obsessive Compulsive Foundation, a 21-year-old organization with headquarters in New Haven, Conn. It takes an average of eight additional years before effective treatment is prescribed. If the disorder strikes a young person, as it often does, that can mean an entire childhood lost to illness. "OCD has had a slow research start," says Gerald Nestadt, co-director of the OCD clinic at Johns Hopkins University. "It's behind schizophrenia, bipolar disorder, autism and ADHD."
But all that is changing. A burst of new genetics studies is turning up insights into the causes of the disorder. Scanning technologies are pinpointing the parts of the brain that trigger the symptoms. New treatments are being developed. And refinements of old treatments, like talk and behavioral therapy, are proving more effective than ever.
"Everyone has intrusive thoughts, but most people consider them meaningless and can move on with their lives," says psychologist Sabine Wilhelm, associate professor at the Harvard Medical School and director of the OCD clinic at Massachusetts General Hospital. "For people with OCD, the thoughts become their lives. We can give those lives back to them."


ON THE WHOLE, A LITTLE ANXIETY IS A VERY good thing. It was not enough for humans in the state of nature to know there was no lion near the family cave; they also had to be able to imagine all the other places a lion could lurk. The same is true for other eccentricities of human behavior. Our anxiety about all the ways harm may befall someone else keeps us mindful of the safety of family and community. "There's a creative, what-if quality to this thinking," says clinical psychologist Jonathan Grayson of the Anxiety and Agoraphobia Treatment Center in Bala Cynwyd, Pa. "It's evolutionarily valuable."
Something woven so tightly into the genome is not likely to be shaken loose by a few thousand years of modern living. But that doesn't mean every person with eccentric traits--the woman in the office next to yours who keeps her desk impeccably neat and gets edgy if something is moved out of place, for example--has OCD. "Having these OCD-like traits is a universal experience," says Judith Rapoport, author of the landmark book The Boy Who Couldn't Stop Washing and chief of child psychiatry at the National Institute of Mental Health. "I sometimes count on my fingers when I have nothing to count." The key to diagnosing whether such behavior is authentic OCD is how great an impact the behavior has on your life. "You have to show longstanding interference with function, and that eliminates most people," Rapoport explains.
What causes some people to suffer that interference and most not? Why does their internal alarm keep shouting "Lion!" long after they've checked every place a lion could plausibly be? The answer has always been thought to lie principally in a small, almond-shaped structure in the brain called the amygdala--the place where danger is processed and evaluated. It stands to reason that if this risk center is overactive, it would keep on alerting you to peril even after you've attended to the problem.
As it turns out, the amygdala is indeed a big player in the pathological process of OCD but only one of several players. Functional magnetic resonance imaging (fMRI) and other scanning technologies have allowed researchers to peer deeper than ever into the OCD-tossed brain. In addition to the amygdala, there are three other anatomical hot spots involved in the disorder: the orbital frontal cortex, the caudate nucleus and the thalamus--the first two seated high in the brain, the third lying deeper within.
"Those areas are linked along a circuit," says Dr. Sanjaya Saxena, director of the OCD program at the University of California at San Diego. It's the job of that wiring to regulate your response to the stimuli around you, including how anxious you are in the face of threatening or frustrating things. "That circuit," says Saxena, "is abnormally active in people with OCD."
Saxena, who has conducted extensive scanning research, has even come to recognize the neural fingerprint that distinguishes one less common type of OCD behavior--hoarding--from better-known ones. Hoarders who live alone have been known to crowd themselves into small areas of their home, with clear paths left from sofa to kitchen to bathroom, and the rest piled high with debris. When Saxena scanned the brains of these highly particular people, he found that they had equally particular abnormalities. Instead of hyperactivity in any area, they had reduced activity in the anterior cingulate gyrus, the part of the brain that helps you focus your attention and make decisions. "Those are things that compulsive hoarders have a lot of trouble with," he says.


ALTHOUGH SCANS CAN TELL YOU THE landscape of the obsessive-compulsive brain, they can't tell you how it got to be that way. As with many other psychological disorders, research is revealing that ocd has a powerful genetic component. Having any blood relative with ocd puts your risk of the disorder at 12%, and while that seems low, it's still more than four times as high as that of the U.S. population as a whole.
If the disorder comes to you through the genes, the next job is to determine which ones. A team of investigators at Johns Hopkins University last summer discovered half a dozen areas in the human genome that appear to be linked to the development of OCD. Analyzing 1,008 blood samples from 219 families in which at least two siblings had the disorder, they discovered gene markers at six sites on five chromosomes that appear more frequently in those kids than in family members and other people without OCD. That study did not tease out how those genes do their damage, but another group has identified a seventh gene whose mechanism is clearer.
Located on the ninth chromosome, that gene--discovered in two studies by researchers at several universities including the University of Michigan and the University of Toronto--appears to regulate a brain chemical known as glutamate. One of a number of substances that stimulate signaling among neurons, glutamate works fine unless you've got too much on hand. Then the signals just keep coming. In the case of the alarm centers in the brain, that means the warning bell just keeps on ringing. "Glutamate has to be taken up quickly because otherwise it becomes toxic to the brain cells," says Vladimir Coric, director of OCD research at Yale University and a leader in studies of the chemical.
What makes the glutamate-related gene especially suspect is the particular people it affects the most. OCD strikes males and females about evenly, but early-onset forms tend to target boys more than girls. This is particularly true in cases in which the boys also exhibit the involuntary tics or vocalizations often associated with Tourette's syndrome. Interacting with the glutamate gene are three genes related to androgens, or masculinizing hormones. Interacting with those is another gene that has been implicated in Tourette's. Gather all these together in the same chromosomal neighborhood, and they can make trouble. "Kids who start early tend to be boys, tend to have tic disorders and, in genetic analyses, tend to have parents with tic disorders too," says John March, chief of child psychiatry at Duke University.
Other compelling, if controversial, research has long pursued an entirely different cause of OCD: streptococcal infection. As long ago as the 17th century, British physician Thomas Sydenham first noticed a link between childhood strep and the later onset of a tic condition that became known as Sydenham's chorea. Modern researchers who saw a link between tics and OCD began wondering if, in some cases, strep might be involved with both.
Last year investigators from the University of Chicago and the University of Washington studied a group of 144 children-- 71% of whom were boys--who had tics or OCD. All the kids, it turned out, were more than twice as likely as others to have had a strep infection in the previous three months. For those with Tourette's symptoms, the strep incidence was a whopping 13 times as great.
The tics and OCD are probably the result of an autoimmune response, in which the body begins attacking its own healthy tissue. Blood tests of kids with strep-related tics and OCD have turned up antibodies hostile to neural tissue, particularly in the brain's caudate nucleus and putamen, regions associated with reinforcement learning. "There certainly seems to be an epidemiological relationship there," says Dr. Cathy Budman, associate professor of psychiatry and neurology at New York University, "but what it means needs to be further investigated."


NO MATTER HOW OR WHEN THE DISORDER hits, the first step in striking back is usually comparatively short-term behavioral therapy, using a technique known as exposure and response prevention (ERP), in which OCD sufferers don't try to avoid their particular source of anxiety but actually seek it out. Eventually, emotional nerve endings grow desensitized to the stimulus. The point is to tough it out until that happens.
At the Obsessive Compulsive Foundation convention in Atlanta last summer, Grayson, the Pennsylvania-based clinical psychologist, gave those in attendance who had OCD a quick taste of ERP. Inviting the ones in the audience with dirt and germ anxieties to come forward, he instructed them to sit beside him on the ballroom carpet. Then he told them to touch the carpet and bring their fingers to their lips. Left to themselves, most would have refused or, if they went along, would have then found the nearest bathroom and spent long minutes--perhaps long hours--scrubbing. Instead, they sat with Grayson and the anxiety, learning a very early lesson that the pain does subside. Extended ERP treatment involves a graduated series of such exposures, each a bit more challenging than the one before it.
Such tactical jujitsu works for all manner of OCD, though it's not always easy to find a doctor skilled at administering it. Patients obsessed about their sexual orientation, who become intolerably anxious if they so much as notice an attractive member of the same sex, are assigned to do just that: flip through magazines for scantily clad same-sex models. People plagued by what's known as relationship substantiation, who become consumed by inconsequential defects in a partner, are encouraged to seek out those flaws and even exaggerate them in their mind.
Medication helps too. Antidepressants such as Prozac and other selective serotonin reuptake inhibitors (SSRIs) can help dial down the anxiety enough that patients can get started with ERP and, significantly, stay with it. When patients are children, practitioners are more reluctant to prescribe medication, but they are careful not to stay too long with ERP alone if it's not producing results. "The longer a child struggles with an illness, the more impact it's going to have," says Dr. John Piacentini, director of UCLA's child OCD clinic. Still, there are some people--kids and adults--whose OCD is so acute that more extreme methods are needed, such as hospitalization, more intensive exposure therapy and other medications.
Coric, of Yale, is among the growing group of investigators experimenting with drugs targeting the glutamate problem. The best medication so far, riluzole, was originally developed for Lou Gehrig's disease and works simply by turning down the glutamate spigot, reducing the amount that's available in the brain. In Coric's admittedly small studies and clinical observations, half of about 50 subjects experienced at least a 35% remission, and almost all the rest improved at least a little.
Much more invasively, investigators are looking into deep-brain stimulation (DBS), in which electrodes are implanted in the brain and connected by wires embedded in the skin to a pacemaker-like device in the chest. Low doses of current can then be applied as needed to calm the turmoil in the regions of the brain that cause OCD. The procedure sounds extreme--and it is--but it's already been used in about 35,000 people worldwide to treat Parkinson's disease, and FDA approval to use DBS for OCD as well is pending. "Many of our OCD patients are able to re-engage in life rather than being stuck at home," says neurosurgeon Ali Rezai of the Cleveland Clinic, who performs DBS surgery for Parkinson's and has researched it for OCD.
For the vast majority of people, the treatment never needs to go so far. OCD, for all the suffering it inflicts, is nothing more than the brain doing something it's supposed to do--warning you of danger--but doing it very badly. Living in the world means living with risks: real ones, imagined ones, exaggerated ones. That's not an easy lesson, but it's a powerful lesson--one that, once learned, can offer a paradoxical state of peace.

When Worry Hijacks The Brain - TIME

Hot and Cold Emotions Make Us Poor Judges

Hot and Cold Emotions Make Us Poor Judges
By Shankar Vedantam
Monday, August 6, 2007

Why would David Vitter, a U.S. senator with four young children, have gotten involved with a seedy escort service? Why would Michael Vick, a gifted NFL quarterback, get mixed up with the sordid world of dog fighting? Why would Bill Clinton, a Rhodes scholar, six-time governor and president of the United States at 46, have an affair with an intern in the Oval Office?

It isn't just men behaving badly. Remember Lisa Nowak, the married NASA astronaut who drove from Houston to Orlando (wearing diapers so she wouldn't have to make bathroom stops, police said) allegedly in order to kidnap her rival in a love triangle?
Whenever these scandals break, the rest of us shake our heads and ask, "What were they thinking?"
That feeling of incredulity is now the subject of a growing body of research. It isn't just that people find it difficult to understand or empathize with others who do crazy things. People find it very difficult to imagine how they themselves would behave when strong emotions are involved.
Studies have found that, for some reason, an enormous mental gulf separates "cold" emotional states from "hot" emotional states. When we are not hungry or thirsty or sexually aroused, we find it difficult to understand what effects those factors can have on our behavior. Similarly, when we are excited or angry, it is difficult to think about the consequences of our behavior -- outcomes that are glaringly obvious when we are in a cold emotional state.
Vitter (R-La.), for example, demanded in late June that the Title V Abstinence Education program be reauthorized: "These programs have been shown to effectively reduce the risks of out-of-wedlock pregnancy and sexually transmitted diseases by teaching teenagers that saving sex until marriage and remaining faithful afterwards is the best choice for health and happiness," he declared.
A little more than two weeks later, Vitter was apologizing for a "serious sin" in his past, after his telephone number was found among the telephone lists of the alleged D.C. Madam. Hypocrisy? Possibly. But if the research is accurate, what it suggests is that Vitter-the-policymaker probably finds Vitter-the-escort-service-
client as incomprehensible as everyone else does.
"We tend to exaggerate the importance of willpower," said George Loewenstein, a professor of economics and psychology at Carnegie Mellon University who has studied the phenomenon of hot and cold emotional states and the surprisingly diverse implications of the gulf that separates them.
Many health resolutions, for example, are made when people are in a cold state. But while they may intellectually grasp the temptation of a potato chip or a cigarette, they do not appreciate in advance how visceral the desire can be -- which is why many resolutions fail when put to the test.
Psychologist Louis Giordano once asked heroin addicts on a maintenance course of the heroin substitute buprenorphine whether they would prefer an extra dose five days later or a sum of money. He found that when addicts were asked the question right before they got a dose -- when their craving was highest -- they valued the extra dose more than twice as much as addicts who had just taken their buprenorphine. The addicts who were in a craving state viscerally understood how much they would want the extra dose later; the satiated addicts, on the other hand, overestimated how easily they could do without the fix.
Similarly, when cancer researcher Maurice Slevin quizzed medical professionals about whether they would endure grueling chemotherapy to extend their lives by only a few months, fewer than one in 10 said it was worth it -- they were evaluating the question in a cold state. When he asked patients who actually had cancer the same question -- these were dying people who were in a very hot state -- nearly half said a few more weeks of life was worth the pain of chemo.
The empathy gulf between hot and cold states, Loewenstein said, might also explain why many patients are undertreated for pain. Patients viscerally experience their agony; doctors who are coolly evaluating the situation have to make a leap of imagination across the gulf that separates hot and cold states.
Other experiments have found that shoppers at grocery stores spend more when they are hungry than they do when they are full.
The empathy gap between hot and cold states not only keeps people from realizing how prone they can be to temptation but from enjoying things as much as they could: Marriage therapists, for instance, find that couples who report being uninterested in sex are usually surprised to find how much they enjoy intimacy once an encounter takes place. Couples in a cold state don't realize how they will feel once they are in a hot state.
Loewenstein said his research made it difficult for him to serve on a university disciplinary committee, because he now empathizes with students who make mistakes in the heat of the moment. And when big public scandals break, he automatically thinks about the empathy gap that prompts so many people to be judgmental of others.
"Most people have their own vices," he said. "When we are dealing with our vices, we are shortsighted, impulsive and make ridiculous sacrifices to satisfy our vices. But when we see other people succumbing to their vices, we think, 'How pathetic.' "

Shankar Vedantam - Hot and Cold Emotions Make Us Poor Judges - washingtonpost.com

Derren Brown - Messiah

This documentary-styled film sees Derren in America attempting to raise questions about the validity of certain religious and spiritual belief systems; belief systems that people are encouraged to base their lives upon - such as new-age faiths and mainstream Christianity. Can he get certain authority figures to endorse him as the real thing?

The Psychology of the Obvious

Isn’t it all just obvious?

Psychology? Well it’s all obvious, isn’t it? Just common sense, but dressed up with big words to confuse people. Many of us must be familiar with this kind of accusation. Tell someone you are a psychologist and it is nearly as popular as ‘can you tell what I’m thinking then?’. Hell, I’m sure that during the long dark afternoons many a researcher has found themselves wondering if it might even be true. What, after all, is most published research really saying? When you convert it into everyday language what exciting message is there left to tell the ordinary person in the street about?
I used to keep a stock of ‘unobvious’ findings ready to hand for occasions like this. Is it really obvious that people can be made to enjoy a task more by being more poorly paid to recruit for it (cognitive dissonance: Festinger & Carlsmith, 1959)? That a saline solution can be as effective as morphine in killing pain (the placebo effect: Hrobjartsson, 2001)? That students warned that excessive drinking is putting many of their peers at risk may actually drink more, whereas advertising the fact that most students don’t drink, or drink in moderation, is the thing that actually reduces binge drinking (Perkins et al., 2005)? That over a third of normal people report having had hallucinations, something we normally experience solely with mental illness or substance abuse (Ohayon, 2000)? Or that the majority of ordinary Americans could be persuaded to electrocute someone to death merely by being asked to by a scientist in a white coat (Milgram, 1974)?
Other notions that have been challenged include that children need to be taught language (Chomsky, Pinker), that parenting style has a significant effect on child development (Rich-Harris) or that anti-social behaviour is caused by low self-esteem (Baumeister). (Let’s not dwell on the uncomfortable fact that many ‘popular notions’ were originally propagated by psychologists!)
Another tack you can try in response to the ‘obvious’ accusation is attacking the very idea of what obvious is. Saying ‘I could have told you that’ after the fact is far easier than getting there first. Like inventions, the best psychological research findings should be obvious after they have been discovered – but it is surely the case that they aren’t so obvious beforehand.
In fact, there is good evidence that most of us share a common cognitive bias in the form of the illusion of explanatory depth. This is that we mistake our familiarity with a situation for an understanding of how it works. Rozenblit and Keil (2002) showed this by asking people how well they understood certain common devices (such as a flush toilet, a clothes zip or a cylinder lock) and then asking them to describe in detail how these things worked. People’s self-ratings of their understanding dropped as they were forced to confront their ignorance. Furthermore when asked detailed diagnostic questions about the devices, their self-ratings dropped still lower. We can speculate that additional evidence that they didn’t really understand these devices as well as they thought (such as being asked to assemble them from component parts) would have produced even further drops in self-ratings of understanding. Perhaps, similarly, people don’t realise how difficult it is to make psychological judgements before the fact – we know that in other domains people can be prevented from insight into inability by the same lack of knowledge that generates that inability (Kruger & Denning, 1999). In other words, what we don’t know can get in the way of us figuring out that we don’t know it and correcting the problem.
Some wise experimenters have insured themselves against being charged with proving the obvious by asking in advance what people would expect. Milgram did this with his experiments on obedience to authority. He described the procedure to psychiatrists and to ordinary people and asked for their predictions – both groups were wildly inaccurate about how few people would defy authority, and how many would proceed to the highest level of ‘shock’ (the psychiatrists were furthest out: Milgram, 1974). In another example, car drivers in low-status cars are more likely to be honked if they pause at green traffic lights: a finding in direct contradiction of the report of (male) interviewees who said they would be more likely to honk a high-status car driver than a low-status car driver (Doob & Gross, 1968).
The illusion of explanatory depth works both ways, of course. Non-psychologists might have a tendency to mistake their familiarity with psychological processes for an understanding of their operation, but there is nothing that makes individual psychologists immune from this. Describing and categorising psychological phenomenon is vital to understanding, but within all of us the illusion of explanatory depth is primed to make us mistake a detailed description of what occurs for an understanding of why it occurs.

Obviously conflicting
People’s everyday beliefs are rather wobbly foundations for psychological science. Things which people think to be ‘obvious’ don’t make up a coherent theory of human behaviour (see the brief but excellent discussion in Stanovich, 1998). Research has shown that people are perfectly willing to endorse contradictory statements – so that something like ‘absence makes the heart grow fonder’ can be seen as obviously true, yet so is ‘out of sight out of mind’ (Teigen, 1986).
It has been argued that many currently popular beliefs are false (Kohn, 1990, see also Mackay, 1841/1995). This shouldn’t surprise us when considered in the light of popular beliefs from the past and how ridiculous they seem now (for example the notion, popular in parts of Europe during the 14th century, that the Black Death was caused by the fashion of wearing pointed shoes: Hecker, 1844). Many modern beliefs have been overturned, or at least challenged, by psychologists. For example (provided in this context by Stanovich, 1998), there is the modern American dictum that making teenagers work during high-school is character building and a generally positive thing to do. Research has shown that working during high-school tends to harm teenager’s academic work. What evidence there is on it being character building is also disappointing – work appears to make teenagers cynical about corporate culture and the value of hard work while promoting rather than deterring at least some forms of delinquent behaviour (Steinberg et al., 1993; Greenberger & Steinberg, 1986).

Embracing the obvious
Is the purpose of the study of psychology to produce wise and insightful individuals, to whom many things are obvious? It is not (and perhaps this is a good thing, given the history of abuses perpetrated by some psychologists who have considered themselves to have become wise and insightful: see Masson, 1989). Instead the purpose of psychological science is making findings about the human mind and behaviour available – obvious! – to everyone. By explicitly, rigorously, stating propositions about psychology and laying them open to testing we are democratising knowledge. We are making knowledge public, explicit and usable by everyone. That means stating the obvious, so that anyone can come and disagree with it (and so that we can be sure we aren’t deluding ourselves about what is true or what we know).
If psychology is to be a science then the obvious needs to be thoroughly explored. Most research is Kuhnian ‘normal science’. It is what happens between scientific revolutions; the monotonous process of confirming what we think is probably true, and disconfirming what we think probably isn’t (Kuhn, 1962). But, in rare cares there is a unusual result – something we thought was obvious turns out not to be – and it is via these anomalies, Kuhn argued, that science progresses. Popper (1963; discussed in Chalmers, 1982) said something similar – that science progresses through the confirmation of bold conjectures and the falsification of cautious ones. By definition it would be unfair of us to expect the falsification of cautious conjectures to be routine. We have to test lots of statements that seem obvious before we find the door of discovery ajar and the situation reveals itself to be more complex than we thought.
Many of the activities of normal science are essential and involve delineating the exact form of a phenomenon, determining its magnitude, sphere of influence and limitations of effects. This kind of incremental addition to the heap of knowledge isn’t headline grabbing – especially if an uninvolved media report the findings of the study, rather than why it was needed or novel.
A related point is that most psychology may be obvious, and indeed, possible to judge as worthless retrospectively, But perhaps this majority needs to exist so that the minority of unobvious, worthwhile, findings can be brought forth. After all, the majority of new products quickly fail, just like the vast majority of new species have rapidly become extinct (Ormerod, 2005). Theodore Sturgeon was generalising this even further when he famously said ‘90 per cent of everything is crap’. The point is that 90 per cent of psychology research might be worthless, but it is generated by the same processes that create the worthwhile 10 per cent. Individual readers can decide for themselves what they think true proportion of worthless to worthwhile research is – the essential point remains that it isn’t possible to judge in advance the difference. Nor is it possible for any individual to judge on their own, and the collaborative sifting of findings, methods and theories is the wider process of science.

Search The Psychologist Online